WorldWideScience

Sample records for classification design challenges

  1. Classifications, applications, and design challenges of drones: A review

    Science.gov (United States)

    Hassanalian, M.; Abdelkefi, A.

    2017-05-01

    Nowadays, there is a growing need for flying drones with diverse capabilities for both civilian and military applications. There is also a significant interest in the development of novel drones which can autonomously fly in different environments and locations and can perform various missions. In the past decade, the broad spectrum of applications of these drones has received most attention which led to the invention of various types of drones with different sizes and weights. In this review paper, we identify a novel classification of flying drones that ranges from unmanned air vehicles to smart dusts at both ends of this spectrum, with their new defined applications. Design and fabrication challenges of micro drones, existing methods for increasing their endurance, and various navigation and control approaches are discussed in details. Limitations of the existing drones, proposed solutions for the next generation of drones, and recommendations are also presented and discussed.

  2. 6 CFR 7.30 - Classification challenges.

    Science.gov (United States)

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Classification challenges. 7.30 Section 7.30... INFORMATION Classified Information § 7.30 Classification challenges. (a) Authorized holders of information... classified are encouraged and expected to challenge the classification status of that information pursuant to...

  3. 22 CFR 9.8 - Classification challenges.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Classification challenges. 9.8 Section 9.8 Foreign Relations DEPARTMENT OF STATE GENERAL SECURITY INFORMATION REGULATIONS § 9.8 Classification... classification status is improper are expected and encouraged to challenge the classification status of the...

  4. Securing classification and regulatory approval for deepwater projects: management challenges in a global environment

    Energy Technology Data Exchange (ETDEWEB)

    Feijo, Luiz P.; Burton, Gareth C. [American Bureau of Shipping (ABS), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    As the offshore industry continues to develop and move into increasingly deeper waters, technological boundaries are being pushed to new limits. Along with these advances, the design, fabrication and installation of deepwater oil and gas projects has become an increasingly global endeavor. After providing an overview of the history and role of Classification Societies, this paper reviews the challenges of securing classification and regulatory approval in a global environment. Operational, procedural and technological changes which one Classification Society; the American Bureau of Shipping, known as ABS, has implemented to address these challenges are presented. The result of the changes has been a more customized service aiming at faster and more streamlined classification approval process. (author)

  5. Towards a unified classification of the ectodermal dysplasias: opportunities outweigh challenges.

    LENUS (Irish Health Repository)

    Irvine, Alan D

    2012-02-01

    The ectodermal dysplasias include a complex and highly diverse group of heritable disorders that share in common developmental abnormalities of ectodermal derivatives. The broader definition of ectodermal dysplasias (as heritable disorders involving at least two of the ectodermal derivatives nails, teeth, hair, and eccrine sweat glands) encompasses 170-200 conditions. Some conditions included by this definition are relatively common; others are rare and, in some cases, family-specific. Classification of the ectodermal dysplasias has largely been approached by categorizing patterns of clinical findings (phenotypic grouping). In the last 2 decades great progress has been made in understanding the molecular pathogenesis and inter-relatedness of some of these conditions and a new consensus approach to classification that incorporates this new information is needed. A comprehensive and definitive classification of these disorders would be highly valuable for the many stakeholders in ED. As disease-specific molecular treatments are developed, accurate classification will assume greater importance in designing registries to enable rapid identification of those with rare disorders who may wish to participate in clinical trials. Ideally a working classification of such a disparate collection of conditions would have a design and architecture that would facilitate easy accessibility by each of the key stakeholder groups and would encourage enhanced interaction between these parties. Attaining this objective is a major challenge but is achievable. This article reviews the historical-clinical perspective and the impact of recent developments in molecular biology in the field. Reflections are offered as to the future direction of classification systems in these disorders.

  6. Moving research tools into practice: the successes and challenges in promoting uptake of classification tools.

    Science.gov (United States)

    Cunningham, Barbara Jane; Hidecker, Mary Jo Cooley; Thomas-Stonell, Nancy; Rosenbaum, Peter

    2018-05-01

    In this paper, we present our experiences - both successes and challenges - in implementing evidence-based classification tools into clinical practice. We also make recommendations for others wanting to promote the uptake and application of new research-based assessment tools. We first describe classification systems and the benefits of using them in both research and practice. We then present a theoretical framework from Implementation Science to report strategies we have used to implement two research-based classification tools into practice. We also illustrate some of the challenges we have encountered by reporting results from an online survey investigating 58 Speech-language Pathologists' knowledge and use of the Communication Function Classification System (CFCS), a new tool to classify children's functional communication skills. We offer recommendations for researchers wanting to promote the uptake of new tools in clinical practice. Specifically, we identify structural, organizational, innovation, practitioner, and patient-related factors that we recommend researchers address in the design of implementation interventions. Roles and responsibilities of both researchers and clinicians in making implementations science a success are presented. Implications for rehabilitation Promoting uptake of new and evidence-based tools into clinical practice is challenging. Implementation science can help researchers to close the knowledge-to-practice gap. Using concrete examples, we discuss our experiences in implementing evidence-based classification tools into practice within a theoretical framework. Recommendations are provided for researchers wanting to implement new tools in clinical practice. Implications for researchers and clinicians are presented.

  7. Design evaluaion: pneumatic transport and classification

    International Nuclear Information System (INIS)

    McNair, J.M.

    1979-10-01

    This report describes the evaluation of selected design features of the cold engineering scale pneumatic transport and classification subsystems used in the development of the head-end equipment for HTGR fuel reprocessing. The report identifies areas that require further design effort and evaluation of alternatives prior to the design of the HTGR reference recycle facility (HRRF). Seven areas in the transport subsystem and three in the classification subsystem were selected for evaluation. Seventeen specific recommendations are presented for further design effort

  8. 15 CFR 2008.7 - Challenges to classification.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Challenges to classification. 2008.7 Section 2008.7 Commerce and Foreign Trade Regulations Relating to Foreign Trade Agreements OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE REGULATIONS TO IMPLEMENT E.O. 12065; OFFICE OF THE UNITED STATES TRADE...

  9. Post engineered nanomaterials lifespan: nanowastes classification, legislative development/implementation challenges, and proactive approaches

    CSIR Research Space (South Africa)

    Musee, N

    2012-05-01

    Full Text Available -1 NANOLCA Symposium, "Safety issues and regulatory challenges of nanomaterials", San Sebastian, Spain, 3-4 May 2012 Post engineered nanomaterials lifespan: nanowastes classification, legislative development/implementation challenges, and proactive...

  10. Virtual Bridge Design Challenge

    Science.gov (United States)

    Mitts, Charles R.

    2013-01-01

    This design/problem-solving activity challenges students to design a replacement bridge for one that has been designated as either structurally deficient or functionally obsolete. The Aycock MS Technology/STEM Magnet Program Virtual Bridge Design Challenge is an authentic introduction to the engineering design process. It is a socially relevant…

  11. Ecosystem services provided by a complex coastal region: challenges of classification and mapping

    Science.gov (United States)

    Sousa, Lisa P.; Sousa, Ana I.; Alves, Fátima L.; Lillebø, Ana I.

    2016-03-01

    A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping.

  12. Ecosystem services provided by a complex coastal region: challenges of classification and mapping.

    Science.gov (United States)

    Sousa, Lisa P; Sousa, Ana I; Alves, Fátima L; Lillebø, Ana I

    2016-03-11

    A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping.

  13. Cell-based therapy technology classifications and translational challenges

    Science.gov (United States)

    Mount, Natalie M.; Ward, Stephen J.; Kefalas, Panos; Hyllner, Johan

    2015-01-01

    Cell therapies offer the promise of treating and altering the course of diseases which cannot be addressed adequately by existing pharmaceuticals. Cell therapies are a diverse group across cell types and therapeutic indications and have been an active area of research for many years but are now strongly emerging through translation and towards successful commercial development and patient access. In this article, we present a description of a classification of cell therapies on the basis of their underlying technologies rather than the more commonly used classification by cell type because the regulatory path and manufacturing solutions are often similar within a technology area due to the nature of the methods used. We analyse the progress of new cell therapies towards clinical translation, examine how they are addressing the clinical, regulatory, manufacturing and reimbursement requirements, describe some of the remaining challenges and provide perspectives on how the field may progress for the future. PMID:26416686

  14. Post-industrial landscape - its identification and classification as contemporary challenges faced by geographic research

    Czech Academy of Sciences Publication Activity Database

    Kolejka, Jaromír

    2010-01-01

    Roč. 14, č. 2 (2010), s. 67-78 ISSN 1842-5135 Institutional research plan: CEZ:AV0Z30860518 Keywords : classification * geographical research * identification method * landscape structure Subject RIV: DE - Earth Magnetism, Geodesy, Geography http://studiacrescent.com/images/02_2010/09_jaromir_kolejka_post_industrial_landscape_its_identification_and_classification_as_contemporary_challenges_faced_by_geographic_.pdf

  15. Classification of male lower torso for underwear design

    Science.gov (United States)

    Cheng, Z.; Kuzmichev, V. E.

    2017-10-01

    By means of scanning technology we have got new information about the morphology of male bodies and have redistricted the classification of men’s underwear by adopting one to consumer demands. To build the new classification in accordance with male body characteristic factors of lower torso, we make the method of underwear designing which allow to get the accurate and convenience for consumers products.

  16. Design and implementation based on the classification protection vulnerability scanning system

    International Nuclear Information System (INIS)

    Wang Chao; Lu Zhigang; Liu Baoxu

    2010-01-01

    With the application and spread of the classification protection, Network Security Vulnerability Scanning should consider the efficiency and the function expansion. It proposes a kind of a system vulnerability from classification protection, and elaborates the design and implementation of a vulnerability scanning system based on vulnerability classification plug-in technology and oriented classification protection. According to the experiment, the application of classification protection has good adaptability and salability with the system, and it also approves the efficiency of scanning. (authors)

  17. Designing the robot inclusive space challenge

    Directory of Open Access Journals (Sweden)

    Rajesh Elara Mohan

    2015-11-01

    Full Text Available A novel robotic challenge, namely the robot inclusive spaces (RIS challenge, is proposed in this paper, which is a cross disciplinary and design focused initiative. It aims to foster the roboticists, architects, and designers towards realizing robot friendly social spaces. Contrary to conventional robotics competitions focusing on designing robots and its component technologies, robot inclusive spaces challenge adopts an interdisciplinary “design for robots” strategy to overcome the traditional research problem in real world deployments of social robots. In order to realize the RIS, various architectural elements must be adapted including: design principles for inclusive spaces, lighting schemes, furniture choices and arrangement, wall and floor surfaces, pathways among others. This paper introduces the format and design principles of RIS challenge, presents a first run of the challenge, and gives the corresponding analysis.

  18. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....

  19. Using QA classification to guide design and manage risk

    International Nuclear Information System (INIS)

    Lathrop, J.; DeKlever, R.; Petrie, E.H.

    1993-01-01

    Raytheon Services Nevada has developed a classification process based on probabilistic risk assessment, using accident/impact scenarios for each system classified. Initial classification analyses were performed for the 20 systems of Package IA of the Exploratory Studies Facility (ESF). The analyses demonstrated a solid, defensible methodological basis for classification which minimizes the use of direct engineering judgment. They provide guidance for ESF design and risk management through the identification of: The critical characteristics of each system that need to be controlled; and the parts of the information base that most need to be further developed through performance assessment or other efforts

  20. Challenges in Designing Mechatronic Systems

    DEFF Research Database (Denmark)

    Torry-Smith, Jonas; Qamar, Ahsan; Achiche, Sofiane

    2013-01-01

    Development of mechatronic products is traditionally carried out by several design experts from different design domains. Performing development of mechatronic products is thus greatly challenging. In order to tackle this, the critical challenges in mechatronics have to be well understood and well...... supported through applicable methods and tools. This paper aims at identifying the major challenges, by conducting a systematic and thorough survey of the most relevant research work in mechatronic design. Solutions proposed in literature are assessed and illustrated through a case study in order...... to investigate if the challenges can be handled appropriately by the methods, tools, and mindsets suggested by the mechatronic community. Using a real world mechatronics case, the paper identifies the areas where further research is required, by showing a clear connection between the actual problems faced during...

  1. Mechatronic Design - Still a Considerable Challenge

    DEFF Research Database (Denmark)

    Torry-Smith, Jonas; Qamar, Ahsan; Achiche, Sofiane

    2011-01-01

    Development of mechatronic products is traditionally carried out by several design experts from different design domains. Performing development of mechatronic products is thus greatly challenging. In order to tackle this, the critical challenges in mechatronics have to be well understood and well...... supported through applicable methods and tools. This paper aims at identifying the major challenges, by conducting a survey of the most relevant research work in mechatronic design. Solutions proposed in literature are assessed and illustrated through a case study in order to investigate, if the challenges...... can be handled appropriately by the methods, tools, and mindsets suggested by the mechatronic community. Using a real world mechatronics case, the paper identifies the areas where further research is required, by showing a clear connection between the actual problems faced during the design task...

  2. Challenges and Approaches to Statistical Design and Inference in High Dimensional Investigations

    Science.gov (United States)

    Garrett, Karen A.; Allison, David B.

    2015-01-01

    Summary Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other “omic” data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology, and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative. PMID:19588106

  3. Challenges and approaches to statistical design and inference in high-dimensional investigations.

    Science.gov (United States)

    Gadbury, Gary L; Garrett, Karen A; Allison, David B

    2009-01-01

    Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other "omic" data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative.

  4. Safeguards by Design Challenge

    Energy Technology Data Exchange (ETDEWEB)

    Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    The International Atomic Energy Agency (IAEA) defines Safeguards as a system of inspection and verification of the peaceful uses of nuclear materials as part of the Nuclear Nonproliferation Treaty. IAEA oversees safeguards worldwide. Safeguards by Design (SBD) involves incorporation of safeguards technologies, techniques, and instrumentation during the design phase of a facility, rather that after the fact. Design challenge goals are the following: Design a system of safeguards technologies, techniques, and instrumentation for inspection and verification of the peaceful uses of nuclear materials. Cost should be minimized to work with the IAEA’s limited budget. Dose to workers should always be as low are reasonably achievable (ALARA). Time is of the essence in operating facilities and flow of material should not be interrupted significantly. Proprietary process information in facilities may need to be protected, thus the amount of information obtained by inspectors should be the minimum required to achieve the measurement goal. Then three different design challenges are detailed: Plutonium Waste Item Measurement System, Marine-based Modular Reactor, and Floating Nuclear Power Plant (FNPP).

  5. Safeguards by Design Challenge

    International Nuclear Information System (INIS)

    Alwin, Jennifer Louise

    2016-01-01

    The International Atomic Energy Agency (IAEA) defines Safeguards as a system of inspection and verification of the peaceful uses of nuclear materials as part of the Nuclear Nonproliferation Treaty. IAEA oversees safeguards worldwide. Safeguards by Design (SBD) involves incorporation of safeguards technologies, techniques, and instrumentation during the design phase of a facility, rather that after the fact. Design challenge goals are the following: Design a system of safeguards technologies, techniques, and instrumentation for inspection and verification of the peaceful uses of nuclear materials. Cost should be minimized to work with the IAEA's limited budget. Dose to workers should always be as low are reasonably achievable (ALARA). Time is of the essence in operating facilities and flow of material should not be interrupted significantly. Proprietary process information in facilities may need to be protected, thus the amount of information obtained by inspectors should be the minimum required to achieve the measurement goal. Then three different design challenges are detailed: Plutonium Waste Item Measurement System, Marine-based Modular Reactor, and Floating Nuclear Power Plant (FNPP).

  6. Challenges to Designing Game-Based Business

    DEFF Research Database (Denmark)

    Henriksen, Thomas Duus

    2014-01-01

    The four categories labelled game-design, didactic design, organisational design and business design each constitute a set of challenges, each requiring a particular set of competencies. The key conclusion of the paper is that even though the learning game design constitutes the core of establish......The four categories labelled game-design, didactic design, organisational design and business design each constitute a set of challenges, each requiring a particular set of competencies. The key conclusion of the paper is that even though the learning game design constitutes the core...... of establishing game based business (GBB), the subsequent stages of development call for other kinds of competencies in order to become a viable GBB....

  7. Evaluating and comparing imaging techniques: a review and classification of study designs

    International Nuclear Information System (INIS)

    Freedman, L.S.

    1987-01-01

    The design of studies to evaluate and compare imaging techniques are reviewed. Thirteen principles for the design of studies of diagnostic accuracy are given. Because of the 'independence principle' these studies are not able directly to evaluate the contribution of a technique to clinical management. For the latter, the 'clinical value' study design is recommended. A classification of study designs is proposed in parallel with the standard classification of clinical trials. Studies of diagnostic accuracy are analogous to Phase II, whereas studies evaluating the contribution to clinical management correspond to the Phase III category. Currently the majority of published studies employ the Phase II design. More emphasis on Phase III studies is required. (author)

  8. Challenges in biomimetic design and innovation

    DEFF Research Database (Denmark)

    Lenau, Torben Anker; Barfoed, Michael; Shu, Li

    Biomimetic design copies desired principles found in nature and implement them into artificial applications. Applications could be products we use in our daily life but it can also be used to inspire material innovation. However there are significant challenges in performing biomimetic design. One....... This is a key issue in design and innovation work where problem identification and systematic search for suitable solution principle are major activities. One way to deal with this challenge is to use a biology search method. The use of such a method is illustrated with a case story describing the design...... including the terminology and knowledge organisation. It is often easy to recognise the splendour of a biological solution, but it can be much more difficult to understand the underlying mechanisms. Another challenge in biomimetic design is the search and identification of relevant solutions in nature...

  9. Applying Topographic Classification, Based on the Hydrological Process, to Design Habitat Linkages for Climate Change

    Directory of Open Access Journals (Sweden)

    Yongwon Mo

    2017-11-01

    Full Text Available The use of biodiversity surrogates has been discussed in the context of designing habitat linkages to support the migration of species affected by climate change. Topography has been proposed as a useful surrogate in the coarse-filter approach, as the hydrological process caused by topography such as erosion and accumulation is the basis of ecological processes. However, some studies that have designed topographic linkages as habitat linkages, so far have focused much on the shape of the topography (morphometric topographic classification with little emphasis on the hydrological processes (generic topographic classification to find such topographic linkages. We aimed to understand whether generic classification was valid for designing these linkages. First, we evaluated whether topographic classification is more appropriate for describing actual (coniferous and deciduous and potential (mammals and amphibians habitat distributions. Second, we analyzed the difference in the linkages between the morphometric and generic topographic classifications. The results showed that the generic classification represented the actual distribution of the trees, but neither the morphometric nor the generic classification could represent the potential animal distributions adequately. Our study demonstrated that the topographic classes, according to the generic classification, were arranged successively according to the flow of water, nutrients, and sediment; therefore, it would be advantageous to secure linkages with a width of 1 km or more. In addition, the edge effect would be smaller than with the morphometric classification. Accordingly, we suggest that topographic characteristics, based on the hydrological process, are required to design topographic linkages for climate change.

  10. Rooftop Garden Design Challenge

    Science.gov (United States)

    Roman, Harry T.

    2010-01-01

    A small commercial building in a nearby industrial park has decided to install a rooftop garden for its employees to enjoy. The garden will be about 100 feet long and 75 feet wide. This article presents a design challenge for technology and engineering students wherein they will assist in the initial conceptual design of the rooftop garden. The…

  11. Pattern recognition and classification an introduction

    CERN Document Server

    Dougherty, Geoff

    2012-01-01

    The use of pattern recognition and classification is fundamental to many of the automated electronic systems in use today. However, despite the existence of a number of notable books in the field, the subject remains very challenging, especially for the beginner. Pattern Recognition and Classification presents a comprehensive introduction to the core concepts involved in automated pattern recognition. It is designed to be accessible to newcomers from varied backgrounds, but it will also be useful to researchers and professionals in image and signal processing and analysis, and in computer visi

  12. A Two-Stream Deep Fusion Framework for High-Resolution Aerial Scene Classification

    Directory of Open Access Journals (Sweden)

    Yunlong Yu

    2018-01-01

    Full Text Available One of the challenging problems in understanding high-resolution remote sensing images is aerial scene classification. A well-designed feature representation method and classifier can improve classification accuracy. In this paper, we construct a new two-stream deep architecture for aerial scene classification. First, we use two pretrained convolutional neural networks (CNNs as feature extractor to learn deep features from the original aerial image and the processed aerial image through saliency detection, respectively. Second, two feature fusion strategies are adopted to fuse the two different types of deep convolutional features extracted by the original RGB stream and the saliency stream. Finally, we use the extreme learning machine (ELM classifier for final classification with the fused features. The effectiveness of the proposed architecture is tested on four challenging datasets: UC-Merced dataset with 21 scene categories, WHU-RS dataset with 19 scene categories, AID dataset with 30 scene categories, and NWPU-RESISC45 dataset with 45 challenging scene categories. The experimental results demonstrate that our architecture gets a significant classification accuracy improvement over all state-of-the-art references.

  13. Design of a hybrid model for cardiac arrhythmia classification based on Daubechies wavelet transform.

    Science.gov (United States)

    Rajagopal, Rekha; Ranganathan, Vidhyapriya

    2018-06-05

    Automation in cardiac arrhythmia classification helps medical professionals make accurate decisions about the patient's health. The aim of this work was to design a hybrid classification model to classify cardiac arrhythmias. The design phase of the classification model comprises the following stages: preprocessing of the cardiac signal by eliminating detail coefficients that contain noise, feature extraction through Daubechies wavelet transform, and arrhythmia classification using a collaborative decision from the K nearest neighbor classifier (KNN) and a support vector machine (SVM). The proposed model is able to classify 5 arrhythmia classes as per the ANSI/AAMI EC57: 1998 classification standard. Level 1 of the proposed model involves classification using the KNN and the classifier is trained with examples from all classes. Level 2 involves classification using an SVM and is trained specifically to classify overlapped classes. The final classification of a test heartbeat pertaining to a particular class is done using the proposed KNN/SVM hybrid model. The experimental results demonstrated that the average sensitivity of the proposed model was 92.56%, the average specificity 99.35%, the average positive predictive value 98.13%, the average F-score 94.5%, and the average accuracy 99.78%. The results obtained using the proposed model were compared with the results of discriminant, tree, and KNN classifiers. The proposed model is able to achieve a high classification accuracy.

  14. Weakly supervised classification in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Dery, Lucio Mwinmaarong [Physics Department, Stanford University,Stanford, CA, 94305 (United States); Nachman, Benjamin [Physics Division, Lawrence Berkeley National Laboratory,1 Cyclotron Rd, Berkeley, CA, 94720 (United States); Rubbo, Francesco; Schwartzman, Ariel [SLAC National Accelerator Laboratory, Stanford University,2575 Sand Hill Rd, Menlo Park, CA, 94025 (United States)

    2017-05-29

    As machine learning algorithms become increasingly sophisticated to exploit subtle features of the data, they often become more dependent on simulations. This paper presents a new approach called weakly supervised classification in which class proportions are the only input into the machine learning algorithm. Using one of the most challenging binary classification tasks in high energy physics — quark versus gluon tagging — we show that weakly supervised classification can match the performance of fully supervised algorithms. Furthermore, by design, the new algorithm is insensitive to any mis-modeling of discriminating features in the data by the simulation. Weakly supervised classification is a general procedure that can be applied to a wide variety of learning problems to boost performance and robustness when detailed simulations are not reliable or not available.

  15. Weakly supervised classification in high energy physics

    International Nuclear Information System (INIS)

    Dery, Lucio Mwinmaarong; Nachman, Benjamin; Rubbo, Francesco; Schwartzman, Ariel

    2017-01-01

    As machine learning algorithms become increasingly sophisticated to exploit subtle features of the data, they often become more dependent on simulations. This paper presents a new approach called weakly supervised classification in which class proportions are the only input into the machine learning algorithm. Using one of the most challenging binary classification tasks in high energy physics — quark versus gluon tagging — we show that weakly supervised classification can match the performance of fully supervised algorithms. Furthermore, by design, the new algorithm is insensitive to any mis-modeling of discriminating features in the data by the simulation. Weakly supervised classification is a general procedure that can be applied to a wide variety of learning problems to boost performance and robustness when detailed simulations are not reliable or not available.

  16. Challenges of Collaborative Product Styling in Design Teams

    DEFF Research Database (Denmark)

    Ovesen, Nis

    2016-01-01

    Apart from a long list of advantages, design students face certain challenges when working collectively in groups on form, styling and aesthetics. An investigation about these challenges has been carried out and a number of challenges have been identified. The most apparent challenges relate to d...... to different aesthetic preferences, lack of tools and methods, and difficulties in establishing form related requirements. The challenges are presented and design quality, as well as some solution strategies, is discussed....

  17. Analysis of effect of safety classification on DCS design in nuclear power plants

    International Nuclear Information System (INIS)

    Gou Guokai; Li Guomin; Wang Qunfeng

    2011-01-01

    By analyzing the safety classification for the systems and functions of nuclear power plants based on the general design requirements for nuclear power plants, especially the requirement of availability and reliability of I and C systems, the characteristics of modem DCS technology and I and C products currently applied in nuclear power field are interpreted. According to the requirements on the safety operation of nuclear power plants and the regulations for safety audit, the effect of different safety classifications on DCS design in nuclear power plants is analyzed, by considering the actual design process of different DCS solutions in the nuclear power plants under construction. (authors)

  18. Use of Ecohydraulic-Based Mesohabitat Classification and Fish Species Traits for Stream Restoration Design

    Directory of Open Access Journals (Sweden)

    John S. Schwartz

    2016-11-01

    Full Text Available Stream restoration practice typically relies on a geomorphological design approach in which the integration of ecological criteria is limited and generally qualitative, although the most commonly stated project objective is to restore biological integrity by enhancing habitat and water quality. Restoration has achieved mixed results in terms of ecological successes and it is evident that improved methodologies for assessment and design are needed. A design approach is suggested for mesohabitat restoration based on a review and integration of fundamental processes associated with: (1 lotic ecological concepts; (2 applied geomorphic processes for mesohabitat self-maintenance; (3 multidimensional hydraulics and habitat suitability modeling; (4 species functional traits correlated with fish mesohabitat use; and (5 multi-stage ecohydraulics-based mesohabitat classification. Classification of mesohabitat units demonstrated in this article were based on fish preferences specifically linked to functional trait strategies (i.e., feeding resting, evasion, spawning, and flow refugia, recognizing that habitat preferences shift by season and flow stage. A multi-stage classification scheme developed under this premise provides the basic “building blocks” for ecological design criteria for stream restoration. The scheme was developed for Midwest US prairie streams, but the conceptual framework for mesohabitat classification and functional traits analysis can be applied to other ecoregions.

  19. Challenges of Aircraft Design Integration

    Science.gov (United States)

    2003-03-01

    predicted by the conceptual stick model and the full FEM of the Challenger wing without winglets . Advanced aerodynamic wing design methods To design wings...Piperni, E. Laurendeau Advanced Aerodynamics Bombardier Aerospace 400 CMte Vertu Road Dorval, Quebec, Canada, H4S 1Y9 Fassi.Kafyeke @notes.canadair.ca Tel...514) 855-7186 Abstract The design of a modern airplane brings together many disciplines: structures, aerodynamics , controls, systems, propulsion

  20. The challenge of integrating evidence-based design.

    Science.gov (United States)

    Martin, Caren S

    2009-01-01

    This paper discusses the integration of evidence-based design (EBD) into the design process as an innovation, illuminates the significance and progress of the diffusion of this innovation, and identifies EBD advocates and the consequences of meeting the EBD challenge. A free tool for engaging in EBD is explored. Healthcare designers are leading the EBD charge, because their clients depend on it. But not all designers engage in EBD, because it may be beyond the resources of a firm or outside its culture. However, as with other meaningful design innovations, designers who do not practice EBD could fall by the wayside. EBD is a product of the diffusion of the innovation of evidence-based medicine. The academy (i.e., the collective of institutions of higher education), design organizations, design communities, and the media all contribute to the diffusion of EBD. However, the quantity, quality, and understandability of evidence continue to challenge its broad adoption. InformeDesign®, a free, Internet-based tool, presents information to designers in a concise, understandable way. Firms must invest in EBD incrementally as a value-added component of design to meet current and future challenges. It is important for designers to realize that engaging in EBD is not a rejection of creativity, but a means by which to elevate their design solutions. ©2009 VENDOME GROUP, LLC

  1. Challenges in the automated classification of variable stars in large databases

    Directory of Open Access Journals (Sweden)

    Graham Matthew

    2017-01-01

    Full Text Available With ever-increasing numbers of astrophysical transient surveys, new facilities and archives of astronomical time series, time domain astronomy is emerging as a mainstream discipline. However, the sheer volume of data alone - hundreds of observations for hundreds of millions of sources – necessitates advanced statistical and machine learning methodologies for scientific discovery: characterization, categorization, and classification. Whilst these techniques are slowly entering the astronomer’s toolkit, their application to astronomical problems is not without its issues. In this paper, we will review some of the challenges posed by trying to identify variable stars in large data collections, including appropriate feature representations, dealing with uncertainties, establishing ground truths, and simple discrete classes.

  2. Design, Results and Plans for Power Beaming Competitive Challenge

    International Nuclear Information System (INIS)

    Shelef, Ben

    2008-01-01

    In our context, Power Beaming refers to the extraction of useable electrical power from a directed electromagnetic beam. In order to promote interest in this technology, the Spaceward Foundation proposed and is managing a technology prize challenge based on a Space Elevator design scenario. The challenge has a prize purse of $2M, provided by NASA's Centennial Challenges office. This paper covers the considerations that went into the design of the challenge, a brief chronology of past results, and plans for the future

  3. A CNN Based Approach for Garments Texture Design Classification

    Directory of Open Access Journals (Sweden)

    S.M. Sofiqul Islam

    2017-05-01

    Full Text Available Identifying garments texture design automatically for recommending the fashion trends is important nowadays because of the rapid growth of online shopping. By learning the properties of images efficiently, a machine can give better accuracy of classification. Several Hand-Engineered feature coding exists for identifying garments design classes. Recently, Deep Convolutional Neural Networks (CNNs have shown better performances for different object recognition. Deep CNN uses multiple levels of representation and abstraction that helps a machine to understand the types of data more accurately. In this paper, a CNN model for identifying garments design classes has been proposed. Experimental results on two different datasets show better results than existing two well-known CNN models (AlexNet and VGGNet and some state-of-the-art Hand-Engineered feature extraction methods.

  4. A classification plan of design class for systems of an advanced research reactor

    International Nuclear Information System (INIS)

    Yoon, Doo Byung; Ryu, Jeong Soo

    2005-01-01

    Advanced Research Reactor(ARR) is being designed by KAERI since 2002. The final goal of the project is to develop a new and unique research reactor model which is superior in safety and economical aspects. The conceptual design for systems, structures, and components of the ARR will be completed by 2005. The basic design for the systems, structures, and components of the ARR will be performed from 2006. Based on the technical experiences on the design and operation of the HANARO, the ARR will be designed. It is necessary to classify the safety class, quality class, and seismic category for the systems, structures, and components. The objective of this work is to propose a classification plan of design class for systems, structures, and components of the ARR. To achieve this purpose, the revision status of the regulations that used as criteria for determining the design class of the systems, structures, and components of the HANARO were investigated. In addition, the present revision status of the codes and the standards that utilized for the design of the HANARO were investigated. Based on these investigations, the codes and the standards for the design of the systems, structures, and components of the ARR were proposed. The feasibility of the proposed classification plan will be verified by performing the conceptual and basic design of the systems, structures, and components of the ARR

  5. Classification in context

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper surveys classification research literature, discusses various classification theories, and shows that the focus has traditionally been on establishing a scientific foundation for classification research. This paper argues that a shift has taken place, and suggests that contemporary...... classification research focus on contextual information as the guide for the design and construction of classification schemes....

  6. Radon-technical design methods based on radon classification of the soil

    International Nuclear Information System (INIS)

    Kettunen, A.V.

    1993-01-01

    Radon-technical classification of the foundation soil divides the foundation soil into four classes: negligible, normal, high and very high. Separate radon-technical designing methods and radon-technical solutions have been developed for each class. On regions of negligible class, no specific radon-technical designing methods are needed. On regions of normal radon class, there is no need for actual radon-technical designing based on calculations, whereas existing radon-technical solutions can be used. On regions of high and very high radon class, a separate radon-technical designing should be performed in each case, where radon-technical solutions are designed so that expected value for indoor radon content is lower than the maximum allowable radon content. (orig.). (3 refs., 2 figs., 2 tabs.)

  7. Incorporating Engineering Design Challenges into STEM Courses

    Science.gov (United States)

    Householder, Daniel L., Ed.; Hailey, Christine E., Ed.

    2012-01-01

    Successful strategies for incorporating engineering design challenges into science, technology, engineering, and mathematics (STEM) courses in American high schools are presented in this paper. The developers have taken the position that engineering design experiences should be an important component of the high school education of all American…

  8. New challenges for data design

    CERN Document Server

    2015-01-01

    The present work provides a platform for leading data designers whose vision and creativity help us to anticipate major changes occurring in the data design field, and pre-empt the future. Each of them strives to provide new answers to the question, “What challenges await data design?” To avoid falling into too narrow a mind-set, each works hard to elucidate the breadth of data design today and to demonstrate its widespread application across a variety of business sectors. With end users in mind, designer-contributors bring to light the myriad of purposes for which the field was originally intended, forging the bond even further between data design and the aims and intentions of those who contribute to it. The first seven parts of the book outline the scope of data design, and presents a line-up of “viewpoints” that highlight this discipline’s main topics, and offers an in-depth look into practices boasting both foresight and imagination. The eighth and final part features a series of interviews wit...

  9. Challenges and progress in turbomachinery design systems

    International Nuclear Information System (INIS)

    Van den Braembussche, R A

    2013-01-01

    This paper first describes the requirements that a modern design system should meet, followed by a comparison between design systems based on inverse design or optimization techniques. The second part of the paper presents the way these challenges are realized in an optimization method combining an Evolutionary theory and a Metamodel. Extensions to multi-disciplinary, multi-point and multi-objective optimization are illustrated by examples

  10. Designing a Smaller Power Inverter: the Google Littlebox Challenge - Text

    Science.gov (United States)

    Version | Energy Systems Integration Facility | NREL Designing a Smaller Power Inverter: the Google Littlebox Challenge - Text Version er Power Inverter: the Google Littlebox Challenge - Text Version Below is the text version for the Designing a Smaller Power Inverter: the Google Littlebox

  11. Incorporating Engineering Design Challenges into STEM Courses

    OpenAIRE

    Householder, Daniel L.; Hailey, Christine E.

    2012-01-01

    Successful strategies for incorporating engineering design challenges into science, technology, engineering, and mathematics (STEM) courses in American high schools are presented in this paper. The developers have taken the position that engineering design experiences should be an important component of the high school education of all American youth. In most instances, these experiences in engineering design are infused into instruction programs in standards-based courses in science, technol...

  12. SAFETY BASIS DESIGN DEVELOPMENT CHALLENGES IMECE2007-42747

    Energy Technology Data Exchange (ETDEWEB)

    RYAN GW

    2007-09-24

    'Designing in Safety' is a desired part of the development of any new potentially hazardous system, process, or facility. It is a required part of nuclear safety activities as specified in the U.S. Department of Energy (DOE) Order 420.B, Facility Safety. This order addresses the design of nuclear related facilities developed under federal regulation IOCFR830, Nuclear Safety Management. IOCFR830 requires that safety basis documentation be provided to identify how nuclear safety is being adequately addressed as a condition for system operation (e.g., the safety basis). To support the development of the safety basis, a safety analysis is performed. Although the concept of developing a design that addresses 'Safety is simple, the execution can be complex and challenging. This paper addresses those complexities and challenges for the design activity of a system to treat sludge, a corrosion product of spent nuclear fuel, at DOE's Hanford Site in Washington State. The system being developed is referred to as the Sludge Treatment Project (STP). This paper describes the portion of the safety analysis that addresses the selection of design basis events using the experience gained from the STP and the development of design requirements for safety features associated with those events. Specifically, the paper describes the safety design process and the application of the process for two types of potential design basis accidents associated with the operation of the system, (1) flashing spray leaks and (2) splash and splatter leaks. Also presented are the technical challenges that are being addressed to develop effective safety features to deal with these design basis accidents.

  13. SAFETY BASIS DESIGN DEVELOPMENT CHALLENGES IMECE2007-42747

    International Nuclear Information System (INIS)

    RYAN GW

    2007-01-01

    'Designing in Safety' is a desired part of the development of any new potentially hazardous system, process, or facility. It is a required part of nuclear safety activities as specified in the U.S. Department of Energy (DOE) Order 420.B, Facility Safety. This order addresses the design of nuclear related facilities developed under federal regulation IOCFR830, Nuclear Safety Management. IOCFR830 requires that safety basis documentation be provided to identify how nuclear safety is being adequately addressed as a condition for system operation (e.g., the safety basis). To support the development of the safety basis, a safety analysis is performed. Although the concept of developing a design that addresses 'Safety is simple, the execution can be complex and challenging. This paper addresses those complexities and challenges for the design activity of a system to treat sludge, a corrosion product of spent nuclear fuel, at DOE's Hanford Site in Washington State. The system being developed is referred to as the Sludge Treatment Project (STP). This paper describes the portion of the safety analysis that addresses the selection of design basis events using the experience gained from the STP and the development of design requirements for safety features associated with those events. Specifically, the paper describes the safety design process and the application of the process for two types of potential design basis accidents associated with the operation of the system, (1) flashing spray leaks and (2) splash and splatter leaks. Also presented are the technical challenges that are being addressed to develop effective safety features to deal with these design basis accidents

  14. Challenges in IC design for hearing aids

    DEFF Research Database (Denmark)

    Jørgensen, Ivan Harald Holger

    2012-01-01

    Designing modern hearing aids is a formidable challenge. The size of hearing aids is constantly decreasing, making them virtually invisible today. Still, as in all other modern electronics, more and more features are added to these devices driven by the development in modern IC technology....... The demands for performance and features at very low supply voltage and power consumption constantly prove a challenge to the physical design of hearing aids and not at least the design of the ICs for these. As a result of this all large hearing aid manufacturers use fully customized ASICs in their products...... to produce a competitive advantage. This presentation will give a brief insight into the hearing aid market and industry, a brief view of the historic development of hearing aids and an introduction to how a modern hearing is constructed showing the amplifier as the key component in the modern hearing aid...

  15. Possibilities and Challenges designing low-carbon-energy technologies

    DEFF Research Database (Denmark)

    Bjarklev, Araceli

    Though there is broad consensus that one of the solutions to the current environmental challenge will be based on the use of low-carbon technologies, and even though there is a big potential to turn to a more sustainable design and innovation, there are several elements that need to be taken...... as a study object and discusses the question: What are the main possibilities and challenges when designing low-carbon illumination technologies? To answer this question, we use a systemic approach including environmental, economic, energy and political issues using relevant concepts from the Ecological...

  16. Four categories of design challenges to building game-based business

    DEFF Research Database (Denmark)

    Henriksen, Thomas Duus; Harpelund, Christian

    2014-01-01

    Building a business on the basis of designing and selling learning games is seldom a straightforward task. Often, such a project involves a diversity of competencies for handling a wide variety of challenges. On the basis of a longitudinal study of the game ChangeSetter, this chapter proposes...... a four-category approach to understanding such challenges. The four categories include 1) the learning game design, 2) didactic design on how the game is to be used, 3) organisational design for establishing both supply and demand, and finally 4) business design, which concerns the establishment...... of a business model that ensures continual rather than incidental income. While the four categories can be used for understanding the various challenges and what competencies they prompt for, the key argument of the chapter is to start with the business design as it is likely to cause extensive iterations...

  17. Power-Efficient Design Challenges

    Science.gov (United States)

    Pangrle, Barry

    significant gains can be realized and why power-efficiency requirements will continue to challenge designers into the future. Despite new process technologies, the future will continue to rely on innovative design approaches.

  18. Pattern Classification with Memristive Crossbar Circuits

    Science.gov (United States)

    2016-03-31

    Pattern Classification with Memristive Crossbar Circuits Dmitri B. Strukov Department of Electrical and Computer Engineering Department UC Santa...pattern classification ; deep learning; convolutional neural network networks. Introduction Deep-learning convolutional neural networks (DLCNN), which...the best classification performances on a variety of benchmark tasks [1]. The major challenge in building fast and energy- efficient networks of this

  19. Challenges of designing fusion reactors for remote maintainability

    International Nuclear Information System (INIS)

    Masson, L.S.

    1981-01-01

    One of the major problems faced by the fusion community is the development of the high level of reliability required to assure that fusion will be a viable commercial power source. Much of the responsibility for solving this problem falls directly on the designer in developing concepts that have a high level of maintainability for the next generation engineering oriented reactors; and long range, in developing full maintainability for the more complicated commercial concepts with their required high level of on-line time. The near-term challenge will include development of unique design concepts to perform inspection, maintenance, replacement, and testing under the stringent conditions imposed by the next generation engineering oriented machines. The long range challenge will focus on basic design concepts that will enable the full maintainability required by commercial fusion. In addition to the purely technical challenges, the fusion community is also faced with the problem of developing programmatic means to assure that reactor maintenance issues are given proper and timely emphasis as the nuclear phase of fusion is approached

  20. Classification of refrigerants; Classification des fluides frigorigenes

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This document was made from the US standard ANSI/ASHRAE 34 published in 2001 and entitled 'designation and safety classification of refrigerants'. This classification allows to clearly organize in an international way the overall refrigerants used in the world thanks to a codification of the refrigerants in correspondence with their chemical composition. This note explains this codification: prefix, suffixes (hydrocarbons and derived fluids, azeotropic and non-azeotropic mixtures, various organic compounds, non-organic compounds), safety classification (toxicity, flammability, case of mixtures). (J.S.)

  1. Addressing the challenges of patient-centred design

    Directory of Open Access Journals (Sweden)

    Karen LaBat

    2009-11-01

    Full Text Available Patient-centred design is a relatively new term, but a longstanding concept in clinical practice. This discussion looks at patient-centred design and explores the relationships of patient-centred design to universal design, user-centred design and the newer human-centred design. It also explores why interdisciplinary approaches are needed for patient-centred design and how interdisciplinary collaboration works to address the challenges of patient centred design. Successful patient-centred solutions can grow from collaborations which include shared visions, understanding of both the nature and degree of variation in the patient,materials, and the designed solution, clear regular communication among all parties with careful definition of terms, and respect for the inherent cultures of all disciplines involved.

  2. iPad Learning Ecosystem: Developing Challenge-Based Learning Using Design Thinking

    Science.gov (United States)

    Marin, Catalina; Hargis, Jace; Cavanaugh, Cathy

    2013-01-01

    In order to maximize college English language students' learning, product development, 21st Century skills and engagement with real world meaningful challenges, a course was designed to integrate Challenge Based Learning (CBL) and iPad mobile learning technology. This article describes the course design, which was grounded in design thinking, and…

  3. The Challenges of Designing Digital Services for Multiple Mobile Platforms

    DEFF Research Database (Denmark)

    Ghazawneh, Ahmad

    2016-01-01

    on a multiple case study of three mobile application development firms from Sweden, Denmark and Norway, we synthesize the digital service design taxonomy to understand the challenges faced by third-party developers. Our study identifies a set of challenges in four different levels: user level, platform level...... to tap into and join the digital ecosystem. However, while there is an emerging literature on designing digital services, little empirical evidence exists about challenges faced by third-party developers while designing digital services, and in particular for multiple mobile platforms. Drawing......The value of digital services is increasingly recognized by owners of digital platforms. These services have central role in building and sustaining the business of the digital platform. In order to sustain the design of digital services, owners of digital platforms encourage third-party developers...

  4. Powering the Future: A Wind Turbine Design Challenge

    Science.gov (United States)

    Pries, Caitlin Hicks; Hughes, Julie

    2011-01-01

    Nothing brings out the best in eighth-grade physical science students quite like an engineering challenge. The wind turbine design challenge described in this article has proved to be a favorite among students with its focus on teamwork and creativity and its (almost) sneaky reinforcement of numerous physics concepts. For this activity, pairs of…

  5. Challenges to Deploy Service Design in Organizations

    DEFF Research Database (Denmark)

    Akasaka, Fumiya; Ohno, Takehiko; Jensen, Mika Yasuoka

    2016-01-01

    More and more companies are applying service design approaches to develop services and products. Not every project, however, has achieved its goals. In many cases, difficulties arise from organizational issues such as organization structure and evaluation system. In this research, we held workshops...... where success and failure factors of service design projects in organization are presented. By analysing the results, we construct a model that explains the “difficulties of deploying the service design approach in organization.” On the basis of the model, this paper discusses the challenges...... to the deployment of the service design approach in organizations....

  6. Advanced Steel Microstructural Classification by Deep Learning Methods.

    Science.gov (United States)

    Azimi, Seyed Majid; Britz, Dominik; Engstler, Michael; Fritz, Mario; Mücklich, Frank

    2018-02-01

    The inner structure of a material is called microstructure. It stores the genesis of a material and determines all its physical and chemical properties. While microstructural characterization is widely spread and well known, the microstructural classification is mostly done manually by human experts, which gives rise to uncertainties due to subjectivity. Since the microstructure could be a combination of different phases or constituents with complex substructures its automatic classification is very challenging and only a few prior studies exist. Prior works focused on designed and engineered features by experts and classified microstructures separately from the feature extraction step. Recently, Deep Learning methods have shown strong performance in vision applications by learning the features from data together with the classification step. In this work, we propose a Deep Learning method for microstructural classification in the examples of certain microstructural constituents of low carbon steel. This novel method employs pixel-wise segmentation via Fully Convolutional Neural Network (FCNN) accompanied by a max-voting scheme. Our system achieves 93.94% classification accuracy, drastically outperforming the state-of-the-art method of 48.89% accuracy. Beyond the strong performance of our method, this line of research offers a more robust and first of all objective way for the difficult task of steel quality appreciation.

  7. Climate classification for the simulation of thermally activated building systems (TABS)

    DEFF Research Database (Denmark)

    Behrendt, Benjamin; Christensen, Jørgen Erik

    2013-01-01

    alternative (sustainable) energy sources that would otherwise be insufficient. The design of TABS is however challenging and most often requires a complete simulation of the building. The standard ISO 11855-4 (2011) suggests a simplified sizing method for TABS. The results however omit condensation risk...... entirely. The proposed climate classification should fill this gap by providing the missing data in a simple manner....

  8. Barriers and Challenges in the Integrated Design Process Approcach

    DEFF Research Database (Denmark)

    Knudstrup, Mary-Ann

    2006-01-01

    ABSTRACT: In the future, it will be a huge challenge to make sustainable building design by using a more holistic and innovative approach in order to be able to decrease or reduce the use of energy for heating and cooling in new building projects. This is seen in the perspective of the Kyoto agre....... It also describes the barriers and the challenges that must be overcome when trying to cross the borders between the two fields of engineering and architecture to design sustainable architecture....... agreement for reducing the global heating. This paper will briefly present the method of the Integrated Design Process, IDP [1]. It describes the background and means for developing a new method for designing integrated architecture in an interdisciplinary approach between architecture and engineering...

  9. Event Classification using Concepts

    NARCIS (Netherlands)

    Boer, M.H.T. de; Schutte, K.; Kraaij, W.

    2013-01-01

    The semantic gap is one of the challenges in the GOOSE project. In this paper a Semantic Event Classification (SEC) system is proposed as an initial step in tackling the semantic gap challenge in the GOOSE project. This system uses semantic text analysis, multiple feature detectors using the BoW

  10. Active Learning for Text Classification

    OpenAIRE

    Hu, Rong

    2011-01-01

    Text classification approaches are used extensively to solve real-world challenges. The success or failure of text classification systems hangs on the datasets used to train them, without a good dataset it is impossible to build a quality system. This thesis examines the applicability of active learning in text classification for the rapid and economical creation of labelled training data. Four main contributions are made in this thesis. First, we present two novel selection strategies to cho...

  11. Teaching Strategies to Promote Concept Learning by Design Challenges

    Science.gov (United States)

    Van Breukelen, Dave; Van Meel, Adrianus; De Vries, Marc

    2017-01-01

    Background: This study is the second study of a design-based research, organised around four studies, that aims to improve student learning, teaching skills and teacher training concerning the design-based learning approach called Learning by Design (LBD). Purpose: LBD uses the context of design challenges to learn, among other things, science.…

  12. High-End Computing Challenges in Aerospace Design and Engineering

    Science.gov (United States)

    Bailey, F. Ronald

    2004-01-01

    High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.

  13. Challenges of designing fusion reactors for remote maintainability

    International Nuclear Information System (INIS)

    Mason, L.S.

    1981-01-01

    One of the major problems faced by the fusion community is the development of the high level of reliability required to assure that fusion will be a viable commercial power source. Much of the responsibility for solving this problem falls directly on the designer in developing concepts that have a high level of maintainability. The problems are both near-term, in developing maintainability for next generation engineering oriented reactors; and long range, in developing full maintainability for the more commercial concepts with their required high level of on-line time. The near-time challenge will include development of unqiue design concepts to perform inspection, maintenance, replacement, and testing under the stringent conditions imposed by the next generation engineering oriented machines. The long range challenge will focus on basic design concepts that will enable the full mainatability required by commerical fusion

  14. Biomimetics as a design methodology – possibilities and challenges

    DEFF Research Database (Denmark)

    Lenau, Torben Anker

    2009-01-01

    Biomimetics – or bionik as it is called in parts of Europe – offer a number of promising opportunities and challenges for the designer. The paper investigates how biomimetics as a design methodology is used in engineering design by looking at examples of biological searches and highlight...

  15. Design challenges for large Arctic crude oil tanker

    International Nuclear Information System (INIS)

    Iyerusalimskiy, A.; Noble, P.

    2008-01-01

    The Vasily Dinkov vessel was delivered by Samsung Heavy Industries shipyard to Russian ship-owner Sovcomflot. It is the largest icebreaking tanker ever built. The vessel was designed and constructed to transport crude oil from the Varandey offshore terminal in the southeastern Barents Sea to a transshipment location near Murmansk, Russia. The vessel is under long-term charter for Naryanmarneftegas, a joint venture of Lukoil and ConocoPhillips. The new ship was constructed strictly to the requirements, specification, and concept design provided by the charterer. The Varandey oil transportation concept and the vessel operational profile resulted in some conflicting requirements, compromising technical solutions, and assumptions yet to be proven in operation. This paper described the design challenges and selected approach to solve the solution to the tanker key design elements. These included the ice transit and other Arctic environmental challenges; open water performance issues; and icebreaking hull structure design challenges associated with modern shipbuilding technology standards and cost efficiency. The principle characteristics of the Vasily Dinkov were first presented and the Varandey crude oil transportation system was also described. Several features have made the Vasily Dinkov the most advanced icebreaking tanker to date, such as the icebreaking concept which has expanded the capability of both traditional icebreaking ships fitted with the icebreaker bow and double acting ships intended to operate astern only in the ice; the largest azimuthal twin screw propulsion plant for the Arctic with highest ice torque capacity ever specified for cargo vessel; and the first customized, automated, asymmetric steering control system designed to improve open water maneuverability and steering stability of podded vessels. It was concluded that the transportation system, the overall vessel concept and many of the vessel features require validation based on operational

  16. Achievements and Challenges in Computational Protein Design.

    Science.gov (United States)

    Samish, Ilan

    2017-01-01

    Computational protein design (CPD), a yet evolving field, includes computer-aided engineering for partial or full de novo designs of proteins of interest. Designs are defined by a requested structure, function, or working environment. This chapter describes the birth and maturation of the field by presenting 101 CPD examples in a chronological order emphasizing achievements and pending challenges. Integrating these aspects presents the plethora of CPD approaches with the hope of providing a "CPD 101". These reflect on the broader structural bioinformatics and computational biophysics field and include: (1) integration of knowledge-based and energy-based methods, (2) hierarchical designated approach towards local, regional, and global motifs and the integration of high- and low-resolution design schemes that fit each such region, (3) systematic differential approaches towards different protein regions, (4) identification of key hot-spot residues and the relative effect of remote regions, (5) assessment of shape-complementarity, electrostatics and solvation effects, (6) integration of thermal plasticity and functional dynamics, (7) negative design, (8) systematic integration of experimental approaches, (9) objective cross-assessment of methods, and (10) successful ranking of potential designs. Future challenges also include dissemination of CPD software to the general use of life-sciences researchers and the emphasis of success within an in vivo milieu. CPD increases our understanding of protein structure and function and the relationships between the two along with the application of such know-how for the benefit of mankind. Applied aspects range from biological drugs, via healthier and tastier food products to nanotechnology and environmentally friendly enzymes replacing toxic chemicals utilized in the industry.

  17. Multidisciplinary design optimization of large wind turbines—Technical, economic, and design challenges

    International Nuclear Information System (INIS)

    Ashuri, Turaj; Zaaijer, Michiel B.; Martins, Joaquim R.R.A.; Zhang, Jie

    2016-01-01

    Highlights: • 5, 10 and 20 MW wind turbines are developed using multidisciplinary design optimization. • Technical feasibility and economy of large wind turbines are investigated. • Critical upscaling trends of existing wind turbines are presented up to 20 MW. • Design challenges of large wind turbines are identified, and design solutions proposed. • With no design innovation, upscaling of existing turbines will increase the costs. - Abstract: Wind energy has experienced a continuous cost reduction in the last decades. A popular cost reduction technique is to increase the rated power of the wind turbine by making it larger. However, it is not clear whether further upscaling of the existing wind turbines beyond the 5–7 MW range is technically feasible and economically attractive. To address this question, this study uses 5, 10, and 20 MW wind turbines that are developed using multidisciplinary design optimization as upscaling data points. These wind turbines are upwind, 3-bladed, pitch-regulated, variable-speed machines with a tubular tower. Based on the design data and properties of these wind turbines, scaling trends such as loading, mass, and cost are developed. These trends are used to study the technical and economical aspects of upscaling and its impact on the design and cost. The results of this research show the technical feasibility of the existing wind turbines up to 20 MW, but the design of such an upscaled machine is cost prohibitive. Mass increase of the rotor is identified as a main design challenge to overcome. The results of this research support the development of alternative lightweight materials and design concepts such as a two-bladed downwind design for upscaling to remain a cost effective solution for future wind turbines.

  18. Planning pesticides usage for herbal and animal pests based on intelligent classification system with image processing and neural networks

    Directory of Open Access Journals (Sweden)

    Dimililer Kamil

    2018-01-01

    Full Text Available Pests are divided into two as herbal and animal pests in agriculture, and detection and use of minimum pesticides are quite challenging task. Last three decades, researchers have been improving their studies on these manners. Therefore, effective, efficient, and as well as intelligent systems are designed and modelled. In this paper, an intelligent classification system is designed for detecting pests as herbal or animal to use of proper pesticides accordingly. The designed system suggests two main stages. Firstly, images are processed using different image processing techniques that images have specific distinguishing geometric patterns. The second stage is neural network phase for classification. A backpropagation neural network is used for training and testing with processed images. System is tested, and experiment results show efficiency and effective classification rate. Autonomy and time efficiency within the pesticide usage are also discussed.

  19. How well do the rosgen classification and associated "natural channel design" methods integrate and quantify fluvial processes and channel response?

    Science.gov (United States)

    Simon, A.; Doyle, M.; Kondolf, M.; Shields, F.D.; Rhoads, B.; Grant, G.; Fitzpatrick, F.; Juracek, K.; McPhillips, M.; MacBroom, J.

    2005-01-01

    Over the past 10 years the Rosgen classification system and its associated methods of "natural channel design" have become synonymous (to many without prior knowledge of the field) with the term "stream restoration" and the science of fluvial geomorphology. Since the mid 1990s, this classification approach has become widely, and perhaps dominantly adopted by governmental agencies, particularly those funding restoration projects. For example, in a request for proposals for the restoration of Trout Creek in Montana, the Natural Resources Conservation Service required "experience in the use and application of a stream classification system and its implementation." Similarly, classification systems have been used in evaluation guides for riparian areas and U.S. Forest Service management plans. Most notably, many highly trained geomorphologists and hydraulic engineers are often held suspect, or even thought incorrect, if their approach does not include reference to or application of a classification system. This, combined with the para-professional training provided by some involved in "natural channel design" empower individuals and groups with limited backgrounds in stream and watershed sciences to engineer wholesale re-patterning of stream reaches using 50-year old technology that was never intended for engineering design. At Level I, the Rosgen classification system consists of eight or nine major stream types, based on hydraulic-geometry relations and four other measures of channel shape to distinguish the dimensions of alluvial stream channels as a function of the bankfull stage. Six classes of the particle size of the boundary sediments are used to further sub-divide each of the major stream types, resulting in 48 or 54 stream types. Aside from the difficulty in identifying bankfull stage, particularly in incising channels, and the issue of sampling from two distinct populations (beds and banks) to classify the boundary sediments, the classification provides a

  20. Challenges in designing interactive systems for emergency response

    DEFF Research Database (Denmark)

    Kristensen, Margit; Kyng, Morten; Nielsen, Esben Toftdahl

    2007-01-01

    and visions as ways to bridge between fieldwork and literature studies on the one hand and the emerging computer based prototypes on the other. Our case concerns design of innovative interactive systems for support in emergency response, including patient identification and monitoring as well as construction......This paper presents research on participatory design of interactive systems for emergency response. We present the work by going through the design method with a focus on the new elements that we developed for the participatory design toolkit, in particular we emphasize the use of challenges...

  1. 75 FR 33169 - Dental Devices: Classification of Dental Amalgam, Reclassification of Dental Mercury, Designation...

    Science.gov (United States)

    2010-06-11

    .... FDA-2008-N-0163] (formerly Docket No. 2001N-0067) RIN 0910-AG21 Dental Devices: Classification of Dental Amalgam, Reclassification of Dental Mercury, Designation of Special Controls for Dental Amalgam... the Federal Register of August 4, 2009 (74 FR 38686) which classified dental amalgam as a class II...

  2. Supernova Photometric Lightcurve Classification

    Science.gov (United States)

    Zaidi, Tayeb; Narayan, Gautham

    2016-01-01

    This is a preliminary report on photometric supernova classification. We first explore the properties of supernova light curves, and attempt to restructure the unevenly sampled and sparse data from assorted datasets to allow for processing and classification. The data was primarily drawn from the Dark Energy Survey (DES) simulated data, created for the Supernova Photometric Classification Challenge. This poster shows a method for producing a non-parametric representation of the light curve data, and applying a Random Forest classifier algorithm to distinguish between supernovae types. We examine the impact of Principal Component Analysis to reduce the dimensionality of the dataset, for future classification work. The classification code will be used in a stage of the ANTARES pipeline, created for use on the Large Synoptic Survey Telescope alert data and other wide-field surveys. The final figure-of-merit for the DES data in the r band was 60% for binary classification (Type I vs II).Zaidi was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experiences for Undergraduates Program (AST-1262829).

  3. Low Dimensional Representation of Fisher Vectors for Microscopy Image Classification.

    Science.gov (United States)

    Song, Yang; Li, Qing; Huang, Heng; Feng, Dagan; Chen, Mei; Cai, Weidong

    2017-08-01

    Microscopy image classification is important in various biomedical applications, such as cancer subtype identification, and protein localization for high content screening. To achieve automated and effective microscopy image classification, the representative and discriminative capability of image feature descriptors is essential. To this end, in this paper, we propose a new feature representation algorithm to facilitate automated microscopy image classification. In particular, we incorporate Fisher vector (FV) encoding with multiple types of local features that are handcrafted or learned, and we design a separation-guided dimension reduction method to reduce the descriptor dimension while increasing its discriminative capability. Our method is evaluated on four publicly available microscopy image data sets of different imaging types and applications, including the UCSB breast cancer data set, MICCAI 2015 CBTC challenge data set, and IICBU malignant lymphoma, and RNAi data sets. Our experimental results demonstrate the advantage of the proposed low-dimensional FV representation, showing consistent performance improvement over the existing state of the art and the commonly used dimension reduction techniques.

  4. A Java-based tool for the design of classification microarrays

    Directory of Open Access Journals (Sweden)

    Broschat Shira L

    2008-08-01

    Full Text Available Abstract Background Classification microarrays are used for purposes such as identifying strains of bacteria and determining genetic relationships to understand the epidemiology of an infectious disease. For these cases, mixed microarrays, which are composed of DNA from more than one organism, are more effective than conventional microarrays composed of DNA from a single organism. Selection of probes is a key factor in designing successful mixed microarrays because redundant sequences are inefficient and limited representation of diversity can restrict application of the microarray. We have developed a Java-based software tool, called PLASMID, for use in selecting the minimum set of probe sequences needed to classify different groups of plasmids or bacteria. Results The software program was successfully applied to several different sets of data. The utility of PLASMID was illustrated using existing mixed-plasmid microarray data as well as data from a virtual mixed-genome microarray constructed from different strains of Streptococcus. Moreover, use of data from expression microarray experiments demonstrated the generality of PLASMID. Conclusion In this paper we describe a new software tool for selecting a set of probes for a classification microarray. While the tool was developed for the design of mixed microarrays–and mixed-plasmid microarrays in particular–it can also be used to design expression arrays. The user can choose from several clustering methods (including hierarchical, non-hierarchical, and a model-based genetic algorithm, several probe ranking methods, and several different display methods. A novel approach is used for probe redundancy reduction, and probe selection is accomplished via stepwise discriminant analysis. Data can be entered in different formats (including Excel and comma-delimited text, and dendrogram, heat map, and scatter plot images can be saved in several different formats (including jpeg and tiff. Weights

  5. A Java-based tool for the design of classification microarrays.

    Science.gov (United States)

    Meng, Da; Broschat, Shira L; Call, Douglas R

    2008-08-04

    Classification microarrays are used for purposes such as identifying strains of bacteria and determining genetic relationships to understand the epidemiology of an infectious disease. For these cases, mixed microarrays, which are composed of DNA from more than one organism, are more effective than conventional microarrays composed of DNA from a single organism. Selection of probes is a key factor in designing successful mixed microarrays because redundant sequences are inefficient and limited representation of diversity can restrict application of the microarray. We have developed a Java-based software tool, called PLASMID, for use in selecting the minimum set of probe sequences needed to classify different groups of plasmids or bacteria. The software program was successfully applied to several different sets of data. The utility of PLASMID was illustrated using existing mixed-plasmid microarray data as well as data from a virtual mixed-genome microarray constructed from different strains of Streptococcus. Moreover, use of data from expression microarray experiments demonstrated the generality of PLASMID. In this paper we describe a new software tool for selecting a set of probes for a classification microarray. While the tool was developed for the design of mixed microarrays-and mixed-plasmid microarrays in particular-it can also be used to design expression arrays. The user can choose from several clustering methods (including hierarchical, non-hierarchical, and a model-based genetic algorithm), several probe ranking methods, and several different display methods. A novel approach is used for probe redundancy reduction, and probe selection is accomplished via stepwise discriminant analysis. Data can be entered in different formats (including Excel and comma-delimited text), and dendrogram, heat map, and scatter plot images can be saved in several different formats (including jpeg and tiff). Weights generated using stepwise discriminant analysis can be stored for

  6. Bosniak Classification system

    DEFF Research Database (Denmark)

    Graumann, Ole; Osther, Susanne Sloth; Karstoft, Jens

    2014-01-01

    Background: The Bosniak classification is a diagnostic tool for the differentiation of cystic changes in the kidney. The process of categorizing renal cysts may be challenging, involving a series of decisions that may affect the final diagnosis and clinical outcome such as surgical management....... Purpose: To investigate the inter- and intra-observer agreement among experienced uroradiologists when categorizing complex renal cysts according to the Bosniak classification. Material and Methods: The original categories of 100 cystic renal masses were chosen as “Gold Standard” (GS), established...... to the calculated weighted κ all readers performed “very good” for both inter-observer and intra-observer variation. Most variation was seen in cysts catagorized as Bosniak II, IIF, and III. These results show that radiologists who evaluate complex renal cysts routinely may apply the Bosniak classification...

  7. 32 CFR 2400.34 - Classification.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Classification. 2400.34 Section 2400.34 National... Government Information § 2400.34 Classification. (a) Foreign government information classified by a foreign government or international organization of governments shall retain its original classification designation...

  8. Deep water challenges for drilling rig design

    Energy Technology Data Exchange (ETDEWEB)

    Roth, M [Transocean Sedco Forex, Houston, TX (United States)

    2001-07-01

    Drilling rigs designed for deep water must meet specific design considerations for harsh environments. The early lessons for rig design came from experiences in the North Sea. Rig efficiency and safety considerations must include structural integrity, isolated/redundant ballast controls, triple redundant DP systems, enclosed heated work spaces, and automated equipment such as bridge cranes, pipe handling gear, offline capabilities, subsea tree handling, and computerized drill floors. All components must be designed to harmonize man and machine. Some challenges which are unique to Eastern Canada include frequent storms and fog, cold temperature, icebergs, rig ice, and difficult logistics. This power point presentation described station keeping and mooring issues in terms of dynamic positioning issues. The environmental influence on riser management during forced disconnects was also described. Design issues for connected deep water risers must insure elastic stability, and control deflected shape. The design must also keep stresses within acceptable limits. Codes and standards for stress limits, flex joints and tension were also presented. tabs., figs.

  9. Extension classification method for low-carbon product cases

    Directory of Open Access Journals (Sweden)

    Yanwei Zhao

    2016-05-01

    Full Text Available In product low-carbon design, intelligent decision systems integrated with certain classification algorithms recommend the existing design cases to designers. However, these systems mostly dependent on prior experience, and product designers not only expect to get a satisfactory case from an intelligent system but also hope to achieve assistance in modifying unsatisfactory cases. In this article, we proposed a new categorization method composed of static and dynamic classification based on extension theory. This classification method can be integrated into case-based reasoning system to get accurate classification results and to inform designers of detailed information about unsatisfactory cases. First, we establish the static classification model for cases by dependent function in a hierarchical structure. Then for dynamic classification, we make transformation for cases based on case model, attributes, attribute values, and dependent function, thus cases can take qualitative changes. Finally, the applicability of proposed method is demonstrated through a case study of screw air compressor cases.

  10. Environmental Monitoring, Water Quality - MO 2009 Water Quality Standards - Table G Lake Classifications and Use Designations (SHP)

    Data.gov (United States)

    NSGIC State | GIS Inventory — This data set contains Missouri Water Quality Standards (WQS) lake classifications and use designations described in the Missouri Code of State Regulations (CSR), 10...

  11. The Pediatric Home Care/Expenditure Classification Model (P/ECM): A Home Care Case-Mix Model for Children Facing Special Health Care Challenges

    Science.gov (United States)

    Phillips, Charles D.

    2015-01-01

    Case-mix classification and payment systems help assure that persons with similar needs receive similar amounts of care resources, which is a major equity concern for consumers, providers, and programs. Although health service programs for adults regularly use case-mix payment systems, programs providing health services to children and youth rarely use such models. This research utilized Medicaid home care expenditures and assessment data on 2,578 children receiving home care in one large state in the USA. Using classification and regression tree analyses, a case-mix model for long-term pediatric home care was developed. The Pediatric Home Care/Expenditure Classification Model (P/ECM) grouped children and youth in the study sample into 24 groups, explaining 41% of the variance in annual home care expenditures. The P/ECM creates the possibility of a more equitable, and potentially more effective, allocation of home care resources among children and youth facing serious health care challenges. PMID:26740744

  12. The Pediatric Home Care/Expenditure Classification Model (P/ECM): A Home Care Case-Mix Model for Children Facing Special Health Care Challenges.

    Science.gov (United States)

    Phillips, Charles D

    2015-01-01

    Case-mix classification and payment systems help assure that persons with similar needs receive similar amounts of care resources, which is a major equity concern for consumers, providers, and programs. Although health service programs for adults regularly use case-mix payment systems, programs providing health services to children and youth rarely use such models. This research utilized Medicaid home care expenditures and assessment data on 2,578 children receiving home care in one large state in the USA. Using classification and regression tree analyses, a case-mix model for long-term pediatric home care was developed. The Pediatric Home Care/Expenditure Classification Model (P/ECM) grouped children and youth in the study sample into 24 groups, explaining 41% of the variance in annual home care expenditures. The P/ECM creates the possibility of a more equitable, and potentially more effective, allocation of home care resources among children and youth facing serious health care challenges.

  13. The Pediatric Home Care/Expenditure Classification Model (P/ECM: A Home Care Case-Mix Model for Children Facing Special Health Care Challenges

    Directory of Open Access Journals (Sweden)

    Charles D. Phillips

    2015-01-01

    Full Text Available Case-mix classification and payment systems help assure that persons with similar needs receive similar amounts of care resources, which is a major equity concern for consumers, providers, and programs. Although health service programs for adults regularly use case-mix payment systems, programs providing health services to children and youth rarely use such models. This research utilized Medicaid home care expenditures and assessment data on 2,578 children receiving home care in one large state in the USA. Using classification and regression tree analyses, a case-mix model for long-term pediatric home care was developed. The Pediatric Home Care/Expenditure Classification Model (P/ECM grouped children and youth in the study sample into 24 groups, explaining 41% of the variance in annual home care expenditures. The P/ECM creates the possibility of a more equitable, and potentially more effective, allocation of home care resources among children and youth facing serious health care challenges.

  14. Game Design Principles based on Human Error

    Directory of Open Access Journals (Sweden)

    Guilherme Zaffari

    2016-03-01

    Full Text Available This paper displays the result of the authors’ research regarding to the incorporation of Human Error, through design principles, to video game design. In a general way, designers must consider Human Error factors throughout video game interface development; however, when related to its core design, adaptations are in need, since challenge is an important factor for fun and under the perspective of Human Error, challenge can be considered as a flaw in the system. The research utilized Human Error classifications, data triangulation via predictive human error analysis, and the expanded flow theory to allow the design of a set of principles in order to match the design of playful challenges with the principles of Human Error. From the results, it was possible to conclude that the application of Human Error in game design has a positive effect on player experience, allowing it to interact only with errors associated with the intended aesthetics of the game.

  15. 14 CFR 1203.701 - Classification.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Classification. 1203.701 Section 1203.701... Government Information § 1203.701 Classification. (a) Foreign government information that is classified by a foreign entity shall either retain its original classification designation or be marked with a United...

  16. 14 CFR 298.3 - Classification.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Classification. 298.3 Section 298.3... REGULATIONS EXEMPTIONS FOR AIR TAXI AND COMMUTER AIR CARRIER OPERATIONS General § 298.3 Classification. (a) There is hereby established a classification of air carriers, designated as “air taxi operators,” which...

  17. Spaceflight Flow Cytometry: Design Challenges and Applications

    Science.gov (United States)

    Pappas, Dimitri; Kao, Shih-Hsin; Jeevarajan, Antony S.

    2004-01-01

    Future space exploration missions will require analytical technology capable of providing both autonomous medical care to the crew and investigative capabilities to researchers. While several promising candidate technologies exist for further development, flow cytometry is an attractive technology as it offers both crew health and a wide array of biochemistry and immunology assays. While flow cytometry has been widely used for cellular analysis in both clinical and research settings, the requirements for proper operation in spaceflight impose constraints on any instrument designs. The challenges of designing a spaceflight-ready flow cytometer are discussed, as well as some preliminary results using a prototype system.

  18. Design and update of a classification system: the UCSD map of science.

    Directory of Open Access Journals (Sweden)

    Katy Börner

    Full Text Available Global maps of science can be used as a reference system to chart career trajectories, the location of emerging research frontiers, or the expertise profiles of institutes or nations. This paper details data preparation, analysis, and layout performed when designing and subsequently updating the UCSD map of science and classification system. The original classification and map use 7.2 million papers and their references from Elsevier's Scopus (about 15,000 source titles, 2001-2005 and Thomson Reuters' Web of Science (WoS Science, Social Science, Arts & Humanities Citation Indexes (about 9,000 source titles, 2001-2004-about 16,000 unique source titles. The updated map and classification adds six years (2005-2010 of WoS data and three years (2006-2008 from Scopus to the existing category structure-increasing the number of source titles to about 25,000. To our knowledge, this is the first time that a widely used map of science was updated. A comparison of the original 5-year and the new 10-year maps and classification system show (i an increase in the total number of journals that can be mapped by 9,409 journals (social sciences had a 80% increase, humanities a 119% increase, medical (32% and natural science (74%, (ii a simplification of the map by assigning all but five highly interdisciplinary journals to exactly one discipline, (iii a more even distribution of journals over the 554 subdisciplines and 13 disciplines when calculating the coefficient of variation, and (iv a better reflection of journal clusters when compared with paper-level citation data. When evaluating the map with a listing of desirable features for maps of science, the updated map is shown to have higher mapping accuracy, easier understandability as fewer journals are multiply classified, and higher usability for the generation of data overlays, among others.

  19. Challenges and learning outcomes of educational design research for PhD students

    NARCIS (Netherlands)

    Bronkhorst, L.H.; de Kleijn, R.A.M.

    2016-01-01

    Educational design research (EDR) is described as a complex research approach. The challenges resulting from this complexity are typically described as procedural, whereas EDR might also be challenging for different reasons, specifically for early career researchers. Yet challenging experiences may

  20. Challenges in design of zirconium alloy reactor components

    International Nuclear Information System (INIS)

    Kakodkar, Anil; Sinha, R.K.

    1992-01-01

    Zirconium alloy components used in core-internal assemblies of heavy water reactors have to be designed under constraints imposed by need to have minimum mass, limitations of fabrication, welding and joining techniques with this material, and unique mechanisms for degradation of the operating performance of these components. These constraints manifest as challenges for design and development when the size, shape and dimensions of the components and assemblies are unconventional or untried, or when one is aiming for maximization of service life of these components under severe operating conditions. A number of such challenges were successfully met during the development of core-internal components and assemblies of Dhruva reactor. Some of the then untried ideas which were developed and successfully implemented include use of electron beam welding, cold forming of hemispherical ends of reentrant cans, and a large variety of rolled joints of innovative designs. This experience provided the foundation for taking up and successfully completing several tasks relating to coolant channels, liquid poison channels and sparger channels for PHWRs and test sections for the in-pile loops of Dhruva reactor. For life prediction and safety assessment of coolant channels of PHWRs some analytical tools, notably, a computer code for prediction of creep limited life of coolant channels has been developed. Some of the future challenges include the development of easily replaceable coolant channels and also large diameter coolant channels for Advanced Heavy Water Reactor, and development of solutions to overcome deterioration of service life of coolant channels due to hydriding. (author). 5 refs., 13 figs., 1 tab

  1. Library Classification 2020

    Science.gov (United States)

    Harris, Christopher

    2013-01-01

    In this article the author explores how a new library classification system might be designed using some aspects of the Dewey Decimal Classification (DDC) and ideas from other systems to create something that works for school libraries in the year 2020. By examining what works well with the Dewey Decimal System, what features should be carried…

  2. Improving Student Question Classification

    Science.gov (United States)

    Heiner, Cecily; Zachary, Joseph L.

    2009-01-01

    Students in introductory programming classes often articulate their questions and information needs incompletely. Consequently, the automatic classification of student questions to provide automated tutorial responses is a challenging problem. This paper analyzes 411 questions from an introductory Java programming course by reducing the natural…

  3. Emotion models for textual emotion classification

    Science.gov (United States)

    Bruna, O.; Avetisyan, H.; Holub, J.

    2016-11-01

    This paper deals with textual emotion classification which gained attention in recent years. Emotion classification is used in user experience, product evaluation, national security, and tutoring applications. It attempts to detect the emotional content in the input text and based on different approaches establish what kind of emotional content is present, if any. Textual emotion classification is the most difficult to handle, since it relies mainly on linguistic resources and it introduces many challenges to assignment of text to emotion represented by a proper model. A crucial part of each emotion detector is emotion model. Focus of this paper is to introduce emotion models used for classification. Categorical and dimensional models of emotion are explained and some more advanced approaches are mentioned.

  4. Ensemble support vector machine classification of dementia using structural MRI and mini-mental state examination.

    Science.gov (United States)

    Sørensen, Lauge; Nielsen, Mads

    2018-05-15

    The International Challenge for Automated Prediction of MCI from MRI data offered independent, standardized comparison of machine learning algorithms for multi-class classification of normal control (NC), mild cognitive impairment (MCI), converting MCI (cMCI), and Alzheimer's disease (AD) using brain imaging and general cognition. We proposed to use an ensemble of support vector machines (SVMs) that combined bagging without replacement and feature selection. SVM is the most commonly used algorithm in multivariate classification of dementia, and it was therefore valuable to evaluate the potential benefit of ensembling this type of classifier. The ensemble SVM, using either a linear or a radial basis function (RBF) kernel, achieved multi-class classification accuracies of 55.6% and 55.0% in the challenge test set (60 NC, 60 MCI, 60 cMCI, 60 AD), resulting in a third place in the challenge. Similar feature subset sizes were obtained for both kernels, and the most frequently selected MRI features were the volumes of the two hippocampal subregions left presubiculum and right subiculum. Post-challenge analysis revealed that enforcing a minimum number of selected features and increasing the number of ensemble classifiers improved classification accuracy up to 59.1%. The ensemble SVM outperformed single SVM classifications consistently in the challenge test set. Ensemble methods using bagging and feature selection can improve the performance of the commonly applied SVM classifier in dementia classification. This resulted in competitive classification accuracies in the International Challenge for Automated Prediction of MCI from MRI data. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Power quality event classification: an overview and key issues ...

    African Journals Online (AJOL)

    ... used for PQ events' classifications. Various artificial intelligent techniques which are used in PQ event classification are also discussed. Major Key issues and challenges in classifying PQ events are critically examined and outlined. Keywords: Power quality, PQ event classifiers, artificial intelligence techniques, PQ noise, ...

  6. Teaching habitat and animal classification to fourth graders using an engineering-design model

    Science.gov (United States)

    Marulcu, Ismail

    2014-05-01

    Background: The motivation for this work is built upon the premise that there is a need for research-based materials for design-based science instruction. In this paper, a small portion of our work investigating the impact of a LEGOTM engineering unit on fourth grade students' preconceptions and understanding of animals is presented. Purpose: The driving questions for our work are: (1) What is the impact of an engineering-design-based curricular module on students' understanding of habitat and animal classification? (2) What are students' misconceptions regarding animal classification and habitat? Sample: The study was conducted in an inner-city K-8 school in the northeastern region of the United States. There were two fourth grade classrooms in the school. The first classroom included seven girls and nine boys, whereas the other classroom included eight girls and eight boys. All fourth grade students participated in the study. Design and methods: In answering the research questions mixed-method approaches are used. Data collection methods included pre- and post-tests, pre- and post-interviews, student journals, and classroom observations. Identical pre- and post-tests were administered to measure students' understanding of animals. They included four multiple-choice and six open-ended questions. Identical pre- and post-interviews were administered to explore students' in-depth understanding of animals. Results: Our results show that students significantly increased their performance after instruction on both the multiple-choice questions (t = -3.586, p = .001) and the open-ended questions (t = -5.04, p = .000). They performed better on the post interviews as well. Also, it is found that design-based instruction helped students comprehend core concepts of a life science subject, animals. Conclusions: Based on these results, the main argument of the study is that engineering design is a useful framework for teaching not only physical science-related subjects, but

  7. Big Data in Designing Clinical Trials: Opportunities and Challenges.

    Science.gov (United States)

    Mayo, Charles S; Matuszak, Martha M; Schipper, Matthew J; Jolly, Shruti; Hayman, James A; Ten Haken, Randall K

    2017-01-01

    Emergence of big data analytics resource systems (BDARSs) as a part of routine practice in Radiation Oncology is on the horizon. Gradually, individual researchers, vendors, and professional societies are leading initiatives to create and demonstrate use of automated systems. What are the implications for design of clinical trials, as these systems emerge? Gold standard, randomized controlled trials (RCTs) have high internal validity for the patients and settings fitting constraints of the trial, but also have limitations including: reproducibility, generalizability to routine practice, infrequent external validation, selection bias, characterization of confounding factors, ethics, and use for rare events. BDARS present opportunities to augment and extend RCTs. Preliminary modeling using single- and muti-institutional BDARS may lead to better design and less cost. Standardizations in data elements, clinical processes, and nomenclatures used to decrease variability and increase veracity needed for automation and multi-institutional data pooling in BDARS also support ability to add clinical validation phases to clinical trial design and increase participation. However, volume and variety in BDARS present other technical, policy, and conceptual challenges including applicable statistical concepts, cloud-based technologies. In this summary, we will examine both the opportunities and the challenges for use of big data in design of clinical trials.

  8. Big Data in Designing Clinical Trials: Opportunities and Challenges

    Directory of Open Access Journals (Sweden)

    Charles S. Mayo

    2017-08-01

    Full Text Available Emergence of big data analytics resource systems (BDARSs as a part of routine practice in Radiation Oncology is on the horizon. Gradually, individual researchers, vendors, and professional societies are leading initiatives to create and demonstrate use of automated systems. What are the implications for design of clinical trials, as these systems emerge? Gold standard, randomized controlled trials (RCTs have high internal validity for the patients and settings fitting constraints of the trial, but also have limitations including: reproducibility, generalizability to routine practice, infrequent external validation, selection bias, characterization of confounding factors, ethics, and use for rare events. BDARS present opportunities to augment and extend RCTs. Preliminary modeling using single- and muti-institutional BDARS may lead to better design and less cost. Standardizations in data elements, clinical processes, and nomenclatures used to decrease variability and increase veracity needed for automation and multi-institutional data pooling in BDARS also support ability to add clinical validation phases to clinical trial design and increase participation. However, volume and variety in BDARS present other technical, policy, and conceptual challenges including applicable statistical concepts, cloud-based technologies. In this summary, we will examine both the opportunities and the challenges for use of big data in design of clinical trials.

  9. Cancer classification in the genomic era: five contemporary problems.

    Science.gov (United States)

    Song, Qingxuan; Merajver, Sofia D; Li, Jun Z

    2015-10-19

    Classification is an everyday instinct as well as a full-fledged scientific discipline. Throughout the history of medicine, disease classification is central to how we develop knowledge, make diagnosis, and assign treatment. Here, we discuss the classification of cancer and the process of categorizing cancer subtypes based on their observed clinical and biological features. Traditionally, cancer nomenclature is primarily based on organ location, e.g., "lung cancer" designates a tumor originating in lung structures. Within each organ-specific major type, finer subgroups can be defined based on patient age, cell type, histological grades, and sometimes molecular markers, e.g., hormonal receptor status in breast cancer or microsatellite instability in colorectal cancer. In the past 15+ years, high-throughput technologies have generated rich new data regarding somatic variations in DNA, RNA, protein, or epigenomic features for many cancers. These data, collected for increasingly large tumor cohorts, have provided not only new insights into the biological diversity of human cancers but also exciting opportunities to discover previously unrecognized cancer subtypes. Meanwhile, the unprecedented volume and complexity of these data pose significant challenges for biostatisticians, cancer biologists, and clinicians alike. Here, we review five related issues that represent contemporary problems in cancer taxonomy and interpretation. (1) How many cancer subtypes are there? (2) How can we evaluate the robustness of a new classification system? (3) How are classification systems affected by intratumor heterogeneity and tumor evolution? (4) How should we interpret cancer subtypes? (5) Can multiple classification systems co-exist? While related issues have existed for a long time, we will focus on those aspects that have been magnified by the recent influx of complex multi-omics data. Exploration of these problems is essential for data-driven refinement of cancer classification

  10. 14 CFR 1203.412 - Classification guides.

    Science.gov (United States)

    2010-01-01

    ... of the classification designations (i.e., Top Secret, Secret or Confidential) apply to the identified... writing by an official with original Top Secret classification authority; the identity of the official...

  11. Exploring Challenging Group Dynamics in Participatory Design with Children

    OpenAIRE

    Van Mechelen, Maarten; Gielen, Matthieu; Vanden Abeele, Vero; Laenen, Ann; Zaman, Bieke

    2014-01-01

    This paper presents a structured way to evaluate challenging group or 'codesign dynamics' in participatory design processes with children. In the form of a critical reflection on a project in which 103 children were involved as design partners, we describe the most prevalent codesign dynamics. For example, some groups rush too quickly towards consensus to safeguard group cohesiveness instead of examining other choice alternatives (i.e., groupthink). Besides 'groupthink' we describe five more ...

  12. Solar Probe Plus: Mission design challenges and trades

    Science.gov (United States)

    Guo, Yanping

    2010-11-01

    NASA plans to launch the first mission to the Sun, named Solar Probe Plus, as early as 2015, after a comprehensive feasibility study that significantly changed the original Solar Probe mission concept. The original Solar Probe mission concept, based on a Jupiter gravity assist trajectory, was no longer feasible under the new guidelines given to the mission. A complete redesign of the mission was required, which called for developing alternative trajectories that excluded a flyby of Jupiter. Without the very powerful gravity assist from Jupiter it was extremely difficult to get to the Sun, so designing a trajectory to reach the Sun that is technically feasible under the new mission guidelines became a key enabler to this highly challenging mission. Mission design requirements and challenges unique to this mission are reviewed and discussed, including various mission scenarios and six different trajectory designs utilizing various planetary gravity assists that were considered. The V 5GA trajectory design using five Venus gravity assists achieves a perihelion of 11.8 solar radii ( RS) in 3.3 years without any deep space maneuver (DSM). The V 7GA trajectory design reaches a perihelion of 9.5 RS using seven Venus gravity assists in 6.39 years without any DSM. With nine Venus gravity assists, the V 9GA trajectory design shows a solar orbit at inclination as high as 37.9° from the ecliptic plane can be achieved with the time of flight of 5.8 years. Using combined Earth and Venus gravity assists, as close as 9 RS from the Sun can be achieved in less than 10 years of flight time at moderate launch C3. Ultimately the V 7GA trajectory was chosen as the new baseline mission trajectory. Its design allowing for science investigation right after launch and continuing for nearly 7 years is unprecedented for interplanetary missions. The redesigned Solar Probe Plus mission is not only feasible under the new guidelines but also significantly outperforms the original mission concept

  13. Mechanical design features and challenges for the ITER ICRH antenna

    Energy Technology Data Exchange (ETDEWEB)

    Borthwick, A. [UKAEA/Euratom Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom)], E-mail: andy.borthwick@yahoo.co.uk; Agarici, G. [Fusion for Energy, Barcelona (Spain); Davis, A. [UKAEA/Euratom Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Dumortier, P.; Durodie, F. [LPP-ERM-KMS, Association EURATOM-Belgian State, Brussels (Belgium); Fanthome, J.; Hamlyn-Harris, C.; Hancock, A.D.; Lockley, D. [UKAEA/Euratom Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Mitteau, R. [Euratom-CEA Association, DSM/IRFM, CEA-Cadarache, 13108 St Paul lez Durance (France); Nightingale, M. [UKAEA/Euratom Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Sartori, R. [Fusion for Energy, Barcelona (Spain); Vulliez, K. [Euratom-CEA Association, DSM/IRFM, CEA-Cadarache, 13108 St Paul lez Durance (France)

    2009-06-15

    The ITER Ion Cyclotron Resonant Heating (ICRH) antenna provides plasma heating at a power of 20 MW. Operation in the ITER environment imposes significant thermal power handling capability, structural integrity, shielding and operations requirements. The design will require a step change over any predecessor in terms of power, scale and complexity. This paper reports the main mechanical design features that address the challenges and often conflicting requirements during the conceptual design phase.

  14. Classification of parotidectomy: a proposed modification to the European Salivary Gland Society classification system.

    Science.gov (United States)

    Wong, Wai Keat; Shetty, Subhaschandra

    2017-08-01

    Parotidectomy remains the mainstay of treatment for both benign and malignant lesions of the parotid gland. There exists a wide range of possible surgical options in parotidectomy in terms of extent of parotid tissue removed. There is increasing need for uniformity of terminology resulting from growing interest in modifications of the conventional parotidectomy. It is, therefore, of paramount importance for a standardized classification system in describing extent of parotidectomy. Recently, the European Salivary Gland Society (ESGS) proposed a novel classification system for parotidectomy. The aim of this study is to evaluate this system. A classification system proposed by the ESGS was critically re-evaluated and modified to increase its accuracy and its acceptability. Modifications mainly focused on subdividing Levels I and II into IA, IB, IIA, and IIB. From June 2006 to June 2016, 126 patients underwent 130 parotidectomies at our hospital. The classification system was tested in that cohort of patient. While the ESGS classification system is comprehensive, it does not cover all possibilities. The addition of Sublevels IA, IB, IIA, and IIB may help to address some of the clinical situations seen and is clinically relevant. We aim to test the modified classification system for partial parotidectomy to address some of the challenges mentioned.

  15. Application Study of Fire Severity Classification

    International Nuclear Information System (INIS)

    Kim, In Hwan; Kim, Hyeong Taek; Jee, Moon Hak; Kim, Yun Jung

    2013-01-01

    This paper introduces the Fire Incidents Severity Classification Method for Korean NPPs that may be derived directly from the data fields and feasibility study for domestic uses. FEDB was characterized in more detail and assessed based on the significance of fire incidents in the updated database and five fire severity categories were defined. The logical approach to determine the fire severity starts from the most severe characteristics, namely challenging fires, and continues to define the less challenging and undetermined categories in progress. If the FEDB is utilized for Korean NPPs, the ways of Fire Severity Classification suggested in 2.4 above can be utilized for the quantitative fire risk analysis in future. The Fire Events Database (FEDB) is the primary source of fire data which are used for fire frequency in Fire PSA (Probabilistic Safety Assessment). The purpose of its development is to calculate the quantitative fire frequency at the comprehensive and consolidated source derived from the fire incident information available for Nuclear Power Plants (NPPs). Recently, the Fire Events Database (FEDB) was updated by Electric Power Research Institute (EPRI) and Nuclear Regulatory Commission (NRC) in U. S. The FEDB is intended to update the fire event history up to 2009. A significant enhancement to it is the reorganization and refinement of the database structure and data fields. It has been expanded and improved data fields, coding consistency, incident detail, data review fields, and reference data source traceability. It has been designed to better support several Fire PRA uses as well

  16. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    Science.gov (United States)

    Thomas C. Edwards; D. Richard Cutler; Niklaus E. Zimmermann; Linda Geiser; Gretchen G. Moisen

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by...

  17. From products to services : reflections on the challenges in designing for services

    NARCIS (Netherlands)

    Bhomer, ten M.; De Lille, C. S. H.; Tomico, O.; Kleinsmann, M.S.

    2013-01-01

    In this paper we will point to implications for designers who support organizations in the transition process from products to services based business models. These implications are based on four important challenges when designing for services: the designerly mindset, collaboration, empathy with

  18. Convolutional neural network-based classification system design with compressed wireless sensor network images.

    Science.gov (United States)

    Ahn, Jungmo; Park, JaeYeon; Park, Donghwan; Paek, Jeongyeup; Ko, JeongGil

    2018-01-01

    With the introduction of various advanced deep learning algorithms, initiatives for image classification systems have transitioned over from traditional machine learning algorithms (e.g., SVM) to Convolutional Neural Networks (CNNs) using deep learning software tools. A prerequisite in applying CNN to real world applications is a system that collects meaningful and useful data. For such purposes, Wireless Image Sensor Networks (WISNs), that are capable of monitoring natural environment phenomena using tiny and low-power cameras on resource-limited embedded devices, can be considered as an effective means of data collection. However, with limited battery resources, sending high-resolution raw images to the backend server is a burdensome task that has direct impact on network lifetime. To address this problem, we propose an energy-efficient pre- and post- processing mechanism using image resizing and color quantization that can significantly reduce the amount of data transferred while maintaining the classification accuracy in the CNN at the backend server. We show that, if well designed, an image in its highly compressed form can be well-classified with a CNN model trained in advance using adequately compressed data. Our evaluation using a real image dataset shows that an embedded device can reduce the amount of transmitted data by ∼71% while maintaining a classification accuracy of ∼98%. Under the same conditions, this process naturally reduces energy consumption by ∼71% compared to a WISN that sends the original uncompressed images.

  19. Two approaches to meeting the economic challenge for advanced BWR designs

    International Nuclear Information System (INIS)

    Arnold, H.; Rao, A.S.; Sawyer, C.D.

    1996-01-01

    In developing next generation nuclear power plants many economic challenges must be addressed before they become economically attractive to utilities. The economic challenges vary from country to country but have several common characteristics. First and foremost, a plant has to have the lowest construction (costs) to even be considered for design and construction. Additionally, the plant design has to a have a reasonable chance of being licensed by the regulatory authorities in order to minimize the financial risk to the constructing utility. With the long lead times involved in the design and development of advanced plants nowadays, the overall development costs have also become a key factor in the evolution of advanced plants. This paper presents the design overview and approach to addressing the aforementioned economic challenges for two Advanced Boiling Water Reactor (ABWR) designs. The first plant is the ABWR and the second is the European Simplified Boiling Water. The ABWR relies on proven technology and components and an extensive infrastructure that has been built up over the last 20 year. Because it has proven and standard safety systems, which have been licensed in two countries, it has very limited uncertainly regarding licensing. Finally, it relies on the economies of scale and design flexibility to improve the overall economics of power generation. The ESBWR on the other hand has taken an innovative approach to reduce systems and components to simplify the overall plant to improve plant economics. The overall plant design is indeed simpler, but improved economics required reliance on some economies of scale also. This design embodied in the ESBWR, also has minimized the overall development cost by utilizing features and components from the ABWR and Simplified Boiling Water Reactor technology programs. (authors)

  20. Teaching chemical product design to engineering students: course contents and challenges

    DEFF Research Database (Denmark)

    Skov, Anne Ladegaard; Kiil, Søren

    Chemical product design is not taught in the same way as traditional engineering courses like unit operations or transport phenomena. This paper gives an overview of the challenges that we, as teachers, have faced when teaching chemical product design to engineering students. Specific course...

  1. Use of a design challenge to develop postural support devices for intermediate wheelchair users

    Directory of Open Access Journals (Sweden)

    Brenda N. Onguti

    2017-09-01

    Full Text Available The provision of an appropriate wheelchair, one that provides proper fit and postural support, promotes wheelchair users’ physical health and quality of life. Many wheelchair users have postural difficulties, requiring supplemental postural support devices for added trunk support. However, in many low- and middle-income settings, postural support devices are inaccessible, inappropriate or unaffordable. This article describes the use of the design challenge model, informed by a design thinking approach, to catalyse the development of an affordable, simple and robust postural support device for low- and middle-income countries. The article also illustrates how not-for-profit organisations can utilise design thinking and, in particular, the design challenge model to successfully support the development of innovative solutions to product or process challenges.

  2. Robert Spitzer and psychiatric classification: technical challenges and ethical dilemmas.

    Science.gov (United States)

    Jacob, K S

    2016-01-01

    Dr Robert Leopold Spitzer (May 22, 1932-December 25, 2015), the architect of modern psychiatric diagnostic criteria and classification, died recently at the age of 83 in Seattle. Under his leadership, the American Psychiatric Association's (APA) Diagnostic and Statistical Manuals (DSM) became the international standard.

  3. Introduction to South Africa's safety classification

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Kyung Jun; Wu, Sang Ik; Yoon, Juh Yeon [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    The safety functions of nuclear reactor facilities such as research reactors have to be maintained for all initiating events, incidents and accidents. From the position of licensee, it is a very important issue and design challenge to meet the licensing requirements for the final goal of proper safety functions from nuclear regulator. This paper intends to introduce and understand South Africa's licensing requirements and processing for safety classification of SSCs. South Africa's licensing requirements are shown in Table 1. Three categories A, B and C are categorized based on the occurrence frequency and the dose limitation of worker and public exposure. The Defense in Depth (DiD) and ALARA principle are forced to apply to a nuclear reactor facility design. Also, South Africa's safety and quality class compare with that of ANSI 51.1.

  4. Characterising Extrinsic Challenges Linked to the Design and Implementation of Inquiry-Based Practical Work

    Science.gov (United States)

    Akuma, Fru Vitalis; Callaghan, Ronel

    2017-11-01

    Inquiry-based science education has been incorporated in science curricula internationally. In this regard, however, many teachers encounter challenges. The challenges have been characterised into those linked to the personal characteristics of these teachers (intrinsic challenges) and others associated with contextual factors (extrinsic challenges). However, this level of characterisation is inadequate in terms of appreciating the complexity of the challenges, tracking of their development, and discovering knowledge within specific categories. Against this background, the purpose of the research presented here was to characterise extrinsic challenges linked to the design and implementation of inquiry-based practical work. In order to do so, we used a conceptual framework of teaching challenges based on Bronfenbrenner's ecological theory of human development. The data gathered using a multi-method case study of practical work in two South African high schools, was analysed by combining the data-driven inductive approach and the deductive a priori template of codes approach in thematic analysis. On this basis, the extrinsic challenges linked to the design and implementation of inquiry-based practical work that participants are confronted with, were found to consist of macrosystem challenges (such as a restrictive curriculum) and microsystem challenges. At the latter level, the challenges are material-related (e.g., lack of science education equipment and materials) or non-material-related (such as time constraints and the lack of access to interactive computer simulations). We have discussed the theory-, practice- and research-based implications of these results in relation to the design and implementation of inquiry-based practical work in South Africa and internationally.

  5. A review of the automated detection and classification of acute leukaemia: Coherent taxonomy, datasets, validation and performance measurements, motivation, open challenges and recommendations.

    Science.gov (United States)

    Alsalem, M A; Zaidan, A A; Zaidan, B B; Hashim, M; Madhloom, H T; Azeez, N D; Alsyisuf, S

    2018-05-01

    Acute leukaemia diagnosis is a field requiring automated solutions, tools and methods and the ability to facilitate early detection and even prediction. Many studies have focused on the automatic detection and classification of acute leukaemia and their subtypes to promote enable highly accurate diagnosis. This study aimed to review and analyse literature related to the detection and classification of acute leukaemia. The factors that were considered to improve understanding on the field's various contextual aspects in published studies and characteristics were motivation, open challenges that confronted researchers and recommendations presented to researchers to enhance this vital research area. We systematically searched all articles about the classification and detection of acute leukaemia, as well as their evaluation and benchmarking, in three main databases: ScienceDirect, Web of Science and IEEE Xplore from 2007 to 2017. These indices were considered to be sufficiently extensive to encompass our field of literature. Based on our inclusion and exclusion criteria, 89 articles were selected. Most studies (58/89) focused on the methods or algorithms of acute leukaemia classification, a number of papers (22/89) covered the developed systems for the detection or diagnosis of acute leukaemia and few papers (5/89) presented evaluation and comparative studies. The smallest portion (4/89) of articles comprised reviews and surveys. Acute leukaemia diagnosis, which is a field requiring automated solutions, tools and methods, entails the ability to facilitate early detection or even prediction. Many studies have been performed on the automatic detection and classification of acute leukaemia and their subtypes to promote accurate diagnosis. Research areas on medical-image classification vary, but they are all equally vital. We expect this systematic review to help emphasise current research opportunities and thus extend and create additional research fields. Copyright

  6. Furniture design

    CERN Document Server

    Smardzewski, Jerzy

    2015-01-01

    Maximizing reader insights into the principles of designing furniture as wooden structures, this book discusses issues related to the history of furniture structures, their classification and characteristics, ergonomic approaches to anthropometric requirements and safety of use. It presents key methods and highlights common errors in designing the characteristics of the materials, components, joints and structures, as well as looking at the challenges regarding developing associated design documentation. Including analysis of how designers may go about calculating the stiffness and endurance of parts, joints and whole structures, the book analyzes questions regarding the loss of furniture stability and the resulting threats to health of the user, putting forward a concept of furniture design as an engineering processes. Creating an attractive, functional, ergonomic and safe piece of furniture is not only the fruit of the work of individual architects and artists, but requires an effort of many people working ...

  7. Rectifier Design Challenges for RF Wireless Power Transfer and Energy Harvesting Systems

    Directory of Open Access Journals (Sweden)

    A. Collado

    2017-06-01

    Full Text Available The design of wireless power transfer (WPT and energy harvesting (EH solutions poses different challenges towards achieving maximum RF-DC conversion efficiency in these systems. This paper covers several selected challenges when developing WPT and electromagnetic EH solutions, such as the design of multiband and broadband rectifiers, the minimization of the effect that load and input power variations may have on the system performance and finally the most optimum power combining mechanisms that can be used when dealing with multi-element rectifiers.

  8. The MedlinePlus public user interface: studies of design challenges and opportunities

    Science.gov (United States)

    Marill, Jennifer L.; Miller, Naomi; Kitendaugh, Paula

    2006-01-01

    Question: What are the challenges involved in designing, modifying, and improving a major health information portal that serves over sixty million page views a month? Setting: MedlinePlus, the National Library of Medicine's (NLM's) consumer health Website, is examined. Method: Challenges are presented as six “studies,” which describe selected design issues and how NLM staff resolved them. Main Result: Improving MedlinePlus is an iterative process. Changes in the public user interface are ongoing, reflecting Web design trends, usability testing recommendations, user survey results, new technical requirements, and the need to grow the site in an orderly way. Conclusion: Testing and analysis should accompany Website design modifications. New technologies may enhance a site but also introduce problems. Further modifications to MedlinePlus will be informed by the experiences described here. PMID:16404467

  9. Design and manufacturing challenges of optogenetic neural interfaces: a review

    Science.gov (United States)

    Goncalves, S. B.; Ribeiro, J. F.; Silva, A. F.; Costa, R. M.; Correia, J. H.

    2017-08-01

    Optogenetics is a relatively new technology to achieve cell-type specific neuromodulation with millisecond-scale temporal precision. Optogenetic tools are being developed to address neuroscience challenges, and to improve the knowledge about brain networks, with the ultimate aim of catalyzing new treatments for brain disorders and diseases. To reach this ambitious goal the implementation of mature and reliable engineered tools is required. The success of optogenetics relies on optical tools that can deliver light into the neural tissue. Objective/Approach: Here, the design and manufacturing approaches available to the scientific community are reviewed, and current challenges to accomplish appropriate scalable, multimodal and wireless optical devices are discussed. Significance: Overall, this review aims at presenting a helpful guidance to the engineering and design of optical microsystems for optogenetic applications.

  10. Challenges in Design of an Orientation free Micro Direct Methanol Fuel Cell (µDMFC)

    DEFF Research Database (Denmark)

    Omidvarnia, Farzaneh; Hansen, Hans Nørgaard; Hales, Jan Harry

    2014-01-01

    the challenges in design and manufacturing of a micro direct methanol fuel cell (μDMFC) as the power generator in hearing aid devices is investigated. Among the different challenges in design for μDMFC, the CO2 bubble management and orientation independency of the cell are addressed by proposing a spring loaded...

  11. Classification guide: Paralympic Games London 2012

    OpenAIRE

    2013-01-01

    The London 2012 Paralympic Games Classification Guide is designed to provide National Paralympic Committees (NPCs) and International Paralympic Sport Federations (IPSFs) with information about the classification policies and procedures that will apply to the London 2012 Paralympic Games.

  12. Domain Adaptation for Opinion Classification: A Self-Training Approach

    Directory of Open Access Journals (Sweden)

    Yu, Ning

    2013-03-01

    Full Text Available Domain transfer is a widely recognized problem for machine learning algorithms because models built upon one data domain generally do not perform well in another data domain. This is especially a challenge for tasks such as opinion classification, which often has to deal with insufficient quantities of labeled data. This study investigates the feasibility of self-training in dealing with the domain transfer problem in opinion classification via leveraging labeled data in non-target data domain(s and unlabeled data in the target-domain. Specifically, self-training is evaluated for effectiveness in sparse data situations and feasibility for domain adaptation in opinion classification. Three types of Web content are tested: edited news articles, semi-structured movie reviews, and the informal and unstructured content of the blogosphere. Findings of this study suggest that, when there are limited labeled data, self-training is a promising approach for opinion classification, although the contributions vary across data domains. Significant improvement was demonstrated for the most challenging data domain-the blogosphere-when a domain transfer-based self-training strategy was implemented.

  13. LTE and the evolution to 4G wireless design and measurement challenges

    CERN Document Server

    Rumney, Moray

    2013-01-01

    A practical guide to LTE design, test and measurement, this new edition has been updated to include the latest developments This book presents the latest details on LTE from a practical and technical perspective. Written by Agilent's measurement experts, it offers a valuable insight into LTE technology and its design and test challenges. Chapters cover the upper layer signaling and system architecture evolution (SAE). Basic concepts such as MIMO and SC-FDMA, the new uplink modulation scheme, are introduced and explained, and the authors look into the challenges of verifying the

  14. A neural network-based optimal spatial filter design method for motor imagery classification.

    Directory of Open Access Journals (Sweden)

    Ayhan Yuksel

    Full Text Available In this study, a novel spatial filter design method is introduced. Spatial filtering is an important processing step for feature extraction in motor imagery-based brain-computer interfaces. This paper introduces a new motor imagery signal classification method combined with spatial filter optimization. We simultaneously train the spatial filter and the classifier using a neural network approach. The proposed spatial filter network (SFN is composed of two layers: a spatial filtering layer and a classifier layer. These two layers are linked to each other with non-linear mapping functions. The proposed method addresses two shortcomings of the common spatial patterns (CSP algorithm. First, CSP aims to maximize the between-classes variance while ignoring the minimization of within-classes variances. Consequently, the features obtained using the CSP method may have large within-classes variances. Second, the maximizing optimization function of CSP increases the classification accuracy indirectly because an independent classifier is used after the CSP method. With SFN, we aimed to maximize the between-classes variance while minimizing within-classes variances and simultaneously optimizing the spatial filter and the classifier. To classify motor imagery EEG signals, we modified the well-known feed-forward structure and derived forward and backward equations that correspond to the proposed structure. We tested our algorithm on simple toy data. Then, we compared the SFN with conventional CSP and its multi-class version, called one-versus-rest CSP, on two data sets from BCI competition III. The evaluation results demonstrate that SFN is a good alternative for classifying motor imagery EEG signals with increased classification accuracy.

  15. The Importance of Classification to Business Model Research

    OpenAIRE

    Susan Lambert

    2015-01-01

    Purpose: To bring to the fore the scientific significance of classification and its role in business model theory building. To propose a method by which existing classifications of business models can be analyzed and new ones developed. Design/Methodology/Approach: A review of the scholarly literature relevant to classifications of business models is presented along with a brief overview of classification theory applicable to business model research. Existing business model classification...

  16. Computer Aided Design for Soil Classification Relational Database ...

    African Journals Online (AJOL)

    unique firstlady

    engineering, several developers were asked what rules they applied to identify ... classification is actually a part of all good science. As Michalski ... by a large number of soil scientists. .... and use. The calculus relational database processing is.

  17. Challenges and opportunities in integration of design and control

    DEFF Research Database (Denmark)

    Huusom, Jakob Kjøbsted

    2015-01-01

    Process synthesis and design of plant operation are related topics but current industrial practice solves these problems sequentially. The implication of this sequential strategy may result in design of processing systems which are very hard to control. This paper presents a discussion on drivers...... for an integrated approach and outlines the challenges in formulation of such a multi-objective synthesis problem. This discussion is viewed in relation to some of the changing trends in the industry. Significant results have been published which in different ways seek to handle the integrated problem. Further...

  18. Designing and Developing Game-Like Learning Experience in Virtual Worlds: Challenges and Design Decisions of Novice Instructional Designers

    Science.gov (United States)

    Yilmaz, Turkan Karakus; Cagiltay, Kursat

    2016-01-01

    Many virtual worlds have been adopted for implementation within educational settings because they are potentially useful for building effective learning environments. Since the flexibility of virtual worlds challenges to obtain effective and efficient educational outcomes, the design of such platforms need more attention. In the present study, the…

  19. Classification hierarchies for product data modelling

    NARCIS (Netherlands)

    Pels, H.J.

    2006-01-01

    Abstraction is an essential element in data modelling that appears mainly in one of the following forms: generalisation, classification or aggregation. In the design of complex products classification hierarchies can be found product families that are viewed as classes of product types, while

  20. Cirse Quality Assurance Document and Standards for Classification of Complications: The Cirse Classification System.

    Science.gov (United States)

    Filippiadis, D K; Binkert, C; Pellerin, O; Hoffmann, R T; Krajina, A; Pereira, P L

    2017-08-01

    Interventional radiology provides a wide variety of vascular, nonvascular, musculoskeletal, and oncologic minimally invasive techniques aimed at therapy or palliation of a broad spectrum of pathologic conditions. Outcome data for these techniques are globally evaluated by hospitals, insurance companies, and government agencies targeting in a high-quality health care policy, including reimbursement strategies. To analyze effectively the outcome of a technique, accurate reporting of complications is necessary. Throughout the literature, numerous classification systems for complications grading and classification have been reported. Until now, there has been no method for uniform reporting of complications both in terms of definition and grading. The purpose of this CIRSE guideline is to provide a classification system of complications based on combining outcome and severity of sequelae. The ultimate challenge will be the adoption of this system by practitioners in different countries and health economies within the European Union and beyond.

  1. Classification of schizophrenia patients based on resting-state functional network connectivity

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Arbabshirani

    2013-07-01

    Full Text Available There is a growing interest in automatic classification of mental disorders based on neuroimaging data. Small training data sets (subjects and very large amount of high dimensional data make it a challenging task to design robust and accurate classifiers for heterogeneous disorders such as schizophrenia. Most previous studies considered structural MRI, diffusion tensor imaging and task-based fMRI for this purpose. However, resting-state data has been rarely used in discrimination of schizophrenia patients from healthy controls. Resting data are of great interest, since they are relatively easy to collect, and not confounded by behavioral performance on a task. Several linear and non-linear classification methods were trained using a training dataset and evaluate with a separate testing dataset. Results show that classification with high accuracy is achievable using simple non-linear discriminative methods such as k-nearest neighbors which is very promising. We compare and report detailed results of each classifier as well as statistical analysis and evaluation of each single feature. To our knowledge our effects represent the first use of resting-state functional network connectivity features to classify schizophrenia.

  2. Design and Evaluation of Smart Glasses for Food Intake and Physical Activity Classification.

    Science.gov (United States)

    Chung, Jungman; Oh, Wonjoon; Baek, Dongyoub; Ryu, Sunwoong; Lee, Won Gu; Bang, Hyunwoo

    2018-02-14

    This study presents a series of protocols of designing and manufacturing a glasses-type wearable device that detects the patterns of temporalis muscle activities during food intake and other physical activities. We fabricated a 3D-printed frame of the glasses and a load cell-integrated printed circuit board (PCB) module inserted in both hinges of the frame. The module was used to acquire the force signals, and transmit them wirelessly. These procedures provide the system with higher mobility, which can be evaluated in practical wearing conditions such as walking and waggling. A performance of the classification is also evaluated by distinguishing the patterns of food intake from those physical activities. A series of algorithms were used to preprocess the signals, generate feature vectors, and recognize the patterns of several featured activities (chewing and winking), and other physical activities (sedentary rest, talking, and walking). The results showed that the average F1 score of the classification among the featured activities was 91.4%. We believe this approach can be potentially useful for automatic and objective monitoring of ingestive behaviors with higher accuracy as practical means to treat ingestive problems.

  3. Deep Recurrent Neural Networks for Supernovae Classification

    Science.gov (United States)

    Charnock, Tom; Moss, Adam

    2017-03-01

    We apply deep recurrent neural networks, which are capable of learning complex sequential information, to classify supernovae (code available at https://github.com/adammoss/supernovae). The observational time and filter fluxes are used as inputs to the network, but since the inputs are agnostic, additional data such as host galaxy information can also be included. Using the Supernovae Photometric Classification Challenge (SPCC) data, we find that deep networks are capable of learning about light curves, however the performance of the network is highly sensitive to the amount of training data. For a training size of 50% of the representational SPCC data set (around 104 supernovae) we obtain a type-Ia versus non-type-Ia classification accuracy of 94.7%, an area under the Receiver Operating Characteristic curve AUC of 0.986 and an SPCC figure-of-merit F 1 = 0.64. When using only the data for the early-epoch challenge defined by the SPCC, we achieve a classification accuracy of 93.1%, AUC of 0.977, and F 1 = 0.58, results almost as good as with the whole light curve. By employing bidirectional neural networks, we can acquire impressive classification results between supernovae types I, II and III at an accuracy of 90.4% and AUC of 0.974. We also apply a pre-trained model to obtain classification probabilities as a function of time and show that it can give early indications of supernovae type. Our method is competitive with existing algorithms and has applications for future large-scale photometric surveys.

  4. Classification of research reactors and discussion of thinking of safety regulation based on the classification

    International Nuclear Information System (INIS)

    Song Chenxiu; Zhu Lixin

    2013-01-01

    Research reactors have different characteristics in the fields of reactor type, use, power level, design principle, operation model and safety performance, etc, and also have significant discrepancy in the aspect of nuclear safety regulation. This paper introduces classification of research reactors and discusses thinking of safety regulation based on the classification of research reactors. (authors)

  5. Bread crumb classification using fractal and multifractal features

    OpenAIRE

    Baravalle, Rodrigo Guillermo; Delrieux, Claudio Augusto; Gómez, Juan Carlos

    2017-01-01

    Adequate image descriptors are fundamental in image classification and object recognition. Main requirements for image features are robustness and low dimensionality which would lead to low classification errors in a variety of situations and with a reasonable computational cost. In this context, the identification of materials poses a significant challenge, since typical (geometric and/or differential) feature extraction methods are not robust enough. Texture features based on Fourier or wav...

  6. Some Challenges in the Design of Human-Automation Interaction for Safety-Critical Systems

    Science.gov (United States)

    Feary, Michael S.; Roth, Emilie

    2014-01-01

    Increasing amounts of automation are being introduced to safety-critical domains. While the introduction of automation has led to an overall increase in reliability and improved safety, it has also introduced a class of failure modes, and new challenges in risk assessment for the new systems, particularly in the assessment of rare events resulting from complex inter-related factors. Designing successful human-automation systems is challenging, and the challenges go beyond good interface development (e.g., Roth, Malin, & Schreckenghost 1997; Christoffersen & Woods, 2002). Human-automation design is particularly challenging when the underlying automation technology generates behavior that is difficult for the user to anticipate or understand. These challenges have been recognized in several safety-critical domains, and have resulted in increased efforts to develop training, procedures, regulations and guidance material (CAST, 2008, IAEA, 2001, FAA, 2013, ICAO, 2012). This paper points to the continuing need for new methods to describe and characterize the operational environment within which new automation concepts are being presented. We will describe challenges to the successful development and evaluation of human-automation systems in safety-critical domains, and describe some approaches that could be used to address these challenges. We will draw from experience with the aviation, spaceflight and nuclear power domains.

  7. WFIRST: Microlensing Analysis Data Challenge

    Science.gov (United States)

    Street, Rachel; WFIRST Microlensing Science Investigation Team

    2018-01-01

    WFIRST will produce thousands of high cadence, high photometric precision lightcurves of microlensing events, from which a wealth of planetary and stellar systems will be discovered. However, the analysis of such lightcurves has historically been very time consuming and expensive in both labor and computing facilities. This poses a potential bottleneck to deriving the full science potential of the WFIRST mission. To address this problem, the WFIRST Microlensing Science Investigation Team designing a series of data challenges to stimulate research to address outstanding problems of microlensing analysis. These range from the classification and modeling of triple lens events to methods to efficiently yet thoroughly search a high-dimensional parameter space for the best fitting models.

  8. Research of design challenges and new technologies for floating LNG

    Directory of Open Access Journals (Sweden)

    Dong-Hyun Lee

    2014-06-01

    Full Text Available With the rate of worldwide LNG demand expected to grow faster than that of gas demand, most major oil companies are currently investing their resources to develop floating LNG-FLNG (i.e. LNG FSRU and LNG FPSO. The global Floating LNG (FLNG market trend will be reviewed based on demand and supply chain relationships. Typical technical issues associated with FLNG design are categorized in terms of global performance evaluation. Although many proven technologies developed through LNG carrier and oil FPSO projects are available for FLNG design, we are still faced with several technical challenges to clear for successful FLNG projects. In this study, some of the challenges encountered during development of the floating LNG facility (i.e. LNG FPSO and FSRU will be reviewed together with their investigated solution. At the same time, research of new LNG-related technologies such as combined containment system will be presented.

  9. Challenges and Limitations of Applying an Emotion-driven Design Approach on Elderly Users

    DEFF Research Database (Denmark)

    Andersen, Casper L.; Gudmundsson, Hjalte P.; Achiche, Sofiane

    2011-01-01

    a competitive advantage for companies. In this paper, challenges of applying an emotion-driven design approach applied on elderly people, in order to identify their user needs towards walking frames, are discussed. The discussion will be based on the experiences and results obtained from the case study...... related to the participants’ age and cognitive abilities. The challenges encountered are discussed and guidelines on what should be taken into account to facilitate an emotion-driven design approach for elderly people are proposed....

  10. Design Challenges Encountered in a Propulsion-Controlled Aircraft Flight Test Program

    Science.gov (United States)

    Maine, Trindel; Burken, John; Burcham, Frank; Schaefer, Peter

    1994-01-01

    The NASA Dryden Flight Research Center conducted flight tests of a propulsion-controlled aircraft system on an F-15 airplane. This system was designed to explore the feasibility of providing safe emergency landing capability using only the engines to provide flight control in the event of a catastrophic loss of conventional flight controls. Control laws were designed to control the flightpath and bank angle using only commands to the throttles. Although the program was highly successful, this paper highlights some of the challenges associated with using engine thrust as a control effector. These challenges include slow engine response time, poorly modeled nonlinear engine dynamics, unmodeled inlet-airframe interactions, and difficulties with ground effect and gust rejection. Flight and simulation data illustrate these difficulties.

  11. Design and Fabrication Challenges for Millimeter-Scale Three-Dimensional Phononic Crystals

    Directory of Open Access Journals (Sweden)

    Frieder Lucklum

    2017-11-01

    Full Text Available While phononic crystals can be theoretically modeled with a variety of analytical and numerical methods, the practical realization and comprehensive characterization of complex designs is often challenging. This is especially important for the nearly limitless possibilities of periodic, three-dimensional structures. In this contribution, we take a look at these design and fabrication challenges of different 3D phononic elements based on recent research using additive manufacturing. Different fabrication technologies introduce specific limitations in terms of, e.g., material choices, minimum feature size, aspect ratios, or support requirements that have to be taken into account during design and theoretical modeling. We discuss advantages and disadvantages of additive technologies suitable for millimeter and sub-millimeter feature sizes. Furthermore, we present comprehensive experimental characterization of finite, simple cubic lattices in terms of wave polarization and propagation direction to demonstrate the substantial differences between complete phononic band gap and application oriented directional band gaps of selected propagation modes.

  12. Construction of a knowledge classification scheme for sharing and usage of knowledge

    International Nuclear Information System (INIS)

    Yoo, Jae Bok; Oh, Jeong Hoon; Lee, Ji Ho; Ko, Young Chul

    2003-12-01

    To efficiently share knowledge among our members on the basis of knowledge management system, first of all, we need to systematically design the knowledge classification scheme that we can classify these knowledge well. The objective of this project is to construct the most suitable knowledge classification scheme that all of us can share them in Korea Atomic Energy Research Institute(KAERI). To construct the knowledge classification scheme all over the our organization, we established a few principles to design it and examined related many classification schemes. And we carried out 3 steps to complete the best desirable KAERI's knowledge classification scheme, that is, 1) the step to design a draft of the knowledge classification scheme, 2) the step to revise a draft of the knowledge classification scheme, 3) the step to verify the revised scheme and to decide its scheme. The scheme completed as a results of this project is consisted of total 218 items, that is, sections of 8 items, classes of 43 items and sub-classes of 167 items. We expect that the knowledge classification scheme designed as the results of this project can be played an important role as the frame to share knowledge among our members when we introduce knowledge management system in our organization. In addition, we expect that methods to design its scheme as well as this scheme itself can be applied when design a knowledge classification scheme at the other R and D institutes and enterprises

  13. Electronic health records challenges in design and implementation

    CERN Document Server

    Sittig, Dean F

    2013-01-01

    This book provides an overview of the challenges in electronic health records (EHR) design and implementation along with an introduction to the best practices that have been identified over the past several years. The book examines concerns surrounding EHR use and proposes eight examples of proper EHR use. It discusses the complex strategic planning that accompanies the systemic organizational changes associated with EHR programs and highlights key lessons learned regarding health information-including technology errors and risk management concerns.

  14. Nanomedical device and systems design challenges, possibilities, visions

    CERN Document Server

    2014-01-01

    Nanomedical Device and Systems Design: Challenges, Possibilities, Visions serves as a preliminary guide toward the inspiration of specific investigative pathways that may lead to meaningful discourse and significant advances in nanomedicine/nanotechnology. This volume considers the potential of future innovations that will involve nanomedical devices and systems. It endeavors to explore remarkable possibilities spanning medical diagnostics, therapeutics, and other advancements that may be enabled within this discipline. In particular, this book investigates just how nanomedical diagnostic and

  15. Discriminative Hierarchical K-Means Tree for Large-Scale Image Classification.

    Science.gov (United States)

    Chen, Shizhi; Yang, Xiaodong; Tian, Yingli

    2015-09-01

    A key challenge in large-scale image classification is how to achieve efficiency in terms of both computation and memory without compromising classification accuracy. The learning-based classifiers achieve the state-of-the-art accuracies, but have been criticized for the computational complexity that grows linearly with the number of classes. The nonparametric nearest neighbor (NN)-based classifiers naturally handle large numbers of categories, but incur prohibitively expensive computation and memory costs. In this brief, we present a novel classification scheme, i.e., discriminative hierarchical K-means tree (D-HKTree), which combines the advantages of both learning-based and NN-based classifiers. The complexity of the D-HKTree only grows sublinearly with the number of categories, which is much better than the recent hierarchical support vector machines-based methods. The memory requirement is the order of magnitude less than the recent Naïve Bayesian NN-based approaches. The proposed D-HKTree classification scheme is evaluated on several challenging benchmark databases and achieves the state-of-the-art accuracies, while with significantly lower computation cost and memory requirement.

  16. New Course Design: Classification Schemes and Information Architecture.

    Science.gov (United States)

    Weinberg, Bella Hass

    2002-01-01

    Describes a course developed at St. John's University (New York) in the Division of Library and Information Science that relates traditional classification schemes to information architecture and Web sites. Highlights include functional aspects of information architecture, that is, the way content is structured; assignments; student reactions; and…

  17. Component design challenges for the ground-based SP-100 nuclear assembly test

    International Nuclear Information System (INIS)

    Markley, R.A.; Disney, R.K.; Brown, G.B.

    1989-01-01

    The SP-100 ground engineering system (GES) program involves a ground test of the nuclear subsystems to demonstrate their design. The GES nuclear assembly test (NAT) will be performed in a simulated space environment within a vessel maintained at ultrahigh vacuum. The NAT employs a radiation shielding system that is comprised of both prototypical and nonprototypical shield subsystems to attenuate the reactor radiation leakage and also nonprototypical heat transport subsystems to remove the heat generated by the reactor. The reactor is cooled by liquid lithium, which will operate at temperatures prototypical of the flight system. In designing the components for these systems, a number of design challenges were encountered in meeting the operational requirements of the simulated space environment (and where necessary, prototypical requirements) while also accommodating the restrictions of a ground-based test facility with its limited available space. This paper presents a discussion of the design challenges associated with the radiation shield subsystem components and key components of the heat transport systems

  18. Packet Classification by Multilevel Cutting of the Classification Space: An Algorithmic-Architectural Solution for IP Packet Classification in Next Generation Networks

    Directory of Open Access Journals (Sweden)

    Motasem Aldiab

    2008-01-01

    Full Text Available Traditionally, the Internet provides only a “best-effort” service, treating all packets going to the same destination equally. However, providing differentiated services for different users based on their quality requirements is increasingly becoming a demanding issue. For this, routers need to have the capability to distinguish and isolate traffic belonging to different flows. This ability to determine the flow each packet belongs to is called packet classification. Technology vendors are reluctant to support algorithmic solutions for classification due to their nondeterministic performance. Although content addressable memories (CAMs are favoured by technology vendors due to their deterministic high-lookup rates, they suffer from the problems of high-power consumption and high-silicon cost. This paper provides a new algorithmic-architectural solution for packet classification that mixes CAMs with algorithms based on multilevel cutting of the classification space into smaller spaces. The provided solution utilizes the geometrical distribution of rules in the classification space. It provides the deterministic performance of CAMs, support for dynamic updates, and added flexibility for system designers.

  19. A Classification Model and an Open E-Learning System Based on Intuitionistic Fuzzy Sets for Instructional Design Concepts

    Science.gov (United States)

    Güyer, Tolga; Aydogdu, Seyhmus

    2016-01-01

    This study suggests a classification model and an e-learning system based on this model for all instructional theories, approaches, models, strategies, methods, and technics being used in the process of instructional design that constitutes a direct or indirect resource for educational technology based on the theory of intuitionistic fuzzy sets…

  20. Hierarchical classification of dynamically varying radar pulse repetition interval modulation patterns.

    Science.gov (United States)

    Kauppi, Jukka-Pekka; Martikainen, Kalle; Ruotsalainen, Ulla

    2010-12-01

    The central purpose of passive signal intercept receivers is to perform automatic categorization of unknown radar signals. Currently, there is an urgent need to develop intelligent classification algorithms for these devices due to emerging complexity of radar waveforms. Especially multifunction radars (MFRs) capable of performing several simultaneous tasks by utilizing complex, dynamically varying scheduled waveforms are a major challenge for automatic pattern classification systems. To assist recognition of complex radar emissions in modern intercept receivers, we have developed a novel method to recognize dynamically varying pulse repetition interval (PRI) modulation patterns emitted by MFRs. We use robust feature extraction and classifier design techniques to assist recognition in unpredictable real-world signal environments. We classify received pulse trains hierarchically which allows unambiguous detection of the subpatterns using a sliding window. Accuracy, robustness and reliability of the technique are demonstrated with extensive simulations using both static and dynamically varying PRI modulation patterns. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.

    Science.gov (United States)

    Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel

    2017-06-01

    Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.

  2. Modern classification of neoplasms: reconciling differences between morphologic and molecular approaches

    International Nuclear Information System (INIS)

    Berman, Jules

    2005-01-01

    For over 150 years, pathologists have relied on histomorphology to classify and diagnose neoplasms. Their success has been stunning, permitting the accurate diagnosis of thousands of different types of neoplasms using only a microscope and a trained eye. In the past two decades, cancer genomics has challenged the supremacy of histomorphology by identifying genetic alterations shared by morphologically diverse tumors and by finding genetic features that distinguish subgroups of morphologically homogeneous tumors. The Developmental Lineage Classification and Taxonomy of Neoplasms groups neoplasms by their embryologic origin. The putative value of this classification is based on the expectation that tumors of a common developmental lineage will share common metabolic pathways and common responses to drugs that target these pathways. The purpose of this manuscript is to show that grouping tumors according to their developmental lineage can reconcile certain fundamental discrepancies resulting from morphologic and molecular approaches to neoplasm classification. In this study, six issues in tumor classification are described that exemplify the growing rift between morphologic and molecular approaches to tumor classification: 1) the morphologic separation between epithelial and non-epithelial tumors; 2) the grouping of tumors based on shared cellular functions; 3) the distinction between germ cell tumors and pluripotent tumors of non-germ cell origin; 4) the distinction between tumors that have lost their differentiation and tumors that arise from uncommitted stem cells; 5) the molecular properties shared by morphologically disparate tumors that have a common developmental lineage, and 6) the problem of re-classifying morphologically identical but clinically distinct subsets of tumors. The discussion of these issues in the context of describing different methods of tumor classification is intended to underscore the clinical value of a robust tumor classification. A

  3. Classification guide: Sochi 2014 Paralympic Winter Games

    OpenAIRE

    2014-01-01

    The Sochi 2014 Paralympic Winter Games classification guide is designed to provide National Paralympic Committees (NPCs) and International Federations (IFs) with information about the classification policies and procedures that will apply to the Sochi 2014 Paralympic Winter Games.

  4. Computer Aided Design for Soil Classification Relational Database ...

    African Journals Online (AJOL)

    The paper focuses on the problems associated with classification, storage and retrieval of information on soil data, such as the incompatibility of soil data semantics; inadequate documentation, and lack of indexing; hence it is pretty difficult to efficiently access large database. Consequently, information on soil is very difficult ...

  5. Hybrid Societies: Challenges and Perspectives in the Design of Collective Behavior in Self-organizing Systems

    Directory of Open Access Journals (Sweden)

    Heiko eHamann

    2016-04-01

    Full Text Available Hybrid societies are self-organizing, collective systems composed of different components, for example, natural and artificial parts (bio-hybrid or human beings interacting with and through technical systems (socio-technical. Many different disciplines investigate methods and systems closely related to the design of hybrid societies. A~stronger collaboration between these disciplines could allow for re-use of methods and create significant synergies. We identify three main areas of challenges in the design of self-organizing hybrid societies. First, we identify the formalization challenge. There is an urgent need for a generic model that allows a description and comparison of collective hybrid societies. Second, we identify the system design challenge. Starting from the formal specification of the system, we need to develop an integrated design process. Third, we identify the challenge of interdisciplinarity. Current research on self-organizing hybrid societies stretches over many different fields and hence requires the re-use and synthesis of methods at intersections between disciplines. We then conclude by presenting our perspective for future approaches with high potential in this area.

  6. Near infrared spectroscopy is suitable for the classification of hazelnuts according to Protected Designation of Origin.

    Science.gov (United States)

    Moscetti, Roberto; Radicetti, Emanuele; Monarca, Danilo; Cecchini, Massimo; Massantini, Riccardo

    2015-10-01

    This study investigates the possibility of using near infrared spectroscopy for the authentication of the 'Nocciola Romana' hazelnut (Corylus avellana L. cvs Tonda Gentile Romana and Nocchione) as a Protected Designation of Origin (PDO) hazelnut from central Italy. Algorithms for the selection of the optimal pretreatments were tested in combination with the following discriminant routines: k-nearest neighbour, soft independent modelling of class analogy, partial least squares discriminant analysis and support vector machine discriminant analysis. The best results were obtained using a support vector machine discriminant analysis routine. Thus, classification performance rates with specificities, sensitivities and accuracies as high as 96.0%, 95.0% and 95.5%, respectively, were achieved. Various pretreatments, such as standard normal variate, mean centring and a Savitzky-Golay filter with seven smoothing points, were used. The optimal wavelengths for classification were mainly correlated with lipids, although some contribution from minor constituents, such as proteins and carbohydrates, was also observed. Near infrared spectroscopy could classify hazelnut according to the PDO 'Nocciola Romana' designation. Thus, the experimentation lays the foundations for a rapid, online, authentication system for hazelnut. However, model robustness should be improved taking into account agro-pedo-climatic growing conditions. © 2014 Society of Chemical Industry.

  7. [New International Classification of Chronic Pancreatitis (M-ANNHEIM multifactor classification system, 2007): principles, merits, and demerits].

    Science.gov (United States)

    Tsimmerman, Ia S

    2008-01-01

    The new International Classification of Chronic Pancreatitis (designated as M-ANNHEIM) proposed by a group of German specialists in late 2007 is reviewed. All its sections are subjected to analysis (risk group categories, clinical stages and phases, variants of clinical course, diagnostic criteria for "established" and "suspected" pancreatitis, instrumental methods and functional tests used in the diagnosis, evaluation of the severity of the disease using a scoring system, stages of elimination of pain syndrome). The new classification is compared with the earlier classification proposed by the author. Its merits and demerits are discussed.

  8. Challenges to the Use of Artificial Neural Networks for Diagnostic Classifications with Student Test Data

    Science.gov (United States)

    Briggs, Derek C.; Circi, Ruhan

    2017-01-01

    Artificial Neural Networks (ANNs) have been proposed as a promising approach for the classification of students into different levels of a psychological attribute hierarchy. Unfortunately, because such classifications typically rely upon internally produced item response patterns that have not been externally validated, the instability of ANN…

  9. Integrating medical, assistive, and universally designed products and technologies: assistive technology device classification (ATDC).

    Science.gov (United States)

    Bauer, Stephen; Elsaesser, Linda-Jeanne

    2012-09-01

    ISO26000:2010 International Guidance Standard on Organizational Social Responsibility requires that effective organizational performance recognize social responsibility, including the rights of persons with disabilities (PWD), engage stakeholders and contribute to sustainable development. Millennium Development Goals 2010 notes that the most vulnerable people require special attention, while the World Report on Disability 2011 identifies improved data collection and removal of barriers to rehabilitation as the means to empower PWD. The Assistive Technology Device Classification (ATDC), Assistive Technology Service Method (ATSM) and Matching Person and Technology models provide an evidence-based, standardized, internationally comparable framework to improve data collection and rehabilitation interventions. The ATDC and ATSM encompass and support universal design (UD) principles, and use the language and concepts of the International Classification of Functioning, Disability and Health (ICF). Use ATDC and ICF concepts to differentiate medical, assistive and UD products and technology; relate technology "types" to markets and costs; and support provision of UD products and technologies as sustainable and socially responsible behavior. Supply-side and demand-side incentives are suggested to foster private sector development and commercialization of UD products and technologies. Health and health-related professionals should be knowledgeable of UD principles and interventions.

  10. Impacts of Sample Design for Validation Data on the Accuracy of Feedforward Neural Network Classification

    Directory of Open Access Journals (Sweden)

    Giles M. Foody

    2017-08-01

    Full Text Available Validation data are often used to evaluate the performance of a trained neural network and used in the selection of a network deemed optimal for the task at-hand. Optimality is commonly assessed with a measure, such as overall classification accuracy. The latter is often calculated directly from a confusion matrix showing the counts of cases in the validation set with particular labelling properties. The sample design used to form the validation set can, however, influence the estimated magnitude of the accuracy. Commonly, the validation set is formed with a stratified sample to give balanced classes, but also via random sampling, which reflects class abundance. It is suggested that if the ultimate aim is to accurately classify a dataset in which the classes do vary in abundance, a validation set formed via random, rather than stratified, sampling is preferred. This is illustrated with the classification of simulated and remotely-sensed datasets. With both datasets, statistically significant differences in the accuracy with which the data could be classified arose from the use of validation sets formed via random and stratified sampling (z = 2.7 and 1.9 for the simulated and real datasets respectively, for both p < 0.05%. The accuracy of the classifications that used a stratified sample in validation were smaller, a result of cases of an abundant class being commissioned into a rarer class. Simple means to address the issue are suggested.

  11. Integrating Human and Machine Intelligence in Galaxy Morphology Classification Tasks

    Science.gov (United States)

    Beck, Melanie Renee

    The large flood of data flowing from observatories presents significant challenges to astronomy and cosmology--challenges that will only be magnified by projects currently under development. Growth in both volume and velocity of astrophysics data is accelerating: whereas the Sloan Digital Sky Survey (SDSS) has produced 60 terabytes of data in the last decade, the upcoming Large Synoptic Survey Telescope (LSST) plans to register 30 terabytes per night starting in the year 2020. Additionally, the Euclid Mission will acquire imaging for 5 x 107 resolvable galaxies. The field of galaxy evolution faces a particularly challenging future as complete understanding often cannot be reached without analysis of detailed morphological galaxy features. Historically, morphological analysis has relied on visual classification by astronomers, accessing the human brains capacity for advanced pattern recognition. However, this accurate but inefficient method falters when confronted with many thousands (or millions) of images. In the SDSS era, efforts to automate morphological classifications of galaxies (e.g., Conselice et al., 2000; Lotz et al., 2004) are reasonably successful and can distinguish between elliptical and disk-dominated galaxies with accuracies of 80%. While this is statistically very useful, a key problem with these methods is that they often cannot say which 80% of their samples are accurate. Furthermore, when confronted with the more complex task of identifying key substructure within galaxies, automated classification algorithms begin to fail. The Galaxy Zoo project uses a highly innovative approach to solving the scalability problem of visual classification. Displaying images of SDSS galaxies to volunteers via a simple and engaging web interface, www.galaxyzoo.org asks people to classify images by eye. Within the first year hundreds of thousands of members of the general public had classified each of the 1 million SDSS galaxies an average of 40 times. Galaxy Zoo

  12. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  13. Typecasting catchments: Classification, directionality, and the pursuit of universality

    Science.gov (United States)

    Smith, Tyler; Marshall, Lucy; McGlynn, Brian

    2018-02-01

    Catchment classification poses a significant challenge to hydrology and hydrologic modeling, restricting widespread transfer of knowledge from well-studied sites. The identification of important physical, climatological, or hydrologic attributes (to varying degrees depending on application/data availability) has traditionally been the focus for catchment classification. Classification approaches are regularly assessed with regard to their ability to provide suitable hydrologic predictions - commonly by transferring fitted hydrologic parameters at a data-rich catchment to a data-poor catchment deemed similar by the classification. While such approaches to hydrology's grand challenges are intuitive, they often ignore the most uncertain aspect of the process - the model itself. We explore catchment classification and parameter transferability and the concept of universal donor/acceptor catchments. We identify the implications of the assumption that the transfer of parameters between "similar" catchments is reciprocal (i.e., non-directional). These concepts are considered through three case studies situated across multiple gradients that include model complexity, process description, and site characteristics. Case study results highlight that some catchments are more successfully used as donor catchments and others are better suited as acceptor catchments. These results were observed for both black-box and process consistent hydrologic models, as well as for differing levels of catchment similarity. Therefore, we suggest that similarity does not adequately satisfy the underlying assumptions being made in parameter regionalization approaches regardless of model appropriateness. Furthermore, we suggest that the directionality of parameter transfer is an important factor in determining the success of parameter regionalization approaches.

  14. An operational framework for object-based land use classification of heterogeneous rural landscapes

    DEFF Research Database (Denmark)

    Watmough, Gary Richard; Palm, Cheryl; Sullivan, Clare

    2017-01-01

    The characteristics of very high resolution (VHR) satellite data are encouraging development agencies to investigate its use in monitoring and evaluation programmes. VHR data pose challenges for land use classification of heterogeneous rural landscapes as it is not possible to develop generalised...... and transferable land use classification definitions and algorithms. We present an operational framework for classifying VHR satellite data in heterogeneous rural landscapes using an object-based and random forest classifier. The framework overcomes the challenges of classifying VHR data in anthropogenic...

  15. Indexing Density Models for Incremental Learning and Anytime Classification on Data Streams

    DEFF Research Database (Denmark)

    Seidl, Thomas; Assent, Ira; Kranen, Philipp

    2009-01-01

    Classification of streaming data faces three basic challenges: it has to deal with huge amounts of data, the varying time between two stream data items must be used best possible (anytime classification) and additional training data must be incrementally learned (anytime learning) for applying...... to the individual object to be classified) a hierarchy of mixture densities that represent kernel density estimators at successively coarser levels. Our probability density queries together with novel classification improvement strategies provide the necessary information for very effective classification at any...... point of interruption. Moreover, we propose a novel evaluation method for anytime classification using Poisson streams and demonstrate the anytime learning performance of the Bayes tree....

  16. Augmented reality as a design tool for mobile interfaces

    DEFF Research Database (Denmark)

    Bertelsen, Olav Wedege; Nielsen, Christina

    2000-01-01

    applications derived from the classification of augmented reality interfaces. The focus on physical interaction with objects of work and with the mobile device provides us with a range of interaction styles, based on e.g. gestures and manipulation of objects. Furthermore, issues of transparency and directness......This paper challenges user interface paradigms for mobile devices, by using the technical classification of augmented reality interfaces as a thinking tool to develop ideas for interaction with mobile devices. The paper presents future work scenarios from a wastewater treatment plant embodying PDA...... are addressed. The future scenarios indicate that the concepts of augmented reality support solving context problems in mobile design....

  17. Classification system on the selection of number of implants and superstructure design on the basis available vertical restorative space and interforaminal distance for implant supported mandibular overdenture

    Directory of Open Access Journals (Sweden)

    Akshay Bhargava

    2016-01-01

    Full Text Available Purpose: The rehabilitation of the edentulous mandible is a challenge due to various limiting factors, of which the available vertical restorative space (AVRS has been well understood in the literature. However, other anatomic variations such as arch form, arch size, and also the interforaminal distance (IFD (due to the presence of mandibular nerve are influential in the selection of size and position of implants, and thereby the prosthetic design. Materials and Method: In the present study, 30 edentulous patients from a group of 300 edentulous patients, representing all the three jaw relations (Class I, II, and III were evaluated for designing a classification that could help in a comprehensive treatment plan for the edentulous mandible. Dental panoramic radiographs of each individual with a trial or final prosthesis were made. The horizontal IFD and AVRS values were calculated. Results: One-way analysis of variance followed by post-hoc test (multiple comparison and Bonferroni method having P < 0.05 as significant value showed an overall mean of 38.9 mm for horizontal distance and 13.69 mm for the AVRS in 30 edentulous patients. Conclusion: The results showed that in the majority of cases (90% there is insufficient space to place a bar attachment supported by five implants for mandibular overdentures. This suggests that a universal treatment plan cannot be followed due to varying anatomic factors. Hence, it becomes imperative to have a set of clinical guidelines based on the AVRS and IFD, for the selection of implant number and type of attachment. The article proposes a simple classification system based on the AVRS and IFD for establishing guidelines in the treatment planning of the edentulous mandible, to aid in selection of implant size, number, and position along with the associated prosthetic design.

  18. Music classification with MPEG-7

    Science.gov (United States)

    Crysandt, Holger; Wellhausen, Jens

    2003-01-01

    Driven by increasing amount of music available electronically the need and possibility of automatic classification systems for music becomes more and more important. Currently most search engines for music are based on textual descriptions like artist or/and title. This paper presents a system for automatic music description, classification and visualization for a set of songs. The system is designed to extract significant features of a piece of music in order to find songs of similar genre or a similar sound characteristics. The description is done with the help of MPEG-7 only. The classification and visualization is done with the self organizing map algorithm.

  19. Reliability of Oronasal Fistula Classification.

    Science.gov (United States)

    Sitzman, Thomas J; Allori, Alexander C; Matic, Damir B; Beals, Stephen P; Fisher, David M; Samson, Thomas D; Marcus, Jeffrey R; Tse, Raymond W

    2018-01-01

    Objective Oronasal fistula is an important complication of cleft palate repair that is frequently used to evaluate surgical quality, yet reliability of fistula classification has never been examined. The objective of this study was to determine the reliability of oronasal fistula classification both within individual surgeons and between multiple surgeons. Design Using intraoral photographs of children with repaired cleft palate, surgeons rated the location of palatal fistulae using the Pittsburgh Fistula Classification System. Intrarater and interrater reliability scores were calculated for each region of the palate. Participants Eight cleft surgeons rated photographs obtained from 29 children. Results Within individual surgeons reliability for each region of the Pittsburgh classification ranged from moderate to almost perfect (κ = .60-.96). By contrast, reliability between surgeons was lower, ranging from fair to substantial (κ = .23-.70). Between-surgeon reliability was lowest for the junction of the soft and hard palates (κ = .23). Within-surgeon and between-surgeon reliability were almost perfect for the more general classification of fistula in the secondary palate (κ = .95 and κ = .83, respectively). Conclusions This is the first reliability study of fistula classification. We show that the Pittsburgh Fistula Classification System is reliable when used by an individual surgeon, but less reliable when used among multiple surgeons. Comparisons of fistula occurrence among surgeons may be subject to less bias if they use the more general classification of "presence or absence of fistula of the secondary palate" rather than the Pittsburgh Fistula Classification System.

  20. Managing the cigeo design: a challenge and an opportunity

    International Nuclear Information System (INIS)

    Muscetti, R.

    2015-01-01

    INGEROP is working since years on the French Deep Geological Repository's design for Andra, carrying out since 2012 the preliminary design, the global project management and the technical integration for the engineering of the Cigeo project (in a 50-50 consortium with the French company TECHNIP). The article presents some particular organizational aspects that turned out to be more challenging than foreseen in the course of our activities. Starting from the presentation of real examples, some lessons learned are derived as well as practices of interest in solving analogue issues in similar projects, with focus on its application to the management of engineering phase of geological repositories and other 'megaprojects' in different countries. (author)

  1. STRONG LENS TIME DELAY CHALLENGE. I. EXPERIMENTAL DESIGN

    Energy Technology Data Exchange (ETDEWEB)

    Dobler, Gregory [Kavli Institute for Theoretical Physics, University of California Santa Barbara, Santa Barbara, CA 93106 (United States); Fassnacht, Christopher D.; Rumbaugh, Nicholas [Department of Physics, University of California, 1 Shields Avenue, Davis, CA 95616 (United States); Treu, Tommaso; Liao, Kai [Department of Physics, University of California, Santa Barbara, CA 93106 (United States); Marshall, Phil [Kavli Institute for Particle Astrophysics and Cosmology, P.O. Box 20450, MS29, Stanford, CA 94309 (United States); Hojjati, Alireza [Department of Physics and Astronomy, University of British Columbia, 6224 Agricultural Road, Vancouver, B.C. V6T 1Z1 (Canada); Linder, Eric, E-mail: tt@astro.ucla.edu [Lawrence Berkeley National Laboratory and University of California, Berkeley, CA 94720 (United States)

    2015-02-01

    The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ∼10{sup 3} strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a ''Time Delay Challenge'' (TDC). The challenge is organized as a set of ''ladders'', each containing a group of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.

  2. New Challenges for Design Participation in the Era of Ubiquitous Computing

    DEFF Research Database (Denmark)

    Brereton, Margot; Buur, Jacob

    2008-01-01

    Since the event of participatory design in the work democracy projects of the 1970’s and 1980’s in Scandinavia, computing technology and people’s engagement with it have undergone fundamental changes. Although participatory design continues to be a precondition for designing computing that aligns...... with human practices, the motivations to engage in participatory design have changed, and the new era requires formats that are different from the original ones. Through the analysis of three case studies this paper seeks to explain why participatory design must be brought to bear on the field of ubiquitous...... computing, and how this challenges the original participatory design thinking. In particular we will argue that more casual, exploratory formats of engagement with people are required, and rather than planning the all-encompassing systems development project, participatory design needs to move towards...

  3. PHOTOMETRIC SUPERNOVA CLASSIFICATION WITH MACHINE LEARNING

    Energy Technology Data Exchange (ETDEWEB)

    Lochner, Michelle; Peiris, Hiranya V.; Lahav, Ofer; Winter, Max K. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); McEwen, Jason D., E-mail: dr.michelle.lochner@gmail.com [Mullard Space Science Laboratory, University College London, Surrey RH5 6NT (United Kingdom)

    2016-08-01

    Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscopic confirmation of type for all supernovae discovered will be impossible. Here, we develop a multi-faceted classification pipeline, combining existing and new approaches. Our pipeline consists of two stages: extracting descriptive features from the light curves and classification using a machine learning algorithm. Our feature extraction methods vary from model-dependent techniques, namely SALT2 fits, to more independent techniques that fit parametric models to curves, to a completely model-independent wavelet approach. We cover a range of representative machine learning algorithms, including naive Bayes, k -nearest neighbors, support vector machines, artificial neural networks, and boosted decision trees (BDTs). We test the pipeline on simulated multi-band DES light curves from the Supernova Photometric Classification Challenge. Using the commonly used area under the curve (AUC) of the Receiver Operating Characteristic as a metric, we find that the SALT2 fits and the wavelet approach, with the BDTs algorithm, each achieve an AUC of 0.98, where 1 represents perfect classification. We find that a representative training set is essential for good classification, whatever the feature set or algorithm, with implications for spectroscopic follow-up. Importantly, we find that by using either the SALT2 or the wavelet feature sets with a BDT algorithm, accurate classification is possible purely from light curve data, without the need for any redshift information.

  4. PHOTOMETRIC SUPERNOVA CLASSIFICATION WITH MACHINE LEARNING

    International Nuclear Information System (INIS)

    Lochner, Michelle; Peiris, Hiranya V.; Lahav, Ofer; Winter, Max K.; McEwen, Jason D.

    2016-01-01

    Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscopic confirmation of type for all supernovae discovered will be impossible. Here, we develop a multi-faceted classification pipeline, combining existing and new approaches. Our pipeline consists of two stages: extracting descriptive features from the light curves and classification using a machine learning algorithm. Our feature extraction methods vary from model-dependent techniques, namely SALT2 fits, to more independent techniques that fit parametric models to curves, to a completely model-independent wavelet approach. We cover a range of representative machine learning algorithms, including naive Bayes, k -nearest neighbors, support vector machines, artificial neural networks, and boosted decision trees (BDTs). We test the pipeline on simulated multi-band DES light curves from the Supernova Photometric Classification Challenge. Using the commonly used area under the curve (AUC) of the Receiver Operating Characteristic as a metric, we find that the SALT2 fits and the wavelet approach, with the BDTs algorithm, each achieve an AUC of 0.98, where 1 represents perfect classification. We find that a representative training set is essential for good classification, whatever the feature set or algorithm, with implications for spectroscopic follow-up. Importantly, we find that by using either the SALT2 or the wavelet feature sets with a BDT algorithm, accurate classification is possible purely from light curve data, without the need for any redshift information.

  5. EPR design features to mitigate severe accident challenges

    International Nuclear Information System (INIS)

    Mazurkiewicz, S.M.; Fischer, M.; Bittermann, D.

    2005-01-01

    The EPR, an evolutionary pressurized water reactor (PWR), is a 4300-4500 MWth that incorporates proven technology within an optimized configuration to enhance safety. EPR was originally developed through a joint effort between Framatome ANP and Siemens by incorporating the best technological features from the French and German nuclear reactor fleets into a cost-competitive product. Commercial EPR units are currently being built in Finland at the Olkiluoto site, and planned for France at the Flamanville site. In recent months, Framatome ANP announced their intention to market the EPR units to China in response to a request for vendor bids as well as their intent to pursue design certification in the United States under 10CFR52. The EPR safety philosophy is based on a deterministic consideration of defense-in-depth complemented by probabilistic analyses. Not only is the EPR designed to prevent and mitigate design basis accidents (DBAs), it employs an extra level of safety associated with severe accident response. Therefore, as a design objective, features are included to ensure that radiological consequences are limited such that the need for stringent counter measures, such as evacuation and relocation of the nearby population, can be reasonably excluded. This paper discusses some of the innovative features of the EPR to address severe accident challenges. (author)

  6. NASA Engineering Design Challenges: Environmental Control and Life Support Systems. Water Filtration Challenge. EG-2008-09-134-MSFC

    Science.gov (United States)

    Schneider, Twila, Ed.

    2010-01-01

    This educator guide is organized into seven chapters: (1) Overview; (2) The Design Challenge; (3) Connections to National Curriculum Standards; (4) Preparing to Teach; (5) Classroom Sessions; (6) Opportunities for Extension; and (7) Teacher Resources. Chapter 1 provides information about Environmental Control and Life Support Systems used on NASA…

  7. Designing multiple ligands - medicinal chemistry strategies and challenges.

    Science.gov (United States)

    Morphy, Richard; Rankovic, Zoran

    2009-01-01

    It has been widely recognised over the recent years that parallel modulation of multiple biological targets can be beneficial for treatment of diseases with complex etiologies such as cancer asthma, and psychiatric disease. In this article, current strategies for the generation of ligands with a specific multi-target profile (designed multiple ligands or DMLs) are described and a number of illustrative example are given. Designing multiple ligands is frequently a challenging endeavour for medicinal chemists, with the need to appropriately balance affinity for 2 or more targets whilst obtaining physicochemical and pharmacokinetic properties that are consistent with the administration of an oral drug. Given that the properties of DMLs are influenced to a large extent by the proteomic superfamily to which the targets belong and the lead generation strategy that is pursued, an early assessment of the feasibility of any given DML project is essential.

  8. Classification of Error-Diffused Halftone Images Based on Spectral Regression Kernel Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    Zhigao Zeng

    2016-01-01

    Full Text Available This paper proposes a novel algorithm to solve the challenging problem of classifying error-diffused halftone images. We firstly design the class feature matrices, after extracting the image patches according to their statistics characteristics, to classify the error-diffused halftone images. Then, the spectral regression kernel discriminant analysis is used for feature dimension reduction. The error-diffused halftone images are finally classified using an idea similar to the nearest centroids classifier. As demonstrated by the experimental results, our method is fast and can achieve a high classification accuracy rate with an added benefit of robustness in tackling noise.

  9. Interaction design challenges and solutions for ALMA operations monitoring and control

    Science.gov (United States)

    Pietriga, Emmanuel; Cubaud, Pierre; Schwarz, Joseph; Primet, Romain; Schilling, Marcus; Barkats, Denis; Barrios, Emilio; Vila Vilaro, Baltasar

    2012-09-01

    The ALMA radio-telescope, currently under construction in northern Chile, is a very advanced instrument that presents numerous challenges. From a software perspective, one critical issue is the design of graphical user interfaces for operations monitoring and control that scale to the complexity of the system and to the massive amounts of data users are faced with. Early experience operating the telescope with only a few antennas has shown that conventional user interface technologies are not adequate in this context. They consume too much screen real-estate, require many unnecessary interactions to access relevant information, and fail to provide operators and astronomers with a clear mental map of the instrument. They increase extraneous cognitive load, impeding tasks that call for quick diagnosis and action. To address this challenge, the ALMA software division adopted a user-centered design approach. For the last two years, astronomers, operators, software engineers and human-computer interaction researchers have been involved in participatory design workshops, with the aim of designing better user interfaces based on state-of-the-art visualization techniques. This paper describes the process that led to the development of those interface components and to a proposal for the science and operations console setup: brainstorming sessions, rapid prototyping, joint implementation work involving software engineers and human-computer interaction researchers, feedback collection from a broader range of users, further iterations and testing.

  10. Renewing Theories, Methods and Design Practices: Challenges for Architectural Education

    Directory of Open Access Journals (Sweden)

    Andri Yatmo Yandi

    2018-01-01

    Full Text Available Architectural education should promote the advancement of knowledge that is necessary as the basis for the development of excellent design practice. Architectural education needs to respond appropriately to the current issues in the society. To find its way into the society in an appropriate way, architecture needs to be liquid. The ability to address the liquidity of architecture requires educational approach that promotes the ability to work with a range of design methods and approaches. There are several principles that become the basis for developing architectural education that could strengthen its position within the society: to promote knowledge-based design practice, to embrace variety of design methods and approaches; to keep a balance between design knowledge and design skills; while at the same time to aim for mastery and excellence in design. These principles should be the basis for defining and developing the curriculum and the process of design learning architectural education. Then the main challenge is on our willingness to be liquid in developing architectural education, which needs continuous renewal and update to respond to the changing context of knowledge, technology and society.

  11. New guidelines for dam safety classification

    International Nuclear Information System (INIS)

    Dascal, O.

    1999-01-01

    Elements are outlined of recommended new guidelines for safety classification of dams. Arguments are provided for the view that dam classification systems should require more than one system as follows: (a) classification for selection of design criteria, operation procedures and emergency measures plans, based on potential consequences of a dam failure - the hazard classification of water retaining structures; (b) classification for establishment of surveillance activities and for safety evaluation of dams, based on the probability and consequences of failure - the risk classification of water retaining structures; and (c) classification for establishment of water management plans, for safety evaluation of the entire project, for preparation of emergency measures plans, for definition of the frequency and extent of maintenance operations, and for evaluation of changes and modifications required - the hazard classification of the project. The hazard classification of the dam considers, as consequence, mainly the loss of lives or persons in jeopardy and the property damages to third parties. Difficulties in determining the risk classification of the dam lie in the fact that no tool exists to evaluate the probability of the dam's failure. To overcome this, the probability of failure can be substituted for by a set of dam characteristics that express the failure potential of the dam and its foundation. The hazard classification of the entire project is based on the probable consequences of dam failure influencing: loss of life, persons in jeopardy, property and environmental damage. The classification scheme is illustrated for dam threatening events such as earthquakes and floods. 17 refs., 5 tabs

  12. CLASSIFICATION OF THE MGR WASTE HANDLING BUILDING VENTILATION SYSTEM

    International Nuclear Information System (INIS)

    J.A. Ziegler

    2000-01-01

    The purpose of this analysis is to document the Quality Assurance (QA) classification of the Monitored Geologic Repository (MGR) waste handling building ventilation system structures, systems and components (SSCs) performed by the MGR Preclosure Safety and Systems Engineering Section. This analysis also provides the basis for revision of YMP/90-55Q, Q-List (YMP 2000). The Q-List identifies those MGR SSCs subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description'' (QARD) (DOE 2000). This QA classification incorporates the current MGR design and the results of the ''Design Basis Event Frequency and Dose Calculation for Site Recommendation'' (CRWMS M andO 2000a) and ''Bounding Individual Category 1 Design Basis Event Dose Calculation to Support Quality Assurance Classification'' (Gwyn 2000)

  13. A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update.

    Science.gov (United States)

    Lotte, F; Bougrain, L; Cichocki, A; Clerc, M; Congedo, M; Rakotomamonjy, A; Yger, F

    2018-06-01

    Most current electroencephalography (EEG)-based brain-computer interfaces (BCIs) are based on machine learning algorithms. There is a large diversity of classifier types that are used in this field, as described in our 2007 review paper. Now, approximately ten years after this review publication, many new algorithms have been developed and tested to classify EEG signals in BCIs. The time is therefore ripe for an updated review of EEG classification algorithms for BCIs. We surveyed the BCI and machine learning literature from 2007 to 2017 to identify the new classification approaches that have been investigated to design BCIs. We synthesize these studies in order to present such algorithms, to report how they were used for BCIs, what were the outcomes, and to identify their pros and cons. We found that the recently designed classification algorithms for EEG-based BCIs can be divided into four main categories: adaptive classifiers, matrix and tensor classifiers, transfer learning and deep learning, plus a few other miscellaneous classifiers. Among these, adaptive classifiers were demonstrated to be generally superior to static ones, even with unsupervised adaptation. Transfer learning can also prove useful although the benefits of transfer learning remain unpredictable. Riemannian geometry-based methods have reached state-of-the-art performances on multiple BCI problems and deserve to be explored more thoroughly, along with tensor-based methods. Shrinkage linear discriminant analysis and random forests also appear particularly useful for small training samples settings. On the other hand, deep learning methods have not yet shown convincing improvement over state-of-the-art BCI methods. This paper provides a comprehensive overview of the modern classification algorithms used in EEG-based BCIs, presents the principles of these methods and guidelines on when and how to use them. It also identifies a number of challenges to further advance EEG classification in BCI.

  14. A review of classification algorithms for EEG-based brain–computer interfaces: a 10 year update

    Science.gov (United States)

    Lotte, F.; Bougrain, L.; Cichocki, A.; Clerc, M.; Congedo, M.; Rakotomamonjy, A.; Yger, F.

    2018-06-01

    Objective. Most current electroencephalography (EEG)-based brain–computer interfaces (BCIs) are based on machine learning algorithms. There is a large diversity of classifier types that are used in this field, as described in our 2007 review paper. Now, approximately ten years after this review publication, many new algorithms have been developed and tested to classify EEG signals in BCIs. The time is therefore ripe for an updated review of EEG classification algorithms for BCIs. Approach. We surveyed the BCI and machine learning literature from 2007 to 2017 to identify the new classification approaches that have been investigated to design BCIs. We synthesize these studies in order to present such algorithms, to report how they were used for BCIs, what were the outcomes, and to identify their pros and cons. Main results. We found that the recently designed classification algorithms for EEG-based BCIs can be divided into four main categories: adaptive classifiers, matrix and tensor classifiers, transfer learning and deep learning, plus a few other miscellaneous classifiers. Among these, adaptive classifiers were demonstrated to be generally superior to static ones, even with unsupervised adaptation. Transfer learning can also prove useful although the benefits of transfer learning remain unpredictable. Riemannian geometry-based methods have reached state-of-the-art performances on multiple BCI problems and deserve to be explored more thoroughly, along with tensor-based methods. Shrinkage linear discriminant analysis and random forests also appear particularly useful for small training samples settings. On the other hand, deep learning methods have not yet shown convincing improvement over state-of-the-art BCI methods. Significance. This paper provides a comprehensive overview of the modern classification algorithms used in EEG-based BCIs, presents the principles of these methods and guidelines on when and how to use them. It also identifies a number of challenges

  15. EEG Signal Classification With Super-Dirichlet Mixture Model

    DEFF Research Database (Denmark)

    Ma, Zhanyu; Tan, Zheng-Hua; Prasad, Swati

    2012-01-01

    Classification of the Electroencephalogram (EEG) signal is a challengeable task in the brain-computer interface systems. The marginalized discrete wavelet transform (mDWT) coefficients extracted from the EEG signals have been frequently used in researches since they reveal features related...

  16. Automated Decision Tree Classification of Corneal Shape

    Science.gov (United States)

    Twa, Michael D.; Parthasarathy, Srinivasan; Roberts, Cynthia; Mahmoud, Ashraf M.; Raasch, Thomas W.; Bullimore, Mark A.

    2011-01-01

    Purpose The volume and complexity of data produced during videokeratography examinations present a challenge of interpretation. As a consequence, results are often analyzed qualitatively by subjective pattern recognition or reduced to comparisons of summary indices. We describe the application of decision tree induction, an automated machine learning classification method, to discriminate between normal and keratoconic corneal shapes in an objective and quantitative way. We then compared this method with other known classification methods. Methods The corneal surface was modeled with a seventh-order Zernike polynomial for 132 normal eyes of 92 subjects and 112 eyes of 71 subjects diagnosed with keratoconus. A decision tree classifier was induced using the C4.5 algorithm, and its classification performance was compared with the modified Rabinowitz–McDonnell index, Schwiegerling’s Z3 index (Z3), Keratoconus Prediction Index (KPI), KISA%, and Cone Location and Magnitude Index using recommended classification thresholds for each method. We also evaluated the area under the receiver operator characteristic (ROC) curve for each classification method. Results Our decision tree classifier performed equal to or better than the other classifiers tested: accuracy was 92% and the area under the ROC curve was 0.97. Our decision tree classifier reduced the information needed to distinguish between normal and keratoconus eyes using four of 36 Zernike polynomial coefficients. The four surface features selected as classification attributes by the decision tree method were inferior elevation, greater sagittal depth, oblique toricity, and trefoil. Conclusions Automated decision tree classification of corneal shape through Zernike polynomials is an accurate quantitative method of classification that is interpretable and can be generated from any instrument platform capable of raw elevation data output. This method of pattern classification is extendable to other classification

  17. Rock suitability classification RSC 2012

    Energy Technology Data Exchange (ETDEWEB)

    McEwen, T. (ed.) [McEwen Consulting, Leicester (United Kingdom); Kapyaho, A. [Geological Survey of Finland, Espoo (Finland); Hella, P. [Saanio and Riekkola, Helsinki (Finland); Aro, S.; Kosunen, P.; Mattila, J.; Pere, T.

    2012-12-15

    This report presents Posiva's Rock Suitability Classification (RSC) system, developed for locating suitable rock volumes for repository design and construction. The RSC system comprises both the revised rock suitability criteria and the procedure for the suitability classification during the construction of the repository. The aim of the classification is to avoid such features of the host rock that may be detrimental to the favourable conditions within the repository, either initially or in the long term. This report also discusses the implications of applying the RSC system for the fulfilment of the regulatory requirements concerning the host rock as a natural barrier and the site's overall suitability for hosting a final repository of spent nuclear fuel.

  18. Rock suitability classification RSC 2012

    International Nuclear Information System (INIS)

    McEwen, T.; Kapyaho, A.; Hella, P.; Aro, S.; Kosunen, P.; Mattila, J.; Pere, T.

    2012-12-01

    This report presents Posiva's Rock Suitability Classification (RSC) system, developed for locating suitable rock volumes for repository design and construction. The RSC system comprises both the revised rock suitability criteria and the procedure for the suitability classification during the construction of the repository. The aim of the classification is to avoid such features of the host rock that may be detrimental to the favourable conditions within the repository, either initially or in the long term. This report also discusses the implications of applying the RSC system for the fulfilment of the regulatory requirements concerning the host rock as a natural barrier and the site's overall suitability for hosting a final repository of spent nuclear fuel

  19. On the Feature Selection and Classification Based on Information Gain for Document Sentiment Analysis

    Directory of Open Access Journals (Sweden)

    Asriyanti Indah Pratiwi

    2018-01-01

    Full Text Available Sentiment analysis in a movie review is the needs of today lifestyle. Unfortunately, enormous features make the sentiment of analysis slow and less sensitive. Finding the optimum feature selection and classification is still a challenge. In order to handle an enormous number of features and provide better sentiment classification, an information-based feature selection and classification are proposed. The proposed method reduces more than 90% unnecessary features while the proposed classification scheme achieves 96% accuracy of sentiment classification. From the experimental results, it can be concluded that the combination of proposed feature selection and classification achieves the best performance so far.

  20. Naïve and Robust: Class-Conditional Independence in Human Classification Learning

    Science.gov (United States)

    Jarecki, Jana B.; Meder, Björn; Nelson, Jonathan D.

    2018-01-01

    Humans excel in categorization. Yet from a computational standpoint, learning a novel probabilistic classification task involves severe computational challenges. The present paper investigates one way to address these challenges: assuming class-conditional independence of features. This feature independence assumption simplifies the inference…

  1. Review of Design Aspects and Challenges of Efficient and Quiet Amphibious Aircraft

    Science.gov (United States)

    D, Rhea P. Liem Ph.

    2018-04-01

    Apart from the commercial and military aviation sectors, the general aviation (GA) sector is expected to experience a rapid growth, especially in Asia. The increasing economic activities in the region would demand for more efficient and convenient transportation, which would open door to more GA services. This development would require sufficient infrastructure supports, including airports. However, insufficient land area has often imposed limitations in airport development. As such, some areas (e.g., remote islands) are not easily accessible by air. One implication is that travels can only be done via land or water, which might prolong the travel time. This applies to business travels, with the significant increase in business and economic activities, which in turns demands for more efficient and faster mobility. In other cases, this involves some rural areas where the infrastructures are not very well-developed, and where the geographical terrains are too challenging to build a pad for vertical takeoff and landing (VTOL) air vehicles. Under such circumstances, it would be imperative to enable air travels to carry critical logistics such as medical supplies, food, and even sick patients. In this regard, we propose to develop a low-payload, low-altitude amphibious aircraft, which can takeoff and land on both water and land. Aircraft design process is a complex procedure and multidisciplinary in nature, and for amphibious aircraft design we need to consider the two takeoff and landing modes, which imposes further challenges to the design. In this paper we present two preliminary design projects, for two-seater and ten-seater aircraft. To design an efficient and quiet amphibious aircraft, we conduct some experiments on noise shielding mechanisms to reduce the propeller noise. The challenges and resulting designs are briefly discussed in this paper. Amphibious aircraft development will be very relevant to Indonesia, which is the world’s largest archipelago with

  2. Device reliability challenges for modern semiconductor circuit design – a review

    Directory of Open Access Journals (Sweden)

    C. Schlünder

    2009-05-01

    Full Text Available Product development based on highly integrated semiconductor circuits faces various challenges. To ensure the function of circuits the electrical parameters of every device must be in a specific window. This window is restricted by competing mechanisms like process variations and device degradation (Fig. 1. Degradation mechanisms like Negative Bias Temperature Instability (NBTI or Hot Carrier Injection (HCI lead to parameter drifts during operation adding on top of the process variations.

    The safety margin between real lifetime of MOSFETs and product lifetime requirements decreases at advanced technologies. The assignment of tasks to ensure the product lifetime has to be changed for the future. Up to now technology development has the main responsibility to adjust the technology processes to achieve the required lifetime. In future, reliability can no longer be the task of technology development only. Device degradation becomes a collective challenge for semiconductor technologist, reliability experts and circuit designers. Reliability issues have to be considered in design as well to achieve reliable and competitive products. For this work, designers require support by smart software tools with built-in reliability know how. Design for reliability will be one of the key requirements for modern product designs.

    An overview will be given of the physical device damage mechanisms, the operation conditions within circuits leading to stress and the impact of the corresponding device parameter degradation on the function of the circuit. Based on this understanding various approaches for Design for Reliability (DfR will be described. The function of aging simulators will be explained and the flow of circuit-simulation will be described. Furthermore, the difference between full custom and semi custom design and therefore, the different required approaches will be discussed.

  3. Smart Industry Research in the Field of HRM : Resetting Job Design as an Example of Upcoming Challenges

    NARCIS (Netherlands)

    Habraken, Milou Maria Petronella; Bondarouk, Tatiana; Bondarouk, Tanya; Ruel, Huub; Parry, Emma

    2017-01-01

    Purpose – This chapter aims to encourage and guide Smart Industry HRM-related research by addressing upcoming challenges developed using a Job Design lens. Methodology/approach – The challenges are constructed based on a developed overview of the existing body of work related to Job Design and a

  4. Design Challenges for a Wide-Aperture Insertion Quadrupole Magnet

    CERN Document Server

    Russenschuck, S; Perez, J C; Ramos, D; Fessia, P; Karppinen, M; Kirby, G; Sahner, T; Schwerg, N

    2011-01-01

    The design and development of a superconducting (Nb-Ti) quadrupole with 120 mm aperture, for an upgrade of the LHC insertion region, faces challenges arising from the LHC beam optics requirements and the heat-deposition. The first triggered extensive studies of coil alternatives with four and six coil-blocks in view of field quality and operation margins. The latter requires more porous insulation schemes for both the cables and the ground-plane. This in turn necessitates extensive heatpropagation and quench-velocity studies, as well as more efficient quench heaters. The engineering design of the magnet includes innovative features such as self-locking collars, which will enable the collaring to be performed with the coils on a horizontal assembly bench, a spring-loaded and collapsible assembly mandrel, tuning-shims for field quality, porous collaring-shoes, and coil end-spacer design based on differential geometry methods. The project also initiated code extensions in the quench-simulation and CAD/CAM module...

  5. MULTI-TEMPORAL CLASSIFICATION AND CHANGE DETECTION USING UAV IMAGES

    Directory of Open Access Journals (Sweden)

    S. Makuti

    2018-05-01

    Full Text Available In this paper different methodologies for the classification and change detection of UAV image blocks are explored. UAV is not only the cheapest platform for image acquisition but it is also the easiest platform to operate in repeated data collections over a changing area like a building construction site. Two change detection techniques have been evaluated in this study: the pre-classification and the post-classification algorithms. These methods are based on three main steps: feature extraction, classification and change detection. A set of state of the art features have been used in the tests: colour features (HSV, textural features (GLCM and 3D geometric features. For classification purposes Conditional Random Field (CRF has been used: the unary potential was determined using the Random Forest algorithm while the pairwise potential was defined by the fully connected CRF. In the performed tests, different feature configurations and settings have been considered to assess the performance of these methods in such challenging task. Experimental results showed that the post-classification approach outperforms the pre-classification change detection method. This was analysed using the overall accuracy, where by post classification have an accuracy of up to 62.6 % and the pre classification change detection have an accuracy of 46.5 %. These results represent a first useful indication for future works and developments.

  6. Challenges and perspectives in Service Design curricula. The case of the Service Systems Design Master of Aalborg University in Copenhagen

    DEFF Research Database (Denmark)

    Götzen, Amalia De; Morelli, Nicola; Grani, Francesco

    2014-01-01

    In this paper the new Master program on Service Systems Design at Aalborg University in Copenhagen will be presented, focusing on the challenges of building such a curriculum and on its peculiar approach to Service Design through the Problem Based Learning methodology. All the semesters will be d...

  7. Rule-guided human classification of Volunteered Geographic Information

    Science.gov (United States)

    Ali, Ahmed Loai; Falomir, Zoe; Schmid, Falko; Freksa, Christian

    2017-05-01

    During the last decade, web technologies and location sensing devices have evolved generating a form of crowdsourcing known as Volunteered Geographic Information (VGI). VGI acted as a platform of spatial data collection, in particular, when a group of public participants are involved in collaborative mapping activities: they work together to collect, share, and use information about geographic features. VGI exploits participants' local knowledge to produce rich data sources. However, the resulting data inherits problematic data classification. In VGI projects, the challenges of data classification are due to the following: (i) data is likely prone to subjective classification, (ii) remote contributions and flexible contribution mechanisms in most projects, and (iii) the uncertainty of spatial data and non-strict definitions of geographic features. These factors lead to various forms of problematic classification: inconsistent, incomplete, and imprecise data classification. This research addresses classification appropriateness. Whether the classification of an entity is appropriate or inappropriate is related to quantitative and/or qualitative observations. Small differences between observations may be not recognizable particularly for non-expert participants. Hence, in this paper, the problem is tackled by developing a rule-guided classification approach. This approach exploits data mining techniques of Association Classification (AC) to extract descriptive (qualitative) rules of specific geographic features. The rules are extracted based on the investigation of qualitative topological relations between target features and their context. Afterwards, the extracted rules are used to develop a recommendation system able to guide participants to the most appropriate classification. The approach proposes two scenarios to guide participants towards enhancing the quality of data classification. An empirical study is conducted to investigate the classification of grass

  8. Biometric Authentication for Gender Classification Techniques: A Review

    Science.gov (United States)

    Mathivanan, P.; Poornima, K.

    2017-12-01

    One of the challenging biometric authentication applications is gender identification and age classification, which captures gait from far distance and analyze physical information of the subject such as gender, race and emotional state of the subject. It is found that most of the gender identification techniques have focused only with frontal pose of different human subject, image size and type of database used in the process. The study also classifies different feature extraction process such as, Principal Component Analysis (PCA) and Local Directional Pattern (LDP) that are used to extract the authentication features of a person. This paper aims to analyze different gender classification techniques that help in evaluating strength and weakness of existing gender identification algorithm. Therefore, it helps in developing a novel gender classification algorithm with less computation cost and more accuracy. In this paper, an overview and classification of different gender identification techniques are first presented and it is compared with other existing human identification system by means of their performance.

  9. Toward accountable land use mapping: Using geocomputation to improve classification accuracy and reveal uncertainty

    NARCIS (Netherlands)

    Beekhuizen, J.; Clarke, K.C.

    2010-01-01

    The classification of satellite imagery into land use/cover maps is a major challenge in the field of remote sensing. This research aimed at improving the classification accuracy while also revealing uncertain areas by employing a geocomputational approach. We computed numerous land use maps by

  10. A Challenge to Change: Necessary Changes in the Library Classification System for the Chicago Public Schools.

    Science.gov (United States)

    Williams, Florence M.

    This report addresses the feasibility of changing the classification of library materials in the Chicago Public School libraries from the Dewey Decimal classification system (DDC) to the Library of Congress system (LC), thus patterning the city school libraries after the Chicago Public Library and strengthening the existing close relationship…

  11. Design Innovations and Implementation Challenges - A Case of Smart Textiles in Future Hospital Interiors

    DEFF Research Database (Denmark)

    Mogensen, Jeppe; Jørgensen, Poul-Erik; Poulsen, Søren Bolvig

    2014-01-01

    Concerned with the overall challenges of implementing design innovations, this paper relates to the specific case of applying smart textiles in future hospital interiors. The methodological approach is inspired by design thinking and implementation processes, and through the scope of a developed ...

  12. Validation of a new classification system for interprosthetic femoral fractures.

    Science.gov (United States)

    Pires, Robinson Esteves Santos; Silveira, Marcelo Peixoto Sena; Resende, Alessandra Regina da Silva; Junior, Egidio Oliveira Santana; Campos, Tulio Vinicius Oliveira; Santos, Leandro Emilio Nascimento; Balbachevsky, Daniel; Andrade, Marco Antônio Percope de

    2017-07-01

    Interprosthetic femoral fracture (IFF) incidence is gradually increasing as the population is progressively ageing. However, treatment remains challenging due to several contributing factors, such as poor bone quality, patient comorbidities, small interprosthetic fragment, and prostheses instability. An effective and specific classification system is essential to optimize treatment management, therefore diminishing complication rates. This study aims to validate a previously described classification system for interprosthetic femoral fractures. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Assessing Cognitive Function in Bipolar Disorder: Challenges and Recommendations for Clinical Trial Design

    Science.gov (United States)

    Burdick, Katherine E.; Ketter, Terence A.; Goldberg, Joseph F.; Calabrese, Joseph R.

    2015-01-01

    OBJECTIVE Neurocognitive impairment in schizophrenia has been recognized for more than a century. In contrast, only recently have significant neurocognitive deficits been recognized in bipolar disorder. Converging data suggest the importance of cognitive problems in relation to quality of life in bipolar disorder, highlighting the need for treatment and prevention efforts targeting cognition in bipolar patients. Future treatment trials targeting cognitive deficits will be met with methodological challenges due to the inherent complexity and heterogeneity of the disorder, including significant diagnostic comorbidities, the episodic nature of the illness, frequent use of polypharmacy, cognitive heterogeneity, and a lack of consensus regarding measurement of cognition and outcome in bipolar patients. Guidelines for use in designing future trials are needed. PARTICIPANTS The members of the consensus panel (each of the bylined authors) were selected based upon their expertise in bipolar disorder. Dr. Burdick is a neuropsychologist who has studied cognition in this illness for 15 years; Drs. Ketter, Calabrese, and Goldberg each bring considerable expertise in the treatment of bipolar disorder both within and outside of controlled clinical trials. This consensus statement was derived from work together at scientific meetings (e.g. symposium presention at the 2014 Annual meeting of the American Society of Clinical Psychopharmacology, among others) and ongoing discussions by conference call. With the exception of the public presentations on this topic, these meetings were closed to outside participants. EVIDENCE A literature review was undertaken by the authors to identify illness-specific challenges relevant to the design and conduct of treatment trials targeting neurocognition in bipolar disorder. Expert opinion from each of the authors guided the consensus recommendations. CONSENSUS PROCESS Consensus recommendations, reached by unanimous opinion of the authors, are

  14. Impact of Passive Safety on FHR Instrumentation Systems Design and Classification

    International Nuclear Information System (INIS)

    Holcomb, David Eugene

    2015-01-01

    Fluoride salt-cooled high-temperature reactors (FHRs) will rely more extensively on passive safety than earlier reactor classes. 10CFR50 Appendix A, General Design Criteria for Nuclear Power Plants, establishes minimum design requirements to provide reasonable assurance of adequate safety. 10CFR50.69, Risk-Informed Categorization and Treatment of Structures, Systems and Components for Nuclear Power Reactors, provides guidance on how the safety significance of systems, structures, and components (SSCs) should be reflected in their regulatory treatment. The Nuclear Energy Institute (NEI) has provided 10 CFR 50.69 SSC Categorization Guideline (NEI-00-04) that factors in probabilistic risk assessment (PRA) model insights, as well as deterministic insights, through an integrated decision-making panel. Employing the PRA to inform deterministic requirements enables an appropriately balanced, technically sound categorization to be established. No FHR currently has an adequate PRA or set of design basis accidents to enable establishing the safety classification of its SSCs. While all SSCs used to comply with the general design criteria (GDCs) will be safety related, the intent is to limit the instrumentation risk significance through effective design and reliance on inherent passive safety characteristics. For example, FHRs have no safety-significant temperature threshold phenomena, thus enabling the primary and reserve reactivity control systems required by GDC 26 to be passively, thermally triggered at temperatures well below those for which core or primary coolant boundary damage would occur. Moreover, the passive thermal triggering of the primary and reserve shutdown systems may relegate the control rod drive motors to the control system, substantially decreasing the amount of safety-significant wiring needed. Similarly, FHR decay heat removal systems are intended to be running continuously to minimize the amount of safety-significant instrumentation needed to initiate

  15. UAS Detection Classification and Neutralization: Market Survey 2015

    Energy Technology Data Exchange (ETDEWEB)

    Birch, Gabriel Carisle [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Griffin, John Clark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Erdman, Matthew Kelly [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    The purpose of this document is to briefly frame the challenges of detecting low, slow, and small (LSS) unmanned aerial systems (UAS). The conclusion drawn from internal discussions and external reports is the following; detection of LSS UAS is a challenging problem that can- not be achieved with a single detection modality for all potential targets. Classification of LSS UAS, especially classification in the presence of background clutter (e.g., urban environment) or other non-threating targets (e.g., birds), is under-explored. Though information of avail- able technologies is sparse, many of the existing options for UAS detection appear to be in their infancy (when compared to more established ground-based air defense systems for larger and/or faster threats). Companies currently providing or developing technologies to combat the UAS safety and security problem are certainly worth investigating, however, no company has provided the statistical evidence necessary to support robust detection, identification, and/or neutralization of LSS UAS targets. The results of a market survey are included that highlights potential commercial entities that could contribute some technology that assists in the detection, classification, and neutral- ization of a LSS UAS. This survey found no clear and obvious commercial solution, though recommendations are given for further investigation of several potential systems.

  16. CLASSIFICATION BY USING MULTISPECTRAL POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    C. T. Liao

    2012-07-01

    Full Text Available Remote sensing images are generally recorded in two-dimensional format containing multispectral information. Also, the semantic information is clearly visualized, which ground features can be better recognized and classified via supervised or unsupervised classification methods easily. Nevertheless, the shortcomings of multispectral images are highly depending on light conditions, and classification results lack of three-dimensional semantic information. On the other hand, LiDAR has become a main technology for acquiring high accuracy point cloud data. The advantages of LiDAR are high data acquisition rate, independent of light conditions and can directly produce three-dimensional coordinates. However, comparing with multispectral images, the disadvantage is multispectral information shortage, which remains a challenge in ground feature classification through massive point cloud data. Consequently, by combining the advantages of both LiDAR and multispectral images, point cloud data with three-dimensional coordinates and multispectral information can produce a integrate solution for point cloud classification. Therefore, this research acquires visible light and near infrared images, via close range photogrammetry, by matching images automatically through free online service for multispectral point cloud generation. Then, one can use three-dimensional affine coordinate transformation to compare the data increment. At last, the given threshold of height and color information is set as threshold in classification.

  17. Classification by Using Multispectral Point Cloud Data

    Science.gov (United States)

    Liao, C. T.; Huang, H. H.

    2012-07-01

    Remote sensing images are generally recorded in two-dimensional format containing multispectral information. Also, the semantic information is clearly visualized, which ground features can be better recognized and classified via supervised or unsupervised classification methods easily. Nevertheless, the shortcomings of multispectral images are highly depending on light conditions, and classification results lack of three-dimensional semantic information. On the other hand, LiDAR has become a main technology for acquiring high accuracy point cloud data. The advantages of LiDAR are high data acquisition rate, independent of light conditions and can directly produce three-dimensional coordinates. However, comparing with multispectral images, the disadvantage is multispectral information shortage, which remains a challenge in ground feature classification through massive point cloud data. Consequently, by combining the advantages of both LiDAR and multispectral images, point cloud data with three-dimensional coordinates and multispectral information can produce a integrate solution for point cloud classification. Therefore, this research acquires visible light and near infrared images, via close range photogrammetry, by matching images automatically through free online service for multispectral point cloud generation. Then, one can use three-dimensional affine coordinate transformation to compare the data increment. At last, the given threshold of height and color information is set as threshold in classification.

  18. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  19. Haussdorff and hellinger for colorimetric sensor array classification

    DEFF Research Database (Denmark)

    Alstrøm, Tommy Sonne; Jensen, Bjørn Sand; Schmidt, Mikkel Nørgaard

    2012-01-01

    Development of sensors and systems for detection of chemical compounds is an important challenge with applications in areas such as anti-terrorism, demining, and environmental monitoring. A newly developed colorimetric sensor array is able to detect explosives and volatile organic compounds......; however, each sensor reading consists of hundreds of pixel values, and methods for combining these readings from multiple sensors must be developed to make a classification system. In this work we examine two distance based classification methods, K-Nearest Neighbor (KNN) and Gaussian process (GP......) classification, which both rely on a suitable distance metric. We evaluate a range of different distance measures and propose a method for sensor fusion in the GP classifier. Our results indicate that the best choice of distance measure depends on the sensor and the chemical of interest....

  20. Co-design and implementation research: challenges and solutions for ethics committees.

    Science.gov (United States)

    Goodyear-Smith, Felicity; Jackson, Claire; Greenhalgh, Trisha

    2015-11-16

    Implementation science research, especially when using participatory and co-design approaches, raises unique challenges for research ethics committees. Such challenges may be poorly addressed by approval and governance mechanisms that were developed for more traditional research approaches such as randomised controlled trials. Implementation science commonly involves the partnership of researchers and stakeholders, attempting to understand and encourage uptake of completed or piloted research. A co-creation approach involves collaboration between researchers and end users from the onset, in question framing, research design and delivery, and influencing strategy, with implementation and broader dissemination strategies part of its design from gestation. A defining feature of co-creation is its emergent and adaptive nature, making detailed pre-specification of interventions and outcome measures impossible. This methodology sits oddly with ethics committee protocols that require precise pre-definition of interventions, mode of delivery, outcome measurements, and the role of study participants. But the strict (and, some would say, inflexible) requirements of ethics committees were developed for a purpose - to protect participants from harm and help ensure the rigour and transparency of studies. We propose some guiding principles to help square this circle. First, ethics committees should acknowledge and celebrate the diversity of research approaches, both formally (through training) and informally (by promoting debate and discussion); without active support, their members may not understand or value participatory designs. Second, ground rules should be established for co-design applications (e.g. how to judge when 'consultation' or 'engagement' becomes research) and communicated to committee members and stakeholders. Third, the benefits of power-sharing should be recognised and credit given to measures likely to support this important goal, especially in research with

  1. Identification and classification of behavioural indicators to assess innovation competence

    Directory of Open Access Journals (Sweden)

    María Jose Pérez Peñalver

    2018-02-01

    Design/methodology/approach: A literature review was addressed by means of a search in Elsevier’s Scopus, Web of Science and Google Scholar. By applying inclusive and exclusive criteria, references were obtained with the search protocol. After filtering and scanning, there was a selection of references plus other articles added by the snowball effect. The final phase undertaken was the classification of the main indicators raised in the publications selected. Findings: Our main contribution was the identification of the behavioural indicators of innovators at the workplace and their classification in five dimensions. Practical implications: This research may yield some light on the assessment of innovative workplace performance of individuals in organisations, as well as on the development of the innovative competence of students in academic institutions as a challenge to meet the needs of both professionals and Higher Education institutions. Originality/value: Some authors have studied the characteristics of innovative people mainly focusing on cognitive abilities, personality, motivation and knowledge. We have sought to offer a better understanding of the phenomenon of individual innovation in organisations, through the analysis of behavioural indicators, an issue that has not been studied from this perspective previously.

  2. Simple Fully Automated Group Classification on Brain fMRI

    International Nuclear Information System (INIS)

    Honorio, J.; Goldstein, R.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-01-01

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  3. Simple Fully Automated Group Classification on Brain fMRI

    Energy Technology Data Exchange (ETDEWEB)

    Honorio, J.; Goldstein, R.; Honorio, J.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-04-14

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  4. Inter Genre Similarity Modelling For Automatic Music Genre Classification

    OpenAIRE

    Bagci, Ulas; Erzin, Engin

    2009-01-01

    Music genre classification is an essential tool for music information retrieval systems and it has been finding critical applications in various media platforms. Two important problems of the automatic music genre classification are feature extraction and classifier design. This paper investigates inter-genre similarity modelling (IGS) to improve the performance of automatic music genre classification. Inter-genre similarity information is extracted over the mis-classified feature population....

  5. Review article: A systematic review of emergency department incident classification frameworks.

    Science.gov (United States)

    Murray, Matthew; McCarthy, Sally

    2017-10-11

    As in any part of the hospital system, safety incidents can occur in the ED. These incidents arguably have a distinct character, as the ED involves unscheduled flows of urgent patients who require disparate services. To aid understanding of safety issues and support risk management of the ED, a comparison of published ED specific incident classification frameworks was performed. A review of emergency medicine, health management and general medical publications, using Ovid SP to interrogate Medline (1976-2016) was undertaken to identify any type of taxonomy or classification-like framework for ED related incidents. These frameworks were then analysed and compared. The review identified 17 publications containing an incident classification framework. Comparison of factors and themes making up the classification constituent elements revealed some commonality, but no overall consistency, nor evolution towards an ideal framework. Inconsistency arises from differences in the evidential basis and design methodology of classifications, with design itself being an inherently subjective process. It was not possible to identify an 'ideal' incident classification framework for ED risk management, and there is significant variation in the selection of categories used by frameworks. The variation in classification could risk an unbalanced emphasis in findings through application of a particular framework. Design of an ED specific, ideal incident classification framework should be informed by a much wider range of theories of how organisations and systems work, in addition to clinical and human factors. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  6. Positioning Industrial Design Education within Higher Education: How to face increasingly challenging market forces?

    Directory of Open Access Journals (Sweden)

    André Liem

    2014-05-01

    Full Text Available This paper discusses how Industrial Design Education should be adapted to pressing future challenges of higher education with respect to promoting high quality mentorship and scholarship, as well as being more economically self-sufficient through stronger collaborative engagements with industry. The four (4 following trends will be presented on how prospective design programs are to be developed: (1 Mass-education and rationalisation, (2 Links between education and research, (3 Globalisation and internationalisation, and (4 Collaboration with industry and research commercialisation.Given the challenges of market forces within academia, a consensus within the design education community should be established in order to expose students more to “active learning” and to vice-versa commute from generic to specialist and from abstract to concrete modes of working. Comprehensive and collaborative studio projects should be implemented as platforms, where social, interdisciplinary and inquiry-based learning can be developed in line with selected design themes, processes and methods.

  7. Positioning Learning Design: Learner Experience and the challenges of transforming teaching practice

    NARCIS (Netherlands)

    Johnson, Mark; Griffiths, Dai; Hanslot, Zubair

    2010-01-01

    Johnson, M., Griffiths, D., & Hanslot, Z. (2010). Positioning Learning Design: Learner Experience and the challenges of transforming teaching practice. In D. Griffiths, & R. Koper (Eds.), Rethinking Learning and Employment at a Time of Economic Uncertainty. Proceedings of the 6th TENCompetence Open

  8. Classification across gene expression microarray studies

    Directory of Open Access Journals (Sweden)

    Kuner Ruprecht

    2009-12-01

    Full Text Available Abstract Background The increasing number of gene expression microarray studies represents an important resource in biomedical research. As a result, gene expression based diagnosis has entered clinical practice for patient stratification in breast cancer. However, the integration and combined analysis of microarray studies remains still a challenge. We assessed the potential benefit of data integration on the classification accuracy and systematically evaluated the generalization performance of selected methods on four breast cancer studies comprising almost 1000 independent samples. To this end, we introduced an evaluation framework which aims to establish good statistical practice and a graphical way to monitor differences. The classification goal was to correctly predict estrogen receptor status (negative/positive and histological grade (low/high of each tumor sample in an independent study which was not used for the training. For the classification we chose support vector machines (SVM, predictive analysis of microarrays (PAM, random forest (RF and k-top scoring pairs (kTSP. Guided by considerations relevant for classification across studies we developed a generalization of kTSP which we evaluated in addition. Our derived version (DV aims to improve the robustness of the intrinsic invariance of kTSP with respect to technologies and preprocessing. Results For each individual study the generalization error was benchmarked via complete cross-validation and was found to be similar for all classification methods. The misclassification rates were substantially higher in classification across studies, when each single study was used as an independent test set while all remaining studies were combined for the training of the classifier. However, with increasing number of independent microarray studies used in the training, the overall classification performance improved. DV performed better than the average and showed slightly less variance. In

  9. Classifying Classifications

    DEFF Research Database (Denmark)

    Debus, Michael S.

    2017-01-01

    This paper critically analyzes seventeen game classifications. The classifications were chosen on the basis of diversity, ranging from pre-digital classification (e.g. Murray 1952), over game studies classifications (e.g. Elverdam & Aarseth 2007) to classifications of drinking games (e.g. LaBrie et...... al. 2013). The analysis aims at three goals: The classifications’ internal consistency, the abstraction of classification criteria and the identification of differences in classification across fields and/or time. Especially the abstraction of classification criteria can be used in future endeavors...... into the topic of game classifications....

  10. EMG finger movement classification based on ANFIS

    Science.gov (United States)

    Caesarendra, W.; Tjahjowidodo, T.; Nico, Y.; Wahyudati, S.; Nurhasanah, L.

    2018-04-01

    An increase number of people suffering from stroke has impact to the rapid development of finger hand exoskeleton to enable an automatic physical therapy. Prior to the development of finger exoskeleton, a research topic yet important i.e. machine learning of finger gestures classification is conducted. This paper presents a study on EMG signal classification of 5 finger gestures as a preliminary study toward the finger exoskeleton design and development in Indonesia. The EMG signals of 5 finger gestures were acquired using Myo EMG sensor. The EMG signal features were extracted and reduced using PCA. The ANFIS based learning is used to classify reduced features of 5 finger gestures. The result shows that the classification of finger gestures is less than the classification of 7 hand gestures.

  11. Designing capacity-building in e-learning expertise: Challenges and strategies

    OpenAIRE

    Aczel, J. C.; Peake, S. R.; Hardy, P.

    2008-01-01

    This research study looks at how organizations in developing countries perceive the challenge of building capacity in e-learning expertise. Data was collected on six such organizations, and a range of perceived rationales and constraints were identified. The paper hypothesizes a four-part framework to define the e-learning capacity gaps that these circumstances appear to represent: the 'instructional design capacity gap', the 'production capacity gap', the 'tutorial capacity gap' and the 'com...

  12. Addressing current challenges in cancer immunotherapy with mathematical and computational modelling.

    Science.gov (United States)

    Konstorum, Anna; Vella, Anthony T; Adler, Adam J; Laubenbacher, Reinhard C

    2017-06-01

    The goal of cancer immunotherapy is to boost a patient's immune response to a tumour. Yet, the design of an effective immunotherapy is complicated by various factors, including a potentially immunosuppressive tumour microenvironment, immune-modulating effects of conventional treatments and therapy-related toxicities. These complexities can be incorporated into mathematical and computational models of cancer immunotherapy that can then be used to aid in rational therapy design. In this review, we survey modelling approaches under the umbrella of the major challenges facing immunotherapy development, which encompass tumour classification, optimal treatment scheduling and combination therapy design. Although overlapping, each challenge has presented unique opportunities for modellers to make contributions using analytical and numerical analysis of model outcomes, as well as optimization algorithms. We discuss several examples of models that have grown in complexity as more biological information has become available, showcasing how model development is a dynamic process interlinked with the rapid advances in tumour-immune biology. We conclude the review with recommendations for modellers both with respect to methodology and biological direction that might help keep modellers at the forefront of cancer immunotherapy development. © 2017 The Author(s).

  13. Classification of diffuse lung diseases: why and how.

    Science.gov (United States)

    Hansell, David M

    2013-09-01

    The understanding of complex lung diseases, notably the idiopathic interstitial pneumonias and small airways diseases, owes as much to repeated attempts over the years to classify them as to any single conceptual breakthrough. One of the many benefits of a successful classification scheme is that it allows workers, within and between disciplines, to be clear that they are discussing the same disease. This may be of particular importance in the recruitment of individuals for a clinical trial that requires a standardized and homogeneous study population. Different specialties require fundamentally different things from a classification: for epidemiologic studies, a classification that requires categorization of individuals according to histopathologic pattern is not usually practicable. Conversely, a scheme that simply divides diffuse parenchymal disease into inflammatory and noninflammatory categories is unlikely to further the understanding about the pathogenesis of disease. Thus, for some disease groupings, for example, pulmonary vasculopathies, there may be several appropriate classifications, each with its merits and demerits. There has been an interesting shift in the past few years, from the accepted primacy of histopathology as the sole basis on which the classification of parenchymal lung disease has rested, to new ways of considering how these entities relate to each other. Some inventive thinking has resulted in new classifications that undoubtedly benefit patients and clinicians in their endeavor to improve management and outcome. The challenge of understanding the logic behind current classifications and their shortcomings are explored in various examples of lung diseases.

  14. Mapping of the Universe of Knowledge in Different Classification Schemes

    Directory of Open Access Journals (Sweden)

    M. P. Satija

    2017-06-01

    Full Text Available Given the variety of approaches to mapping the universe of knowledge that have been presented and discussed in the literature, the purpose of this paper is to systematize their main principles and their applications in the major general modern library classification schemes. We conducted an analysis of the literature on classification and the main classification systems, namely Dewey/Universal Decimal Classification, Cutter’s Expansive Classification, Subject Classification of J.D. Brown, Colon Classification, Library of Congress Classification, Bibliographic Classification, Rider’s International Classification, Bibliothecal Bibliographic Klassification (BBK, and Broad System of Ordering (BSO. We conclude that the arrangement of the main classes can be done following four principles that are not mutually exclusive: ideological principle, social purpose principle, scientific order, and division by discipline. The paper provides examples and analysis of each system. We also conclude that as knowledge is ever-changing, classifications also change and present a different structure of knowledge depending upon the society and time of their design.

  15. 77 FR 31161 - Designation of Officers of the Millennium Challenge Corporation To Act as Chief Executive Officer...

    Science.gov (United States)

    2012-05-25

    ... of May 21, 2012 Designation of Officers of the Millennium Challenge Corporation To Act as Chief Executive Officer of the Millennium Challenge Corporation Memorandum for the Chief Executive Officer of the... following officers of the Millennium Challenge Corporation (MCC), in the order listed, shall act as and...

  16. State‐of‐the‐art and progress in the optimization‐based simultaneous design and control for chemical processes

    DEFF Research Database (Denmark)

    Yuan, Zhihong; Chen, Bingzhen; Sin, Gürkan

    2012-01-01

    ‐based frameworks that are capable of screening alternative designs, and (2) optimization‐based frameworks that integrate the process design and control system design. The major objective is to give an up‐to‐date review of the state‐of‐the‐art and progress in the challenging area of optimization‐based simultaneous...... design and control. First, motivations and significances of simultaneous design and control are illustrated. Second, a general classification of existing methodologies of optimization‐based simultaneous design and control is outlined. Subsequently, the mathematical formulations and relevant theoretical...

  17. The influence of spine surgeons' experience on the classification and intraobserver reliability of the novel AOSpine thoracolumbar spine injury classification system : an international study

    NARCIS (Netherlands)

    Sadiqi, Said; Oner, F. Cumhur; Dvorak, Marcel F.; Aarabi, Bizhan; Schroeder, Gregory D.; Vaccaro, Alexander R.

    2015-01-01

    Study Design. International validation study. Objective. To investigate the influence of the spine surgeons' level of experience on the intraobserver reliability of the novel AOSpine Thoracolumbar Spine Injury Classification system, and the appropriate classification according to this system.

  18. Perforator chimerism for the reconstruction of complex defects: A new chimeric free flap classification system.

    Science.gov (United States)

    Kim, Jeong Tae; Kim, Youn Hwan; Ghanem, Ali M

    2015-11-01

    Complex defects present structural and functional challenges to reconstructive surgeons. When compared to multiple free flaps or staged reconstruction, the use of chimeric flaps to reconstruct such defects have many advantages such as reduced number of operative procedures and donor site morbidity as well as preservation of recipient vessels. With increased popularity of perforator flaps, chimeric flaps' harvest and design has benefited from 'perforator concept' towards more versatile and better reconstruction solutions. This article discusses perforator based chimeric flaps and presents a practice based classification system that incorporates the perforator flap concept into "Perforator Chimerism". The authors analyzed a variety of chimeric patterns used in 31 consecutive cases to present illustrative case series and their new classification system. Accordingly, chimeric flaps are classified into four types. Type I: Classical Chimerism, Type II: Anastomotic Chimerism, Type III: Perforator Chimerism and Type IV Mixed Chimerism. Types I on specific source vessel anatomy whilst Type II requires microvascular anastomosis to create the chimeric reconstructive solution. Type III chimeric flaps utilizes the perforator concept to raise two components of tissues without microvascular anastomosis between them. Type IV chimeric flaps are mixed type flaps comprising any combination of Types I to III. Incorporation of the perforator concept in planning and designing chimeric flaps has allowed safe, effective and aesthetically superior reconstruction of complex defects. The new classification system aids reconstructive surgeons and trainees to understand chimeric flaps design, facilitating effective incorporation of this important reconstructive technique into the armamentarium of the reconstruction toolbox. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  19. Entrepreneurship and response strategies to challenges in engineering and design education

    DEFF Research Database (Denmark)

    Jørgensen, Ulrik; Pineda, Andres Felipe Valderrama

    2012-01-01

    Entrepreneurship is one of the contemporary expectations to engineers and their training at engineering schools. But what is entrepreneurship? We propose three different conceptualizations of entrepreneurship in engineering and design programs. They are: (1) the technology-driven promotion response...... centered in technological development; (2) the business selection response strategy centered in business skills (which should be additional to the technical skills); and (3) the design intervention response strategy focused on a network approach to technology, business and society. These conceptualizations...... are response strategies from engineering communities, professors and institutions to perceived challenges. We argue that all engineering educators deal in one way or another with the three response strategies when approaching issues of curricular design, academicreform and the international accreditation...

  20. Classification of protein-protein interaction full-text documents using text and citation network features.

    Science.gov (United States)

    Kolchinsky, Artemy; Abi-Haidar, Alaa; Kaur, Jasleen; Hamed, Ahmed Abdeen; Rocha, Luis M

    2010-01-01

    We participated (as Team 9) in the Article Classification Task of the Biocreative II.5 Challenge: binary classification of full-text documents relevant for protein-protein interaction. We used two distinct classifiers for the online and offline challenges: 1) the lightweight Variable Trigonometric Threshold (VTT) linear classifier we successfully introduced in BioCreative 2 for binary classification of abstracts and 2) a novel Naive Bayes classifier using features from the citation network of the relevant literature. We supplemented the supplied training data with full-text documents from the MIPS database. The lightweight VTT classifier was very competitive in this new full-text scenario: it was a top-performing submission in this task, taking into account the rank product of the Area Under the interpolated precision and recall Curve, Accuracy, Balanced F-Score, and Matthew's Correlation Coefficient performance measures. The novel citation network classifier for the biomedical text mining domain, while not a top performing classifier in the challenge, performed above the central tendency of all submissions, and therefore indicates a promising new avenue to investigate further in bibliome informatics.

  1. Two approaches to meeting the economic challenge for advanced BWR designs

    International Nuclear Information System (INIS)

    Rao, A.S.; Sawyer, C.D.

    1997-01-01

    This paper presents the design overview and approach to addressing the aforementioned economic challenges for two Advanced BWR designs. The first plant is the ABWR and the second is the ESBWR. The ABWR relies on proven technology and components and an extensive infrastructure that has been built up over the last 20 years. Because it has proven and standards safety systems it has very limited uncertainty regarding licensing. Finally, it relies on the economies of scale and overall design flexibility to improve the overall economics of power generation. The ESBWR on the other hand has taken an innovative approach to reduce systems and components to simplify the overall plant to improve plant economics. The overall plant design is indeed simpler, but improved economics required reliance on some economies of scale also. This design embodied in the ESBWR, also has minimized the overall development cost by utilizing features and components from the ABWR and SBWR technology programs

  2. Constructed Wetlands for Treatment of Combined Sewer Overflow in the US: A Review of Design Challenges and Application Status

    Directory of Open Access Journals (Sweden)

    Wendong Tao

    2014-11-01

    Full Text Available As combined sewer systems and centralized wastewater treatment facilities age, many communities in the world are challenged by management of combined sewer overflow (CSO. Constructed wetlands are considered to be one of the green infrastructure solutions to CSOs in the US. Despite the wide application of constructed wetlands to different types of wastewaters, the stochastic and intermittent nature of CSO presents challenges for design and performance assessment of constructed wetlands. This paper reviews the application status of CSO constructed wetlands in the US, assesses the benefits of CSO constructed wetlands, identifies challenges to designing CSO constructed wetlands, and proposes design considerations. This review finds that constructed wetlands are effective in CSO treatment and relatively less expensive to build than comparable grey infrastructure. Constructed wetlands not only remove pollutants, but also mitigate the event-associated flow regime. The design challenges include incorporating considerations of green infrastructure into permit requirements, determining design capacity for highly variable flows, requiring pretreatment, and needing adaptive design and intensive monitoring. Simultaneous monitoring of flow rate and water quality at both the inflow and outflow of CSO constructed wetlands is required for performance assessment and needed to support design, but is rarely available.

  3. Challenges of building and sustaining living labs for designing services and products

    DEFF Research Database (Denmark)

    Subasi, Özge; Werner, Katharina; Fitzpatrick, Geraldine

    2016-01-01

    In this paper, we show examples from one of the living labs from the Give&Take project and discuss the observed challenges of establishing and sustaining living labs in a participatory design context. The observations we present are around the mismatch between research language and everyday...

  4. Hyperspectral image classification based on local binary patterns and PCANet

    Science.gov (United States)

    Yang, Huizhen; Gao, Feng; Dong, Junyu; Yang, Yang

    2018-04-01

    Hyperspectral image classification has been well acknowledged as one of the challenging tasks of hyperspectral data processing. In this paper, we propose a novel hyperspectral image classification framework based on local binary pattern (LBP) features and PCANet. In the proposed method, linear prediction error (LPE) is first employed to select a subset of informative bands, and LBP is utilized to extract texture features. Then, spectral and texture features are stacked into a high dimensional vectors. Next, the extracted features of a specified position are transformed to a 2-D image. The obtained images of all pixels are fed into PCANet for classification. Experimental results on real hyperspectral dataset demonstrate the effectiveness of the proposed method.

  5. CADASTRAL CLASSIFICATION OF THE LAND PLOTS IN UKRAINE

    Directory of Open Access Journals (Sweden)

    KIRICHEK Yu. O.

    2016-04-01

    Full Text Available Summary. Work concerns development of national system of classification of the land plots. The developed classification will allow to solve correctly a number of the corresponding cadastral, land management, estimated and other tasks. The analysis of classifications of lands, improvements and real estate in general is made. The created offers concerning creation of a new classification of the land plots in Ukraine. Today the Ukrainian real estate market has no single system that separates the system property groups, classes and types. This significantly complicates the work and can not fully be aware of the specific situation of real estate market. This task is designed to solve classification properties, it is used to transition from a diversity of individual properties to a limited number of classes of evaluation objects. The classification is different functional purpose (use facilities assessment, which determines the difference in value.

  6. BluLab: Temporal Information Extraction for the 2015 Clinical TempEval Challenge

    OpenAIRE

    Velupillai, Sumithra; Mowery, Danielle L.; Abdelrahman, Samir; Christensen, Lee; Chapman, Wendy W.

    2015-01-01

    The 2015 Clinical TempEval Challenge addressed the problem of temporal reasoning in the clinical domain by providing an annotated corpus of pathology and clinical notes related to colon cancer patients. The challenge consisted of six subtasks: TIMEX3 and event span detection, TIMEX3 and event attribute classification, document relation time and narrative container relation classification. Our BluLab team participated in all six subtasks. For the TIMEX3 and event subtasks, we developed a Clear...

  7. Integration of scholastic curriculum in computergames – impossible or a design challenge?

    DEFF Research Database (Denmark)

    Larsen, Lasse Juel

    The present paper argues that integration of scholastic knowledge in computer games is a design challenge and one that will only work if you preserve the computer game as a game. This is important cause if you don’t adhere to or understand the dynamics of computer games you run the risk of destro......The present paper argues that integration of scholastic knowledge in computer games is a design challenge and one that will only work if you preserve the computer game as a game. This is important cause if you don’t adhere to or understand the dynamics of computer games you run the risk...... of destroying your own goal. In order to integrate the scholastic curriculum in computer games for a learning purpose it is and can not be stressed enough important to preserve the action-outcome circle inside the game world. Stated in simpler terms this means that users of learning games must see...

  8. Present day design challenges exemplified by the Clinch River Breeder Reactor Plant

    International Nuclear Information System (INIS)

    Dickson, P.W. Jr.; Anderson, C.A. Jr.

    1976-01-01

    The present day design challenges faced by the Clinch River Breeder Reactor Plant engineer result from two causes. The first cause is aspiration to achieve a design that will operate at conditions which are desirable for future LMFBRs in order for them to achieve low power costs and good breeding. The second cause is the licensing impact. Although licensing the CRBRP won't eliminate future licensing effort, many licensing questions will have been resolved and precedents set for the future LMFBR industry

  9. The Hand Eczema Trial (HET): Design of a randomised clinical trial of the effect of classification and individual counselling versus no intervention among health-care workers with hand eczema

    DEFF Research Database (Denmark)

    Ibler, Kristina Sophie; Agner, Tove; Hansen, Jane L.

    2010-01-01

    . The experimental group undergoes patch and prick testing; classification of the hand eczema; demonstration of hand washing and appliance of emollients; individual counselling, and a skin-care programme. The control group receives no intervention. All participants are reassessed after six months. The primary...... strategies are needed to reduce occupational hand eczema. METHODS/DESIGN: We describe the design of a randomised clinical trial to investigate the effects of classification of hand eczema plus individual counselling versus no intervention. The trial includes health-care workers with hand eczema identified...

  10. Multivariate decision tree designing for the classification of multi-jet topologies in e sup + e sup - collisions

    CERN Document Server

    Mjahed, M

    2002-01-01

    The binary decision tree method is used to separate between several multi-jet topologies in e sup + e sup - collisions. Instead of the univariate process usually taken, a new design procedure for constructing multivariate decision trees is proposed. The segmentation is obtained by considering some features functions, where linear and non-linear discriminant functions and a minimal distance method are used. The classification focuses on ALEPH simulated events, with multi-jet topologies. Compared to a standard univariate tree, the multivariate decision trees offer significantly better performance.

  11. Deep learning for tumor classification in imaging mass spectrometry.

    Science.gov (United States)

    Behrmann, Jens; Etmann, Christian; Boskamp, Tobias; Casadonte, Rita; Kriegsmann, Jörg; Maaß, Peter

    2018-04-01

    Tumor classification using imaging mass spectrometry (IMS) data has a high potential for future applications in pathology. Due to the complexity and size of the data, automated feature extraction and classification steps are required to fully process the data. Since mass spectra exhibit certain structural similarities to image data, deep learning may offer a promising strategy for classification of IMS data as it has been successfully applied to image classification. Methodologically, we propose an adapted architecture based on deep convolutional networks to handle the characteristics of mass spectrometry data, as well as a strategy to interpret the learned model in the spectral domain based on a sensitivity analysis. The proposed methods are evaluated on two algorithmically challenging tumor classification tasks and compared to a baseline approach. Competitiveness of the proposed methods is shown on both tasks by studying the performance via cross-validation. Moreover, the learned models are analyzed by the proposed sensitivity analysis revealing biologically plausible effects as well as confounding factors of the considered tasks. Thus, this study may serve as a starting point for further development of deep learning approaches in IMS classification tasks. https://gitlab.informatik.uni-bremen.de/digipath/Deep_Learning_for_Tumor_Classification_in_IMS. jbehrmann@uni-bremen.de or christianetmann@uni-bremen.de. Supplementary data are available at Bioinformatics online.

  12. Land use/cover classification in the Brazilian Amazon using satellite images.

    Science.gov (United States)

    Lu, Dengsheng; Batistella, Mateus; Li, Guiying; Moran, Emilio; Hetrick, Scott; Freitas, Corina da Costa; Dutra, Luciano Vieira; Sant'anna, Sidnei João Siqueira

    2012-09-01

    Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation-based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi-resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Of the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, has the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical-based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.

  13. Engineering challenges encountered in the design of the ELMO BUMPY TORUS proof-of-principle fusion device

    International Nuclear Information System (INIS)

    Dillow, C.F.; Imster, H.F.

    1982-01-01

    This paper first provides a summary of the history and current status of the Elmo Bumpy Torus (EBT) fusion concept. A brief description of the EBT-P is then provided in which the many unique features of this fusion device are highlighted. This description will provide the technical background for the following discussions of some of the more challenging mechanical engineering problems encountered to date in the evolution of the EBT-P design. The problems discussed are: optimization of the device primary structure design, optimization of the superconducting magnet x-ray shield design, design of the liquid helium supply and distribution system, and selection of high vacuum seals and pumps and their protection from the high power microwave environment. The common challenge in each of these design issues was to assure adequate performance at minimum cost

  14. Smart integrated microsystems: the energy efficiency challenge (Conference Presentation) (Plenary Presentation)

    Science.gov (United States)

    Benini, Luca

    2017-06-01

    The "internet of everything" envisions trillions of connected objects loaded with high-bandwidth sensors requiring massive amounts of local signal processing, fusion, pattern extraction and classification. From the computational viewpoint, the challenge is formidable and can be addressed only by pushing computing fabrics toward massive parallelism and brain-like energy efficiency levels. CMOS technology can still take us a long way toward this goal, but technology scaling is losing steam. Energy efficiency improvement will increasingly hinge on architecture, circuits, design techniques such as heterogeneous 3D integration, mixed-signal preprocessing, event-based approximate computing and non-Von-Neumann architectures for scalable acceleration.

  15. CLASSIFICATION OF THE MGR MUCK HANDLING SYSTEM

    International Nuclear Information System (INIS)

    R. Garrett

    1999-01-01

    The purpose of this analysis is to document the Quality Assurance (QA) classification of the Monitored Geologic Repository (MGR) muck handling system structures, systems and components (SSCs) performed by the MGR Safety Assurance Department. This analysis also provides the basis for revision of YMP/90-55Q, Q-List (YMP 1998). The Q-List identifies those MGR SSCs subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description (QARD) (DOE 1998). This QA classification incorporates the current MGR design and the results of the ''Preliminary Preclosure Design Basis Event Calculations for the Monitored Geologic Repository (CRWMS M and O 1998a)

  16. Nuclear power plant systems, structures and components and their safety classification

    International Nuclear Information System (INIS)

    2000-01-01

    The assurance of a nuclear power plant's safety is based on the reliable functioning of the plant as well as on its appropriate maintenance and operation. To ensure the reliability of operation, special attention shall be paid to the design, manufacturing, commissioning and operation of the plant and its components. To control these functions the nuclear power plant is divided into structural and functional entities, i.e. systems. A systems safety class is determined by its safety significance. Safety class specifies the procedures to be employed in plant design, construction, monitoring and operation. The classification document contains all documentation related to the classification of the nuclear power plant. The principles of safety classification and the procedures pertaining to the classification document are presented in this guide. In the Appendix of the guide, examples of systems most typical of each safety class are given to clarify the safety classification principles

  17. Multi-Label Classification by Semi-Supervised Singular Value Decomposition.

    Science.gov (United States)

    Jing, Liping; Shen, Chenyang; Yang, Liu; Yu, Jian; Ng, Michael K

    2017-10-01

    Multi-label problems arise in various domains, including automatic multimedia data categorization, and have generated significant interest in computer vision and machine learning community. However, existing methods do not adequately address two key challenges: exploiting correlations between labels and making up for the lack of labelled data or even missing labelled data. In this paper, we proposed to use a semi-supervised singular value decomposition (SVD) to handle these two challenges. The proposed model takes advantage of the nuclear norm regularization on the SVD to effectively capture the label correlations. Meanwhile, it introduces manifold regularization on mapping to capture the intrinsic structure among data, which provides a good way to reduce the required labelled data with improving the classification performance. Furthermore, we designed an efficient algorithm to solve the proposed model based on the alternating direction method of multipliers, and thus, it can efficiently deal with large-scale data sets. Experimental results for synthetic and real-world multimedia data sets demonstrate that the proposed method can exploit the label correlations and obtain promising and better label prediction results than the state-of-the-art methods.

  18. Opportunities and Challenges for Drug Development: Public-Private Partnerships, Adaptive Designs and Big Data.

    Science.gov (United States)

    Yildirim, Oktay; Gottwald, Matthias; Schüler, Peter; Michel, Martin C

    2016-01-01

    Drug development faces the double challenge of increasing costs and increasing pressure on pricing. To avoid that lack of perceived commercial perspective will leave existing medical needs unmet, pharmaceutical companies and many other stakeholders are discussing ways to improve the efficiency of drug Research and Development. Based on an international symposium organized by the Medical School of the University of Duisburg-Essen (Germany) and held in January 2016, we discuss the opportunities and challenges of three specific areas, i.e., public-private partnerships, adaptive designs and big data. Public-private partnerships come in many different forms with regard to scope, duration and type and number of participants. They range from project-specific collaborations to strategic alliances to large multi-party consortia. Each of them offers specific opportunities and faces distinct challenges. Among types of collaboration, investigator-initiated studies are becoming increasingly popular but have legal, ethical, and financial implications. Adaptive trial designs are also increasingly discussed. However, adaptive should not be used as euphemism for the repurposing of a failed trial; rather it requires carefully planning and specification before a trial starts. Adaptive licensing can be a counter-part of adaptive trial design. The use of Big Data is another opportunity to leverage existing information into knowledge useable for drug discovery and development. Respecting limitations of informed consent and privacy is a key challenge in the use of Big Data. Speakers and participants at the symposium were convinced that appropriate use of the above new options may indeed help to increase the efficiency of future drug development.

  19. Opportunities and challenges for drug development: public-private partnerships, adaptive designs and big data

    Directory of Open Access Journals (Sweden)

    Oktay Yildirim

    2016-12-01

    Full Text Available Drug development faces the double challenge of increasing costs and increasing pressure on pricing. To avoid that lack of perceived commercial perspective will leave existing medical needs unmet, pharmaceutical companies and many other stakeholders are discussing ways to improve the efficiency of drug Research & Development. Based on an international symposium organized by the Medical School of the University of Duisburg-Essen (Germany and held in January 2016, we discuss the opportunities and challenges of three specific areas, i.e. public-private partnerships, adaptive designs and big data. Public-private partnerships come in many different forms with regard to scope, duration and type and number of participants. They range from project-specific collaborations to strategic alliances to large multi-party consortia. Each of them offers specific opportunities and faces distinct challenges. Among types of collaboration, investigator-initiated studies are becoming increasingly popular but have legal, ethical and financial implications. Adaptive trial designs are also increasingly discussed. However, adaptive should not be used as euphemism for the repurposing of a failed trial; rather it requires carefully planning and specification before a trial starts. Adaptive licensing can be a counter-part of adaptive trial design. The use of Big Data is another opportunity to leverage existing information into knowledge useable for drug discovery and development. Respecting limitations of informed consent and privacy is a key challenge in the use of Big Data. Speakers and participants at the symposium were convinced that appropriate use of the above new options may indeed help to increase the efficiency of future drug development.

  20. Opportunities and Challenges for Drug Development: Public–Private Partnerships, Adaptive Designs and Big Data

    Science.gov (United States)

    Yildirim, Oktay; Gottwald, Matthias; Schüler, Peter; Michel, Martin C.

    2016-01-01

    Drug development faces the double challenge of increasing costs and increasing pressure on pricing. To avoid that lack of perceived commercial perspective will leave existing medical needs unmet, pharmaceutical companies and many other stakeholders are discussing ways to improve the efficiency of drug Research and Development. Based on an international symposium organized by the Medical School of the University of Duisburg-Essen (Germany) and held in January 2016, we discuss the opportunities and challenges of three specific areas, i.e., public–private partnerships, adaptive designs and big data. Public–private partnerships come in many different forms with regard to scope, duration and type and number of participants. They range from project-specific collaborations to strategic alliances to large multi-party consortia. Each of them offers specific opportunities and faces distinct challenges. Among types of collaboration, investigator-initiated studies are becoming increasingly popular but have legal, ethical, and financial implications. Adaptive trial designs are also increasingly discussed. However, adaptive should not be used as euphemism for the repurposing of a failed trial; rather it requires carefully planning and specification before a trial starts. Adaptive licensing can be a counter-part of adaptive trial design. The use of Big Data is another opportunity to leverage existing information into knowledge useable for drug discovery and development. Respecting limitations of informed consent and privacy is a key challenge in the use of Big Data. Speakers and participants at the symposium were convinced that appropriate use of the above new options may indeed help to increase the efficiency of future drug development. PMID:27999543

  1. Progress in the diagnosis and classification of pituitary adenomas

    Directory of Open Access Journals (Sweden)

    Luis V Syro

    2015-06-01

    Full Text Available Pituitary adenomas are common neoplasms. Their classification is based upon size, invasion of adjacent structures, sporadic or familial cases, biochemical activity, clinical manifestations, morphological characteristics, response to treatment and recurrence. Although they are considered benign tumors, some of them are difficult to treat due to their tendency to recur, despite standardized treatment. Functional tumors present other challenges for normalizing their biochemical activity. Novel approaches for early diagnosis as well as different perspectives on classification may help to identify subgroups of patients with similar characteristics, creating opportunities to match each patient with the best personalized treatment option. In this paper we present the progress in the diagnosis and classification of different subgroups of patients with pituitary tumors that may be managed with specific considerations according to their tumor subtype.

  2. CLASSIFICATION OF THE MGR WASTE EMPLACEMENT/RETRIEVAL SYSTEM

    International Nuclear Information System (INIS)

    J.A. Ziegler

    2000-01-01

    The purpose of this analysis is to document the Quality Assurance (QA) classification of the Monitored Geologic Repository (MGR) waste emplacement/retrieved system structures, systems and components (SSCs) performed by the MGR Preclosure Safety and Systems Engineering Section. This analysis also provides the basis for revision of YMP/90-55Q, Q-List (YMP 2000). The Q-List identifies those MGR SSCs subject to the requirements of DOE/RW-0333P, Quality Assurance Requirements and Description (QARD) (DOE 2000). This QA classification incorporates the current MGR design and the results of the ''Design Basis Event Frequency and Dose Calculation for Site Recommendation'' (CRWMS M andO 2000a). The content and technical approach of this analysis is in accordance with the development plan ''QA Classification of MGR Structures, Systems, and Components'' (CRWMS M andO 1999b)

  3. Small-Scale Design Experiments as Working Space for Larger Mobile Communication Challenges

    Science.gov (United States)

    Lowe, Sarah; Stuedahl, Dagny

    2014-01-01

    In this paper, a design experiment using Instagram as a cultural probe is submitted as a method for analyzing the challenges that arise when considering the implementation of social media within a distributed communication space. It outlines how small, iterative investigations can reveal deeper research questions relevant to the education of…

  4. Integration of heterogeneous features for remote sensing scene classification

    Science.gov (United States)

    Wang, Xin; Xiong, Xingnan; Ning, Chen; Shi, Aiye; Lv, Guofang

    2018-01-01

    Scene classification is one of the most important issues in remote sensing (RS) image processing. We find that features from different channels (shape, spectral, texture, etc.), levels (low-level and middle-level), or perspectives (local and global) could provide various properties for RS images, and then propose a heterogeneous feature framework to extract and integrate heterogeneous features with different types for RS scene classification. The proposed method is composed of three modules (1) heterogeneous features extraction, where three heterogeneous feature types, called DS-SURF-LLC, mean-Std-LLC, and MS-CLBP, are calculated, (2) heterogeneous features fusion, where the multiple kernel learning (MKL) is utilized to integrate the heterogeneous features, and (3) an MKL support vector machine classifier for RS scene classification. The proposed method is extensively evaluated on three challenging benchmark datasets (a 6-class dataset, a 12-class dataset, and a 21-class dataset), and the experimental results show that the proposed method leads to good classification performance. It produces good informative features to describe the RS image scenes. Moreover, the integration of heterogeneous features outperforms some state-of-the-art features on RS scene classification tasks.

  5. Proposal of a new classification scheme for periocular injuries

    Directory of Open Access Journals (Sweden)

    Devi Prasad Mohapatra

    2017-01-01

    Full Text Available Background: Eyelids are important structures and play a role in protecting the globe from trauma, brightness, in maintaining the integrity of tear films and moving the tears towards the lacrimal drainage system and contribute to aesthetic appearance of the face. Ophthalmic trauma is an important cause of morbidity among individuals and has also been responsible for additional cost of healthcare. Periocular trauma involving eyelids and adjacent structures has been found to have increased recently probably due to increased pace of life and increased dependence on machinery. A comprehensive classification of periocular trauma would help in stratifying these injuries as well as study outcomes. Material and Methods: This study was carried out at our institute from June 2015 to Dec 2015. We searched multiple English language databases for existing classification systems for periocular trauma. We designed a system of classification of periocular soft tissue injuries based on clinico-anatomical presentations. This classification was applied prospectively to patients presenting with periocular soft tissue injuries to our department. Results: A comprehensive classification scheme was designed consisting of five types of periocular injuries. A total of 38 eyelid injuries in 34 patients were evaluated in this study. According to the System for Peri-Ocular Trauma (SPOT classification, Type V injuries were most common. SPOT Type II injuries were more common isolated injuries among all zones. Discussion: Classification systems are necessary in order to provide a framework in which to scientifically study the etiology, pathogenesis, and treatment of diseases in an orderly fashion. The SPOT classification has taken into account the periocular soft tissue injuries i.e., upper eyelid, lower eyelid, medial and lateral canthus injuries., based on observed clinico-anatomical patterns of eyelid injuries. Conclusion: The SPOT classification seems to be a reliable

  6. Ecological Design of Fernery based on Bioregion Classification System in Ecopark Cibinong Science Center Botanic Gardens, Indonesia

    Science.gov (United States)

    Nafar, S.; Gunawan, A.

    2017-10-01

    Indonesia as mega biodiversity country has a wide variety of ferns. However, the natural habitats of ferns are currently degrading, particularly in lowlands due to the increasing level of urban-sprawl and industrial zones development. Therefore, Ecology Park (Ecopark) Cibinong Science Center-Botanic Gardens as an ex-situ conservation area is expected to be the best location to conserve the lowland ferns. The purpose of this study is to design a fernery through an ecological landscape design process. The main concept is The Journey of Fern, this concept aiming on providing users experiences in fernery by associating conservational, educational, and recreational aspects. Ecological landscape design as general is applied by the principal of reduce, reuse, and recycle (3R). Bioregion classification system is applied by grouping the plants based on the characteristics of light, water, soil, air, and temperature. The design concept is inspired by the morphology of fern and its growth patterns which is transformed into organic and geometric forms. The result of this study is a design of fernery which consist of welcome area, recreation area, service area, and conservation education area as the main area that providing 66 species of ferns.

  7. Limited resources of genome sequencing in developing countries: Challenges and solutions

    Directory of Open Access Journals (Sweden)

    Mohamed Helmy

    2016-06-01

    Full Text Available The differences between countries in national income, growth, human development and many other factors are used to classify countries into developed and developing countries. There are several classification systems that use different sets of measures and criteria. The most common classifications are the United Nations (UN and the World Bank (WB systems. The UN classification system uses the UN Human Development Index (HDI, an indicator that uses statistic of life expectancy, education, and income per capita for countries' classification. While the WB system uses gross national income (GNI per capita that is calculated using the World Bank Atlas method. According to the UN and WB classification systems, there are 151 and 134 developing countries, respectively, with 89% overlap between the two systems. Developing countries have limited human development, and limited expenditure in education and research, among several other limitations. The biggest challenge facing genomic researchers and clinicians is limited resources. As a result, genomic tools, specifically genome sequencing technologies, which are rapidly becoming indispensable, are not widely available. In this report, we explore the current status of sequencing technologies in developing countries, describe the associated challenges and emphasize potential solutions.

  8. The Hybrid of Classification Tree and Extreme Learning Machine for Permeability Prediction in Oil Reservoir

    KAUST Repository

    Prasetyo Utomo, Chandra

    2011-06-01

    Permeability is an important parameter connected with oil reservoir. Predicting the permeability could save millions of dollars. Unfortunately, petroleum engineers have faced numerous challenges arriving at cost-efficient predictions. Much work has been carried out to solve this problem. The main challenge is to handle the high range of permeability in each reservoir. For about a hundred year, mathematicians and engineers have tried to deliver best prediction models. However, none of them have produced satisfying results. In the last two decades, artificial intelligence models have been used. The current best prediction model in permeability prediction is extreme learning machine (ELM). It produces fairly good results but a clear explanation of the model is hard to come by because it is so complex. The aim of this research is to propose a way out of this complexity through the design of a hybrid intelligent model. In this proposal, the system combines classification and regression models to predict the permeability value. These are based on the well logs data. In order to handle the high range of the permeability value, a classification tree is utilized. A benefit of this innovation is that the tree represents knowledge in a clear and succinct fashion and thereby avoids the complexity of all previous models. Finally, it is important to note that the ELM is used as a final predictor. Results demonstrate that this proposed hybrid model performs better when compared with support vector machines (SVM) and ELM in term of correlation coefficient. Moreover, the classification tree model potentially leads to better communication among petroleum engineers concerning this important process and has wider implications for oil reservoir management efficiency.

  9. Data sharing policy design for consortia: challenges for sustainability.

    Science.gov (United States)

    Kaye, Jane; Hawkins, Naomi

    2014-01-01

    The field of human genomics has led advances in the sharing of data with a view to facilitating translation of research into innovations for human health. This change in scientific practice has been implemented through new policy developed by many principal investigators, project managers and funders, which has ultimately led to new forms of practice and innovative governance models for data sharing. Here, we examine the development of the governance of data sharing in genomics, and explore some of the key challenges associated with the design and implementation of these policies. We examine how the incremental nature of policy design, the perennial problem of consent, the gridlock caused by multiple and overlapping access systems, the administrative burden and the problems with incentives and acknowledgment all have an impact on the potential for data sharing to be maximized. We conclude by proposing ways in which the scientific community can address these problems, to improve the sustainability of data sharing into the future.

  10. From fault classification to fault tolerance for multi-agent systems

    CERN Document Server

    Potiron, Katia; Taillibert, Patrick

    2013-01-01

    Faults are a concern for Multi-Agent Systems (MAS) designers, especially if the MAS are built for industrial or military use because there must be some guarantee of dependability. Some fault classification exists for classical systems, and is used to define faults. When dependability is at stake, such fault classification may be used from the beginning of the system's conception to define fault classes and specify which types of faults are expected. Thus, one may want to use fault classification for MAS; however, From Fault Classification to Fault Tolerance for Multi-Agent Systems argues that

  11. Volunteer-Based System for classification of traffic in computer networks

    DEFF Research Database (Denmark)

    Bujlow, Tomasz; Balachandran, Kartheepan; Riaz, M. Tahir

    2011-01-01

    To overcome the drawbacks of existing methods for traffic classification (by ports, Deep Packet Inspection, statistical classification) a new system was developed, in which the data are collected from client machines. This paper presents design of the system, implementation, initial runs and obta...

  12. ACCELERATOR PHYSICS CHALLENGES IN THE DESIGN OF MULTI-BEND-ACHROMAT-BASED STORAGE RINGS

    Energy Technology Data Exchange (ETDEWEB)

    Borland, M.; Hettel, R.; Leemann, S. C.; Robin, D. S.

    2017-06-01

    With the recent success in commissioning of MAX IV, the multi-bend achromat (MBA) lattice has begun to deliver on its promise to usher in a new generation of higher-brightness synchrotron light sources. In this paper, we begin by reviewing the challenges, recent success, and lessons learned of the MAX-IV project. Drawing on these lessons, we then describe the physics challenges in even more ambitious rings and how these can be met. In addition, we touch on engineering issues and choices that are tightly linked with the physics design.

  13. Design of a flexible tactile sensor for classification of rigid and deformable objects

    DEFF Research Database (Denmark)

    Drimus, Alin; Kootstra, Gert; Bilberg, Arne

    2014-01-01

    of the sensor in an active object-classification system. A robotic gripper with two sensors mounted on its fingers performs a palpation procedure on a set of objects. By squeezing an object, the robot actively explores the material properties, and the system acquires tactile information corresponding......For both humans and robots, tactile sensing is important for interaction with the environment: it is the core sensing used for exploration and manipulation of objects. In this paper, we present a novel tactile-array sensor based on flexible piezoresistive rubber.We describe the design of the sensor...... and data acquisition system.We evaluate the sensitivity and robustness of the sensor, and show that it is consistent over time with little relaxation. Furthermore, the sensor has the benefit of being flexible, having a high resolution, it is easy to mount, and simple to manufacture. We demonstrate the use...

  14. Classification-based comparison of pre-processing methods for interpretation of mass spectrometry generated clinical datasets

    Directory of Open Access Journals (Sweden)

    Hoefsloot Huub CJ

    2009-05-01

    Full Text Available Abstract Background Mass spectrometry is increasingly being used to discover proteins or protein profiles associated with disease. Experimental design of mass-spectrometry studies has come under close scrutiny and the importance of strict protocols for sample collection is now understood. However, the question of how best to process the large quantities of data generated is still unanswered. Main challenges for the analysis are the choice of proper pre-processing and classification methods. While these two issues have been investigated in isolation, we propose to use the classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Results Two in-house generated clinical SELDI-TOF MS datasets are used in this study as an example of high throughput mass-spectrometry data. We perform a systematic comparison of two commonly used pre-processing methods as implemented in Ciphergen ProteinChip Software and in the Cromwell package. With respect to reproducibility, Ciphergen and Cromwell pre-processing are largely comparable. We find that the overlap between peaks detected by either Ciphergen ProteinChip Software or Cromwell is large. This is especially the case for the more stringent peak detection settings. Moreover, similarity of the estimated intensities between matched peaks is high. We evaluate the pre-processing methods using five different classification methods. Classification is done in a double cross-validation protocol using repeated random sampling to obtain an unbiased estimate of classification accuracy. No pre-processing method significantly outperforms the other for all peak detection settings evaluated. Conclusion We use classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Both pre-processing methods lead to similar classification results on an ovarian cancer and a Gaucher disease dataset. However, the settings for pre

  15. Hierarchical structure for audio-video based semantic classification of sports video sequences

    Science.gov (United States)

    Kolekar, M. H.; Sengupta, S.

    2005-07-01

    A hierarchical structure for sports event classification based on audio and video content analysis is proposed in this paper. Compared to the event classifications in other games, those of cricket are very challenging and yet unexplored. We have successfully solved cricket video classification problem using a six level hierarchical structure. The first level performs event detection based on audio energy and Zero Crossing Rate (ZCR) of short-time audio signal. In the subsequent levels, we classify the events based on video features using a Hidden Markov Model implemented through Dynamic Programming (HMM-DP) using color or motion as a likelihood function. For some of the game-specific decisions, a rule-based classification is also performed. Our proposed hierarchical structure can easily be applied to any other sports. Our results are very promising and we have moved a step forward towards addressing semantic classification problems in general.

  16. Audio stream classification for multimedia database search

    Science.gov (United States)

    Artese, M.; Bianco, S.; Gagliardi, I.; Gasparini, F.

    2013-03-01

    Search and retrieval of huge archives of Multimedia data is a challenging task. A classification step is often used to reduce the number of entries on which to perform the subsequent search. In particular, when new entries of the database are continuously added, a fast classification based on simple threshold evaluation is desirable. In this work we present a CART-based (Classification And Regression Tree [1]) classification framework for audio streams belonging to multimedia databases. The database considered is the Archive of Ethnography and Social History (AESS) [2], which is mainly composed of popular songs and other audio records describing the popular traditions handed down generation by generation, such as traditional fairs, and customs. The peculiarities of this database are that it is continuously updated; the audio recordings are acquired in unconstrained environment; and for the non-expert human user is difficult to create the ground truth labels. In our experiments, half of all the available audio files have been randomly extracted and used as training set. The remaining ones have been used as test set. The classifier has been trained to distinguish among three different classes: speech, music, and song. All the audio files in the dataset have been previously manually labeled into the three classes above defined by domain experts.

  17. Global hierarchical classification of deepwater and wetland environments from remote sensing products

    Science.gov (United States)

    Fluet-Chouinard, E.; Lehner, B.; Aires, F.; Prigent, C.; McIntyre, P. B.

    2017-12-01

    Global surface water maps have improved in spatial and temporal resolutions through various remote sensing methods: open water extents with compiled Landsat archives and inundation with topographically downscaled multi-sensor retrievals. These time-series capture variations through time of open water and inundation without discriminating between hydrographic features (e.g. lakes, reservoirs, river channels and wetland types) as other databases have done as static representation. Available data sources present the opportunity to generate a comprehensive map and typology of aquatic environments (deepwater and wetlands) that improves on earlier digitized inventories and maps. The challenge of classifying surface waters globally is to distinguishing wetland types with meaningful characteristics or proxies (hydrology, water chemistry, soils, vegetation) while accommodating limitations of remote sensing data. We present a new wetland classification scheme designed for global application and produce a map of aquatic ecosystem types globally using state-of-the-art remote sensing products. Our classification scheme combines open water extent and expands it with downscaled multi-sensor inundation data to capture the maximal vegetated wetland extent. The hierarchical structure of the classification is modified from the Cowardin Systems (1979) developed for the USA. The first level classification is based on a combination of landscape positions and water source (e.g. lacustrine, riverine, palustrine, coastal and artificial) while the second level represents the hydrologic regime (e.g. perennial, seasonal, intermittent and waterlogged). Class-specific descriptors can further detail the wetland types with soils and vegetation cover. Our globally consistent nomenclature and top-down mapping allows for direct comparison across biogeographic regions, to upscale biogeochemical fluxes as well as other landscape level functions.

  18. IR-360 nuclear power plant safety functions and component classification

    International Nuclear Information System (INIS)

    Yousefpour, F.; Shokri, F.; Soltani, H.

    2010-01-01

    The IR-360 nuclear power plant as a 2-loop PWR of 360 MWe power generation capacity is under design in MASNA Company. For design of the IR-360 structures, systems and components (SSCs), the codes and standards and their design requirements must be determined. It is a prerequisite to classify the IR-360 safety functions and safety grade of structures, systems and components correctly for selecting and adopting the suitable design codes and standards. This paper refers to the IAEA nuclear safety codes and standards as well as USNRC standard system to determine the IR-360 safety functions and to formulate the principles of the IR-360 component classification in accordance with the safety philosophy and feature of the IR-360. By implementation of defined classification procedures for the IR-360 SSCs, the appropriate design codes and standards are specified. The requirements of specific codes and standards are used in design process of IR-360 SSCs by design engineers of MASNA Company. In this paper, individual determination of the IR-360 safety functions and definition of the classification procedures and roles are presented. Implementation of this work which is described with example ensures the safety and reliability of the IR-360 nuclear power plant.

  19. IR-360 nuclear power plant safety functions and component classification

    Energy Technology Data Exchange (ETDEWEB)

    Yousefpour, F., E-mail: fyousefpour@snira.co [Management of Nuclear Power Plant Construction Company (MASNA) (Iran, Islamic Republic of); Shokri, F.; Soltani, H. [Management of Nuclear Power Plant Construction Company (MASNA) (Iran, Islamic Republic of)

    2010-10-15

    The IR-360 nuclear power plant as a 2-loop PWR of 360 MWe power generation capacity is under design in MASNA Company. For design of the IR-360 structures, systems and components (SSCs), the codes and standards and their design requirements must be determined. It is a prerequisite to classify the IR-360 safety functions and safety grade of structures, systems and components correctly for selecting and adopting the suitable design codes and standards. This paper refers to the IAEA nuclear safety codes and standards as well as USNRC standard system to determine the IR-360 safety functions and to formulate the principles of the IR-360 component classification in accordance with the safety philosophy and feature of the IR-360. By implementation of defined classification procedures for the IR-360 SSCs, the appropriate design codes and standards are specified. The requirements of specific codes and standards are used in design process of IR-360 SSCs by design engineers of MASNA Company. In this paper, individual determination of the IR-360 safety functions and definition of the classification procedures and roles are presented. Implementation of this work which is described with example ensures the safety and reliability of the IR-360 nuclear power plant.

  20. ICF-based classification and measurement of functioning.

    Science.gov (United States)

    Stucki, G; Kostanjsek, N; Ustün, B; Cieza, A

    2008-09-01

    If we aim towards a comprehensive understanding of human functioning and the development of comprehensive programs to optimize functioning of individuals and populations we need to develop suitable measures. The approval of the International Classification, Disability and Health (ICF) in 2001 by the 54th World Health Assembly as the first universally shared model and classification of functioning, disability and health marks, therefore an important step in the development of measurement instruments and ultimately for our understanding of functioning, disability and health. The acceptance and use of the ICF as a reference framework and classification has been facilitated by its development in a worldwide, comprehensive consensus process and the increasing evidence regarding its validity. However, the broad acceptance and use of the ICF as a reference framework and classification will also depend on the resolution of conceptual and methodological challenges relevant for the classification and measurement of functioning. This paper therefore describes first how the ICF categories can serve as building blocks for the measurement of functioning and then the current state of the development of ICF based practical tools and international standards such as the ICF Core Sets. Finally it illustrates how to map the world of measures to the ICF and vice versa and the methodological principles relevant for the transformation of information obtained with a clinical test or a patient-oriented instrument to the ICF as well as the development of ICF-based clinical and self-reported measurement instruments.

  1. SVM Based Descriptor Selection and Classification of Neurodegenerative Disease Drugs for Pharmacological Modeling.

    Science.gov (United States)

    Shahid, Mohammad; Shahzad Cheema, Muhammad; Klenner, Alexander; Younesi, Erfan; Hofmann-Apitius, Martin

    2013-03-01

    Systems pharmacological modeling of drug mode of action for the next generation of multitarget drugs may open new routes for drug design and discovery. Computational methods are widely used in this context amongst which support vector machines (SVM) have proven successful in addressing the challenge of classifying drugs with similar features. We have applied a variety of such SVM-based approaches, namely SVM-based recursive feature elimination (SVM-RFE). We use the approach to predict the pharmacological properties of drugs widely used against complex neurodegenerative disorders (NDD) and to build an in-silico computational model for the binary classification of NDD drugs from other drugs. Application of an SVM-RFE model to a set of drugs successfully classified NDD drugs from non-NDD drugs and resulted in overall accuracy of ∼80 % with 10 fold cross validation using 40 top ranked molecular descriptors selected out of total 314 descriptors. Moreover, SVM-RFE method outperformed linear discriminant analysis (LDA) based feature selection and classification. The model reduced the multidimensional descriptors space of drugs dramatically and predicted NDD drugs with high accuracy, while avoiding over fitting. Based on these results, NDD-specific focused libraries of drug-like compounds can be designed and existing NDD-specific drugs can be characterized by a well-characterized set of molecular descriptors. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Modern classification and outcome predictors of surgery in patients with brain arteriovenous malformations.

    Science.gov (United States)

    Tayebi Meybodi, Ali; Lawton, Michael T

    2018-02-23

    Brain arteriovenous malformations (bAVM) are challenging lesions. Part of this challenge stems from the infinite diversity of these lesions regarding shape, location, anatomy, and physiology. This diversity has called on a variety of treatment modalities for these lesions, of which microsurgical resection prevails as the mainstay of treatment. As such, outcome prediction and managing strategy mainly rely on unraveling the nature of these complex tangles and ways each lesion responds to various therapeutic modalities. This strategy needs the ability to decipher each lesion through accurate and efficient categorization. Therefore, classification schemes are essential parts of treatment planning and outcome prediction. This article summarizes different surgical classification schemes and outcome predictors proposed for bAVMs.

  3. Challenges Facing 3-D Audio Display Design for Multimedia

    Science.gov (United States)

    Begault, Durand R.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    The challenges facing successful multimedia presentation depend largely on the expectations of the designer and end user for a given application. Perceptual limitations in distance, elevation and azimuth sound source simulation differ significantly between headphone and cross-talk cancellation loudspeaker listening and therefore must be considered. Simulation of an environmental context is desirable but the quality depends on processing resources and lack of interaction with the host acoustical environment. While techniques such as data reduction of head-related transfer functions have been used widely to improve simulation fidelity, another approach involves determining thresholds for environmental acoustic events. Psychoacoustic studies relevant to this approach are reviewed in consideration of multimedia applications

  4. Current Trends in the Molecular Classification of Renal Neoplasms

    Directory of Open Access Journals (Sweden)

    Andrew N. Young

    2006-01-01

    Full Text Available Renal cell carcinoma (RCC is the most common form of kidney cancer in adults. RCC is a significant challenge for pathologic diagnosis and clinical management. The primary approach to diagnosis is by light microscopy, using the World Health Organization (WHO classification system, which defines histopathologic tumor subtypes with distinct clinical behavior and underlying genetic mutations. However, light microscopic diagnosis of RCC subtypes is often difficult due to variable histology. In addition, the clinical behavior of RCC is highly variable and therapeutic response rates are poor. Few clinical assays are available to predict outcome in RCC or correlate behavior with histology. Therefore, novel RCC classification systems based on gene expression should be useful for diagnosis, prognosis, and treatment. Recent microarray studies have shown that renal tumors are characterized by distinct gene expression profiles, which can be used to discover novel diagnostic and prognostic biomarkers. Here, we review clinical features of kidney cancer, the WHO classification system, and the growing role of molecular classification for diagnosis, prognosis, and therapy of this disease.

  5. Accurate crop classification using hierarchical genetic fuzzy rule-based systems

    Science.gov (United States)

    Topaloglou, Charalampos A.; Mylonas, Stelios K.; Stavrakoudis, Dimitris G.; Mastorocostas, Paris A.; Theocharis, John B.

    2014-10-01

    This paper investigates the effectiveness of an advanced classification system for accurate crop classification using very high resolution (VHR) satellite imagery. Specifically, a recently proposed genetic fuzzy rule-based classification system (GFRBCS) is employed, namely, the Hierarchical Rule-based Linguistic Classifier (HiRLiC). HiRLiC's model comprises a small set of simple IF-THEN fuzzy rules, easily interpretable by humans. One of its most important attributes is that its learning algorithm requires minimum user interaction, since the most important learning parameters affecting the classification accuracy are determined by the learning algorithm automatically. HiRLiC is applied in a challenging crop classification task, using a SPOT5 satellite image over an intensively cultivated area in a lake-wetland ecosystem in northern Greece. A rich set of higher-order spectral and textural features is derived from the initial bands of the (pan-sharpened) image, resulting in an input space comprising 119 features. The experimental analysis proves that HiRLiC compares favorably to other interpretable classifiers of the literature, both in terms of structural complexity and classification accuracy. Its testing accuracy was very close to that obtained by complex state-of-the-art classification systems, such as the support vector machines (SVM) and random forest (RF) classifiers. Nevertheless, visual inspection of the derived classification maps shows that HiRLiC is characterized by higher generalization properties, providing more homogeneous classifications that the competitors. Moreover, the runtime requirements for producing the thematic map was orders of magnitude lower than the respective for the competitors.

  6. Challenges and design solutions of the liquid hydrogen circuit at the European Spallation Source

    Energy Technology Data Exchange (ETDEWEB)

    Gallimore, S.; Nilsson, P.; Sabbagh, P.; Takibayev, A.; Weisend II, J. G. [European Spallation Source ESS AB, SE-22100 Lund (Sweden); Beßler, Y. [Forschungzentrum Jülich, Jülich (Germany); Klaus, M. [Technische Universität Dresden, Dresden (Germany)

    2014-01-29

    The European Spallation Source (ESS), Lund, Sweden will be a 5MW long-pulse neutron spallation research facility and will enable new opportunities for researchers in the fields of life sciences, energy, environmental technology, cultural heritage and fundamental physics. Neutrons are produced by accelerating a high-energy proton beam into a rotating helium-cooled tungsten target. These neutrons pass through moderators to reduce their energy to an appropriate range (< 5 meV for cold neutrons); two of which will use liquid hydrogen at 17 K as the moderating and cooling medium. There are several technical challenges to overcome in the design of a robust system that will operate under such conditions, not least the 20 kW of deposited heat. These challenges and the associated design solutions will be detailed in this paper.

  7. Automatic Classification of Attacks on IP Telephony

    Directory of Open Access Journals (Sweden)

    Jakub Safarik

    2013-01-01

    Full Text Available This article proposes an algorithm for automatic analysis of attack data in IP telephony network with a neural network. Data for the analysis is gathered from variable monitoring application running in the network. These monitoring systems are a typical part of nowadays network. Information from them is usually used after attack. It is possible to use an automatic classification of IP telephony attacks for nearly real-time classification and counter attack or mitigation of potential attacks. The classification use proposed neural network, and the article covers design of a neural network and its practical implementation. It contains also methods for neural network learning and data gathering functions from honeypot application.

  8. The Design of Cluster Randomized Trials with Random Cross-Classifications

    Science.gov (United States)

    Moerbeek, Mirjam; Safarkhani, Maryam

    2018-01-01

    Data from cluster randomized trials do not always have a pure hierarchical structure. For instance, students are nested within schools that may be crossed by neighborhoods, and soldiers are nested within army units that may be crossed by mental health-care professionals. It is important that the random cross-classification is taken into account…

  9. Sentiment classification of Roman-Urdu opinions using Naïve Bayesian, Decision Tree and KNN classification techniques

    Directory of Open Access Journals (Sweden)

    Muhammad Bilal

    2016-07-01

    Full Text Available Sentiment mining is a field of text mining to determine the attitude of people about a particular product, topic, politician in newsgroup posts, review sites, comments on facebook posts twitter, etc. There are many issues involved in opinion mining. One important issue is that opinions could be in different languages (English, Urdu, Arabic, etc.. To tackle each language according to its orientation is a challenging task. Most of the research work in sentiment mining has been done in English language. Currently, limited research is being carried out on sentiment classification of other languages like Arabic, Italian, Urdu and Hindi. In this paper, three classification models are used for text classification using Waikato Environment for Knowledge Analysis (WEKA. Opinions written in Roman-Urdu and English are extracted from a blog. These extracted opinions are documented in text files to prepare a training dataset containing 150 positive and 150 negative opinions, as labeled examples. Testing data set is supplied to three different models and the results in each case are analyzed. The results show that Naïve Bayesian outperformed Decision Tree and KNN in terms of more accuracy, precision, recall and F-measure.

  10. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Technology Center (MPMask). The Calibre ADC tool was qualified on production mask blanks against the manual classification. The classification accuracy of ADC is greater than 95% for critical defects with an overall accuracy of 90%. The sensitivity to weak defect signals and locating the defect in the images is a challenge we are resolving. The performance of the tool has been demonstrated on multiple mask types and is ready for deployment in full volume mask manufacturing production flow. Implementation of Calibre ADC is estimated to reduce the misclassification of critical defects by 60-80%.

  11. Intelligent Computer Vision System for Automated Classification

    International Nuclear Information System (INIS)

    Jordanov, Ivan; Georgieva, Antoniya

    2010-01-01

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  12. Low-cost real-time automatic wheel classification system

    Science.gov (United States)

    Shabestari, Behrouz N.; Miller, John W. V.; Wedding, Victoria

    1992-11-01

    This paper describes the design and implementation of a low-cost machine vision system for identifying various types of automotive wheels which are manufactured in several styles and sizes. In this application, a variety of wheels travel on a conveyor in random order through a number of processing steps. One of these processes requires the identification of the wheel type which was performed manually by an operator. A vision system was designed to provide the required identification. The system consisted of an annular illumination source, a CCD TV camera, frame grabber, and 386-compatible computer. Statistical pattern recognition techniques were used to provide robust classification as well as a simple means for adding new wheel designs to the system. Maintenance of the system can be performed by plant personnel with minimal training. The basic steps for identification include image acquisition, segmentation of the regions of interest, extraction of selected features, and classification. The vision system has been installed in a plant and has proven to be extremely effective. The system properly identifies the wheels correctly up to 30 wheels per minute regardless of rotational orientation in the camera's field of view. Correct classification can even be achieved if a portion of the wheel is blocked off from the camera. Significant cost savings have been achieved by a reduction in scrap associated with incorrect manual classification as well as a reduction of labor in a tedious task.

  13. Design of Passive Power Filter for Hybrid Series Active Power Filter using Estimation, Detection and Classification Method

    Science.gov (United States)

    Swain, Sushree Diptimayee; Ray, Pravat Kumar; Mohanty, K. B.

    2016-06-01

    This research paper discover the design of a shunt Passive Power Filter (PPF) in Hybrid Series Active Power Filter (HSAPF) that employs a novel analytic methodology which is superior than FFT analysis. This novel approach consists of the estimation, detection and classification of the signals. The proposed method is applied to estimate, detect and classify the power quality (PQ) disturbance such as harmonics. This proposed work deals with three methods: the harmonic detection through wavelet transform method, the harmonic estimation by Kalman Filter algorithm and harmonic classification by decision tree method. From different type of mother wavelets in wavelet transform method, the db8 is selected as suitable mother wavelet because of its potency on transient response and crouched oscillation at frequency domain. In harmonic compensation process, the detected harmonic is compensated through Hybrid Series Active Power Filter (HSAPF) based on Instantaneous Reactive Power Theory (IRPT). The efficacy of the proposed method is verified in MATLAB/SIMULINK domain and as well as with an experimental set up. The obtained results confirm the superiority of the proposed methodology than FFT analysis. This newly proposed PPF is used to make the conventional HSAPF more robust and stable.

  14. A Comprehensive Study of Features and Algorithms for URL-Based Topic Classification

    CERN Document Server

    Weber, I; Henzinger, M; Baykan, E

    2011-01-01

    Given only the URL of a Web page, can we identify its topic? We study this problem in detail by exploring a large number of different feature sets and algorithms on several datasets. We also show that the inherent overlap between topics and the sparsity of the information in URLs makes this a very challenging problem. Web page classification without a page's content is desirable when the content is not available at all, when a classification is needed before obtaining the content, or when classification speed is of utmost importance. For our experiments we used five different corpora comprising a total of about 3 million (URL, classification) pairs. We evaluated several techniques for feature generation and classification algorithms. The individual binary classifiers were then combined via boosting into metabinary classifiers. We achieve typical F-measure values between 80 and 85, and a typical precision of around 86. The precision can be pushed further over 90 while maintaining a typical level of recall betw...

  15. Data Modeling Challenges of Advanced Interoperability.

    Science.gov (United States)

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  16. Clinically-inspired automatic classification of ovarian carcinoma subtypes

    Directory of Open Access Journals (Sweden)

    Aicha BenTaieb

    2016-01-01

    Full Text Available Context: It has been shown that ovarian carcinoma subtypes are distinct pathologic entities with differing prognostic and therapeutic implications. Histotyping by pathologists has good reproducibility, but occasional cases are challenging and require immunohistochemistry and subspecialty consultation. Motivated by the need for more accurate and reproducible diagnoses and to facilitate pathologists′ workflow, we propose an automatic framework for ovarian carcinoma classification. Materials and Methods: Our method is inspired by pathologists′ workflow. We analyse imaged tissues at two magnification levels and extract clinically-inspired color, texture, and segmentation-based shape descriptors using image-processing methods. We propose a carefully designed machine learning technique composed of four modules: A dissimilarity matrix, dimensionality reduction, feature selection and a support vector machine classifier to separate the five ovarian carcinoma subtypes using the extracted features. Results: This paper presents the details of our implementation and its validation on a clinically derived dataset of eighty high-resolution histopathology images. The proposed system achieved a multiclass classification accuracy of 95.0% when classifying unseen tissues. Assessment of the classifier′s confusion (confusion matrix between the five different ovarian carcinoma subtypes agrees with clinician′s confusion and reflects the difficulty in diagnosing endometrioid and serous carcinomas. Conclusions: Our results from this first study highlight the difficulty of ovarian carcinoma diagnosis which originate from the intrinsic class-imbalance observed among subtypes and suggest that the automatic analysis of ovarian carcinoma subtypes could be valuable to clinician′s diagnostic procedure by providing a second opinion.

  17. Retrieval and classification of food images.

    Science.gov (United States)

    Farinella, Giovanni Maria; Allegra, Dario; Moltisanti, Marco; Stanco, Filippo; Battiato, Sebastiano

    2016-10-01

    Automatic food understanding from images is an interesting challenge with applications in different domains. In particular, food intake monitoring is becoming more and more important because of the key role that it plays in health and market economies. In this paper, we address the study of food image processing from the perspective of Computer Vision. As first contribution we present a survey of the studies in the context of food image processing from the early attempts to the current state-of-the-art methods. Since retrieval and classification engines able to work on food images are required to build automatic systems for diet monitoring (e.g., to be embedded in wearable cameras), we focus our attention on the aspect of the representation of the food images because it plays a fundamental role in the understanding engines. The food retrieval and classification is a challenging task since the food presents high variableness and an intrinsic deformability. To properly study the peculiarities of different image representations we propose the UNICT-FD1200 dataset. It was composed of 4754 food images of 1200 distinct dishes acquired during real meals. Each food plate is acquired multiple times and the overall dataset presents both geometric and photometric variabilities. The images of the dataset have been manually labeled considering 8 categories: Appetizer, Main Course, Second Course, Single Course, Side Dish, Dessert, Breakfast, Fruit. We have performed tests employing different representations of the state-of-the-art to assess the related performances on the UNICT-FD1200 dataset. Finally, we propose a new representation based on the perceptual concept of Anti-Textons which is able to encode spatial information between Textons outperforming other representations in the context of food retrieval and Classification. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Mechatronic futures challenges and solutions for mechatronic systems and their designers

    CERN Document Server

    Bradley, David

    2016-01-01

    Offering a comprehensive overview of the challenges, risks and options facing the future of mechatronics, this book provides insights into how these issues are currently assessed and managed. Building on the previously published book ‘Mechatronics in Action,’ it identifies and discusses the key issues likely to impact on future mechatronic systems. It supports mechatronics practitioners in identifying key areas in design, modeling and technology and places these in the wider context of concepts such as cyber-physical systems and the Internet of Things. For educators it considers the potential effects of developments in these areas on mechatronic course design, and ways of integrating these. Written by experts in the field, it explores topics including systems integration, design, modeling, privacy, ethics and future application domains. Highlighting novel innovation directions, it is intended for academics, engineers and students working in the field of mechatronics, particularly those developing new conc...

  19. A New Classification Approach Based on Multiple Classification Rules

    OpenAIRE

    Zhongmei Zhou

    2014-01-01

    A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...

  20. A Way Forward for Ship Classification and Technical Services

    Directory of Open Access Journals (Sweden)

    Lam-Bee Goh

    2014-04-01

    Full Text Available Classification societies are one of key organizations that promote the highest standards in ship safety and quality shipping. The paper reviews the ship classification industry and identifies what the classification societies can do to add value to the maritime industry more effectively. To meet this objective, an analysis of the five competitive forces is carried out, together with an opinion survey performed on some of the leading shipping companies, to assess and to establish some of the key factors which should be considered when formulating an overall business strategy for the growth of the classification services business. The findings from the study are discussed with the strategic options and choices. A classification services industrial value chain analysis together with ship management and operation is undertaken to explore the opportunities for classification societies. These findings also provide guidance to policy-makers who design and seek to implement more effective international shipping policies.

  1. Is classification necessary after Google?

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2012-01-01

    believe that the activity of “classification” is not worth the effort, as search engines can be improved without the heavy cost of providing metadata. Design/methodology/approach – The basic issue in classification is seen as providing criteria for deciding whether A should be classified as X...

  2. Design challenges for long-term interaction with a robot in a science classroom

    NARCIS (Netherlands)

    Davison, Daniel Patrick; Charisi, Vasiliki; Wijnen, Frances Martine; Papenmeier, Andrea; van der Meij, Jan; Reidsma, Dennis; Evers, Vanessa

    This paper aims to present the main challenges that emerged during the process of the research design of a longitudinal study on child-robot interaction for science education and to discuss relevant suggestions in the context. The theoretical rationale is based on aspects of the theory of social

  3. 78 FR 68983 - Cotton Futures Classification: Optional Classification Procedure

    Science.gov (United States)

    2013-11-18

    ...-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing... regulations to allow for the addition of an optional cotton futures classification procedure--identified and... response to requests from the U.S. cotton industry and ICE, AMS will offer a futures classification option...

  4. HDPE (High Density Polyethylene) pipeline and riser design in Guanabara Bay: challenges and solutions

    Energy Technology Data Exchange (ETDEWEB)

    Bomfimsilva, Carlos; Jorge, Joao Paulo Carrijo; Schmid, Dominique; Gomes, Rodrigo Klim [INTECSEA, Sao Paulo, SP (Brazil); Lima, Alexander Piraja [GDK, Salvador, BA (Brazil)

    2009-12-19

    Worldwide shipments of plastic pipes are forecasted to increase 5.2% per year since 2008, being commonly used for water supply and sewage disposal. The HDPE (High Density Polyethylene) pipes have been applied recently to deliver potable water and fire fighting water for the main pier of the LNG system in Guanabara Bay, Rio de Janeiro. The system contains three sizes of pipe outside diameter, 110 mm and 160 mm for water supply, and 500 mm for the fire fighting system. The main design challenges of the pipeline system included providing on-bottom stability, a suitable installation procedure and a proper riser design. The on-bottom stability calculations, which are quite different from the conventional steel pipelines, were developed by designing concrete blocks to be assembled on the pipeline in a required spacing to assure long term stability, knowing that plastic pipes are buoyant even in flooded conditions. The installation procedure was developed considering the lay down methodology based on surface towing technique. The riser was designed to be installed together with additional steel support structure to allow the entire underwater system to have the same plastic pipe specification up to the surface. This paper presents the main challenges that were faced during the design of the HDPE pipelines for the LNG system in Guanabara Bay, addressing the solutions and recommendations adopted for the plastic underwater pipeline system.

  5. Development of a classification system for cup anemometers - CLASSCUP

    DEFF Research Database (Denmark)

    Friis Pedersen, Troels

    2003-01-01

    the objectives to quantify the errors associated with the use of cup anemometers, and to determine the requirements for an optimum design of a cup anemometer, and to develop a classification system forquantification of systematic errors of cup anemometers. The present report describes this proposed...... classification system. A classification method for cup anemometers has been developed, which proposes general external operational ranges to be used. Anormal category range connected to ideal sites of the IEC power performance standard was made, and another extended category range for complex terrain...... was proposed. General classification indices were proposed for all types of cup anemometers. As a resultof the classification, the cup anemometer will be assigned to a certain class: 0.5, 1, 2, 3 or 5 with corresponding intrinsic errors (%) as a vector instrument (3D) or as a horizontal instrument (2D...

  6. The Cool and Belkin Faceted Classification of Information Interactions Revisited

    Science.gov (United States)

    Huvila, Isto

    2010-01-01

    Introduction: The complexity of human information activity is a challenge for both practice and research in information sciences and information management. Literature presents a wealth of approaches to analytically structure and make sense of human information activity including a faceted classification model of information interactions published…

  7. Safety classification of nuclear power plant systems, structures and components

    International Nuclear Information System (INIS)

    1992-01-01

    The Safety Classification principles used for the systems, structures and components of a nuclear power plant are detailed in the guide. For classification, the nuclear power plant is divided into structural and operational units called systems. Every structure and component under control is included into some system. The Safety Classes are 1, 2 and 3 and the Class EYT (non-nuclear). Instructions how to assign each system, structure and component to an appropriate safety class are given in the guide. The guide applies to new nuclear power plants and to the safety classification of systems, structures and components designed for the refitting of old nuclear power plants. The classification principles and procedures applying to the classification document are also given

  8. Minimisation de fonctions de perte calibrée pour la classification des images

    OpenAIRE

    Bel Haj Ali , Wafa

    2013-01-01

    Image classification becomes a big challenge since it concerns on the one hand millions or billions of images that are available on the web and on the other hand images used for critical real-time applications. This classification involves in general learning methods and classifiers that must require both precision as well as speed performance. These learning problems concern a large number of application areas: namely, web applications (profiling, targeting, social networks, search engines),...

  9. Deep learning for EEG-Based preference classification

    Science.gov (United States)

    Teo, Jason; Hou, Chew Lin; Mountstephens, James

    2017-10-01

    Electroencephalogram (EEG)-based emotion classification is rapidly becoming one of the most intensely studied areas of brain-computer interfacing (BCI). The ability to passively identify yet accurately correlate brainwaves with our immediate emotions opens up truly meaningful and previously unattainable human-computer interactions such as in forensic neuroscience, rehabilitative medicine, affective entertainment and neuro-marketing. One particularly useful yet rarely explored areas of EEG-based emotion classification is preference recognition [1], which is simply the detection of like versus dislike. Within the limited investigations into preference classification, all reported studies were based on musically-induced stimuli except for a single study which used 2D images. The main objective of this study is to apply deep learning, which has been shown to produce state-of-the-art results in diverse hard problems such as in computer vision, natural language processing and audio recognition, to 3D object preference classification over a larger group of test subjects. A cohort of 16 users was shown 60 bracelet-like objects as rotating visual stimuli on a computer display while their preferences and EEGs were recorded. After training a variety of machine learning approaches which included deep neural networks, we then attempted to classify the users' preferences for the 3D visual stimuli based on their EEGs. Here, we show that that deep learning outperforms a variety of other machine learning classifiers for this EEG-based preference classification task particularly in a highly challenging dataset with large inter- and intra-subject variability.

  10. Design of Multistable Origami Structures

    Science.gov (United States)

    Gillman, Andrew; Fuchi, Kazuko; Bazzan, Giorgio; Reich, Gregory; Alyanak, Edward; Buskohl, Philip

    Origami is being transformed from an art to a mathematically robust method for device design in a variety of scientific applications. These structures often require multiple stable configurations, e.g. efficient well-controlled deployment. However, the discovery of origami structures with mechanical instabilities is challenging given the complex geometric nonlinearities and the large design space to investigate. To address this challenge, we have developed a topology optimization framework for discovering origami fold patterns that realize stable and metastable positions. The objective function targets both the desired stable positions and nonlinear loading profiles of specific vertices in the origami structure. Multistable compliant structures have been shown to offer advantages in their stability and efficiency, and certain origami fold patterns exhibit multistable behavior. Building on this previous work of single vertex multistability analysis, e.g. waterbomb origami pattern, we are expanding the solution set of multistable mechanisms to include multiple vertices and a broader set of reference configurations. Collectively, these results enable an initial classification of geometry-induced mechanical instabilities that can be programmed into active material systems. This work was supported by the Air Force Office of Scientific Research.

  11. HIV classification using coalescent theory

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Ming [Los Alamos National Laboratory; Letiner, Thomas K [Los Alamos National Laboratory; Korber, Bette T [Los Alamos National Laboratory

    2008-01-01

    Algorithms for subtype classification and breakpoint detection of HIV-I sequences are based on a classification system of HIV-l. Hence, their quality highly depend on this system. Due to the history of creation of the current HIV-I nomenclature, the current one contains inconsistencies like: The phylogenetic distance between the subtype B and D is remarkably small compared with other pairs of subtypes. In fact, it is more like the distance of a pair of subsubtypes Robertson et al. (2000); Subtypes E and I do not exist any more since they were discovered to be composed of recombinants Robertson et al. (2000); It is currently discussed whether -- instead of CRF02 being a recombinant of subtype A and G -- subtype G should be designated as a circulating recombination form (CRF) nd CRF02 as a subtype Abecasis et al. (2007); There are 8 complete and over 400 partial HIV genomes in the LANL-database which belong neither to a subtype nor to a CRF (denoted by U). Moreover, the current classification system is somehow arbitrary like all complex classification systems that were created manually. To this end, it is desirable to deduce the classification system of HIV systematically by an algorithm. Of course, this problem is not restricted to HIV, but applies to all fast mutating and recombining viruses. Our work addresses the simpler subproblem to score classifications of given input sequences of some virus species (classification denotes a partition of the input sequences in several subtypes and CRFs). To this end, we reconstruct ancestral recombination graphs (ARG) of the input sequences under restrictions determined by the given classification. These restritions are imposed in order to ensure that the reconstructed ARGs do not contradict the classification under consideration. Then, we find the ARG with maximal probability by means of Markov Chain Monte Carlo methods. The probability of the most probable ARG is interpreted as a score for the classification. To our

  12. Gas Classification Using Deep Convolutional Neural Networks

    Science.gov (United States)

    Peng, Pai; Zhao, Xiaojin; Pan, Xiaofang; Ye, Wenbin

    2018-01-01

    In this work, we propose a novel Deep Convolutional Neural Network (DCNN) tailored for gas classification. Inspired by the great success of DCNN in the field of computer vision, we designed a DCNN with up to 38 layers. In general, the proposed gas neural network, named GasNet, consists of: six convolutional blocks, each block consist of six layers; a pooling layer; and a fully-connected layer. Together, these various layers make up a powerful deep model for gas classification. Experimental results show that the proposed DCNN method is an effective technique for classifying electronic nose data. We also demonstrate that the DCNN method can provide higher classification accuracy than comparable Support Vector Machine (SVM) methods and Multiple Layer Perceptron (MLP). PMID:29316723

  13. Gas Classification Using Deep Convolutional Neural Networks.

    Science.gov (United States)

    Peng, Pai; Zhao, Xiaojin; Pan, Xiaofang; Ye, Wenbin

    2018-01-08

    In this work, we propose a novel Deep Convolutional Neural Network (DCNN) tailored for gas classification. Inspired by the great success of DCNN in the field of computer vision, we designed a DCNN with up to 38 layers. In general, the proposed gas neural network, named GasNet, consists of: six convolutional blocks, each block consist of six layers; a pooling layer; and a fully-connected layer. Together, these various layers make up a powerful deep model for gas classification. Experimental results show that the proposed DCNN method is an effective technique for classifying electronic nose data. We also demonstrate that the DCNN method can provide higher classification accuracy than comparable Support Vector Machine (SVM) methods and Multiple Layer Perceptron (MLP).

  14. Building Information Modelling: Challenges and Barriers in Implement of BIM for Interior Design Industry in Malaysia

    Science.gov (United States)

    Hamid, A. B. Abd; Taib, M. Z. Mohd; Razak, A. H. N. Abdul; Embi, M. R.

    2018-04-01

    Building Information Modelling (BIM) is an innovative approach that has developed crossways the global in architecture, engineering and construction (AEC) industry. The construction industry of Malaysia has undergone a rapid development and dynamic technology adoption in advance and methods between the players industry and stakeholders. Consequently, limited technologies and devices have not been successful as it should have been. This study will be emphasizing scenarios of challenges and barriers in adopting BIM in interior design industry in Malaysia. The study was emphasizing the challenges and barriers in BIM usage from the designer’s perspective. The data are collected through the questionnaires as to identifying the barriers, knowledge, readiness and awareness and distributed to interior design firms were selected randomly. The finding of this research is to examine the barriers and causes of variables BIM usage for interior design industry in Malaysia. The outcome of this study is to identify the constraint of adoption BIM in interior design industry compare to others players in same industry.

  15. Integrating human and machine intelligence in galaxy morphology classification tasks

    Science.gov (United States)

    Beck, Melanie R.; Scarlata, Claudia; Fortson, Lucy F.; Lintott, Chris J.; Simmons, B. D.; Galloway, Melanie A.; Willett, Kyle W.; Dickinson, Hugh; Masters, Karen L.; Marshall, Philip J.; Wright, Darryl

    2018-06-01

    Quantifying galaxy morphology is a challenging yet scientifically rewarding task. As the scale of data continues to increase with upcoming surveys, traditional classification methods will struggle to handle the load. We present a solution through an integration of visual and automated classifications, preserving the best features of both human and machine. We demonstrate the effectiveness of such a system through a re-analysis of visual galaxy morphology classifications collected during the Galaxy Zoo 2 (GZ2) project. We reprocess the top-level question of the GZ2 decision tree with a Bayesian classification aggregation algorithm dubbed SWAP, originally developed for the Space Warps gravitational lens project. Through a simple binary classification scheme, we increase the classification rate nearly 5-fold classifying 226 124 galaxies in 92 d of GZ2 project time while reproducing labels derived from GZ2 classification data with 95.7 per cent accuracy. We next combine this with a Random Forest machine learning algorithm that learns on a suite of non-parametric morphology indicators widely used for automated morphologies. We develop a decision engine that delegates tasks between human and machine and demonstrate that the combined system provides at least a factor of 8 increase in the classification rate, classifying 210 803 galaxies in just 32 d of GZ2 project time with 93.1 per cent accuracy. As the Random Forest algorithm requires a minimal amount of computational cost, this result has important implications for galaxy morphology identification tasks in the era of Euclid and other large-scale surveys.

  16. NASA Green Flight Challenge: Conceptual Design Approaches and Technologies to Enable 200 Passenger Miles per Gallon

    Science.gov (United States)

    Wells, Douglas P.

    2011-01-01

    The Green Flight Challenge is one of the National Aeronautics and Space Administration s Centennial Challenges designed to push technology and make passenger aircraft more efficient. Airliners currently average around 50 passenger-miles per gallon and this competition will push teams to greater than 200 passenger-miles per gallon. The aircraft must also fly at least 100 miles per hour for 200 miles. The total prize money for this competition is $1.65 Million. The Green Flight Challenge will be run by the Comparative Aircraft Flight Efficiency (CAFE) Foundation September 25 October 1, 2011 at Charles M. Schulz Sonoma County Airport in California. Thirteen custom aircraft were developed with electric, bio-diesel, and other bio-fuel engines. The aircraft are using various technologies to improve aerodynamic, propulsion, and structural efficiency. This paper will explore the feasibility of the rule set, competitor vehicles, design approaches, and technologies used.

  17. Improvement of Bioactive Compound Classification through Integration of Orthogonal Cell-Based Biosensing Methods

    Directory of Open Access Journals (Sweden)

    Goran N. Jovanovic

    2007-01-01

    Full Text Available Lack of specificity for different classes of chemical and biological agents, and false positives and negatives, can limit the range of applications for cell-based biosensors. This study suggests that the integration of results from algal cells (Mesotaenium caldariorum and fish chromatophores (Betta splendens improves classification efficiency and detection reliability. Cells were challenged with paraquat, mercuric chloride, sodium arsenite and clonidine. The two detection systems were independently investigated for classification of the toxin set by performing discriminant analysis. The algal system correctly classified 72% of the bioactive compounds, whereas the fish chromatophore system correctly classified 68%. The combined classification efficiency was 95%. The algal sensor readout is based on fluorescence measurements of changes in the energy producing pathways of photosynthetic cells, whereas the response from fish chromatophores was quantified using optical density. Change in optical density reflects interference with the functioning of cellular signal transduction networks. Thus, algal cells and fish chromatophores respond to the challenge agents through sufficiently different mechanisms of action to be considered orthogonal.

  18. Academic Performance as a Predictor of Student Growth in Achievement and Mental Motivation During an Engineering Design Challenge in Engineering and Technology Education

    OpenAIRE

    Mentzer, Nathan

    2008-01-01

    The purpose of this correlational research study was to determine if students’ academic success was correlated with: (a) the student change in achievement during an engineering design challenge; and (b) student change in mental motivation toward solving problems and critical thinking during an engineering design challenge. Multiple experimental studies have shown engineering design challenges increase student achievement and attitude toward learning, but conflicting evidence surrounded the im...

  19. EEG BASED COGNITIVE WORKLOAD CLASSIFICATION DURING NASA MATB-II MULTITASKING

    Directory of Open Access Journals (Sweden)

    Sushil Chandra

    2015-06-01

    Full Text Available The objective of this experiment was to determine the best possible input EEG feature for classification of the workload while designing load balancing logic for an automated operator. The input features compared in this study consisted of spectral features of Electroencephalography, objective scoring and subjective scoring. Method utilizes to identify best EEG feature as an input in Neural Network Classifiers for workload classification, to identify channels which could provide classification with the highest accuracy and for identification of EEG feature which could give discrimination among workload level without adding any classifiers. The result had shown Engagement Index is the best feature for neural network classification.

  20. A REVIEW OF OIL PALM BIOCOMPOSITES FOR FURNITURE DESIGN AND APPLICATIONS: POTENTIAL AND CHALLENGES

    OpenAIRE

    Siti Suhaily,; Mohammad Jawaid,; H. P. S. Abdul Khalil,; A. Rahman Mohamed; , F. Ibrahim

    2012-01-01

    This review considers the potential and challenges of using agro-based oil palm biomasses, including the trunk, frond, empty fruit bunch, and palm press fiber biocomposites, for furniture applications. Currently, design and quality rather than price are becoming the primary concern for consumers when buying new furniture. Within this context, this paper focuses on the design of innovative, sustainable furniture from agro-based biocomposites to meet the needs of future population growth and te...

  1. Towards the use of similarity distances to music genre classification: A comparative study.

    Science.gov (United States)

    Goienetxea, Izaro; Martínez-Otzeta, José María; Sierra, Basilio; Mendialdua, Iñigo

    2018-01-01

    Music genre classification is a challenging research concept, for which open questions remain regarding classification approach, music piece representation, distances between/within genres, and so on. In this paper an investigation on the classification of generated music pieces is performed, based on the idea that grouping close related known pieces in different sets -or clusters- and then generating in an automatic way a new song which is somehow "inspired" in each set, the new song would be more likely to be classified as belonging to the set which inspired it, based on the same distance used to separate the clusters. Different music pieces representations and distances among pieces are used; obtained results are promising, and indicate the appropriateness of the used approach even in a such a subjective area as music genre classification is.

  2. Acoustic classification schemes in Europe – Applicability for new, existing and renovated housing

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2016-01-01

    The first acoustic classification schemes for dwellings were published in the 1990’es as national standards with the main purpose to introduce the possibility of specifying easily stricter acoustic criteria for new-build than the minimum requirements found in building regulations. Since then, more...... countries have introduced acoustic classification schemes, the first countries updated more times and some countries introduced acoustic classification also for other building categories. However, the classification schemes continued to focus on new buildings and have in general limited applicability...... for existing buildings from before implementation of acoustic regulations, typically in the 1950’es or later. The paper will summarize main characteristics, differences and similarities of the current national quality classes for housing in ten countries in Europe. In addition, the status and challenges...

  3. Can surgical simulation be used to train detection and classification of neural networks?

    Science.gov (United States)

    Zisimopoulos, Odysseas; Flouty, Evangello; Stacey, Mark; Muscroft, Sam; Giataganas, Petros; Nehme, Jean; Chow, Andre; Stoyanov, Danail

    2017-10-01

    Computer-assisted interventions (CAI) aim to increase the effectiveness, precision and repeatability of procedures to improve surgical outcomes. The presence and motion of surgical tools is a key information input for CAI surgical phase recognition algorithms. Vision-based tool detection and recognition approaches are an attractive solution and can be designed to take advantage of the powerful deep learning paradigm that is rapidly advancing image recognition and classification. The challenge for such algorithms is the availability and quality of labelled data used for training. In this Letter, surgical simulation is used to train tool detection and segmentation based on deep convolutional neural networks and generative adversarial networks. The authors experiment with two network architectures for image segmentation in tool classes commonly encountered during cataract surgery. A commercially-available simulator is used to create a simulated cataract dataset for training models prior to performing transfer learning on real surgical data. To the best of authors' knowledge, this is the first attempt to train deep learning models for surgical instrument detection on simulated data while demonstrating promising results to generalise on real data. Results indicate that simulated data does have some potential for training advanced classification methods for CAI systems.

  4. Wind Turbine Condition Monitoring: State-of-the-Art Review, New Trends, and Future Challenges

    Directory of Open Access Journals (Sweden)

    Pierre Tchakoua

    2014-04-01

    Full Text Available As the demand for wind energy continues to grow at exponential rates, reducing operation and maintenance (OM costs and improving reliability have become top priorities in wind turbine (WT maintenance strategies. In addition to the development of more highly evolved WT designs intended to improve availability, the application of reliable and cost-effective condition-monitoring (CM techniques offers an efficient approach to achieve this goal. This paper provides a general review and classification of wind turbine condition monitoring (WTCM methods and techniques with a focus on trends and future challenges. After highlighting the relevant CM, diagnosis, and maintenance analysis, this work outlines the relationship between these concepts and related theories, and examines new trends and future challenges in the WTCM industry. Interesting insights from this research are used to point out strengths and weaknesses in today’s WTCM industry and define research priorities needed for the industry to meet the challenges in wind industry technological evolution and market growth.

  5. Is overall similarity classification less effortful than single-dimension classification?

    Science.gov (United States)

    Wills, Andy J; Milton, Fraser; Longmore, Christopher A; Hester, Sarah; Robinson, Jo

    2013-01-01

    It is sometimes argued that the implementation of an overall similarity classification is less effortful than the implementation of a single-dimension classification. In the current article, we argue that the evidence securely in support of this view is limited, and report additional evidence in support of the opposite proposition--overall similarity classification is more effortful than single-dimension classification. Using a match-to-standards procedure, Experiments 1A, 1B and 2 demonstrate that concurrent load reduces the prevalence of overall similarity classification, and that this effect is robust to changes in the concurrent load task employed, the level of time pressure experienced, and the short-term memory requirements of the classification task. Experiment 3 demonstrates that participants who produced overall similarity classifications from the outset have larger working memory capacities than those who produced single-dimension classifications initially, and Experiment 4 demonstrates that instructions to respond meticulously increase the prevalence of overall similarity classification.

  6. Predictive Manufacturing: Classification of categorical data

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Kulahci, Murat

    2018-01-01

    and classification capabilities of our methodology (on different experimental settings) is done through a specially designed simulation experiment. Secondly, in order to demonstrate the applicability in a real life problem a data set from electronics component manufacturing is being analysed through our proposed...

  7. Consensus classification of posterior cortical atrophy.

    Science.gov (United States)

    Crutch, Sebastian J; Schott, Jonathan M; Rabinovici, Gil D; Murray, Melissa; Snowden, Julie S; van der Flier, Wiesje M; Dickerson, Bradford C; Vandenberghe, Rik; Ahmed, Samrah; Bak, Thomas H; Boeve, Bradley F; Butler, Christopher; Cappa, Stefano F; Ceccaldi, Mathieu; de Souza, Leonardo Cruz; Dubois, Bruno; Felician, Olivier; Galasko, Douglas; Graff-Radford, Jonathan; Graff-Radford, Neill R; Hof, Patrick R; Krolak-Salmon, Pierre; Lehmann, Manja; Magnin, Eloi; Mendez, Mario F; Nestor, Peter J; Onyike, Chiadi U; Pelak, Victoria S; Pijnenburg, Yolande; Primativo, Silvia; Rossor, Martin N; Ryan, Natalie S; Scheltens, Philip; Shakespeare, Timothy J; Suárez González, Aida; Tang-Wai, David F; Yong, Keir X X; Carrillo, Maria; Fox, Nick C

    2017-08-01

    A classification framework for posterior cortical atrophy (PCA) is proposed to improve the uniformity of definition of the syndrome in a variety of research settings. Consensus statements about PCA were developed through a detailed literature review, the formation of an international multidisciplinary working party which convened on four occasions, and a Web-based quantitative survey regarding symptom frequency and the conceptualization of PCA. A three-level classification framework for PCA is described comprising both syndrome- and disease-level descriptions. Classification level 1 (PCA) defines the core clinical, cognitive, and neuroimaging features and exclusion criteria of the clinico-radiological syndrome. Classification level 2 (PCA-pure, PCA-plus) establishes whether, in addition to the core PCA syndrome, the core features of any other neurodegenerative syndromes are present. Classification level 3 (PCA attributable to AD [PCA-AD], Lewy body disease [PCA-LBD], corticobasal degeneration [PCA-CBD], prion disease [PCA-prion]) provides a more formal determination of the underlying cause of the PCA syndrome, based on available pathophysiological biomarker evidence. The issue of additional syndrome-level descriptors is discussed in relation to the challenges of defining stages of syndrome severity and characterizing phenotypic heterogeneity within the PCA spectrum. There was strong agreement regarding the definition of the core clinico-radiological syndrome, meaning that the current consensus statement should be regarded as a refinement, development, and extension of previous single-center PCA criteria rather than any wholesale alteration or redescription of the syndrome. The framework and terminology may facilitate the interpretation of research data across studies, be applicable across a broad range of research scenarios (e.g., behavioral interventions, pharmacological trials), and provide a foundation for future collaborative work. Copyright © 2017 The Authors

  8. Fully Convolutional Networks for Ground Classification from LIDAR Point Clouds

    Science.gov (United States)

    Rizaldy, A.; Persello, C.; Gevaert, C. M.; Oude Elberink, S. J.

    2018-05-01

    Deep Learning has been massively used for image classification in recent years. The use of deep learning for ground classification from LIDAR point clouds has also been recently studied. However, point clouds need to be converted into an image in order to use Convolutional Neural Networks (CNNs). In state-of-the-art techniques, this conversion is slow because each point is converted into a separate image. This approach leads to highly redundant computation during conversion and classification. The goal of this study is to design a more efficient data conversion and ground classification. This goal is achieved by first converting the whole point cloud into a single image. The classification is then performed by a Fully Convolutional Network (FCN), a modified version of CNN designed for pixel-wise image classification. The proposed method is significantly faster than state-of-the-art techniques. On the ISPRS Filter Test dataset, it is 78 times faster for conversion and 16 times faster for classification. Our experimental analysis on the same dataset shows that the proposed method results in 5.22 % of total error, 4.10 % of type I error, and 15.07 % of type II error. Compared to the previous CNN-based technique and LAStools software, the proposed method reduces the total error and type I error (while type II error is slightly higher). The method was also tested on a very high point density LIDAR point clouds resulting in 4.02 % of total error, 2.15 % of type I error and 6.14 % of type II error.

  9. Formalization of the classification pattern: survey of classification modeling in information systems engineering.

    Science.gov (United States)

    Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James

    2018-01-01

    Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus

  10. Manifold regularized multitask learning for semi-supervised multilabel image classification.

    Science.gov (United States)

    Luo, Yong; Tao, Dacheng; Geng, Bo; Xu, Chao; Maybank, Stephen J

    2013-02-01

    It is a significant challenge to classify images with multiple labels by using only a small number of labeled samples. One option is to learn a binary classifier for each label and use manifold regularization to improve the classification performance by exploring the underlying geometric structure of the data distribution. However, such an approach does not perform well in practice when images from multiple concepts are represented by high-dimensional visual features. Thus, manifold regularization is insufficient to control the model complexity. In this paper, we propose a manifold regularized multitask learning (MRMTL) algorithm. MRMTL learns a discriminative subspace shared by multiple classification tasks by exploiting the common structure of these tasks. It effectively controls the model complexity because different tasks limit one another's search volume, and the manifold regularization ensures that the functions in the shared hypothesis space are smooth along the data manifold. We conduct extensive experiments, on the PASCAL VOC'07 dataset with 20 classes and the MIR dataset with 38 classes, by comparing MRMTL with popular image classification algorithms. The results suggest that MRMTL is effective for image classification.

  11. DNA methylation-based classification of central nervous system tumours.

    Science.gov (United States)

    Capper, David; Jones, David T W; Sill, Martin; Hovestadt, Volker; Schrimpf, Daniel; Sturm, Dominik; Koelsche, Christian; Sahm, Felix; Chavez, Lukas; Reuss, David E; Kratz, Annekathrin; Wefers, Annika K; Huang, Kristin; Pajtler, Kristian W; Schweizer, Leonille; Stichel, Damian; Olar, Adriana; Engel, Nils W; Lindenberg, Kerstin; Harter, Patrick N; Braczynski, Anne K; Plate, Karl H; Dohmen, Hildegard; Garvalov, Boyan K; Coras, Roland; Hölsken, Annett; Hewer, Ekkehard; Bewerunge-Hudler, Melanie; Schick, Matthias; Fischer, Roger; Beschorner, Rudi; Schittenhelm, Jens; Staszewski, Ori; Wani, Khalida; Varlet, Pascale; Pages, Melanie; Temming, Petra; Lohmann, Dietmar; Selt, Florian; Witt, Hendrik; Milde, Till; Witt, Olaf; Aronica, Eleonora; Giangaspero, Felice; Rushing, Elisabeth; Scheurlen, Wolfram; Geisenberger, Christoph; Rodriguez, Fausto J; Becker, Albert; Preusser, Matthias; Haberler, Christine; Bjerkvig, Rolf; Cryan, Jane; Farrell, Michael; Deckert, Martina; Hench, Jürgen; Frank, Stephan; Serrano, Jonathan; Kannan, Kasthuri; Tsirigos, Aristotelis; Brück, Wolfgang; Hofer, Silvia; Brehmer, Stefanie; Seiz-Rosenhagen, Marcel; Hänggi, Daniel; Hans, Volkmar; Rozsnoki, Stephanie; Hansford, Jordan R; Kohlhof, Patricia; Kristensen, Bjarne W; Lechner, Matt; Lopes, Beatriz; Mawrin, Christian; Ketter, Ralf; Kulozik, Andreas; Khatib, Ziad; Heppner, Frank; Koch, Arend; Jouvet, Anne; Keohane, Catherine; Mühleisen, Helmut; Mueller, Wolf; Pohl, Ute; Prinz, Marco; Benner, Axel; Zapatka, Marc; Gottardo, Nicholas G; Driever, Pablo Hernáiz; Kramm, Christof M; Müller, Hermann L; Rutkowski, Stefan; von Hoff, Katja; Frühwald, Michael C; Gnekow, Astrid; Fleischhack, Gudrun; Tippelt, Stephan; Calaminus, Gabriele; Monoranu, Camelia-Maria; Perry, Arie; Jones, Chris; Jacques, Thomas S; Radlwimmer, Bernhard; Gessi, Marco; Pietsch, Torsten; Schramm, Johannes; Schackert, Gabriele; Westphal, Manfred; Reifenberger, Guido; Wesseling, Pieter; Weller, Michael; Collins, Vincent Peter; Blümcke, Ingmar; Bendszus, Martin; Debus, Jürgen; Huang, Annie; Jabado, Nada; Northcott, Paul A; Paulus, Werner; Gajjar, Amar; Robinson, Giles W; Taylor, Michael D; Jaunmuktane, Zane; Ryzhova, Marina; Platten, Michael; Unterberg, Andreas; Wick, Wolfgang; Karajannis, Matthias A; Mittelbronn, Michel; Acker, Till; Hartmann, Christian; Aldape, Kenneth; Schüller, Ulrich; Buslei, Rolf; Lichter, Peter; Kool, Marcel; Herold-Mende, Christel; Ellison, David W; Hasselblatt, Martin; Snuderl, Matija; Brandner, Sebastian; Korshunov, Andrey; von Deimling, Andreas; Pfister, Stefan M

    2018-03-22

    Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging-with substantial inter-observer variability in the histopathological diagnosis of many tumour types. Here we present a comprehensive approach for the DNA methylation-based classification of central nervous system tumours across all entities and age groups, and demonstrate its application in a routine diagnostic setting. We show that the availability of this method may have a substantial impact on diagnostic precision compared to standard methods, resulting in a change of diagnosis in up to 12% of prospective cases. For broader accessibility, we have designed a free online classifier tool, the use of which does not require any additional onsite data processing. Our results provide a blueprint for the generation of machine-learning-based tumour classifiers across other cancer entities, with the potential to fundamentally transform tumour pathology.

  12. Feasibility Study on a Portable Field Pest Classification System Design Based on DSP and 3G Wireless Communication Technology

    Directory of Open Access Journals (Sweden)

    Fei Liu

    2012-03-01

    Full Text Available This paper presents a feasibility study on a real-time in field pest classification system design based on Blackfin DSP and 3G wireless communication technology. This prototype system is composed of remote on-line classification platform (ROCP, which uses a digital signal processor (DSP as a core CPU, and a host control platform (HCP. The ROCP is in charge of acquiring the pest image, extracting image features and detecting the class of pest using an Artificial Neural Network (ANN classifier. It sends the image data, which is encoded using JPEG 2000 in DSP, to the HCP through the 3G network at the same time for further identification. The image transmission and communication are accomplished using 3G technology. Our system transmits the data via a commercial base station. The system can work properly based on the effective coverage of base stations, no matter the distance from the ROCP to the HCP. In the HCP, the image data is decoded and the pest image displayed in real-time for further identification. Authentication and performance tests of the prototype system were conducted. The authentication test showed that the image data were transmitted correctly. Based on the performance test results on six classes of pests, the average accuracy is 82%. Considering the different live pests’ pose and different field lighting conditions, the result is satisfactory. The proposed technique is well suited for implementation in field pest classification on-line for precision agriculture.

  13. Feasibility study on a portable field pest classification system design based on DSP and 3G wireless communication technology.

    Science.gov (United States)

    Han, Ruizhen; He, Yong; Liu, Fei

    2012-01-01

    This paper presents a feasibility study on a real-time in field pest classification system design based on Blackfin DSP and 3G wireless communication technology. This prototype system is composed of remote on-line classification platform (ROCP), which uses a digital signal processor (DSP) as a core CPU, and a host control platform (HCP). The ROCP is in charge of acquiring the pest image, extracting image features and detecting the class of pest using an Artificial Neural Network (ANN) classifier. It sends the image data, which is encoded using JPEG 2000 in DSP, to the HCP through the 3G network at the same time for further identification. The image transmission and communication are accomplished using 3G technology. Our system transmits the data via a commercial base station. The system can work properly based on the effective coverage of base stations, no matter the distance from the ROCP to the HCP. In the HCP, the image data is decoded and the pest image displayed in real-time for further identification. Authentication and performance tests of the prototype system were conducted. The authentication test showed that the image data were transmitted correctly. Based on the performance test results on six classes of pests, the average accuracy is 82%. Considering the different live pests' pose and different field lighting conditions, the result is satisfactory. The proposed technique is well suited for implementation in field pest classification on-line for precision agriculture.

  14. Uav-Based Crops Classification with Joint Features from Orthoimage and Dsm Data

    Science.gov (United States)

    Liu, B.; Shi, Y.; Duan, Y.; Wu, W.

    2018-04-01

    Accurate crops classification remains a challenging task due to the same crop with different spectra and different crops with same spectrum phenomenon. Recently, UAV-based remote sensing approach gains popularity not only for its high spatial and temporal resolution, but also for its ability to obtain spectraand spatial data at the same time. This paper focus on how to take full advantages of spatial and spectrum features to improve crops classification accuracy, based on an UAV platform equipped with a general digital camera. Texture and spatial features extracted from the RGB orthoimage and the digital surface model of the monitoring area are analysed and integrated within a SVM classification framework. Extensive experiences results indicate that the overall classification accuracy is drastically improved from 72.9 % to 94.5 % when the spatial features are combined together, which verified the feasibility and effectiveness of the proposed method.

  15. The history of transdisciplinary race classification: methods, politics and institutions, 1840s-1940s.

    Science.gov (United States)

    McMahon, Richard

    2018-03-01

    A recently blossoming historiographical literature recognizes that physical anthropologists allied with scholars of diverse aspects of society and history to racially classify European peoples over a period of about a hundred years. They created three successive race classification coalitions - ethnology, from around 1840; anthropology, from the 1850s; and interwar raciology - each of which successively disintegrated. The present genealogical study argues that representing these coalitions as 'transdisciplinary' can enrich our understanding of challenges to disciplinary specialization. This is especially the case for the less well-studied nineteenth century, when disciplines and challenges to disciplinary specialization were both gradually emerging. Like Marxism or structuralism, race classification was a holistic interpretive framework, which, at its most ambitious, aimed to structure the human sciences as a whole. It resisted the organization of academia and knowledge into disciplines with separate organizational institutions and research practices. However, the 'transdisciplinarity' of this nationalistic project also bridged emerging borderlines between science and politics. I ascribe race classification's simultaneous longevity and instability to its complex and intricately entwined processes of political and interdisciplinary coalition building. Race classification's politically useful conclusions helped secure public support for institutionalizing the coalition's component disciplines. Institutionalization in turn stimulated disciplines to professionalize. They emphasized disciplinary boundaries and insisted on apolitical science, thus ultimately undermining the 'transdisciplinary' project.

  16. Guidance on classification for reproductive toxicity under the globally harmonized system of classification and labelling of chemicals (GHS).

    Science.gov (United States)

    Moore, Nigel P; Boogaard, Peter J; Bremer, Susanne; Buesen, Roland; Edwards, James; Fraysse, Benoit; Hallmark, Nina; Hemming, Helena; Langrand-Lerche, Carole; McKee, Richard H; Meisters, Marie-Louise; Parsons, Paul; Politano, Valerie; Reader, Stuart; Ridgway, Peter; Hennes, Christa

    2013-11-01

    The Globally Harmonised System of Classification (GHS) is a framework within which the intrinsic hazards of substances may be determined and communicated. It is not a legislative instrument per se, but is enacted into national legislation with the appropriate legislative instruments. GHS covers many aspects of effects upon health and the environment, including adverse effects upon sexual function and fertility or on development. Classification for these effects is based upon observations in humans or from properly designed experiments in animals, although only the latter is covered herein. The decision to classify a substance based upon experimental data, and the category of classification ascribed, is determined by the level of evidence that is available for an adverse effect on sexual function and fertility or on development that does not arise as a secondary non-specific consequence of other toxic effect. This document offers guidance on the determination of level of concern as a measure of adversity, and the level of evidence to ascribe classification based on data from tests in laboratory animals.

  17. Image Classification Based on Convolutional Denoising Sparse Autoencoder

    Directory of Open Access Journals (Sweden)

    Shuangshuang Chen

    2017-01-01

    Full Text Available Image classification aims to group images into corresponding semantic categories. Due to the difficulties of interclass similarity and intraclass variability, it is a challenging issue in computer vision. In this paper, an unsupervised feature learning approach called convolutional denoising sparse autoencoder (CDSAE is proposed based on the theory of visual attention mechanism and deep learning methods. Firstly, saliency detection method is utilized to get training samples for unsupervised feature learning. Next, these samples are sent to the denoising sparse autoencoder (DSAE, followed by convolutional layer and local contrast normalization layer. Generally, prior in a specific task is helpful for the task solution. Therefore, a new pooling strategy—spatial pyramid pooling (SPP fused with center-bias prior—is introduced into our approach. Experimental results on the common two image datasets (STL-10 and CIFAR-10 demonstrate that our approach is effective in image classification. They also demonstrate that none of these three components: local contrast normalization, SPP fused with center-prior, and l2 vector normalization can be excluded from our proposed approach. They jointly improve image representation and classification performance.

  18. Automated Feature Design for Time Series Classification by Genetic Programming

    OpenAIRE

    Harvey, Dustin Yewell

    2014-01-01

    Time series classification (TSC) methods discover and exploit patterns in time series and other one-dimensional signals. Although many accurate, robust classifiers exist for multivariate feature sets, general approaches are needed to extend machine learning techniques to make use of signal inputs. Numerous applications of TSC can be found in structural engineering, especially in the areas of structural health monitoring and non-destructive evaluation. Additionally, the fields of process contr...

  19. SAW Classification Algorithm for Chinese Text Classification

    OpenAIRE

    Xiaoli Guo; Huiyu Sun; Tiehua Zhou; Ling Wang; Zhaoyang Qu; Jiannan Zang

    2015-01-01

    Considering the explosive growth of data, the increased amount of text data’s effect on the performance of text categorization forward the need for higher requirements, such that the existing classification method cannot be satisfied. Based on the study of existing text classification technology and semantics, this paper puts forward a kind of Chinese text classification oriented SAW (Structural Auxiliary Word) algorithm. The algorithm uses the special space effect of Chinese text where words...

  20. A Classification of BPEL Extensions

    Directory of Open Access Journals (Sweden)

    Oliver Kopp

    2011-10-01

    Full Text Available The Business Process Execution Language (BPEL has emerged as de-facto standard for business processes implementation. This language is designed to be extensible for including additional valuable features in a standardized manner. There are a number of BPEL extensions available. They are, however, neither classified nor evaluated with respect to their compliance to the BPEL standard. This article fills this gap by providing a framework for classifying BPEL extensions, a classification of existing extensions, and a guideline for designing BPEL extensions.

  1. Towards the use of similarity distances to music genre classification: A comparative study.

    Directory of Open Access Journals (Sweden)

    Izaro Goienetxea

    Full Text Available Music genre classification is a challenging research concept, for which open questions remain regarding classification approach, music piece representation, distances between/within genres, and so on. In this paper an investigation on the classification of generated music pieces is performed, based on the idea that grouping close related known pieces in different sets -or clusters- and then generating in an automatic way a new song which is somehow "inspired" in each set, the new song would be more likely to be classified as belonging to the set which inspired it, based on the same distance used to separate the clusters. Different music pieces representations and distances among pieces are used; obtained results are promising, and indicate the appropriateness of the used approach even in a such a subjective area as music genre classification is.

  2. Benefits and challenges of using the cohort multiple randomised controlled trial design for testing an intervention for depression.

    Science.gov (United States)

    Viksveen, Petter; Relton, Clare; Nicholl, Jon

    2017-07-06

    Trials which test the effectiveness of interventions compared with the status quo frequently encounter challenges. The cohort multiple randomised controlled trial (cmRCT) design is an innovative approach to the design and conduct of pragmatic trials which seeks to address some of these challenges. In this article, we report our experiences with the first completed randomised controlled trial (RCT) using the cmRCT design. This trial-the Depression in South Yorkshire (DEPSY) trial-involved comparison of treatment as usual (TAU) with TAU plus the offer of an intervention for people with self-reported long-term moderate to severe depression. In the trial, we used an existing large population-based cohort: the Yorkshire Health Study. We discuss our experiences with recruitment, attrition, crossover, data analysis, generalisability of results, and cost. The main challenges in using the cmRCT design were the high crossover to the control group and the lower questionnaire response rate among patients who refused the offer of treatment. However, the design did help facilitate efficient and complete recruitment of the trial population as well as analysable data that were generalisable to the population of interest. Attrition rates were also smaller than those reported in other depression trials. This first completed full trial using the cmRCT design testing an intervention for self-reported depression was associated with a number of important benefits. Further research is required to compare the acceptability and cost effectiveness of standard pragmatic RCT design with the cmRCT design. ISRCTN registry: ISRCTN02484593 . Registered on 7 Jan 2013.

  3. 10 CFR 1045.17 - Classification levels.

    Science.gov (United States)

    2010-01-01

    ...) Top Secret. The Director of Classification shall classify RD information Top Secret if it is vital to... exceptionally grave damage to the national security. Examples of RD information that warrant Top Secret... comprehensive to warrant designation as Top Secret. Examples of RD information that warrant Secret...

  4. VOCAL SEGMENT CLASSIFICATION IN POPULAR MUSIC

    DEFF Research Database (Denmark)

    Feng, Ling; Nielsen, Andreas Brinch; Hansen, Lars Kai

    2008-01-01

    This paper explores the vocal and non-vocal music classification problem within popular songs. A newly built labeled database covering 147 popular songs is announced. It is designed for classifying signals from 1sec time windows. Features are selected for this particular task, in order to capture...

  5. The space shuttle ascent vehicle aerodynamic challenges configuration design and data base development

    Science.gov (United States)

    Dill, C. C.; Young, J. C.; Roberts, B. B.; Craig, M. K.; Hamilton, J. T.; Boyle, W. W.

    1985-01-01

    The phase B Space Shuttle systems definition studies resulted in a generic configuration consisting of a delta wing orbiter, and two solid rocket boosters (SRB) attached to an external fuel tank (ET). The initial challenge facing the aerodynamic community was aerodynamically optimizing, within limits, this configuration. As the Shuttle program developed and the sensitivities of the vehicle to aerodynamics were better understood the requirements of the aerodynamic data base grew. Adequately characterizing the vehicle to support the various design studies exploded the size of the data base to proportions that created a data modeling/management challenge for the aerodynamicist. The ascent aerodynamic data base originated primarily from wind tunnel test results. The complexity of the configuration rendered conventional analytic methods of little use. Initial wind tunnel tests provided results which included undesirable effects from model support tructure, inadequate element proximity, and inadequate plume simulation. The challenge to improve the quality of test results by determining the extent of these undesirable effects and subsequently develop testing techniques to eliminate them was imposed on the aerodynamic community. The challenges to the ascent aerodynamics community documented are unique due to the aerodynamic complexity of the Shuttle launch. Never before was such a complex vehicle aerodynamically characterized. The challenges were met with innovative engineering analyses/methodology development and wind tunnel testing techniques.

  6. Classification for Inconsistent Decision Tables

    KAUST Repository

    Azad, Mohammad

    2016-09-28

    Decision trees have been used widely to discover patterns from consistent data set. But if the data set is inconsistent, where there are groups of examples with equal values of conditional attributes but different labels, then to discover the essential patterns or knowledge from the data set is challenging. Three approaches (generalized, most common and many-valued decision) have been considered to handle such inconsistency. The decision tree model has been used to compare the classification results among three approaches. Many-valued decision approach outperforms other approaches, and M_ws_entM greedy algorithm gives faster and better prediction accuracy.

  7. Classification for Inconsistent Decision Tables

    KAUST Repository

    Azad, Mohammad; Moshkov, Mikhail

    2016-01-01

    Decision trees have been used widely to discover patterns from consistent data set. But if the data set is inconsistent, where there are groups of examples with equal values of conditional attributes but different labels, then to discover the essential patterns or knowledge from the data set is challenging. Three approaches (generalized, most common and many-valued decision) have been considered to handle such inconsistency. The decision tree model has been used to compare the classification results among three approaches. Many-valued decision approach outperforms other approaches, and M_ws_entM greedy algorithm gives faster and better prediction accuracy.

  8. Classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    This article presents and discusses definitions of the term “classification” and the related concepts “Concept/conceptualization,”“categorization,” “ordering,” “taxonomy” and “typology.” It further presents and discusses theories of classification including the influences of Aristotle...... and Wittgenstein. It presents different views on forming classes, including logical division, numerical taxonomy, historical classification, hermeneutical and pragmatic/critical views. Finally, issues related to artificial versus natural classification and taxonomic monism versus taxonomic pluralism are briefly...

  9. FULLY CONVOLUTIONAL NETWORKS FOR GROUND CLASSIFICATION FROM LIDAR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    A. Rizaldy

    2018-05-01

    Full Text Available Deep Learning has been massively used for image classification in recent years. The use of deep learning for ground classification from LIDAR point clouds has also been recently studied. However, point clouds need to be converted into an image in order to use Convolutional Neural Networks (CNNs. In state-of-the-art techniques, this conversion is slow because each point is converted into a separate image. This approach leads to highly redundant computation during conversion and classification. The goal of this study is to design a more efficient data conversion and ground classification. This goal is achieved by first converting the whole point cloud into a single image. The classification is then performed by a Fully Convolutional Network (FCN, a modified version of CNN designed for pixel-wise image classification. The proposed method is significantly faster than state-of-the-art techniques. On the ISPRS Filter Test dataset, it is 78 times faster for conversion and 16 times faster for classification. Our experimental analysis on the same dataset shows that the proposed method results in 5.22 % of total error, 4.10 % of type I error, and 15.07 % of type II error. Compared to the previous CNN-based technique and LAStools software, the proposed method reduces the total error and type I error (while type II error is slightly higher. The method was also tested on a very high point density LIDAR point clouds resulting in 4.02 % of total error, 2.15 % of type I error and 6.14 % of type II error.

  10. EOG and EMG: two important switches in automatic sleep stage classification.

    Science.gov (United States)

    Estrada, E; Nazeran, H; Barragan, J; Burk, J R; Lucas, E A; Behbehani, K

    2006-01-01

    Sleep is a natural periodic state of rest for the body, in which the eyes are usually closed and consciousness is completely or partially lost. In this investigation we used the EOG and EMG signals acquired from 10 patients undergoing overnight polysomnography with their sleep stages determined by expert sleep specialists based on RK rules. Differentiation between Stage 1, Awake and REM stages challenged a well trained neural network classifier to distinguish between classes when only EEG-derived signal features were used. To meet this challenge and improve the classification rate, extra features extracted from EOG and EMG signals were fed to the classifier. In this study, two simple feature extraction algorithms were applied to EOG and EMG signals. The statistics of the results were calculated and displayed in an easy to visualize fashion to observe tendencies for each sleep stage. Inclusion of these features show a great promise to improve the classification rate towards the target rate of 100%

  11. Toward the establishment of standardized in vitro tests for lipid-based formulations, part 4: proposing a new lipid formulation performance classification system.

    Science.gov (United States)

    Williams, Hywel D; Sassene, Philip; Kleberg, Karen; Calderone, Marilyn; Igonin, Annabel; Jule, Eduardo; Vertommen, Jan; Blundell, Ross; Benameur, Hassan; Müllertz, Anette; Porter, Christopher J H; Pouton, Colin W

    2014-08-01

    The Lipid Formulation Classification System Consortium looks to develop standardized in vitro tests and to generate much-needed performance criteria for lipid-based formulations (LBFs). This article highlights the value of performing a second, more stressful digestion test to identify LBFs near a performance threshold and to facilitate lead formulation selection in instances where several LBF prototypes perform adequately under standard digestion conditions (but where further discrimination is necessary). Stressed digestion tests can be designed based on an understanding of the factors that affect LBF performance, including the degree of supersaturation generated on dispersion/digestion. Stresses evaluated included decreasing LBF concentration (↓LBF), increasing bile salt, and decreasing pH. Their capacity to stress LBFs was dependent on LBF composition and drug type: ↓LBF was a stressor to medium-chain glyceride-rich LBFs, but not more hydrophilic surfactant-rich LBFs, whereas decreasing pH stressed tolfenamic acid LBFs, but not fenofibrate LBFs. Lastly, a new Performance Classification System, that is, LBF composition independent, is proposed to promote standardized LBF comparisons, encourage robust LBF development, and facilitate dialogue with the regulatory authorities. This classification system is based on the concept that performance evaluations across three in vitro tests, designed to subject a LBF to progressively more challenging conditions, will enable effective LBF discrimination and performance grading. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  12. An indigenous soil classification system for Bellona Island - a raised atoll in the Solomon Islands

    DEFF Research Database (Denmark)

    Elberling, Bo; Breuning-Madsen, Henrik; Bruun, Thilde Bech

    2010-01-01

    One of the challenges of evaluating existing traditional farming systems is to combine local knowledge and modern scientific methods and terminology. This requires an evaluation of indigenous soil classification in modern terms. This paper focuses on an indigenous soil classification system...... perceive the same four out of seven soil types as highly useful for cultivation and rank these soil types similarly according to their suitability for different crops such as yam, watermelon, cassava and sweet potato. It is concluded that the indigenous soil classification is in line with the soil...... production potential and useful for land evaluation on Bellona....

  13. Design challenges in nanoparticle-based platforms: Implications for targeted drug delivery systems

    Science.gov (United States)

    Mullen, Douglas Gurnett

    Characterization and control of heterogeneous distributions of nanoparticle-ligand components are major design challenges for nanoparticle-based platforms. This dissertation begins with an examination of poly(amidoamine) (PAMAM) dendrimer-based targeted delivery platform. A folic acid targeted modular platform was developed to target human epithelial cancer cells. Although active targeting was observed in vitro, active targeting was not found in vivo using a mouse tumor model. A major flaw of this platform design was that it did not provide for characterization or control of the component distribution. Motivated by the problems experienced with the modular design, the actual composition of nanoparticle-ligand distributions were examined using a model dendrimer-ligand system. High Pressure Liquid Chromatography (HPLC) resolved the distribution of components in samples with mean ligand/dendrimer ratios ranging from 0.4 to 13. A peak fitting analysis enabled the quantification of the component distribution. Quantified distributions were found to be significantly more heterogeneous than commonly expected and standard analytical parameters, namely the mean ligand/nanoparticle ratio, failed to adequately represent the component heterogeneity. The distribution of components was also found to be sensitive to particle modifications that preceded the ligand conjugation. With the knowledge gained from this detailed distribution analysis, a new platform design was developed to provide a system with dramatically improved control over the number of components and with improved batch reproducibility. Using semi-preparative HPLC, individual dendrimer-ligand components were isolated. The isolated dendrimer with precise numbers of ligands were characterized by NMR and analytical HPLC. In total, nine different dendrimer-ligand components were obtained with degrees of purity ≥80%. This system has the potential to serve as a platform to which a precise number of functional molecules

  14. A hierarchical inferential method for indoor scene classification

    Directory of Open Access Journals (Sweden)

    Jiang Jingzhe

    2017-12-01

    Full Text Available Indoor scene classification forms a basis for scene interaction for service robots. The task is challenging because the layout and decoration of a scene vary considerably. Previous studies on knowledge-based methods commonly ignore the importance of visual attributes when constructing the knowledge base. These shortcomings restrict the performance of classification. The structure of a semantic hierarchy was proposed to describe similarities of different parts of scenes in a fine-grained way. Besides the commonly used semantic features, visual attributes were also introduced to construct the knowledge base. Inspired by the processes of human cognition and the characteristics of indoor scenes, we proposed an inferential framework based on the Markov logic network. The framework is evaluated on a popular indoor scene dataset, and the experimental results demonstrate its effectiveness.

  15. Exploring Group Life Design with Teachers in the Context of Poverty Related Psychosocial Challenges

    Science.gov (United States)

    Setlhare, Rubina; Wood, Lesley; Meyer, Lukas

    2017-01-01

    Working in challenging contexts can impact negatively on a teacher's sense of purpose and efficacy. This article explores the potential of group Life Design (LD), a narrative constructivist career counselling process, for supporting ten South African school teachers working at an under-resourced school with understanding their career aspirations…

  16. DTI measurements for Alzheimer’s classification

    Science.gov (United States)

    Maggipinto, Tommaso; Bellotti, Roberto; Amoroso, Nicola; Diacono, Domenico; Donvito, Giacinto; Lella, Eufemia; Monaco, Alfonso; Antonella Scelsi, Marzia; Tangaro, Sabina; Disease Neuroimaging Initiative, Alzheimer's.

    2017-03-01

    Diffusion tensor imaging (DTI) is a promising imaging technique that provides insight into white matter microstructure integrity and it has greatly helped identifying white matter regions affected by Alzheimer’s disease (AD) in its early stages. DTI can therefore be a valuable source of information when designing machine-learning strategies to discriminate between healthy control (HC) subjects, AD patients and subjects with mild cognitive impairment (MCI). Nonetheless, several studies have reported so far conflicting results, especially because of the adoption of biased feature selection strategies. In this paper we firstly analyzed DTI scans of 150 subjects from the Alzheimer’s disease neuroimaging initiative (ADNI) database. We measured a significant effect of the feature selection bias on the classification performance (p-value  informative content provided by DTI measurements for AD classification. Classification performances and biological insight, concerning brain regions related to the disease, provided by cross-validation analysis were both confirmed on the independent test.

  17. Inventory classification based on decoupling points

    Directory of Open Access Journals (Sweden)

    Joakim Wikner

    2015-01-01

    Full Text Available The ideal state of continuous one-piece flow may never be achieved. Still the logistics manager can improve the flow by carefully positioning inventory to buffer against variations. Strategies such as lean, postponement, mass customization, and outsourcing all rely on strategic positioning of decoupling points to separate forecast-driven from customer-order-driven flows. Planning and scheduling of the flow are also based on classification of decoupling points as master scheduled or not. A comprehensive classification scheme for these types of decoupling points is introduced. The approach rests on identification of flows as being either demand based or supply based. The demand or supply is then combined with exogenous factors, classified as independent, or endogenous factors, classified as dependent. As a result, eight types of strategic as well as tactical decoupling points are identified resulting in a process-based framework for inventory classification that can be used for flow design.

  18. Apples to committee consensus: the challenge of gender identity classification.

    Science.gov (United States)

    Rettew, David C

    2012-01-01

    The debate surrounding the inclusion of gender dysphoria/gender variant behavior (GD/GV) as a psychiatric diagnosis exposes many of the fundamental shortcomings and inconsistencies of our current diagnostic classification system. Proposals raised by the authors of this special issue, including basing diagnosis on cause rather than overt behavior, reclassifying GD/GV behavior as a physical rather than mental condition, and basing diagnosis on impairment or distress, offer some solutions but have limitations themselves given the available database. In contrast to most accepted psychiatric conditions where emphasis is placed on ultimately changing internal thoughts, feelings, and behaviors, consensus treatment for most GD/GV individuals, at least from adolescence onward, focuses on modifying the external body and external environment to maximize positive outcomes. This series of articles illustrating the diversity of opinions on when and if gender incongruence should be considered pathological reflects the relative lack of scientific indicators of disease in this area, similar to many other domains of mental functioning.

  19. Challenges and Opportunities for Establishing Design as a Research Discipline in Civil and Environmental Engineering

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn

    2013-01-01

    faculty, research and education communities, conferences, and journals. However, design remains an emerging sub-discipline in civil and environmental engineering – practiced, valued, and taught but not subject to rigorous academic research. This paper presents some of the challenges associated...... with the establishment of design as a research discipline within civil and environmental engineering, some of the benefits and opportunities that will come from that establishment, and some evidence for the fact that this process has already begun.......There are a number of fields including architecture, industrial design, and urban planning and design, where design is the discipline upon which all research and teaching activities are based. In other fields such as aerospace and mechanical engineering, design is a sub-discipline with its own...

  20. 78 FR 54970 - Cotton Futures Classification: Optional Classification Procedure

    Science.gov (United States)

    2013-09-09

    ... Service 7 CFR Part 27 [AMS-CN-13-0043] RIN 0581-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing Service, USDA. ACTION: Proposed rule. SUMMARY: The... optional cotton futures classification procedure--identified and known as ``registration'' by the U.S...

  1. Authorization Basis Safety Classification of Transfer Bay Bridge Crane at the 105-K Basins

    International Nuclear Information System (INIS)

    CHAFFEE, G.A.

    2000-01-01

    This supporting document provides the bases for the safety classification for the K Basin transfer bay bridge crane and the bases for the Structures, Systems, and Components (SSC) safety classification. A table is presented that delineates the safety significant components. This safety classification is based on a review of the Authorization Basis (AB). This Authorization Basis review was performed regarding AB and design baseline issues. The primary issues are: (1) What is the AB for the safety classification of the transfer bay bridge crane? (2) What does the SSC safety classification ''Safety Significant'' or ''Safety Significant for Design Only'' mean for design requirements and quality requirements for procurement, installation and maintenance (including replacement of parts) activities for the crane during its expected life time? The AB information on the crane was identified based on review of Department of Energy--Richland Office (RL) and Spent Nuclear Fuel (SNF) Project correspondence, K Basin Safety Analysis Report (SAR) and RL Safety Evaluation Reports (SERs) of SNF Project SAR submittals. The relevant correspondence, actions and activities taken and substantive directions or conclusions of these documents are provided in Appendix A

  2. Nuclear challenges and progress in designing stellarator power plants

    International Nuclear Information System (INIS)

    El-Guebaly, L.

    2007-01-01

    As an alternate to the mainline magnetic fusion tokamaks, the stellarator concept offers a steady state operation without external driven current, eliminating the risk of plasma irruptions. Over the past 2-3 decades, stellarator power plants have been studied in the U.S., Japan, and Europe to enhance the physics and engineering aspects and optimize the design parameters that are subject to numerous constraints. The earlier 1980's studies delivered large stellarators with an average major radius exceeding 20 m. The most recent development of the compact stellarator concept has led to the construction of the National Compact Stellarator Experiment (NCSX) in the U.S. and the 3 years power plant study of ARIES-CS, a compact stellarator with 7.75 m average major radius, approaching that of tokamaks. The ARIES-CS first wall configuration deviates from the standard practice of uniform toroidal shape in order to achieve compactness. Modeling such a complex geometry for 3-D nuclear analysis was a challenging engineering task. A novel approach based on coupling the CAD model with the MCNP Monte Carlo code was developed to model, for the first time ever, the complex stellarator geometry for nuclear assessments. The most important parameter that determines the stellarator size and cost is the minimum distance between the plasma boundary and mid-coil. Accommodating the breeding blanket and necessary shield to protect the superconducting magnet represented another challenging task. An innovative approach utilizing a non-uniform blanket combined with a highly efficient WC shield for this highly constrained area reduced the radial standoff (and machine size and cost) by 25- 30%, which is significant. As stellarators generate more radwaste than tokamaks, managing ARIES-CS active materials during operation and after plant decommissioning was essential for the environmental attractiveness of the machine. The geological disposal option could be replaced with more attractive scenarios

  3. The Australian National Sub-Acute and Non-Acute Patient casemix classification.

    Science.gov (United States)

    Eagar, K

    1999-01-01

    The Australian National Sub-Acute and Non-Acute Patient (AN-SNAP) Version 1 casemix classification was completed in 1997. AN-SNAP is designed for the classification of sub-acute and non-acute care provided in both inpatient and ambulatory settings and is intended to be useful for both funding and clinical management purposes. The National Sub-Acute and Non-Acute Casemix Classification study has produced the first version of a national classification of sub-acute and non-acute care. Ongoing refinement (leading to Version 2) will be possible through further analysis of the existing data set in combination with analysis of the results of a carefully planned and phased implementation.

  4. Feature Selection for Motor Imagery EEG Classification Based on Firefly Algorithm and Learning Automata.

    Science.gov (United States)

    Liu, Aiming; Chen, Kun; Liu, Quan; Ai, Qingsong; Xie, Yi; Chen, Anqi

    2017-11-08

    Motor Imagery (MI) electroencephalography (EEG) is widely studied for its non-invasiveness, easy availability, portability, and high temporal resolution. As for MI EEG signal processing, the high dimensions of features represent a research challenge. It is necessary to eliminate redundant features, which not only create an additional overhead of managing the space complexity, but also might include outliers, thereby reducing classification accuracy. The firefly algorithm (FA) can adaptively select the best subset of features, and improve classification accuracy. However, the FA is easily entrapped in a local optimum. To solve this problem, this paper proposes a method of combining the firefly algorithm and learning automata (LA) to optimize feature selection for motor imagery EEG. We employed a method of combining common spatial pattern (CSP) and local characteristic-scale decomposition (LCD) algorithms to obtain a high dimensional feature set, and classified it by using the spectral regression discriminant analysis (SRDA) classifier. Both the fourth brain-computer interface competition data and real-time data acquired in our designed experiments were used to verify the validation of the proposed method. Compared with genetic and adaptive weight particle swarm optimization algorithms, the experimental results show that our proposed method effectively eliminates redundant features, and improves the classification accuracy of MI EEG signals. In addition, a real-time brain-computer interface system was implemented to verify the feasibility of our proposed methods being applied in practical brain-computer interface systems.

  5. Feature Selection for Motor Imagery EEG Classification Based on Firefly Algorithm and Learning Automata

    Directory of Open Access Journals (Sweden)

    Aiming Liu

    2017-11-01

    Full Text Available Motor Imagery (MI electroencephalography (EEG is widely studied for its non-invasiveness, easy availability, portability, and high temporal resolution. As for MI EEG signal processing, the high dimensions of features represent a research challenge. It is necessary to eliminate redundant features, which not only create an additional overhead of managing the space complexity, but also might include outliers, thereby reducing classification accuracy. The firefly algorithm (FA can adaptively select the best subset of features, and improve classification accuracy. However, the FA is easily entrapped in a local optimum. To solve this problem, this paper proposes a method of combining the firefly algorithm and learning automata (LA to optimize feature selection for motor imagery EEG. We employed a method of combining common spatial pattern (CSP and local characteristic-scale decomposition (LCD algorithms to obtain a high dimensional feature set, and classified it by using the spectral regression discriminant analysis (SRDA classifier. Both the fourth brain–computer interface competition data and real-time data acquired in our designed experiments were used to verify the validation of the proposed method. Compared with genetic and adaptive weight particle swarm optimization algorithms, the experimental results show that our proposed method effectively eliminates redundant features, and improves the classification accuracy of MI EEG signals. In addition, a real-time brain–computer interface system was implemented to verify the feasibility of our proposed methods being applied in practical brain–computer interface systems.

  6. The complexity of intestinal permeability: Assigning the correct BCS classification through careful data interpretation.

    Science.gov (United States)

    Zur, Moran; Hanson, Allison S; Dahan, Arik

    2014-09-30

    While the solubility parameter is fairly straightforward when assigning BCS classification, the intestinal permeability (Peff) is more complex than generally recognized. In this paper we emphasize this complexity through the analysis of codeine, a commonly used antitussive/analgesic drug. Codeine was previously classified as a low-permeability compound, based on its lower LogP compared to metoprolol, a marker for the low-high permeability class boundary. In contrast, high fraction of dose absorbed (Fabs) was reported for codeine, which challenges the generally recognized Peff-Fabs correlation. The purpose of this study was to clarify this ambiguity through elucidation of codeine's BCS solubility/permeability class membership. Codeine's BCS solubility class was determined, and its intestinal permeability throughout the small intestine was investigated, both in vitro and in vivo in rats. Codeine was found to be unequivocally a high-solubility compound. All in vitro studies indicated that codeine's permeability is higher than metoprolol's. In vivo studies in rats showed similar permeability for both drugs throughout the entire small-intestine. In conclusion, codeine was found to be a BCS Class I compound. No Peff-Fabs discrepancy is involved in its absorption; rather, it reflects the risk of assigning BCS classification based on merely limited physicochemical characteristics. A thorough investigation using multiple experimental methods is prudent before assigning a BCS classification, to avoid misjudgment in various settings, e.g., drug discovery, formulation design, drug development and regulation. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. An automated cirrus classification

    Science.gov (United States)

    Gryspeerdt, Edward; Quaas, Johannes; Goren, Tom; Klocke, Daniel; Brueck, Matthias

    2018-05-01

    Cirrus clouds play an important role in determining the radiation budget of the earth, but many of their properties remain uncertain, particularly their response to aerosol variations and to warming. Part of the reason for this uncertainty is the dependence of cirrus cloud properties on the cloud formation mechanism, which itself is strongly dependent on the local meteorological conditions. In this work, a classification system (Identification and Classification of Cirrus or IC-CIR) is introduced to identify cirrus clouds by the cloud formation mechanism. Using reanalysis and satellite data, cirrus clouds are separated into four main types: orographic, frontal, convective and synoptic. Through a comparison to convection-permitting model simulations and back-trajectory-based analysis, it is shown that these observation-based regimes can provide extra information on the cloud-scale updraughts and the frequency of occurrence of liquid-origin ice, with the convective regime having higher updraughts and a greater occurrence of liquid-origin ice compared to the synoptic regimes. Despite having different cloud formation mechanisms, the radiative properties of the regimes are not distinct, indicating that retrieved cloud properties alone are insufficient to completely describe them. This classification is designed to be easily implemented in GCMs, helping improve future model-observation comparisons and leading to improved parametrisations of cirrus cloud processes.

  8. Device and Circuit Design Challenges in the Digital Subthreshold Region for Ultralow-Power Applications

    Directory of Open Access Journals (Sweden)

    Ramesh Vaddi

    2009-01-01

    Full Text Available In recent years, subthreshold operation has gained a lot of attention due to ultra low-power consumption in applications requiring low to medium performance. It has also been shown that by optimizing the device structure, power consumption of digital subthreshold logic can be further minimized while improving its performance. Therefore, subthreshold circuit design is very promising for future ultra low-energy sensor applications as well as high-performance parallel processing. This paper deals with various device and circuit design challenges associated with the state of the art in optimal digital subthreshold circuit design and reviews device design methodologies and circuit topologies for optimal digital subthreshold operation. This paper identifies the suitable candidates for subthreshold operation at device and circuit levels for optimal subthreshold circuit design and provides an effective roadmap for digital designers interested to work with ultra low-power applications.

  9. Sparse Bayesian classification and feature selection for biological expression data with high correlations.

    Directory of Open Access Journals (Sweden)

    Xian Yang

    Full Text Available Classification models built on biological expression data are increasingly used to predict distinct disease subtypes. Selected features that separate sample groups can be the candidates of biomarkers, helping us to discover biological functions/pathways. However, three challenges are associated with building a robust classification and feature selection model: 1 the number of significant biomarkers is much smaller than that of measured features for which the search will be exhaustive; 2 current biological expression data are big in both sample size and feature size which will worsen the scalability of any search algorithms; and 3 expression profiles of certain features are typically highly correlated which may prevent to distinguish the predominant features. Unfortunately, most of the existing algorithms are partially addressing part of these challenges but not as a whole. In this paper, we propose a unified framework to address the above challenges. The classification and feature selection problem is first formulated as a nonconvex optimisation problem. Then the problem is relaxed and solved iteratively by a sequence of convex optimisation procedures which can be distributed computed and therefore allows the efficient implementation on advanced infrastructures. To illustrate the competence of our method over others, we first analyse a randomly generated simulation dataset under various conditions. We then analyse a real gene expression dataset on embryonal tumour. Further downstream analysis, such as functional annotation and pathway analysis, are performed on the selected features which elucidate several biological findings.

  10. A Support Vector Machine Hydrometeor Classification Algorithm for Dual-Polarization Radar

    Directory of Open Access Journals (Sweden)

    Nicoletta Roberto

    2017-07-01

    Full Text Available An algorithm based on a support vector machine (SVM is proposed for hydrometeor classification. The training phase is driven by the output of a fuzzy logic hydrometeor classification algorithm, i.e., the most popular approach for hydrometer classification algorithms used for ground-based weather radar. The performance of SVM is evaluated by resorting to a weather scenario, generated by a weather model; the corresponding radar measurements are obtained by simulation and by comparing results of SVM classification with those obtained by a fuzzy logic classifier. Results based on the weather model and simulations show a higher accuracy of the SVM classification. Objective comparison of the two classifiers applied to real radar data shows that SVM classification maps are spatially more homogenous (textural indices, energy, and homogeneity increases by 21% and 12% respectively and do not present non-classified data. The improvements found by SVM classifier, even though it is applied pixel-by-pixel, can be attributed to its ability to learn from the entire hyperspace of radar measurements and to the accurate training. The reliability of results and higher computing performance make SVM attractive for some challenging tasks such as its implementation in Decision Support Systems for helping pilots to make optimal decisions about changes inthe flight route caused by unexpected adverse weather.

  11. CLASSIFICATION OF INFORMAL SETTLEMENTS THROUGH THE INTEGRATION OF 2D AND 3D FEATURES EXTRACTED FROM UAV DATA

    Directory of Open Access Journals (Sweden)

    C. M. Gevaert

    2016-06-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are capable of providing very high resolution and up-to-date information to support informal settlement upgrading projects. In order to provide accurate basemaps, urban scene understanding through the identification and classification of buildings and terrain is imperative. However, common characteristics of informal settlements such as small, irregular buildings with heterogeneous roof material and large presence of clutter challenge state-of-the-art algorithms. Especially the dense buildings and steeply sloped terrain cause difficulties in identifying elevated objects. This work investigates how 2D radiometric and textural features, 2.5D topographic features, and 3D geometric features obtained from UAV imagery can be integrated to obtain a high classification accuracy in challenging classification problems for the analysis of informal settlements. It compares the utility of pixel-based and segment-based features obtained from an orthomosaic and DSM with point-based and segment-based features extracted from the point cloud to classify an unplanned settlement in Kigali, Rwanda. Findings show that the integration of 2D and 3D features leads to higher classification accuracies.

  12. LDA boost classification: boosting by topics

    Science.gov (United States)

    Lei, La; Qiao, Guo; Qimin, Cao; Qitao, Li

    2012-12-01

    AdaBoost is an efficacious classification algorithm especially in text categorization (TC) tasks. The methodology of setting up a classifier committee and voting on the documents for classification can achieve high categorization precision. However, traditional Vector Space Model can easily lead to the curse of dimensionality and feature sparsity problems; so it affects classification performance seriously. This article proposed a novel classification algorithm called LDABoost based on boosting ideology which uses Latent Dirichlet Allocation (LDA) to modeling the feature space. Instead of using words or phrase, LDABoost use latent topics as the features. In this way, the feature dimension is significantly reduced. Improved Naïve Bayes (NB) is designed as the weaker classifier which keeps the efficiency advantage of classic NB algorithm and has higher precision. Moreover, a two-stage iterative weighted method called Cute Integration in this article is proposed for improving the accuracy by integrating weak classifiers into strong classifier in a more rational way. Mutual Information is used as metrics of weights allocation. The voting information and the categorization decision made by basis classifiers are fully utilized for generating the strong classifier. Experimental results reveals LDABoost making categorization in a low-dimensional space, it has higher accuracy than traditional AdaBoost algorithms and many other classic classification algorithms. Moreover, its runtime consumption is lower than different versions of AdaBoost, TC algorithms based on support vector machine and Neural Networks.

  13. Genome-Wide Comparative Gene Family Classification

    Science.gov (United States)

    Frech, Christian; Chen, Nansheng

    2010-01-01

    Correct classification of genes into gene families is important for understanding gene function and evolution. Although gene families of many species have been resolved both computationally and experimentally with high accuracy, gene family classification in most newly sequenced genomes has not been done with the same high standard. This project has been designed to develop a strategy to effectively and accurately classify gene families across genomes. We first examine and compare the performance of computer programs developed for automated gene family classification. We demonstrate that some programs, including the hierarchical average-linkage clustering algorithm MC-UPGMA and the popular Markov clustering algorithm TRIBE-MCL, can reconstruct manual curation of gene families accurately. However, their performance is highly sensitive to parameter setting, i.e. different gene families require different program parameters for correct resolution. To circumvent the problem of parameterization, we have developed a comparative strategy for gene family classification. This strategy takes advantage of existing curated gene families of reference species to find suitable parameters for classifying genes in related genomes. To demonstrate the effectiveness of this novel strategy, we use TRIBE-MCL to classify chemosensory and ABC transporter gene families in C. elegans and its four sister species. We conclude that fully automated programs can establish biologically accurate gene families if parameterized accordingly. Comparative gene family classification finds optimal parameters automatically, thus allowing rapid insights into gene families of newly sequenced species. PMID:20976221

  14. The Speeding Car Design Challenge

    Science.gov (United States)

    Roman, Harry T.

    2009-01-01

    All too often, one reads about high-speed police chases in pursuit of stolen cars that result in death and injury to people and innocent bystanders. Isn't there another way to accomplish the apprehension of the thieves that does not put people at such great risk? This article presents a classroom challenge to use technology to remotely shutdown…

  15. Validating the Danish adaptation of the World Health Organization's International Classification for Patient Safety classification of patient safety incident types

    DEFF Research Database (Denmark)

    Mikkelsen, Kim Lyngby; Thommesen, Jacob; Andersen, Henning Boje

    2013-01-01

    Objectives Validation of a Danish patient safety incident classification adapted from the World Health Organizaton's International Classification for Patient Safety (ICPS-WHO). Design Thirty-three hospital safety management experts classified 58 safety incident cases selected to represent all types.......513 (range: 0.193–0.804). Kappa and ICC showed high correlation (r = 0.99). An inverse correlation was found between the prevalence of type and inter-rater reliability. Results are discussed according to four factors known to determine the inter-rater agreement: skill and motivation of raters; clarity...

  16. Fuzzy set classifier for waste classification tracking

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1992-01-01

    We have developed an expert system based on fuzzy logic theory to fuse the data from multiple sensors and make classification decisions for objects in a waste reprocessing stream. Fuzzy set theory has been applied in decision and control applications with some success, particularly by the Japanese. We have found that the fuzzy logic system is rather easy to design and train, a feature that can cut development costs considerably. With proper training, the classification accuracy is quite high. We performed several tests sorting radioactive test samples using a gamma spectrometer to compare fuzzy logic to more conventional sorting schemes

  17. A canonical correlation analysis based EMG classification algorithm for eliminating electrode shift effect.

    Science.gov (United States)

    Zhe Fan; Zhong Wang; Guanglin Li; Ruomei Wang

    2016-08-01

    Motion classification system based on surface Electromyography (sEMG) pattern recognition has achieved good results in experimental condition. But it is still a challenge for clinical implement and practical application. Many factors contribute to the difficulty of clinical use of the EMG based dexterous control. The most obvious and important is the noise in the EMG signal caused by electrode shift, muscle fatigue, motion artifact, inherent instability of signal and biological signals such as Electrocardiogram. In this paper, a novel method based on Canonical Correlation Analysis (CCA) was developed to eliminate the reduction of classification accuracy caused by electrode shift. The average classification accuracy of our method were above 95% for the healthy subjects. In the process, we validated the influence of electrode shift on motion classification accuracy and discovered the strong correlation with correlation coefficient of >0.9 between shift position data and normal position data.

  18. STAR-GALAXY CLASSIFICATION IN MULTI-BAND OPTICAL IMAGING

    International Nuclear Information System (INIS)

    Fadely, Ross; Willman, Beth; Hogg, David W.

    2012-01-01

    Ground-based optical surveys such as PanSTARRS, DES, and LSST will produce large catalogs to limiting magnitudes of r ∼> 24. Star-galaxy separation poses a major challenge to such surveys because galaxies—even very compact galaxies—outnumber halo stars at these depths. We investigate photometric classification techniques on stars and galaxies with intrinsic FWHM best ) where the training data are (unrealistically) a random sampling of the data in both signal-to-noise and demographics and (2) a more realistic scenario where training is done on higher signal-to-noise data (SVM real ) at brighter apparent magnitudes. Testing with COSMOS ugriz data, we find that HB outperforms ML, delivering ∼80% completeness, with purity of ∼60%-90% for both stars and galaxies. We find that no algorithm delivers perfect performance and that studies of metal-poor main-sequence turnoff stars may be challenged by poor star-galaxy separation. Using the Receiver Operating Characteristic curve, we find a best-to-worst ranking of SVM best , HB, ML, and SVM real . We conclude, therefore, that a well-trained SVM will outperform template-fitting methods. However, a normally trained SVM performs worse. Thus, HB template fitting may prove to be the optimal classification method in future surveys.

  19. Classification of cultivated mussels from Galicia (Northwest Spain) with European Protected Designation of Origin using trace element fingerprint and chemometric analysis

    International Nuclear Information System (INIS)

    Costas-Rodriguez, M.; Lavilla, I.; Bendicho, C.

    2010-01-01

    Inductively coupled plasma-mass spectrometry (ICP-MS) in combination with different supervised chemometric approaches has been used to classify cultivated mussels in Galicia (Northwest of Spain) under the European Protected Designation of Origin (PDO). 158 mussel samples, collected in the five rias on the basis of the production, along with minor and trace elements, including high field strength elements (HFSEs) and rare earth elements (REEs), were used with this aim. The classification of samples was achieved according to their origin: Galician vs. other regions (from Tarragona, Spain, and Ethang de Thau, France) and between the Galician Rias. The ability of linear discriminant analysis (LDA), soft independent modelling of class analogy (SIMCA) and artificial neural network (ANN) to classify the samples was investigated. Correct assignations for Galician and non-Galician samples were obtained when LDA and SIMCA were used. ANNs were more effective when a classification according to the ria of origin was to be applied.

  20. Comparative study of deep learning methods for one-shot image classification (abstract)

    NARCIS (Netherlands)

    van den Bogaert, J.; Mohseni, H.; Khodier, M.; Stoyanov, Y.; Mocanu, D.C.; Menkovski, V.

    2017-01-01

    Training deep learning models for images classification requires large amount of labeled data to overcome the challenges of overfitting and underfitting. Usually, in many practical applications, these labeled data are not available. In an attempt to solve this problem, the one-shot learning paradigm

  1. A New Well Classification Scheme For The Nigerian Oil Industry

    International Nuclear Information System (INIS)

    Ojoh, K.

    2002-01-01

    Oil was discovered in the Niger Delta Basin in 1956, with Oloibiri 1, after 21 wildcats had been drilled with lack of success. In the 46 years since, 25 companies have discovered 52 Billion barrels, of which 20 Billion has been produced, leaving proven reserves of 32 Billion Barrels.Between now and 2010, the country would like to add 15 billion barrels of oil to these reserves. The target is 40 Billion barrels. The National aspiration is to be able to obtain OPEC quota to produce 4 million barrels of oil per day. A large percentage of the reserves additions will definitely come from the deepwater segment of the basin, where fields of over 500 Million barrels are expected. Exploration also continues on the shelf and on land, but the rate of discovery in these areas is - after 46 years of constant effort - constrained by the relative maturity of the basin.The challenges are that few, small, untested structures remain on shelf and land, whereas most undiscovered reserves are in stratigraphic accumulations within known producing areas. These are only visible on 3-D seismic after it is processed using state-of-the-art, high-technology attribute analyses. In the deepwater province, the stratigraphy throws up problems of reservoir continuity. Channels and lobe fans have complex spatial distribution which systematically require more than the classical two appraisal wells in conventional classification.The industry agrees that the current well classification scheme, which came into place in 1977, needs to be overhauled to take cognisance of these challenges.At a workshop last May, a Well Classification Committee comprising members from OPTS, DEWOG, NAIPEC as well as the DPR was mandated to produce a well classification scheme for the industry. This paper examines the current scheme and comes with a technically sound, widely accepted alternative, complete with exhaustive illustrations

  2. Classification of Herbaceous Vegetation Using Airborne Hyperspectral Imagery

    Directory of Open Access Journals (Sweden)

    Péter Burai

    2015-02-01

    Full Text Available Alkali landscapes hold an extremely fine-scale mosaic of several vegetation types, thus it seems challenging to separate these classes by remote sensing. Our aim was to test the applicability of different image classification methods of hyperspectral data in this complex situation. To reach the highest classification accuracy, we tested traditional image classifiers (maximum likelihood classifier—MLC, machine learning algorithms (support vector machine—SVM, random forest—RF and feature extraction (minimum noise fraction (MNF-transformation on training datasets of different sizes. Digital images were acquired from an AISA EAGLE II hyperspectral sensor of 128 contiguous bands (400–1000 nm, a spectral sampling of 5 nm bandwidth and a ground pixel size of 1 m. For the classification, we established twenty vegetation classes based on the dominant species, canopy height, and total vegetation cover. Image classification was applied to the original and MNF (minimum noise fraction transformed dataset with various training sample sizes between 10 and 30 pixels. In order to select the optimal number of the transformed features, we applied SVM, RF and MLC classification to 2–15 MNF transformed bands. In the case of the original bands, SVM and RF classifiers provided high accuracy irrespective of the number of the training pixels. We found that SVM and RF produced the best accuracy when using the first nine MNF transformed bands; involving further features did not increase classification accuracy. SVM and RF provided high accuracies with the transformed bands, especially in the case of the aggregated groups. Even MLC provided high accuracy with 30 training pixels (80.78%, but the use of a smaller training dataset (10 training pixels significantly reduced the accuracy of classification (52.56%. Our results suggest that in alkali landscapes, the application of SVM is a feasible solution, as it provided the highest accuracies compared to RF and MLC

  3. Representation learning with deep extreme learning machines for efficient image set classification

    KAUST Repository

    Uzair, Muhammad

    2016-12-09

    Efficient and accurate representation of a collection of images, that belong to the same class, is a major research challenge for practical image set classification. Existing methods either make prior assumptions about the data structure, or perform heavy computations to learn structure from the data itself. In this paper, we propose an efficient image set representation that does not make any prior assumptions about the structure of the underlying data. We learn the nonlinear structure of image sets with deep extreme learning machines that are very efficient and generalize well even on a limited number of training samples. Extensive experiments on a broad range of public datasets for image set classification show that the proposed algorithm consistently outperforms state-of-the-art image set classification methods both in terms of speed and accuracy.

  4. Representation learning with deep extreme learning machines for efficient image set classification

    KAUST Repository

    Uzair, Muhammad; Shafait, Faisal; Ghanem, Bernard; Mian, Ajmal

    2016-01-01

    Efficient and accurate representation of a collection of images, that belong to the same class, is a major research challenge for practical image set classification. Existing methods either make prior assumptions about the data structure, or perform heavy computations to learn structure from the data itself. In this paper, we propose an efficient image set representation that does not make any prior assumptions about the structure of the underlying data. We learn the nonlinear structure of image sets with deep extreme learning machines that are very efficient and generalize well even on a limited number of training samples. Extensive experiments on a broad range of public datasets for image set classification show that the proposed algorithm consistently outperforms state-of-the-art image set classification methods both in terms of speed and accuracy.

  5. A NEW WASTE CLASSIFYING MODEL: HOW WASTE CLASSIFICATION CAN BECOME MORE OBJECTIVE?

    Directory of Open Access Journals (Sweden)

    Burcea Stefan Gabriel

    2015-07-01

    Full Text Available The waste management specialist must be able to identify and analyze waste generation sources and to propose proper solutions to prevent the waste generation and encurage the waste minimisation. In certain situations like implementing an integrated waste management sustem and configure the waste collection methods and capacities, practitioners can face the challenge to classify the generated waste. This will tend to be the more demanding as the literature does not provide a coherent system of criteria required for an objective waste classification process. The waste incineration will determine no doubt a different waste classification than waste composting or mechanical and biological treatment. In this case the main question is what are the proper classification criteria witch can be used to realise an objective waste classification? The article provide a short critical literature review of the existing waste classification criteria and suggests the conclusion that the literature can not provide unitary waste classification system which is unanimously accepted and assumed by ideologists and practitioners. There are various classification criteria and more interesting perspectives in the literature regarding the waste classification, but the most common criteria based on which specialists classify waste into several classes, categories and types are the generation source, physical and chemical features, aggregation state, origin or derivation, hazardous degree etc. The traditional classification criteria divided waste into various categories, subcategories and types; such an approach is a conjectural one because is inevitable that according to the context in which the waste classification is required the used criteria to differ significantly; hence the need to uniformizating the waste classification systems. For the first part of the article it has been used indirect observation research method by analyzing the literature and the various

  6. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  7. Basic considerations on radioactive waste classification regarding the different waste management steps

    International Nuclear Information System (INIS)

    Berg, H.P.; Brennecke, P.

    1993-01-01

    Radioactive waste classification systems are designed to facilitate the exchange of technical information between waste management institutions and, more general, between different countries. Because such waste classification systems may serve a wide range of often competing and conflicting objectives, one classification system cannot serve all purposes. Different approaches are described, considering the different waste management steps, and taking into account the fact that radioactive waste must finally be disposed of in an appropriate repository. (orig.) [de

  8. Identification and classification of similar looking food grains

    Science.gov (United States)

    Anami, B. S.; Biradar, Sunanda D.; Savakar, D. G.; Kulkarni, P. V.

    2013-01-01

    This paper describes the comparative study of Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers by taking a case study of identification and classification of four pairs of similar looking food grains namely, Finger Millet, Mustard, Soyabean, Pigeon Pea, Aniseed, Cumin-seeds, Split Greengram and Split Blackgram. Algorithms are developed to acquire and process color images of these grains samples. The developed algorithms are used to extract 18 colors-Hue Saturation Value (HSV), and 42 wavelet based texture features. Back Propagation Neural Network (BPNN)-based classifier is designed using three feature sets namely color - HSV, wavelet-texture and their combined model. SVM model for color- HSV model is designed for the same set of samples. The classification accuracies ranging from 93% to 96% for color-HSV, ranging from 78% to 94% for wavelet texture model and from 92% to 97% for combined model are obtained for ANN based models. The classification accuracy ranging from 80% to 90% is obtained for color-HSV based SVM model. Training time required for the SVM based model is substantially lesser than ANN for the same set of images.

  9. Design and Evaluation of Reform Plan for Local Academic Nursing Challenges Using Action Research.

    Science.gov (United States)

    Asadizaker, Marziyeh; Abedsaeedi, Zhila; Abedi, Heidarali; Saki, Azadeh

    2016-12-01

    This study identifies challenges to the first nurse training program for undergraduate nursing students at a nursing and midwifery school in Iran using a collaborative approach in order to improve the program. Action research was used as a research strategy with qualitative content analysis and quantitative evaluation. The participants were 148 individuals from nursing academic and clinical settings, including administrators, faculty members, students, and staff nurses. We obtained approval from the research deputy and ethics committee of Shahid Beheshti University of Medical Sciences in Tehran, Iran for this study. Lack of coherence in the educational program and implementation of the program, inadequate communication between management inside and outside the organization, insufficient understanding of situations by students, and improper control of inhibitors and use of facilitators in teaching and in practice were among the major challenges in the first training process in the context of this study. After classification of problems, the educational decision-making authorities of the school developed an operational program with stakeholder cooperation to plan initial reforms, implementation of reforms, reflection about the actions, and evaluation. Comparison of student satisfaction with the collaborative learning process versus the traditional method showed that except for the atmosphere in the clinical learning environment (p>.05), the mean differences for all dimensions were statistically significant. The results confirm the overall success of the revised partnership program, but stressed the need for further modification of some details for its implementation in future rounds. Copyright © 2016. Published by Elsevier B.V.

  10. Fission--fusion systems: classification and critique

    International Nuclear Information System (INIS)

    Lidsky, L.M.

    1974-01-01

    A useful classification scheme for hybrid systems is described and some common features that the scheme makes apparent are pointed out. The early history of fusion-fission systems is reviewed. Some designs are described along with advantages and disadvantages of each. The extension to low and moderate Q devices is noted. (U.S.)

  11. Technology Transfer Challenges: A Case Study of User-Centered Design in NASA's Systems Engineering Culture

    Science.gov (United States)

    Quick, Jason

    2009-01-01

    The Upper Stage (US) section of the National Aeronautics and Space Administration's (NASA) Ares I rocket will require internal access platforms for maintenance tasks performed by humans inside the vehicle. Tasks will occur during expensive critical path operations at Kennedy Space Center (KSC) including vehicle stacking and launch preparation activities. Platforms must be translated through a small human access hatch, installed in an enclosed worksite environment, support the weight of ground operators and be removed before flight - and their design must minimize additional vehicle mass at attachment points. This paper describes the application of a user-centered conceptual design process and the unique challenges encountered within NASA's systems engineering culture focused on requirements and "heritage hardware". The NASA design team at Marshall Space Flight Center (MSFC) initiated the user-centered design process by studying heritage internal access kits and proposing new design concepts during brainstorming sessions. Simultaneously, they partnered with the Technology Transfer/Innovative Partnerships Program to research inflatable structures and dynamic scaffolding solutions that could enable ground operator access. While this creative, technology-oriented exploration was encouraged by upper management, some design stakeholders consistently opposed ideas utilizing novel, untested equipment. Subsequent collaboration with an engineering consulting firm improved the technical credibility of several options, however, there was continued resistance from team members focused on meeting system requirements with pre-certified hardware. After a six-month idea-generating phase, an intensive six-week effort produced viable design concepts that justified additional vehicle mass while optimizing the human factors of platform installation and use. Although these selected final concepts closely resemble heritage internal access platforms, challenges from the application of the

  12. Joint classification and contour extraction of large 3D point clouds

    Science.gov (United States)

    Hackel, Timo; Wegner, Jan D.; Schindler, Konrad

    2017-08-01

    We present an effective and efficient method for point-wise semantic classification and extraction of object contours of large-scale 3D point clouds. What makes point cloud interpretation challenging is the sheer size of several millions of points per scan and the non-grid, sparse, and uneven distribution of points. Standard image processing tools like texture filters, for example, cannot handle such data efficiently, which calls for dedicated point cloud labeling methods. It turns out that one of the major drivers for efficient computation and handling of strong variations in point density, is a careful formulation of per-point neighborhoods at multiple scales. This allows, both, to define an expressive feature set and to extract topologically meaningful object contours. Semantic classification and contour extraction are interlaced problems. Point-wise semantic classification enables extracting a meaningful candidate set of contour points while contours help generating a rich feature representation that benefits point-wise classification. These methods are tailored to have fast run time and small memory footprint for processing large-scale, unstructured, and inhomogeneous point clouds, while still achieving high classification accuracy. We evaluate our methods on the semantic3d.net benchmark for terrestrial laser scans with >109 points.

  13. Classification of red wine based on its protected designation of origin (PDO) using Laser-induced Breakdown Spectroscopy (LIBS).

    Science.gov (United States)

    Moncayo, S; Rosales, J D; Izquierdo-Hornillos, R; Anzano, J; Caceres, J O

    2016-09-01

    This work reports on a simple and fast classification procedure for the quality control of red wines with protected designation of origin (PDO) by means of Laser Induced Breakdown Spectroscopy (LIBS) technique combined with Neural Networks (NN) in order to increase the quality assurance and authenticity issues. A total of thirty-eight red wine samples from different PDO were analyzed to detect fake wines and to avoid unfair competition in the market. LIBS is well known for not requiring sample preparation, however, in order to increase its analytical performance a new sample preparation treatment by previous liquid-to-solid transformation of the wine using a dry collagen gel has been developed. The use of collagen pellets allowed achieving successful classification results, avoiding the limitations and difficulties of working with aqueous samples. The performance of the NN model was assessed by three validation procedures taking into account their sensitivity (internal validation), generalization ability and robustness (independent external validation). The results of the use of a spectroscopic technique coupled with a chemometric analysis (LIBS-NN) are discussed in terms of its potential use in the food industry, providing a methodology able to perform the quality control of alcoholic beverages. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Towards a consensus on a hearing preservation classification system.

    Science.gov (United States)

    Skarzynski, Henryk; van de Heyning, P; Agrawal, S; Arauz, S L; Atlas, M; Baumgartner, W; Caversaccio, M; de Bodt, M; Gavilan, J; Godey, B; Green, K; Gstoettner, W; Hagen, R; Han, D M; Kameswaran, M; Karltorp, E; Kompis, M; Kuzovkov, V; Lassaletta, L; Levevre, F; Li, Y; Manikoth, M; Martin, J; Mlynski, R; Mueller, J; O'Driscoll, M; Parnes, L; Prentiss, S; Pulibalathingal, S; Raine, C H; Rajan, G; Rajeswaran, R; Rivas, J A; Rivas, A; Skarzynski, P H; Sprinzl, G; Staecker, H; Stephan, K; Usami, S; Yanov, Y; Zernotti, M E; Zimmermann, K; Lorens, A; Mertens, G

    2013-01-01

    The comprehensive Hearing Preservation classification system presented in this paper is suitable for use for all cochlear implant users with measurable pre-operative residual hearing. If adopted as a universal reporting standard, as it was designed to be, it should prove highly beneficial by enabling future studies to quickly and easily compare the results of previous studies and meta-analyze their data. To develop a comprehensive Hearing Preservation classification system suitable for use for all cochlear implant users with measurable pre-operative residual hearing. The HEARRING group discussed and reviewed a number of different propositions of a HP classification systems and reviewed critical appraisals to develop a qualitative system in accordance with the prerequisites. The Hearing Preservation Classification System proposed herein fulfills the following necessary criteria: 1) classification is independent from users' initial hearing, 2) it is appropriate for all cochlear implant users with measurable pre-operative residual hearing, 3) it covers the whole range of pure tone average from 0 to 120 dB; 4) it is easy to use and easy to understand.

  15. Turning challenges into design principles: Telemonitoring systems for patients with multiple chronic conditions.

    Science.gov (United States)

    Sultan, Mehwish; Kuluski, Kerry; McIsaac, Warren J; Cafazzo, Joseph A; Seto, Emily

    2018-01-01

    People with multiple chronic conditions often struggle with managing their health. The purpose of this research was to identify specific challenges of patients with multiple chronic conditions and to use the findings to form design principles for a telemonitoring system tailored for these patients. Semi-structured interviews with 15 patients with multiple chronic conditions and 10 clinicians were conducted to gain an understanding of their needs and preferences for a smartphone-based telemonitoring system. The interviews were analyzed using a conventional content analysis technique, resulting in six themes. Design principles developed from the themes included that the system must be modular to accommodate various combinations of conditions, reinforce a routine, consolidate record keeping, as well as provide actionable feedback to the patients. Designing an application for multiple chronic conditions is complex due to variability in patient conditions, and therefore, design principles developed in this study can help with future innovations aimed to help manage this population.

  16. Fingerprint classification using a simplified rule-set based on directional patterns and singularity features

    CSIR Research Space (South Africa)

    Dorasamy, K

    2015-07-01

    Full Text Available The use of directional patterns has recently received more attention in fingerprint classification. It provides a global representation of a fingerprint, by dividing it into homogeneous orientation partitions. With this technique, the challenge...

  17. A Framework and Classification for Fault Detection Approaches in Wireless Sensor Networks with an Energy Efficiency Perspective

    DEFF Research Database (Denmark)

    Zhang, Yue; Dragoni, Nicola; Wang, Jiangtao

    2015-01-01

    efficiency to facilitate the design of fault detection methods and the evaluation of their energy efficiency. Following the same design principle of the fault detection framework, the paper proposes a classification for fault detection approaches. The classification is applied to a number of fault detection...

  18. Deep learning application: rubbish classification with aid of an android device

    Science.gov (United States)

    Liu, Sijiang; Jiang, Bo; Zhan, Jie

    2017-06-01

    Deep learning is a very hot topic currently in pattern recognition and artificial intelligence researches. Aiming at the practical problem that people usually don't know correct classifications some rubbish should belong to, based on the powerful image classification ability of the deep learning method, we have designed a prototype system to help users to classify kinds of rubbish. Firstly the CaffeNet Model was adopted for our classification network training on the ImageNet dataset, and the trained network was deployed on a web server. Secondly an android app was developed for users to capture images of unclassified rubbish, upload images to the web server for analyzing backstage and retrieve the feedback, so that users can obtain the classification guide by an android device conveniently. Tests on our prototype system of rubbish classification show that: an image of one single type of rubbish with origin shape can be better used to judge its classification, while an image containing kinds of rubbish or rubbish with changed shape may fail to help users to decide rubbish's classification. However, the system still shows promising auxiliary function for rubbish classification if the network training strategy can be optimized further.

  19. Binary patterns encoded convolutional neural networks for texture recognition and remote sensing scene classification

    Science.gov (United States)

    Anwer, Rao Muhammad; Khan, Fahad Shahbaz; van de Weijer, Joost; Molinier, Matthieu; Laaksonen, Jorma

    2018-04-01

    Designing discriminative powerful texture features robust to realistic imaging conditions is a challenging computer vision problem with many applications, including material recognition and analysis of satellite or aerial imagery. In the past, most texture description approaches were based on dense orderless statistical distribution of local features. However, most recent approaches to texture recognition and remote sensing scene classification are based on Convolutional Neural Networks (CNNs). The de facto practice when learning these CNN models is to use RGB patches as input with training performed on large amounts of labeled data (ImageNet). In this paper, we show that Local Binary Patterns (LBP) encoded CNN models, codenamed TEX-Nets, trained using mapped coded images with explicit LBP based texture information provide complementary information to the standard RGB deep models. Additionally, two deep architectures, namely early and late fusion, are investigated to combine the texture and color information. To the best of our knowledge, we are the first to investigate Binary Patterns encoded CNNs and different deep network fusion architectures for texture recognition and remote sensing scene classification. We perform comprehensive experiments on four texture recognition datasets and four remote sensing scene classification benchmarks: UC-Merced with 21 scene categories, WHU-RS19 with 19 scene classes, RSSCN7 with 7 categories and the recently introduced large scale aerial image dataset (AID) with 30 aerial scene types. We demonstrate that TEX-Nets provide complementary information to standard RGB deep model of the same network architecture. Our late fusion TEX-Net architecture always improves the overall performance compared to the standard RGB network on both recognition problems. Furthermore, our final combination leads to consistent improvement over the state-of-the-art for remote sensing scene classification.

  20. Classification and Segmentation of Satellite Orthoimagery Using Convolutional Neural Networks

    Directory of Open Access Journals (Sweden)

    Martin Längkvist

    2016-04-01

    Full Text Available The availability of high-resolution remote sensing (HRRS data has opened up the possibility for new interesting applications, such as per-pixel classification of individual objects in greater detail. This paper shows how a convolutional neural network (CNN can be applied to multispectral orthoimagery and a digital surface model (DSM of a small city for a full, fast and accurate per-pixel classification. The predicted low-level pixel classes are then used to improve the high-level segmentation. Various design choices of the CNN architecture are evaluated and analyzed. The investigated land area is fully manually labeled into five categories (vegetation, ground, roads, buildings and water, and the classification accuracy is compared to other per-pixel classification works on other land areas that have a similar choice of categories. The results of the full classification and segmentation on selected segments of the map show that CNNs are a viable tool for solving both the segmentation and object recognition task for remote sensing data.

  1. Classification of hydrocephalus: critical analysis of classification categories and advantages of "Multi-categorical Hydrocephalus Classification" (Mc HC).

    Science.gov (United States)

    Oi, Shizuo

    2011-10-01

    Hydrocephalus is a complex pathophysiology with disturbed cerebrospinal fluid (CSF) circulation. There are numerous numbers of classification trials published focusing on various criteria, such as associated anomalies/underlying lesions, CSF circulation/intracranial pressure patterns, clinical features, and other categories. However, no definitive classification exists comprehensively to cover the variety of these aspects. The new classification of hydrocephalus, "Multi-categorical Hydrocephalus Classification" (Mc HC), was invented and developed to cover the entire aspects of hydrocephalus with all considerable classification items and categories. Ten categories include "Mc HC" category I: onset (age, phase), II: cause, III: underlying lesion, IV: symptomatology, V: pathophysiology 1-CSF circulation, VI: pathophysiology 2-ICP dynamics, VII: chronology, VII: post-shunt, VIII: post-endoscopic third ventriculostomy, and X: others. From a 100-year search of publication related to the classification of hydrocephalus, 14 representative publications were reviewed and divided into the 10 categories. The Baumkuchen classification graph made from the round o'clock classification demonstrated the historical tendency of deviation to the categories in pathophysiology, either CSF or ICP dynamics. In the preliminary clinical application, it was concluded that "Mc HC" is extremely effective in expressing the individual state with various categories in the past and present condition or among the compatible cases of hydrocephalus along with the possible chronological change in the future.

  2. Nuclear challenges and progress in designing stellarator fusion power plants

    International Nuclear Information System (INIS)

    El-Guebaly, L.A.; Wilson, P.; Henderson, D.; Sawan, M.; Sviatoslavsky, G.; Tautges, T.; Slaybaugh, R.; Kiedrowski, B.; Ibrahim, A.

    2008-01-01

    Over the past 5-6 decades, stellarator power plants have been studied in the US, Europe, and Japan as an alternate to the mainline magnetic fusion tokamaks, offering steady-state operation and eliminating the risk of plasma disruptions. The earlier 1980s studies suggested large-scale stellarator power plants with an average major radius exceeding 20 m. The most recent development of the compact stellarator concept delivered ARIES-CS - a compact stellarator with 7.75 m average major radius, approaching that of tokamaks. For stellarators, the most important engineering parameter that determines the machine size and cost is the minimum distance between the plasma boundary and mid-coil. Accommodating the breeding blanket and necessary shield within this distance to protect the ARIES-CS superconducting magnet represents a challenging task. Selecting the ARIES-CS nuclear and engineering parameters to produce an economic optimum, modeling the complex geometry for 3D nuclear analysis to confirm the key parameters, and minimizing the radwaste stream received considerable attention during the design process. These engineering design elements combined with advanced physics helped enable the compact stellarator to be a viable concept. This paper provides a brief historical overview of the progress in designing stellarator power plants and a perspective on the successful integration of the nuclear activity into the final ARIES-CS configuration

  3. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  4. Using Patent Classification to Discover Chemical Information in a Free Patent Database: Challenges and Opportunities

    Science.gov (United States)

    Ha¨rtinger, Stefan; Clarke, Nigel

    2016-01-01

    Developing skills for searching the patent literature is an essential element of chemical information literacy programs at the university level. The present article creates awareness of patents as a rich source of chemical information. Patent classification is introduced as a key-component in comprehensive search strategies. The free Espacenet…

  5. MERRF Classification: Implications for Diagnosis and Clinical Trials.

    Science.gov (United States)

    Finsterer, Josef; Zarrouk-Mahjoub, Sinda; Shoffner, John M

    2018-03-01

    Given the etiologic heterogeneity of disease classification using clinical phenomenology, we employed contemporary criteria to classify variants associated with myoclonic epilepsy with ragged-red fibers (MERRF) syndrome and to assess the strength of evidence of gene-disease associations. Standardized approaches are used to clarify the definition of MERRF, which is essential for patient diagnosis, patient classification, and clinical trial design. Systematic literature and database search with application of standardized assessment of gene-disease relationships using modified Smith criteria and of variants reported to be associated with MERRF using modified Yarham criteria. Review of available evidence supports a gene-disease association for two MT-tRNAs and for POLG. Using modified Smith criteria, definitive evidence of a MERRF gene-disease association is identified for MT-TK. Strong gene-disease evidence is present for MT-TL1 and POLG. Functional assays that directly associate variants with oxidative phosphorylation impairment were critical to mtDNA variant classification. In silico analysis was of limited utility to the assessment of individual MT-tRNA variants. With the use of contemporary classification criteria, several mtDNA variants previously reported as pathogenic or possibly pathogenic are reclassified as neutral variants. MERRF is primarily an MT-TK disease, with pathogenic variants in this gene accounting for ~90% of MERRF patients. Although MERRF is phenotypically and genotypically heterogeneous, myoclonic epilepsy is the clinical feature that distinguishes MERRF from other categories of mitochondrial disorders. Given its low frequency in mitochondrial disorders, myoclonic epilepsy is not explained simply by an impairment of cellular energetics. Although MERRF phenocopies can occur in other genes, additional data are needed to establish a MERRF disease-gene association. This approach to MERRF emphasizes standardized classification rather than clinical

  6. General regression and representation model for classification.

    Directory of Open Access Journals (Sweden)

    Jianjun Qian

    Full Text Available Recently, the regularized coding-based classification methods (e.g. SRC and CRC show a great potential for pattern classification. However, most existing coding methods assume that the representation residuals are uncorrelated. In real-world applications, this assumption does not hold. In this paper, we take account of the correlations of the representation residuals and develop a general regression and representation model (GRR for classification. GRR not only has advantages of CRC, but also takes full use of the prior information (e.g. the correlations between representation residuals and representation coefficients and the specific information (weight matrix of image pixels to enhance the classification performance. GRR uses the generalized Tikhonov regularization and K Nearest Neighbors to learn the prior information from the training data. Meanwhile, the specific information is obtained by using an iterative algorithm to update the feature (or image pixel weights of the test sample. With the proposed model as a platform, we design two classifiers: basic general regression and representation classifier (B-GRR and robust general regression and representation classifier (R-GRR. The experimental results demonstrate the performance advantages of proposed methods over state-of-the-art algorithms.

  7. REAL-TIME INTELLIGENT MULTILAYER ATTACK CLASSIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    T. Subbhulakshmi

    2014-01-01

    Full Text Available Intrusion Detection Systems (IDS takes the lion’s share of the current security infrastructure. Detection of intrusions is vital for initiating the defensive procedures. Intrusion detection was done by statistical and distance based methods. A threshold value is used in these methods to indicate the level of normalcy. When the network traffic crosses the level of normalcy then above which it is flagged as anomalous. When there are occurrences of new intrusion events which are increasingly a key part of system security, the statistical techniques cannot detect them. To overcome this issue, learning techniques are used which helps in identifying new intrusion activities in a computer system. The objective of the proposed system designed in this paper is to classify the intrusions using an Intelligent Multi Layered Attack Classification System (IMLACS which helps in detecting and classifying the intrusions with improved classification accuracy. The intelligent multi layered approach contains three intelligent layers. The first layer involves Binary Support Vector Machine classification for detecting the normal and attack. The second layer involves neural network classification to classify the attacks into classes of attacks. The third layer involves fuzzy inference system to classify the attacks into various subclasses. The proposed IMLACS can be able to detect an intrusion behavior of the networks since the system contains a three intelligent layer classification and better set of rules. Feature selection is also used to improve the time of detection. The experimental results show that the IMLACS achieves the Classification Rate of 97.31%.

  8. A classification model of Hyperion image base on SAM combined decision tree

    Science.gov (United States)

    Wang, Zhenghai; Hu, Guangdao; Zhou, YongZhang; Liu, Xin

    2009-10-01

    Monitoring the Earth using imaging spectrometers has necessitated more accurate analyses and new applications to remote sensing. A very high dimensional input space requires an exponentially large amount of data to adequately and reliably represent the classes in that space. On the other hand, with increase in the input dimensionality the hypothesis space grows exponentially, which makes the classification performance highly unreliable. Traditional classification algorithms Classification of hyperspectral images is challenging. New algorithms have to be developed for hyperspectral data classification. The Spectral Angle Mapper (SAM) is a physically-based spectral classification that uses an ndimensional angle to match pixels to reference spectra. The algorithm determines the spectral similarity between two spectra by calculating the angle between the spectra, treating them as vectors in a space with dimensionality equal to the number of bands. The key and difficulty is that we should artificial defining the threshold of SAM. The classification precision depends on the rationality of the threshold of SAM. In order to resolve this problem, this paper proposes a new automatic classification model of remote sensing image using SAM combined with decision tree. It can automatic choose the appropriate threshold of SAM and improve the classify precision of SAM base on the analyze of field spectrum. The test area located in Heqing Yunnan was imaged by EO_1 Hyperion imaging spectrometer using 224 bands in visual and near infrared. The area included limestone areas, rock fields, soil and forests. The area was classified into four different vegetation and soil types. The results show that this method choose the appropriate threshold of SAM and eliminates the disturbance and influence of unwanted objects effectively, so as to improve the classification precision. Compared with the likelihood classification by field survey data, the classification precision of this model

  9. A new classification system for congenital laryngeal cysts.

    Science.gov (United States)

    Forte, Vito; Fuoco, Gabriel; James, Adrian

    2004-06-01

    A new classification system for congenital laryngeal cysts based on the extent of the cyst and on the embryologic tissue of origin is proposed. Retrospective chart review. The charts of 20 patients with either congenital or acquired laryngeal cysts that were treated surgically between 1987 and 2002 at the Hospital for Sick Children, Toronto were retrospectively reviewed. Clinical presentation, radiologic findings, surgical management, histopathology, and outcome were recorded. A new classification system is proposed to better appreciate the origin of these cysts and to guide in their successful surgical management. Fourteen of the supraglottic and subglottic simple mucous retention cysts posed no diagnostic or therapeutic challenge and were treated successfully by a single endoscopic excision or marsupialization. The remaining six patients with congenital cysts in the study were deemed more complex, and all required open surgical procedures for cure. On the basis of the analysis of the data of these patients, a new classification of congenital laryngeal cysts is proposed. Type I cysts are confined to the larynx, the cyst wall composed of endodermal elements only, and can be managed endoscopically. Type II cysts extend beyond the confines of the larynx and require an external approach. The Type II cysts are further subclassified histologically on the basis of the embryologic tissue of origin: IIa, composed of endoderm only and IIb, containing endodermal and mesodermal elements (epithelium and cartilage) in the wall of the cyst. A new classification system for congenital laryngeal cysts is proposed on the basis of the extent of the cyst and the embryologic tissue of origin. This classification can help guide the surgeon with initial management and help us better understand the origin of these cysts.

  10. Phylogenetic classification and the universal tree.

    Science.gov (United States)

    Doolittle, W F

    1999-06-25

    From comparative analyses of the nucleotide sequences of genes encoding ribosomal RNAs and several proteins, molecular phylogeneticists have constructed a "universal tree of life," taking it as the basis for a "natural" hierarchical classification of all living things. Although confidence in some of the tree's early branches has recently been shaken, new approaches could still resolve many methodological uncertainties. More challenging is evidence that most archaeal and bacterial genomes (and the inferred ancestral eukaryotic nuclear genome) contain genes from multiple sources. If "chimerism" or "lateral gene transfer" cannot be dismissed as trivial in extent or limited to special categories of genes, then no hierarchical universal classification can be taken as natural. Molecular phylogeneticists will have failed to find the "true tree," not because their methods are inadequate or because they have chosen the wrong genes, but because the history of life cannot properly be represented as a tree. However, taxonomies based on molecular sequences will remain indispensable, and understanding of the evolutionary process will ultimately be enriched, not impoverished.

  11. Waste classification: a management approach

    International Nuclear Information System (INIS)

    Wickham, L.E.

    1984-01-01

    A waste classification system designed to quantify the total hazard of a waste has been developed by the Low-Level Waste Management Program. As originally conceived, the system was designed to deal with mixed radioactive waste. The methodology has been developed and successfully applied to radiological and chemical wastes, both individually and mixed together. Management options to help evaluate the financial and safety trade-offs between waste segregation, waste treatment, container types, and site factors are described. Using the system provides a very simple and cost effective way of making quick assessments of a site's capabilities to contain waste materials. 3 references

  12. A New Insight into Land Use Classification Based on Aggregated Mobile Phone Data

    OpenAIRE

    Pei, Tao; Sobolevsky, Stanislav; Ratti, Carlo; Shaw, Shih-Lung; Zhou, Chenghu

    2013-01-01

    Land use classification is essential for urban planning. Urban land use types can be differentiated either by their physical characteristics (such as reflectivity and texture) or social functions. Remote sensing techniques have been recognized as a vital method for urban land use classification because of their ability to capture the physical characteristics of land use. Although significant progress has been achieved in remote sensing methods designed for urban land use classification, most ...

  13. 44 CFR 8.2 - Original classification authority.

    Science.gov (United States)

    2010-10-01

    ... information originally as TOP SECRET, as designated by the President in the Federal Register, Vol 47, No. 91...)(2), E.O. 12356, the following positions have been delegated ORIGINAL TOP SECRET CLASSIFICATION... Preparedness Directorate (3) Director, Office of Security (c) The positions delegated original Top Secret...

  14. Challenges for eco-design of emerging technologies: The case of electronic textiles

    International Nuclear Information System (INIS)

    Köhler, Andreas R.

    2013-01-01

    Highlights: • Recent innovations of electronic textiles and their end-of-life impacts are reviewed. • The properties of e-textiles are examined against Design for Recycling (DfR) principles. • Eco-design strategies for sustainable product development are discussed. • Compatibility standards for e-textiles are proposed as a waste prevention strategy. • Labelling of e-textiles is suggested as a measure to facilitate recycling. - Abstract: The combination of textile and electronic technologies results in new challenges for sustainable product design. Electronic textiles (e-textiles) feature a seamless integration of textiles with electronics and other high-tech materials. Such products may, if they become mass consumer applications, result in a new kind of waste that could be difficult to recycle. The ongoing innovation process of e-textiles holds opportunities to prevent future end-of-life impacts. Implementing eco-design in the technological development process can help to minimise future waste. However, the existing Design for Recycling (DfR) principles for textiles or electronics do not match with the properties of the combined products. This article examines possibilities to advance eco-design of a converging technology. DfR strategies for e-textiles are discussed from the background of contemporary innovation trends. Three waste preventative eco-design approaches for e-textiles are discussed: 1 harnessing the inherent advantages of smart materials for sustainable design; 2 establishing open compatibility standards; 3 labelling the e-textiles to facilitate their recycling. It is argued that life-cycle thinking needs to be implemented concurrent to the technological development process

  15. A review on technologies and their usage in solid waste monitoring and management systems: Issues and challenges

    International Nuclear Information System (INIS)

    Hannan, M.A.; Abdulla Al Mamun, Md.; Hussain, Aini; Basri, Hassan; Begum, R.A.

    2015-01-01

    Highlights: • Classification of available technologies for SWM system in four core category. • Organization of technology based SWM systems in three main groups. • Summary of SWM systems with target application, methodology and functional domain. • Issues and challenges are highlighted for further design of a sustainable system. - Abstract: In the backdrop of prompt advancement, information and communication technology (ICT) has become an inevitable part to plan and design of modern solid waste management (SWM) systems. This study presents a critical review of the existing ICTs and their usage in SWM systems to unfold the issues and challenges towards using integrated technologies based system. To plan, monitor, collect and manage solid waste, the ICTs are divided into four categories such as spatial technologies, identification technologies, data acquisition technologies and data communication technologies. The ICT based SWM systems classified in this paper are based on the first three technologies while the forth one is employed by almost every systems. This review may guide the reader about the basics of available ICTs and their application in SWM to facilitate the search for planning and design of a sustainable new system

  16. A review on technologies and their usage in solid waste monitoring and management systems: Issues and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Hannan, M.A., E-mail: hannan@eng.ukm.my [Department of Electrical, Electronic and Systems Engineering, Faculty of Engineering & Built Environment, Universiti Kebangsaan Malaysia, Bangi, Selangor DE (Malaysia); Abdulla Al Mamun, Md., E-mail: md.abdulla@siswa.ukm.edu.my [Department of Electrical, Electronic and Systems Engineering, Faculty of Engineering & Built Environment, Universiti Kebangsaan Malaysia, Bangi, Selangor DE (Malaysia); Hussain, Aini, E-mail: aini@eng.ukm.my [Department of Electrical, Electronic and Systems Engineering, Faculty of Engineering & Built Environment, Universiti Kebangsaan Malaysia, Bangi, Selangor DE (Malaysia); Basri, Hassan, E-mail: drhb@ukm.my [Department of Civil and Structural Engineering, Faculty of Engineering & Built Environment, Universiti Kebangsaan Malaysia, Bangi, Selangor DE (Malaysia); Begum, R.A., E-mail: rawshan@ukm.edu.my [Institute of Climate Change, Universiti Kebangsaan Malaysia, Bangi, Selangor DE (Malaysia)

    2015-09-15

    Highlights: • Classification of available technologies for SWM system in four core category. • Organization of technology based SWM systems in three main groups. • Summary of SWM systems with target application, methodology and functional domain. • Issues and challenges are highlighted for further design of a sustainable system. - Abstract: In the backdrop of prompt advancement, information and communication technology (ICT) has become an inevitable part to plan and design of modern solid waste management (SWM) systems. This study presents a critical review of the existing ICTs and their usage in SWM systems to unfold the issues and challenges towards using integrated technologies based system. To plan, monitor, collect and manage solid waste, the ICTs are divided into four categories such as spatial technologies, identification technologies, data acquisition technologies and data communication technologies. The ICT based SWM systems classified in this paper are based on the first three technologies while the forth one is employed by almost every systems. This review may guide the reader about the basics of available ICTs and their application in SWM to facilitate the search for planning and design of a sustainable new system.

  17. A systematic review and development of a classification framework for factors associated with missing patient-reported outcome data.

    Science.gov (United States)

    Palmer, Michael J; Mercieca-Bebber, Rebecca; King, Madeleine; Calvert, Melanie; Richardson, Harriet; Brundage, Michael

    2018-02-01

    Missing patient-reported outcome data can lead to biased results, to loss of power to detect between-treatment differences, and to research waste. Awareness of factors may help researchers reduce missing patient-reported outcome data through study design and trial processes. The aim was to construct a Classification Framework of factors associated with missing patient-reported outcome data in the context of comparative studies. The first step in this process was informed by a systematic review. Two databases (MEDLINE and CINAHL) were searched from inception to March 2015 for English articles. Inclusion criteria were (a) relevant to patient-reported outcomes, (b) discussed missing data or compliance in prospective medical studies, and (c) examined predictors or causes of missing data, including reasons identified in actual trial datasets and reported on cover sheets. Two reviewers independently screened titles and abstracts. Discrepancies were discussed with the research team prior to finalizing the list of eligible papers. In completing the systematic review, four particular challenges to synthesizing the extracted information were identified. To address these challenges, operational principles were established by consensus to guide the development of the Classification Framework. A total of 6027 records were screened. In all, 100 papers were eligible and included in the review. Of these, 57% focused on cancer, 23% did not specify disease, and 20% reported for patients with a variety of non-cancer conditions. In total, 40% of the papers offered a descriptive analysis of possible factors associated with missing data, but some papers used other methods. In total, 663 excerpts of text (units), each describing a factor associated with missing patient-reported outcome data, were extracted verbatim. Redundant units were identified and sequestered. Similar units were grouped, and an iterative process of consensus among the investigators was used to reduce these units to a

  18. Two conceptual designs of helical fusion reactor FFHR-d1A based on ITER technologies and challenging ideas

    Science.gov (United States)

    Sagara, A.; Miyazawa, J.; Tamura, H.; Tanaka, T.; Goto, T.; Yanagi, N.; Sakamoto, R.; Masuzaki, S.; Ohtani, H.; The FFHR Design Group

    2017-08-01

    The Fusion Engineering Research Project (FERP) at the National Institute for Fusion Science (NIFS) is conducting conceptual design activities for the LHD-type helical fusion reactor FFHR-d1A. This paper newly defines two design options, ‘basic’ and ‘challenging.’ Conservative technologies, including those that will be demonstrated in ITER, are chosen in the basic option in which two helical coils are made of continuously wound cable-in-conduit superconductors of Nb3Sn strands, the divertor is composed of water-cooled tungsten monoblocks, and the blanket is composed of water-cooled ceramic breeders. In contrast, new ideas that would possibly be beneficial for making the reactor design more attractive are boldly included in the challenging option in which the helical coils are wound by connecting high-temperature REBCO superconductors using mechanical joints, the divertor is composed of a shower of molten tin jets, and the blanket is composed of molten salt FLiNaBe including Ti powers to increase hydrogen solubility. The main targets of the challenging option are early construction and easy maintenance of a large and three-dimensionally complicated helical structure, high thermal efficiency, and, in particular, realistic feasibility of the helical reactor.

  19. A segmentation and classification scheme for single tooth in MicroCT images based on 3D level set and k-means+.

    Science.gov (United States)

    Wang, Liansheng; Li, Shusheng; Chen, Rongzhen; Liu, Sze-Yu; Chen, Jyh-Cheng

    2017-04-01

    Accurate classification of different anatomical structures of teeth from medical images provides crucial information for the stress analysis in dentistry. Usually, the anatomical structures of teeth are manually labeled by experienced clinical doctors, which is time consuming. However, automatic segmentation and classification is a challenging task because the anatomical structures and surroundings of the tooth in medical images are rather complex. Therefore, in this paper, we propose an effective framework which is designed to segment the tooth with a Selective Binary and Gaussian Filtering Regularized Level Set (GFRLS) method improved by fully utilizing 3 dimensional (3D) information, and classify the tooth by employing unsupervised learning i.e., k-means++ method. In order to evaluate the proposed method, the experiments are conducted on the sufficient and extensive datasets of mandibular molars. The experimental results show that our method can achieve higher accuracy and robustness compared to other three clustering methods. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. A Classification-based Review Recommender

    Science.gov (United States)

    O'Mahony, Michael P.; Smyth, Barry

    Many online stores encourage their users to submit product/service reviews in order to guide future purchasing decisions. These reviews are often listed alongside product recommendations but, to date, limited attention has been paid as to how best to present these reviews to the end-user. In this paper, we describe a supervised classification approach that is designed to identify and recommend the most helpful product reviews. Using the TripAdvisor service as a case study, we compare the performance of several classification techniques using a range of features derived from hotel reviews. We then describe how these classifiers can be used as the basis for a practical recommender that automatically suggests the mosthelpful contrasting reviews to end-users. We present an empirical evaluation which shows that our approach achieves a statistically significant improvement over alternative review ranking schemes.

  1. Approach to design neural cryptography: a generalized architecture and a heuristic rule.

    Science.gov (United States)

    Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen

    2013-06-01

    Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.

  2. Software life cycle process and classification guides for KNICS digital instrumentation and control system design

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Son, Han Seung; Kim, Jang Yeol; Kwon, Kee Choon; Lee, Soon Seung; Kim, Doo Hwan [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-04-01

    Documentation should exist that shows that the qualification activities have been successfully accomplished for each life cycle activity group. In particular, the documentation should show that the system safety requirements have been adequately addressed for each life cycle activity group, that no new hazards have been introduced, and that the software requirements, design elements, and code elements that can affect safety have been identified. Because the safety of software can be assured through both the process Verification and Validation (V and V) itself and the V and V of all the intermediate and final products during the software development lifecycle, the development of KNICS Software Safety Framework (KSSF) must be established. As the first activity for establishing KSSF, we have developed this report, Software Life Cycle Process and Classification Guides for KNICS Digital I and C System. This report is organized as follows. Chapter I describes the background, definitions, and references of SLCP. Chapter II describes KNICS safety software categorization. In Chapter III, we define the requirements on software life cycle process for designing digital KNICS. Chapter III.3, that is the main section of the chapter, includes the requirements for software life cycle process planning, the requirements for software life cycle process implementation, and the requirements for software life cycle process design outputs. Finally, we have described the result of a case study on the SLCP for developing the software of ESF-CCS system that is being developed by a private company, BNF. 29 refs., 5 figs., 7 tabs. (Author)

  3. Hand eczema classification

    DEFF Research Database (Denmark)

    Diepgen, T L; Andersen, Klaus Ejner; Brandao, F M

    2008-01-01

    of the disease is rarely evidence based, and a classification system for different subdiagnoses of hand eczema is not agreed upon. Randomized controlled trials investigating the treatment of hand eczema are called for. For this, as well as for clinical purposes, a generally accepted classification system...... A classification system for hand eczema is proposed. Conclusions It is suggested that this classification be used in clinical work and in clinical trials....

  4. Challenges and considerations for the design and production of a purpose-optimized body-worn wrist-watch computer

    Science.gov (United States)

    Narayanaswami, Chandra; Raghunath, Mandayam T.

    2004-09-01

    We outline a collection of technological challenges in the design of wearable computers with a focus on one of the most desirable form-factors, the wrist watch. We describe our experience with building three generations of wrist watch computers. We built these research prototypes as platforms to investigate the fundamental limitations of wearable computing. Results of our investigations are presented in the form of challenges that have been overcome and those that still remain.

  5. A Preliminary Study on the Multiple Mapping Structure of Classification Systems for Heterogeneous Databases

    Directory of Open Access Journals (Sweden)

    Seok-Hyoung Lee

    2012-06-01

    Full Text Available While science and technology information service portals and heterogeneous databases produced in Korea and other countries are integrated, methods of connecting the unique classification systems applied to each database have been studied. Results of technologists' research, such as, journal articles, patent specifications, and research reports, are organically related to each other. In this case, if the most basic and meaningful classification systems are not connected, it is difficult to achieve interoperability of the information and thus not easy to implement meaningful science technology information services through information convergence. This study aims to address the aforementioned issue by analyzing mapping systems between classification systems in order to design a structure to connect a variety of classification systems used in the academic information database of the Korea Institute of Science and Technology Information, which provides science and technology information portal service. This study also aims to design a mapping system for the classification systems to be applied to actual science and technology information services and information management systems.

  6. Security classification of information

    Energy Technology Data Exchange (ETDEWEB)

    Quist, A.S.

    1993-04-01

    This document is the second of a planned four-volume work that comprehensively discusses the security classification of information. The main focus of Volume 2 is on the principles for classification of information. Included herein are descriptions of the two major types of information that governments classify for national security reasons (subjective and objective information), guidance to use when determining whether information under consideration for classification is controlled by the government (a necessary requirement for classification to be effective), information disclosure risks and benefits (the benefits and costs of classification), standards to use when balancing information disclosure risks and benefits, guidance for assigning classification levels (Top Secret, Secret, or Confidential) to classified information, guidance for determining how long information should be classified (classification duration), classification of associations of information, classification of compilations of information, and principles for declassifying and downgrading information. Rules or principles of certain areas of our legal system (e.g., trade secret law) are sometimes mentioned to .provide added support to some of those classification principles.

  7. PAI-1 and EGFR expression in adult glioma tumors: toward a molecular prognostic classification

    International Nuclear Information System (INIS)

    Muracciole, Xavier; Romain, Sylvie; Dufour, Henri; Palmari, Jacqueline; Chinot, Olivier; Ouafik, L'Houcine; Grisoli, Francois; Figarella-Branger, Dominique; Martin, Pierre-Marie

    2002-01-01

    Purpose: Molecular classification of gliomas is a major challenge in the effort to improve therapeutic decisions. The plasminogen activator system, including plasminogen activator inhibitor type 1 (PAI-1), plays a key role in tumor invasion and neoangiogenesis. Epidermal growth factor receptor (EGFR) is involved in the control of proliferation. The contribution of PAI-1 and EGFR to the survival of gliomas was retrospectively investigated. Methods and Materials: Fifty-nine adult gliomas treated by neurosurgery and conventional irradiation were analyzed, including 9 low-grade (2) and 50 high-grade (3-4) tumors (WHO classification). PAI-1 was measured on cytosols and EGFR on solubilized membranes using ELISA methods. Results: High PAI-1 levels were strongly associated with high histologic grade (p<0.001) and histologic necrosis (p<0.001). PAI-1 also correlated positively with patient age (p=0.05) and negatively with Karnofsky index (p=0.01). By univariate analysis of the high-grade population, higher PAI-1 (p<0.0001) and EGFR values (p=0.02) were associated with shorter overall survival. Only PAI-1 was an independent factor in multivariate analysis. Grade 3 tumors with low PAI-1 (100% 3-year overall survival rate) presented the same clinical outcome as the low-grade tumors. Conclusions: In this prognostic study, PAI-1 and EGFR expression revealed similarities and differences between high-grade gliomas that were not apparent by traditional clinical criteria. These data strongly support that biologic factors should be included in glioma classification and the design of clinical trials to treat more homogeneous populations

  8. Support Vector Machines for Hyperspectral Remote Sensing Classification

    Science.gov (United States)

    Gualtieri, J. Anthony; Cromp, R. F.

    1998-01-01

    The Support Vector Machine provides a new way to design classification algorithms which learn from examples (supervised learning) and generalize when applied to new data. We demonstrate its success on a difficult classification problem from hyperspectral remote sensing, where we obtain performances of 96%, and 87% correct for a 4 class problem, and a 16 class problem respectively. These results are somewhat better than other recent results on the same data. A key feature of this classifier is its ability to use high-dimensional data without the usual recourse to a feature selection step to reduce the dimensionality of the data. For this application, this is important, as hyperspectral data consists of several hundred contiguous spectral channels for each exemplar. We provide an introduction to this new approach, and demonstrate its application to classification of an agriculture scene.

  9. Competency-based education: programme design and challenges to implementation.

    Science.gov (United States)

    Gruppen, Larry D; Burkhardt, John C; Fitzgerald, James T; Funnell, Martha; Haftel, Hilary M; Lypson, Monica L; Mullan, Patricia B; Santen, Sally A; Sheets, Kent J; Stalburg, Caren M; Vasquez, John A

    2016-05-01

    Competency-based education (CBE) has been widely cited as an educational framework for medical students and residents, and provides a framework for designing educational programmes that reflect four critical features: a focus on outcomes, an emphasis on abilities, a reduction of emphasis on time-based training, and promotion of learner centredness. Each of these features has implications and potential challenges for implementing CBE. As an experiment in CBE programme design and implementation, the University of Michigan Master of Health Professions Education (UM-MHPE) degree programme was examined for lessons to be learned when putting CBE into practice. The UM-MHPE identifies 12 educational competencies and 20 educational entrustable professional activities (EPAs) that serve as the vehicle for both learning and assessment. The programme also defines distinct roles of faculty members as assessors, mentors and subject-matter experts focused on highly individualised learning plans adapted to each learner. Early experience with implementing the UM-MHPE indicates that EPAs and competencies can provide a viable alternative to traditional courses and a vehicle for rigorous assessment. A high level of individualisation is feasible but carries with it significant costs and makes intentional community building essential. Most significantly, abandoning a time-based framework is a difficult innovation to implement in a university structure that is predicated on time-based education. © 2016 John Wiley & Sons Ltd.

  10. Reporting Qualitative Research: Standards, Challenges, and Implications for Health Design.

    Science.gov (United States)

    Peditto, Kathryn

    2018-04-01

    This Methods column describes the existing reporting standards for qualitative research, their application to health design research, and the challenges to implementation. Intended for both researchers and practitioners, this article provides multiple perspectives on both reporting and evaluating high-quality qualitative research. Two popular reporting standards exist for reporting qualitative research-the Consolidated Criteria for Reporting Qualitative Research (COREQ) and the Standards for Reporting Qualitative Research (SRQR). Though compiled using similar procedures, they differ in their criteria and the methods to which they apply. Creating and applying reporting criteria is inherently difficult due to the undefined and fluctuating nature of qualitative research when compared to quantitative studies. Qualitative research is expansive and occasionally controversial, spanning many different methods of inquiry and epistemological approaches. A "one-size-fits-all" standard for reporting qualitative research can be restrictive, but COREQ and SRQR both serve as valuable tools for developing responsible qualitative research proposals, effectively communicating research decisions, and evaluating submissions. Ultimately, tailoring a set of standards specific to health design research and its frequently used methods would ensure quality research and aid reviewers in their evaluations.

  11. Automatic Segmentation of Dermoscopic Images by Iterative Classification

    Directory of Open Access Journals (Sweden)

    Maciel Zortea

    2011-01-01

    Full Text Available Accurate detection of the borders of skin lesions is a vital first step for computer aided diagnostic systems. This paper presents a novel automatic approach to segmentation of skin lesions that is particularly suitable for analysis of dermoscopic images. Assumptions about the image acquisition, in particular, the approximate location and color, are used to derive an automatic rule to select small seed regions, likely to correspond to samples of skin and the lesion of interest. The seed regions are used as initial training samples, and the lesion segmentation problem is treated as binary classification problem. An iterative hybrid classification strategy, based on a weighted combination of estimated posteriors of a linear and quadratic classifier, is used to update both the automatically selected training samples and the segmentation, increasing reliability and final accuracy, especially for those challenging images, where the contrast between the background skin and lesion is low.

  12. An evaluation of sampling and full enumeration strategies for Fisher Jenks classification in big data settings

    Science.gov (United States)

    Rey, Sergio J.; Stephens, Philip A.; Laura, Jason R.

    2017-01-01

    Large data contexts present a number of challenges to optimal choropleth map classifiers. Application of optimal classifiers to a sample of the attribute space is one proposed solution. The properties of alternative sampling-based classification methods are examined through a series of Monte Carlo simulations. The impacts of spatial autocorrelation, number of desired classes, and form of sampling are shown to have significant impacts on the accuracy of map classifications. Tradeoffs between improved speed of the sampling approaches and loss of accuracy are also considered. The results suggest the possibility of guiding the choice of classification scheme as a function of the properties of large data sets.

  13. Effects of Classification Exposure upon Numerical Achievement of Educable Mentally Retarded Children.

    Science.gov (United States)

    Funk, Kerri L.; Tseng, M. S.

    Two groups of 32 educable mentally retarded children (ages 7 to 14 years) were compared as to their arithmetic and classification performances attributable to the presence or absence of a 4 1/2 week exposure to classification tasks. The randomized block pretest-posttest design was used. The experimental group and the control group were matched on…

  14. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  15. Evaluation of the Waste Isolation Pilot Plant classification of systems, structures and components

    International Nuclear Information System (INIS)

    1985-07-01

    A review of the classification system for systems, structures, and components at the Waste Isolation Pilot Plant (WIPP) was performed using the WIPP Safety Analysis Report (SAR) and Bechtel document D-76-D-03 as primary source documents. The regulations of the US Nuclear Regulatory Commission (NRC) covering ''Disposal of High level Radioactive Wastes in Geologic Repositories,'' 10 CFR 60, and the regulations relevant to nuclear power plant siting and construction (10 CFR 50, 51, 100) were used as standards to evaluate the WIPP design classification system, although it is recognized that the US Department of Energy (DOE) is not required to comply with these NRC regulations in the design and construction of WIPP. The DOE General Design Criteria Manual (DOE Order 6430.1) and the Safety Analysis and Review System for AL Operation document (AL 54f81.1A) were reviewed in part. This report includes a discussion of the historical basis for nuclear power plant requirements, a review of WIPP and nuclear power plant classification bases, and a comparison of the codes and standards applicable to each quality level. Observations made during the review of the WIPP SAR are noted in the text of this reoport. The conclusions reached by this review are: WIPP classification methodology is comparable to corresponding nuclear power procedures. The classification levels assigned to WIPP systems are qualitatively the same as those assigned to nuclear power plant systems

  16. Development of climatic zones and passive solar design in Madagascar

    International Nuclear Information System (INIS)

    Rakoto-Joseph, O.; Garde, F.; David, M.; Adelard, L.; Randriamanantany, Z.A.

    2009-01-01

    Climate classification is extremely useful to design buildings for thermal comfort purposes. This paper presents the first work for a climate classification of Madagascar Island. This classification is based on the meteorological data measured in different cities of this country. Three major climatic zones are identified. Psychometric charts for the six urban areas of Madagascar are proposed, and suited passive solar designs related to each climate are briefly discussed. Finally, a total of three passive design zones have been identified and appropriate design strategies such as solar heating, natural ventilation, thermal mass are suggested for each zone. The specificity of this work is that: it is the first published survey on the climate classification and the passive solar designs for this developing country

  17. Design of Embedded System for Multivariate Classification of Finger and Thumb Movements Using EEG Signals for Control of Upper Limb Prosthesis

    Science.gov (United States)

    Javed, Amna; Tiwana, Mohsin I.; Khan, Umar Shahbaz

    2018-01-01

    Brain Computer Interface (BCI) determines the intent of the user from a variety of electrophysiological signals. These signals, Slow Cortical Potentials, are recorded from scalp, and cortical neuronal activity is recorded by implanted electrodes. This paper is focused on design of an embedded system that is used to control the finger movements of an upper limb prosthesis using Electroencephalogram (EEG) signals. This is a follow-up of our previous research which explored the best method to classify three movements of fingers (thumb movement, index finger movement, and first movement). Two-stage logistic regression classifier exhibited the highest classification accuracy while Power Spectral Density (PSD) was used as a feature of the filtered signal. The EEG signal data set was recorded using a 14-channel electrode headset (a noninvasive BCI system) from right-handed, neurologically intact volunteers. Mu (commonly known as alpha waves) and Beta Rhythms (8–30 Hz) containing most of the movement data were retained through filtering using “Arduino Uno” microcontroller followed by 2-stage logistic regression to obtain a mean classification accuracy of 70%. PMID:29888252

  18. Design of Embedded System for Multivariate Classification of Finger and Thumb Movements Using EEG Signals for Control of Upper Limb Prosthesis

    Directory of Open Access Journals (Sweden)

    Nasir Rashid

    2018-01-01

    Full Text Available Brain Computer Interface (BCI determines the intent of the user from a variety of electrophysiological signals. These signals, Slow Cortical Potentials, are recorded from scalp, and cortical neuronal activity is recorded by implanted electrodes. This paper is focused on design of an embedded system that is used to control the finger movements of an upper limb prosthesis using Electroencephalogram (EEG signals. This is a follow-up of our previous research which explored the best method to classify three movements of fingers (thumb movement, index finger movement, and first movement. Two-stage logistic regression classifier exhibited the highest classification accuracy while Power Spectral Density (PSD was used as a feature of the filtered signal. The EEG signal data set was recorded using a 14-channel electrode headset (a noninvasive BCI system from right-handed, neurologically intact volunteers. Mu (commonly known as alpha waves and Beta Rhythms (8–30 Hz containing most of the movement data were retained through filtering using “Arduino Uno” microcontroller followed by 2-stage logistic regression to obtain a mean classification accuracy of 70%.

  19. Design of Embedded System for Multivariate Classification of Finger and Thumb Movements Using EEG Signals for Control of Upper Limb Prosthesis.

    Science.gov (United States)

    Rashid, Nasir; Iqbal, Javaid; Javed, Amna; Tiwana, Mohsin I; Khan, Umar Shahbaz

    2018-01-01

    Brain Computer Interface (BCI) determines the intent of the user from a variety of electrophysiological signals. These signals, Slow Cortical Potentials, are recorded from scalp, and cortical neuronal activity is recorded by implanted electrodes. This paper is focused on design of an embedded system that is used to control the finger movements of an upper limb prosthesis using Electroencephalogram (EEG) signals. This is a follow-up of our previous research which explored the best method to classify three movements of fingers (thumb movement, index finger movement, and first movement). Two-stage logistic regression classifier exhibited the highest classification accuracy while Power Spectral Density (PSD) was used as a feature of the filtered signal. The EEG signal data set was recorded using a 14-channel electrode headset (a noninvasive BCI system) from right-handed, neurologically intact volunteers. Mu (commonly known as alpha waves) and Beta Rhythms (8-30 Hz) containing most of the movement data were retained through filtering using "Arduino Uno" microcontroller followed by 2-stage logistic regression to obtain a mean classification accuracy of 70%.

  20. Multi-level discriminative dictionary learning with application to large scale image classification.

    Science.gov (United States)

    Shen, Li; Sun, Gang; Huang, Qingming; Wang, Shuhui; Lin, Zhouchen; Wu, Enhua

    2015-10-01

    The sparse coding technique has shown flexibility and capability in image representation and analysis. It is a powerful tool in many visual applications. Some recent work has shown that incorporating the properties of task (such as discrimination for classification task) into dictionary learning is effective for improving the accuracy. However, the traditional supervised dictionary learning methods suffer from high computation complexity when dealing with large number of categories, making them less satisfactory in large scale applications. In this paper, we propose a novel multi-level discriminative dictionary learning method and apply it to large scale image classification. Our method takes advantage of hierarchical category correlation to encode multi-level discriminative information. Each internal node of the category hierarchy is associated with a discriminative dictionary and a classification model. The dictionaries at different layers are learnt to capture the information of different scales. Moreover, each node at lower layers also inherits the dictionary of its parent, so that the categories at lower layers can be described with multi-scale information. The learning of dictionaries and associated classification models is jointly conducted by minimizing an overall tree loss. The experimental results on challenging data sets demonstrate that our approach achieves excellent accuracy and competitive computation cost compared with other sparse coding methods for large scale image classification.

  1. 76 FR 80278 - Revision of Cotton Classification Procedures for Determining Cotton Leaf Grade

    Science.gov (United States)

    2011-12-23

    ... challenge to the provisions of this rule. Regulatory Flexibility Act Pursuant to requirements set forth in... currently part of the official USDA cotton classification. Accurate assignment of leaf grade is of economic... cost factor associated with its removal. Furthermore, since small leaf particles cannot always be...

  2. PASTEC: an automatic transposable element classification tool.

    Directory of Open Access Journals (Sweden)

    Claire Hoede

    Full Text Available SUMMARY: The classification of transposable elements (TEs is key step towards deciphering their potential impact on the genome. However, this process is often based on manual sequence inspection by TE experts. With the wealth of genomic sequences now available, this task requires automation, making it accessible to most scientists. We propose a new tool, PASTEC, which classifies TEs by searching for structural features and similarities. This tool outperforms currently available software for TE classification. The main innovation of PASTEC is the search for HMM profiles, which is useful for inferring the classification of unknown TE on the basis of conserved functional domains of the proteins. In addition, PASTEC is the only tool providing an exhaustive spectrum of possible classifications to the order level of the Wicker hierarchical TE classification system. It can also automatically classify other repeated elements, such as SSR (Simple Sequence Repeats, rDNA or potential repeated host genes. Finally, the output of this new tool is designed to facilitate manual curation by providing to biologists with all the evidence accumulated for each TE consensus. AVAILABILITY: PASTEC is available as a REPET module or standalone software (http://urgi.versailles.inra.fr/download/repet/REPET_linux-x64-2.2.tar.gz. It requires a Unix-like system. There are two standalone versions: one of which is parallelized (requiring Sun grid Engine or Torque, and the other of which is not.

  3. Handling Imbalanced Data Sets in Multistage Classification

    Science.gov (United States)

    López, M.

    Multistage classification is a logical approach, based on a divide-and-conquer solution, for dealing with problems with a high number of classes. The classification problem is divided into several sequential steps, each one associated to a single classifier that works with subgroups of the original classes. In each level, the current set of classes is split into smaller subgroups of classes until they (the subgroups) are composed of only one class. The resulting chain of classifiers can be represented as a tree, which (1) simplifies the classification process by using fewer categories in each classifier and (2) makes it possible to combine several algorithms or use different attributes in each stage. Most of the classification algorithms can be biased in the sense of selecting the most populated class in overlapping areas of the input space. This can degrade a multistage classifier performance if the training set sample frequencies do not reflect the real prevalence in the population. Several techniques such as applying prior probabilities, assigning weights to the classes, or replicating instances have been developed to overcome this handicap. Most of them are designed for two-class (accept-reject) problems. In this article, we evaluate several of these techniques as applied to multistage classification and analyze how they can be useful for astronomy. We compare the results obtained by classifying a data set based on Hipparcos with and without these methods.

  4. Status of Foreground and Instrument Challenges for 21cm EoR experiments - Design Strategies for SKA and HERA

    Science.gov (United States)

    Thyagarajan, Nithyanandan

    2018-05-01

    Direct detection of the Epoch of Reionization (EoR) via redshifted 21 cm line of H i will reveal the nature of the first stars and galaxies as well as revolutionize our understanding of a poorly explored evolutionary phase of the Universe. Projects such as the MWA, LOFAR, and PAPER commenced in the last decade with the promise of high significance statistical detection of the EoR, but have so far only weakly constrained models owing to unforeseen challenges from bright foreground sources and instrument systematics. It is essential for next generation instruments like the HERA and SKA to have these challenges addressed. I present an analysis of these challenges - wide-field measurements, antenna beam chromaticity, reflections in the instrument, and antenna position errors - along with performance specifications and design solutions that will be critical to designing successful next-generation instruments in enabling the first detection and also in placing meaningful constraints on reionization models.

  5. Comparing Linear Discriminant Function with Logistic Regression for the Two-Group Classification Problem.

    Science.gov (United States)

    Fan, Xitao; Wang, Lin

    The Monte Carlo study compared the performance of predictive discriminant analysis (PDA) and that of logistic regression (LR) for the two-group classification problem. Prior probabilities were used for classification, but the cost of misclassification was assumed to be equal. The study used a fully crossed three-factor experimental design (with…

  6. Feature extraction based on extended multi-attribute profiles and sparse autoencoder for remote sensing image classification

    Science.gov (United States)

    Teffahi, Hanane; Yao, Hongxun; Belabid, Nasreddine; Chaib, Souleyman

    2018-02-01

    The satellite images with very high spatial resolution have been recently widely used in image classification topic as it has become challenging task in remote sensing field. Due to a number of limitations such as the redundancy of features and the high dimensionality of the data, different classification methods have been proposed for remote sensing images classification particularly the methods using feature extraction techniques. This paper propose a simple efficient method exploiting the capability of extended multi-attribute profiles (EMAP) with sparse autoencoder (SAE) for remote sensing image classification. The proposed method is used to classify various remote sensing datasets including hyperspectral and multispectral images by extracting spatial and spectral features based on the combination of EMAP and SAE by linking them to kernel support vector machine (SVM) for classification. Experiments on new hyperspectral image "Huston data" and multispectral image "Washington DC data" shows that this new scheme can achieve better performance of feature learning than the primitive features, traditional classifiers and ordinary autoencoder and has huge potential to achieve higher accuracy for classification in short running time.

  7. Multi-q pattern classification of polarization curves

    Science.gov (United States)

    Fabbri, Ricardo; Bastos, Ivan N.; Neto, Francisco D. Moura; Lopes, Francisco J. P.; Gonçalves, Wesley N.; Bruno, Odemir M.

    2014-02-01

    Several experimental measurements are expressed in the form of one-dimensional profiles, for which there is a scarcity of methodologies able to classify the pertinence of a given result to a specific group. The polarization curves that evaluate the corrosion kinetics of electrodes in corrosive media are applications where the behavior is chiefly analyzed from profiles. Polarization curves are indeed a classic method to determine the global kinetics of metallic electrodes, but the strong nonlinearity from different metals and alloys can overlap and the discrimination becomes a challenging problem. Moreover, even finding a typical curve from replicated tests requires subjective judgment. In this paper, we used the so-called multi-q approach based on the Tsallis statistics in a classification engine to separate the multiple polarization curve profiles of two stainless steels. We collected 48 experimental polarization curves in an aqueous chloride medium of two stainless steel types, with different resistance against localized corrosion. Multi-q pattern analysis was then carried out on a wide potential range, from cathodic up to anodic regions. An excellent classification rate was obtained, at a success rate of 90%, 80%, and 83% for low (cathodic), high (anodic), and both potential ranges, respectively, using only 2% of the original profile data. These results show the potential of the proposed approach towards efficient, robust, systematic and automatic classification of highly nonlinear profile curves.

  8. Classification of Breast Cancer Subtypes by combining Gene Expression and DNA Methylation Data

    DEFF Research Database (Denmark)

    List, Markus; Hauschild, Anne-Christin; Tan, Qihua

    2014-01-01

    expression data for hundreds of patients, the challenge is to extract a minimal optimal set of genes with good prognostic properties from a large bulk of genes making a moderate contribution to classification. Several studies have successfully applied machine learning algorithms to solve this so-called gene...... on the transcriptomic, but also on an epigenetic level. We compared so-called random forest derived classification models based on gene expression and methylation data alone, to a model based on the combined features and to a model based on the gold standard PAM50. We obtained bootstrap errors of 10...

  9. Enhanced land use/cover classification of heterogeneous tropical landscapes using support vector machines and textural homogeneity

    Science.gov (United States)

    Paneque-Gálvez, Jaime; Mas, Jean-François; Moré, Gerard; Cristóbal, Jordi; Orta-Martínez, Martí; Luz, Ana Catarina; Guèze, Maximilien; Macía, Manuel J.; Reyes-García, Victoria

    2013-08-01

    Land use/cover classification is a key research field in remote sensing and land change science as thematic maps derived from remotely sensed data have become the basis for analyzing many socio-ecological issues. However, land use/cover classification remains a difficult task and it is especially challenging in heterogeneous tropical landscapes where nonetheless such maps are of great importance. The present study aims at establishing an efficient classification approach to accurately map all broad land use/cover classes in a large, heterogeneous tropical area, as a basis for further studies (e.g., land use/cover change, deforestation and forest degradation). Specifically, we first compare the performance of parametric (maximum likelihood), non-parametric (k-nearest neighbor and four different support vector machines - SVM), and hybrid (unsupervised-supervised) classifiers, using hard and soft (fuzzy) accuracy assessments. We then assess, using the maximum likelihood algorithm, what textural indices from the gray-level co-occurrence matrix lead to greater classification improvements at the spatial resolution of Landsat imagery (30 m), and rank them accordingly. Finally, we use the textural index that provides the most accurate classification results to evaluate whether its usefulness varies significantly with the classifier used. We classified imagery corresponding to dry and wet seasons and found that SVM classifiers outperformed all the rest. We also found that the use of some textural indices, but particularly homogeneity and entropy, can significantly improve classifications. We focused on the use of the homogeneity index, which has so far been neglected in land use/cover classification efforts, and found that this index along with reflectance bands significantly increased the overall accuracy of all the classifiers, but particularly of SVM. We observed that improvements in producer's and user's accuracies through the inclusion of homogeneity were different

  10. The front end electronics of the NA62 Gigatracker: challenges, design and experimental measurements

    Science.gov (United States)

    Noy, M.; Aglieri Rinella, G.; Ceccucci, A.; Dellacasa, G.; Fiorini, M.; Garbolino, S.; Jarron, P.; Kaplon, J.; Kluge, A.; Marchetto, F.; Martin, E.; Mazza, G.; Martoiu, S.; Morel, M.; Perktold, L.; Rivetti, A.; Tiuraniemi, S.

    2011-06-01

    The beam spectrometer of the NA62 experiment consists of 3 Gigatracker (GTK) stations. Each station comprises a pixel detector of 16 cm active area made of an assembly of 10 readout ASICs bump bonded to a 200 μm thick pixel silicon sensor, comprising 18000 pixels of 300 μm×300 μm. The main challenge of the NA62 pixel GTK station is the combination of an extremely high kaon/pion beam rate, where the intensity in the center of the beam reaches up to 1.5 Mhit s mm together with an extreme time resolution of 100 ps. To date, it is the first silicon tracking system with this time resolution. To face this challenge, the pixel analogue front end has been designed with a peaking time of 4 ns, with a planar silicon sensor operating up to 300 V over depletion. Moreover, the radiation level is severe, 2×10 1 MeV n cm per year of operation. Easy replacement of the GTK stations is foreseen as a design requirement. The amount of material of a single station should also be less than 0.5% X to minimize the background, which imposes strong constraints on the mechanics and the cooling system. We report upon the design and architecture of the 2 prototype demonstrator chips both designed in 130 nm CMOS technology, one with a constant fraction discriminator and the time stamp digitisation in each pixel (In-Pixel), and the other with a time-over-threshold discriminator and the processing of the time stamp located in the End of Column (EoC) region at the chip periphery. Some preliminary results are presented.

  11. CLASSIFICATION ALGORITHMS FOR BIG DATA ANALYSIS, A MAP REDUCE APPROACH

    Directory of Open Access Journals (Sweden)

    V. A. Ayma

    2015-03-01

    Full Text Available Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP, which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA’s machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM. The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  12. Common occupational classification system - revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Stahlman, E.J.; Lewis, R.E.

    1996-05-01

    Workforce planning has become an increasing concern within the DOE community as the Office of Environmental Restoration and Waste Management (ER/WM or EM) seeks to consolidate and refocus its activities and the Office of Defense Programs (DP) closes production sites. Attempts to manage the growth and skills mix of the EM workforce while retaining the critical skills of the DP workforce have been difficult due to the lack of a consistent set of occupational titles and definitions across the complex. Two reasons for this difficulty may be cited. First, classification systems commonly used in industry often fail to cover in sufficient depth the unique demands of DOE`s nuclear energy and research community. Second, the government practice of contracting the operation of government facilities to the private sector has introduced numerous contractor-specific classification schemes to the DOE complex. As a result, sites/contractors report their workforce needs using unique classification systems. It becomes difficult, therefore, to roll these data up to the national level necessary to support strategic planning and analysis. The Common Occupational Classification System (COCS) is designed to overcome these workforce planning barriers. The COCS is based on earlier workforce planning activities and the input of technical, workforce planning, and human resource managers from across the DOE complex. It provides a set of mutually-exclusive occupation titles and definitions that cover the broad range of activities present in the DOE complex. The COCS is not a required record-keeping or data management guide. Neither is it intended to replace contractor/DOE-specific classification systems. Instead, the system provides a consistent, high- level, functional structure of occupations to which contractors can crosswalk (map) their job titles.

  13. Deep multi-scale convolutional neural network for hyperspectral image classification

    Science.gov (United States)

    Zhang, Feng-zhe; Yang, Xia

    2018-04-01

    In this paper, we proposed a multi-scale convolutional neural network for hyperspectral image classification task. Firstly, compared with conventional convolution, we utilize multi-scale convolutions, which possess larger respective fields, to extract spectral features of hyperspectral image. We design a deep neural network with a multi-scale convolution layer which contains 3 different convolution kernel sizes. Secondly, to avoid overfitting of deep neural network, dropout is utilized, which randomly sleeps neurons, contributing to improve the classification accuracy a bit. In addition, new skills like ReLU in deep learning is utilized in this paper. We conduct experiments on University of Pavia and Salinas datasets, and obtained better classification accuracy compared with other methods.

  14. Thermal Protection for Mars Sample Return Earth Entry Vehicle: A Grand Challenge for Design Methodology and Reliability Verification

    Science.gov (United States)

    Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.

    2017-01-01

    Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.

  15. Structural health monitoring feature design by genetic programming

    International Nuclear Information System (INIS)

    Harvey, Dustin Y; Todd, Michael D

    2014-01-01

    Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and other high-capital or life-safety critical structures. Conventional data processing involves pre-processing and extraction of low-dimensional features from in situ time series measurements. The features are then input to a statistical pattern recognition algorithm to perform the relevant classification or regression task necessary to facilitate decisions by the SHM system. Traditional design of signal processing and feature extraction algorithms can be an expensive and time-consuming process requiring extensive system knowledge and domain expertise. Genetic programming, a heuristic program search method from evolutionary computation, was recently adapted by the authors to perform automated, data-driven design of signal processing and feature extraction algorithms for statistical pattern recognition applications. The proposed method, called Autofead, is particularly suitable to handle the challenges inherent in algorithm design for SHM problems where the manifestation of damage in structural response measurements is often unclear or unknown. Autofead mines a training database of response measurements to discover information-rich features specific to the problem at hand. This study provides experimental validation on three SHM applications including ultrasonic damage detection, bearing damage classification for rotating machinery, and vibration-based structural health monitoring. Performance comparisons with common feature choices for each problem area are provided demonstrating the versatility of Autofead to produce significant algorithm improvements on a wide range of problems. (paper)

  16. Design Anthropology in Participatory Design

    DEFF Research Database (Denmark)

    Smith, Rachel Charlotte; Gislev Kjærsgaard, Mette

    2014-01-01

    In this workshop we explore the opportunities of ethnography and design anthropology in Participatory Design (PD) as an approach to design in an increasingly global and digital world. Traditionally, ethnography has been used in PD to research real-life contexts and challenges, and as ways...... opportunities of using design anthropology as a holistic and critical approach to societal challenges, and a way for anthropologists and designers to engage in design that extends beyond the empirical....

  17. A novel neural network based image reconstruction model with scale and rotation invariance for target identification and classification for Active millimetre wave imaging

    Science.gov (United States)

    Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad

    2014-12-01

    Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.

  18. Classification of subsurface objects using singular values derived from signal frames

    Science.gov (United States)

    Chambers, David H; Paglieroni, David W

    2014-05-06

    The classification system represents a detected object with a feature vector derived from the return signals acquired by an array of N transceivers operating in multistatic mode. The classification system generates the feature vector by transforming the real-valued return signals into complex-valued spectra, using, for example, a Fast Fourier Transform. The classification system then generates a feature vector of singular values for each user-designated spectral sub-band by applying a singular value decomposition (SVD) to the N.times.N square complex-valued matrix formed from sub-band samples associated with all possible transmitter-receiver pairs. The resulting feature vector of singular values may be transformed into a feature vector of singular value likelihoods and then subjected to a multi-category linear or neural network classifier for object classification.

  19. Segmentation and classification of colon glands with deep convolutional neural networks and total variation regularization

    Directory of Open Access Journals (Sweden)

    Philipp Kainz

    2017-10-01

    Full Text Available Segmentation of histopathology sections is a necessary preprocessing step for digital pathology. Due to the large variability of biological tissue, machine learning techniques have shown superior performance over conventional image processing methods. Here we present our deep neural network-based approach for segmentation and classification of glands in tissue of benign and malignant colorectal cancer, which was developed to participate in the GlaS@MICCAI2015 colon gland segmentation challenge. We use two distinct deep convolutional neural networks (CNN for pixel-wise classification of Hematoxylin-Eosin stained images. While the first classifier separates glands from background, the second classifier identifies gland-separating structures. In a subsequent step, a figure-ground segmentation based on weighted total variation produces the final segmentation result by regularizing the CNN predictions. We present both quantitative and qualitative segmentation results on the recently released and publicly available Warwick-QU colon adenocarcinoma dataset associated with the GlaS@MICCAI2015 challenge and compare our approach to the simultaneously developed other approaches that participated in the same challenge. On two test sets, we demonstrate our segmentation performance and show that we achieve a tissue classification accuracy of 98% and 95%, making use of the inherent capability of our system to distinguish between benign and malignant tissue. Our results show that deep learning approaches can yield highly accurate and reproducible results for biomedical image analysis, with the potential to significantly improve the quality and speed of medical diagnoses.

  20. Classification of heterogeneous electron microscopic projections into homogeneous subsets

    International Nuclear Information System (INIS)

    Herman, G.T.; Kalinowski, M.

    2008-01-01

    The co-existence of different states of a macromolecular complex in samples used by three-dimensional electron microscopy (3D-EM) constitutes a serious challenge. The single particle method applied directly to such heterogeneous sets is unable to provide useful information about the encountered conformational diversity and produces reconstructions with severely reduced resolution. One approach to solving this problem is to partition heterogeneous projection set into homogeneous components and apply existing reconstruction techniques to each of them. Due to the nature of the projection images and the high noise level present in them, this classification task is difficult. A method is presented to achieve the desired classification by using a novel image similarity measure and solving the corresponding optimization problem. Unlike the majority of competing approaches, the presented method employs unsupervised classification (it does not require any prior knowledge about the objects being classified) and does not involve a 3D reconstruction procedure. We demonstrate a fast implementation of this method, capable of classifying projection sets that originate from 3D-EM. The method's performance is evaluated on synthetically generated data sets produced by projecting 3D objects that resemble biological structures

  1. The Spectrometer/Telescope for Imaging X-rays on Solar Orbiter: Flight design, challenges and trade-offs

    International Nuclear Information System (INIS)

    Krucker, S.; Bednarzik, M.; Grimm, O.; Hurford, G.J.; Limousin, O.; Meuris, A.; Orleański, P.; Seweryn, K.; Skup, K.R.

    2016-01-01

    STIX is the X-ray spectral imaging instrument on-board the Solar Orbiter space mission of the European Space Agency, and together with nine other instruments will address questions of the interaction between the Sun and the heliosphere. STIX will study the properties of thermal and accelerated electrons near the Sun through their Bremsstrahlung X-ray emission, addressing in particular the emission from flaring regions on the Sun. The design phase of STIX has been concluded. This paper reports the final flight design of the instrument, focusing on design challenges that were faced recently and how they were addressed.

  2. Creating wheelchair-controlled video games: challenges and opportunities when involving young people with mobility impairments and game design experts

    OpenAIRE

    Gerling, Kathrin; Linehan, Conor; Kirman, Ben; Kalyn, Michael; Evans, Adam; Hicks, Kieran

    2016-01-01

    Although participatory design (PD) is currently the most acceptable and respectful process we have for designing technology, recent discussions suggest that there may be two barriers to the successful application of PD to the design of digital games: First, the involvement of audiences with special needs can introduce new practical and ethical challenges to the design process. Second, the use of non-experts in game design roles has been criticised in that participants lack skills necessary to...

  3. Classification, disease, and diagnosis.

    Science.gov (United States)

    Jutel, Annemarie

    2011-01-01

    Classification shapes medicine and guides its practice. Understanding classification must be part of the quest to better understand the social context and implications of diagnosis. Classifications are part of the human work that provides a foundation for the recognition and study of illness: deciding how the vast expanse of nature can be partitioned into meaningful chunks, stabilizing and structuring what is otherwise disordered. This article explores the aims of classification, their embodiment in medical diagnosis, and the historical traditions of medical classification. It provides a brief overview of the aims and principles of classification and their relevance to contemporary medicine. It also demonstrates how classifications operate as social framing devices that enable and disable communication, assert and refute authority, and are important items for sociological study.

  4. U.S. Geological Survey ArcMap Sediment Classification tool

    Science.gov (United States)

    O'Malley, John

    2007-01-01

    The U.S. Geological Survey (USGS) ArcMap Sediment Classification tool is a custom toolbar that extends the Environmental Systems Research Institute, Inc. (ESRI) ArcGIS 9.2 Desktop application to aid in the analysis of seabed sediment classification. The tool uses as input either a point data layer with field attributes containing percentage of gravel, sand, silt, and clay or four raster data layers representing a percentage of sediment (0-100%) for the various sediment grain size analysis: sand, gravel, silt and clay. This tool is designed to analyze the percent of sediment at a given location and classify the sediments according to either the Folk (1954, 1974) or Shepard (1954) as modified by Schlee(1973) classification schemes. The sediment analysis tool is based upon the USGS SEDCLASS program (Poppe, et al. 2004).

  5. Can Automatic Classification Help to Increase Accuracy in Data Collection?

    Directory of Open Access Journals (Sweden)

    Frederique Lang

    2016-09-01

    Full Text Available Purpose: The authors aim at testing the performance of a set of machine learning algorithms that could improve the process of data cleaning when building datasets. Design/methodology/approach: The paper is centered on cleaning datasets gathered from publishers and online resources by the use of specific keywords. In this case, we analyzed data from the Web of Science. The accuracy of various forms of automatic classification was tested here in comparison with manual coding in order to determine their usefulness for data collection and cleaning. We assessed the performance of seven supervised classification algorithms (Support Vector Machine (SVM, Scaled Linear Discriminant Analysis, Lasso and elastic-net regularized generalized linear models, Maximum Entropy, Regression Tree, Boosting, and Random Forest and analyzed two properties: accuracy and recall. We assessed not only each algorithm individually, but also their combinations through a voting scheme. We also tested the performance of these algorithms with different sizes of training data. When assessing the performance of different combinations, we used an indicator of coverage to account for the agreement and disagreement on classification between algorithms. Findings: We found that the performance of the algorithms used vary with the size of the sample for training. However, for the classification exercise in this paper the best performing algorithms were SVM and Boosting. The combination of these two algorithms achieved a high agreement on coverage and was highly accurate. This combination performs well with a small training dataset (10%, which may reduce the manual work needed for classification tasks. Research limitations: The dataset gathered has significantly more records related to the topic of interest compared to unrelated topics. This may affect the performance of some algorithms, especially in their identification of unrelated papers. Practical implications: Although the

  6. Ethical challenges in developing drugs for psychiatric disorders.

    Science.gov (United States)

    Carrier, Felix; Banayan, David; Boley, Randy; Karnik, Niranjan

    2017-05-01

    As the classification of mental disorders advances towards a disease model as promoted by the National Institute of Mental Health (NIMH) Research Domain Criteria (RDoC), there is hope that a more thorough neurobiological understanding of mental illness may allow clinicians and researchers to determine treatment efficacy with less diagnostic variability. This paradigm shift has presented a variety of ethical issues to be considered in the development of psychiatric drugs. These challenges are not limited to informed consent practices, industry funding, and placebo use. The consideration for alternative research models and quality of research design also present ethical challenges in the development of psychiatric drugs. The imperatives to create valid and sound research that justify the human time, cost, risk and use of limited resources must also be considered. Clinical innovation, and consideration for special populations are also important aspects to take into account. Based on the breadth of these ethical concerns, it is particularly important that scientific questions regarding the development of psychiatric drugs be answered collaboratively by a variety of stakeholders. As the field expands, new ethical considerations will be raised with increased focus on genetic markers, personalized medicine, patient-centered outcomes research, and tension over funding. We suggest that innovation in trial design is necessary to better reflect practices in clinical settings and that there must be an emphasized focus on expanding the transparency of consent processes, regard for suicidality, and care in working with special populations to support the goal of developing sound psychiatric drug therapies. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Advanced Geophysical Classification with the Marine Towed Array

    Science.gov (United States)

    Steinhurst, D.; Harbaugh, G.; Keiswetter, D.; Bell, T. W.; Massey, G.; Wright, D.

    2017-12-01

    The Marine Towed Array, or MTA, is an underwater dual-mode sensor array that has been successfully deployed at multiple marine venues in support of Strategic Environmental Research and Development Program (SERDP) and Environmental Security Technology Certification Program (ESTCP) demonstrations beginning in 2004. It provided both marine electromagnetic and marine magnetic sensors for detection and mapping of underwater UXO. The EMI sensor array was based on older technology, which in several ESTCP demonstrations has not been able to support advanced geophysical classification (AGC). Under ESTCP funding, the U.S. Naval Research Laboratory is in the process of upgrading the MTA with modern, advanced electromagnetic (EMI) electronics and replacing the sensor array with a modern, multistatic array design. A half-scale version of the proposed array has been built and tested on land. Six tri-axial receiver cubes were placed inside two- and three- transmit coil configurations in equivalent positions to design locations for the MTA wing. The responses of a variety of munitions items and test spheres were measured over a range of target-to-array geometries and in both static and simulated dynamic data collection modes. The multi-transmit coil configuration was shown to provide enhanced single-pass classification performance over the original single coil design, particularly as a function of target location relative to the centerline. The ability to go beyond anomaly detection and additionally classify detected anomalies from survey data would dramatically improve the state of the art for underwater UXO remediation by reducing costs and improving the efficiency of these efforts. The results of our efforts to return the MTA to service and validating the new EMI array's design for UXO detection and classification in the underwater environment will be the focus of this presentation.

  8. Design Anthropology in Participatory Design

    DEFF Research Database (Denmark)

    Smith, Rachel Charlotte; Kjærsgaard, Mette Gislev

    2015-01-01

    This focus section explores the opportunities of design anthropology in participatory design as an approach to research and design in an increasingly global and digital world. Traditionally, ethnography has been used in Participatory design to research real-life contexts and challenges, and as ways...... opportunities of using design anthropology as a holistic and critical approach to addressing societal challenges and change, and a way for anthropologists and designers to engage in participatory research and design that extend beyond the empirical....

  9. Standard classification: Physics

    International Nuclear Information System (INIS)

    1977-01-01

    This is a draft standard classification of physics. The conception is based on the physics part of the systematic catalogue of the Bayerische Staatsbibliothek and on the classification given in standard textbooks. The ICSU-AB classification now used worldwide by physics information services was not taken into account. (BJ) [de

  10. Deep Multi-Task Learning for Tree Genera Classification

    Science.gov (United States)

    Ko, C.; Kang, J.; Sohn, G.

    2018-05-01

    The goal for our paper is to classify tree genera using airborne Light Detection and Ranging (LiDAR) data with Convolution Neural Network (CNN) - Multi-task Network (MTN) implementation. Unlike Single-task Network (STN) where only one task is assigned to the learning outcome, MTN is a deep learning architect for learning a main task (classification of tree genera) with other tasks (in our study, classification of coniferous and deciduous) simultaneously, with shared classification features. The main contribution of this paper is to improve classification accuracy from CNN-STN to CNN-MTN. This is achieved by introducing a concurrence loss (Lcd) to the designed MTN. This term regulates the overall network performance by minimizing the inconsistencies between the two tasks. Results show that we can increase the classification accuracy from 88.7 % to 91.0 % (from STN to MTN). The second goal of this paper is to solve the problem of small training sample size by multiple-view data generation. The motivation of this goal is to address one of the most common problems in implementing deep learning architecture, the insufficient number of training data. We address this problem by simulating training dataset with multiple-view approach. The promising results from this paper are providing a basis for classifying a larger number of dataset and number of classes in the future.

  11. 3D scattering transforms for disease classification in neuroimaging

    Directory of Open Access Journals (Sweden)

    Tameem Adel

    2017-01-01

    Full Text Available Classifying neurodegenerative brain diseases in MRI aims at correctly assigning discrete labels to MRI scans. Such labels usually refer to a diagnostic decision a learner infers based on what it has learned from a training sample of MRI scans. Classification from MRI voxels separately typically does not provide independent evidence towards or against a class; the information relevant for classification is only present in the form of complicated multivariate patterns (or “features”. Deep learning solves this problem by learning a sequence of non-linear transformations that result in feature representations that are better suited to classification. Such learned features have been shown to drastically outperform hand-engineered features in computer vision and audio analysis domains. However, applying the deep learning approach to the task of MRI classification is extremely challenging, because it requires a very large amount of data which is currently not available. We propose to instead use a three dimensional scattering transform, which resembles a deep convolutional neural network but has no learnable parameters. Furthermore, the scattering transform linearizes diffeomorphisms (due to e.g. residual anatomical variability in MRI scans, making the different disease states more easily separable using a linear classifier. In experiments on brain morphometry in Alzheimer's disease, and on white matter microstructural damage in HIV, scattering representations are shown to be highly effective for the task of disease classification. For instance, in semi-supervised learning of progressive versus stable MCI, we reach an accuracy of 82.7%. We also present a visualization method to highlight areas that provide evidence for or against a certain class, both on an individual and group level.

  12. HEp-2 cell image classification method based on very deep convolutional networks with small datasets

    Science.gov (United States)

    Lu, Mengchi; Gao, Long; Guo, Xifeng; Liu, Qiang; Yin, Jianping

    2017-07-01

    Human Epithelial-2 (HEp-2) cell images staining patterns classification have been widely used to identify autoimmune diseases by the anti-Nuclear antibodies (ANA) test in the Indirect Immunofluorescence (IIF) protocol. Because manual test is time consuming, subjective and labor intensive, image-based Computer Aided Diagnosis (CAD) systems for HEp-2 cell classification are developing. However, methods proposed recently are mostly manual features extraction with low accuracy. Besides, the scale of available benchmark datasets is small, which does not exactly suitable for using deep learning methods. This issue will influence the accuracy of cell classification directly even after data augmentation. To address these issues, this paper presents a high accuracy automatic HEp-2 cell classification method with small datasets, by utilizing very deep convolutional networks (VGGNet). Specifically, the proposed method consists of three main phases, namely image preprocessing, feature extraction and classification. Moreover, an improved VGGNet is presented to address the challenges of small-scale datasets. Experimental results over two benchmark datasets demonstrate that the proposed method achieves superior performance in terms of accuracy compared with existing methods.

  13. Improving Remote Sensing Scene Classification by Integrating Global-Context and Local-Object Features

    Directory of Open Access Journals (Sweden)

    Dan Zeng

    2018-05-01

    Full Text Available Recently, many researchers have been dedicated to using convolutional neural networks (CNNs to extract global-context features (GCFs for remote-sensing scene classification. Commonly, accurate classification of scenes requires knowledge about both the global context and local objects. However, unlike the natural images in which the objects cover most of the image, objects in remote-sensing images are generally small and decentralized. Thus, it is hard for vanilla CNNs to focus on both global context and small local objects. To address this issue, this paper proposes a novel end-to-end CNN by integrating the GCFs and local-object-level features (LOFs. The proposed network includes two branches, the local object branch (LOB and global semantic branch (GSB, which are used to generate the LOFs and GCFs, respectively. Then, the concatenation of features extracted from the two branches allows our method to be more discriminative in scene classification. Three challenging benchmark remote-sensing datasets were extensively experimented on; the proposed approach outperformed the existing scene classification methods and achieved state-of-the-art results for all three datasets.

  14. Seismic Target Classification Using a Wavelet Packet Manifold in Unattended Ground Sensors Systems

    Directory of Open Access Journals (Sweden)

    Enliang Song

    2013-07-01

    Full Text Available One of the most challenging problems in target classification is the extraction of a robust feature, which can effectively represent a specific type of targets. The use of seismic signals in unattended ground sensor (UGS systems makes this problem more complicated, because the seismic target signal is non-stationary, geology-dependent and with high-dimensional feature space. This paper proposes a new feature extraction algorithm, called wavelet packet manifold (WPM, by addressing the neighborhood preserving embedding (NPE algorithm of manifold learning on the wavelet packet node energy (WPNE of seismic signals. By combining non-stationary information and low-dimensional manifold information, WPM provides a more robust representation for seismic target classification. By using a K nearest neighbors classifier on the WPM signature, the algorithm of wavelet packet manifold classification (WPMC is proposed. Experimental results show that the proposed WPMC can not only reduce feature dimensionality, but also improve the classification accuracy up to 95.03%. Moreover, compared with state-of-the-art methods, WPMC is more suitable for UGS in terms of recognition ratio and computational complexity.

  15. Vision-Based Perception and Classification of Mosquitoes Using Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Masataka Fuchida

    2017-01-01

    Full Text Available The need for a novel automated mosquito perception and classification method is becoming increasingly essential in recent years, with steeply increasing number of mosquito-borne diseases and associated casualties. There exist remote sensing and GIS-based methods for mapping potential mosquito inhabitants and locations that are prone to mosquito-borne diseases, but these methods generally do not account for species-wise identification of mosquitoes in closed-perimeter regions. Traditional methods for mosquito classification involve highly manual processes requiring tedious sample collection and supervised laboratory analysis. In this research work, we present the design and experimental validation of an automated vision-based mosquito classification module that can deploy in closed-perimeter mosquito inhabitants. The module is capable of identifying mosquitoes from other bugs such as bees and flies by extracting the morphological features, followed by support vector machine-based classification. In addition, this paper presents the results of three variants of support vector machine classifier in the context of mosquito classification problem. This vision-based approach to the mosquito classification problem presents an efficient alternative to the conventional methods for mosquito surveillance, mapping and sample image collection. Experimental results involving classification between mosquitoes and a predefined set of other bugs using multiple classification strategies demonstrate the efficacy and validity of the proposed approach with a maximum recall of 98%.

  16. A simplified classification system for partially edentulous spaces

    Directory of Open Access Journals (Sweden)

    Bhandari Aruna J, Bhandari Akshay J

    2014-04-01

    Full Text Available Background: There is no single universally employed classification system that will specify the exact edentulous situation. Several classification systems exist to group the situation and avoid confusion. Classifications based on edentulous areas, finished restored prostheses, type of direct retainers or fulcrum lines are there. Some are based depending on the placement of the implants. Widely accepted Kennedy Applegate classification does not give any idea about length, span or number of teeth missing. Rule 6 governing the application of Kennedy method states that additional edentulous areas are referred as modification number 1,2 etc. Rule 7 states that extent of the modification is not considered; only the number of edentulous areas is considered. Hence there is a need to modify the Kennedy –Applegate System. Aims: This new classification system is an attempt to modify Kennedy –Applegate System so as to give the exact idea about missing teeth, space, span, side and areas of partially edentulous arches. Methods and Material: This system will provide the information regarding Maxillary or Mandibular partially edentulous arches, Left or Right side, length of the edentulous space, number of teeth missing and whether there will be tooth borne or tooth – tissue borne prosthesis. Conclusions: This classification is easy for application, communication and will also help to design the removable cast partial denture in a better logical and systematic way. Also, this system will give the idea of the edentulous status and the number of missing teeth in fixed, hybrid or implant prosthesis.

  17. Challenges in the design of a Home Telemanagement trial for patients with ulcerative colitis.

    Science.gov (United States)

    Cross, Raymond K; Finkelstein, Joseph

    2009-12-01

    Nonadherence, inadequate monitoring, and side-effects result in suboptimal outcomes in ulcerative colitis (UC). We hypothesize that telemanagement for UC will improve symptoms, quality of life, adherence, and decrease costs. This article describes the challenges encountered in the design of the home telemanagement in patients with UC trial. In a randomized trial to assess the effectiveness of telemanagement for UC compared to best available care, 100 patients will be enrolled. Subjects in the intervention arm will complete self-testing with telemanagement weekly; best available care subjects will receive scheduled follow up, educational fact sheets, and written action plans. Telemanagement consists of a home-unit, decision support server, and web-based clinician portal. The home-unit includes a scale and laptop. Subjects will respond to questions about symptoms, side-effects, adherence, and knowledge weekly; subjects will receive action plans after self-testing. Outcome variables to be assessed every 4 months include: disease activity, using the Seo index; quality of life, using the Inflammatory Bowel Disease Questionnaire; adherence, using pharmacy refill data and the Morisky Medication Adherence Scale; utilization of healthcare resources, using urgent care visits and hospitalizations. We encountered several challenges during design and implementation of our trial. First, we selected a randomized controlled trial design. We could have selected a quasiexperimental design to decrease the sample size needed and costs. Second, identification of a control group was challenging. Telemanagement patients received self-care plans and an educational curriculum. Since controls would not receive these interventions, we thought our results would be biased in favor of telemanagement. In addition, we wanted to evaluate the mode of delivery of these components of care. Therefore, we included written action plans and educational materials for patients in the control group ('best

  18. MISSION PROFILE AND DESIGN CHALLENGES FOR MARS LANDING EXPLORATION

    Directory of Open Access Journals (Sweden)

    J. Dong

    2017-07-01

    Full Text Available An orbiter and a descent module will be delivered to Mars in the Chinese first Mars exploration mission. The descent module is composed of a landing platform and a rover. The module will be released into the atmosphere by the orbiter and make a controlled landing on Martian surface. After landing, the rover will egress from the platform to start its science mission. The rover payloads mainly include the subsurface radar, terrain camera, multispectral camera, magnetometer, anemometer to achieve the scientific investigation of the terrain, soil characteristics, material composition, magnetic field, atmosphere, etc. The landing process is divided into three phases (entry phase, parachute descent phase and powered descent phase, which are full of risks. There exit lots of indefinite parameters and design constrain to affect the selection of the landing sites and phase switch (mortaring the parachute, separating the heat shield and cutting off the parachute. A number of new technologies (disk-gap-band parachute, guidance and navigation, etc. need to be developed. Mars and Earth have gravity and atmosphere conditions that are significantly different from one another. Meaningful environmental conditions cannot be recreated terrestrially on earth. A full-scale flight validation on earth is difficult. Therefore the end-to-end simulation and some critical subsystem test must be considered instead. The challenges above and the corresponding design solutions are introduced in this paper, which can provide reference for the Mars exploration mission.

  19. Single-labelled music genre classification using content-based features

    CSIR Research Space (South Africa)

    Ajoodha, R

    2015-11-01

    Full Text Available In this paper we use content-based features to perform automatic classification of music pieces into genres. We categorise these features into four groups: features extracted from the Fourier transform’s magnitude spectrum, features designed...

  20. Detecting Hijacked Journals by Using Classification Algorithms.

    Science.gov (United States)

    Andoohgin Shahri, Mona; Jazi, Mohammad Davarpanah; Borchardt, Glenn; Dadkhah, Mehdi

    2018-04-01

    Invalid journals are recent challenges in the academic world and many researchers are unacquainted with the phenomenon. The number of victims appears to be accelerating. Researchers might be suspicious of predatory journals because they have unfamiliar names, but hijacked journals are imitations of well-known, reputable journals whose websites have been hijacked. Hijacked journals issue calls for papers via generally laudatory emails that delude researchers into paying exorbitant page charges for publication in a nonexistent journal. This paper presents a method for detecting hijacked journals by using a classification algorithm. The number of published articles exposing hijacked journals is limited and most of them use simple techniques that are limited to specific journals. Hence we needed to amass Internet addresses and pertinent data for analyzing this type of attack. We inspected the websites of 104 scientific journals by using a classification algorithm that used criteria common to reputable journals. We then prepared a decision tree that we used to test five journals we knew were authentic and five we knew were hijacked.

  1. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu; Ghanem, Bernard; Liu, Si; Xu, Changsheng; Ahuja, Narendra

    2013-01-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  2. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu

    2013-12-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  3. Challenges and opportunities in designing clinical trials for neuromyelitis optica

    Science.gov (United States)

    Barron, Gerard; Behne, Jacinta M.; Bennett, Jeffery L.; Chin, Peter S.; Cree, Bruce A.C.; de Seze, Jerome; Flor, Armando; Fujihara, Kazuo; Greenberg, Benjamin; Higashi, Sayumi; Holt, William; Khan, Omar; Knappertz, Volker; Levy, Michael; Melia, Angela T.; Palace, Jacqueline; Smith, Terry J.; Sormani, Maria Pia; Van Herle, Katja; VanMeter, Susan; Villoslada, Pablo; Walton, Marc K.; Wasiewski, Warren; Wingerchuk, Dean M.; Yeaman, Michael R.

    2015-01-01

    Current management of neuromyelitis optica (NMO) is noncurative and only partially effective. Immunosuppressive or immunomodulatory agents are the mainstays of maintenance treatment. Safer, better-tolerated, and proven effective treatments are needed. The perceived rarity of NMO has impeded clinical trials for this disease. However, a diagnostic biomarker and recognition of a wider spectrum of NMO presentations has expanded the patient population from which study candidates might be recruited. Emerging insights into the pathogenesis of NMO have provided rationale for exploring new therapeutic targets. Academic, pharmaceutical, and regulatory communities are increasingly interested in meeting the unmet needs of patients with NMO. Clinical trials powered to yield unambiguous outcomes and designed to facilitate rapid evaluation of an expanding pipeline of experimental agents are needed. NMO-related disability occurs incrementally as a result of attacks; thus, limiting attack frequency and severity are critical treatment goals. Yet, the severity of NMO and perception that currently available agents are effective pose challenges to study design. We propose strategies for NMO clinical trials to evaluate agents targeting recovery from acute attacks and prevention of relapses, the 2 primary goals of NMO treatment. Aligning the interests of all stakeholders is an essential step to this end. PMID:25841026

  4. Challenges of implementation and implementation research: Learning from an intervention study designed to improve tumor registry reporting

    Directory of Open Access Journals (Sweden)

    Ann Scheck McAlearney

    2016-08-01

    Full Text Available Objectives: Implementation of interventions designed to improve the quality of medical care often proceeds differently from what is planned. Improving existing conceptual models to better understand the sources of these differences can help future projects avoid these pitfalls and achieve desired effectiveness. To inform an adaptation of an existing theoretical model, we examined unanticipated changes that occurred in an intervention designed to improve reporting of adjuvant therapies for breast cancer patients at a large, urban academic medical center. Methods: Guided by the complex innovation implementation conceptual framework, our study team observed and evaluated the implementation of an intervention designed to improve reporting to a tumor registry. Findings were assessed against the conceptual framework to identify boundary conditions and modifications that could improve implementation effectiveness. Results: The intervention successfully increased identification of the managing medical oncologist and treatment reporting. During implementation, however, unexpected external challenges including hospital acquisitions of community practices and practices’ responses to government incentives to purchase electronic medical record systems led to unanticipated changes and associated threats to implementation. We present a revised conceptual model that incorporates the sources of these unanticipated challenges. Conclusion: This report of our experience highlights the importance of monitoring implementation over time and accounting for changes that affect both implementation and measurement of intervention impact. In this article, we use our study to examine the challenges of implementation research in health care, and our experience can help future implementation efforts.

  5. Three-class classification in computer-aided diagnosis of breast cancer by support vector machine

    Science.gov (United States)

    Sun, Xuejun; Qian, Wei; Song, Dansheng

    2004-05-01

    Design of classifier in computer-aided diagnosis (CAD) scheme of breast cancer plays important role to its overall performance in sensitivity and specificity. Classification of a detected object as malignant lesion, benign lesion, or normal tissue on mammogram is a typical three-class pattern recognition problem. This paper presents a three-class classification approach by using two-stage classifier combined with support vector machine (SVM) learning algorithm for classification of breast cancer on mammograms. The first classification stage is used to detect abnormal areas and normal breast tissues, and the second stage is for classification of malignant or benign in detected abnormal objects. A series of spatial, morphology and texture features have been extracted on detected objects areas. By using genetic algorithm (GA), different feature groups for different stage classification have been investigated. Computerized free-response receiver operating characteristic (FROC) and receiver operating characteristic (ROC) analyses have been employed in different classification stages. Results have shown that obvious performance improvement in both sensitivity and specificity was observed through proposed classification approach compared with conventional two-class classification approaches, indicating its effectiveness in classification of breast cancer on mammograms.

  6. Active Metric Learning for Supervised Classification

    OpenAIRE

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  7. Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Andreas König

    2009-11-01

    Full Text Available The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.

  8. Deep learning architectures for multi-label classification of intelligent health risk prediction.

    Science.gov (United States)

    Maxwell, Andrew; Li, Runzhi; Yang, Bei; Weng, Heng; Ou, Aihua; Hong, Huixiao; Zhou, Zhaoxian; Gong, Ping; Zhang, Chaoyang

    2017-12-28

    Multi-label classification of data remains to be a challenging problem. Because of the complexity of the data, it is sometimes difficult to infer information about classes that are not mutually exclusive. For medical data, patients could have symptoms of multiple different diseases at the same time and it is important to develop tools that help to identify problems early. Intelligent health risk prediction models built with deep learning architectures offer a powerful tool for physicians to identify patterns in patient data that indicate risks associated with certain types of chronic diseases. Physical examination records of 110,300 anonymous patients were used to predict diabetes, hypertension, fatty liver, a combination of these three chronic diseases, and the absence of disease (8 classes in total). The dataset was split into training (90%) and testing (10%) sub-datasets. Ten-fold cross validation was used to evaluate prediction accuracy with metrics such as precision, recall, and F-score. Deep Learning (DL) architectures were compared with standard and state-of-the-art multi-label classification methods. Preliminary results suggest that Deep Neural Networks (DNN), a DL architecture, when applied to multi-label classification of chronic diseases, produced accuracy that was comparable to that of common methods such as Support Vector Machines. We have implemented DNNs to handle both problem transformation and algorithm adaption type multi-label methods and compare both to see which is preferable. Deep Learning architectures have the potential of inferring more information about the patterns of physical examination data than common classification methods. The advanced techniques of Deep Learning can be used to identify the significance of different features from physical examination data as well as to learn the contributions of each feature that impact a patient's risk for chronic diseases. However, accurate prediction of chronic disease risks remains a challenging

  9. Science Planning and Orbit Classification for Solar Probe Plus

    Science.gov (United States)

    Kusterer, M. B.; Fox, N. J.; Rodgers, D. J.; Turner, F. S.

    2016-12-01

    There are a number of challenges for the Science Planning Team (SPT) of the Solar Probe Plus (SPP) Mission. Since SPP is using a decoupled payload operations approach, tight coordination between the mission operations and payload teams will be required. The payload teams must manage the volume of data that they write to the spacecraft solid-state recorders (SSR) for their individual instruments for downlink to the ground. Making this process more difficult, the geometry of the celestial bodies and the spacecraft during some of the SPP mission orbits cause limited uplink and downlink opportunities. The payload teams will also be required to coordinate power on opportunities, command uplink opportunities, and data transfers from instrument memory to the spacecraft SSR with the operation team. The SPT also intend to coordinate observations with other spacecraft and ground based systems. To solve these challenges, detailed orbit activity planning is required in advance for each orbit. An orbit planning process is being created to facilitate the coordination of spacecraft and payload activities for each orbit. An interactive Science Planning Tool is being designed to integrate the payload data volume and priority allocations, spacecraft ephemeris, attitude, downlink and uplink schedules, spacecraft and payload activities, and other spacecraft ephemeris. It will be used during science planning to select the instrument data priorities and data volumes that satisfy the orbit data volume constraints and power on, command uplink and data transfer time periods. To aid in the initial stages of science planning we have created an orbit classification scheme based on downlink availability and significant science events. Different types of challenges arise in the management of science data driven by orbital geometry and operational constraints, and this scheme attempts to identify the patterns that emerge.

  10. Improved motion description for action classification

    Directory of Open Access Journals (Sweden)

    Mihir eJain

    2016-01-01

    Full Text Available Even though the importance of explicitly integrating motion characteristics in video descriptions has been demonstrated by several recent papers on action classification, our current work concludes that adequately decomposing visual motion into dominant and residual motions, i.e.: camera and scene motion, significantly improves action recognition algorithms. This holds true both for the extraction of the space-time trajectories and for computation of descriptors.We designed a new motion descriptor – the DCS descriptor – that captures additional information on local motion patterns enhancing results based on differential motion scalar quantities, divergence, curl and shear features. Finally, applying the recent VLAD coding technique proposed in image retrieval provides a substantial improvement for action recognition. These findings are complementary to each other and they outperformed all previously reported results by a significant margin on three challenging datasets: Hollywood 2, HMDB51 and Olympic Sports as reported in (Jain et al. (2013. These results were further improved by (Oneata et al. (2013; Wang and Schmid (2013; Zhu et al. (2013 through the use of the Fisher vector encoding. We therefore also employ Fisher vector in this paper and we further enhance our approach by combining trajectories from both optical flow and compensated flow. We as well provide additional details of DCS descriptors, including visualization. For extending the evaluation, a novel dataset with 101 action classes, UCF101, was added.

  11. DNA methylation-based classification of central nervous system tumours

    DEFF Research Database (Denmark)

    Capper, David; Jones, David T.W.; Sill, Martin

    2018-01-01

    Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging - with substantial inter-observer variabil......Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging - with substantial inter......-observer variability in the histopathological diagnosis of many tumour types. Here we present a comprehensive approach for the DNA methylation-based classification of central nervous system tumours across all entities and age groups, and demonstrate its application in a routine diagnostic setting. We show...

  12. The paradox of atheoretical classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2016-01-01

    A distinction can be made between “artificial classifications” and “natural classifications,” where artificial classifications may adequately serve some limited purposes, but natural classifications are overall most fruitful by allowing inference and thus many different purposes. There is strong...... support for the view that a natural classification should be based on a theory (and, of course, that the most fruitful theory provides the most fruitful classification). Nevertheless, atheoretical (or “descriptive”) classifications are often produced. Paradoxically, atheoretical classifications may...... be very successful. The best example of a successful “atheoretical” classification is probably the prestigious Diagnostic and Statistical Manual of Mental Disorders (DSM) since its third edition from 1980. Based on such successes one may ask: Should the claim that classifications ideally are natural...

  13. Design and Development of the Blackbird: Challenges and Lessons Learned

    Science.gov (United States)

    Merlin, Peter W.

    2009-01-01

    The Lockheed Blackbirds hold a unique place in the development of aeronautics. In their day, the A-12, YF-12, M-21, D-21, and SR-71 variants outperformed all other jet airplanes in terms of altitude and speed. Now retired, they remain the only production aircraft capable of sustained Mach 3 cruise and operational altitudes above 80,000 feet. In this paper the author describes the design evolution of the Blackbird from Lockheed's early Archangel studies for the Central Intelligence Agency through Senior Crown, production of the Air Force's SR-71. He describes the construction and materials challenges faced by Lockheed, the Blackbird's performance characteristics and capabilities, and the National Aeronautics and Space Administration's role in using the aircraft as a flying laboratory to collect data on materials, structures, loads, heating, aerodynamics, and performance for high-speed aircraft.

  14. Hindi vowel classification using QCN-MFCC features

    Directory of Open Access Journals (Sweden)

    Shipra Mishra

    2016-09-01

    Full Text Available In presence of environmental noise, speakers tend to emphasize their vocal effort to improve the audibility of voice. This involuntary adjustment is known as Lombard effect (LE. Due to LE the signal to noise ratio of speech increases, but at the same time the loudness, pitch and duration of phonemes changes. Hence, accuracy of automatic speech recognition systems degrades. In this paper, the effect of unsupervised equalization of Lombard effect is investigated for Hindi vowel classification task using Hindi database designed at TIFR Mumbai, India. Proposed Quantile-based Dynamic Cepstral Normalization MFCC (QCN-MFCC along with baseline MFCC features have been used for vowel classification. Hidden Markov Model (HMM is used as classifier. It is observed that QCN-MFCC features have given a maximum improvement of 5.97% and 5% over MFCC features for context-dependent and context-independent cases respectively. It is also observed that QCN-MFCC features have given improvement of 13% and 11.5% over MFCC features for context-dependent and context-independent classification of mid vowels.

  15. Molecular classification of pesticides including persistent organic pollutants, phenylurea and sulphonylurea herbicides.

    Science.gov (United States)

    Torrens, Francisco; Castellano, Gloria

    2014-06-05

    Pesticide residues in wine were analyzed by liquid chromatography-tandem mass spectrometry. Retentions are modelled by structure-property relationships. Bioplastic evolution is an evolutionary perspective conjugating effect of acquired characters and evolutionary indeterminacy-morphological determination-natural selection principles; its application to design co-ordination index barely improves correlations. Fractal dimensions and partition coefficient differentiate pesticides. Classification algorithms are based on information entropy and its production. Pesticides allow a structural classification by nonplanarity, and number of O, S, N and Cl atoms and cycles; different behaviours depend on number of cycles. The novelty of the approach is that the structural parameters are related to retentions. Classification algorithms are based on information entropy. When applying procedures to moderate-sized sets, excessive results appear compatible with data suffering a combinatorial explosion. However, equipartition conjecture selects criterion resulting from classification between hierarchical trees. Information entropy permits classifying compounds agreeing with principal component analyses. Periodic classification shows that pesticides in the same group present similar properties; those also in equal period, maximum resemblance. The advantage of the classification is to predict the retentions for molecules not included in the categorization. Classification extends to phenyl/sulphonylureas and the application will be to predict their retentions.

  16. On Internet Traffic Classification: A Two-Phased Machine Learning Approach

    Directory of Open Access Journals (Sweden)

    Taimur Bakhshi

    2016-01-01

    Full Text Available Traffic classification utilizing flow measurement enables operators to perform essential network management. Flow accounting methods such as NetFlow are, however, considered inadequate for classification requiring additional packet-level information, host behaviour analysis, and specialized hardware limiting their practical adoption. This paper aims to overcome these challenges by proposing two-phased machine learning classification mechanism with NetFlow as input. The individual flow classes are derived per application through k-means and are further used to train a C5.0 decision tree classifier. As part of validation, the initial unsupervised phase used flow records of fifteen popular Internet applications that were collected and independently subjected to k-means clustering to determine unique flow classes generated per application. The derived flow classes were afterwards used to train and test a supervised C5.0 based decision tree. The resulting classifier reported an average accuracy of 92.37% on approximately 3.4 million test cases increasing to 96.67% with adaptive boosting. The classifier specificity factor which accounted for differentiating content specific from supplementary flows ranged between 98.37% and 99.57%. Furthermore, the computational performance and accuracy of the proposed methodology in comparison with similar machine learning techniques lead us to recommend its extension to other applications in achieving highly granular real-time traffic classification.

  17. Transporter Classification Database (TCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Transporter Classification Database details a comprehensive classification system for membrane transport proteins known as the Transporter Classification (TC)...

  18. Intraoperative neuropathology of glioma recurrence: cell detection and classification

    Science.gov (United States)

    Abas, Fazly S.; Gokozan, Hamza N.; Goksel, Behiye; Otero, Jose J.; Gurcan, Metin N.

    2016-03-01

    Intraoperative neuropathology of glioma recurrence represents significant visual challenges to pathologists as they carry significant clinical implications. For example, rendering a diagnosis of recurrent glioma can help the surgeon decide to perform more aggressive resection if surgically appropriate. In addition, the success of recent clinical trials for intraoperative administration of therapies, such as inoculation with oncolytic viruses, may suggest that refinement of the intraoperative diagnosis during neurosurgery is an emerging need for pathologists. Typically, these diagnoses require rapid/STAT processing lasting only 20-30 minutes after receipt from neurosurgery. In this relatively short time frame, only dyes, such as hematoxylin and eosin (H and E), can be implemented. The visual challenge lies in the fact that these patients have undergone chemotherapy and radiation, both of which induce cytological atypia in astrocytes, and pathologists are unable to implement helpful biomarkers in their diagnoses. Therefore, there is a need to help pathologists differentiate between astrocytes that are cytologically atypical due to treatment versus infiltrating, recurrent, neoplastic astrocytes. This study focuses on classification of neoplastic versus non-neoplastic astrocytes with the long term goal of providing a better neuropathological computer-aided consultation via classification of cells into reactive gliosis versus recurrent glioma. We present a method to detect cells in H and E stained digitized slides of intraoperative cytologic preparations. The method uses a combination of the `value' component of the HSV color space and `b*' component of the CIE L*a*b* color space to create an enhanced image that suppresses the background while revealing cells on an image. A composite image is formed based on the morphological closing of the hue-luminance combined image. Geometrical and textural features extracted from Discrete Wavelet Frames and combined to classify

  19. Structural and mechanical design challenges of space shuttle solid rocket boosters separation and recovery subsystems

    Science.gov (United States)

    Woodis, W. R.; Runkle, R. E.

    1985-01-01

    The design of the space shuttle solid rocket booster (SRB) subsystems for reuse posed some unique and challenging design considerations. The separation of the SRBs from the cluster (orbiter and external tank) at 150,000 ft when the orbiter engines are running at full thrust meant the two SRBs had to have positive separation forces pushing them away. At the same instant, the large attachments that had reacted launch loads of 7.5 million pounds thrust had to be servered. These design considerations dictated the design requirements for the pyrotechnics and separation rocket motors. The recovery and reuse of the two SRBs meant they had to be safely lowered to the ocean, remain afloat, and be owed back to shore. In general, both the pyrotechnic and recovery subsystems have met or exceeded design requirements. In twelve vehicles, there has only been one instance where the pyrotechnic system has failed to function properly.

  20. Monitoring nanotechnology using patent classifications: an overview and comparison of nanotechnology classification schemes

    Energy Technology Data Exchange (ETDEWEB)

    Jürgens, Björn, E-mail: bjurgens@agenciaidea.es [Agency of Innovation and Development of Andalusia, CITPIA PATLIB Centre (Spain); Herrero-Solana, Victor, E-mail: victorhs@ugr.es [University of Granada, SCImago-UGR (SEJ036) (Spain)

    2017-04-15

    Patents are an essential information source used to monitor, track, and analyze nanotechnology. When it comes to search nanotechnology-related patents, a keyword search is often incomplete and struggles to cover such an interdisciplinary discipline. Patent classification schemes can reveal far better results since they are assigned by experts who classify the patent documents according to their technology. In this paper, we present the most important classifications to search nanotechnology patents and analyze how nanotechnology is covered in the main patent classification systems used in search systems nowadays: the International Patent Classification (IPC), the United States Patent Classification (USPC), and the Cooperative Patent Classification (CPC). We conclude that nanotechnology has a significantly better patent coverage in the CPC since considerable more nanotechnology documents were retrieved than by using other classifications, and thus, recommend its use for all professionals involved in nanotechnology patent searches.