WorldWideScience

Sample records for classified elimination facilities

  1. Region 9 National Pollution Discharge Elimination System (NPDES) Facilities

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...

  2. National Pollution Discharge Elimination System (NPDES) Facility Points, Region 9, 2012, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...

  3. National Pollution Discharge Elimination System (NPDES) Facility Points, Region 9, 2011, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...

  4. SCIENCE FACILITIES. A CLASSIFIED LIST OF LITERATURE RELATED TO DESIGN, CONSTRUCTION AND OTHER ARCHITECTURAL MATTERS.

    Science.gov (United States)

    National Science Foundation, Washington, DC.

    A CLASSIFIED LIST OF ARTICLES, PAPERS AND CATALOGS IN THE SCIENCE FACILITIES COLLECTION OF THE ARCHITECTURAL SERVICES STAFF OF THE NATIONAL SCIENCE FOUNDATION IS PRESENTED WHICH MAY BE USEFUL IN SEARCHING FOR PERTINENT LITERATURE ON PROBLEMS IN THE DESIGN OF SCIENCE FACILITIES. CITATIONS COVER SUCH AREAS AS GENERAL PLANNING, SPACE UTILIZATION AND…

  5. Elimination of Porcine Epidemic Diarrhea Virus in an Animal Feed Manufacturing Facility.

    Science.gov (United States)

    Huss, Anne R; Schumacher, Loni L; Cochrane, Roger A; Poulsen, Elizabeth; Bai, Jianfa; Woodworth, Jason C; Dritz, Steve S; Stark, Charles R; Jones, Cassandra K

    2017-01-01

    Porcine Epidemic Diarrhea Virus (PEDV) was the first virus of wide scale concern to be linked to possible transmission by livestock feed or ingredients. Measures to exclude pathogens, prevent cross-contamination, and actively reduce the pathogenic load of feed and ingredients are being developed. However, research thus far has focused on the role of chemicals or thermal treatment to reduce the RNA in the actual feedstuffs, and has not addressed potential residual contamination within the manufacturing facility that may lead to continuous contamination of finished feeds. The purpose of this experiment was to evaluate the use of a standardized protocol to sanitize an animal feed manufacturing facility contaminated with PEDV. Environmental swabs were collected throughout the facility during the manufacturing of a swine diet inoculated with PEDV. To monitor facility contamination of the virus, swabs were collected at: 1) baseline prior to inoculation, 2) after production of the inoculated feed, 3) after application of a quaternary ammonium-glutaraldehyde blend cleaner, 4) after application of a sodium hypochlorite sanitizing solution, and 5) after facility heat-up to 60°C for 48 hours. Decontamination step, surface, type, zone and their interactions were all found to impact the quantity of detectable PEDV RNA (P feed. Additionally, the majority of samples collected from non-direct feed contact surfaces were also positive for PEDV RNA after the production of the contaminated feed, emphasizing the potential role dust plays in cross-contamination of pathogen throughout a manufacturing facility. Application of the cleaner, sanitizer, and heat were effective at reducing PEDV genomic material (P < 0.05), but did not completely eliminate it.

  6. Impact study of classified facilities. Impacts of the facility; Etude d'impact des ICPE. Effets de l'installation

    Energy Technology Data Exchange (ETDEWEB)

    Seveque, J.L.

    2002-01-01

    The operation of a classified facility has direct or indirect, temporary and permanent impacts on the environment, in particular on sites and landscapes, on ecosystems, on the neighborhood, on the agriculture, on public health, etc.. Thus an impact study is necessary to identify the overall harmful effects of the facility. Content: 1 - aim of the impact study; 2 - environment of the facility (impact on the landscape, on the fauna and flora, on material goods and cultural patrimony, on agriculture); 3 - water pollution (impact on ground waters, on drinkable water catchment, on surface waters, on public health); 4 - air pollution (impact on air quality, odors, public health); 5 - noise and vibrations; 6 - wastes; 7 - transport (impact of road traffic). (J.S.)

  7. List of currently classified documents relative to Hanford Production Facilities Operations originated on the Hanford Site between 1961 and 1972

    Energy Technology Data Exchange (ETDEWEB)

    1993-04-01

    The United States Department of Energy (DOE) has declared that all Hanford plutonium production- and operations-related information generated between 1944 and 1972 is declassified. Any documents found and deemed useful for meeting Hanford Environmental Dose Reconstruction (HEDR) objectives may be declassified with or without deletions in accordance with DOE guidance by Authorized Derivative Declassifiers. The September 1992, letter report, Declassifications Requested by the Technical Steering Panel of Hanford Documents Produced 1944--1960, (PNWD-2024 HEDR UC-707), provides an important milestone toward achieving a complete listing of documents that may be useful to the HEDR Project. The attached listing of approximately 7,000 currently classified Hanford-originated documents relative to Hanford Production Facilities Operations between 1961 and 1972 fulfills TSP Directive 89-3. This list does not include such titles as the Irradiation Processing Department, Chemical Processing Department, and Hanford Laboratory Operations monthly reports generated after 1960 which have been previously declassified with minor deletions and made publicly available. Also Kaiser Engineers Hanford (KEH) Document Control determined that no KEH documents generated between January 1, 1961 and December 31, 1972 are currently classified. Titles which address work for others have not been included because Hanford Site contractors currently having custodial responsibility for these documents do not have the authority to determine whether other than their own staff have on file an appropriate need-to-know. Furthermore, these documents do not normally contain information relative to Hanford Site operations.

  8. Elimination of Pasteurella pneumotropica from a Mouse Barrier Facility by Using a Modified Enrofloxacin Treatment Regimen

    OpenAIRE

    Towne, Justin W; Wagner, April M; Griffin, Kurt J; Buntzman, Adam S.; Frelinger, Jeffrey A.; Besselsen, David G.

    2014-01-01

    Multiple NOD.Cg-Prkdcscid Il2rgtm1WjlTg(HLA-A2.1)Enge/Sz (NSG/A2) transgenic mice maintained in a mouse barrier facility were submitted for necropsy to determine the cause of facial alopecia, tachypnea, dyspnea, and sudden death. Pneumonia and soft-tissue abscesses were observed, and Pasteurella pneumotropica biotype Jawetz was consistently isolated from the upper respiratory tract, lung, and abscesses. Epidemiologic investigation within the facility revealed presence of this pathogen in mice...

  9. Chemical elimination of the harmful properties of asbestos from military facilities.

    Science.gov (United States)

    Pawełczyk, Adam; Božek, František; Grabas, Kazimierz; Chęcmanowski, Jacek

    2017-03-01

    This work presents research on the neutralization of asbestos banned from military use and its conversion to usable products. The studies showed that asbestos can be decomposed by the use of phosphoric acid. The process proved very effective when the phosphoric acid concentration was 30%, the temperature was 90°C and the reaction time 60min. Contrary to the common asbestos treatment method that consists of landfilling, the proposed process ensures elimination of the harmful properties of this waste material and its transformation into inert substances. The obtained products include calcium phosphate, magnesium phosphate and silica. Chemical, microscopic and X-ray analyses proved that the products are free of harmful fibers and can be, in particular, utilized for fertilizers production. The obtained results may contribute to development of an asbestos utilization technique that fits well into the European waste policy, regulated by the EU waste management law. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Elimination of Pasteurella pneumotropica from a mouse barrier facility by using a modified enrofloxacin treatment regimen.

    Science.gov (United States)

    Towne, Justin W; Wagner, April M; Griffin, Kurt J; Buntzman, Adam S; Frelinger, Jeffrey A; Besselsen, David G

    2014-09-01

    Multiple NOD. Cg-Prkdc(scid)Il2rg(tm1Wjl)Tg(HLA-A2.1)Enge/Sz (NSG/A2) transgenic mice maintained in a mouse barrier facility were submitted for necropsy to determine the cause of facial alopecia, tachypnea, dyspnea, and sudden death. Pneumonia and soft-tissue abscesses were observed, and Pasteurella pneumotropica biotype Jawetz was consistently isolated from the upper respiratory tract, lung, and abscesses. Epidemiologic investigation within the facility revealed presence of this pathogen in mice generated or rederived by the intramural Genetically Engineered Mouse Model (GEMM) Core but not in mice procured from several approved commercial vendors. Epidemiologic data suggested the infection originated from female or vasectomized male ND4 mice obtained from a commercial vendor and then comingled by the GEMM Core to induce pseudopregnancy in female mice for embryo implantation. Enrofloxacin delivered in drinking water (85 mg/kg body weight daily) for 14 d was sufficient to clear bacterial infection in normal, breeding, and immune-deficient mice without the need to change the antibiotic water source. This modified treatment regimen was administered to 2400 cages of mice to eradicate Pasteurella pneumotropica from the facility. Follow-up PCR testing for P. pneumotropica biotype Jawetz remained uniformly negative at 2, 6, 12, and 52 wk after treatment in multiple strains of mice that were originally infected. Together, these data indicate that enrofloxacin can eradicate P. pneumotropica from infected mice in a less labor-intensive approach that does not require breeding cessation and that is easily adaptable to the standard biweekly cage change schedule for individually ventilated cages.

  11. Analysis of the application of selected physico-chemical methods in eliminating odor nuisance of municipal facilities

    Science.gov (United States)

    Miller, Urszula; Grzelka, Agnieszka; Romanik, Elżbieta; Kuriata, Magdalena

    2018-01-01

    Operation of municipal management facilities is inseparable from the problem of malodorous compounds emissions to the atmospheric air. In that case odor nuisance is related to the chemical composition of waste, sewage and sludge as well as to the activity of microorganisms whose products of life processes can be those odorous compounds. Significant reduction of odorant emission from many sources can be achieved by optimizing parameters and conditions of processes. However, it is not always possible to limit the formation of odorants. In such cases it is best to use appropriate deodorizing methods. The choice of the appropriate method is based on in terms of physical parameters, emission intensity of polluted gases and their composition, if it is possible to determine. Among the solutions used in municipal economy, there can be distinguished physico-chemical methods such as sorption and oxidation. In cases where the source of the emission is not encapsulated, odor masking techniques are used, which consists of spraying preparations that neutralize unpleasant odors. The paper presents the characteristics of selected methods of eliminating odor nuisance and evaluation of their applicability in municipal management facilities.

  12. Analysis of the application of selected physico-chemical methods in eliminating odor nuisance of municipal facilities

    Directory of Open Access Journals (Sweden)

    Miller Urszula

    2018-01-01

    Full Text Available Operation of municipal management facilities is inseparable from the problem of malodorous compounds emissions to the atmospheric air. In that case odor nuisance is related to the chemical composition of waste, sewage and sludge as well as to the activity of microorganisms whose products of life processes can be those odorous compounds. Significant reduction of odorant emission from many sources can be achieved by optimizing parameters and conditions of processes. However, it is not always possible to limit the formation of odorants. In such cases it is best to use appropriate deodorizing methods. The choice of the appropriate method is based on in terms of physical parameters, emission intensity of polluted gases and their composition, if it is possible to determine. Among the solutions used in municipal economy, there can be distinguished physico-chemical methods such as sorption and oxidation. In cases where the source of the emission is not encapsulated, odor masking techniques are used, which consists of spraying preparations that neutralize unpleasant odors. The paper presents the characteristics of selected methods of eliminating odor nuisance and evaluation of their applicability in municipal management facilities.

  13. Classifying Microorganisms

    DEFF Research Database (Denmark)

    Sommerlund, Julie

    2006-01-01

    This paper describes the coexistence of two systems for classifying organisms and species: a dominant genetic system and an older naturalist system. The former classifies species and traces their evolution on the basis of genetic characteristics, while the latter employs physiological characteris......This paper describes the coexistence of two systems for classifying organisms and species: a dominant genetic system and an older naturalist system. The former classifies species and traces their evolution on the basis of genetic characteristics, while the latter employs physiological...... characteristics. The coexistence of the classification systems does not lead to a conflict between them. Rather, the systems seem to co-exist in different configurations, through which they are complementary, contradictory and inclusive in different situations-sometimes simultaneously. The systems come...

  14. Carbon classified?

    DEFF Research Database (Denmark)

    Lippert, Ingmar

    2012-01-01

    . Using an actor- network theory (ANT) framework, the aim is to investigate the actors who bring together the elements needed to classify their carbon emission sources and unpack the heterogeneous relations drawn on. Based on an ethnographic study of corporate agents of ecological modernisation over...... and corporations construing themselves as able and suitable to manage their emissions, and, additionally, given that the construction of carbon emissions has performative consequences, the underlying practices need to be declassified, i.e. opened for public scrutiny. Hence the paper concludes by arguing...

  15. Multimedia Classifier

    Science.gov (United States)

    Costache, G. N.; Gavat, I.

    2004-09-01

    Along with the aggressive growing of the amount of digital data available (text, audio samples, digital photos and digital movies joined all in the multimedia domain) the need for classification, recognition and retrieval of this kind of data became very important. In this paper will be presented a system structure to handle multimedia data based on a recognition perspective. The main processing steps realized for the interesting multimedia objects are: first, the parameterization, by analysis, in order to obtain a description based on features, forming the parameter vector; second, a classification, generally with a hierarchical structure to make the necessary decisions. For audio signals, both speech and music, the derived perceptual features are the melcepstral (MFCC) and the perceptual linear predictive (PLP) coefficients. For images, the derived features are the geometric parameters of the speaker mouth. The hierarchical classifier consists generally in a clustering stage, based on the Kohonnen Self-Organizing Maps (SOM) and a final stage, based on a powerful classification algorithm called Support Vector Machines (SVM). The system, in specific variants, is applied with good results in two tasks: the first, is a bimodal speech recognition which uses features obtained from speech signal fused to features obtained from speaker's image and the second is a music retrieval from large music database.

  16. Eliminating animal facility light-at-night contamination and its effect on circadian regulation of rodent physiology, tumor growth, and metabolism: a challenge in the relocation of a cancer research laboratory.

    Science.gov (United States)

    Dauchy, Robert T; Dupepe, Lynell M; Ooms, Tara G; Dauchy, Erin M; Hill, Cody R; Mao, Lulu; Belancio, Victoria P; Slakey, Lauren M; Hill, Steven M; Blask, David E

    2011-05-01

    Appropriate laboratory animal facility lighting and lighting protocols are essential for maintaining the health and wellbeing of laboratory animals and ensuring the credible outcome of scientific investigations. Our recent experience in relocating to a new laboratory facility illustrates the importance of these considerations. Previous studies in our laboratory demonstrated that animal room contamination with light-at-night (LAN) of as little as 0.2 lx at rodent eye level during an otherwise normal dark-phase disrupted host circadian rhythms and stimulated the metabolism and proliferation of human cancer xenografts in rats. Here we examined how simple improvements in facility design at our new location completely eliminated dark-phase LAN contamination and restored normal circadian rhythms in nontumor-bearing rats and normal tumor metabolism and growth in host rats bearing tissue-isolated MCF7(SR(-)) human breast tumor xenografts or 7288CTC rodent hepatomas. Reducing LAN contamination in the animal quarters from 24.5 ± 2.5 lx to nondetectable levels (complete darkness) restored normal circadian regulation of rodent arterial blood melatonin, glucose, total fatty and linoleic acid concentrations, tumor uptake of O(2), glucose, total fatty acid and CO(2) production and tumor levels of cAMP, triglycerides, free fatty acids, phospholipids, and cholesterol esters, as well as extracellular-signal-regulated kinase, mitogen-activated protein kinase, serine-threonine protein kinase, glycogen synthase kinase 3β, γ-histone 2AX, and proliferating cell nuclear antigen.

  17. Facile synthesis of 2D CuO nanoleaves for the catalytic elimination of hazardous and toxic dyes from aqueous phase: a sustainable approach.

    Science.gov (United States)

    Bhattacharjee, Archita; Begum, Shamima; Neog, Kashmiri; Ahmaruzzaman, M

    2016-06-01

    This article reports for the first time a facile, green synthesis of 2D CuO nanoleaves (NLs) using the amino acid, namely aspartic acid, and NaOH by a microwave heating method. The amino acid acts as a complexing/capping agent in the synthesis of CuO NLs. This method resulted in the formation of self-assembled 2D CuO NLs with an average length and width of ~300-400 and ~50-82 nm, respectively. The as-synthesized 2D CuO NLs were built up from the primary CuO nanoparticles by oriented attachment growth mechanism. The CuO NLs were characterized by an X-ray diffraction (XRD) method, transmission electron microscopy (TEM), selected-area electron diffraction (SAED) pattern, and Fourier transform infrared spectroscopy (FT-IR). The optical properties were investigated using UV-visible spectroscopy. For the first time, rose bengal and eosin Y dyes were degraded photochemically by solar irradiation using CuO NLs as a photocatalyst. The synthesized CuO NLs act as an efficient photocatalyst in the degradation of rose bengal and eosin Y dye under direct sunlight. The degradation of both the dyes, namely rose bengal and eosin Y, took place within 120 and 45 min, respectively, using CuO NLs as a photocatalyst, whereas commercial CuO, SnO2 quantum dots (QDs), and commercial SnO2 took more than 120 and 45 min for the degradation of rose bengal and eosin Y, respectively. The synthesized CuO NLs showed a superior photocatalytic activity as compared to that of commercial CuO, SnO2 QDs, and commercial SnO2. The reusability of the CuO NLs as a photocatalyst in the degradation of dyes was investigated, and it was evident that the catalytic efficiency decreases to a small extent (5-6 %) after the fifth cycle of operation.

  18. analysis, diagnosis and prognosis of leprosy utilizing fuzzy classifier ...

    African Journals Online (AJOL)

    TRIPPLEJO2K2

    . The proposed expert system eliminates uncertainty and imprecision associated with the diagnosis of Leprosy. Keywords: Leprosy, Fuzzy Set, Fuzzy Logic, Fuzzy Classifier, Diagnosis. INTRODUCTION. Leprosy (Hensen's Disease) is chronic, ...

  19. Quantum Minimum Distance Classifier

    OpenAIRE

    Enrica Santucci

    2017-01-01

    We propose a quantum version of the well known minimum distance classification model called Nearest Mean Classifier (NMC). In this regard, we presented our first results in two previous works. First, a quantum counterpart of the NMC for two-dimensional problems was introduced, named Quantum Nearest Mean Classifier (QNMC), together with a possible generalization to any number of dimensions. Secondly, we studied the n-dimensional problem into detail and we showed a new encoding for arbitrary n-...

  20. Classifying Returns as Extreme

    DEFF Research Database (Denmark)

    Christiansen, Charlotte

    2014-01-01

    I consider extreme returns for the stock and bond markets of 14 EU countries using two classification schemes: One, the univariate classification scheme from the previous literature that classifies extreme returns for each market separately, and two, a novel multivariate classification scheme...... that classifies extreme returns for several markets jointly. The new classification scheme holds about the same information as the old one, while demanding a shorter sample period. The new classification scheme is useful....

  1. Wastewater Treatment Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Individual permits for municipal, industrial, and semi-public wastewater treatment facilities in Iowa for the National Pollutant Discharge Elimination System (NPDES)...

  2. 46 CFR 503.59 - Safeguarding classified information.

    Science.gov (United States)

    2010-10-01

    ... classification. (b) Whenever classified material is removed from a storage facility, such material shall not be left unattended and shall be protected by attaching an appropriate classified document cover sheet to... in approved equipment or facilities, whenever it is not under the direct supervision of authorized...

  3. Dynamic system classifier

    CERN Document Server

    Pumpe, Daniel; Müller, Ewald; Enßlin, Torsten A

    2016-01-01

    Stochastic differential equations describe well many physical, biological and sociological systems, despite the simplification often made in their derivation. Here the usage of simple stochastic differential equations to characterize and classify complex dynamical systems is proposed within a Bayesian framework. To this end, we develop a dynamic system classifier (DSC). The DSC first abstracts training data of a system in terms of time dependent coefficients of the descriptive stochastic differential equation. Thereby the DSC identifies unique correlation structures within the training data. For definiteness we restrict the presentation of DSC to oscillation processes with a time dependent frequency {\\omega}(t) and damping factor {\\gamma}(t). Although real systems might be more complex, this simple oscillator captures many characteristic features. The {\\omega} and {\\gamma} timelines represent the abstract system characterization and permit the construction of efficient signal classifiers. Numerical experiment...

  4. Application of Data Clustering Embedded in Fuzzy Classifier Expert ...

    African Journals Online (AJOL)

    The conventional (traditional) methods for water quality recognition employed by different individuals are expressed using Fuzzy classifier. The proposed expert system eliminates uncertainties and imprecision associated with the recognition of water quality. @JASEM Keywords: Fuzzy classifier, fuzzy logic, fuzzy set, Water

  5. Classifying Cereal Data

    Science.gov (United States)

    The DSQ includes questions about cereal intake and allows respondents up to two responses on which cereals they consume. We classified each cereal reported first by hot or cold, and then along four dimensions: density of added sugars, whole grains, fiber, and calcium.

  6. LCC: Light Curves Classifier

    Science.gov (United States)

    Vo, Martin

    2017-08-01

    Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.

  7. Intelligent Garbage Classifier

    Directory of Open Access Journals (Sweden)

    Ignacio Rodríguez Novelle

    2008-12-01

    Full Text Available IGC (Intelligent Garbage Classifier is a system for visual classification and separation of solid waste products. Currently, an important part of the separation effort is based on manual work, from household separation to industrial waste management. Taking advantage of the technologies currently available, a system has been built that can analyze images from a camera and control a robot arm and conveyor belt to automatically separate different kinds of waste.

  8. Quantum Minimum Distance Classifier

    Directory of Open Access Journals (Sweden)

    Enrica Santucci

    2017-12-01

    Full Text Available We propose a quantum version of the well known minimum distance classification model called Nearest Mean Classifier (NMC. In this regard, we presented our first results in two previous works. First, a quantum counterpart of the NMC for two-dimensional problems was introduced, named Quantum Nearest Mean Classifier (QNMC, together with a possible generalization to any number of dimensions. Secondly, we studied the n-dimensional problem into detail and we showed a new encoding for arbitrary n-feature vectors into density operators. In the present paper, another promising encoding is considered, suggested by recent debates on quantum machine learning. Further, we observe a significant property concerning the non-invariance by feature rescaling of our quantum classifier. This fact, which represents a meaningful difference between the NMC and the respective quantum version, allows us to introduce a free parameter whose variation provides, in some cases, better classification results for the QNMC. The experimental section is devoted: (i to compare the NMC and QNMC performance on different datasets; and (ii to study the effects of the non-invariance under uniform rescaling for the QNMC.

  9. USCIS Backlog Elimination

    Data.gov (United States)

    Department of Homeland Security — USCIS is streamlining the way immigration benefits are delivered. By working smarter and eliminating redundancies, USCIS is bringing a business model to government....

  10. Classifying TDSS Stellar Variables

    Science.gov (United States)

    Amaro, Rachael Christina; Green, Paul J.; TDSS Collaboration

    2017-01-01

    The Time Domain Spectroscopic Survey (TDSS), a subprogram of SDSS-IV eBOSS, obtains classification/discovery spectra of point-source photometric variables selected from PanSTARRS and SDSS multi-color light curves regardless of object color or lightcurve shape. Tens of thousands of TDSS spectra are already available and have been spectroscopically classified both via pipeline and by visual inspection. About half of these spectra are quasars, half are stars. Our goal is to classify the stars with their correct variability types. We do this by acquiring public multi-epoch light curves for brighter stars (rclassifications and parameters in the Catalina Surveys Periodic Variable Star Catalog. Variable star classifications include RR Lyr, close eclipsing binaries, CVs, pulsating white dwarfs, and other exotic systems. The key difference between our catalog and others is that along with the light curves, we will be using TDSS spectra to help in the classification of variable type, as spectra are rich with information allowing estimation of physical parameters like temperature, metallicity, gravity, etc. This work was supported by the SDSS Research Experience for Undergraduates program, which is funded by a grant from Sloan Foundation to the Astrophysical Research Consortium.

  11. Classifying Facial Actions

    Science.gov (United States)

    Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.

    2010-01-01

    The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284

  12. Stack filter classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory

    2009-01-01

    Just as linear models generalize the sample mean and weighted average, weighted order statistic models generalize the sample median and weighted median. This analogy can be continued informally to generalized additive modeels in the case of the mean, and Stack Filters in the case of the median. Both of these model classes have been extensively studied for signal and image processing but it is surprising to find that for pattern classification, their treatment has been significantly one sided. Generalized additive models are now a major tool in pattern classification and many different learning algorithms have been developed to fit model parameters to finite data. However Stack Filters remain largely confined to signal and image processing and learning algorithms for classification are yet to be seen. This paper is a step towards Stack Filter Classifiers and it shows that the approach is interesting from both a theoretical and a practical perspective.

  13. Eliminating the xy Term.

    Science.gov (United States)

    Roberti, Joseph V.

    1979-01-01

    A process for eliminating the xy term in a quadratic equation in two variables is presented. The author feels this process will be within the reach of more high school students than more commonly used methods. (MK)

  14. Evolving extended naive Bayes classifiers

    OpenAIRE

    Klawonn, Frank; Angelov, Plamen

    2006-01-01

    Naive Bayes classifiers are a very simple, but often effective tool for classification problems, although they are based on independence assumptions that do not hold in most cases. Extended naive Bayes classifiers also rely on independence assumptions, but break them down to artificial subclasses, in this way becoming more powerful than ordinary naive Bayes classifiers. Since the involved computations for Bayes classifiers are basically generalised mean value calculations, they easily render ...

  15. Eliminating cracking during drying.

    Science.gov (United States)

    Jin, Qiu; Tan, Peng; Schofield, Andrew B; Xu, Lei

    2013-03-01

    When colloidal suspensions dry, stresses build up and cracks often occur -a phenomenon undesirable for important industries such as paint and ceramics. We demonstrate an effective method which can completely eliminate cracking during drying: by adding emulsion droplets into colloidal suspensions, we can systematically decrease the amount of cracking, and eliminate it completely above a critical droplet concentration. Since the emulsion droplets eventually also evaporate, our technique achieves an effective function while making little changes to the component of final product, and may therefore serve as a promising approach for cracking elimination. Furthermore, adding droplets also varies the speed of air invasion and provides a powerful method to adjust drying rate. With the effective control over cracking and drying rate, our study may find important applications in many drying- and cracking-related industrial processes.

  16. Eliminating common PACU delays.

    Science.gov (United States)

    Jenkins, Jamie

    2007-01-01

    This article discusses how one hospital identified patient flow delays in its PACU By using lean methods focused on eliminating waste, the team was able to improve patient flow. Lean thinking required the team to keep issues that were important to patients at top of mind. The improvements not only saved staff time, but they also helped the department prepare for the addition of six beds by focusing on methods to eliminate delays. The team, assigned by the vice president of surgical services, included a process engineer two decision support analysts, the PACU charge nurse, the nursing manager and ad hoc department nurses. The team recommended and implemented changes to improve operational effectiveness.

  17. Remediation Technologies Eliminate Contaminants

    Science.gov (United States)

    2012-01-01

    All research and development has a story behind it, says Jacqueline Quinn, environmental engineer at Kennedy Space Center. For Quinn, one such story begins with the Saturn 1B launch stand at Kennedy and ends with a unique solution to a challenging environmental problem. Used in a number of Apollo missions and during the Skylab program, the Saturn 1B launch stand was dismantled following the transition to the Space Shuttle Program and stored in an open field at Kennedy. Decades later, the Center s Environmental Program Office discovered evidence of chemicals called polychlorinated biphenyls (PCBs) in the field s soil. The findings were puzzling since PCBs a toxin classified as a probable carcinogen by the Environmental Protection Agency (EPA) have been banned in the United States since 1979. Before the ban, PCBs were commonly used in transformer oils that leached into the ground when the oils were changed out and dumped near transformer sites, but there were no electrical transformers near the dismantled stand. It soon became apparent that the source of the PCBs was the launch stand itself. Prior to the ban, PCBs were used extensively in paints to add elasticity and other desirable characteristics. The PCB-laden paint on the Saturn 1B launch stand was flaking off into the field s soil. Nobody knew there were PCBs in the paint, says Quinn, noting that the ingredient was not monitored carefully when it was in use in 1960s. In fact, she says, the U.S. EPA was not even established until 1970, a year after Neil Armstrong first set foot on the Moon. Nobody knew any better at the time, Quinn says, but today, we have the responsibility to return any natural environmental media to as close to pristine a condition as possible. Quinn, fellow engineer Kathleen Loftin, and other Kennedy colleagues already had experience developing unprecedented solutions for environmental contamination; the team invented the emulsified zero-valent iron (EZVI) technology to safely treat

  18. Recognizing, Confronting, and Eliminating Workplace Bullying.

    Science.gov (United States)

    Berry, Peggy Ann; Gillespie, Gordon L; Fisher, Bonnie S; Gormley, Denise K

    2016-07-01

    Workplace bullying (WPB) behaviors negatively affect nurse productivity, satisfaction, and retention, and hinder safe patient care. The purpose of this article is to define WPB, differentiate between incivility and WPB, and recommend actions to prevent WPB behaviors. Informed occupational and environmental health nurses and nurse leaders must recognize, confront, and eliminate WPB in their facilities and organizations. Recognizing, confronting, and eliminating WPB behaviors in health care is a crucial first step toward sustained improvements in patient care quality and the health and safety of health care employees. © 2016 The Author(s).

  19. Verification of the Accountability Method as a Means to Classify Radioactive Wastes Processed Using THOR Fluidized Bed Steam Reforming at the Studsvik Processing Facility in Erwin, Tennessee, USA - 13087

    Energy Technology Data Exchange (ETDEWEB)

    Olander, Jonathan [Studsvik Processing Facility Erwin, 151 T.C. Runnion Rd., Erwin, TN 37650 (United States); Myers, Corey [Studsvik, Inc., 5605 Glenridge Drive, Suite 705, Atlanta, GA 30342 (United States)

    2013-07-01

    Studsviks' Processing Facility Erwin (SPFE) has been treating Low-Level Radioactive Waste using its patented THOR process for over 13 years. Studsvik has been mixing and processing wastes of the same waste classification but different chemical and isotopic characteristics for the full extent of this period as a general matter of operations. Studsvik utilizes the accountability method to track the movement of radionuclides from acceptance of waste, through processing, and finally in the classification of waste for disposal. Recently the NRC has proposed to revise the 1995 Branch Technical Position on Concentration Averaging and Encapsulation (1995 BTP on CA) with additional clarification (draft BTP on CA). The draft BTP on CA has paved the way for large scale blending of higher activity and lower activity waste to produce a single waste for the purpose of classification. With the onset of blending in the waste treatment industry, there is concern from the public and state regulators as to the robustness of the accountability method and the ability of processors to prevent the inclusion of hot spots in waste. To address these concerns and verify the accountability method as applied by the SPFE, as well as the SPFE's ability to control waste package classification, testing of actual waste packages was performed. Testing consisted of a comprehensive dose rate survey of a container of processed waste. Separately, the waste package was modeled chemically and radiologically. Comparing the observed and theoretical data demonstrated that actual dose rates were lower than, but consistent with, modeled dose rates. Moreover, the distribution of radioactivity confirms that the SPFE can produce a radiologically homogeneous waste form. The results of the study demonstrate: 1) the accountability method as applied by the SPFE is valid and produces expected results; 2) the SPFE can produce a radiologically homogeneous waste; and 3) the SPFE can effectively control the

  20. Minding Rachlin's Eliminative Materialism

    Science.gov (United States)

    McDowell, J. J.

    2012-01-01

    Rachlin's teleological behaviorism eliminates the first-person ontology of conscious experience by identifying mental states with extended patterns of behavior, and thereby maintains the materialist ontology of science. An alternate view, informed by brain-based and externalist philosophies of mind, is shown also to maintain the materialist…

  1. Emergent behaviors of classifier systems

    Energy Technology Data Exchange (ETDEWEB)

    Forrest, S.; Miller, J.H.

    1989-01-01

    This paper discusses some examples of emergent behavior in classifier systems, describes some recently developed methods for studying them based on dynamical systems theory, and presents some initial results produced by the methodology. The goal of this work is to find techniques for noticing when interesting emergent behaviors of classifier systems emerge, to study how such behaviors might emerge over time, and make suggestions for designing classifier systems that exhibit preferred behaviors. 20 refs., 1 fig.

  2. Targeting rubella for elimination.

    Science.gov (United States)

    Taneja, Davendra K; Sharma, Pragya

    2012-01-01

    Rubella is an acute, usually mild viral disease. However, when rubella infection occurs just before conception or during the first 8-10 weeks of gestation, it causes multiple fetal defects in up to 90% of cases, known as Congenital Rubella Syndrome (CRS). It may result in fetal wastage, stillbirths and sensorineural hearing deficit up to 20 weeks of gestation. Rubella vaccine (RA 27/3) is highly effective and has resulted in elimination of rubella and CRS from the western hemisphere and several European countries. Review of several studies documents the duration of protection over 10-21 years following one dose of RA27/3 vaccination, and persistent seropositivity in over 95% cases. Studies in India show seronegativity to rubella among adolescent girls to vary from 10% to 36%. Although due to early age of infection resulting in protection in the reproductive age group, incidence of rubella in India is not very high. However, due to severity of CRS coupled with introduction of RCV in private sector and in some of the states which is likely to lead to sub-optimal coverage and resulting higher risk of rubella during pregnancy in the coming decades, it is imperative to adopt the goal of rubella elimination. As in order to control measles, the country has adopted strategy of delivering second dose of measles through measles campaigns covering children 9 months to 10 years of age in 14 states, it is recommended to synergize efforts for elimination of rubella with these campaigns by replacing measles vaccine by MR or MMR vaccine. Other states which are to give second dose of measles through routine immunization will also have to adopt campaign mode in order to eliminate rubella from the country over 10-20 years. Subsequently, measles vaccine can be replaced by MR or MMR vaccine in the national schedule.

  3. Eliminating Perinatal HIV Transmission

    Centers for Disease Control (CDC) Podcasts

    2012-11-26

    In this podcast, CDC’s Dr. Steve Nesheim discusses perinatal HIV transmission, including the importance of preventing HIV among women, preconception care, and timely HIV testing of the mother. Dr. Nesheim also introduces the revised curriculum Eliminating Perinatal HIV Transmission intended for faculty of OB/GYN and pediatric residents and nurse midwifery students.  Created: 11/26/2012 by Division of HIV/AIDS Prevention.   Date Released: 11/26/2012.

  4. Classified

    CERN Multimedia

    Computer Security Team

    2011-01-01

    In the last issue of the Bulletin, we have discussed recent implications for privacy on the Internet. But privacy of personal data is just one facet of data protection. Confidentiality is another one. However, confidentiality and data protection are often perceived as not relevant in the academic environment of CERN.   But think twice! At CERN, your personal data, e-mails, medical records, financial and contractual documents, MARS forms, group meeting minutes (and of course your password!) are all considered to be sensitive, restricted or even confidential. And this is not all. Physics results, in particular when being preliminary and pending scrutiny, are sensitive, too. Just recently, an ATLAS collaborator copy/pasted the abstract of an ATLAS note onto an external public blog, despite the fact that this document was clearly marked as an "Internal Note". Such an act was not only embarrassing to the ATLAS collaboration, and had negative impact on CERN’s reputation --- i...

  5. Risks: diagnosing and eliminating

    Directory of Open Access Journals (Sweden)

    Yuriy A. Tikhomirov

    2016-01-01

    Full Text Available Objective to develop conceptual theoretical and legal provisions and scientific recommendations on the identification analysis and elimination of risk. Methods universal dialectic method of cognition as well as scientific and private research methods based on it. Results the system was researched of risks diagnostics in the legal sphere and mechanism of influencing the quotrisk situationsquot and their consequences damage to the environment and harm to society. The concept of risk in the legal sphere was formulated the author39s classification of risks in the legal sphere is presented. The rules of analysis evaluation and prevention of risks and the model risk management framework are elaborated. Scientific novelty the mechanism for the identification analysis and elimination of risk has been developed and introduced into scientific circulation the author has proposed the classification and types of risks the reasons and the conditions promoting the risk occurrence. Practical significance the provisions and conclusions of the article can be used in the scientific lawmaking and lawenforcement activity as well as in the educational process of higher educational establishments. nbsp

  6. Tackling imported malaria: an elimination endgame.

    Science.gov (United States)

    Sturrock, Hugh J W; Roberts, Kathryn W; Wegbreit, Jennifer; Ohrt, Colin; Gosling, Roly D

    2015-07-01

    As countries move toward malaria elimination, imported infections become increasingly significant as they often represent the majority of cases, can sustain transmission, cause resurgences, and lead to mortality. Here we review and critique current methods to prevent malaria importation in countries pursuing elimination and explore methods applied in other transmission settings and to other diseases that could be transferred to support malaria elimination. To improve intervention targeting we need a better understanding of the characteristics of populations importing infections and their patterns of migration, improved methods to reliably classify infections as imported or acquired locally, and ensure early and accurate diagnosis. The potential for onward transmission in the most receptive and vulnerable locations can be predicted through high-resolution risk mapping that can help malaria elimination or prevention of reintroduction programs target resources. Cross border and regional initiatives can be highly effective when based on an understanding of human and parasite movement. Ultimately, determining the optimal combinations of approaches to address malaria importation will require an evaluation of their impact, cost effectiveness, and operational feasibility. © The American Society of Tropical Medicine and Hygiene.

  7. Hybrid classifiers methods of data, knowledge, and classifier combination

    CERN Document Server

    Wozniak, Michal

    2014-01-01

    This book delivers a definite and compact knowledge on how hybridization can help improving the quality of computer classification systems. In order to make readers clearly realize the knowledge of hybridization, this book primarily focuses on introducing the different levels of hybridization and illuminating what problems we will face with as dealing with such projects. In the first instance the data and knowledge incorporated in hybridization were the action points, and then a still growing up area of classifier systems known as combined classifiers was considered. This book comprises the aforementioned state-of-the-art topics and the latest research results of the author and his team from Department of Systems and Computer Networks, Wroclaw University of Technology, including as classifier based on feature space splitting, one-class classification, imbalance data, and data stream classification.

  8. 76 FR 34761 - Classified National Security Information

    Science.gov (United States)

    2011-06-14

    ... Classified National Security Information AGENCY: Marine Mammal Commission. ACTION: Notice. SUMMARY: This..., ``Classified National Security Information,'' and 32 CFR part 2001, ``Classified National Security Information.... Executive Order 13526, ``Classified National Security Information,'' December 29, 2009 b. 32 CFR part 2001...

  9. Single elimination competition

    Science.gov (United States)

    Fink, T. M. A.; Coe, J. B.; Ahnert, S. E.

    2008-09-01

    We study a simple model of competition in which each player has a fixed strength: randomly selected pairs of players compete, the stronger one wins and the loser is eliminated. We show that the best indicator of future success is not the number of wins but a player's wealth: the accumulated wealth of all defeated players. We calculate the distributions of strength and wealth for two versions of the problem: in the first, the loser is replaced; in the second, the loser is not. The probability of attaining a given wealth is shown to be path-independent. We illustrate our model with the popular game of conkers and discuss an extension to round-robin sports competition.

  10. 33 CFR 154.1216 - Facility classification.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Facility classification. 154.1216... Vegetable Oils Facilities § 154.1216 Facility classification. (a) The Coast Guard classifies facilities that... classification of a facility that handles, stores, or transports animal fats or vegetable oils. The COTP may...

  11. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  12. Classifying Cereal Data (Earlier Methods)

    Science.gov (United States)

    The DSQ includes questions about cereal intake and allows respondents up to two responses on which cereals they consume. We classified each cereal reported first by hot or cold, and then along four dimensions: density of added sugars, whole grains, fiber, and calcium.

  13. Adaptive Regularization of Neural Classifiers

    DEFF Research Database (Denmark)

    Andersen, Lars Nonboe; Larsen, Jan; Hansen, Lars Kai

    1997-01-01

    We present a regularization scheme which iteratively adapts the regularization parameters by minimizing the validation error. It is suggested to use the adaptive regularization scheme in conjunction with optimal brain damage pruning to optimize the architecture and to avoid overfitting. Furthermore......, we propose an improved neural classification architecture eliminating an inherent redundancy in the widely used SoftMax classification network. Numerical results demonstrate the viability of the method...

  14. Eliminating rabies in Estonia.

    Science.gov (United States)

    Cliquet, Florence; Robardet, Emmanuelle; Must, Kylli; Laine, Marjana; Peik, Katrin; Picard-Meyer, Evelyne; Guiot, Anne-Laure; Niin, Enel

    2012-01-01

    The compulsory vaccination of pets, the recommended vaccination of farm animals in grazing areas and the extermination of stray animals did not succeed in eliminating rabies in Estonia because the virus was maintained in two main wildlife reservoirs, foxes and raccoon dogs. These two species became a priority target therefore in order to control rabies. Supported by the European Community, successive oral vaccination (OV) campaigns were conducted twice a year using Rabigen® SAG2 baits, beginning in autumn 2005 in North Estonia. They were then extended to the whole territory from spring 2006. Following the vaccination campaigns, the incidence of rabies cases dramatically decreased, with 266 cases in 2005, 114 in 2006, four in 2007 and three in 2008. Since March 2008, no rabies cases have been detected in Estonia other than three cases reported in summer 2009 and one case in January 2011, all in areas close to the South-Eastern border with Russia. The bait uptake was satisfactory, with tetracycline positivity rates ranging from 85% to 93% in foxes and from 82% to 88% in raccoon dogs. Immunisation rates evaluated by ELISA ranged from 34% to 55% in foxes and from 38% to 55% in raccoon dogs. The rabies situation in Estonia was compared to that of the other two Baltic States, Latvia and Lithuania. Despite regular OV campaigns conducted throughout their territory since 2006, and an improvement in the epidemiological situation, rabies has still not been eradicated in these countries. An analysis of the number of baits distributed and the funding allocated by the European Commission showed that the strategy for rabies control is more cost-effective in Estonia than in Latvia and Lithuania.

  15. Eliminating Rabies in Estonia

    Science.gov (United States)

    Cliquet, Florence; Robardet, Emmanuelle; Must, Kylli; Laine, Marjana; Peik, Katrin; Picard-Meyer, Evelyne; Guiot, Anne-Laure; Niin, Enel

    2012-01-01

    The compulsory vaccination of pets, the recommended vaccination of farm animals in grazing areas and the extermination of stray animals did not succeed in eliminating rabies in Estonia because the virus was maintained in two main wildlife reservoirs, foxes and raccoon dogs. These two species became a priority target therefore in order to control rabies. Supported by the European Community, successive oral vaccination (OV) campaigns were conducted twice a year using Rabigen® SAG2 baits, beginning in autumn 2005 in North Estonia. They were then extended to the whole territory from spring 2006. Following the vaccination campaigns, the incidence of rabies cases dramatically decreased, with 266 cases in 2005, 114 in 2006, four in 2007 and three in 2008. Since March 2008, no rabies cases have been detected in Estonia other than three cases reported in summer 2009 and one case in January 2011, all in areas close to the South-Eastern border with Russia. The bait uptake was satisfactory, with tetracycline positivity rates ranging from 85% to 93% in foxes and from 82% to 88% in raccoon dogs. Immunisation rates evaluated by ELISA ranged from 34% to 55% in foxes and from 38% to 55% in raccoon dogs. The rabies situation in Estonia was compared to that of the other two Baltic States, Latvia and Lithuania. Despite regular OV campaigns conducted throughout their territory since 2006, and an improvement in the epidemiological situation, rabies has still not been eradicated in these countries. An analysis of the number of baits distributed and the funding allocated by the European Commission showed that the strategy for rabies control is more cost-effective in Estonia than in Latvia and Lithuania. PMID:22393461

  16. Fingerprint prediction using classifier ensembles

    CSIR Research Space (South Africa)

    Molale, P

    2011-11-01

    Full Text Available for improving fingerprint prediction accuracy. The study is organized as follows. The next section briefly gives details of the five classifiers used in this study, followed by a description of different types of MCS architectures. Then the robustness... discrimination (LgDA): Logistic Discrimination Analysis (LgDA), due to Cox (1966) is related to logistic regression analysis. The dependent variable can only take values of 0 and 1, say, given two classes. This technique is partially parametric...

  17. Clustering signatures classify directed networks

    Science.gov (United States)

    Ahnert, S. E.; Fink, T. M. A.

    2008-09-01

    We use a clustering signature, based on a recently introduced generalization of the clustering coefficient to directed networks, to analyze 16 directed real-world networks of five different types: social networks, genetic transcription networks, word adjacency networks, food webs, and electric circuits. We show that these five classes of networks are cleanly separated in the space of clustering signatures due to the statistical properties of their local neighborhoods, demonstrating the usefulness of clustering signatures as a classifier of directed networks.

  18. National Pollution Discharge Elimination System (NPDES) Wastewater Treatment Plant Points, Region 9, 2007, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA...

  19. National Pollution Discharge Elimination System (NPDES) Wastewater Treatment Plant Points, Region 9, 2012, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA...

  20. National Pollution Discharge Elimination System (NPDES) Wastewater Treatment Plant Points, Region 9, 2011, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA...

  1. Dimensionality Reduction Through Classifier Ensembles

    Science.gov (United States)

    Oza, Nikunj C.; Tumer, Kagan; Norwig, Peter (Technical Monitor)

    1999-01-01

    In data mining, one often needs to analyze datasets with a very large number of attributes. Performing machine learning directly on such data sets is often impractical because of extensive run times, excessive complexity of the fitted model (often leading to overfitting), and the well-known "curse of dimensionality." In practice, to avoid such problems, feature selection and/or extraction are often used to reduce data dimensionality prior to the learning step. However, existing feature selection/extraction algorithms either evaluate features by their effectiveness across the entire data set or simply disregard class information altogether (e.g., principal component analysis). Furthermore, feature extraction algorithms such as principal components analysis create new features that are often meaningless to human users. In this article, we present input decimation, a method that provides "feature subsets" that are selected for their ability to discriminate among the classes. These features are subsequently used in ensembles of classifiers, yielding results superior to single classifiers, ensembles that use the full set of features, and ensembles based on principal component analysis on both real and synthetic datasets.

  2. Detection of microaneurysms in retinal images using an ensemble classifier

    Directory of Open Access Journals (Sweden)

    M.M. Habib

    2017-01-01

    Full Text Available This paper introduces, and reports on the performance of, a novel combination of algorithms for automated microaneurysm (MA detection in retinal images. The presence of MAs in retinal images is a pathognomonic sign of Diabetic Retinopathy (DR which is one of the leading causes of blindness amongst the working age population. An extensive survey of the literature is presented and current techniques in the field are summarised. The proposed technique first detects an initial set of candidates using a Gaussian Matched Filter and then classifies this set to reduce the number of false positives. A Tree Ensemble classifier is used with a set of 70 features (the most commons features in the literature. A new set of 32 MA groundtruth images (with a total of 256 labelled MAs based on images from the MESSIDOR dataset is introduced as a public dataset for benchmarking MA detection algorithms. We evaluate our algorithm on this dataset as well as another public dataset (DIARETDB1 v2.1 and compare it against the best available alternative. Results show that the proposed classifier is superior in terms of eliminating false positive MA detection from the initial set of candidates. The proposed method achieves an ROC score of 0.415 compared to 0.2636 achieved by the best available technique. Furthermore, results show that the classifier model maintains consistent performance across datasets, illustrating the generalisability of the classifier and that overfitting does not occur.

  3. Defining and Classifying Interest Groups

    DEFF Research Database (Denmark)

    Baroni, Laura; Carroll, Brendan; Chalmers, Adam

    2014-01-01

    The interest group concept is defined in many different ways in the existing literature and a range of different classification schemes are employed. This complicates comparisons between different studies and their findings. One of the important tasks faced by interest group scholars engaged...... in large-N studies is therefore to define the concept of an interest group and to determine which classification scheme to use for different group types. After reviewing the existing literature, this article sets out to compare different approaches to defining and classifying interest groups with a sample...... of lobbying actors coded according to different coding schemes. We systematically assess the performance of different schemes by comparing how actor types in the different schemes differ with respect to a number of background characteristics. This is done in a two-stage approach where we first cluster actors...

  4. General and Local: Averaged k-Dependence Bayesian Classifiers

    Directory of Open Access Journals (Sweden)

    Limin Wang

    2015-06-01

    Full Text Available The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB classifier can construct at arbitrary points (values of k along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is proposed in this study to describe the local dependencies implicated in each test instance. Based on the analysis of functional dependencies, substitution-elimination resolution, a new type of semi-naive Bayesian operation, is proposed to substitute or eliminate generalization to achieve accurate estimation of conditional probability distribution while reducing computational complexity. The final classifier, averaged k-dependence Bayesian (AKDB classifiers, will average the output of KDB and local KDB. Experimental results on the repository of machine learning databases from the University of California Irvine (UCI showed that AKDB has significant advantages in zero-one loss and bias relative to naive Bayes (NB, tree augmented naive Bayes (TAN, Averaged one-dependence estimators (AODE, and KDB. Moreover, KDB and local KDB show mutually complementary characteristics with respect to variance.

  5. Defense Logistics Agency Revenue Eliminations

    National Research Council Canada - National Science Library

    1996-01-01

    The issue of revenue eliminations was identified during our work on the Defense Logistics Agency portion of the Audit of Revenue Accounts in the FY 1996 Financial Statements of the Defense Business Operations Fund...

  6. A History of Classified Activities at Oak Ridge National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Quist, A.S.

    2001-01-30

    The facilities that became Oak Ridge National Laboratory (ORNL) were created in 1943 during the United States' super-secret World War II project to construct an atomic bomb (the Manhattan Project). During World War II and for several years thereafter, essentially all ORNL activities were classified. Now, in 2000, essentially all ORNL activities are unclassified. The major purpose of this report is to provide a brief history of ORNL's major classified activities from 1943 until the present (September 2000). This report is expected to be useful to the ORNL Classification Officer and to ORNL's Authorized Derivative Classifiers and Authorized Derivative Declassifiers in their classification review of ORNL documents, especially those documents that date from the 1940s and 1950s.

  7. Conceptualizing cancer drugs as classifiers.

    Directory of Open Access Journals (Sweden)

    Patrick Nathan Lawlor

    Full Text Available Cancer and healthy cells have distinct distributions of molecular properties and thus respond differently to drugs. Cancer drugs ideally kill cancer cells while limiting harm to healthy cells. However, the inherent variance among cells in both cancer and healthy cell populations increases the difficulty of selective drug action. Here we formalize a classification framework based on the idea that an ideal cancer drug should maximally discriminate between cancer and healthy cells. More specifically, this discrimination should be performed on the basis of measurable cell markers. We divide the problem into three parts which we explore with examples. First, molecular markers should discriminate cancer cells from healthy cells at the single-cell level. Second, the effects of drugs should be statistically predicted by these molecular markers. Third, drugs should be optimized for classification performance. We find that expression levels of a handful of genes suffice to discriminate well between individual cells in cancer and healthy tissue. We also find that gene expression predicts the efficacy of some cancer drugs, suggesting that these cancer drugs act as suboptimal classifiers using gene profiles. Finally, we formulate a framework that defines an optimal drug, and predicts drug cocktails that may target cancer more accurately than the individual drugs alone. Conceptualizing cancer drugs as solving a discrimination problem in the high-dimensional space of molecular markers promises to inform the design of new cancer drugs and drug cocktails.

  8. Hierarchical mixtures of naive Bayes classifiers

    OpenAIRE

    Wiering, M.A.

    2002-01-01

    Naive Bayes classifiers tend to perform very well on a large number of problem domains, although their representation power is quite limited compared to more sophisticated machine learning algorithms. In this pa- per we study combining multiple naive Bayes classifiers by using the hierar- chical mixtures of experts system. This system, which we call hierarchical mixtures of naive Bayes classifiers, is compared to a simple naive Bayes classifier and to using bagging and boosting for combining ...

  9. Toward the Elimination of Paper Orders

    Science.gov (United States)

    Ramirez, Ricardo; Webster, S. Luke

    2016-01-01

    Summary With the adoption of Computerized Patient Order Entry (CPOE), many physicians – particularly consultants and those who are affiliated with multiple hospital systems – are faced with the challenge of learning to navigate and commit to memory the details of multiple EHRs and CPOE software modules. These physicians may resist CPOE adoption, and their refusal to use CPOE presents a risk to patient safety when paper and electronic orders co-exist, as paper orders generated in an electronic ordering environment can be missed or acted upon after delay, are frequently illegible, and bypass the Clinical Decision Support (CDS) that is part of the evidence-based value of CPOE. We defined a category of CPOE Low Frequency Users (LFUs) – physicians issuing a total of less than 10 orders per month – and found that 50.4% of all physicians issuing orders in 3 urban/suburban hospitals were LFUs and actively issuing orders across all shifts and days of the week. Data are presented for 2013 on the number of LFUs by month, day of week, shift and facility, over 2.3 million orders issued. A menu of 6 options to assist LFUs in the use of CPOE, from which hospital leaders could select, was instituted so that paper orders could be increasingly eliminated. The options, along with their cost implications, are described, as is the initial option selected by hospital leaders. In practice, however, a mixed pattern involving several LFU support options emerged. We review data on how the option mix selected may have impacted CPOE adoption and physician use rates at the facilities. The challenge of engaging LFU physicians in CPOE adoption may be common in moderately sized hospitals, and these options can be deployed by other systems in advancing CPOE pervasiveness of use and the eventual elimination of paper orders. PMID:27081405

  10. 15 CFR 4.8 - Classified Information.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Classified Information. 4.8 Section 4... INFORMATION Freedom of Information Act § 4.8 Classified Information. In processing a request for information..., the information shall be reviewed to determine whether it should remain classified. Ordinarily the...

  11. Aggregation Operator Based Fuzzy Pattern Classifier Design

    DEFF Research Database (Denmark)

    Mönks, Uwe; Larsen, Henrik Legind; Lohweg, Volker

    2009-01-01

    This paper presents a novel modular fuzzy pattern classifier design framework for intelligent automation systems, developed on the base of the established Modified Fuzzy Pattern Classifier (MFPC) and allows designing novel classifier models which are hardware-efficiently implementable. The perfor...

  12. The economics of malaria control and elimination: a systematic review.

    Science.gov (United States)

    Shretta, Rima; Avanceña, Anton L V; Hatefi, Arian

    2016-12-12

    Declining donor funding and competing health priorities threaten the sustainability of malaria programmes. Elucidating the cost and benefits of continued investments in malaria could encourage sustained political and financial commitments. The evidence, although available, remains disparate. This paper reviews the existing literature on the economic and financial cost and return of malaria control, elimination and eradication. A review of articles that were published on or before September 2014 on the cost and benefits of malaria control and elimination was performed. Studies were classified based on their scope and were analysed according to two major categories: cost of malaria control and elimination to a health system, and cost-benefit studies. Only studies involving more than two control or elimination interventions were included. Outcomes of interest were total programmatic cost, cost per capita, and benefit-cost ratios (BCRs). All costs were converted to 2013 US$ for standardization. Of the 6425 articles identified, 54 studies were included in this review. Twenty-two were focused on elimination or eradication while 32 focused on intensive control. Forty-eight per cent of studies included in this review were published on or after 2000. Overall, the annual per capita cost of malaria control to a health system ranged from $0.11 to $39.06 (median: $2.21) while that for malaria elimination ranged from $0.18 to $27 (median: $3.00). BCRs of investing in malaria control and elimination ranged from 2.4 to over 145. Overall, investments needed for malaria control and elimination varied greatly amongst the various countries and contexts. In most cases, the cost of elimination was greater than the cost of control. At the same time, the benefits of investing in malaria greatly outweighed the costs. While the cost of elimination in most cases was greater than the cost of control, the benefits greatly outweighed the cost. Information from this review provides guidance to

  13. 78 FR 72676 - Draft National Pollutant Discharge Elimination System (NPDES) General Permit for Stormwater...

    Science.gov (United States)

    2013-12-03

    ... AGENCY Draft National Pollutant Discharge Elimination System (NPDES) General Permit for Stormwater... Pollutant Discharge Elimination System (NPDES) general permit for stormwater discharges from industrial... permit covering stormwater discharges from industrial facilities in EPA's Regions 1, 2, 3, 5, 6, 9, and...

  14. Facilities & Leadership

    Data.gov (United States)

    Department of Veterans Affairs — The facilities web service provides VA facility information. The VA facilities locator is a feature that is available across the enterprise, on any webpage, for the...

  15. Biochemistry Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Biochemistry Facility provides expert services and consultation in biochemical enzyme assays and protein purification. The facility currently features 1) Liquid...

  16. Error minimizing algorithms for nearest eighbor classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory; Zimmer, G. Beate [TEXAS A& M

    2011-01-03

    Stack Filters define a large class of discrete nonlinear filter first introd uced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which we call Ordered Hypothesis Machines (OHM), and investigate their relationship to Nearest Neighbor classifiers. We show that OHM classifiers provide a novel framework in which to train Nearest Neighbor type classifiers by minimizing empirical error based loss functions. We use the framework to investigate a new cost sensitive loss function that allows us to train a Nearest Neighbor type classifier for low false alarm rate applications. We report results on both synthetic data and real-world image data.

  17. A Novel Cascade Classifier for Automatic Microcalcification Detection.

    Directory of Open Access Journals (Sweden)

    Seung Yeon Shin

    Full Text Available In this paper, we present a novel cascaded classification framework for automatic detection of individual and clusters of microcalcifications (μC. Our framework comprises three classification stages: i a random forest (RF classifier for simple features capturing the second order local structure of individual μCs, where non-μC pixels in the target mammogram are efficiently eliminated; ii a more complex discriminative restricted Boltzmann machine (DRBM classifier for μC candidates determined in the RF stage, which automatically learns the detailed morphology of μC appearances for improved discriminative power; and iii a detector to detect clusters of μCs from the individual μC detection results, using two different criteria. From the two-stage RF-DRBM classifier, we are able to distinguish μCs using explicitly computed features, as well as learn implicit features that are able to further discriminate between confusing cases. Experimental evaluation is conducted on the original Mammographic Image Analysis Society (MIAS and mini-MIAS databases, as well as our own Seoul National University Bundang Hospital digital mammographic database. It is shown that the proposed method outperforms comparable methods in terms of receiver operating characteristic (ROC and precision-recall curves for detection of individual μCs and free-response receiver operating characteristic (FROC curve for detection of clustered μCs.

  18. A fuzzy classifier system for process control

    Science.gov (United States)

    Karr, C. L.; Phillips, J. C.

    1994-01-01

    A fuzzy classifier system that discovers rules for controlling a mathematical model of a pH titration system was developed by researchers at the U.S. Bureau of Mines (USBM). Fuzzy classifier systems successfully combine the strengths of learning classifier systems and fuzzy logic controllers. Learning classifier systems resemble familiar production rule-based systems, but they represent their IF-THEN rules by strings of characters rather than in the traditional linguistic terms. Fuzzy logic is a tool that allows for the incorporation of abstract concepts into rule based-systems, thereby allowing the rules to resemble the familiar 'rules-of-thumb' commonly used by humans when solving difficult process control and reasoning problems. Like learning classifier systems, fuzzy classifier systems employ a genetic algorithm to explore and sample new rules for manipulating the problem environment. Like fuzzy logic controllers, fuzzy classifier systems encapsulate knowledge in the form of production rules. The results presented in this paper demonstrate the ability of fuzzy classifier systems to generate a fuzzy logic-based process control system.

  19. Data characteristics that determine classifier performance

    CSIR Research Space (South Africa)

    Van der Walt, Christiaan M

    2006-11-01

    Full Text Available The relationship between the distribution of data, on the one hand, and classifier performance, on the other, for non-parametric classifiers has been studied. It is shown that predictable factors such as the available amount of training data...

  20. Hierarchical mixtures of naive Bayes classifiers

    NARCIS (Netherlands)

    Wiering, M.A.

    2002-01-01

    Naive Bayes classifiers tend to perform very well on a large number of problem domains, although their representation power is quite limited compared to more sophisticated machine learning algorithms. In this pa- per we study combining multiple naive Bayes classifiers by using the hierar- chical

  1. Deconvolution When Classifying Noisy Data Involving Transformations

    KAUST Repository

    Carroll, Raymond

    2012-09-01

    In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is applied. However, the inverse transformation is not constructed so as to recover the original signal, and in fact, we show that taking the latter approach is generally inadvisable. We introduce a fully data-driven procedure based on cross-validation, and use several classifiers to illustrate numerical properties of our approach. Theoretical arguments are given in support of our claims. Our procedure is applied to data generated by light detection and ranging (Lidar) technology, where we improve on earlier approaches to classifying aerosols. This article has supplementary materials online.

  2. Surveillance considerations for malaria elimination

    Directory of Open Access Journals (Sweden)

    Barclay Victoria C

    2012-08-01

    Full Text Available Abstract Constant malaria monitoring and surveillance systems have been highlighted as critical for malaria elimination. The absence of robust monitoring and surveillance systems able to respond to outbreaks in a timely manner undeniably contributed to the failure of the last global attempt to eradicate malaria. Today, technological advances could allow for rapid detection of focal outbreaks and improved deployment of diagnostic and treatment supplies to areas needing support. However, optimizing diffusion activities (e.g., distributing vector controls and medicines, as well as deploying behaviour change campaigns requires networks of diverse scholars to monitor, learn, and evaluate data and multiple organizations to coordinate their intervention activities. Surveillance systems that can gather, store and process information, from communities to national levels, in a centralized, widely accessible system will allow tailoring of surveillance and intervention efforts. Different systems and, thus reactions, will be effective in different endemic, geographical or socio-cultural contexts. Investing in carefully designed monitoring technologies, built for a multiple-acter, dynamic system, will help to improve malaria elimination efforts by improving the coordination, timing, coverage, and deployment of malaria technologies.

  3. Surveillance considerations for malaria elimination.

    Science.gov (United States)

    Barclay, Victoria C; Smith, Rachel A; Findeis, Jill L

    2012-08-31

    Constant malaria monitoring and surveillance systems have been highlighted as critical for malaria elimination. The absence of robust monitoring and surveillance systems able to respond to outbreaks in a timely manner undeniably contributed to the failure of the last global attempt to eradicate malaria. Today, technological advances could allow for rapid detection of focal outbreaks and improved deployment of diagnostic and treatment supplies to areas needing support. However, optimizing diffusion activities (e.g., distributing vector controls and medicines, as well as deploying behaviour change campaigns) requires networks of diverse scholars to monitor, learn, and evaluate data and multiple organizations to coordinate their intervention activities. Surveillance systems that can gather, store and process information, from communities to national levels, in a centralized, widely accessible system will allow tailoring of surveillance and intervention efforts. Different systems and, thus reactions, will be effective in different endemic, geographical or socio-cultural contexts. Investing in carefully designed monitoring technologies, built for a multiple-acter, dynamic system, will help to improve malaria elimination efforts by improving the coordination, timing, coverage, and deployment of malaria technologies.

  4. Identifying Malaria Transmission Foci for Elimination Using Human Mobility Data.

    Science.gov (United States)

    Ruktanonchai, Nick W; DeLeenheer, Patrick; Tatem, Andrew J; Alegana, Victor A; Caughlin, T Trevor; Zu Erbach-Schoenberg, Elisabeth; Lourenço, Christopher; Ruktanonchai, Corrine W; Smith, David L

    2016-04-01

    Humans move frequently and tend to carry parasites among areas with endemic malaria and into areas where local transmission is unsustainable. Human-mediated parasite mobility can thus sustain parasite populations in areas where they would otherwise be absent. Data describing human mobility and malaria epidemiology can help classify landscapes into parasite demographic sources and sinks, ecological concepts that have parallels in malaria control discussions of transmission foci. By linking transmission to parasite flow, it is possible to stratify landscapes for malaria control and elimination, as sources are disproportionately important to the regional persistence of malaria parasites. Here, we identify putative malaria sources and sinks for pre-elimination Namibia using malaria parasite rate (PR) maps and call data records from mobile phones, using a steady-state analysis of a malaria transmission model to infer where infections most likely occurred. We also examined how the landscape of transmission and burden changed from the pre-elimination setting by comparing the location and extent of predicted pre-elimination transmission foci with modeled incidence for 2009. This comparison suggests that while transmission was spatially focal pre-elimination, the spatial distribution of cases changed as burden declined. The changing spatial distribution of burden could be due to importation, with cases focused around importation hotspots, or due to heterogeneous application of elimination effort. While this framework is an important step towards understanding progressive changes in malaria distribution and the role of subnational transmission dynamics in a policy-relevant way, future work should account for international parasite movement, utilize real time surveillance data, and relax the steady state assumption required by the presented model.

  5. Logarithmic learning for generalized classifier neural network.

    Science.gov (United States)

    Ozyildirim, Buse Melis; Avci, Mutlu

    2014-12-01

    Generalized classifier neural network is introduced as an efficient classifier among the others. Unless the initial smoothing parameter value is close to the optimal one, generalized classifier neural network suffers from convergence problem and requires quite a long time to converge. In this work, to overcome this problem, a logarithmic learning approach is proposed. The proposed method uses logarithmic cost function instead of squared error. Minimization of this cost function reduces the number of iterations used for reaching the minima. The proposed method is tested on 15 different data sets and performance of logarithmic learning generalized classifier neural network is compared with that of standard one. Thanks to operation range of radial basis function included by generalized classifier neural network, proposed logarithmic approach and its derivative has continuous values. This makes it possible to adopt the advantage of logarithmic fast convergence by the proposed learning method. Due to fast convergence ability of logarithmic cost function, training time is maximally decreased to 99.2%. In addition to decrease in training time, classification performance may also be improved till 60%. According to the test results, while the proposed method provides a solution for time requirement problem of generalized classifier neural network, it may also improve the classification accuracy. The proposed method can be considered as an efficient way for reducing the time requirement problem of generalized classifier neural network. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. A CLASSIFIER SYSTEM USING SMOOTH GRAPH COLORING

    Directory of Open Access Journals (Sweden)

    JORGE FLORES CRUZ

    2017-01-01

    Full Text Available Unsupervised classifiers allow clustering methods with less or no human intervention. Therefore it is desirable to group the set of items with less data processing. This paper proposes an unsupervised classifier system using the model of soft graph coloring. This method was tested with some classic instances in the literature and the results obtained were compared with classifications made with human intervention, yielding as good or better results than supervised classifiers, sometimes providing alternative classifications that considers additional information that humans did not considered.

  7. Prediction of Pork Quality by Fuzzy Support Vector Machine Classifier

    Science.gov (United States)

    Zhang, Jianxi; Yu, Huaizhi; Wang, Jiamin

    Existing objective methods to evaluate pork quality in general do not yield satisfactory results and their applications in meat industry are limited. In this study, fuzzy support vector machine (FSVM) method was developed to evaluate and predict pork quality rapidly and nondestructively. Firstly, the discrete wavelet transform (DWT) was used to eliminate the noise component in original spectrum and the new spectrum was reconstructed. Then, considering the characteristic variables still exist correlation and contain some redundant information, principal component analysis (PCA) was carried out. Lastly, FSVM was developed to differentiate and classify pork samples into different quality grades using the features from PCA. Jackknife tests on the working datasets indicated that the prediction accuracies were higher than other methods.

  8. NPDES (National Pollution Discharge & Elimination System) Minor Dischargers

    Science.gov (United States)

    As authorized by the Clean Water Act, the National Pollutant Discharge Elimination System (NPDES) permit program controls water pollution by regulating point sources that discharge pollutants into waters of the United States. The NPDES permit program regulates direct discharges from municipal and industrial wastewater treatment facilities that discharge directly into surface waters. The NPDES permit program is part of the Permit Compliance System (PCS) which issues, records, tracks, and regulates point source discharge facilities. Individual homes that are connected to a municipal system, use a septic system, or do not have a surface discharge do not need an NPDES permit. Facilities in PCS are identified as either major or minor. Within the major/minor classification, facilities are grouped into municipals or non-municipals. In many cases, non-municipals are industrial facilities. This data layer contains Minor dischargers. Major municipal dischargers include all facilities with design flows of greater than one million gallons per day; minor dischargers are less that one million gallons per day. Essentially, a minor discharger does not meet the discharge criteria for a major. Since its introduction in 1972, the NPDES permit program is responsible for significant improvements to our Nation's water quality.

  9. Survival Processing Eliminates Collaborative Inhibition.

    Science.gov (United States)

    Reysen, Matthew B; Bliss, Heather; Baker, Melissa A

    2017-04-11

    The present experiments examined the effect of processing words for their survival value, relevance to moving, and pleasantness on participants' free recall scores in both nominal groups (non-redundant pooled individual scores) and collaborative dyads. Overall, participants recalled more words in the survival processing conditions than in the moving and pleasantness processing conditions. Furthermore, nominal groups in both the pleasantness condition (Experiment 1) and the moving and pleasantness conditions (Experiment 2) recalled more words than collaborative groups, thereby replicating the oft-observed effect of collaborative inhibition. However, processing words for their survival value appeared to eliminate the deleterious effects of collaborative remembering in both Experiments 1 and 2. These results are discussed in the context of the retrieval strategy disruption hypothesis and the effects of both expertise and collaborative skill on group remembering.

  10. Leprosy elimination: A myth busted

    Directory of Open Access Journals (Sweden)

    Nidhi Yadav

    2014-01-01

    Full Text Available Background: Leprosy is mainly a chronic infectious disease caused by Mycobacterium leprae. The disease mainly affects the skin, the peripheral nerves, mucosa of the upper respiratory tract and eyes. Though the target of leprosy elimination was achieved at national level in 2006 even then a large proportion of leprosy cases reported globally still constitute from India. Aim and Objective: To study the clinico-epidemiological profile of new cases of leprosy in a rural tertiary hospital. Materials and Methods: Thirty-five newly diagnosed cases of leprosy presented in out-patient/admitted in the department of Dermatology, Venereology and Leprosy (between September 2012 and August 2013 were included in the study. Detailed history regarding leprosy, deformity, sensory loss, skin smear for AFB and histopathological examination were done in every patient. Results: The incidence was more in age group of 20 to 39 years (48.57% and 40 to 59 years (37.14%. 68.57% were males. 48.57% cases were found to have facial deformity and ear lobe thickening was found to be pre-dominant form of facial deformity. Ulnar (88.87% and common peroneal nerve (34.28% were the most commonly involved nerves. The split skin smear examination was found to be positive in 27 out of 35 cases. On histopathological examination 10 patients (28.57% were of lepromatous pole (LL, 4 (11.43% were of indeterminate, 6 (17.14% were of tuberculoid type (TT, 4 BT (11.4% and 1 BL type (2.8%. Conclusions: This study helps in concluding that leprosy is still not eliminated. Active surveillance is still needed to detect the sub-clinical cases and undiagnosed cases.

  11. Elimination communication as colic therapy.

    Science.gov (United States)

    Jordan, Geraldine J

    2014-09-01

    Colic is generally defined as excessive crying in early infancy and can have negative consequences on the infant as well as on the infant's family life. Excessive crying can result in escalating parental stress levels, abusive caregiver response, increased risk of shaken baby syndrome and parental postpartum depression. In addition to excessive crying, symptoms and descriptors of infant colic include inconsolable crying, screaming, legs drawn up against the abdomen, furrowing of eyebrows, distended abdomen, arched back, passing gas, post-feeding crying and difficulty defecating. There are few well-designed, reproducible, randomized, large-scale studies which demonstrate efficacy of any therapeutic method for colic. An unexplored etiology is that colic is functionally related to a decrease in stooling frequency. Gut distention may periodically result in intensifying discomfort for the infant and in concomitant inconsolable crying. Elimination communication (EC; also known as Natural Infant Hygiene and sometimes referred to as infant potty training, baby-led potty training or assisted infant toilet training) involves the use of cues by which the infant signals to the caregiver that the infant needs to micturate or defecate. Such cues can include types of crying, squirming, straining, wriggling, grimacing, fussing, vocalizing, intent look at caregiver, red face, passing gas and grunting, many of which are the same initial symptoms related to the onset of colicky infant states. A caregiver's attentive and nurturant response to an infant's cues involve uncovering the infant's intergluteal cleft and cradling the infant gently and non-coercively in a supported, secure squatting position. This position will increase the infant's anorectal angle thus facilitating complete defecation. It is hypothesized that effective and timely elimination will cause increased physical comfort for the infant; colic symptoms will concomitantly decrease. Copyright © 2014 Elsevier Ltd. All

  12. Eliminating US hospital medical errors.

    Science.gov (United States)

    Kumar, Sameer; Steinebach, Marc

    2008-01-01

    Healthcare costs in the USA have continued to rise steadily since the 1980s. Medical errors are one of the major causes of deaths and injuries of thousands of patients every year, contributing to soaring healthcare costs. The purpose of this study is to examine what has been done to deal with the medical-error problem in the last two decades and present a closed-loop mistake-proof operation system for surgery processes that would likely eliminate preventable medical errors. The design method used is a combination of creating a service blueprint, implementing the six sigma DMAIC cycle, developing cause-and-effect diagrams as well as devising poka-yokes in order to develop a robust surgery operation process for a typical US hospital. In the improve phase of the six sigma DMAIC cycle, a number of poka-yoke techniques are introduced to prevent typical medical errors (identified through cause-and-effect diagrams) that may occur in surgery operation processes in US hospitals. It is the authors' assertion that implementing the new service blueprint along with the poka-yokes, will likely result in the current medical error rate to significantly improve to the six-sigma level. Additionally, designing as many redundancies as possible in the delivery of care will help reduce medical errors. Primary healthcare providers should strongly consider investing in adequate doctor and nurse staffing, and improving their education related to the quality of service delivery to minimize clinical errors. This will lead to an increase in higher fixed costs, especially in the shorter time frame. This paper focuses additional attention needed to make a sound technical and business case for implementing six sigma tools to eliminate medical errors that will enable hospital managers to increase their hospital's profitability in the long run and also ensure patient safety.

  13. Classifying the conflict: a soldier's dilemma

    National Research Council Canada - National Science Library

    Carswell, Andrew J

    2009-01-01

    .... Classifying these various scenarios to determine the applicable international law is rendered difficult by both the lack of clarity inherent in the law and the political factors that tend to enter...

  14. Classifiers based on optimal decision rules

    KAUST Repository

    Amin, Talha

    2013-11-25

    Based on dynamic programming approach we design algorithms for sequential optimization of exact and approximate decision rules relative to the length and coverage [3, 4]. In this paper, we use optimal rules to construct classifiers, and study two questions: (i) which rules are better from the point of view of classification-exact or approximate; and (ii) which order of optimization gives better results of classifier work: length, length+coverage, coverage, or coverage+length. Experimental results show that, on average, classifiers based on exact rules are better than classifiers based on approximate rules, and sequential optimization (length+coverage or coverage+length) is better than the ordinary optimization (length or coverage).

  15. Combining multiple classifiers for age classification

    CSIR Research Space (South Africa)

    Van Heerden, C

    2009-11-01

    Full Text Available The authors compare several different classifier combination methods on a single task, namely speaker age classification. This task is well suited to combination strategies, since significantly different feature classes are employed. Support vector...

  16. Test Bias and the Elimination of Racism

    Science.gov (United States)

    Sedlacek, William E.

    1977-01-01

    Three types of test bias are discussed: content bias, atmosphere bias, and use bias. Use bias is considered the most important. Tests reflect the bias in society, and eliminating test bias means eliminating racism and sexism in society. A six-stage model to eliminate racism and sexism is presented. (Author)

  17. Waste Facilities

    Data.gov (United States)

    Vermont Center for Geographic Information — This dataset was developed from the Vermont DEC's list of certified solid waste facilities. It includes facility name, contact information, and the materials...

  18. Fabrication Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Fabrication Facilities are a direct result of years of testing support. Through years of experience, the three fabrication facilities (Fort Hood, Fort Lewis, and...

  19. Quality Classifiers for Open Source Software Repositories

    OpenAIRE

    Tsatsaronis, George; Halkidi, Maria; Giakoumakis, Emmanouel A.

    2009-01-01

    Open Source Software (OSS) often relies on large repositories, like SourceForge, for initial incubation. The OSS repositories offer a large variety of meta-data providing interesting information about projects and their success. In this paper we propose a data mining approach for training classifiers on the OSS meta-data provided by such data repositories. The classifiers learn to predict the successful continuation of an OSS project. The `successfulness' of projects is defined in terms of th...

  20. Vision-based posture recognition using an ensemble classifier and a vote filter

    Science.gov (United States)

    Ji, Peng; Wu, Changcheng; Xu, Xiaonong; Song, Aiguo; Li, Huijun

    2016-10-01

    Posture recognition is a very important Human-Robot Interaction (HRI) way. To segment effective posture from an image, we propose an improved region grow algorithm which combining with the Single Gauss Color Model. The experiment shows that the improved region grow algorithm can get the complete and accurate posture than traditional Single Gauss Model and region grow algorithm, and it can eliminate the similar region from the background at the same time. In the posture recognition part, and in order to improve the recognition rate, we propose a CNN ensemble classifier, and in order to reduce the misjudgments during a continuous gesture control, a vote filter is proposed and applied to the sequence of recognition results. Comparing with CNN classifier, the CNN ensemble classifier we proposed can yield a 96.27% recognition rate, which is better than that of CNN classifier, and the proposed vote filter can improve the recognition result and reduce the misjudgments during the consecutive gesture switch.

  1. REMOTE INTERVENTION TOWER ELIMINATION SYSTEM

    Energy Technology Data Exchange (ETDEWEB)

    Dave Murnane; Renauld Washington

    2002-02-15

    This Topical Report is presented to satisfy reporting requirements in the Statement of work section J.5 page 120 per Department of Energy contract DE-AC26-01NT41093. The project does not contain any imperial research data. This report describes the assembly of Commercial off the shelf (COTS) items configured in a unique manner to represent new and innovative technology in the service of size reduction and material handling at DOE sites, to assist in the D&D effort currently underway at the designated DOE Facilities.

  2. Costs and financial feasibility of malaria elimination

    Science.gov (United States)

    Sabot, Oliver; Cohen, Justin M; Hsiang, Michelle S; Kahn, James G; Basu, Suprotik; Tang, Linhua; Zheng, Bin; Gao, Qi; Zou, Linda; Tatarsky, Allison; Aboobakar, Shahina; Usas, Jennifer; Barrett, Scott; Cohen, Jessica L; Jamison, Dean T; Feachem, Richard GA

    2010-01-01

    Summary The marginal costs and benefits of converting malaria programmes from a control to an elimination goal are central to strategic decisions, but empirical evidence is scarce. We present a conceptual framework to assess the economics of elimination and analyse a central component of that framework—potential short-term to medium-term financial savings. After a review that showed a dearth of existing evidence, the net present value of elimination in five sites was calculated and compared with effective control. The probability that elimination would be cost-saving over 50 years ranged from 0% to 42%, with only one site achieving cost-savings in the base case. These findings show that financial savings should not be a primary rationale for elimination, but that elimination might still be a worthy investment if total benefits are sufficient to outweigh marginal costs. Robust research into these elimination benefits is urgently needed. PMID:21035839

  3. CLASSIFIED BY SUBJECT IN SPORT SCIENCES

    Directory of Open Access Journals (Sweden)

    Petar Protić

    2007-05-01

    Full Text Available High school and academic libraries users need precise classifi cation and subject access review of printed and electronic resources. In library catalogue since, Universal Decimal Classifi cation (UDC -similar to Dewey system - ex classifi es research and scientifi c areas. in subject areas of 796 Sport and 371 Teaching. Nowadays, users need structure of subjects by disciplines in science. Full-open resources of library must be set for users in subject access catalogue, because on the example of bachelors degree thesis in Faculty of Physical Education in Novi Sad they reaches for disciplines in database with 36 indexes sort by fi rst letters in names (Athletics, Boxing, Cycling, etc. This database have single and multiplied index for each thesis. Users in 80% cases of research according to the subject access catalogue of this library.

  4. Classifiers of quantity and quality in Romanian

    Directory of Open Access Journals (Sweden)

    Mihaela Tănase-Dogaru

    2013-11-01

    Full Text Available The present paper proposes that classifiers in Romanian pertain to two distinct categories: classifiers of quantity or “massifiers” and classifiers of quality or “count-classifiers”, to borrow the terms from Cheng and Sybesma (1999. The first category is represented by the first nominal in a pseudopartitive construction of the type o bucată de brânză / a piece of cheese (Tănase-Dogaru 2009. The second category is represented by the first nominal in the so-called restrictive appositives, an example of which is Planeta Venus / the planet Venus (van Riemsdijk 1998, Cornilescu 2007. An important result of the paper is the unification under a similar treatment of concepts which are generally offered different analyses in the literature.

  5. Region 9 NPDES Facilities - Waste Water Treatment Plants

    Science.gov (United States)

    Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates direct discharges from facilities that discharge treated waste water into waters of the US. Facilities are issued NPDES permits regulating their discharge as required by the Clean Water Act. A facility may have one or more outfalls (dischargers). The location represents the facility or operating plant.

  6. Region 9 NPDES Facilities 2012- Waste Water Treatment Plants

    Science.gov (United States)

    Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates direct discharges from facilities that discharge treated waste water into waters of the US. Facilities are issued NPDES permits regulating their discharge as required by the Clean Water Act. A facility may have one or more outfalls (dischargers). The location represents the facility or operating plant.

  7. Design of Robust Neural Network Classifiers

    DEFF Research Database (Denmark)

    Larsen, Jan; Andersen, Lars Nonboe; Hintz-Madsen, Mads

    1998-01-01

    This paper addresses a new framework for designing robust neural network classifiers. The network is optimized using the maximum a posteriori technique, i.e., the cost function is the sum of the log-likelihood and a regularization term (prior). In order to perform robust classification, we present...... a modified likelihood function which incorporates the potential risk of outliers in the data. This leads to the introduction of a new parameter, the outlier probability. Designing the neural classifier involves optimization of network weights as well as outlier probability and regularization parameters. We...

  8. A survey of decision tree classifier methodology

    Science.gov (United States)

    Safavian, S. R.; Landgrebe, David

    1991-01-01

    Decision tree classifiers (DTCs) are used successfully in many diverse areas such as radar signal classification, character recognition, remote sensing, medical diagnosis, expert systems, and speech recognition. Perhaps the most important feature of DTCs is their capability to break down a complex decision-making process into a collection of simpler decisions, thus providing a solution which is often easier to interpret. A survey of current methods is presented for DTC designs and the various existing issues. After considering potential advantages of DTCs over single-state classifiers, subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed.

  9. A survey of decision tree classifier methodology

    Science.gov (United States)

    Safavian, S. Rasoul; Landgrebe, David

    1990-01-01

    Decision Tree Classifiers (DTC's) are used successfully in many diverse areas such as radar signal classification, character recognition, remote sensing, medical diagnosis, expert systems, and speech recognition. Perhaps, the most important feature of DTC's is their capability to break down a complex decision-making process into a collection of simpler decisions, thus providing a solution which is often easier to interpret. A survey of current methods is presented for DTC designs and the various existing issue. After considering potential advantages of DTC's over single stage classifiers, subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed.

  10. Evaluating Microarray-based Classifiers: An Overview

    Directory of Open Access Journals (Sweden)

    M. Daumer

    2008-01-01

    Full Text Available For the last eight years, microarray-based class prediction has been the subject of numerous publications in medicine, bioinformatics and statistics journals. However, in many articles, the assessment of classification accuracy is carried out using suboptimal procedures and is not paid much attention. In this paper, we carefully review various statistical aspects of classifier evaluation and validation from a practical point of view. The main topics addressed are accuracy measures, error rate estimation procedures, variable selection, choice of classifiers and validation strategy.

  11. A Customizable Text Classifier for Text Mining

    Directory of Open Access Journals (Sweden)

    Yun-liang Zhang

    2007-12-01

    Full Text Available Text mining deals with complex and unstructured texts. Usually a particular collection of texts that is specified to one or more domains is necessary. We have developed a customizable text classifier for users to mine the collection automatically. It derives from the sentence category of the HNC theory and corresponding techniques. It can start with a few texts, and it can adjust automatically or be adjusted by user. The user can also control the number of domains chosen and decide the standard with which to choose the texts based on demand and abundance of materials. The performance of the classifier varies with the user's choice.

  12. Elimination of Onchocerciasis from Mexico.

    Directory of Open Access Journals (Sweden)

    Mario A Rodríguez-Pérez

    Full Text Available Mexico is one of the six countries formerly endemic for onchocerciasis in Latin America. Transmission has been interrupted in the three endemic foci of that country and mass drug distribution has ceased. Three years after mass drug distribution ended, post-treatment surveillance (PTS surveys were undertaken which employed entomological indicators to check for transmission recrudescence.In-depth entomologic assessments were performed in 18 communities in the three endemic foci of Mexico. None of the 108,212 Simulium ochraceum s.l. collected from the three foci were found to contain parasite DNA when tested by polymerase chain reaction-enzyme-linked immunosorbent assay (PCR-ELISA, resulting in a maximum upper bound of the 95% confidence interval (95%-ULCI of the infective rate in the vectors of 0.035/2,000 flies examined. This is an order of magnitude below the threshold of a 95%-ULCI of less than one infective fly per 2,000 flies tested, the current entomological criterion for interruption of transmission developed by the international community. The point estimate of seasonal transmission potential (STP was zero, and the upper bound of the 95% confidence interval for the STP ranged from 1.2 to 1.7 L3/person/season in the different foci. This value is below all previous estimates for the minimum transmission potential required to maintain the parasite population.The results from the in-depth entomological post treatment surveillance surveys strongly suggest that transmission has not resumed in the three foci of Mexico during the three years since the last distribution of ivermectin occurred; it was concluded that transmission remains undetectable without intervention, and Onchocerca volvulus has been eliminated from Mexico.

  13. Elimination of Onchocerciasis from Mexico

    Science.gov (United States)

    Rodríguez-Pérez, Mario A.; Fernández-Santos, Nadia A.; Orozco-Algarra, María E.; Rodríguez-Atanacio, José A.; Domínguez-Vázquez, Alfredo; Rodríguez-Morales, Kristel B.; Real-Najarro, Olga; Prado-Velasco, Francisco G.; Cupp, Eddie W.; Richards, Frank O.; Hassan, Hassan K.; González-Roldán, Jesús F.; Kuri-Morales, Pablo A.; Unnasch, Thomas R.

    2015-01-01

    Background Mexico is one of the six countries formerly endemic for onchocerciasis in Latin America. Transmission has been interrupted in the three endemic foci of that country and mass drug distribution has ceased. Three years after mass drug distribution ended, post-treatment surveillance (PTS) surveys were undertaken which employed entomological indicators to check for transmission recrudescence. Methodology/Principal findings In-depth entomologic assessments were performed in 18 communities in the three endemic foci of Mexico. None of the 108,212 Simulium ochraceum s.l. collected from the three foci were found to contain parasite DNA when tested by polymerase chain reaction-enzyme-linked immunosorbent assay (PCR-ELISA), resulting in a maximum upper bound of the 95% confidence interval (95%-ULCI) of the infective rate in the vectors of 0.035/2,000 flies examined. This is an order of magnitude below the threshold of a 95%-ULCI of less than one infective fly per 2,000 flies tested, the current entomological criterion for interruption of transmission developed by the international community. The point estimate of seasonal transmission potential (STP) was zero, and the upper bound of the 95% confidence interval for the STP ranged from 1.2 to 1.7 L3/person/season in the different foci. This value is below all previous estimates for the minimum transmission potential required to maintain the parasite population. Conclusions/Significance The results from the in-depth entomological post treatment surveillance surveys strongly suggest that transmission has not resumed in the three foci of Mexico during the three years since the last distribution of ivermectin occurred; it was concluded that transmission remains undetectable without intervention, and Onchocerca volvulus has been eliminated from Mexico. PMID:26161558

  14. Classifying and quantifying basins of attraction

    Energy Technology Data Exchange (ETDEWEB)

    Sprott, J. C.; Xiong, Anda [Physics Department, University of Wisconsin-Madison, 1150 University Avenue, Madison, Wisconsin 53706 (United States)

    2015-08-15

    A scheme is proposed to classify the basins for attractors of dynamical systems in arbitrary dimensions. There are four basic classes depending on their size and extent, and each class can be further quantified to facilitate comparisons. The calculation uses a Monte Carlo method and is applied to numerous common dissipative chaotic maps and flows in various dimensions.

  15. Feature selection based classifier combination approach for ...

    Indian Academy of Sciences (India)

    Conditional mutual information based feature selection when driving the ensemble of classifier produces improved recognition results for most of the benchmarking datasets. The improve- ment is also observed with maximum relevance minimum redundancy based feature selection when used in combination with ensemble ...

  16. Classifying bicrossed products of two Taft algebras

    OpenAIRE

    Agore, A. L.

    2016-01-01

    We classify all Hopf algebras which factorize through two Taft algebras $\\mathbb{T}_{n^{2}}(\\bar{q})$ and respectively $T_{m^{2}}(q)$. To start with, all possible matched pairs between the two Taft algebras are described: if $\\bar{q} \

  17. A Classifier Learning Method through Data Summaries

    Science.gov (United States)

    Suematsu, Nobuo; Nakayasu, Toshiko; Hayashi, Akira

    Knowledge discovery in databases (KDD) has been studied intensively recent years. In KDD, inductive classifier learning methods which were developed in statistics and machine learning have been used to extract classification rules from databases. Although in KDD we have to deal with large databases in many cases, many of the previous classifier learning methods are not suitable for large databases. They were designed under assumption that any data in databases is accessible on demand and they usually need to access a datum several times in a process of learning. So, they require a huge memory space or a large I/O cost to access storage devices. In this paper, we propose a classifier learning method, we call CIDRE, in which data summaries are constructed and classifiers are learned from the summaries. This learning method is realized by using a clustering method, we call MCF-tree, which is an extension of CF-tree proposed by Zhang et al. In the method, we can specify the size of memory space occupied by data summaries, and databases are swept only once to construct the summaries. In addition, new instances can be inserted into the summaries incrementally. Thus, the method possesses important properties which are desirable to deal with large databases. We also show empirical results, which indicate that our method performs very well in comparison to C4.5 and naive Bayes, and the extension from CF-tree to MCF-tree is indispensable to achieve high classification accuracy.

  18. Multilevel Growth Mixture Models for Classifying Groups

    Science.gov (United States)

    Palardy, Gregory J.; Vermunt, Jeroen K.

    2010-01-01

    This article introduces a multilevel growth mixture model (MGMM) for classifying both the individuals and the groups they are nested in. Nine variations of the general model are described that differ in terms of categorical and continuous latent variable specification within and between groups. An application in the context of school effectiveness…

  19. On the interpretation of number and classifiers

    NARCIS (Netherlands)

    Cheng, L.L.; Doetjes, J.S.; Sybesma, R.P.E.; Zamparelli, R.

    2012-01-01

    Mandarin and Cantonese, both of which are numeral classifier languages, present an interesting puzzle concerning a compositional account of number in the various forms of nominals. First, bare nouns are number neutral (or vague in number). Second, cl-noun combinations appear to have different

  20. Neural Classifier Construction using Regularization, Pruning

    DEFF Research Database (Denmark)

    Hintz-Madsen, Mads; Hansen, Lars Kai; Larsen, Jan

    1998-01-01

    In this paper we propose a method for construction of feed-forward neural classifiers based on regularization and adaptive architectures. Using a penalized maximum likelihood scheme, we derive a modified form of the entropic error measure and an algebraic estimate of the test error. In conjunction...

  1. Face detection by aggregated Bayesian network classifiers

    NARCIS (Netherlands)

    Pham, T.V.; Worring, M.; Smeulders, A.W.M.

    2002-01-01

    A face detection system is presented. A new classification method using forest-structured Bayesian networks is used. The method is used in an aggregated classifier to discriminate face from non-face patterns. The process of generating non-face patterns is integrated with the construction of the

  2. MScanner: a classifier for retrieving Medline citations.

    Science.gov (United States)

    Poulter, Graham L; Rubin, Daniel L; Altman, Russ B; Seoighe, Cathal

    2008-02-19

    Keyword searching through PubMed and other systems is the standard means of retrieving information from Medline. However, ad-hoc retrieval systems do not meet all of the needs of databases that curate information from literature, or of text miners developing a corpus on a topic that has many terms indicative of relevance. Several databases have developed supervised learning methods that operate on a filtered subset of Medline, to classify Medline records so that fewer articles have to be manually reviewed for relevance. A few studies have considered generalisation of Medline classification to operate on the entire Medline database in a non-domain-specific manner, but existing applications lack speed, available implementations, or a means to measure performance in new domains. MScanner is an implementation of a Bayesian classifier that provides a simple web interface for submitting a corpus of relevant training examples in the form of PubMed IDs and returning results ranked by decreasing probability of relevance. For maximum speed it uses the Medical Subject Headings (MeSH) and journal of publication as a concise document representation, and takes roughly 90 seconds to return results against the 16 million records in Medline. The web interface provides interactive exploration of the results, and cross validated performance evaluation on the relevant input against a random subset of Medline. We describe the classifier implementation, cross validate it on three domain-specific topics, and compare its performance to that of an expert PubMed query for a complex topic. In cross validation on the three sample topics against 100,000 random articles, the classifier achieved excellent separation of relevant and irrelevant article score distributions, ROC areas between 0.97 and 0.99, and averaged precision between 0.69 and 0.92. MScanner is an effective non-domain-specific classifier that operates on the entire Medline database, and is suited to retrieving topics for which

  3. Creating Diverse Ensemble Classifiers to Reduce Supervision

    Science.gov (United States)

    2005-12-01

    disagree on some inputs (Hansen & Salamon, 1990; Tumer & Ghosh, 1996). We refer to the measure of disagreement as the diversity/ambiguity of the ensemble...they use an objective function that incorporates both an accuracy and diversity term. Tumer and Ghosh (1996) reduce the correlation between...controlled by the amount of features that are eliminated. This method, called input decimation, has been further explored by Tumer and Oza (1999). Zenobi and

  4. Parallelisation of surface-related multiple elimination

    NARCIS (Netherlands)

    vanWaveren, GM; Godfrey, IM; Hertzberger, B; Serazzi, G

    1995-01-01

    This paper presents the first published parallelisation of the surface-related multiple elimination method from the Delphi (3) software release. This method is used in the seismic industry to eliminate multiple data from recorded seismic data. Both data-parallel and message-passing implementation

  5. Comparing cosmic web classifiers using information theory

    Science.gov (United States)

    Leclercq, Florent; Lavaux, Guilhem; Jasche, Jens; Wandelt, Benjamin

    2016-08-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  6. Classifying objects in LWIR imagery via CNNs

    Science.gov (United States)

    Rodger, Iain; Connor, Barry; Robertson, Neil M.

    2016-10-01

    The aim of the presented work is to demonstrate enhanced target recognition and improved false alarm rates for a mid to long range detection system, utilising a Long Wave Infrared (LWIR) sensor. By exploiting high quality thermal image data and recent techniques in machine learning, the system can provide automatic target recognition capabilities. A Convolutional Neural Network (CNN) is trained and the classifier achieves an overall accuracy of > 95% for 6 object classes related to land defence. While the highly accurate CNN struggles to recognise long range target classes, due to low signal quality, robust target discrimination is achieved for challenging candidates. The overall performance of the methodology presented is assessed using human ground truth information, generating classifier evaluation metrics for thermal image sequences.

  7. N-Zero Integrated Analog Classifier (NINA)

    Science.gov (United States)

    2017-03-01

    detected signals. For physical- asset detection and classification, passive piezoelectric MEMS resonant seismic sensors will be utilized to produce...computational circuits operating deep in the weak inversion regime and non-volatile analog memories to classify seismic signals at extremely low...frequencies to maximize the sensor output voltage with generator signal. Figure 3. Vibration signal spectrum measured from by a sensor located atop the

  8. Robot Learning Using Learning Classifier Systems Approach

    OpenAIRE

    Jabin, Suraiya

    2010-01-01

    In this chapter, I have presented Learning Classifier Systems, which add to the classical Reinforcement Learning framework the possibility of representing the state as a vector of attributes and finding a compact expression of the representation so induced. Their formalism conveys a nice interaction between learning and evolution, which makes them a class of particularly rich systems, at the intersection of several research domains. As a result, they profit from the accumulated extensions of ...

  9. Classifying and Evaluating Architecture Design Methods

    OpenAIRE

    Tekinerdogan, B.; Aksit, Mehmet; Aksit, Mehmet

    2002-01-01

    The concept of software architecture has gained a wide popularity and is generally considered to play a fundamental role in coping with the inherent difficulties of the development of large-scale and complex software systems. This chapter first gives a definition of architecture. Second, a meta-model for architecture design methods is presented. This model is used for classifying and evaluating various architecture design approaches. The chapter concludes with the description of the identifie...

  10. Classifying and evaluating architecture design methods

    OpenAIRE

    Aksit, Mehmet; Tekinerdogan, B.

    1999-01-01

    The concept of software architecture has gained a wide popularity and is generally considered to play a fundamental role in coping with the inherent difficulties of the development of large-scale and complex software systems. This document first gives a definition of architectures. Second, a meta-model for architecture design methods is presented. This model is used for classifying and evaluating various architecture design approaches. The document concludes with the description of the identi...

  11. Dynamic Dimensionality Selection for Bayesian Classifier Ensembles

    Science.gov (United States)

    2015-03-19

    Prescribed by ANSI Std. Z39.18 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...by ANSI Std Z39-18 Final Report for AOARD Grant AOARD-124030 “Dynamic dimensionality selection for Bayesian classifier ensembles” March 19, 2015...learning workbench . We compared the two levels of ANDE without either extension, with each in isolation and with both in combination. We also compared

  12. Acquiring Visual Classifiers from Human Imagination

    Science.gov (United States)

    2014-01-01

    Abstract The human mind can remarkably imagine objects that it has never seen, touched, or heard, all in vivid detail. Moti- vated by the desire to... vivid detail. In this paper, we seek to transfer the mental images of what a human can imagine into an object recognition sys- tem. We combine the...Acquiring Visual Classifiers from Human Imagination Carl Vondrick, Hamed Pirsiavash, Aude Oliva, Antonio Torralba Massachusetts Institute of

  13. Classifying Variable Sources in SDSS Stripe 82

    Science.gov (United States)

    Willecke Lindberg, Christina

    2018-01-01

    SDSS (Sloan Digital Sky Survey) Stripe 82 is a well-documented and researched region of the sky that does not have all of its ~67,500 variable objects labeled. By collecting data and consulting different catalogs such as the Catalina Survey, we are able to slowly cross-match more objects and add classifications within the Stripe 82 catalog. Such matching is performed either by pairing SDSS identification numbers, or by converting and comparing the coordinates of every object within the Stripe 82 catalog to every object within the classified catalog, such as the Catalina Survey catalog. If matching is performed with converted coordinates, a follow-up check is performed to ascertain that the magnitudes of the paired objects are within a reasonable margin of error and that objects have not been mismatched. Once matches have been confirmed, the light curves of classified objects can then be used to determine features that most effectively separate the different types of variable objects in feature spaces. By classifying variable objects, we can construct a reference for subsequent large research surveys, such as LSST (the Large Synoptic Survey Telescope), that could utilize SDSS data as a training set for its own classifications.

  14. Fumigation success for California facility.

    Science.gov (United States)

    Hacker, Robert

    2010-02-01

    As Robert Hacker, at the time director of facilities management at the St John's Regional Medical Center in Oxnard, California, explains, the hospital, one of the area's largest, recently successfully utilised a new technology to eliminate mould, selecting a cost and time-saving fumigation process in place of the traditional "rip and tear" method. Although hospital managers knew the technology had been used extremely effectively in other US buildings, this was reportedly among the first ever healthcare applications.

  15. Mammography Facilities

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Mammography Facility Database is updated periodically based on information received from the four FDA-approved accreditation bodies: the American College of...

  16. Canyon Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — B Plant, T Plant, U Plant, PUREX, and REDOX (see their links) are the five facilities at Hanford where the original objective was plutonium removal from the uranium...

  17. Health Facilities

    Science.gov (United States)

    Health facilities are places that provide health care. They include hospitals, clinics, outpatient care centers, and specialized care centers, such as birthing centers and psychiatric care centers. When you ...

  18. Cut elimination in multifocused linear logic

    DEFF Research Database (Denmark)

    Guenot, Nicolas; Brock-Nannestad, Taus

    2015-01-01

    We study cut elimination for a multifocused variant of full linear logic in the sequent calculus. The multifocused normal form of proofs yields problems that do not appear in a standard focused system, related to the constraints in grouping rule instances in focusing phases. We show that cut...... elimination can be performed in a sensible way even though the proof requires some specific lemmas to deal with multifocusing phases, and discuss the difficulties arising with cut elimination when considering normal forms of proofs in linear logic....

  19. Hailstone classifier based on Rough Set Theory

    Science.gov (United States)

    Wan, Huisong; Jiang, Shuming; Wei, Zhiqiang; Li, Jian; Li, Fengjiao

    2017-09-01

    The Rough Set Theory was used for the construction of the hailstone classifier. Firstly, the database of the radar image feature was constructed. It included transforming the base data reflected by the Doppler radar into the bitmap format which can be seen. Then through the image processing, the color, texture, shape and other dimensional features should be extracted and saved as the characteristic database to provide data support for the follow-up work. Secondly, Through the Rough Set Theory, a machine for hailstone classifications can be built to achieve the hailstone samples’ auto-classification.

  20. Gearbox Condition Monitoring Using Advanced Classifiers

    Directory of Open Access Journals (Sweden)

    P. Večeř

    2010-01-01

    Full Text Available New efficient and reliable methods for gearbox diagnostics are needed in automotive industry because of growing demand for production quality. This paper presents the application of two different classifiers for gearbox diagnostics – Kohonen Neural Networks and the Adaptive-Network-based Fuzzy Interface System (ANFIS. Two different practical applications are presented. In the first application, the tested gearboxes are separated into two classes according to their condition indicators. In the second example, ANFIS is applied to label the tested gearboxes with a Quality Index according to the condition indicators. In both applications, the condition indicators were computed from the vibration of the gearbox housing. 

  1. The extended clearance model and its use for the interpretation of hepatobiliary elimination data

    Directory of Open Access Journals (Sweden)

    Gian Camenisch

    2015-03-01

    Full Text Available Hepatic elimination is a function of the interplay between different processes such as sinusoidal uptake, intracellular metabolism, canalicular (biliary secretion, and sinusoidal efflux. In this review, we outline how drugs can be classified according to their in vitro determined clearance mechanisms using the extended clearance model as a reference. The approach enables the determination of the rate-determining hepatic clearance step. Some successful applications will be highlighted, together with a discussion on the major consequences for the pharmacokinetics and the drug-drug interaction potential of drugs. Special emphasize is put on the role of passive permeability and active transport processes in hepatic elimination.

  2. A systematic comparison of supervised classifiers.

    Science.gov (United States)

    Amancio, Diego Raphael; Comin, Cesar Henrique; Casanova, Dalcimar; Travieso, Gonzalo; Bruno, Odemir Martinez; Rodrigues, Francisco Aparecido; Costa, Luciano da Fontoura

    2014-01-01

    Pattern recognition has been employed in a myriad of industrial, commercial and academic applications. Many techniques have been devised to tackle such a diversity of applications. Despite the long tradition of pattern recognition research, there is no technique that yields the best classification in all scenarios. Therefore, as many techniques as possible should be considered in high accuracy applications. Typical related works either focus on the performance of a given algorithm or compare various classification methods. In many occasions, however, researchers who are not experts in the field of machine learning have to deal with practical classification tasks without an in-depth knowledge about the underlying parameters. Actually, the adequate choice of classifiers and parameters in such practical circumstances constitutes a long-standing problem and is one of the subjects of the current paper. We carried out a performance study of nine well-known classifiers implemented in the Weka framework and compared the influence of the parameter configurations on the accuracy. The default configuration of parameters in Weka was found to provide near optimal performance for most cases, not including methods such as the support vector machine (SVM). In addition, the k-nearest neighbor method frequently allowed the best accuracy. In certain conditions, it was possible to improve the quality of SVM by more than 20% with respect to their default parameter configuration.

  3. A systematic comparison of supervised classifiers.

    Directory of Open Access Journals (Sweden)

    Diego Raphael Amancio

    Full Text Available Pattern recognition has been employed in a myriad of industrial, commercial and academic applications. Many techniques have been devised to tackle such a diversity of applications. Despite the long tradition of pattern recognition research, there is no technique that yields the best classification in all scenarios. Therefore, as many techniques as possible should be considered in high accuracy applications. Typical related works either focus on the performance of a given algorithm or compare various classification methods. In many occasions, however, researchers who are not experts in the field of machine learning have to deal with practical classification tasks without an in-depth knowledge about the underlying parameters. Actually, the adequate choice of classifiers and parameters in such practical circumstances constitutes a long-standing problem and is one of the subjects of the current paper. We carried out a performance study of nine well-known classifiers implemented in the Weka framework and compared the influence of the parameter configurations on the accuracy. The default configuration of parameters in Weka was found to provide near optimal performance for most cases, not including methods such as the support vector machine (SVM. In addition, the k-nearest neighbor method frequently allowed the best accuracy. In certain conditions, it was possible to improve the quality of SVM by more than 20% with respect to their default parameter configuration.

  4. Classifying Coding DNA with Nucleotide Statistics

    Directory of Open Access Journals (Sweden)

    Nicolas Carels

    2009-10-01

    Full Text Available In this report, we compared the success rate of classification of coding sequences (CDS vs. introns by Codon Structure Factor (CSF and by a method that we called Universal Feature Method (UFM. UFM is based on the scoring of purine bias (Rrr and stop codon frequency. We show that the success rate of CDS/intron classification by UFM is higher than by CSF. UFM classifies ORFs as coding or non-coding through a score based on (i the stop codon distribution, (ii the product of purine probabilities in the three positions of nucleotide triplets, (iii the product of Cytosine (C, Guanine (G, and Adenine (A probabilities in the 1st, 2nd, and 3rd positions of triplets, respectively, (iv the probabilities of G in 1st and 2nd position of triplets and (v the distance of their GC3 vs. GC2 levels to the regression line of the universal correlation. More than 80% of CDSs (true positives of Homo sapiens (>250 bp, Drosophila melanogaster (>250 bp and Arabidopsis thaliana (>200 bp are successfully classified with a false positive rate lower or equal to 5%. The method releases coding sequences in their coding strand and coding frame, which allows their automatic translation into protein sequences with 95% confidence. The method is a natural consequence of the compositional bias of nucleotides in coding sequences.

  5. Classifying smoke in laparoscopic videos using SVM

    Directory of Open Access Journals (Sweden)

    Alshirbaji Tamer Abdulbaki

    2017-09-01

    Full Text Available Smoke in laparoscopic videos usually appears due to the use of electrocautery when cutting or coagulating tissues. Therefore, detecting smoke can be used for event-based annotation in laparoscopic surgeries by retrieving the events associated with the electrocauterization. Furthermore, smoke detection can also be used for automatic smoke removal. However, detecting smoke in laparoscopic video is a challenge because of the changeability of smoke patterns, the moving camera and the different lighting conditions. In this paper, we present a video-based smoke detection algorithm to detect smoke of different densities such as fog, low and high density in laparoscopic videos. The proposed method depends on extracting various visual features from the laparoscopic images and providing them to support vector machine (SVM classifier. Features are based on motion, colour and texture patterns of the smoke. We validated our algorithm using experimental evaluation on four laparoscopic cholecystectomy videos. These four videos were manually annotated by defining every frame as smoke or non-smoke frame. The algorithm was applied to the videos by using different feature combinations for classification. Experimental results show that the combination of all proposed features gives the best classification performance. The overall accuracy (i.e. correctly classified frames is around 84%, with the sensitivity (i.e. correctly detected smoke frames and the specificity (i.e. correctly detected non-smoke frames are 89% and 80%, respectively.

  6. Learning ensemble classifiers for diabetic retinopathy assessment.

    Science.gov (United States)

    Saleh, Emran; Błaszczyński, Jerzy; Moreno, Antonio; Valls, Aida; Romero-Aroca, Pedro; de la Riva-Fernández, Sofia; Słowiński, Roman

    2017-10-06

    Diabetic retinopathy is one of the most common comorbidities of diabetes. Unfortunately, the recommended annual screening of the eye fundus of diabetic patients is too resource-consuming. Therefore, it is necessary to develop tools that may help doctors to determine the risk of each patient to attain this condition, so that patients with a low risk may be screened less frequently and the use of resources can be improved. This paper explores the use of two kinds of ensemble classifiers learned from data: fuzzy random forest and dominance-based rough set balanced rule ensemble. These classifiers use a small set of attributes which represent main risk factors to determine whether a patient is in risk of developing diabetic retinopathy. The levels of specificity and sensitivity obtained in the presented study are over 80%. This study is thus a first successful step towards the construction of a personalized decision support system that could help physicians in daily clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Fcoused crawler bused on Bayesian classifier

    Directory of Open Access Journals (Sweden)

    JIA Haijun

    2013-12-01

    Full Text Available With the rapid development of the network,its information resources are increasingly large and faced a huge amount of information database,search engine plays an important role.Focused crawling technique,as the main core portion of search engine,is used to calculate the relationship between search results and search topics,which is called correlation.Normally,focused crawling method is used only to calculate the correlation between web content and search related topics.In this paper,focused crawling method is used to compute the importance of links through link content and anchor text,then Bayesian classifier is used to classify the links,and finally cosine similarity function is used to calculate the relevance of web pages.If the correlation value is greater than the threshold the page is considered to be associated with the predetermined topics,otherwise not relevant.Experimental results show that a high accuracy can be obtained by using the proposed crawling approach.

  8. Eliminative behaviour of dairy cows at pasture

    DEFF Research Database (Denmark)

    Whistance, Lindsay Kay; Sinclair, Liam A.; Arney, David Richard

    2011-01-01

    Despite a strong avoidance of grazing near dung patches, cattle have traditionally been considered not to avoid bodily contact with faeces, regardless of any risk of disease. Little is understood of the behaviour of pasture-kept dairy cows at the time of defaecation and therefore, the eliminative...... was the predominant behaviour pattern of dairy cows at pasture, regardless of activity. Avoidance of bodily contamination with fresh faeces was shown at all observed eliminative events....

  9. Elimination of schistosomiasis: the tools required.

    Science.gov (United States)

    Bergquist, Robert; Zhou, Xiao-Nong; Rollinson, David; Reinhard-Rupp, Jutta; Klohe, Katharina

    2017-11-20

    Historically, the target in the schistosomiasis control has shifted from infection to morbidity, then back to infection, but now as a public health problem, before moving on to transmission control. Currently, all endemic countries are encouraged to increase control efforts and move towards elimination as required by the World Health Organization (WHO) roadmap for the global control of the neglected tropical diseases (NTDs) and the WHA65.21 resolution issued by the World Health Assembly. However, schistosomiasis prevalence is still alarmingly high and the global number of disability-adjusted life years (DALYs) due to this infection has in fact increased due to inclusion of some 'subtle' clinical symptoms not previously counted. There is a need to restart and improve efforts to reach the elimination goal. To that end, the first conference of the Global Schistosomiasis Alliance (GSA) Research Working Group was held in mid-June 2016 in Shanghai, People's Republic of China. It reviewed current progress in schistosomiasis control and elimination, identified pressing operational research gaps that need to be addressed and discussed new tools and strategies required to make elimination a reality. The articles emanating from the lectures and discussions during this meeting, together with some additional invited papers, have been collected as a special issue of the 'Infectious Diseases of Poverty' entitled 'Schistosomiasis Research: Providing the Tools Needed for Elimination', consisting of 26 papers in all. This paper refers to these papers and discusses critical questions arising at the conference related to elimination of schistosomiasis. The currently most burning questions are the following: Can schistosomiasis be eliminated? Does it require better, more highly sensitive diagnostics? What is the role of preventive chemotherapy at the elimination stage? Is praziquantel sufficient or do we need new drugs? Contemplating these questions, it is felt that the heterogeneity

  10. 36 CFR 1256.46 - National security-classified information.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false National security-classified... Restrictions § 1256.46 National security-classified information. In accordance with 5 U.S.C. 552(b)(1), NARA... properly classified under the provisions of the pertinent Executive Order on Classified National Security...

  11. Classifier Calibration for Multi-Domain Sentiment Classification.

    NARCIS (Netherlands)

    Raaijmakers, S.A.; Kraaij, W.

    2010-01-01

    Textual sentiment classifiers classify texts into a fixed number of affective classes, such as positive, negative or neutral sentiment, or subjective versus objective information. It has been observed that sentiment classifiers suffer from a lack of generalization capability: a classifier trained on

  12. Detailed Facility Report | ECHO | US EPA

    Science.gov (United States)

    ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.

  13. Hybrid Neuro-Fuzzy Classifier Based On Nefclass Model

    Directory of Open Access Journals (Sweden)

    Bogdan Gliwa

    2011-01-01

    Full Text Available The paper presents hybrid neuro-fuzzy classifier, based on NEFCLASS model, which wasmodified. The presented classifier was compared to popular classifiers – neural networks andk-nearest neighbours. Efficiency of modifications in classifier was compared with methodsused in original model NEFCLASS (learning methods. Accuracy of classifier was testedusing 3 datasets from UCI Machine Learning Repository: iris, wine and breast cancer wisconsin.Moreover, influence of ensemble classification methods on classification accuracy waspresented.

  14. Gene-expression Classifier in Papillary Thyroid Carcinoma: Validation and Application of a Classifier for Prognostication

    DEFF Research Database (Denmark)

    Londero, Stefano Christian; Jespersen, Marie Louise; Krogdahl, Annelise

    2016-01-01

    BACKGROUND: No reliable biomarker for metastatic potential in the risk stratification of papillary thyroid carcinoma exists. We aimed to develop a gene-expression classifier for metastatic potential. MATERIALS AND METHODS: Genome-wide expression analyses were used. Development cohort: freshly...... frozen tissue from 38 patients was collected between the years 1986 and 2009. Validation cohort: formalin-fixed paraffin-embedded tissues were collected from 183 consecutively treated patients. RESULTS: A 17-gene classifier was identified based on the expression values in patients with and without...... metastasis in the development cohort. The 17-gene classifier for regional/distant metastasis identified was tested against the clinical status in the validation cohort. Sensitivity for detection of metastases was 51.5% and specificity 61.6%. Log-rank testing failed to identify any significance (p=0...

  15. Classifying prion and prion-like phenomena.

    Science.gov (United States)

    Harbi, Djamel; Harrison, Paul M

    2014-01-01

    The universe of prion and prion-like phenomena has expanded significantly in the past several years. Here, we overview the challenges in classifying this data informatically, given that terms such as "prion-like", "prion-related" or "prion-forming" do not have a stable meaning in the scientific literature. We examine the spectrum of proteins that have been described in the literature as forming prions, and discuss how "prion" can have a range of meaning, with a strict definition being for demonstration of infection with in vitro-derived recombinant prions. We suggest that although prion/prion-like phenomena can largely be apportioned into a small number of broad groups dependent on the type of transmissibility evidence for them, as new phenomena are discovered in the coming years, a detailed ontological approach might be necessary that allows for subtle definition of different "flavors" of prion / prion-like phenomena.

  16. Classifying the precancers: A metadata approach

    Directory of Open Access Journals (Sweden)

    Henson Donald E

    2003-06-01

    Full Text Available Abstract Background During carcinogenesis, precancers are the morphologically identifiable lesions that precede invasive cancers. In theory, the successful treatment of precancers would result in the eradication of most human cancers. Despite the importance of these lesions, there has been no effort to list and classify all of the precancers. The purpose of this study is to describe the first comprehensive taxonomy and classification of the precancers. As a novel approach to disease classification, terms and classes were annotated with metadata (data that describes the data so that the classification could be used to link precancer terms to data elements in other biological databases. Methods Terms in the UMLS (Unified Medical Language System related to precancers were extracted. Extracted terms were reviewed and additional terms added. Each precancer was assigned one of six general classes. The entire classification was assembled as an XML (eXtensible Mark-up Language file. A Perl script converted the XML file into a browser-viewable HTML (HyperText Mark-up Language file. Results The classification contained 4700 precancer terms, 568 distinct precancer concepts and six precancer classes: 1 Acquired microscopic precancers; 2 acquired large lesions with microscopic atypia; 3 Precursor lesions occurring with inherited hyperplastic syndromes that progress to cancer; 4 Acquired diffuse hyperplasias and diffuse metaplasias; 5 Currently unclassified entities; and 6 Superclass and modifiers. Conclusion This work represents the first attempt to create a comprehensive listing of the precancers, the first attempt to classify precancers by their biological properties and the first attempt to create a pathologic classification of precancers using standard metadata (XML. The classification is placed in the public domain, and comment is invited by the authors, who are prepared to curate and modify the classification.

  17. Planning Facilities.

    Science.gov (United States)

    Flynn, Richard B., Ed.; And Others

    1983-01-01

    Nine articles give information to help make professionals in health, physical education, recreation, dance, and athletics more knowledgeable about planning facilities. Design of natatoriums, physical fitness laboratories, fitness trails, gymnasium lighting, homemade play equipment, indoor soccer arenas, and dance floors is considered. A…

  18. Martian Atmospheric Pressure Static Charge Elimination Tool

    Science.gov (United States)

    Johansen, Michael R.

    2014-01-01

    A Martian pressure static charge elimination tool is currently in development in the Electrostatics and Surface Physics Laboratory (ESPL) at NASA's Kennedy Space Center. In standard Earth atmosphere conditions, static charge can be neutralized from an insulating surface using air ionizers. These air ionizers generate ions through corona breakdown. The Martian atmosphere is 7 Torr of mostly carbon dioxide, which makes it inherently difficult to use similar methods as those used for standard atmosphere static elimination tools. An initial prototype has been developed to show feasibility of static charge elimination at low pressure, using corona discharge. A needle point and thin wire loop are used as the corona generating electrodes. A photo of the test apparatus is shown below. Positive and negative high voltage pulses are sent to the needle point. This creates positive and negative ions that can be used for static charge neutralization. In a preliminary test, a floating metal plate was charged to approximately 600 volts under Martian atmospheric conditions. The static elimination tool was enabled and the voltage on the metal plate dropped rapidly to -100 volts. This test data is displayed below. Optimization is necessary to improve the electrostatic balance of the static elimination tool.

  19. Gaussian elimination is not optimal, revisited

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel

    2016-01-01

    We refactor the universal law for the tensor product to express matrix multiplication as the product . MN of two matrices . M and . N thus making possible to use such matrix product to encode and transform algorithms performing matrix multiplication using techniques from linear algebra. We explore...... of the transformation correspond to apply Gaussian elimination to the columns of . M and to the lines of . N therefore providing explicit evidence on why "Gaussian elimination is not optimal", the aphorism serving as the title to the succinct paper introducing Strassen's matrix multiplication algorithm. Although...... the end results are equations involving matrix products, our exposition builds upon previous works on the category of matrices (and the related category of finite vector spaces) which we extend by showing: why the direct sum . (⊕,0) monoid is not closed, a biproduct encoding of Gaussian elimination...

  20. Eliminating deformations in fluorescence emission difference microscopy.

    Science.gov (United States)

    You, Shangting; Kuang, Cuifang; Rong, Zihao; Liu, Xu

    2014-10-20

    We propose a method for eliminating the deformations in fluorescence emission difference microscopy (FED). Due to excessive subtraction, negative values are inevitable in the original FED method, giving rise to deformations. We propose modulating the beam to generate an extended solid focal spot and a hollow focal spot. Negative image values can be avoided by using these two types of excitation spots in FED imaging. Hence, deformations are eliminated, and the signal-to-noise ratio is improved. In deformation-free imaging, the resolution is higher than that of confocal imaging by 32%. Compared to standard FED imaging with the same level of deformations, our method provides superior resolution.

  1. Ten years left to eliminate blinding trachoma

    Directory of Open Access Journals (Sweden)

    Haddad D.

    2010-09-01

    Full Text Available n 1997, the World Health Organization formed the Global Alliance to Eliminate Blinding Trachoma by 2020 (GET 2020, a coalition of governmental, non-governmental, research, and pharmaceutical partners. In 1998, the World Health Assembly urged member states to map blinding trachoma in endemic areas, implement the SAFE strategy (which stands for surgery for trichiasis, antibiotics, facial-cleanliness and environmental change, such as clean water and latrines and collaborate with the global alliance in its work to eliminate blinding trachoma.

  2. A Weighted Voting Classifier Based on Differential Evolution

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2014-01-01

    Full Text Available Ensemble learning is to employ multiple individual classifiers and combine their predictions, which could achieve better performance than a single classifier. Considering that different base classifier gives different contribution to the final classification result, this paper assigns greater weights to the classifiers with better performance and proposes a weighted voting approach based on differential evolution. After optimizing the weights of the base classifiers by differential evolution, the proposed method combines the results of each classifier according to the weighted voting combination rule. Experimental results show that the proposed method not only improves the classification accuracy, but also has a strong generalization ability and universality.

  3. Neural classifiers using one-time updating.

    Science.gov (United States)

    Diamantaras, K I; Strintzis, M G

    1998-01-01

    The linear threshold element (LTE), or perceptron, is a linear classifier with limited capabilities due to the problems arising when the input pattern set is linearly nonseparable. Assuming that the patterns are presented in a sequential fashion, we derive a theory for the detection of linear nonseparability as soon as it appears in the pattern set. This theory is based on the precise determination of the solution region in the weight space with the help of a special set of vectors. For this region, called the solution cone, we present a recursive computation procedure which allows immediate detection of nonseparability. The separability-violating patterns may be skipped so that, at the end, we derive a totally separable subset of the original pattern set along with its solution cone. The intriguing aspect of this algorithm is that it can be directly cast into a simple neural-network implementation. In this model the synaptic weights are committed (they are updated only once, and the only change that may happen after that is their destruction). This bears resemblance to the behavior of biological neural networks, and it is a feature unlike those of most other artificial neural techniques. Finally, by combining many such neural models we develop a learning procedure capable of separating convex classes.

  4. Is it important to classify ischaemic stroke?

    LENUS (Irish Health Repository)

    Iqbal, M

    2012-02-01

    Thirty-five percent of all ischemic events remain classified as cryptogenic. This study was conducted to ascertain the accuracy of diagnosis of ischaemic stroke based on information given in the medical notes. It was tested by applying the clinical information to the (TOAST) criteria. Hundred and five patients presented with acute stroke between Jan-Jun 2007. Data was collected on 90 patients. Male to female ratio was 39:51 with age range of 47-93 years. Sixty (67%) patients had total\\/partial anterior circulation stroke; 5 (5.6%) had a lacunar stroke and in 25 (28%) the mechanism of stroke could not be identified. Four (4.4%) patients with small vessel disease were anticoagulated; 5 (5.6%) with atrial fibrillation received antiplatelet therapy and 2 (2.2%) patients with atrial fibrillation underwent CEA. This study revealed deficiencies in the clinical assessment of patients and treatment was not tailored to the mechanism of stroke in some patients.

  5. How should we classify intersex disorders?

    Science.gov (United States)

    Aaronson, Ian A; Aaronson, Alistair J

    2010-10-01

    The term disorders of sex development (DSD) has achieved widespread acceptance as replacement for the term intersex, but how to classify these conditions remains problematic. The LWPES-ESPE (Lawson Wilkins Pediatric Endocrine Society and European Society of Paediatric Endocrinology) Consensus Group proposed using the karyotype as a basis for classification; however, this is but a crude reflection of the genetic makeup, is diagnostically non-specific, and is not in itself relevant to subsequent clinical developments. The historical classification of intersex disorders based on gonadal histology is currently out of favor, being tainted by association with the terms hermaphroditism and pseudohermaphroditism. We believe this is regrettable, for the histology of the gonad remains fundamental to the understanding of normal and aberrant sexual development by medical students and residents in training, as well as being a major determinant of clinical outcome for the patient. We propose a comprehensive classification of those DSD conditions generally regarded as belonging under the heading of intersex, based on gonadal histology. Biopsy will not be required when the diagnosis is clearly established biochemically or by gene studies as the histology can be confidently predicted. It will only be required when an ovotestis or dysgenetic gonad is suspected in order to determine the definitive diagnosis. Copyright © 2010 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  6. Combining classifiers for robust PICO element detection.

    Science.gov (United States)

    Boudin, Florian; Nie, Jian-Yun; Bartlett, Joan C; Grad, Roland; Pluye, Pierre; Dawes, Martin

    2010-05-15

    Formulating a clinical information need in terms of the four atomic parts which are Population/Problem, Intervention, Comparison and Outcome (known as PICO elements) facilitates searching for a precise answer within a large medical citation database. However, using PICO defined items in the information retrieval process requires a search engine to be able to detect and index PICO elements in the collection in order for the system to retrieve relevant documents. In this study, we tested multiple supervised classification algorithms and their combinations for detecting PICO elements within medical abstracts. Using the structural descriptors that are embedded in some medical abstracts, we have automatically gathered large training/testing data sets for each PICO element. Combining multiple classifiers using a weighted linear combination of their prediction scores achieves promising results with an f-measure score of 86.3% for P, 67% for I and 56.6% for O. Our experiments on the identification of PICO elements showed that the task is very challenging. Nevertheless, the performance achieved by our identification method is competitive with previously published results and shows that this task can be achieved with a high accuracy for the P element but lower ones for I and O elements.

  7. Fault diagnosis with the Aladdin transient classifier

    Science.gov (United States)

    Roverso, Davide

    2003-08-01

    The purpose of Aladdin is to assist plant operators in the early detection and diagnosis of faults and anomalies in the plant that either have an impact on the plant performance, or that could lead to a plant shutdown or component damage if allowed to go unnoticed. The kind of early fault detection and diagnosis performed by Aladdin is aimed at allowing more time for decision making, increasing the operator awareness, reducing component damage, and supporting improved plant availability and reliability. In this paper we describe in broad lines the Aladdin transient classifier, which combines techniques such as recurrent neural network ensembles, Wavelet On-Line Pre-processing (WOLP), and Autonomous Recursive Task Decomposition (ARTD), in an attempt to improve the practical applicability and scalability of this type of systems to real processes and machinery. The paper focuses then on describing an application of Aladdin to a Nuclear Power Plant (NPP) through the use of the HAMBO experimental simulator of the Forsmark 3 boiling water reactor NPP in Sweden. It should be pointed out that Aladdin is not necessarily restricted to applications in NPPs. Other types of power plants, or even other types of processes, can also benefit from the diagnostic capabilities of Aladdin.

  8. Identifying and classifying juvenile stalking behavior.

    Science.gov (United States)

    Evans, Thomas M; Reid Meloy, J

    2011-01-01

    Despite the growing research in the area of stalking, the focus has been on adults who engage in this behavior. Unfortunately, almost no studies investigate the prevalence of this behavior in adolescents. Two cases are presented demonstrating not only that stalking occurs during the period of adolescence, but also that there is a significant difference in the motivation underlying this behavior that can be classified similarly to that of adult stalkers. Further, a suggested classification based on these two cases as well as our experience with other juveniles who have exhibited stalking behaviors is proposed. The first case involves a narcissistic youth who also possesses psychopathic traits, while the second involves a lonely, severely socially awkward teen. Juvenile stalking is a societal problem that has not yet garnered the attention it deserves, and all systems that deal with juvenile delinquency (juvenile court, law enforcement, and mental health personnel) as well as the school system must be educated to the prevalence and severity of this yet-to-be-recognized problem. © 2010 American Academy of Forensic Sciences.

  9. EPA Facility Registry Service (FRS): Facility Interests Dataset

    Science.gov (United States)

    This web feature service consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers of haz

  10. Eliminating Problems Caused by Multicollinearity: A Warning.

    Science.gov (United States)

    Kennedy, Peter E.

    1982-01-01

    Explains why an econometric practice introduced by J.C. Soper cannot eliminate the problems caused by multicollinearity. The author suggests that it can be a useful technique in that it forces researchers to pay more attention to the specifications of their models. (AM)

  11. Teaching Projectile Motion to Eliminate Misconceptions

    Science.gov (United States)

    Prescott, Anne; Mitchelmore, Michael

    2005-01-01

    Student misconceptions of projectile motion are well documented, but their effect on the teaching and learning of the mathematics of motion under gravity has not been investigated. An experimental unit was designed that was intended to confront and eliminate misconceptions in senior secondary school students. The approach was found to be…

  12. Eliminating transducer distortion in acoustic measurements

    DEFF Research Database (Denmark)

    Agerkvist, Finn T.; Torras Rosell, Antoni; McWalter, Richard Ian

    2014-01-01

    This paper investigates the in uence of nonlinear components that contaminate the linear response of acoustic transducer, and presents a method for eliminating the in uence of nonlinearities in acoustic measurements. The method is evaluated on simulated as well as experimental data, and is shown...

  13. Health promotion: From malaria control to elimination

    African Journals Online (AJOL)

    Advocacy, health promotion, health education, strategic marketing, advertising, and the strengthening of existing partnerships are essential prerequisites in closing the identified gaps in the malaria control programme when moving from control to elimination.[10]. To chart the way forward for moving malaria programmes ...

  14. Naive Bayesian classifiers for multinomial features: a theoretical analysis

    CSIR Research Space (South Africa)

    Van Dyk, E

    2007-11-01

    Full Text Available The authors investigate the use of naive Bayesian classifiers for multinomial feature spaces and derive error estimates for these classifiers. The error analysis is done by developing a mathematical model to estimate the probability density...

  15. A Weighted Voting Classifier Based on Differential Evolution

    National Research Council Canada - National Science Library

    Zhang, Yong; Zhang, Hongrui; Cai, Jing; Yang, Binbin

    2014-01-01

    ... a weighted voting approach based on differential evolution. After optimizing the weights of the base classifiers by differential evolution, the proposed method combines the results of each classifier according to the weighted voting combination rule...

  16. Evaluating Pixel vs. Segmentation based Classifiers with Height ...

    African Journals Online (AJOL)

    Windows User

    2017-10-13

    Oct 13, 2017 ... classification of digital imagery. ... Traditional pixel-based classifiers have been widely used for classifying optical imagery from ..... Chavez, P, Sides, SC & Anderson, JA 1991, 'Comparison of three different methods to merge.

  17. Endemicity response timelines for Plasmodium falciparum elimination

    Directory of Open Access Journals (Sweden)

    Hay Simon I

    2009-04-01

    Full Text Available Abstract Background The scaling up of malaria control and renewed calls for malaria eradication have raised interest in defining timelines for changes in malaria endemicity. Methods The epidemiological theory for the decline in the Plasmodium falciparum parasite rate (PfPR, the prevalence of infection following intervention was critically reviewed and where necessary extended to consider superinfection, heterogeneous biting, and aging infections. Timelines for malaria control and elimination under different levels of intervention were then established using a wide range of candidate mathematical models. Analysis focused on the timelines from baseline to 1% and from 1% through the final stages of elimination. Results The Ross-Macdonald model, which ignores superinfection, was used for planning during the Global Malaria Eradication Programme (GMEP. In models that consider superinfection, PfPR takes two to three years longer to reach 1% starting from a hyperendemic baseline, consistent with one of the few large-scale malaria control trials conducted in an African population with hyperendemic malaria. The time to elimination depends fundamentally upon the extent to which malaria transmission is interrupted and the size of the human population modelled. When the PfPR drops below 1%, almost all models predict similar and proportional declines in PfPR in consecutive years from 1% through to elimination and that the waiting time to reduce PfPR from 10% to 1% and from 1% to 0.1% are approximately equal, but the decay rate can increase over time if infections senesce. Conclusion The theory described herein provides simple "rules of thumb" and likely time horizons for the impact of interventions for control and elimination. Starting from a hyperendemic baseline, the GMEP planning timelines, which were based on the Ross-Macdonald model with completely interrupted transmission, were inappropriate for setting endemicity timelines and they represent the most

  18. How do we classify functional status?

    Science.gov (United States)

    Meyboom-de Jong, B M; Smith, R J

    1992-02-01

    The original question "How do we classify functional status?" is rephrased as "How do we order or arrange the limitations of function of primary care patients in classes?". After a review of the functions to be considered, the concept of functional status is presented using empirical data from the research project "Morbidity and Functional Status of the Elderly." The group studied consisted of 5,502 patients older than age 65 and 25 general practitioners in 12 practices. Functional status was assessed using five COOP charts: physical status, psychological status, daily activities, social status, and change. Morbidity was registered using the International Classification of Primary Care (ICPC). At the beginning and end of the study, 30% of the elderly patients assessed their physical functions as seriously limited, whereas 6% to 8% reported psychological problems and limitations in daily activities or social contacts. During the encounters, more serious limitations were recorded: 35% of encounters involved serious physical limitations; 18% involved serious limitations in activities of daily living, and 11% involved constant psychological problems or limitations in social contact. Women reported more physical limitations than men. Older patients reported more physical limitations than younger ones, and people living in nursing homes reported more limitations than patients living independently. From the disease-specific health profiles, we concluded that the greatest limitation of all aspects of function was scored during encounters for cerebrovascular disease, dementia, and cancer of the lung, stomach, intestine, and breast. Hypertension, "no disease," and "common cold" elicited the lowest functional limitations.(ABSTRACT TRUNCATED AT 250 WORDS)

  19. Counting, Measuring And The Semantics Of Classifiers

    Directory of Open Access Journals (Sweden)

    Susan Rothstein

    2010-12-01

    Full Text Available This paper makes two central claims. The first is that there is an intimate and non-trivial relation between the mass/count distinction on the one hand and the measure/individuation distinction on the other: a (if not the defining property of mass nouns is that they denote sets of entities which can be measured, while count nouns denote sets of entities which can be counted. Crucially, this is a difference in grammatical perspective and not in ontological status. The second claim is that the mass/count distinction between two types of nominals has its direct correlate at the level of classifier phrases: classifier phrases like two bottles of wine are ambiguous between a counting, or individuating, reading and a measure reading. On the counting reading, this phrase has count semantics, on the measure reading it has mass semantics.ReferencesBorer, H. 1999. ‘Deconstructing the construct’. In K. Johnson & I. Roberts (eds. ‘Beyond Principles and Parameters’, 43–89. Dordrecht: Kluwer publications.Borer, H. 2008. ‘Compounds: the view from Hebrew’. In R. Lieber & P. Stekauer (eds. ‘The Oxford Handbook of Compounds’, 491–511. Oxford: Oxford University Press.Carlson, G. 1977b. Reference to Kinds in English. Ph.D. thesis, University of Massachusetts at Amherst.Carlson, G. 1997. Quantifiers and Selection. Ph.D. thesis, University of Leiden.Carslon, G. 1977a. ‘Amount relatives’. Language 53: 520–542.Chierchia, G. 2008. ‘Plurality of mass nouns and the notion of ‘semantic parameter”. In S. Rothstein (ed. ‘Events and Grammar’, 53–103. Dordrecht: Kluwer.Danon, G. 2008. ‘Definiteness spreading in the Hebrew construct state’. Lingua 118: 872–906.http://dx.doi.org/10.1016/j.lingua.2007.05.012Gillon, B. 1992. ‘Toward a common semantics for English count and mass nouns’. Linguistics and Philosophy 15: 597–640.http://dx.doi.org/10.1007/BF00628112Grosu, A. & Landman, F. 1998. ‘Strange relatives of the third kind

  20. Recognition of pornographic web pages by classifying texts and images.

    Science.gov (United States)

    Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve

    2007-06-01

    With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages.

  1. 32 CFR 2400.28 - Dissemination of classified information.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Dissemination of classified information. 2400.28... SECURITY PROGRAM Safeguarding § 2400.28 Dissemination of classified information. Heads of OSTP offices shall establish procedures consistent with this Regulation for dissemination of classified material. The...

  2. Emission Facilities - Erosion & Sediment Control Facilities

    Data.gov (United States)

    NSGIC Education | GIS Inventory — An Erosion and Sediment Control Facility is a DEP primary facility type related to the Water Pollution Control program. The following sub-facility types related to...

  3. Neural correlates of quantity processing of Chinese numeral classifiers.

    Science.gov (United States)

    Her, One-Soon; Chen, Ying-Chun; Yen, Nai-Shing

    2017-11-08

    Linguistic analysis suggests that numeral classifiers carry quantity information. However, previous neuroimaging studies have shown that classifiers did not elicit higher activation in the intraparietal sulcus (IPS), associated with representation of numerical magnitude, than tool nouns did. This study aimed to control the semantic attributes of classifiers and reexamine the underlying neural correlates. Participants performed a semantic distance comparison task in which they judged which one of the two items was semantically closer to the target. Processing classifiers elicited higher activation than tool nouns in the bilateral inferior parietal lobules (IPL), middle frontal gyri (MFG), right superior frontal gyrus (SFG), and left lingual gyrus. Conjunction analysis showed that the IPS was commonly activated for classifiers, numbers, dots, and number words. The results support that classifiers activate quantity representations, implicating that the system of classifiers is part of magnitude cognition. Furthermore, the results suggest that the IPS represents magnitude independent of notations. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Elimination of Dominated Strategies and Inessential Players

    Directory of Open Access Journals (Sweden)

    Mamoru Kaneko

    2015-01-01

    Full Text Available We study the process, called the IEDI process, of iterated elimination of (strictly dominated strategies and inessential players for finite strategic games. Such elimination may reduce the size of a game considerably, for example, from a game with a large number of players to one with a few players. We extend two existing results to our context; the preservation of Nash equilibria and orderindependence. These give a way of computing the set of Nash equilibria for an initial situation from the endgame. Then, we reverse our perspective to ask the question of what initial situations end up at a given final game. We assess what situations underlie an endgame. We give conditions for the pattern of player sets required for a resulting sequence of the IEDI process to an endgame. We illustrate our development with a few extensions of the battle of the sexes. (original abstract

  5. Hitting Hotspots: Spatial Targeting of Malaria for Control and Elimination

    NARCIS (Netherlands)

    Bousema, T.; Griffin, J.T.; Sauerwein, R.W.; Smith, D.L.; Churcher, T.S.; Takken, W.; Ghani, A.; Drakeley, C.; Gosling, R.

    2012-01-01

    Current malaria elimination guidelines are based on the concept that malaria transmission becomes heterogeneous in the later phases of malaria elimination [1]. In the pre-elimination and elimination phases, interventions have to be targeted to entire villages or towns with higher malaria incidence

  6. Reducing allergic symptoms through eliminating subgingival plaque

    OpenAIRE

    Utomo, Haryono; Prahasanti, Chiquita; Ruhadi, Iwan

    2008-01-01

    Background: Elimination of subgingival plaque for prevention and treatment of periodontal diseases through scaling is a routine procedure. It is also well-known that periodontal disease is related to systemic diseases. Nevertheless, the idea how scaling procedures also able to reduce allergic symptoms i.e. eczema and asthma, is not easily accepted, because it is contradictory to the “hygiene hypothesis”. However, since allergic symptoms also depend on variable factors such as genetic, environ...

  7. Safe, Multiphase Bounds Check Elimination in Java

    Science.gov (United States)

    2010-01-28

    National Science Foundation under grants CCF-0846010, EIA-0117255, CCF-0702527, and CNS-0855247. References [1] Elvira Albert, Germán Puebla, and Manuel ...David Grove, Michael Hind, Vivek Sarkar, Mauricio J. Serrano , V. C. Sreedhar, Harini Srinivasan, and John Whaley. The jalapeño dynamic optimizing...Computer Science, pages 137–153, August 2009. 40 Gampe, et al. Multiphase Bounds Check Elimination CS-TR-2010-001 [44] Martin Odersky and Philip Wadler

  8. Air Quality Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research FacilityFacilities with operating permits for Title V of the Federal Clean Air Act, as well as facilities required to submit an air emissions inventory, and other facilities...

  9. Elimination of Rhodnius prolixus in Central America

    Directory of Open Access Journals (Sweden)

    Hashimoto Ken

    2012-02-01

    Full Text Available Abstract Rhodnius prolixus is one of the main vectors of Trypanosoma cruzi, causative agent of Chagas disease. In Central America, it was first discovered in 1915 in El Salvador, from where it spread northwest to Guatemala and Mexico, and southeast to Nicaragua and Costa Rica, arriving also in Honduras in the late 1950s. Indoor residual spraying (IRS by the antimalaria services of Costa Rica prevented its spread southwards, and similar IRS programmes appear to have eliminated it from El Salvador by the late 1970s. In 1997, by resolution of the Ministers of Health of the seven Central American countries, a multinational initiative against Chagas disease (IPCA was launched with one of the specific objectives being the elimination of R. prolixus from the region. As a result, more and more infested areas were encountered, and progressively sprayed using an IRS strategy already deployed against Triatoma infestans in the southern cone countries of South America. In 2008, Guatemala became the first of these countries to be formally certified as free of Chagas disease transmission due to R. prolixus. The other infested countries have since been similarly certified, and none of these has reported the presence of R. prolixus since June 2010. Further surveillance is required, but current evidence suggests that R. prolixus may now been eliminated from throughout the mesoamerican region, with a corresponding decline in the incidence of T. cruzi infections.

  10. Planning of elimination of emergency consequences

    Directory of Open Access Journals (Sweden)

    S. Kovalenko

    2015-05-01

    Full Text Available Introduction. The volume of useful information in the planning of elimination of emergency consequences process is reasonable to assess with calculatory problems and mathematical models. Materials and methods. The expert survey method is used to calculate quantitative values of probability and to determine the optimal solution before the information in condition is received. Results. It is determined that the quality of the solution of elimination emergency consequences depends primarily on the number of factors that are taken into account in particular circumstances of the situation; on the level of information readiness of control bodies to take decision to eliminate emergency consequences as soon as possible and to consider several options for achieving reasonableness and concreteness of a particular decision. The ratio between volume of useful information collected and processed during operation planning which is required for identifying optimal solution is calculated. This ratio allows to construct a graph of probability of identifying a solution in existing environment and probability value of identifying optimal solution before information in P*condition is obtained. This graph also shows the ratio volume of useful information collected and processed during operation planning and necessary volume of information for identifying optimal solution. Conclusion. The results of this research can be used for improving control bodies decisions to ensure safe working conditions for employees of food industry.

  11. Achieving universal access and moving towards elimination of new HIV infections in Cambodia

    Science.gov (United States)

    Vun, Mean Chhi; Fujita, Masami; Rathavy, Tung; Eang, Mao Tang; Sopheap, Seng; Sovannarith, Samreth; Chhorvann, Chhea; Vanthy, Ly; Sopheap, Oum; Welle, Emily; Ferradini, Laurent; Sedtha, Chin; Bunna, Sok; Verbruggen, Robert

    2014-01-01

    Introduction In the mid-1990s, Cambodia faced one of the fastest growing HIV epidemics in Asia. For its achievement in reversing this trend, and achieving universal access to HIV treatment, the country received a United Nations millennium development goal award in 2010. This article reviews Cambodia’s response to HIV over the past two decades and discusses its current efforts towards elimination of new HIV infections. Methods A literature review of published and unpublished documents, including programme data and presentations, was conducted. Results and discussion Cambodia classifies its response to one of the most serious HIV epidemics in Asia into three phases. In Phase I (1991–2000), when adult HIV prevalence peaked at 1.7% and incidence exceeded 20,000 cases, a nationwide HIV prevention programme targeted brothel-based sex work. Voluntary confidential counselling and testing and home-based care were introduced, and peer support groups of people living with HIV emerged. Phase II (2001–2011) observed a steady decline in adult prevalence to 0.8% and incidence to 1600 cases by 2011, and was characterized by: expanding antiretroviral treatment (coverage reaching more than 80%) and continuum of care; linking with tuberculosis and maternal and child health services; accelerated prevention among key populations, including entertainment establishment-based sex workers, men having sex with men, transgender persons, and people who inject drugs; engagement of health workers to deliver quality services; and strengthening health service delivery systems. The third phase (2012–2020) aims to attain zero new infections by 2020 through: sharpening responses to key populations at higher risk; maximizing access to community and facility-based testing and retention in prevention and care; and accelerating the transition from vertical approaches to linked/integrated approaches. Conclusions Cambodia has tailored its prevention strategy to its own epidemic, established

  12. Achieving universal access and moving towards elimination of new HIV infections in Cambodia

    Directory of Open Access Journals (Sweden)

    Mean Chhi Vun

    2014-06-01

    Full Text Available Introduction: In the mid-1990s, Cambodia faced one of the fastest growing HIV epidemics in Asia. For its achievement in reversing this trend, and achieving universal access to HIV treatment, the country received a United Nations millennium development goal award in 2010. This article reviews Cambodia's response to HIV over the past two decades and discusses its current efforts towards elimination of new HIV infections. Methods: A literature review of published and unpublished documents, including programme data and presentations, was conducted. Results and discussion: Cambodia classifies its response to one of the most serious HIV epidemics in Asia into three phases. In Phase I (1991–2000, when adult HIV prevalence peaked at 1.7% and incidence exceeded 20,000 cases, a nationwide HIV prevention programme targeted brothel-based sex work. Voluntary confidential counselling and testing and home-based care were introduced, and peer support groups of people living with HIV emerged. Phase II (2001–2011 observed a steady decline in adult prevalence to 0.8% and incidence to 1600 cases by 2011, and was characterized by: expanding antiretroviral treatment (coverage reaching more than 80% and continuum of care; linking with tuberculosis and maternal and child health services; accelerated prevention among key populations, including entertainment establishment-based sex workers, men having sex with men, transgender persons, and people who inject drugs; engagement of health workers to deliver quality services; and strengthening health service delivery systems. The third phase (2012–2020 aims to attain zero new infections by 2020 through: sharpening responses to key populations at higher risk; maximizing access to community and facility-based testing and retention in prevention and care; and accelerating the transition from vertical approaches to linked/integrated approaches. Conclusions: Cambodia has tailored its prevention strategy to its own epidemic

  13. Stochastic margin-based structure learning of Bayesian network classifiers.

    Science.gov (United States)

    Pernkopf, Franz; Wohlmayr, Michael

    2013-02-01

    The margin criterion for parameter learning in graphical models gained significant impact over the last years. We use the maximum margin score for discriminatively optimizing the structure of Bayesian network classifiers. Furthermore, greedy hill-climbing and simulated annealing search heuristics are applied to determine the classifier structures. In the experiments, we demonstrate the advantages of maximum margin optimized Bayesian network structures in terms of classification performance compared to traditionally used discriminative structure learning methods. Stochastic simulated annealing requires less score evaluations than greedy heuristics. Additionally, we compare generative and discriminative parameter learning on both generatively and discriminatively structured Bayesian network classifiers. Margin-optimized Bayesian network classifiers achieve similar classification performance as support vector machines. Moreover, missing feature values during classification can be handled by discriminatively optimized Bayesian network classifiers, a case where purely discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.

  14. The analysis of cross-classified categorical data

    CERN Document Server

    Fienberg, Stephen E

    2007-01-01

    A variety of biological and social science data come in the form of cross-classified tables of counts, commonly referred to as contingency tables. Until recent years the statistical and computational techniques available for the analysis of cross-classified data were quite limited. This book presents some of the recent work on the statistical analysis of cross-classified data using longlinear models, especially in the multidimensional situation.

  15. Breadboard Facility

    Science.gov (United States)

    1977-01-01

    In the sixties, Chrysler was NASA's prime contractor for the Saturn I and IB test launch vehicles. The company installed and operated at Huntsville what was known as the Saturn I/IB Development Breadboard Facility. "Breadboard," means an array of electrical and electronic equipment for performing a variety of development and test functions. This work gave Chrysler a broad capability in computerized testing to assure quality control in development of solid-state electronic systems. Today that division is manufacturing many products not destined for NASA, most of them being associated with the company's automotive line. A major project is production and quality-control testing of the "lean-burn" engine, one that has a built-in Computer to control emission timing, and allow the engine to run on a leaner mixture of fuel and air. Other environment-related products include vehicle emission analyzers. The newest of the line is an accurate, portable solid state instrument for testing auto exhaust gases. The exhaust analyzers, now being produced for company dealers and for service

  16. Reducing allergic symptoms through eliminating subgingival plaque

    Directory of Open Access Journals (Sweden)

    Haryono Utomo

    2008-12-01

    Full Text Available Background: Elimination of subgingival plaque for prevention and treatment of periodontal diseases through scaling is a routine procedure. It is also well-known that periodontal disease is related to systemic diseases. Nevertheless, the idea how scaling procedures also able to reduce allergic symptoms i.e. eczema and asthma, is not easily accepted, because it is contradictory to the “hygiene hypothesis”. However, since allergic symptoms also depend on variable factors such as genetic, environmental and infection factors; every possible effort to eliminate or avoid from these factors had to be considered. Subgingival plaque is a source of infection, especially the Gram-negative bacteria that produced endotoxin (lipopolysaccharides, LPS, a potential stimulator of immunocompetent cells, which may also related to allergy, such as mast cells and basophils. In addition, it also triggers the “neurogenic switching” mechanism which may be initiated from chronic gingivitis. Objective: This case report may explain the possible connection between subgingival plaque and allergy based on evidence-based cases. Case: Two adult siblings who suffered from chronic gingivitis also showed different manifestations of allergy that were allergic dermatitis and asthma for years. They were also undergone unsuccessful medical treatment for years. Oral and topical corticosteroids were taken for dermatitis and inhalation for asthma. Case Management: Patients were conducted deep scaling procedures, allergic symptoms gradually diminished in days even though without usual medications. Conclusion: Concerning to the effectiveness of scaling procedures which concomitantly eliminate subgingival plaque in allergic patients, it concluded that this concept is logical. Nevertheless, further verification and collaborated study with allergic expert should be done.

  17. The global cost of eliminating avoidable blindness

    Directory of Open Access Journals (Sweden)

    Kirsten L Armstrong

    2012-01-01

    Full Text Available Aims : To complete an initial estimate of the global cost of eliminating avoidable blindness, including the investment required to build ongoing primary and secondary health care systems, as well as to eliminate the ′backlog′ of avoidable blindness. This analysis also seeks to understand and articulate where key data limitations lie. Materials and Methods : Data were collected in line with a global estimation approach, including separate costing frameworks for the primary and secondary care sectors, and the treatment of backlog. Results : The global direct health cost to eliminate avoidable blindness over a 10-year period from 2011 to 2020 is estimated at $632 billion per year (2009 US$. As countries already spend $592 billion per annum on eye health, this represents additional investment of $397.8 billion over 10 years, which is $40 billion per year or $5.80 per person for each year between 2010 and 2020. This is concentrated in high-income nations, which require 68% of the investment but comprise 16% of the world′s inhabitants. For all other regions, the additional investment required is $127 billion. Conclusions : This costing estimate has identified that low- and middle-income countries require less than half the additional investment compared with high-income nations. Low- and middle-income countries comprise the greater investment proportion in secondary care whereas high-income countries require the majority of investment into the primary sector. However, there is a need to improve sector data. Investment in better data will have positive flow-on effects for the eye health sector.

  18. Eliminating corner effects in square lattice simulation

    Science.gov (United States)

    Pang, Gang; Ji, Songsong; Yang, Yibo; Tang, Shaoqiang

    2017-10-01

    Using an alternative source decomposition, we propose new exact boundary conditions on numerical boundary of a square lattice for out-of-plane motion over the whole space. A set of recurrence relations are found for the resulting kernel functions, hence allow their efficient and accurate evaluation with a system of ordinary differential equations. Stability of the boundary conditions is proved rigorously. Numerical results illustrate effective suppression for spurious wave reflection, and elimination of corner effects. This approach may be extended to other lattice structures and in higher dimensions.

  19. Heuristic Drift Elimination for Personnel Tracking Systems

    Science.gov (United States)

    Borenstein, Johann; Ojeda, Lauro

    This paper pertains to the reduction of the effects of measurement errors in rate gyros used for tracking, recording, or monitoring the position of persons walking indoors. In such applications, bias drift and other gyro errors can degrade accuracy within minutes. To overcome this problem we developed the Heuristic Drift Elimination (HDE) method, that effectively corrects bias drift and other slow-changing errors. HDE works by making assumptions about walking in structured, indoor environments. The paper explains the heuristic assumptions and the HDE method, and shows experimental results. In typical applications, HDE maintains near-zero heading errors in walks of unlimited duration.

  20. Optimal classifier for imbalanced data using Matthews Correlation Coefficient metric.

    Science.gov (United States)

    Boughorbel, Sabri; Jarray, Fethi; El-Anbari, Mohammed

    2017-01-01

    Data imbalance is frequently encountered in biomedical applications. Resampling techniques can be used in binary classification to tackle this issue. However such solutions are not desired when the number of samples in the small class is limited. Moreover the use of inadequate performance metrics, such as accuracy, lead to poor generalization results because the classifiers tend to predict the largest size class. One of the good approaches to deal with this issue is to optimize performance metrics that are designed to handle data imbalance. Matthews Correlation Coefficient (MCC) is widely used in Bioinformatics as a performance metric. We are interested in developing a new classifier based on the MCC metric to handle imbalanced data. We derive an optimal Bayes classifier for the MCC metric using an approach based on Frechet derivative. We show that the proposed algorithm has the nice theoretical property of consistency. Using simulated data, we verify the correctness of our optimality result by searching in the space of all possible binary classifiers. The proposed classifier is evaluated on 64 datasets from a wide range data imbalance. We compare both classification performance and CPU efficiency for three classifiers: 1) the proposed algorithm (MCC-classifier), the Bayes classifier with a default threshold (MCC-base) and imbalanced SVM (SVM-imba). The experimental evaluation shows that MCC-classifier has a close performance to SVM-imba while being simpler and more efficient.

  1. Multiple-instance learning as a classifier combining problem

    DEFF Research Database (Denmark)

    Li, Yan; Tax, David M. J.; Duin, Robert P. W.

    2013-01-01

    posteriors. Given the instance labels, the label of a bag can be obtained as a classifier combining problem. An optimal decision rule is derived that determines the threshold on the fraction of instances in a bag that is assigned to the concept class. We provide estimators for the two parameters in the model...... with the assumption that instances are drawn from a mixture distribution of the concept and the non-concept, which leads to a convenient way to solve MIL as a classifier combining problem. It is shown that instances can be classified with any standard supervised classifier by re-weighting the classification...

  2. Developing a radiomics framework for classifying non-small cell lung carcinoma subtypes

    Science.gov (United States)

    Yu, Dongdong; Zang, Yali; Dong, Di; Zhou, Mu; Gevaert, Olivier; Fang, Mengjie; Shi, Jingyun; Tian, Jie

    2017-03-01

    Patient-targeted treatment of non-small cell lung carcinoma (NSCLC) has been well documented according to the histologic subtypes over the past decade. In parallel, recent development of quantitative image biomarkers has recently been highlighted as important diagnostic tools to facilitate histological subtype classification. In this study, we present a radiomics analysis that classifies the adenocarcinoma (ADC) and squamous cell carcinoma (SqCC). We extract 52-dimensional, CT-based features (7 statistical features and 45 image texture features) to represent each nodule. We evaluate our approach on a clinical dataset including 324 ADCs and 110 SqCCs patients with CT image scans. Classification of these features is performed with four different machine-learning classifiers including Support Vector Machines with Radial Basis Function kernel (RBF-SVM), Random forest (RF), K-nearest neighbor (KNN), and RUSBoost algorithms. To improve the classifiers' performance, optimal feature subset is selected from the original feature set by using an iterative forward inclusion and backward eliminating algorithm. Extensive experimental results demonstrate that radiomics features achieve encouraging classification results on both complete feature set (AUC=0.89) and optimal feature subset (AUC=0.91).

  3. The utility of measles and rubella IgM serology in an elimination setting, Ontario, Canada, 2009-2014.

    Science.gov (United States)

    Bolotin, Shelly; Lim, Gillian; Dang, Vica; Crowcroft, Natasha; Gubbay, Jonathan; Mazzulli, Tony; Schabas, Richard

    2017-01-01

    In Canada, measles was eliminated in 1998 and rubella in 2000. Effective measles and rubella surveillance is vital in elimination settings, hinging on reliable laboratory methods. However, low-prevalence settings affect the predictive value of laboratory tests. We conducted an analysis to determine the performance of measles and rubella IgM testing in a jurisdiction where both infections are eliminated. 21,299 test results were extracted from the Public Health Ontario Laboratories database and 1,239 reports were extracted from the Ontario Integrated Public Health Information System (iPHIS) from 2008 and 2010 for measles and rubella, respectively, to 2014. Deterministic linkage resulted in 658 linked measles records (2009-2014) and 189 linked rubella records (2010-2014). Sixty-six iPHIS measles entries were classified as confirmed cases, of which 53 linked to laboratory data. Five iPHIS rubella entries were classified as confirmed, all linked to IgM results. The positive predictive value was 17.4% for measles and 3.6% for rubella. Sensitivity was 79.2% for measles and 100.0% for rubella. Specificity was 65.7% for measles and 25.8% for rubella. Our study confirms that a positive IgM alone does not confirm a measles case in elimination settings. This has important implications for countries that are working towards measles and rubella elimination.

  4. Injections, Cocktails and Diviners: Therapeutic Flexibility in the Context of Malaria Elimination and Drug Resistance in Northeast Cambodia

    NARCIS (Netherlands)

    Gryseels, C.; Uk, S.; Erhart, A.; Gerrets, R.; Sluydts, V.; Durnez, L.; Muela Ribera, J.; Hausmann Muela, S.; Menard, D.; Heng, S.; Sochantha, T.; D'Alessandro, U.; Coosemans, M.; Peeters Grietens, K.

    2013-01-01

    Background: Adherence to effective malaria medication is extremely important in the context of Cambodia’s elimination targets and drug resistance containment. Although the public sector health facilities are accessible to the local ethnic minorities of Ratanakiri province (Northeast Cambodia), their

  5. Iterative elimination algorithm for thermal image processing

    Directory of Open Access Journals (Sweden)

    A. H. Alkali

    2014-08-01

    Full Text Available Segmentation is employed in everyday image processing, in order to remove unwanted objects present in the image. There are scenarios where segmentation alone does not do the intended job automatically. In such cases, subjective means are required to eliminate the remnants which are time consuming especially when multiple images are involved. It is also not feasible when real-time applications are involved. This is even compounded when thermal imaging is involved as both foreground and background objects can have similar thermal distribution, thus making it impossible for straight segmentation to distinguish between the two. In this study, a real-time Iterative Elimination Algorithm (IEA was developed and it was shown that false foreground was removed in thermal images where segmentation failed to do so. The algorithm was tested on thermal images that were segmented using the inter-variance thresholding. The thermal images contained human subjects as foreground with some background objects having similar thermal distribution as the subject. Informed consent was obtained from the subject that voluntarily took part in the study. The IEA was only tested on thermal images and failed when false background object was connected to the foreground after segmentation.

  6. Multiple classifier integration for the prediction of protein structural classes.

    Science.gov (United States)

    Chen, Lei; Lu, Lin; Feng, Kairui; Li, Wenjin; Song, Jie; Zheng, Lulu; Yuan, Youlang; Zeng, Zhenbin; Feng, Kaiyan; Lu, Wencong; Cai, Yudong

    2009-11-15

    Supervised classifiers, such as artificial neural network, partition trees, and support vector machines, are often used for the prediction and analysis of biological data. However, choosing an appropriate classifier is not straightforward because each classifier has its own strengths and weaknesses, and each biological dataset has its own characteristics. By integrating many classifiers together, people can avoid the dilemma of choosing an individual classifier out of many to achieve an optimized classification results (Rahman et al., Multiple Classifier Combination for Character Recognition: Revisiting the Majority Voting System and Its Variation, Springer, Berlin, 2002, 167-178). The classification algorithms come from Weka (Witten and Frank, Data Mining: Practical Machine Learning Tools and Techniques, Morgan Kaufmann, San Francisco, 2005) (a collection of software tools for machine learning algorithms). By integrating many predictors (classifiers) together through simple voting, the correct prediction (classification) rates are 65.21% and 65.63% for a basic training dataset and an independent test set, respectively. These results are better than any single machine learning algorithm collected in Weka when exactly the same data are used. Furthermore, we introduce an integration strategy which takes care of both classifier weightings and classifier redundancy. A feature selection strategy, called minimum redundancy maximum relevance (mRMR), is transferred into algorithm selection to deal with classifier redundancy in this research, and the weightings are based on the performance of each classifier. The best classification results are obtained when 11 algorithms are selected by mRMR method, and integrated together through majority votes with weightings. As a result, the prediction correct rates are 68.56% and 69.29% for the basic training dataset and the independent test dataset, respectively. The web-server is available at http

  7. Multi-input distributed classifiers for synthetic genetic circuits.

    Directory of Open Access Journals (Sweden)

    Oleg Kanakov

    Full Text Available For practical construction of complex synthetic genetic networks able to perform elaborate functions it is important to have a pool of relatively simple modules with different functionality which can be compounded together. To complement engineering of very different existing synthetic genetic devices such as switches, oscillators or logical gates, we propose and develop here a design of synthetic multi-input classifier based on a recently introduced distributed classifier concept. A heterogeneous population of cells acts as a single classifier, whose output is obtained by summarizing the outputs of individual cells. The learning ability is achieved by pruning the population, instead of tuning parameters of an individual cell. The present paper is focused on evaluating two possible schemes of multi-input gene classifier circuits. We demonstrate their suitability for implementing a multi-input distributed classifier capable of separating data which are inseparable for single-input classifiers, and characterize performance of the classifiers by analytical and numerical results. The simpler scheme implements a linear classifier in a single cell and is targeted at separable classification problems with simple class borders. A hard learning strategy is used to train a distributed classifier by removing from the population any cell answering incorrectly to at least one training example. The other scheme implements a circuit with a bell-shaped response in a single cell to allow potentially arbitrary shape of the classification border in the input space of a distributed classifier. Inseparable classification problems are addressed using soft learning strategy, characterized by probabilistic decision to keep or discard a cell at each training iteration. We expect that our classifier design contributes to the development of robust and predictable synthetic biosensors, which have the potential to affect applications in a lot of fields, including that of

  8. Evaluation of three classifiers in mapping forest stand types using ...

    African Journals Online (AJOL)

    Three classifiers were examined for their suitability in mapping the different forest stand types in the area (maximum likelihood, spectral angle mapper and decision tree). The results showed that using maximum likelihood classifier and ASTER imagery, different forest stand types can be accurately mapped with an overall ...

  9. Classifying spaces with virtually cyclic stabilizers for linear groups

    DEFF Research Database (Denmark)

    Degrijse, Dieter Dries; Köhl, Ralf; Petrosyan, Nansen

    2015-01-01

    We show that every discrete subgroup of GL(n, ℝ) admits a finite-dimensional classifying space with virtually cyclic stabilizers. Applying our methods to SL(3, ℤ), we obtain a four-dimensional classifying space with virtually cyclic stabilizers and a decomposition of the algebraic K-theory of its...

  10. 16 CFR 1610.4 - Requirements for classifying textiles.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Requirements for classifying textiles. 1610... REGULATIONS STANDARD FOR THE FLAMMABILITY OF CLOTHING TEXTILES The Standard § 1610.4 Requirements for classifying textiles. (a) Class 1, Normal Flammability. Class 1 textiles exhibit normal flammability and are...

  11. 32 CFR 2400.30 - Reproduction of classified information.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Reproduction of classified information. 2400.30... SECURITY PROGRAM Safeguarding § 2400.30 Reproduction of classified information. Documents or portions of... the originator or higher authority. Any stated prohibition against reproduction shall be strictly...

  12. Using naive Bayes classifier for classification of convective rainfall ...

    Indian Academy of Sciences (India)

    ... based on 'naiveBayes classifier' is applied. This is a simple probabilistic classifier based on applying 'Bayes' theoremwith strong (naive) independent assumptions. For a 9-month period, the ability of SEVIRI to classifythe rainfall intensity in the convective clouds is evaluated using weather radar over the northern Algeria.

  13. Implications of physical symmetries in adaptive image classifiers

    DEFF Research Database (Denmark)

    Sams, Thomas; Hansen, Jonas Lundbek

    2000-01-01

    It is demonstrated that rotational invariance and reflection symmetry of image classifiers lead to a reduction in the number of free parameters in the classifier. When used in adaptive detectors, e.g. neural networks, this may be used to decrease the number of training samples necessary to learn...

  14. Classifying defects in pallet stringers by ultrasonic scanning

    Science.gov (United States)

    Mohammed F. Kabir; Daniel L. Schmoldt; Philip A. Araman; Mark E. Schafer; Sang-Mook Lee

    2003-01-01

    Detecting and classifying defects are required to grade and sort pallet parts. Use of quality parts can extend the life cycle of pallets and can reduce long-term cost. An investigation has been carried out to detect and classify defects in yellow-poplar (Liriodendron tulipifera, L.) and red oak (Quercus rubra, L.) stringers using ultrasonic scanning. Data were...

  15. 33 CFR 149.405 - How are fire extinguishers classified?

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false How are fire extinguishers... Fire Protection Equipment Firefighting Requirements § 149.405 How are fire extinguishers classified? (a) Portable and semi-portable extinguishers on a manned deepwater port must be classified using the Coast...

  16. Human classifier: Observers can deduce task solely from eye movements.

    Science.gov (United States)

    Bahle, Brett; Mills, Mark; Dodd, Michael D

    2017-07-01

    Computer classifiers have been successful at classifying various tasks using eye movement statistics. However, the question of human classification of task from eye movements has rarely been studied. Across two experiments, we examined whether humans could classify task based solely on the eye movements of other individuals. In Experiment 1, human classifiers were shown one of three sets of eye movements: Fixations, which were displayed as blue circles, with larger circles meaning longer fixation durations; Scanpaths, which were displayed as yellow arrows; and Videos, in which a neon green dot moved around the screen. There was an additional Scene manipulation in which eye movement properties were displayed either on the original scene where the task (Search, Memory, or Rating) was performed or on a black background in which no scene information was available. Experiment 2 used similar methods but only displayed Fixations and Videos with the same Scene manipulation. The results of both experiments showed successful classification of Search. Interestingly, Search was best classified in the absence of the original scene, particularly in the Fixation condition. Memory also was classified above chance with the strongest classification occurring with Videos in the presence of the scene. Additional analyses on the pattern of correct responses in these two conditions demonstrated which eye movement properties successful classifiers were using. These findings demonstrate conditions under which humans can extract information from eye movement characteristics in addition to providing insight into the relative success/failure of previous computer classifiers.

  17. 40 CFR 152.175 - Pesticides classified for restricted use.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Pesticides classified for restricted...) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.175 Pesticides classified for restricted use. The following uses of pesticide products containing the...

  18. Prediction Scores as a Window into Classifier Behavior

    NARCIS (Netherlands)

    M. Katehara (Medha); E.M.A.L. Beauxis-Aussalet (Emmanuelle); B. Alsallakh (Bilal)

    2017-01-01

    textabstractMost multi-class classifiers make their prediction for a test sample by scoring the classes and selecting the one with the highest score. Analyzing these prediction scores is useful to understand the classifier behavior and to assess its reliability. We present an interactive

  19. Blocking Effects in the Learning of Chinese Classifiers

    Science.gov (United States)

    Paul, Jing Z.; Grüter, Theres

    2016-01-01

    This study investigated order-of-learning effects on the acquisition of classifier-noun associations in Chinese in two experiments modeled after Arnon and Ramscar's (2012) study of artificial language learning. In Experiment 1, learners with no prior exposure to Chinese showed better learning of classifier-noun associations when exposed to larger…

  20. Irradiation Facilities at CERN

    CERN Document Server

    Gkotse, Blerina; Carbonez, Pierre; Danzeca, Salvatore; Fabich, Adrian; Garcia, Alia, Ruben; Glaser, Maurice; Gorine, Georgi; Jaekel, Martin, Richard; Mateu,Suau, Isidre; Pezzullo, Giuseppe; Pozzi, Fabio; Ravotti, Federico; Silari, Marco; Tali, Maris

    2017-01-01

    CERN provides unique irradiation facilities for applications in many scientific fields. This paper summarizes the facilities currently operating for proton, gamma, mixed-field and electron irradiations, including their main usage, characteristics and information about their operation. The new CERN irradiation facilities database is also presented. This includes not only CERN facilities but also irradiation facilities available worldwide.

  1. An ensemble of dissimilarity based classifiers for Mackerel gender determination

    Science.gov (United States)

    Blanco, A.; Rodriguez, R.; Martinez-Maranon, I.

    2014-03-01

    Mackerel is an infravalored fish captured by European fishing vessels. A manner to add value to this specie can be achieved by trying to classify it attending to its sex. Colour measurements were performed on Mackerel females and males (fresh and defrozen) extracted gonads to obtain differences between sexes. Several linear and non linear classifiers such as Support Vector Machines (SVM), k Nearest Neighbors (k-NN) or Diagonal Linear Discriminant Analysis (DLDA) can been applied to this problem. However, theyare usually based on Euclidean distances that fail to reflect accurately the sample proximities. Classifiers based on non-Euclidean dissimilarities misclassify a different set of patterns. We combine different kind of dissimilarity based classifiers. The diversity is induced considering a set of complementary dissimilarities for each model. The experimental results suggest that our algorithm helps to improve classifiers based on a single dissimilarity.

  2. North Slope, Alaska ESI: FACILITY (Facility Points)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains data for oil field facilities for the North Slope of Alaska. Vector points in this data set represent oil field facility locations. This data...

  3. Hepatic, renal, and total body galactose elimination in the pig

    DEFF Research Database (Denmark)

    Winkler, K; Henriksen, Jens Henrik; Tygstrup, N

    1993-01-01

    reabsorption (Tm 178 +/- 3.0 mumol/min, Km 3.8 +/- 0.9 mmol/l, n = 20). Metabolic conversion of galactose in the kidney was not demonstrable. At all concentrations studied (0.4-5.8 mmol/l), total galactose elimination from the body exceeded the sum of hepatic and renal elimination by approximately 100 mumol...... that estimation of the hepatic galactose elimination capacity from whole body elimination curves requires correction for renal removal of galactose....

  4. The Effect of non-Hermiticity on Adiabatic Elimination

    OpenAIRE

    Sharaf, Rahman; Dehghani, Mojgan; Darbari, Sara; Ramezani, Hamidreza

    2017-01-01

    We investigate the influence of non-Hermiticity on the adiabatic elimination in coupled waveguides. We show that adiabatic elimination is not affected when the system is in parity-time symmetric phase. However, in the broken phase the eliminated waveguide loses its darkness namely its amplitude starts increasing, which means adiabatic elimination does not hold in the broken phase. Our results can advance the control of the dynamics in coupled laser cavities, and help the design of controllabl...

  5. Lean for Government: Eliminating the Seven Wastes

    Science.gov (United States)

    Shepherd, Christena C.

    2012-01-01

    With shrinking budgets and a slow economy, it is becoming increasingly important for all government agencies to become more efficient. Citizens expect and deserve efficient and effective services from federal, state and local government agencies. One of the best methods to improve efficiency and eliminate waste is to institute the business process improvement methodologies known collectively as Lean; however, with reduced budgets, it may not be possible to train everyone in Lean or to engage the services of a trained consultant. It is possible, however, to raise awareness of the "Seven Wastes" of Lean in each employee, and encourage them to identify areas for improvement. Management commitment is vital to the success of these initiatives, and it is also important to develop the right metrics that will track the success of these changes.

  6. Elimination of photoresist linewidth slimming by fluorination

    Science.gov (United States)

    Garza, Cesar M.; Conley, Willard E.

    2004-05-01

    Typically resist performance has lagged behind exposure tools as new, shorter wavelengths are introduced in the never-ending industry quest to print smaller features. Over time, however, the performance improves until it matches or exceeds that of the resists used in the previous wavelength node; 193 nm resists have not been the exception. Their resolution and stability has improved but one issue that remains is linewidth slimming. This phenomenon consists in a reduction of resist features when they are exposed to an electron beam in an scanning electron microscope during linewidth metrology. Although this phenomenon has been well described and reasonably well understood, no solution exists to eliminate this problem. In this paper we show linewidth slimming can be significantly reduced by fluorinating the resist after the relief image has been developed, keeping the lithographic dimensions unchanged.

  7. Adaptive elimination of synchronization in coupled oscillator

    Science.gov (United States)

    Zhou, Shijie; Ji, Peng; Zhou, Qing; Feng, Jianfeng; Kurths, Jürgen; Lin, Wei

    2017-08-01

    We present here an adaptive control scheme with a feedback delay to achieve elimination of synchronization in a large population of coupled and synchronized oscillators. We validate the feasibility of this scheme not only in the coupled Kuramoto’s oscillators with a unimodal or bimodal distribution of natural frequency, but also in two representative models of neuronal networks, namely, the FitzHugh-Nagumo spiking oscillators and the Hindmarsh-Rose bursting oscillators. More significantly, we analytically illustrate the feasibility of the proposed scheme with a feedback delay and reveal how the exact topological form of the bimodal natural frequency distribution influences the scheme performance. We anticipate that our developed scheme will deepen the understanding and refinement of those controllers, e.g. techniques of deep brain stimulation, which have been implemented in remedying some synchronization-induced mental disorders including Parkinson disease and epilepsy.

  8. RESULTS of the "ELIMINATING NOISE" campaign

    CERN Multimedia

    SC Unit

    2008-01-01

    From 4 to 6 August, CERN’s nurses conducted a screening campaign entitled "Eliminating noise". This campaign was especially aimed at young people exposed to noise during their leisure hours (playing in a band, listening to MP3 players, attending concerts, etc.). In all, 166 people attended the infirmary, where they were able to receive personalised advice, documentation and, above all, a hearing test (audiogram). While the high attendance of people in the younger age category (18-30) was a success, their audiogram data were a cause for concern, with 24.5% showing abnormal results, hearing deficiencies which, we should remind you, are irreversible. It should be noted that such conditions are almost exclusively caused by noise exposure in a non-professional environment (leisure activities, music, etc.). This latest campaign confirms the harmful effects of noise on people’s hearing due to the absence or insufficiency of protective equipment during music-related activities; this further unde...

  9. RESULTS of the "ELIMINATING NOISE" campaign

    CERN Document Server

    SC Unit

    2008-01-01

    From 4 to 6 August, CERN’s nurses conducted a screening campaign entitled "Eliminating noise". This campaign was especially aimed at young people exposed to noise during their leisure hours (playing in a band, listening to MP3 players, attending concerts, etc.). In all, 166 people attended the Infirmary, where they were able to receive personalised advice, documentation and, above all, a hearing test (audiogram). While the high attendance of people in the younger age category (18-30) was a success, their audiogram data were a cause for concern, with 24.5% showing abnormal results, hearing deficiencies which, we should remind you, are irreversible. It should be noted that such conditions are almost exclusively caused by noise exposure in a non-professional environment (leisure activities, music, etc.). This latest campaign confirms the harmful effects of noise on people’s hearing due to the absence or insufficiency of protective equipment during music-related activities; this further unde...

  10. Eliminating paediatric infections and keeping mothers alive

    Directory of Open Access Journals (Sweden)

    Gray G

    2012-11-01

    Full Text Available The global plan of reducing the number of new child HIV infections and a reduction in the number of HIV-related maternal deaths by 2015 will require inordinate political commitment and strengthening of health systems in Sub-Saharan Africa where the burden of HIV infections in pregnant women is the highest. Preventing HIV infection in women of child-bearing age and unwanted pregnancies in HIV-positive women forms the cornerstone of long-term control of paediatric HIV infections. To achieve the goal of eliminating paediatric HIV infection by 2015, health systems strengthening to address prevention of mother-to-child HIV transmission cascade attrition and focusing on the elimination of breastmilk transmission is critical. Understanding the pathogenesis of breastmilk transmission and the mechanisms by which antiretroviral therapy impacts on transmission through this compartment will drive future interventions. Identifying and retaining HIV-positive pregnant women in care and committed to long-term antiretroviral therapy will improve maternal outcomes and concomitant reductions in maternal mortality. Research assessing the natural history of HIV infection and long-term outcomes in women who interrupt antiretroviral therapy post-weaning is urgently required. Data on the outcome of women who opt to continue the long-term use of antiretroviral therapy after initiating therapy during pregnancy will determine future policy in countries considering option B+. The prevalence of antiretroviral resistance and impact on survival in infants who sero-convert whilst receiving neonatal prophylaxis, or are exposed to maternal HAART through breastmilk at a population level, are currently unknown. In addition to the provision of biomedical interventions, healthcare workers and policy makers must address the structural, cultural and community issues that impact on treatment uptake, adherence to medication and retention in care.

  11. Jupiter Laser Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Jupiter Laser Facility is an institutional user facility in the Physical and Life Sciences Directorate at LLNL. The facility is designed to provide a high degree...

  12. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  13. Unbiased feature selection through successive elimination of poor performers for EEG classification

    Science.gov (United States)

    Siddiqui, Khalid J.

    1996-04-01

    Electroencephalogram (EEG) pattern recognition problem is considered as a composite of three subproblems: feature extraction, feature selection, and pattern classification. Focusing particularly on the feature selection issue, each subproblem is reviewed briefly and a new method for feature selection is proposed. The method suggests that first one shall extract as much information (features) as conveniently possible in several pattern information domains and then apply the proposed unbiased successive feature elimination process to remove redundant and poor features. From this set select a significantly smaller, yet useful, feature subset that enhances the performance of the classifier. The successive feature elimination process is formally described. The method is successfully applied to an EEG signal classification problem. The features selected by the algorithm are used to classify three signal classes. The classes identified were eye artifacts, muscle artifacts, and clean (subject in stationary state). Two hundred samples for each of the three classes were selected and the data set was arbitrarily divided into two subsets: design subset, and testing subset. A proximity index classifier using Mahalanobis distance as the proximity criterion was developed using the smaller feature subset. The system was trained on the design set. The recognition performance on the design set was 92.33%. The recognition performance on the testing set was 88.67% by successfully identifying the samples in eye-blinks, muscle response, and clean classes, respectively, with 80%, 97%, and 89%. This performance is very encouraging. In addition, the method is computationally inexpensive and particularly useful for large data set problems. The method further reduces the need for a careful feature determination problem that a system designer usually encounters during the initial design phase of a pattern classifier.

  14. Cross-Subject EEG Feature Selection for Emotion Recognition Using Transfer Recursive Feature Elimination.

    Science.gov (United States)

    Yin, Zhong; Wang, Yongxiong; Liu, Li; Zhang, Wei; Zhang, Jianhua

    2017-01-01

    Using machine-learning methodologies to analyze EEG signals becomes increasingly attractive for recognizing human emotions because of the objectivity of physiological data and the capability of the learning principles on modeling emotion classifiers from heterogeneous features. However, the conventional subject-specific classifiers may induce additional burdens to each subject for preparing multiple-session EEG data as training sets. To this end, we developed a new EEG feature selection approach, transfer recursive feature elimination (T-RFE), to determine a set of the most robust EEG indicators with stable geometrical distribution across a group of training subjects and a specific testing subject. A validating set is introduced to independently determine the optimal hyper-parameter and the feature ranking of the T-RFE model aiming at controlling the overfitting. The effectiveness of the T-RFE algorithm for such cross-subject emotion classification paradigm has been validated by DEAP database. With a linear least square support vector machine classifier implemented, the performance of the T-RFE is compared against several conventional feature selection schemes and the statistical significant improvement has been found. The classification rate and F-score achieve 0.7867, 0.7526, 0.7875, and 0.8077 for arousal and valence dimensions, respectively, and outperform several recent reported works on the same database. In the end, the T-RFE based classifier is compared against two subject-generic classifiers in the literature. The investigation of the computational time for all classifiers indicates the accuracy improvement of the T-RFE is at the cost of the longer training time.

  15. Malignancy and Abnormality Detection of Mammograms using Classifier Ensembling

    Directory of Open Access Journals (Sweden)

    Nawazish Naveed

    2011-07-01

    Full Text Available The breast cancer detection and diagnosis is a critical and complex procedure that demands high degree of accuracy. In computer aided diagnostic systems, the breast cancer detection is a two stage procedure. First, to classify the malignant and benign mammograms, while in second stage, the type of abnormality is detected. In this paper, we have developed a novel architecture to enhance the classification of malignant and benign mammograms using multi-classification of malignant mammograms into six abnormality classes. DWT (Discrete Wavelet Transformation features are extracted from preprocessed images and passed through different classifiers. To improve accuracy, results generated by various classifiers are ensembled. The genetic algorithm is used to find optimal weights rather than assigning weights to the results of classifiers on the basis of heuristics. The mammograms declared as malignant by ensemble classifiers are divided into six classes. The ensemble classifiers are further used for multiclassification using one-against-all technique for classification. The output of all ensemble classifiers is combined by product, median and mean rule. It has been observed that the accuracy of classification of abnormalities is more than 97% in case of mean rule. The Mammographic Image Analysis Society dataset is used for experimentation.

  16. Aperture area measurement facility

    Data.gov (United States)

    Federal Laboratory Consortium — NIST has established an absolute aperture area measurement facility for circular and near-circular apertures use in radiometric instruments. The facility consists of...

  17. Facility Registry Service (FRS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Facility Registry Service (FRS) provides an integrated source of comprehensive (air, water, and waste) environmental information about facilities across EPA,...

  18. Licensed Healthcare Facilities

    Data.gov (United States)

    California Department of Resources — The Licensed Healthcare Facilities point layer represents the locations of all healthcare facilities licensed by the State of California, Department of Health...

  19. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  20. Malaria elimination in Haiti by the year 2020: an achievable goal?

    Science.gov (United States)

    Boncy, Paul Jacques; Adrien, Paul; Lemoine, Jean Frantz; Existe, Alexandre; Henry, Patricia Jean; Raccurt, Christian; Brasseur, Philippe; Fenelon, Natael; Dame, John B; Okech, Bernard A; Kaljee, Linda; Baxa, Dwayne; Prieur, Eric; El Badry, Maha A; Tagliamonte, Massimiliano S; Mulligan, Connie J; Carter, Tamar E; Beau de Rochars, V Madsen; Lutz, Chelsea; Parke, Dana M; Zervos, Marcus J

    2015-06-05

    Haiti and the Dominican Republic, which share the island of Hispaniola, are the last locations in the Caribbean where malaria still persists. Malaria is an important public health concern in Haiti with 17,094 reported cases in 2014. Further, on January 12, 2010, a record earthquake devastated densely populated areas in Haiti including many healthcare and laboratory facilities. Weakened infrastructure provided fertile reservoirs for uncontrolled transmission of infectious pathogens. This situation results in unique challenges for malaria epidemiology and elimination efforts. To help Haiti achieve its malaria elimination goals by year 2020, the Laboratoire National de Santé Publique and Henry Ford Health System, in close collaboration with the Direction d'Épidémiologie, de Laboratoire et de Recherches and the Programme National de Contrôle de la Malaria, hosted a scientific meeting on "Elimination Strategies for Malaria in Haiti" on January 29-30, 2015 at the National Laboratory in Port-au-Prince, Haiti. The meeting brought together laboratory personnel, researchers, clinicians, academics, public health professionals, and other stakeholders to discuss main stakes and perspectives on malaria elimination. Several themes and recommendations emerged during discussions at this meeting. First, more information and research on malaria transmission in Haiti are needed including information from active surveillance of cases and vectors. Second, many healthcare personnel need additional training and critical resources on how to properly identify malaria cases so as to improve accurate and timely case reporting. Third, it is necessary to continue studies genotyping strains of Plasmodium falciparum in different sites with active transmission to evaluate for drug resistance and impacts on health. Fourth, elimination strategies outlined in this report will continue to incorporate use of primaquine in addition to chloroquine and active surveillance of cases. Elimination of

  1. Automatically classifying question types for consumer health questions.

    Science.gov (United States)

    Roberts, Kirk; Kilicoglu, Halil; Fiszman, Marcelo; Demner-Fushman, Dina

    2014-01-01

    We present a method for automatically classifying consumer health questions. Our thirteen question types are designed to aid in the automatic retrieval of medical answers from consumer health resources. To our knowledge, this is the first machine learning-based method specifically for classifying consumer health questions. We demonstrate how previous approaches to medical question classification are insufficient to achieve high accuracy on this task. Additionally, we describe, manually annotate, and automatically classify three important question elements that improve question classification over previous techniques. Our results and analysis illustrate the difficulty of the task and the future directions that are necessary to achieve high-performing consumer health question classification.

  2. Automatically Classifying Question Types for Consumer Health Questions

    Science.gov (United States)

    Roberts, Kirk; Kilicoglu, Halil; Fiszman, Marcelo; Demner-Fushman, Dina

    2014-01-01

    We present a method for automatically classifying consumer health questions. Our thirteen question types are designed to aid in the automatic retrieval of medical answers from consumer health resources. To our knowledge, this is the first machine learning-based method specifically for classifying consumer health questions. We demonstrate how previous approaches to medical question classification are insufficient to achieve high accuracy on this task. Additionally, we describe, manually annotate, and automatically classify three important question elements that improve question classification over previous techniques. Our results and analysis illustrate the difficulty of the task and the future directions that are necessary to achieve high-performing consumer health question classification. PMID:25954411

  3. Automatic speech recognition using a predictive echo state network classifier.

    Science.gov (United States)

    Skowronski, Mark D; Harris, John G

    2007-04-01

    We have combined an echo state network (ESN) with a competitive state machine framework to create a classification engine called the predictive ESN classifier. We derive the expressions for training the predictive ESN classifier and show that the model was significantly more noise robust compared to a hidden Markov model in noisy speech classification experiments by 8+/-1 dB signal-to-noise ratio. The simple training algorithm and noise robustness of the predictive ESN classifier make it an attractive classification engine for automatic speech recognition.

  4. On the Importance of Elimination Heuristics in Lazy Propagation

    DEFF Research Database (Denmark)

    Madsen, Anders Læsø; Butz, Cory J.

    2012-01-01

    elimination orders on-line. This paper considers the importance of elimination heuristics in LP when using Variable Elimination (VE) as the message and single marginal computation algorithm. It considers well-known cost measures for selecting the next variable to eliminate and a new cost measure....... The empirical evaluation examines dierent heuristics as well as sequences of cost measures, and was conducted on real-world and randomly generated Bayesian networks. The results show that for most cases performance is robust relative to the cost measure used and in some cases the elimination heuristic can have...

  5. Testing strategy for classifying self-heating substances for transport of dangerous goods.

    Science.gov (United States)

    Chervin, Sima; Bodman, Glenn T

    2004-11-11

    A testing strategy for the classification of self-heating substances for transport of dangerous goods is proposed. The strategy was developed based on the tests described and correlations used in the UN Recommendations. It was demonstrated that the value of activation energy of the exothermic reaction has a significant impact on the extrapolation of test results with regard to different container sizes and temperatures. Based on a combination of the Grewer Oven test screening, the 25 mm cube test at 140 degrees C, and the determination of the activation energy of a specific material, a flowchart is presented for classifying chemicals as self-heating. The presented approach allows predicting chemical stability in large containers more accurately and eliminates the need to perform hazardous large-scale tests of energetic chemicals in a laboratory.

  6. Guide to research facilities

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    This Guide provides information on facilities at US Department of Energy (DOE) and other government laboratories that focus on research and development of energy efficiency and renewable energy technologies. These laboratories have opened these facilities to outside users within the scientific community to encourage cooperation between the laboratories and the private sector. The Guide features two types of facilities: designated user facilities and other research facilities. Designated user facilities are one-of-a-kind DOE facilities that are staffed by personnel with unparalleled expertise and that contain sophisticated equipment. Other research facilities are facilities at DOE and other government laboratories that provide sophisticated equipment, testing areas, or processes that may not be available at private facilities. Each facility listing includes the name and phone number of someone you can call for more information.

  7. A Comprehensive Copper Compliance Strategy: Implementing Regulatory Guidance at Pearl Harbor Naval Shipyard & Intermediate Maintenance Facility

    National Research Council Canada - National Science Library

    Earley, P. J; Rosen, G; Rivera-Duarte, I; Gauthier, R. D; Arias-Thode, Y; Thompson, J; Swope, B

    2007-01-01

    Studies were performed to develop a new National Pollution Discharge Elimination Systems Permit for the discharge of effluents from the Pearl Harbor Naval Shipyard and Intermediate Maintenance Facility into Pearl Harbor...

  8. NPDES Permit for Soap Creek Associates Wastewater Treatment Facility in Montana

    Science.gov (United States)

    Under National Pollutant Discharge Elimination System permit number MT-0023183, Soap Creek Associates, Inc. is authorized to discharge from its wastewater treatment facility located in West, Bighorn County, Montana, to Soap Creek.

  9. Automatically classifying question types for consumer health questions

    National Research Council Canada - National Science Library

    Roberts, Kirk; Kilicoglu, Halil; Fiszman, Marcelo; Demner-Fushman, Dina

    2014-01-01

    We present a method for automatically classifying consumer health questions. Our thirteen question types are designed to aid in the automatic retrieval of medical answers from consumer health resources...

  10. Using decision tree classifier to predict income levels

    OpenAIRE

    Bekena, Sisay Menji

    2017-01-01

    In this study Random Forest Classifier machine learning algorithm is applied to predict income levels of individuals based on attributes including education, marital status, gender, occupation, country and others. Income levels are defined as a binary variable 0 for income

  11. Evaluation of diagnostic classifiers using artificial clinical cases

    Directory of Open Access Journals (Sweden)

    Antczak Karol

    2017-01-01

    Full Text Available Evaluation of classifiers in diagnosis support systems is a non-trivial task. It can be done in a form of controlled and blinded clinical trial, which is often difficult and costly. We propose a new method for generating artificial medical cases from a knowledge base, utilizing the concept of so-called medical diamonds. Cases generated using this method have features analogous to that of double-blinded trial and, thus, can be used for measuring sensitivity and specificity of diagnostic classifiers. This is easy and low-cost method of evaluation and comparison of classifiers in diagnosis support systems. We demonstrate that this method is able to produce valuable results when used for evaluation of similarity-based classifiers as well as shallow and deep neural networks.

  12. New microsatellite markers classifying nontoxic and toxic Jatropha ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 90; Online resources. New microsatellite markers classifying nontoxic and toxic Jatropha curcas. Patcharin Tanya Sujinna Dachapak Maung Maung Tar Peerasak Srinives. Volume 90 Online resources 2011 pp e76-e78 ...

  13. One pass learning for generalized classifier neural network.

    Science.gov (United States)

    Ozyildirim, Buse Melis; Avci, Mutlu

    2016-01-01

    Generalized classifier neural network introduced as a kind of radial basis function neural network, uses gradient descent based optimized smoothing parameter value to provide efficient classification. However, optimization consumes quite a long time and may cause a drawback. In this work, one pass learning for generalized classifier neural network is proposed to overcome this disadvantage. Proposed method utilizes standard deviation of each class to calculate corresponding smoothing parameter. Since different datasets may have different standard deviations and data distributions, proposed method tries to handle these differences by defining two functions for smoothing parameter calculation. Thresholding is applied to determine which function will be used. One of these functions is defined for datasets having different range of values. It provides balanced smoothing parameters for these datasets through logarithmic function and changing the operation range to lower boundary. On the other hand, the other function calculates smoothing parameter value for classes having standard deviation smaller than the threshold value. Proposed method is tested on 14 datasets and performance of one pass learning generalized classifier neural network is compared with that of probabilistic neural network, radial basis function neural network, extreme learning machines, and standard and logarithmic learning generalized classifier neural network in MATLAB environment. One pass learning generalized classifier neural network provides more than a thousand times faster classification than standard and logarithmic generalized classifier neural network. Due to its classification accuracy and speed, one pass generalized classifier neural network can be considered as an efficient alternative to probabilistic neural network. Test results show that proposed method overcomes computational drawback of generalized classifier neural network and may increase the classification performance. Copyright

  14. Subtractive Fuzzy Classifier Based Driver Distraction Levels Classification Using EEG

    OpenAIRE

    Wali, Mousa Kadhim; Murugappan, Murugappan; Ahmad, Badlishah

    2013-01-01

    [Purpose] In earlier studies of driver distraction, researchers classified distraction into two levels (not distracted, and distracted). This study classified four levels of distraction (neutral, low, medium, high). [Subjects and Methods] Fifty Asian subjects (n=50, 43 males, 7 females), age range 20?35 years, who were free from any disease, participated in this study. Wireless EEG signals were recorded by 14 electrodes during four types of distraction stimuli (Global Position Systems (GPS), ...

  15. Automatic Music Genre Classification Using Ensemble of Classifiers

    OpenAIRE

    Silla Jr, Carlos N.; Kaestner, Celso A. A.; Koerich, Alessandro L.

    2007-01-01

    This paper presents a novel approach to the task of automatic music genre classification which is based on multiple feature vectors and ensemble of classifiers. Multiple feature vectors are extracted from a single music piece. First, three 30-second music segments, one from the beginning, one from the middle and one from end part of a music piece are selected and feature vectors are extracted from each segment. Individual classifiers are trained to account for each feature vector extracted fr...

  16. Progress toward measles elimination in kyrgyzstan.

    Science.gov (United States)

    Suvanbekov, Akbar; Kitarova, Gulzhan; Reyer, Joshua A; Hamajima, Nobuyuki

    2015-02-01

    Measles is one of the most severe infectious diseases of childhood, and one of the major causes of mortali-ty, especially in developing countries. Despite rare measles outbreaks in recent years, Kyrgyzstan seeks to show its commitment towards the global anti-measles campaign. The aim of this article is to summarize the scattered information on the recent status of measles, valid surveillance system, and measles elimination strategies in Kyrgyzstan, based on sources that include non-confidential but usually inaccessible governmental data. Infor-mation was extracted from the reports to the Ministry of Health and documents on the national surveillance system, in addition to outbreak cases extracted from the Republican Infectious Diseases Hospital's archive. To tackle the worsening measles situation in Kyrgyzstan, the Ministry of Health established the Republican Center for Immunoprophylaxis in 1994. Measles related death, which was rampant up until 1992, has not been registered since 2000 due to improved routine vaccination coverage, increasing from 88% in 1994 to 97% and over in 1997. The national surveillance system was modernized thanks to the World Health Organization, helping to detect measles cases and prevent major outbreaks. The system identified 222 cases in the outbreak of 2011, and the case cards in the hospital provided the findings of 69 admitted cases (42 infants, 22 children aged 1 to 14 years, and 5 aged 15 years or over), including 32 severe cases. This article provides a whole view on measles in Kyrgyzstan, which would be useful to control measles worldwide.

  17. Malaria vaccine: a step toward elimination.

    Science.gov (United States)

    Jindal, Harashish; Bhatt, Bhumika; Malik, Jagbir S; Sk, Shashikantha; Mehta, Bharti

    2014-01-01

    Malaria has long been recognized as a public health problem. At the community level, vector control, and antimalarial medicines are the main means for reducing incidence, morbidity, and mortality of malaria. A vaccine not only would bring streamlining in the prevention of morbidity and mortality from malaria but also would be more accessible if integrated with Expanded Programme of Immunization (EPI). Globally, an estimated 3.4 billion people are at risk of malaria. Most cases (80%) and deaths (90%) occurred in Africa, and most deaths (77%) are in children under 5 years of age. An effective vaccine has long been envisaged as a valuable addition to the available tools for malaria control. Although research toward the development of malaria vaccines has been pursued since the 1960s, there are no licensed malaria vaccines. The RTS,S/AS01 vaccine, which targets P. falciparum, has reached phase 3 clinical trials and results are promising. Malaria Vaccine Technology Road Map 2013 has envisaged the world aiming for a licensed vaccine by 2030 that would reduce malaria cases by 75% and be capable of eliminating malaria. It will not only fill the gaps of today's interventions but also be a cost-effective method of decreasing morbidity and mortality from malaria.

  18. Eliminating Residents Increases the Cost of Care.

    Science.gov (United States)

    DeMarco, Deborah M; Forster, Richard; Gakis, Thomas; Finberg, Robert W

    2017-08-01

    Academic health centers are facing a potential reduction in Medicare financing for graduate medical education (GME). Both the Medicare Payment Advisory Commission and the National Commission on Fiscal Responsibility and Reform (Deficit Commission) have suggested cutting approximately half the funding that teaching hospitals receive for indirect medical education. Because of the effort that goes into teaching trainees, who are only transient employees, hospital executives often see teaching programs as a drain on resources. In light of the possibility of a Medicare cut to GME programs, we undertook an analysis to assess the financial risk of training programs to our institution and the possibility of saving money by reducing resident positions. The chief administrative officer, in collaboration with the hospital chief financial officer, performed a financial analysis to examine the possibility of decreasing costs by reducing residency programs at the University of Massachusetts Memorial Medical Center. Despite the real costs of our training programs, the analysis demonstrated that GME programs have a positive impact on hospital finances. Reducing or eliminating GME programs would have a negative impact on our hospital's bottom line.

  19. Ringing effects eliminated spin echo in solids

    Science.gov (United States)

    Ma, Chao; Li, Peng; Chen, Qun; Zhang, Shanmin

    2013-08-01

    Two types of ringing effects eliminated spin echo sequences have been introduced. To achieve the task, two additional 90° pulses with proper phase cycles are placed at the beginning of the pulse sequences. The spin echo time is calculated with the perturbation method to the first order, i.e. taking into account only the dipolar secular term. The non-secular term causes an imaginary part of the FID, leading to an unsymmetrical NMR spectrum. This effect, according to a symmetry of NMR sequences under phase inversion, can be compensated by inverting all the x and -x or y and -y phases. The properties of the symmetry are derived based on the theory of density matrix. In addition, the non-secular term also results in a small drop (several per cent) of the echo amplitude, but it nearly does not affect the echo time. With these pulse sequences we are able to get a spectrum with an echo delay only 1.1 μs without distortion using a Bruker AVANCE III NMR instrument.

  20. A cardiorespiratory classifier of voluntary and involuntary electrodermal activity

    Directory of Open Access Journals (Sweden)

    Sejdic Ervin

    2010-02-01

    Full Text Available Abstract Background Electrodermal reactions (EDRs can be attributed to many origins, including spontaneous fluctuations of electrodermal activity (EDA and stimuli such as deep inspirations, voluntary mental activity and startling events. In fields that use EDA as a measure of psychophysiological state, the fact that EDRs may be elicited from many different stimuli is often ignored. This study attempts to classify observed EDRs as voluntary (i.e., generated from intentional respiratory or mental activity or involuntary (i.e., generated from startling events or spontaneous electrodermal fluctuations. Methods Eight able-bodied participants were subjected to conditions that would cause a change in EDA: music imagery, startling noises, and deep inspirations. A user-centered cardiorespiratory classifier consisting of 1 an EDR detector, 2 a respiratory filter and 3 a cardiorespiratory filter was developed to automatically detect a participant's EDRs and to classify the origin of their stimulation as voluntary or involuntary. Results Detected EDRs were classified with a positive predictive value of 78%, a negative predictive value of 81% and an overall accuracy of 78%. Without the classifier, EDRs could only be correctly attributed as voluntary or involuntary with an accuracy of 50%. Conclusions The proposed classifier may enable investigators to form more accurate interpretations of electrodermal activity as a measure of an individual's psychophysiological state.

  1. Local-global classifier fusion for screening chest radiographs

    Science.gov (United States)

    Ding, Meng; Antani, Sameer; Jaeger, Stefan; Xue, Zhiyun; Candemir, Sema; Kohli, Marc; Thoma, George

    2017-03-01

    Tuberculosis (TB) is a severe comorbidity of HIV and chest x-ray (CXR) analysis is a necessary step in screening for the infective disease. Automatic analysis of digital CXR images for detecting pulmonary abnormalities is critical for population screening, especially in medical resource constrained developing regions. In this article, we describe steps that improve previously reported performance of NLM's CXR screening algorithms and help advance the state of the art in the field. We propose a local-global classifier fusion method where two complementary classification systems are combined. The local classifier focuses on subtle and partial presentation of the disease leveraging information in radiology reports that roughly indicates locations of the abnormalities. In addition, the global classifier models the dominant spatial structure in the gestalt image using GIST descriptor for the semantic differentiation. Finally, the two complementary classifiers are combined using linear fusion, where the weight of each decision is calculated by the confidence probabilities from the two classifiers. We evaluated our method on three datasets in terms of the area under the Receiver Operating Characteristic (ROC) curve, sensitivity, specificity and accuracy. The evaluation demonstrates the superiority of our proposed local-global fusion method over any single classifier.

  2. Sports Facility Management.

    Science.gov (United States)

    Walker, Marcia L., Ed.; Stotlar, David K., Ed.

    The numbers of both sports facility management college courses and sport and exercise facilities are increasing, along with the need for an understanding of the trends and management concepts of these facilities. This book focuses exclusively on managing facilities where sporting events occur and includes examples in physical education, athletics,…

  3. Jitter Elimination at Optical Control of Servomotors

    Directory of Open Access Journals (Sweden)

    Radek Novak

    2014-01-01

    Full Text Available The article describes the application of microcontroller PIC18F25K22 to servomechanism electronics built – in the model of car. Model is controlled optically, in the infrared part of the spectrum. Used microcontroller is optimal for this application – it has timers with capture facilities, sufficient number of PWMs, powerfull instruction set. The main task for microcontroller is to process incoming PWM signals S1, S2 (having jitter into output PWM signals P1, P2 (jitter free. The P1 controls the angle of wheels, and the P2 handles the speed. Values of incoming signals are continuously summarized and rounded. There was choiced method of hysterezis in sophisticated algorithm for setting output PWM signals P1, P2 using tables of duties.

  4. Automated EEG artifact elimination by applying machine learning algorithms to ICA-based features

    Science.gov (United States)

    Radüntz, Thea; Scouten, Jon; Hochmuth, Olaf; Meffert, Beate

    2017-08-01

    Objective. Biological and non-biological artifacts cause severe problems when dealing with electroencephalogram (EEG) recordings. Independent component analysis (ICA) is a widely used method for eliminating various artifacts from recordings. However, evaluating and classifying the calculated independent components (IC) as artifact or EEG is not fully automated at present. Approach. In this study, we propose a new approach for automated artifact elimination, which applies machine learning algorithms to ICA-based features. Main results. We compared the performance of our classifiers with the visual classification results given by experts. The best result with an accuracy rate of 95% was achieved using features obtained by range filtering of the topoplots and IC power spectra combined with an artificial neural network. Significance. Compared with the existing automated solutions, our proposed method is not limited to specific types of artifacts, electrode configurations, or number of EEG channels. The main advantages of the proposed method is that it provides an automatic, reliable, real-time capable, and practical tool, which avoids the need for the time-consuming manual selection of ICs during artifact removal.

  5. Reliable Facility Location Problem with Facility Protection.

    Science.gov (United States)

    Tang, Luohao; Zhu, Cheng; Lin, Zaili; Shi, Jianmai; Zhang, Weiming

    2016-01-01

    This paper studies a reliable facility location problem with facility protection that aims to hedge against random facility disruptions by both strategically protecting some facilities and using backup facilities for the demands. An Integer Programming model is proposed for this problem, in which the failure probabilities of facilities are site-specific. A solution approach combining Lagrangian Relaxation and local search is proposed and is demonstrated to be both effective and efficient based on computational experiments on random numerical examples with 49, 88, 150 and 263 nodes in the network. A real case study for a 100-city network in Hunan province, China, is presented, based on which the properties of the model are discussed and some managerial insights are analyzed.

  6. Reliable Facility Location Problem with Facility Protection.

    Directory of Open Access Journals (Sweden)

    Luohao Tang

    Full Text Available This paper studies a reliable facility location problem with facility protection that aims to hedge against random facility disruptions by both strategically protecting some facilities and using backup facilities for the demands. An Integer Programming model is proposed for this problem, in which the failure probabilities of facilities are site-specific. A solution approach combining Lagrangian Relaxation and local search is proposed and is demonstrated to be both effective and efficient based on computational experiments on random numerical examples with 49, 88, 150 and 263 nodes in the network. A real case study for a 100-city network in Hunan province, China, is presented, based on which the properties of the model are discussed and some managerial insights are analyzed.

  7. Assessing the Elimination of User Fees for Delivery Services in Laos

    Science.gov (United States)

    Boudreaux, Chantelle; Chanthala, Phetdara; Lindelow, Magnus

    2014-01-01

    A pilot eliminating user fees associated with delivery at the point of services was introduced in two districts of Laos in March 2009. Following two years of implementation, an evaluation was conducted to assess the pilot impact, as well as to document the pilot design and implementation challenges. Study results show that, even in the presence of the substantial access and cultural barriers, user fees associated with delivery at health facilities act as a serious deterrent to care seeking behavior. We find a tripling of facility-based delivery rates in the intervention areas, compared to a 40% increase in the control areas. While findings from the control region suggest that facility-based delivery rates may be on the rise across the country, the substantially higher increase in the pilot areas highlight the impact of financial burden associated with facility-based delivery fees. These fees can play an important role in rapidly increasing the uptake of facility delivery to reach the national targets and, ultimately, to improve maternal and child health outcomes. The pilot achieved important gains while relying heavily on capacity and systems already in place. However, the high cost associated with monitoring and evaluation suggest broad-scale expansion of the pilot activities is likely to necessitate targeted capacity building initiatives, especially in areas with limited district level capacity to manage funds and deliver detailed and timely reports. PMID:24632592

  8. Shapley Facility Location Games

    OpenAIRE

    Ben-Porat, Omer; Tennenholtz, Moshe

    2017-01-01

    Facility location games have been a topic of major interest in economics, operations research and computer science, starting from the seminal work by Hotelling. Spatial facility location models have successfully predicted the outcome of competition in a variety of scenarios. In a typical facility location game, users/customers/voters are mapped to a metric space representing their preferences, and each player picks a point (facility) in that space. In most facility location games considered i...

  9. Nonparametric, Coupled ,Bayesian ,Dictionary ,and Classifier Learning for Hyperspectral Classification.

    Science.gov (United States)

    Akhtar, Naveed; Mian, Ajmal

    2017-10-03

    We present a principled approach to learn a discriminative dictionary along a linear classifier for hyperspectral classification. Our approach places Gaussian Process priors over the dictionary to account for the relative smoothness of the natural spectra, whereas the classifier parameters are sampled from multivariate Gaussians. We employ two Beta-Bernoulli processes to jointly infer the dictionary and the classifier. These processes are coupled under the same sets of Bernoulli distributions. In our approach, these distributions signify the frequency of the dictionary atom usage in representing class-specific training spectra, which also makes the dictionary discriminative. Due to the coupling between the dictionary and the classifier, the popularity of the atoms for representing different classes gets encoded into the classifier. This helps in predicting the class labels of test spectra that are first represented over the dictionary by solving a simultaneous sparse optimization problem. The labels of the spectra are predicted by feeding the resulting representations to the classifier. Our approach exploits the nonparametric Bayesian framework to automatically infer the dictionary size--the key parameter in discriminative dictionary learning. Moreover, it also has the desirable property of adaptively learning the association between the dictionary atoms and the class labels by itself. We use Gibbs sampling to infer the posterior probability distributions over the dictionary and the classifier under the proposed model, for which, we derive analytical expressions. To establish the effectiveness of our approach, we test it on benchmark hyperspectral images. The classification performance is compared with the state-of-the-art dictionary learning-based classification methods.

  10. Speed and Cardiac Recovery Variables Predict the Probability of Elimination in Equine Endurance Events.

    Directory of Open Access Journals (Sweden)

    Mohamed Younes

    Full Text Available Nearly 50% of the horses participating in endurance events are eliminated at a veterinary examination (a vet gate. Detecting unfit horses before a health problem occurs and treatment is required is a challenge for veterinarians but is essential for improving equine welfare. We hypothesized that it would be possible to detect unfit horses earlier in the event by measuring heart rate recovery variables. Hence, the objective of the present study was to compute logistic regressions of heart rate, cardiac recovery time and average speed data recorded at the previous vet gate (n-1 and thus predict the probability of elimination during successive phases (n and following in endurance events. Speed and heart rate data were extracted from an electronic database of endurance events (80-160 km in length organized in four countries. Overall, 39% of the horses that started an event were eliminated--mostly due to lameness (64% or metabolic disorders (15%. For each vet gate, logistic regressions of explanatory variables (average speed, cardiac recovery time and heart rate measured at the previous vet gate and categorical variables (age and/or event distance were computed to estimate the probability of elimination. The predictive logistic regressions for vet gates 2 to 5 correctly classified between 62% and 86% of the eliminated horses. The robustness of these results was confirmed by high areas under the receiving operating characteristic curves (0.68-0.84. Overall, a horse has a 70% chance of being eliminated at the next gate if its cardiac recovery time is longer than 11 min at vet gate 1 or 2, or longer than 13 min at vet gates 3 or 4. Heart rate recovery and average speed variables measured at the previous vet gate(s enabled us to predict elimination at the following vet gate. These variables should be checked at each veterinary examination, in order to detect unfit horses as early as possible. Our predictive method may help to improve equine welfare and

  11. Defending Malicious Script Attacks Using Machine Learning Classifiers

    Directory of Open Access Journals (Sweden)

    Nayeem Khan

    2017-01-01

    Full Text Available The web application has become a primary target for cyber criminals by injecting malware especially JavaScript to perform malicious activities for impersonation. Thus, it becomes an imperative to detect such malicious code in real time before any malicious activity is performed. This study proposes an efficient method of detecting previously unknown malicious java scripts using an interceptor at the client side by classifying the key features of the malicious code. Feature subset was obtained by using wrapper method for dimensionality reduction. Supervised machine learning classifiers were used on the dataset for achieving high accuracy. Experimental results show that our method can efficiently classify malicious code from benign code with promising results.

  12. Local feature saliency classifier for real-time intrusion monitoring

    Science.gov (United States)

    Buch, Norbert; Velastin, Sergio A.

    2014-07-01

    We propose a texture saliency classifier to detect people in a video frame by identifying salient texture regions. The image is classified into foreground and background in real time. No temporal image information is used during the classification. The system is used for the task of detecting people entering a sterile zone, which is a common scenario for visual surveillance. Testing is performed on the Imagery Library for Intelligent Detection Systems sterile zone benchmark dataset of the United Kingdom's Home Office. The basic classifier is extended by fusing its output with simple motion information, which significantly outperforms standard motion tracking. A lower detection time can be achieved by combining texture classification with Kalman filtering. The fusion approach running at 10 fps gives the highest result of F1=0.92 for the 24-h test dataset. The paper concludes with a detailed analysis of the computation time required for the different parts of the algorithm.

  13. Grinding wheel condition monitoring with boosted minimum distance classifiers

    Science.gov (United States)

    Liao, T. Warren; Tang, Fengming; Qu, J.; Blau, P. J.

    2008-01-01

    Grinding wheels get dull as more material is removed. This paper presents a methodology to detect a 'dull' wheel online based on acoustic emission (AE) signals. The methodology has three major steps: preprocessing, signal analysis and feature extraction, and constructing boosted classifiers using the minimum distance classifier (MDC) as the weak learner. Two booting algorithms, i.e., AdaBoost and A-Boost, were implemented. The methodology was tested with signals obtained in grinding of two ceramic materials with a diamond wheel under different grinding conditions. The results of cross-validation tests indicate that: (i) boosting greatly improves the effectiveness of the basic MDC; (ii) over all A-Boost does not outperform AdaBoost in terms of classification accuracy; and (iii) the performance of the boosted classifiers improves as the ensemble size increases.

  14. Optimal threshold estimation for binary classifiers using game theory.

    Science.gov (United States)

    Sanchez, Ignacio Enrique

    2016-01-01

    Many bioinformatics algorithms can be understood as binary classifiers. They are usually compared using the area under the receiver operating characteristic ( ROC) curve. On the other hand, choosing the best threshold for practical use is a complex task, due to uncertain and context-dependent skews in the abundance of positives in nature and in the yields/costs for correct/incorrect classification. We argue that considering a classifier as a player in a zero-sum game allows us to use the minimax principle from game theory to determine the optimal operating point. The proposed classifier threshold corresponds to the intersection between the ROC curve and the descending diagonal in ROC space and yields a minimax accuracy of 1-FPR. Our proposal can be readily implemented in practice, and reveals that the empirical condition for threshold estimation of "specificity equals sensitivity" maximizes robustness against uncertainties in the abundance of positives in nature and classification costs.

  15. Mining housekeeping genes with a Naive Bayes classifier

    Directory of Open Access Journals (Sweden)

    Aitken Stuart

    2006-10-01

    Full Text Available Abstract Background Traditionally, housekeeping and tissue specific genes have been classified using direct assay of mRNA presence across different tissues, but these experiments are costly and the results not easy to compare and reproduce. Results In this work, a Naive Bayes classifier based only on physical and functional characteristics of genes already available in databases, like exon length and measures of chromatin compactness, has achieved a 97% success rate in classification of human housekeeping genes (93% for mouse and 90% for fruit fly. Conclusion The newly obtained lists of housekeeping and tissue specific genes adhere to the expected functions and tissue expression patterns for the two classes. Overall, the classifier shows promise, and in the future additional attributes might be included to improve its discriminating power.

  16. COMPARISON OF SVM AND FUZZY CLASSIFIER FOR AN INDIAN SCRIPT

    Directory of Open Access Journals (Sweden)

    M. J. Baheti

    2012-01-01

    Full Text Available With the advent of technological era, conversion of scanned document (handwritten or printed into machine editable format has attracted many researchers. This paper deals with the problem of recognition of Gujarati handwritten numerals. Gujarati numeral recognition requires performing some specific steps as a part of preprocessing. For preprocessing digitization, segmentation, normalization and thinning are done with considering that the image have almost no noise. Further affine invariant moments based model is used for feature extraction and finally Support Vector Machine (SVM and Fuzzy classifiers are used for numeral classification. . The comparison of SVM and Fuzzy classifier is made and it can be seen that SVM procured better results as compared to Fuzzy Classifier.

  17. A Topic Model Approach to Representing and Classifying Football Plays

    KAUST Repository

    Varadarajan, Jagannadan

    2013-09-09

    We address the problem of modeling and classifying American Football offense teams’ plays in video, a challenging example of group activity analysis. Automatic play classification will allow coaches to infer patterns and tendencies of opponents more ef- ficiently, resulting in better strategy planning in a game. We define a football play as a unique combination of player trajectories. To this end, we develop a framework that uses player trajectories as inputs to MedLDA, a supervised topic model. The joint maximiza- tion of both likelihood and inter-class margins of MedLDA in learning the topics allows us to learn semantically meaningful play type templates, as well as, classify different play types with 70% average accuracy. Furthermore, this method is extended to analyze individual player roles in classifying each play type. We validate our method on a large dataset comprising 271 play clips from real-world football games, which will be made publicly available for future comparisons.

  18. Mining housekeeping genes with a Naive Bayes classifier

    Science.gov (United States)

    De Ferrari, Luna; Aitken, Stuart

    2006-01-01

    Background Traditionally, housekeeping and tissue specific genes have been classified using direct assay of mRNA presence across different tissues, but these experiments are costly and the results not easy to compare and reproduce. Results In this work, a Naive Bayes classifier based only on physical and functional characteristics of genes already available in databases, like exon length and measures of chromatin compactness, has achieved a 97% success rate in classification of human housekeeping genes (93% for mouse and 90% for fruit fly). Conclusion The newly obtained lists of housekeeping and tissue specific genes adhere to the expected functions and tissue expression patterns for the two classes. Overall, the classifier shows promise, and in the future additional attributes might be included to improve its discriminating power. PMID:17074078

  19. [A novel spectral classifier based on coherence measure].

    Science.gov (United States)

    Li, Xiang-ru; Wu, Fu-chao; Hu, Zhan-yi; Luo, A-li

    2005-11-01

    Classification and discovery of new types of celestial bodies from voluminous celestial spectra are two important issues in astronomy, and these two issues are treated separately in the literature to our knowledge. In the present paper, a novel coherence measure is introduced which can effectively measure the coherence of a new spectrum of unknown type with the training sampleslocated within its neighbourhood, then a novel classifier is designed based on this coherence measure. The proposed classifier is capable of carrying out spectral classification and knowledge discovery simultaneously. In particular, it can effectively deal with the situation where different types of training spectra exist within the neighbourhood of a new spectrum, and the traditional k-nearest neighbour method usually fails to reach a correct classification. The satisfactory performance for classification and knowledge discovery has been obtained by the proposed novel classifier over active galactic nucleus (AGNs) and active galaxies (AGs) data.

  20. Examining the significance of fingerprint-based classifiers

    Directory of Open Access Journals (Sweden)

    Collins Jack R

    2008-12-01

    Full Text Available Abstract Background Experimental examinations of biofluids to measure concentrations of proteins or their fragments or metabolites are being explored as a means of early disease detection, distinguishing diseases with similar symptoms, and drug treatment efficacy. Many studies have produced classifiers with a high sensitivity and specificity, and it has been argued that accurate results necessarily imply some underlying biology-based features in the classifier. The simplest test of this conjecture is to examine datasets designed to contain no information with classifiers used in many published studies. Results The classification accuracy of two fingerprint-based classifiers, a decision tree (DT algorithm and a medoid classification algorithm (MCA, are examined. These methods are used to examine 30 artificial datasets that contain random concentration levels for 300 biomolecules. Each dataset contains between 30 and 300 Cases and Controls, and since the 300 observed concentrations are randomly generated, these datasets are constructed to contain no biological information. A modest search of decision trees containing at most seven decision nodes finds a large number of unique decision trees with an average sensitivity and specificity above 85% for datasets containing 60 Cases and 60 Controls or less, and for datasets with 90 Cases and 90 Controls many DTs have an average sensitivity and specificity above 80%. For even the largest dataset (300 Cases and 300 Controls the MCA procedure finds several unique classifiers that have an average sensitivity and specificity above 88% using only six or seven features. Conclusion While it has been argued that accurate classification results must imply some biological basis for the separation of Cases from Controls, our results show that this is not necessarily true. The DT and MCA classifiers are sufficiently flexible and can produce good results from datasets that are specifically constructed to contain no

  1. Text Classification: Classifying Plain Source Files with Neural Network

    Directory of Open Access Journals (Sweden)

    Jaromir Veber

    2010-10-01

    Full Text Available The automated text file categorization has an important place in computer engineering, particularly in the process called data management automation. A lot has been written about text classification and the methods allowing classification of these files are well known. Unfortunately most studies are theoretical and for practical implementation more research is needed. I decided to contribute with a research focused on creating of a classifier for different kinds of programs (source files, scripts…. This paper will describe practical implementation of the classifier for text files depending on file content.

  2. Silicon nanowire arrays as learning chemical vapour classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Niskanen, A O; Colli, A; White, R; Li, H W; Spigone, E; Kivioja, J M, E-mail: antti.niskanen@nokia.com [Nokia Research Center, Broers Building, 21 JJ Thomson Avenue, Cambridge CB3 0FA (United Kingdom)

    2011-07-22

    Nanowire field-effect transistors are a promising class of devices for various sensing applications. Apart from detecting individual chemical or biological analytes, it is especially interesting to use multiple selective sensors to look at their collective response in order to perform classification into predetermined categories. We show that non-functionalised silicon nanowire arrays can be used to robustly classify different chemical vapours using simple statistical machine learning methods. We were able to distinguish between acetone, ethanol and water with 100% accuracy while methanol, ethanol and 2-propanol were classified with 96% accuracy in ambient conditions.

  3. A Vertical Search Engine – Based On Domain Classifier

    OpenAIRE

    Rajashree Shettar; Rahul Bhuptani

    2008-01-01

    The World Wide Web is growing exponentially and the dynamic, unstructured nature of the web makes it difficult to locate useful resources. Web Search engines such as Google and Alta Vista provide huge amount of information many of which might not be relevant to the users query. In this paper, we build a vertical search engine which takes a seed URL and classifies the URLs crawled as Medical or Finance domains. The filter component of the vertical search engine classifies the web pages downloa...

  4. 48 CFR 3004.470 - Security requirements for access to unclassified facilities, Information Technology resources...

    Science.gov (United States)

    2010-10-01

    ... access to unclassified facilities, Information Technology resources, and sensitive information. 3004.470... Technology resources, and sensitive information. ... ACQUISITION REGULATION (HSAR) GENERAL ADMINISTRATIVE MATTERS Safeguarding Classified and Sensitive Information...

  5. Materiel Evaluation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — CRREL's Materiel Evaluation Facility (MEF) is a large cold-room facility that can be set up at temperatures ranging from −20°F to 120°F with a temperature change...

  6. Integrated Disposal Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Located near the center of the 586-square-mile Hanford Site is the Integrated Disposal Facility, also known as the IDF.This facility is a landfill similar in concept...

  7. Environmental Toxicology Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Fully-equipped facilities for environmental toxicology researchThe Environmental Toxicology Research Facility (ETRF) located in Vicksburg, MS provides over 8,200 ft...

  8. Explosive Components Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The 98,000 square foot Explosive Components Facility (ECF) is a state-of-the-art facility that provides a full-range of chemical, material, and performance analysis...

  9. Dialysis Facility Compare

    Data.gov (United States)

    U.S. Department of Health & Human Services — Dialysis Facility Compare helps you find detailed information about Medicare-certified dialysis facilities. You can compare the services and the quality of care that...

  10. Armament Technology Facility (ATF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Armament Technology Facility is a 52,000 square foot, secure and environmentally-safe, integrated small arms and cannon caliber design and evaluation facility....

  11. Cold Vacuum Drying Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Located near the K-Basins (see K-Basins link) in Hanford's 100 Area is a facility called the Cold Vacuum Drying Facility (CVDF).Between 2000 and 2004, workers at the...

  12. Lesotho - Health Facility Survey

    Data.gov (United States)

    Millennium Challenge Corporation — The main objective of the 2011 Health Facility Survey (HFS) was to establish a baseline for informing the Health Project performance indicators on health facilities,...

  13. Ouellette Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Thermal Test Facility is a joint Army/Navy state-of-the-art facility (8,100 ft2) that was designed to:Evaluate and characterize the effect of flame and thermal...

  14. Projectile Demilitarization Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Projectile Wash Out Facility is US Army Ammunition Peculiar Equipment (APE 1300). It is a pilot scale wash out facility that uses high pressure water and steam...

  15. Energetics Conditioning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Energetics Conditioning Facility is used for long term and short term aging studies of energetic materials. The facility has 10 conditioning chambers of which 2...

  16. Robustness in facility location

    OpenAIRE

    Van Lokven, Sander W.M.

    2009-01-01

    Facility location concerns the placement of facilities, for various objectives, by use of mathematical models and solution procedures. Almost all facility location models that can be found in literature are based on minimizing costs or maximizing cover, to cover as much demand as possible. These models are quite efficient for finding an optimal location for a new facility for a particular data set, which is considered to be constant and known in advance. In a real world situation, input da...

  17. Efficient architecture for global elimination algorithm for H. 264 ...

    Indian Academy of Sciences (India)

    Home; Journals; Sadhana; Volume 41; Issue 1. Efficient ... Fast block matching motion estimation; global elimination; matching complexity reduction; power reduction. ... The proposed architecture is based on Global Elimination (GE) Algorithm, which uses pixel averaging to reduce complexity of motion search while keeping ...

  18. Competition: Butterflies eliminate milkweed bugs from a Caribbean Island.

    Science.gov (United States)

    Blakley, Nigel R; Dingle, Hugh

    1978-01-01

    By eliminating the food plant, Asclepias curassavica, monarch butterflies, Danaus plexippus, have virtually eliminated milkweed bugs, Oncopeltus spp., from the island of Barbados. The relatively open terrain of Barbados means the plants have no refuge; the butterflies survive on an alternate milkweed food plant, Calotropis procera, whose thick-walled pods make seeds unavailable to the bugs.

  19. 77 FR 30871 - Implementing the Prison Rape Elimination Act

    Science.gov (United States)

    2012-05-23

    ... May 23, 2012 Part VI The President Memorandum of May 17, 2012--Implementing the Prison Rape... 3-- #0;The President ] Memorandum of May 17, 2012 Implementing the Prison Rape Elimination Act... assault on human dignity and an affront to American values. The Prison Rape Elimination Act of 2003 (PREA...

  20. Suppressor plate eliminates undesired arcing during electron beam welding

    Science.gov (United States)

    Hanchey, K. K.; Kubik, J.; Mahon, J. C.

    1966-01-01

    Suppressor grid eliminates undesired arcing during electron beam welding in one of two ways. A grid at ground potential collects secondary emission of ions and electrons produced by the beam as it strikes the workpiece, or a negatively energized grid repels the plasma arc back to the workpiece. This eliminates ground screens used to cover view ports.

  1. Novel shift register eliminates logic gates and power switching circuits

    Science.gov (United States)

    Cliff, R. A.

    1971-01-01

    Register requiring two integrated circuits per stage has nominal power dissipation of 3.5 mW per stage, its use eliminates reset pulse, allowing data transfer to occur in less than 1 microsecond, and eliminates power application to both right and left portions of the register simultaneously.

  2. Renal lactate elimination is maintained during moderate exercise in humans

    DEFF Research Database (Denmark)

    Volianitis, Stefanos; Dawson, Ellen A; Dalsgaard, Mads

    2012-01-01

    Reduced hepatic lactate elimination initiates blood lactate accumulation during incremental exercise. In this study, we wished to determine whether renal lactate elimination contributes to the initiation of blood lactate accumulation. The renal arterial-to-venous (a-v) lactate difference was dete...

  3. Using zeolites to eliminate mercaptans from natural gas

    Energy Technology Data Exchange (ETDEWEB)

    Nemkov, V.V.; Afanasev, U.M.; Dubinskii, V.M.; Frolov, G.S.; Krigman, L.E.; Kuzmenko, N.M.; Oppengein, M.B.

    1980-01-01

    Results from experimental tests on an adsorption method for eliminating mercaptains from Orenburg natural gas using zeolites are given. An engineering and economic comparison of three methods for eliminating mercaptans from natural gas is given, including a method employing zeolite, the use of an alkaline solution employing catalytic reclamation of a saturated absorbent, and using a tributyl phosphate solution.

  4. New ways of working to support sustainable disease elimination

    Directory of Open Access Journals (Sweden)

    Geordie Woods

    2017-03-01

    Full Text Available How can we ensure that Neglected Tropical Diseases (NTDs are not just eliminated, but eliminated once and for all? This article explores the key role that water, sanitation and hygiene (WASH interventions can play and what partnerships, programs and policies can be adopted to help see the end of certain diseases for good.

  5. Detection and elimination of sweetpotato viruses | Rukarwa | African ...

    African Journals Online (AJOL)

    Based on shoot survival and effectiveness of virus elimination, the best results were obtained by exposing plantlets to daily temperature regime of 32 oC for 8 hr of darkness and 36 oC for 16 hr of light for four weeks. Meristem-tip culture combined with thermotherapy allowed elimination of SPFMV and SPMMV in 77% of ...

  6. Malaria elimination practices in rural community residents in ...

    African Journals Online (AJOL)

    Background: Rwanda is moving towards malaria pre-elimination phase by the year 2017 and the role of the community will be critical. However, there is limited information about community perspective of the malaria elimination strategy. A study was thus designed to explore that. Methods: A descriptive cross-sectional study ...

  7. Recognition of Arabic Sign Language Alphabet Using Polynomial Classifiers

    Directory of Open Access Journals (Sweden)

    M. Al-Rousan

    2005-08-01

    Full Text Available Building an accurate automatic sign language recognition system is of great importance in facilitating efficient communication with deaf people. In this paper, we propose the use of polynomial classifiers as a classification engine for the recognition of Arabic sign language (ArSL alphabet. Polynomial classifiers have several advantages over other classifiers in that they do not require iterative training, and that they are highly computationally scalable with the number of classes. Based on polynomial classifiers, we have built an ArSL system and measured its performance using real ArSL data collected from deaf people. We show that the proposed system provides superior recognition results when compared with previously published results using ANFIS-based classification on the same dataset and feature extraction methodology. The comparison is shown in terms of the number of misclassified test patterns. The reduction in the rate of misclassified patterns was very significant. In particular, we have achieved a 36% reduction of misclassifications on the training data and 57% on the test data.

  8. Discrimination-Aware Classifiers for Student Performance Prediction

    Science.gov (United States)

    Luo, Ling; Koprinska, Irena; Liu, Wei

    2015-01-01

    In this paper we consider discrimination-aware classification of educational data. Mining and using rules that distinguish groups of students based on sensitive attributes such as gender and nationality may lead to discrimination. It is desirable to keep the sensitive attributes during the training of a classifier to avoid information loss but…

  9. Multiple Classifier System for Remote Sensing Image Classification: A Review

    Directory of Open Access Journals (Sweden)

    Yi Liu

    2012-04-01

    Full Text Available Over the last two decades, multiple classifier system (MCS or classifier ensemble has shown great potential to improve the accuracy and reliability of remote sensing image classification. Although there are lots of literatures covering the MCS approaches, there is a lack of a comprehensive literature review which presents an overall architecture of the basic principles and trends behind the design of remote sensing classifier ensemble. Therefore, in order to give a reference point for MCS approaches, this paper attempts to explicitly review the remote sensing implementations of MCS and proposes some modified approaches. The effectiveness of existing and improved algorithms are analyzed and evaluated by multi-source remotely sensed images, including high spatial resolution image (QuickBird, hyperspectral image (OMISII and multi-spectral image (Landsat ETM+.Experimental results demonstrate that MCS can effectively improve the accuracy and stability of remote sensing image classification, and diversity measures play an active role for the combination of multiple classifiers. Furthermore, this survey provides a roadmap to guide future research, algorithm enhancement and facilitate knowledge accumulation of MCS in remote sensing community.

  10. Dynamic Classifier Aggregation using Interaction-Sensitive Fuzzy Measures

    Czech Academy of Sciences Publication Activity Database

    Štefka, D.; Holeňa, Martin

    2015-01-01

    Roč. 270, 1 July (2015), s. 25-52 ISSN 0165-0114 R&D Projects: GA ČR GA13-17187S Institutional support: RVO:67985807 Keywords : Fuzzy integral * Fuzzy measure * Dynamic classifier aggregation Subject RIV: IN - Informatics, Computer Science Impact factor: 2.098, year: 2015

  11. An improved predictive association rule based classifier using gain ...

    Indian Academy of Sciences (India)

    Consequently, retaining significant attributes in a dataset contributes high classifier accuracy. To achieve that, statistical T-test and reducts computation have been performed to identify the significant attributes from health care datasets. To conclude, the impact of the both dimension- ality reduction techniques with CPAR and ...

  12. Scoring and Classifying Examinees Using Measurement Decision Theory

    Science.gov (United States)

    Rudner, Lawrence M.

    2009-01-01

    This paper describes and evaluates the use of measurement decision theory (MDT) to classify examinees based on their item response patterns. The model has a simple framework that starts with the conditional probabilities of examinees in each category or mastery state responding correctly to each item. The presented evaluation investigates: (1) the…

  13. A novel statistical method for classifying habitat generalists and specialists

    DEFF Research Database (Denmark)

    Chazdon, Robin L; Chao, Anne; Colwell, Robert K

    2011-01-01

    in second-growth (SG) and old-growth (OG) rain forests in the Caribbean lowlands of northeastern Costa Rica. We evaluate the multinomial model in detail for the tree data set. Our results for birds were highly concordant with a previous nonstatistical classification, but our method classified a higher...

  14. Gene-expression Classifier in Papillary Thyroid Carcinoma

    DEFF Research Database (Denmark)

    Londero, Stefano Christian; Jespersen, Marie Louise; Krogdahl, Annelise

    2016-01-01

    BACKGROUND: No reliable biomarker for metastatic potential in the risk stratification of papillary thyroid carcinoma exists. We aimed to develop a gene-expression classifier for metastatic potential. MATERIALS AND METHODS: Genome-wide expression analyses were used. Development cohort: freshly...

  15. Data Stream Classification Based on the Gamma Classifier

    Directory of Open Access Journals (Sweden)

    Abril Valeria Uriarte-Arcia

    2015-01-01

    Full Text Available The ever increasing data generation confronts us with the problem of handling online massive amounts of information. One of the biggest challenges is how to extract valuable information from these massive continuous data streams during single scanning. In a data stream context, data arrive continuously at high speed; therefore the algorithms developed to address this context must be efficient regarding memory and time management and capable of detecting changes over time in the underlying distribution that generated the data. This work describes a novel method for the task of pattern classification over a continuous data stream based on an associative model. The proposed method is based on the Gamma classifier, which is inspired by the Alpha-Beta associative memories, which are both supervised pattern recognition models. The proposed method is capable of handling the space and time constrain inherent to data stream scenarios. The Data Streaming Gamma classifier (DS-Gamma classifier implements a sliding window approach to provide concept drift detection and a forgetting mechanism. In order to test the classifier, several experiments were performed using different data stream scenarios with real and synthetic data streams. The experimental results show that the method exhibits competitive performance when compared to other state-of-the-art algorithms.

  16. Classifier fusion for VoIP attacks classification

    Science.gov (United States)

    Safarik, Jakub; Rezac, Filip

    2017-05-01

    SIP is one of the most successful protocols in the field of IP telephony communication. It establishes and manages VoIP calls. As the number of SIP implementation rises, we can expect a higher number of attacks on the communication system in the near future. This work aims at malicious SIP traffic classification. A number of various machine learning algorithms have been developed for attack classification. The paper presents a comparison of current research and the use of classifier fusion method leading to a potential decrease in classification error rate. Use of classifier combination makes a more robust solution without difficulties that may affect single algorithms. Different voting schemes, combination rules, and classifiers are discussed to improve the overall performance. All classifiers have been trained on real malicious traffic. The concept of traffic monitoring depends on the network of honeypot nodes. These honeypots run in several networks spread in different locations. Separation of honeypots allows us to gain an independent and trustworthy attack information.

  17. Diagnosis of Broiler Livers by Classifying Image Patches

    DEFF Research Database (Denmark)

    Jørgensen, Anders; Fagertun, Jens; Moeslund, Thomas B.

    2017-01-01

    The manual health inspection are becoming the bottleneck at poultry processing plants. We present a computer vision method for automatic diagnosis of broiler livers. The non-rigid livers, of varying shape and sizes, are classified in patches by a convolutional neural network, outputting maps...

  18. Subtractive fuzzy classifier based driver distraction levels classification using EEG.

    Science.gov (United States)

    Wali, Mousa Kadhim; Murugappan, Murugappan; Ahmad, Badlishah

    2013-09-01

    [Purpose] In earlier studies of driver distraction, researchers classified distraction into two levels (not distracted, and distracted). This study classified four levels of distraction (neutral, low, medium, high). [Subjects and Methods] Fifty Asian subjects (n=50, 43 males, 7 females), age range 20-35 years, who were free from any disease, participated in this study. Wireless EEG signals were recorded by 14 electrodes during four types of distraction stimuli (Global Position Systems (GPS), music player, short message service (SMS), and mental tasks). We derived the amplitude spectrum of three different frequency bands, theta, alpha, and beta of EEG. Then, based on fusion of discrete wavelet packet transforms and fast fourier transform yield, we extracted two features (power spectral density, spectral centroid frequency) of different wavelets (db4, db8, sym8, and coif5). Mean ± SD was calculated and analysis of variance (ANOVA) was performed. A fuzzy inference system classifier was applied to different wavelets using the two extracted features. [Results] The results indicate that the two features of sym8 posses highly significant discrimination across the four levels of distraction, and the best average accuracy achieved by the subtractive fuzzy classifier was 79.21% using the power spectral density feature extracted using the sym8 wavelet. [Conclusion] These findings suggest that EEG signals can be used to monitor distraction level intensity in order to alert drivers to high levels of distraction.

  19. Congestive heart failure detection using random forest classifier.

    Science.gov (United States)

    Masetic, Zerina; Subasi, Abdulhamit

    2016-07-01

    Automatic electrocardiogram (ECG) heartbeat classification is substantial for diagnosing heart failure. The aim of this paper is to evaluate the effect of machine learning methods in creating the model which classifies normal and congestive heart failure (CHF) on the long-term ECG time series. The study was performed in two phases: feature extraction and classification phase. In feature extraction phase, autoregressive (AR) Burg method is applied for extracting features. In classification phase, five different classifiers are examined namely, C4.5 decision tree, k-nearest neighbor, support vector machine, artificial neural networks and random forest classifier. The ECG signals were acquired from BIDMC Congestive Heart Failure and PTB Diagnostic ECG databases and classified by applying various experiments. The experimental results are evaluated in several statistical measures (sensitivity, specificity, accuracy, F-measure and ROC curve) and showed that the random forest method gives 100% classification accuracy. Impressive performance of random forest method proves that it plays significant role in detecting congestive heart failure (CHF) and can be valuable in expressing knowledge useful in medicine. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Development of Oscillating Classifiers for Forage Chop Length ...

    African Journals Online (AJOL)

    The chop length produced by forage harvesting systems is an important factor in many aspects of silage production. The only reliable method being tedious hand measurement of every particle in a sample of chopped material. Three oscillating particle length classifiers have been developed. Each of them differed from one ...

  1. Two Stage Comparison of Classifier Performances for Highly Imbalanced Datasets

    Directory of Open Access Journals (Sweden)

    Goran Oreški

    2015-12-01

    Full Text Available During the process of knowledge discovery in data, imbalanced learning data often emerges and presents a significant challenge for data mining methods. In this paper, we investigate the influence of class imbalanced data on the classification results of artificial intelligence methods, i.e. neural networks and support vector machine, and on the classification results of classical classification methods represented by RIPPER and the Naïve Bayes classifier. All experiments are conducted on 30 different imbalanced datasets obtained from KEEL (Knowledge Extraction based on Evolutionary Learning repository. With the purpose of measuring the quality of classification, the accuracy and the area under ROC curve (AUC measures are used. The results of the research indicate that the neural network and support vector machine show improvement of the AUC measure when applied to balanced data, but at the same time, they show the deterioration of results from the aspect of classification accuracy. RIPPER results are also similar, but the changes are of a smaller magnitude, while the results of the Naïve Bayes classifier show overall deterioration of results on balanced distributions. The number of instances in the presented highly imbalanced datasets has significant additional impact on the classification performances of the SVM classifier. The results have shown the potential of the SVM classifier for the ensemble creation on imbalanced datasets.

  2. 5 CFR 1312.5 - Authority to classify.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Authority to classify. 1312.5 Section 1312.5 Administrative Personnel OFFICE OF MANAGEMENT AND BUDGET OMB DIRECTIVES CLASSIFICATION... International Affairs. (iv) Associate Director for Natural Resources, Energy and Science. (2) Secret and below...

  3. Should Omphaloceles be Re-classified? | Adeniran | East and ...

    African Journals Online (AJOL)

    Background: Omphaloceles are presently classified into 'minor' and 'major' categories depending on the diameter of the umbilical defect. In developed countries most 'major' cases are treated with silo, parenteral nutrition and progressive compression. In developing countries most cases are managed conservatively with ...

  4. 25 CFR 304.3 - Classifying and marking of silver.

    Science.gov (United States)

    2010-04-01

    ... Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR NAVAJO, PUEBLO, AND HOPI SILVER, USE OF GOVERNMENT MARK § 304.3 Classifying and marking of silver. For the present the Indian Arts and Crafts Board... Government mark. All such marking of silver shall, for the present, be done by an agent of the Indian Arts...

  5. An Investigation to Improve Classifier Accuracy for Myo Collected Data

    Science.gov (United States)

    2017-02-01

    Bad Samples Effect on Classification Accuracy 7 5.1 Naïve Bayes (NB) Classifier Accuracy 7 5.2 Logistic Model Tree (LMT) 10 5.3 K-Nearest Neighbor...gesture, pitch feature, user 06. All samples exhibit reversed movement...20 Fig. A-2 Come gesture, pitch feature, user 14. All samples exhibit reversed movement

  6. Naming and Classifying: Theory, Evidence, and Equity in Education

    Science.gov (United States)

    Lucas, Samuel R.; Beresford, Lauren

    2010-01-01

    Education names and classifies individuals. This result seems unavoidable. For example, some students will graduate, and some will not. Those who graduate will be "graduates"; those who do not graduate will be labeled otherwise. The only way to avoid such labeling is to fail to make distinctions of any kind. Yet education is rife with…

  7. A Gene Expression Classifier of Node-Positive Colorectal Cancer

    Directory of Open Access Journals (Sweden)

    Paul F. Meeh

    2009-10-01

    Full Text Available We used digital long serial analysis of gene expression to discover gene expression differences between node-negative and node-positive colorectal tumors and developed a multigene classifier able to discriminate between these two tumor types. We prepared and sequenced long serial analysis of gene expression libraries from one node-negative and one node-positive colorectal tumor, sequenced to a depth of 26,060 unique tags, and identified 262 tags significantly differentially expressed between these two tumors (P < 2 x 10-6. We confirmed the tag-to-gene assignments and differential expression of 31 genes by quantitative real-time polymerase chain reaction, 12 of which were elevated in the node-positive tumor. We analyzed the expression levels of these 12 upregulated genes in a validation panel of 23 additional tumors and developed an optimized seven-gene logistic regression classifier. The classifier discriminated between node-negative and node-positive tumors with 86% sensitivity and 80% specificity. Receiver operating characteristic analysis of the classifier revealed an area under the curve of 0.86. Experimental manipulation of the function of one classification gene, Fibronectin, caused profound effects on invasion and migration of colorectal cancer cells in vitro. These results suggest that the development of node-positive colorectal cancer occurs in part through elevated epithelial FN1 expression and suggest novel strategies for the diagnosis and treatment of advanced disease.

  8. 18 CFR 367.18 - Criteria for classifying leases.

    Science.gov (United States)

    2010-04-01

    ... lessee by the end of the lease term. (2) The lease contains a bargain purchase option. (3) The lease term... effect at the inception of the lease, the revised agreement must be considered as a new agreement over... lease term, must be considered as a new agreement and must be classified according to the criteria in...

  9. 41 CFR 109-45.309-52 - Classified property.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Classified property. 109-45.309-52 Section 109-45.309-52 Public Contracts and Property Management Federal Property Management Regulations System (Continued) DEPARTMENT OF ENERGY PROPERTY MANAGEMENT REGULATIONS UTILIZATION AND DISPOSAL...

  10. Classifying aquatic macrophytes as indicators of eutrophication in European lakes

    NARCIS (Netherlands)

    Penning, W.E.; Mjelde, M.; Dudley, B.; Hellsten, S.; Hanganu, J.; Kolada, A.; van den Berg, Marcel S.; Poikane, S.; Phillips, G.; Willby, N.; Ecke, F.

    2008-01-01

    Aquatic macrophytes are one of the biological quality elements in the Water Framework Directive (WFD) for which status assessments must be defined. We tested two methods to classify macrophyte species and their response to eutrophication pressure: one based on percentiles of occurrence along a

  11. 18 CFR 3a.71 - Accountability for classified material.

    Science.gov (United States)

    2010-04-01

    ... and will be prefixed by the letters “TS”. Examples: 9006—Sixth classified document controlled by the... the document, will be prepared in addition to FPC Form 55. In addition, a physical inventory of all... for recommendations to the chairman, through the Review Committee, for appropriate administrative...

  12. A Framework for Identifying and Classifying Undergraduate Student Proof Errors

    Science.gov (United States)

    Strickland, S.; Rand, B.

    2016-01-01

    This paper describes a framework for identifying, classifying, and coding student proofs, modified from existing proof-grading rubrics. The framework includes 20 common errors, as well as categories for interpreting the severity of the error. The coding scheme is intended for use in a classroom context, for providing effective student feedback. In…

  13. 32 CFR 2700.42 - Responsibility for safeguarding classified information.

    Science.gov (United States)

    2010-07-01

    ... FOR MICRONESIAN STATUS NEGOTIATIONS SECURITY INFORMATION REGULATIONS Safeguarding § 2700.42... maintenance of the security of classified information rest with each person having knowledge or physical... the interest of national security and assurance of the recipient's trustworthiness and need-to-know. ...

  14. A novel statistical method for classifying habitat generalists and specialists.

    Science.gov (United States)

    Chazdon, Robin L; Chao, Anne; Colwell, Robert K; Lin, Shang-Yi; Norden, Natalia; Letcher, Susan G; Clark, David B; Finegan, Bryan; Arroyo, J Pablo

    2011-06-01

    We develop a novel statistical approach for classifying generalists and specialists in two distinct habitats. Using a multinomial model based on estimated species relative abundance in two habitats, our method minimizes bias due to differences in sampling intensities between two habitat types as well as bias due to insufficient sampling within each habitat. The method permits a robust statistical classification of habitat specialists and generalists, without excluding rare species a priori. Based on a user-defined specialization threshold, the model classifies species into one of four groups: (1) generalist; (2) habitat A specialist; (3) habitat B specialist; and (4) too rare to classify with confidence. We illustrate our multinomial classification method using two contrasting data sets: (1) bird abundance in woodland and heath habitats in southeastern Australia and (2) tree abundance in second-growth (SG) and old-growth (OG) rain forests in the Caribbean lowlands of northeastern Costa Rica. We evaluate the multinomial model in detail for the tree data set. Our results for birds were highly concordant with a previous nonstatistical classification, but our method classified a higher fraction (57.7%) of bird species with statistical confidence. Based on a conservative specialization threshold and adjustment for multiple comparisons, 64.4% of tree species in the full sample were too rare to classify with confidence. Among the species classified, OG specialists constituted the largest class (40.6%), followed by generalist tree species (36.7%) and SG specialists (22.7%). The multinomial model was more sensitive than indicator value analysis or abundance-based phi coefficient indices in detecting habitat specialists and also detects generalists statistically. Classification of specialists and generalists based on rarefied subsamples was highly consistent with classification based on the full sample, even for sampling percentages as low as 20%. Major advantages of the new

  15. CLEAR test facility

    CERN Multimedia

    Ordan, Julien Marius

    2017-01-01

    A new user facility for accelerator R&D, the CERN Linear Electron Accelerator for Research (CLEAR), started operation in August 2017. CLEAR evolved from the former CLIC Test Facility 3 (CTF3) used by the Compact Linear Collider (CLIC). The new facility is able to host and test a broad range of ideas in the accelerator field.

  16. Hepatic, renal, and total body galactose elimination in the pig

    DEFF Research Database (Denmark)

    Winkler, K; Henriksen, Jens Henrik Sahl; Tygstrup, N

    1993-01-01

    reabsorption (Tm 178 +/- 3.0 mumol/min, Km 3.8 +/- 0.9 mmol/l, n = 20). Metabolic conversion of galactose in the kidney was not demonstrable. At all concentrations studied (0.4-5.8 mmol/l), total galactose elimination from the body exceeded the sum of hepatic and renal elimination by approximately 100 mumol....../min, independent of the concentration. At blood concentrations usually used for clinical estimation of the galactose elimination capacity (approximately 4 mmol/l), hepatic removal in the pig accounted for 55% and renal removal for 30% of total removal; 15% of removal occurred in other organs. We conclude...... that estimation of the hepatic galactose elimination capacity from whole body elimination curves requires correction for renal removal of galactose....

  17. Investment case concepts in leprosy elimination: A systematic review.

    Science.gov (United States)

    Tiwari, Anuj; Richardus, Jan Hendrik

    2016-03-01

    Leprosy continues to be a global public health problem, but draws less attention because 'prevalence based elimination' has been misinterpreted as eradication. The ongoing transmission of M. leprae has renewed interest in complete elimination. The aim of our study is to review systematically the literature regarding the elimination of leprosy, and to assess this information on its applicability for defining a Leprosy Elimination Investment Case (LEIC) based on Eradication Investment Case guidelines. A literature search was conducted using the MeSH subheadings and synonyms of leprosy. A total of 1007 articles were considered and 112 were included in the final selection. The search focused on the literature covering leprosy elimination and its public health aspects. The LEIC framework was adapted from an existing "Guide to Preparing an Eradication Investment Case". The LEIC framework provided 11 topics under which information was synthesized from the literature. The fields were categorised under sections: 1) Proposed investment; 2) Rationale for investing; 3) Issues to consider when moving from control to eradication; 4) Management and governance. Scanty quantitative data are available for developing a LEIC, particularly regarding disease burden, and new interventions that could contribute to elimination are not yet applied routinely. For monitoring global elimination, it is necessary to measure disease burden comprehensively, and contact centered preventive interventions should be part of a global elimination strategy. The biological and technical feasibility of elimination is not certain and advanced microbiological and operational research is necessary to understand transmission better. The current WHO road map for leprosy elimination is too vague and needs further structuring through a thoroughly prepared LEIC.

  18. Geospatial Technology: A Tool to Aid in the Elimination of Malaria in Bangladesh

    Directory of Open Access Journals (Sweden)

    Karen E. Kirk

    2014-12-01

    Full Text Available Bangladesh is a malaria endemic country. There are 13 districts in the country bordering India and Myanmar that are at risk of malaria. The majority of malaria morbidity and mortality cases are in the Chittagong Hill Tracts, the mountainous southeastern region of Bangladesh. In recent years, malaria burden has declined in the country. In this study, we reviewed and summarized published data (through 2014 on the use of geospatial technologies on malaria epidemiology in Bangladesh and outlined potential contributions of geospatial technologies for eliminating malaria in the country. We completed a literature review using “malaria, Bangladesh” search terms and found 218 articles published in peer-reviewed journals listed in PubMed. After a detailed review, 201 articles were excluded because they did not meet our inclusion criteria, 17 articles were selected for final evaluation. Published studies indicated geospatial technologies tools (Geographic Information System, Global Positioning System, and Remote Sensing were used to determine vector-breeding sites, land cover classification, accessibility to health facility, treatment seeking behaviors, and risk mapping at the household, regional, and national levels in Bangladesh. To achieve the goal of malaria elimination in Bangladesh, we concluded that further research using geospatial technologies should be integrated into the country’s ongoing surveillance system to identify and better assess progress towards malaria elimination.

  19. Celebrating 50 years of polio elimination in New Zealand: but inadequate progress in eliminating other vaccine-preventable diseases.

    Science.gov (United States)

    Wilson, Nick; Baker, Michael G

    2012-11-09

    New Zealanders can now reflect on and celebrate 50 years of polio elimination in this country. This success was followed by eliminating two other infectious diseases, brucellosis and hydatids, and an imported potential disease vector, the southern saltmarsh mosquito. However, this country has made inadequate progress in eliminating several other vaccine-preventable diseases. These include measles, mumps, and rubella, which are priority candidates for elimination, and potentially Hib disease and rotavirus infection. To achieve such successes almost certainly requires that the country: (i) builds national leadership for elimination goals; (ii) develops detailed plans; (iii) continues recent successes in enhancing routine vaccination coverage; (iv) introduces rotavirus vaccine into the childhood immunisation schedule; and (v) strengthens surveillance and research (on such questions as the cost-effectiveness of new vaccines, measures to enhance uptake, and effective border controls to reduce the risk of disease importation).

  20. Enhanced passive screening and diagnosis for gambiense human African trypanosomiasis in north-western Uganda - Moving towards elimination.

    Directory of Open Access Journals (Sweden)

    Charles Wamboga

    Full Text Available The incidence of gambiense human African trypanosomiasis (gHAT in Uganda has been declining, from 198 cases in 2008, to only 20 in 2012. Interruption of transmission of the disease by early diagnosis and treatment is core to the control and eventual elimination of gHAT. Until recently, the format of available screening tests had restricted screening and diagnosis to central health facilities (passive screening. We describe a novel strategy that is contributing to elimination of gHAT in Uganda through expansion of passive screening to the entire population at risk.In this strategy, patients who are clinically suspected of having gHAT at primary health facilities are screened using a rapid diagnostic test (RDT, followed by parasitological confirmation at strategically located microscopy centres. For patients who are positive with the RDT and negative by microscopy, blood samples undergo further testing using loop-mediated isothermal amplification (LAMP, a molecular test that detects parasite DNA. LAMP positive patients are considered strong suspects, and are re-evaluated by microscopy. Location and upgrading of facilities to perform microscopy and LAMP was informed by results of georeferencing and characterization of all public healthcare facilities in the 7 gHAT endemic districts in Uganda. Three facilities were upgraded to perform RDTs, microscopy and LAMP, 9 to perform RDTs and microscopy, and 200 to screen patients with RDTs. This reduced the distance that a sick person must travel to be screened for gHAT to a median distance of 2.5km compared to 23km previously. In this strategy, 9 gHAT cases were diagnosed in 2014, and 4 in 2015.This enhanced passive screening strategy for gHAT has enabled full coverage of the population at risk, and is being replicated in other gHAT endemic countries. The improvement in case detection is making elimination of the disease in Uganda an imminent possibility.

  1. Comparative metabolism and elimination of acetanilide compounds by rat.

    Science.gov (United States)

    Davison, K L; Larsen, G L; Feil, V J

    1994-10-01

    1. 14C-labelled propachlor, alachlor, butachlor, metolachlor, methoxypropachlor and some of their mercapturic acid pathway metabolites (MAP) were given to rat either by gavage or by perfusion into a renal artery. MAP metabolites were isolated from bile and urine. 2. Rat gavaged with propachlor and methoxypropachlor eliminated 14C mostly in urine, whereas rat gavaged with alachlor, butachlor and metolachlor eliminated 14C about equally divided between urine and faeces. When bile ducts were cannulated, the gavaged rat eliminated most of the 14C in bile for all compounds. The amount of 14C in bile from the propachlor-gavaged rat was less than that for the other acetanilides, with the difference being in the urine. 3. The mercapturic acid metabolites 2-methylsulphinyl-N-(1-methylhydroxyethyl)-N-phenylacetam ide and 2-methylsulphinyl-N-(1-methylmethoxyethyl)-N-phenylacetam ide were isolated from the urine and bile of the methoxypropachlor-gavaged rat. 4. Bile was the major route for 14C elimination when MAP metabolites of alachlor, butachlor and metolachlor were perfused into a renal artery. Urine was the major route for 14C elimination when MAP metabolites of propachlor and methoxypropachlor were perfused. Mercapturic acid conjugates were major metabolites in bile and urine when MAP metabolites were perfused. 5. We conclude that alkyl groups on the phenyl portion of the acetanilide causes biliary elimination to be favoured over urinary elimination.

  2. Facilities improvement for sustainability of existing public office ...

    African Journals Online (AJOL)

    The study examined the building design features of a cosmopolitan public office building in Abuja. The features were classified into Spatial Plan, Structure and Facilities, to determine which of the 3 variables requires urgent sustainable improvement from end-users' perspective in existing public office buildings in developing ...

  3. Scoring and Classifying Examinees Using Measurement Decision Theory

    Directory of Open Access Journals (Sweden)

    Lawrence M. Rudner

    2009-04-01

    Full Text Available This paper describes and evaluates the use of measurement decision theory (MDT to classify examinees based on their item response patterns. The model has a simple framework that starts with the conditional probabilities of examinees in each category or mastery state responding correctly to each item. The presented evaluation investigates: (1 the classification accuracy of tests scored using decision theory; (2 the effectiveness of different sequential testing procedures; and (3 the number of items needed to make a classification. A large percentage of examinees can be classified accurately with very few items using decision theory. A Java Applet for self instruction and software for generating, calibrating and scoring MDT data are provided.

  4. Classifying Human Leg Motions with Uniaxial Piezoelectric Gyroscopes

    Directory of Open Access Journals (Sweden)

    Kerem Altun

    2009-10-01

    Full Text Available This paper provides a comparative study on the different techniques of classifying human leg motions that are performed using two low-cost uniaxial piezoelectric gyroscopes worn on the leg. A number of feature sets, extracted from the raw inertial sensor data in different ways, are used in the classification process. The classification techniques implemented and compared in this study are: Bayesian decision making (BDM, a rule-based algorithm (RBA or decision tree, least-squares method (LSM, k-nearest neighbor algorithm (k-NN, dynamic time warping (DTW, support vector machines (SVM, and artificial neural networks (ANN. A performance comparison of these classification techniques is provided in terms of their correct differentiation rates, confusion matrices, computational cost, and training and storage requirements. Three different cross-validation techniques are employed to validate the classifiers. The results indicate that BDM, in general, results in the highest correct classification rate with relatively small computational cost.

  5. Utilizing a class labeling feature in an adaptive Bayesian classifier

    Science.gov (United States)

    Lynch, Robert S., Jr.; Willett, Peter K.

    2001-08-01

    In this paper, the Mean-Field Bayesian Data Reduction Algorithm is developed that adaptively trains on data containing missing values. In the basic data model for this algorithm each feature vector of a given class contains a class-labeling feature. Thus, the methods developed here are used to demonstrate performance for problems in which it is desired to adapt the existing training data with data containing missing values, such as the class-labeling feature. Given that, the Mean-Field Bayesian Data Reduction Algorithm labels the adapted data, while simultaneously determining those features that provide best classification performance. That is, performance is improved by reducing the data to mitigate the effects of the curse of dimensionality. Further, to demonstrate performance, the algorithm is compared to the classifier that does not adapt and bases its decisions on only the prior training data, and also the optimal clairvoyant classifier.

  6. Machine learning classifiers and fMRI: a tutorial overview.

    Science.gov (United States)

    Pereira, Francisco; Mitchell, Tom; Botvinick, Matthew

    2009-03-01

    Interpreting brain image experiments requires analysis of complex, multivariate data. In recent years, one analysis approach that has grown in popularity is the use of machine learning algorithms to train classifiers to decode stimuli, mental states, behaviours and other variables of interest from fMRI data and thereby show the data contain information about them. In this tutorial overview we review some of the key choices faced in using this approach as well as how to derive statistically significant results, illustrating each point from a case study. Furthermore, we show how, in addition to answering the question of 'is there information about a variable of interest' (pattern discrimination), classifiers can be used to tackle other classes of question, namely 'where is the information' (pattern localization) and 'how is that information encoded' (pattern characterization).

  7. Lung Nodule Detection in CT Images using Neuro Fuzzy Classifier

    Directory of Open Access Journals (Sweden)

    M. Usman Akram

    2013-07-01

    Full Text Available Automated lung cancer detection using computer aided diagnosis (CAD is an important area in clinical applications. As the manual nodule detection is very time consuming and costly so computerized systems can be helpful for this purpose. In this paper, we propose a computerized system for lung nodule detection in CT scan images. The automated system consists of two stages i.e. lung segmentation and enhancement, feature extraction and classification. The segmentation process will result in separating lung tissue from rest of the image, and only the lung tissues under examination are considered as candidate regions for detecting malignant nodules in lung portion. A feature vector for possible abnormal regions is calculated and regions are classified using neuro fuzzy classifier. It is a fully automatic system that does not require any manual intervention and experimental results show the validity of our system.

  8. Testing mediation effects in cross-classified multilevel data.

    Science.gov (United States)

    Luo, Wen

    2017-04-01

    In this article, we propose an approach to test mediation effects in cross-classified multilevel data in which the initial cause is associated with one crossed factor, the mediator is associated with the other crossed factor, and the outcome is associated with Level-1 units (i.e., the 2((A))➔2((B))➔1 design). Multiple-membership models and cross-classified random effects models are used to estimate the indirect effects. The method is illustrated using real data from the Early Childhood Longitudinal Study-Kindergarten Cohort (1998). The results from the simulation study show that the proposed method can produce a consistent estimate of the indirect effect and reliable statistical inferences, given an adequate sample size.

  9. Implementation of Emergency Medical Text Classifier for syndromic surveillance.

    Science.gov (United States)

    Travers, Debbie; Haas, Stephanie W; Waller, Anna E; Schwartz, Todd A; Mostafa, Javed; Best, Nakia C; Crouch, John

    2013-01-01

    Public health officials use syndromic surveillance systems to facilitate early detection and response to infectious disease outbreaks. Emergency department clinical notes are becoming more available for surveillance but present the challenge of accurately extracting concepts from these text data. The purpose of this study was to implement a new system, Emergency Medical Text Classifier (EMT-C), into daily production for syndromic surveillance and evaluate system performance and user satisfaction. The system was designed to meet user preferences for a syndromic classifier that maximized positive predictive value and minimized false positives in order to provide a manageable workload. EMT-C performed better than the baseline system on all metrics and users were slightly more satisfied with it. It is vital to obtain user input and test new systems in the production environment.

  10. Feasibility study for banking loan using association rule mining classifier

    Directory of Open Access Journals (Sweden)

    Agus Sasmito Aribowo

    2015-03-01

    Full Text Available The problem of bad loans in the koperasi can be reduced if the koperasi can detect whether member can complete the mortgage debt or decline. The method used for identify characteristic patterns of prospective lenders in this study, called Association Rule Mining Classifier. Pattern of credit member will be converted into knowledge and used to classify other creditors. Classification process would separate creditors into two groups: good credit and bad credit groups. Research using prototyping for implementing the design into an application using programming language and development tool. The process of association rule mining using Weighted Itemset Tidset (WIT–tree methods. The results shown that the method can predict the prospective customer credit. Training data set using 120 customers who already know their credit history. Data test used 61 customers who apply for credit. The results concluded that 42 customers will be paying off their loans and 19 clients are decline

  11. Classifier ensemble selection based on affinity propagation clustering.

    Science.gov (United States)

    Meng, Jun; Hao, Han; Luan, Yushi

    2016-04-01

    A small number of features are significantly correlated with classification in high-dimensional data. An ensemble feature selection method based on cluster grouping is proposed in this paper. Classification-related features are chosen using a ranking aggregation technique. These features are divided into unrelated groups by an affinity propagation clustering algorithm with a bicor correlation coefficient. Some diversity and distinguishing feature subsets are constructed by randomly selecting a feature from each group and are used to train base classifiers. Finally, some base classifiers that have better classification performance are selected using a kappa coefficient and integrated using a majority voting strategy. The experimental results based on five gene expression datasets show that the proposed method has low classification error rates, stable classification performance and strong scalability in terms of sensitivity, specificity, accuracy and G-Mean criteria. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Nonlinear interpolation fractal classifier for multiple cardiac arrhythmias recognition

    Energy Technology Data Exchange (ETDEWEB)

    Lin, C.-H. [Department of Electrical Engineering, Kao-Yuan University, No. 1821, Jhongshan Rd., Lujhu Township, Kaohsiung County 821, Taiwan (China); Institute of Biomedical Engineering, National Cheng-Kung University, Tainan 70101, Taiwan (China)], E-mail: eechl53@cc.kyu.edu.tw; Du, Y.-C.; Chen Tainsong [Institute of Biomedical Engineering, National Cheng-Kung University, Tainan 70101, Taiwan (China)

    2009-11-30

    This paper proposes a method for cardiac arrhythmias recognition using the nonlinear interpolation fractal classifier. A typical electrocardiogram (ECG) consists of P-wave, QRS-complexes, and T-wave. Iterated function system (IFS) uses the nonlinear interpolation in the map and uses similarity maps to construct various data sequences including the fractal patterns of supraventricular ectopic beat, bundle branch ectopic beat, and ventricular ectopic beat. Grey relational analysis (GRA) is proposed to recognize normal heartbeat and cardiac arrhythmias. The nonlinear interpolation terms produce family functions with fractal dimension (FD), the so-called nonlinear interpolation function (NIF), and make fractal patterns more distinguishing between normal and ill subjects. The proposed QRS classifier is tested using the Massachusetts Institute of Technology-Beth Israel Hospital (MIT-BIH) arrhythmia database. Compared with other methods, the proposed hybrid methods demonstrate greater efficiency and higher accuracy in recognizing ECG signals.

  13. Deep Feature Learning and Cascaded Classifier for Large Scale Data

    DEFF Research Database (Denmark)

    Prasoon, Adhish

    from data rather than having a predefined feature set. We explore deep learning approach of convolutional neural network (CNN) for segmenting three dimensional medical images. We propose a novel system integrating three 2D CNNs, which have a one-to-one association with the xy, yz and zx planes of 3D......This thesis focuses on voxel/pixel classification based approaches for image segmentation. The main application is segmentation of articular cartilage in knee MRIs. The first major contribution of the thesis deals with large scale machine learning problems. Many medical imaging problems need huge...... amount of training data to cover sufficient biological variability. Learning methods scaling badly with number of training data points cannot be used in such scenarios. This may restrict the usage of many powerful classifiers having excellent generalization ability. We propose a cascaded classifier which...

  14. Security Enrichment in Intrusion Detection System Using Classifier Ensemble

    Directory of Open Access Journals (Sweden)

    Uma R. Salunkhe

    2017-01-01

    Full Text Available In the era of Internet and with increasing number of people as its end users, a large number of attack categories are introduced daily. Hence, effective detection of various attacks with the help of Intrusion Detection Systems is an emerging trend in research these days. Existing studies show effectiveness of machine learning approaches in handling Intrusion Detection Systems. In this work, we aim to enhance detection rate of Intrusion Detection System by using machine learning technique. We propose a novel classifier ensemble based IDS that is constructed using hybrid approach which combines data level and feature level approach. Classifier ensembles combine the opinions of different experts and improve the intrusion detection rate. Experimental results show the improved detection rates of our system compared to reference technique.

  15. Predicting Cutting Forces in Aluminum Using Polynomial Classifiers

    Science.gov (United States)

    Kadi, H. El; Deiab, I. M.; Khattab, A. A.

    Due to increased calls for environmentally benign machining processes, there has been focus and interest in making processes more lean and agile to enhance efficiency, reduce emissions and increase profitability. One approach to achieving lean machining is to develop a virtual simulation environment that enables fast and reasonably accurate predictions of various machining scenarios. Polynomial Classifiers (PCs) are employed to develop a smart data base that can provide fast prediction of cutting forces resulting from various combinations of cutting parameters. With time, the force model can expand to include different materials, tools, fixtures and machines and would be consulted prior to starting any job. In this work, first, second and third order classifiers are used to predict the cutting coefficients that can be used to determine the cutting forces. Predictions obtained using PCs are compared to experimental results and are shown to be in good agreement.

  16. Hydrogen bond induced HF elimination from photoionized fluorophenol dimers in the gas phase.

    Science.gov (United States)

    Chatterjee, Piyali; Ghosh, Arup K; Chakraborty, Tapas

    2017-02-28

    In this paper, we report finding of a remarkable chemical effect of hydrogen bonding, elimination of hydrogen fluoride (HF) from the hydrogen bonded dimers of 2-fluorophenol (2-FP) and 3-fluorophenol (3-FP), in a supersonic jet expansion upon multi-photon ionization using 4th harmonic wavelength (266 nm) of a Q-switched Nd:YAG laser, and the reaction has been probed by time-of-flight mass spectrometry. No HF elimination is observed to occur by such means from the monomer of 3-FP, but it occurs with a small yield from the monomer of 2-FP. On the other hand, upon dimerization the reaction is triggered on for 3-FP, and for 2-FP it becomes so facile that no intact dimer cation survives and only the HF eliminated product ion appears in the mass spectra. Electronic structure calculation shows that in the cationic ground (D0) state, although the reaction for 2-FP dimer is exothermic, the associated barrier is significantly high (2.75 eV) and for its occurrence, absorption of three photons (2+1 type) is required. However, the reaction is predicted barrierless in the intermediate S1 state of this dimer, and HF loss dimer cation mass peak could appear in the mass spectrum due to an effective two-photon (1+1) ionization process. In the case of 3-FP dimer, the energy barriers both in S1 (neutral) and D0 (ionic) states are high, and it is suggested that for occurrence of HF elimination, dimer cation needs to absorb an additional photon. For facilitation of HF loss from this dimer cation, a rearrangement of the geometry and formation of an intermediate adduct have been suggested, and it is argued that the latter could be produced by nucleophilic attack of the neutral moiety at the ortho site of the cationic counterpart.

  17. Hydrogen bond induced HF elimination from photoionized fluorophenol dimers in the gas phase

    Science.gov (United States)

    Chatterjee, Piyali; Ghosh, Arup K.; Chakraborty, Tapas

    2017-02-01

    In this paper, we report finding of a remarkable chemical effect of hydrogen bonding, elimination of hydrogen fluoride (HF) from the hydrogen bonded dimers of 2-fluorophenol (2-FP) and 3-fluorophenol (3-FP), in a supersonic jet expansion upon multi-photon ionization using 4th harmonic wavelength (266 nm) of a Q-switched Nd:YAG laser, and the reaction has been probed by time-of-flight mass spectrometry. No HF elimination is observed to occur by such means from the monomer of 3-FP, but it occurs with a small yield from the monomer of 2-FP. On the other hand, upon dimerization the reaction is triggered on for 3-FP, and for 2-FP it becomes so facile that no intact dimer cation survives and only the HF eliminated product ion appears in the mass spectra. Electronic structure calculation shows that in the cationic ground (D0) state, although the reaction for 2-FP dimer is exothermic, the associated barrier is significantly high (2.75 eV) and for its occurrence, absorption of three photons (2+1 type) is required. However, the reaction is predicted barrierless in the intermediate S1 state of this dimer, and HF loss dimer cation mass peak could appear in the mass spectrum due to an effective two-photon (1+1) ionization process. In the case of 3-FP dimer, the energy barriers both in S1 (neutral) and D0 (ionic) states are high, and it is suggested that for occurrence of HF elimination, dimer cation needs to absorb an additional photon. For facilitation of HF loss from this dimer cation, a rearrangement of the geometry and formation of an intermediate adduct have been suggested, and it is argued that the latter could be produced by nucleophilic attack of the neutral moiety at the ortho site of the cationic counterpart.

  18. Complexity Measure Revisited: A New Algorithm for Classifying Cardiac Arrhythmias

    Science.gov (United States)

    2001-10-25

    Complexity Measure Revisited: A New Algorithm for Classifying Cardiac Arrhythmias Contract Number Grant Number Program Element Number Author(s) Project...to set-up the acquisition and processing characteristics of ECG signal. REFERENCES [1] Special Issue on Electrical Therapy of Cardiac Arrhythmias , 3URFHHGLQJV...JM Jenkins, LA DiCarlo. “Detection and Identification of Cardiac Arrhythmias Using an adaptive, Linear-Predictive Filter”, ,((( &RPSXWHUV LQ

  19. Evaluating Classifiers in Detecting 419 Scams in Bilingual Cybercriminal Communities

    OpenAIRE

    Mbaziira, Alex V.; Abozinadah, Ehab; Jones Jr, James H.

    2015-01-01

    Incidents of organized cybercrime are rising because of criminals are reaping high financial rewards while incurring low costs to commit crime. As the digital landscape broadens to accommodate more internet-enabled devices and technologies like social media, more cybercriminals who are not native English speakers are invading cyberspace to cash in on quick exploits. In this paper we evaluate the performance of three machine learning classifiers in detecting 419 scams in a bilingual Nigerian c...

  20. Organisms can essentially be classified according to two codon patterns.

    Science.gov (United States)

    Okayasu, T; Sorimachi, K

    2009-02-01

    We recently classified 23 bacteria into two types based on their complete genomes; "S-type" as represented by Staphylococcus aureus and "E-type" as represented by Escherichia coli. Classification was characterized by concentrations of Arg, Ala or Lys in the amino acid composition calculated from the complete genome. Based on these previous classifications, not only prokaryotic but also eukaryotic genome structures were investigated by amino acid compositions and nucleotide contents. Organisms consisting of 112 bacteria, 15 archaea and 18 eukaryotes were classified into two major groups by cluster analysis using GC contents at the three codon positions calculated from complete genomes. The 145 organisms were classified into "AT-type" and "GC-type" represented by high A or T (low G or C) and high G or C (low A or T) contents, respectively, at every third codon position. Reciprocal changes between G or C and A or T contents at the third codon position occurred almost synchronously in every codon among the organisms. Correlations between amino acid concentrations (Ala, Ile and Lys) and the nucleotide contents at the codon position were obtained in both "AT-type" and "GC-type" organisms, but with different regression coefficients. In certain correlations of amino acid concentrations with GC contents, eukaryotes, archaea and bacteria showed different behaviors; thus these kingdoms evolved differently. All organisms are basically classifiable into two groups having characteristic codon patterns; organisms with low GC and high AT contents at the third codon position and their derivatives, and organisms with an inverse relationship.

  1. The Relationship Between Diversity and Accuracy in Multiple Classifier Systems

    Science.gov (United States)

    2012-03-22

    presented below all meet this criteria. In the event that the classifier only produces scores for each class, enough information is known about the 6...Fourteen different data sets are used for the research in this paper. All data sets are available from the UCI Machine Learning repository [12]. While some...validation sets large enough that a few “difficult” exemplars would not have too large of an adverse effect on the accuracy. The data sets used were

  2. Classifying patient portal messages using Convolutional Neural Networks.

    Science.gov (United States)

    Sulieman, Lina; Gilmore, David; French, Christi; Cronin, Robert M; Jackson, Gretchen Purcell; Russell, Matthew; Fabbri, Daniel

    2017-10-01

    Patients communicate with healthcare providers via secure messaging in patient portals. As patient portal adoption increases, growing messaging volumes may overwhelm providers. Prior research has demonstrated promise in automating classification of patient portal messages into communication types to support message triage or answering. This paper examines if using semantic features and word context improves portal message classification. Portal messages were classified into the following categories: informational, medical, social, and logistical. We constructed features from portal messages including bag of words, bag of phrases, graph representations, and word embeddings. We trained one-versus-all random forest and logistic regression classifiers, and convolutional neural network (CNN) with a softmax output. We evaluated each classifier's performance using Area Under the Curve (AUC). Representing the messages using bag of words, the random forest detected informational, medical, social, and logistical communications in patient portal messages with AUCs: 0.803, 0.884, 0.828, and 0.928, respectively. Graph representations of messages outperformed simpler features with AUCs: 0.837, 0.914, 0.846, 0.884 for informational, medical, social, and logistical communication, respectively. Representing words with Word2Vec embeddings, and mapping features using a CNN had the best performance with AUCs: 0.908 for informational, 0.917 for medical, 0.935 for social, and 0.943 for logistical categories. Word2Vec and graph representations improved the accuracy of classifying portal messages compared to features that lacked semantic information such as bag of words, and bag of phrases. Furthermore, using Word2Vec along with a CNN model, which provide a higher order representation, improved the classification of portal messages. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Applying deep learning to classify pornographic images and videos

    OpenAIRE

    Moustafa, Mohamed

    2015-01-01

    It is no secret that pornographic material is now a one-click-away from everyone, including children and minors. General social media networks are striving to isolate adult images and videos from normal ones. Intelligent image analysis methods can help to automatically detect and isolate questionable images in media. Unfortunately, these methods require vast experience to design the classifier including one or more of the popular computer vision feature descriptors. We propose to build a clas...

  4. Optimally splitting cases for training and testing high dimensional classifiers

    Directory of Open Access Journals (Sweden)

    Simon Richard M

    2011-04-01

    Full Text Available Abstract Background We consider the problem of designing a study to develop a predictive classifier from high dimensional data. A common study design is to split the sample into a training set and an independent test set, where the former is used to develop the classifier and the latter to evaluate its performance. In this paper we address the question of what proportion of the samples should be devoted to the training set. How does this proportion impact the mean squared error (MSE of the prediction accuracy estimate? Results We develop a non-parametric algorithm for determining an optimal splitting proportion that can be applied with a specific dataset and classifier algorithm. We also perform a broad simulation study for the purpose of better understanding the factors that determine the best split proportions and to evaluate commonly used splitting strategies (1/2 training or 2/3 training under a wide variety of conditions. These methods are based on a decomposition of the MSE into three intuitive component parts. Conclusions By applying these approaches to a number of synthetic and real microarray datasets we show that for linear classifiers the optimal proportion depends on the overall number of samples available and the degree of differential expression between the classes. The optimal proportion was found to depend on the full dataset size (n and classification accuracy - with higher accuracy and smaller n resulting in more assigned to the training set. The commonly used strategy of allocating 2/3rd of cases for training was close to optimal for reasonable sized datasets (n ≥ 100 with strong signals (i.e. 85% or greater full dataset accuracy. In general, we recommend use of our nonparametric resampling approach for determing the optimal split. This approach can be applied to any dataset, using any predictor development method, to determine the best split.

  5. [A New HAC Unsupervised Classifier Based on Spectral Harmonic Analysis].

    Science.gov (United States)

    Yang, Ke-ming; Wei, Hua-feng; Shi, Gang-qiang; Sun, Yang-yang; Liu, Fei

    2015-07-01

    Hyperspectral images classification is one of the important methods to identify image information, which has great significance for feature identification, dynamic monitoring and thematic information extraction, etc. Unsupervised classification without prior knowledge is widely used in hyperspectral image classification. This article proposes a new hyperspectral images unsupervised classification algorithm based on harmonic analysis(HA), which is called the harmonic analysis classifer (HAC). First, the HAC algorithm counts the first harmonic component and draws the histogram, so it can determine the initial feature categories and the pixel of cluster centers according to the number and location of the peak. Then, the algorithm is to map the waveform information of pixels to be classified spectrum into the feature space made up of harmonic decomposition times, amplitude and phase, and the similar features can be gotten together in the feature space, these pixels will be classified according to the principle of minimum distance. Finally, the algorithm computes the Euclidean distance of these pixels between cluster center, and merges the initial classification by setting the distance threshold. so the HAC can achieve the purpose of hyperspectral images classification. The paper collects spectral curves of two feature categories, and obtains harmonic decomposition times, amplitude and phase after harmonic analysis, the distribution of HA components in the feature space verified the correctness of the HAC. While the HAC algorithm is applied to EO-1 satellite Hyperion hyperspectral image and obtains the results of classification. Comparing with the hyperspectral image classifying results of K-MEANS, ISODATA and HAC classifiers, the HAC, as a unsupervised classification method, is confirmed to have better application on hyperspectral image classification.

  6. Classifier Performance Estimation with Unbalanced, Partially Labeled Data

    Science.gov (United States)

    2017-05-26

    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE , VOL. XX, NO. X, XXX 2017 1 Classifier Performance Estimation with Unbalanced...occurred. In these settings, operators typically work with a score computed from avail- able observations. When the score exceeds a certain value , an in...are missed by the system. Since all scores beyond a certain value are investigated and are determined to be true or false positives, the precision of

  7. Classifying Floating Potential Measurement Unit Data Products as Science Data

    Science.gov (United States)

    Coffey, Victoria; Minow, Joseph

    2015-01-01

    We are Co-Investigators for the Floating Potential Measurement Unit (FPMU) on the International Space Station (ISS) and members of the FPMU operations and data analysis team. We are providing this memo for the purpose of classifying raw and processed FPMU data products and ancillary data as NASA science data with unrestricted, public availability in order to best support science uses of the data.

  8. Classifying Radio Galaxies with the Convolutional Neural Network

    Science.gov (United States)

    Aniyan, A. K.; Thorat, K.

    2017-06-01

    We present the application of a deep machine learning technique to classify radio images of extended sources on a morphological basis using convolutional neural networks (CNN). In this study, we have taken the case of the Fanaroff-Riley (FR) class of radio galaxies as well as radio galaxies with bent-tailed morphology. We have used archival data from the Very Large Array (VLA)—Faint Images of the Radio Sky at Twenty Centimeters survey and existing visually classified samples available in the literature to train a neural network for morphological classification of these categories of radio sources. Our training sample size for each of these categories is ˜200 sources, which has been augmented by rotated versions of the same. Our study shows that CNNs can classify images of the FRI and FRII and bent-tailed radio galaxies with high accuracy (maximum precision at 95%) using well-defined samples and a “fusion classifier,” which combines the results of binary classifications, while allowing for a mechanism to find sources with unusual morphologies. The individual precision is highest for bent-tailed radio galaxies at 95% and is 91% and 75% for the FRI and FRII classes, respectively, whereas the recall is highest for FRI and FRIIs at 91% each, while the bent-tailed class has a recall of 79%. These results show that our results are comparable to that of manual classification, while being much faster. Finally, we discuss the computational and data-related challenges associated with the morphological classification of radio galaxies with CNNs.

  9. Classifying handheld Augmented Reality: Three categories linked by spatial mappings

    OpenAIRE

    Vincent, Thomas; Nigay, Laurence; Kurata, Takeshi

    2012-01-01

    Session 1: Research papers; International audience; Handheld Augmented Reality (AR) relies on a spatial coupling of the on-screen content with the physical surrounding. To help the design of such systems and to classify existing AR systems, we present a framework made of three categories and two spatial relationships. Our framework highlights spatial relationships between the physical world, the representation of the physical world on screen and the augmentation on screen. Within this framewo...

  10. Automatic Classification of Cetacean Vocalizations Using an Aural Classifier

    Science.gov (United States)

    2013-09-30

    the Gulf of Mexico . A set of pre-recorded bowhead and humpback whale vocalizations and a set of synthetic bowhead and humpback vocalizations were...vocalizations primarily from four1 cetacean species – the sperm whale , northern right whale , the bowhead whale and the humpback whale . These species...with the classifier as time permits. For example, Minke whale vocalizations, available on the Mobysound website, were the focal topic for the 5th

  11. Evaluation of Polarimetric SAR Decomposition for Classifying Wetland Vegetation Types

    Directory of Open Access Journals (Sweden)

    Sang-Hoon Hong

    2015-07-01

    Full Text Available The Florida Everglades is the largest subtropical wetland system in the United States and, as with subtropical and tropical wetlands elsewhere, has been threatened by severe environmental stresses. It is very important to monitor such wetlands to inform management on the status of these fragile ecosystems. This study aims to examine the applicability of TerraSAR-X quadruple polarimetric (quad-pol synthetic aperture radar (PolSAR data for classifying wetland vegetation in the Everglades. We processed quad-pol data using the Hong & Wdowinski four-component decomposition, which accounts for double bounce scattering in the cross-polarization signal. The calculated decomposition images consist of four scattering mechanisms (single, co- and cross-pol double, and volume scattering. We applied an object-oriented image analysis approach to classify vegetation types with the decomposition results. We also used a high-resolution multispectral optical RapidEye image to compare statistics and classification results with Synthetic Aperture Radar (SAR observations. The calculated classification accuracy was higher than 85%, suggesting that the TerraSAR-X quad-pol SAR signal had a high potential for distinguishing different vegetation types. Scattering components from SAR acquisition were particularly advantageous for classifying mangroves along tidal channels. We conclude that the typical scattering behaviors from model-based decomposition are useful for discriminating among different wetland vegetation types.

  12. Evaluating Classifiers to Detect Arm Movement Intention from EEG Signals

    Directory of Open Access Journals (Sweden)

    Daniel Planelles

    2014-09-01

    Full Text Available This paper presents a methodology to detect the intention to make a reaching movement with the arm in healthy subjects before the movement actually starts. This is done by measuring brain activity through electroencephalographic (EEG signals that are registered by electrodes placed over the scalp. The preparation and performance of an arm movement generate a phenomenon called event-related desynchronization (ERD in the mu and beta frequency bands. A novel methodology to characterize this cognitive process based on three sums of power spectral frequencies involved in ERD is presented. The main objective of this paper is to set the benchmark for classifiers and to choose the most convenient. The best results are obtained using an SVM classifier with around 72% accuracy. This classifier will be used in further research to generate the control commands to move a robotic exoskeleton that helps people suffering from motor disabilities to perform the movement. The final aim is that this brain-controlled robotic exoskeleton improves the current rehabilitation processes of disabled people.

  13. Recurrent neural networks for classifying relations in clinical notes.

    Science.gov (United States)

    Luo, Yuan

    2017-08-01

    We proposed the first models based on recurrent neural networks (more specifically Long Short-Term Memory - LSTM) for classifying relations from clinical notes. We tested our models on the i2b2/VA relation classification challenge dataset. We showed that our segment LSTM model, with only word embedding feature and no manual feature engineering, achieved a micro-averaged f-measure of 0.661 for classifying medical problem-treatment relations, 0.800 for medical problem-test relations, and 0.683 for medical problem-medical problem relations. These results are comparable to those of the state-of-the-art systems on the i2b2/VA relation classification challenge. We compared the segment LSTM model with the sentence LSTM model, and demonstrated the benefits of exploring the difference between concept text and context text, and between different contextual parts in the sentence. We also evaluated the impact of word embedding on the performance of LSTM models and showed that medical domain word embedding help improve the relation classification. These results support the use of LSTM models for classifying relations between medical concepts, as they show comparable performance to previously published systems while requiring no manual feature engineering. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Self-organizing map classifier for stressed speech recognition

    Science.gov (United States)

    Partila, Pavol; Tovarek, Jaromir; Voznak, Miroslav

    2016-05-01

    This paper presents a method for detecting speech under stress using Self-Organizing Maps. Most people who are exposed to stressful situations can not adequately respond to stimuli. Army, police, and fire department occupy the largest part of the environment that are typical of an increased number of stressful situations. The role of men in action is controlled by the control center. Control commands should be adapted to the psychological state of a man in action. It is known that the psychological changes of the human body are also reflected physiologically, which consequently means the stress effected speech. Therefore, it is clear that the speech stress recognizing system is required in the security forces. One of the possible classifiers, which are popular for its flexibility, is a self-organizing map. It is one type of the artificial neural networks. Flexibility means independence classifier on the character of the input data. This feature is suitable for speech processing. Human Stress can be seen as a kind of emotional state. Mel-frequency cepstral coefficients, LPC coefficients, and prosody features were selected for input data. These coefficients were selected for their sensitivity to emotional changes. The calculation of the parameters was performed on speech recordings, which can be divided into two classes, namely the stress state recordings and normal state recordings. The benefit of the experiment is a method using SOM classifier for stress speech detection. Results showed the advantage of this method, which is input data flexibility.

  15. A new system for classifying tooth, root and canal anomalies.

    Science.gov (United States)

    Ahmed, H M A; Dummer, P M H

    2017-10-12

    Understanding the normal anatomical features as well as the more unusual developmental anomalies of teeth, roots and root canals is essential for successful root canal treatment. In addition to various types of root canal configuration and accessory canal morphology, a wide range of developmental tooth, root and canal anomalies exists, including C-shaped canals, dens invaginatus, taurodontism, root fusion, dilacerations and palato-gingival grooves. There is a direct association between developmental anomalies and pulp and periradicular diseases that usually require a multidisciplinary treatment approach to achieve a successful outcome. A number of classifications have categorized tooth, root and canal anomalies; however, several important details are often missed making the classifications less than ideal and potentially confusing. Recently, a new coding system for classifying root, root canal and accessory canal morphology has been introduced. The purpose of this article is to introduce a new system for classifying tooth, root and canal anomalies for use in research, clinical practice and training, which can serve as complementary codes to the recently described system for classifying root, as well as main and accessory canal morphology. © 2017 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  16. A Distributed Fuzzy Associative Classifier for Big Data.

    Science.gov (United States)

    Segatori, Armando; Bechini, Alessio; Ducange, Pietro; Marcelloni, Francesco

    2017-09-19

    Fuzzy associative classification has not been widely analyzed in the literature, although associative classifiers (ACs) have proved to be very effective in different real domain applications. The main reason is that learning fuzzy ACs is a very heavy task, especially when dealing with large datasets. To overcome this drawback, in this paper, we propose an efficient distributed fuzzy associative classification approach based on the MapReduce paradigm. The approach exploits a novel distributed discretizer based on fuzzy entropy for efficiently generating fuzzy partitions of the attributes. Then, a set of candidate fuzzy association rules is generated by employing a distributed fuzzy extension of the well-known FP-Growth algorithm. Finally, this set is pruned by using three purposely adapted types of pruning. We implemented our approach on the popular Hadoop framework. Hadoop allows distributing storage and processing of very large data sets on computer clusters built from commodity hardware. We have performed an extensive experimentation and a detailed analysis of the results using six very large datasets with up to 11,000,000 instances. We have also experimented different types of reasoning methods. Focusing on accuracy, model complexity, computation time, and scalability, we compare the results achieved by our approach with those obtained by two distributed nonfuzzy ACs recently proposed in the literature. We highlight that, although the accuracies result to be comparable, the complexity, evaluated in terms of number of rules, of the classifiers generated by the fuzzy distributed approach is lower than the one of the nonfuzzy classifiers.

  17. Analysis of classifiers performance for classification of potential microcalcification

    Science.gov (United States)

    M. N., Arun K.; Sheshadri, H. S.

    2013-07-01

    Breast cancer is a significant public health problem in the world. According to the literature early detection improve breast cancer prognosis. Mammography is a screening tool used for early detection of breast cancer. About 10-30% cases are missed during the routine check as it is difficult for the radiologists to make accurate analysis due to large amount of data. The Microcalcifications (MCs) are considered to be important signs of breast cancer. It has been reported in literature that 30% - 50% of breast cancer detected radio graphically show MCs on mammograms. Histologic examinations report 62% to 79% of breast carcinomas reveals MCs. MC are tiny, vary in size, shape, and distribution, and MC may be closely connected to surrounding tissues. There is a major challenge using the traditional classifiers in the classification of individual potential MCs as the processing of mammograms in appropriate stage generates data sets with an unequal amount of information for both classes (i.e., MC, and Not-MC). Most of the existing state-of-the-art classification approaches are well developed by assuming the underlying training set is evenly distributed. However, they are faced with a severe bias problem when the training set is highly imbalanced in distribution. This paper addresses this issue by using classifiers which handle the imbalanced data sets. In this paper, we also compare the performance of classifiers which are used in the classification of potential MC.

  18. Classifier ensembles for land cover mapping using multitemporal SAR imagery

    Science.gov (United States)

    Waske, Björn; Braun, Matthias

    SAR data are almost independent from weather conditions, and thus are well suited for mapping of seasonally changing variables such as land cover. In regard to recent and upcoming missions, multitemporal and multi-frequency approaches become even more attractive. In the present study, classifier ensembles (i.e., boosted decision tree and random forests) are applied to multi-temporal C-band SAR data, from different study sites and years. A detailed accuracy assessment shows that classifier ensembles, in particularly random forests, outperform standard approaches like a single decision tree and a conventional maximum likelihood classifier by more than 10% independently from the site and year. They reach up to almost 84% of overall accuracy in rural areas with large plots. Visual interpretation confirms the statistical accuracy assessment and reveals that also typical random noise is considerably reduced. In addition the results demonstrate that random forests are less sensitive to the number of training samples and perform well even with only a small number. Random forests are computationally highly efficient and are hence considered very well suited for land cover classifications of future multifrequency and multitemporal stacks of SAR imagery.

  19. Entropy based classifier for cross-domain opinion mining

    Directory of Open Access Journals (Sweden)

    Jyoti S. Deshmukh

    2018-01-01

    Full Text Available In recent years, the growth of social network has increased the interest of people in analyzing reviews and opinions for products before they buy them. Consequently, this has given rise to the domain adaptation as a prominent area of research in sentiment analysis. A classifier trained from one domain often gives poor results on data from another domain. Expression of sentiment is different in every domain. The labeling cost of each domain separately is very high as well as time consuming. Therefore, this study has proposed an approach that extracts and classifies opinion words from one domain called source domain and predicts opinion words of another domain called target domain using a semi-supervised approach, which combines modified maximum entropy and bipartite graph clustering. A comparison of opinion classification on reviews on four different product domains is presented. The results demonstrate that the proposed method performs relatively well in comparison to the other methods. Comparison of SentiWordNet of domain-specific and domain-independent words reveals that on an average 72.6% and 88.4% words, respectively, are correctly classified.

  20. A New Classifier for Flood Hazard Mapping over Large Regions

    Science.gov (United States)

    Samela, C.; Troy, T. J.; Manfreda, S.

    2015-12-01

    The knowledge of the position and the extent of the areas exposed to the flood hazard is essential to any strategy for minimizing the risk. Unfortunately, in ungauged basins the use of traditional floodplain mapping techniques is prevented by the lack of the extensive data required. The main aim of the present work is to overcome this limitation by defining an alternative simplified procedure for a preliminary, but efficient, floodplain delineation. To validate the method in a data-rich environment, eleven flood-related morphological descriptors derived from DEMs have been used as linear binary classifiers over the Ohio River basin and its sub-catchments. Their performances in identifying the floodplains have been measured at the change of the topography and the size of the calibration area, and the best performing classifiers among those analysed have been applied and validated across the continental U.S. The results suggest that the classifier based on the index ln(hr/H), named the Geomorphic Flood Index (GFI), is the most suitable to detect the flood-prone areas in ungauged basins and for large-scale applications, providing good accuracies with low requirements in terms of data and computational costs.

  1. Exploiting Language Models to Classify Events from Twitter

    Directory of Open Access Journals (Sweden)

    Duc-Thuan Vo

    2015-01-01

    Full Text Available Classifying events is challenging in Twitter because tweets texts have a large amount of temporal data with a lot of noise and various kinds of topics. In this paper, we propose a method to classify events from Twitter. We firstly find the distinguishing terms between tweets in events and measure their similarities with learning language models such as ConceptNet and a latent Dirichlet allocation method for selectional preferences (LDA-SP, which have been widely studied based on large text corpora within computational linguistic relations. The relationship of term words in tweets will be discovered by checking them under each model. We then proposed a method to compute the similarity between tweets based on tweets’ features including common term words and relationships among their distinguishing term words. It will be explicit and convenient for applying to k-nearest neighbor techniques for classification. We carefully applied experiments on the Edinburgh Twitter Corpus to show that our method achieves competitive results for classifying events.

  2. Jointly Learning Structured Analysis Discriminative Dictionary and Analysis Multiclass Classifier.

    Science.gov (United States)

    Zhang, Zhao; Jiang, Weiming; Qin, Jie; Zhang, Li; Li, Fanzhang; Zhang, Min; Yan, Shuicheng

    2017-09-14

    In this paper, we propose an analysis mechanism-based structured analysis discriminative dictionary learning (ADDL) framework. The ADDL seamlessly integrates ADDL, analysis representation, and analysis classifier training into a unified model. The applied analysis mechanism can make sure that the learned dictionaries, representations, and linear classifiers over different classes are independent and discriminating as much as possible. The dictionary is obtained by minimizing a reconstruction error and an analytical incoherence promoting term that encourages the subdictionaries associated with different classes to be independent. To obtain the representation coefficients, ADDL imposes a sparse l2,1-norm constraint on the coding coefficients instead of using l₀ or l₁ norm, since the l₀- or l₁-norm constraint applied in most existing DL criteria makes the training phase time consuming. The code-extraction projection that bridges data with the sparse codes by extracting special features from the given samples is calculated via minimizing a sparse code approximation term. Then we compute a linear classifier based on the approximated sparse codes by an analysis mechanism to simultaneously consider the classification and representation powers. Thus, the classification approach of our model is very efficient, because it can avoid the extra time-consuming sparse reconstruction process with trained dictionary for each new test data as most existing DL algorithms. Simulations on real image databases demonstrate that our ADDL model can obtain superior performance over other state of the arts.

  3. Classifier-Guided Sampling for Complex Energy System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report documents the results of a Laboratory Directed Research and Development (LDRD) effort enti tled "Classifier - Guided Sampling for Complex Energy System Optimization" that was conducted during FY 2014 and FY 2015. The goal of this proj ect was to develop, implement, and test major improvements to the classifier - guided sampling (CGS) algorithm. CGS is type of evolutionary algorithm for perform ing search and optimization over a set of discrete design variables in the face of one or more objective functions. E xisting evolutionary algorithms, such as genetic algorithms , may require a large number of o bjecti ve function evaluations to identify optimal or near - optimal solutions . Reducing the number of evaluations can result in significant time savings, especially if the objective function is computationally expensive. CGS reduce s the evaluation count by us ing a Bayesian network classifier to filter out non - promising candidate designs , prior to evaluation, based on their posterior probabilit ies . In this project, b oth the single - objective and multi - objective version s of the CGS are developed and tested on a set of benchm ark problems. As a domain - specific case study, CGS is used to design a microgrid for use in islanded mode during an extended bulk power grid outage.

  4. Hybrid approach using fuzzy sets and extreme learning machine for classifying clinical datasets

    Directory of Open Access Journals (Sweden)

    Kindie Biredagn Nahato

    Full Text Available Data mining techniques play a major role in developing computer aided diagnosis systems and expert systems that will aid a physician in clinical decision making. In this work, a classifier that combines the relative merits of fuzzy sets and extreme learning machine (FELM for clinical datasets is proposed. The three major subsystems in the FELM framework are preprocessing subsystem, fuzzification subsystem and classification subsystem. Missing value imputation and outlier elimination are handled by the preprocessing subsystem. The fuzzification subsystem maps each feature to a fuzzy set and the classification subsystem uses extreme learning machine for classification.Cleveland heart disease (CHD, Statlog heart disease (SHD and Pima Indian diabetes (PID datasets from the University of California Irvine (UCI machine learning repository have been used for experimentation. The CHD and SHD datasets have been experimented with two class labels one indicating the absence and the other indicating the presence of heart disease. The CHD dataset has also been experimented with five class labels, one class label indicating the absence of heart disease and the other four class labels indicating the severity of heart disease namely low risk, medium risk, high risk and serious. The PID data set has been experimented with two class labels one indicating the absence and the other indicating the presence of gestational diabetes.The classifier has achieved an accuracy of 93.55% for CHD data set with two class labels; 73.77% for CHD data set with five class labels; 94.44% for SHD data set and 92.54% for PID dataset. Keywords: Extreme learning machine, Fuzzification, Fuzzy set, Classification, Euclidean distance, Membership function

  5. Microbiological investigations on the elimination of volatile chlorinated hydrocarbons. Final report. Mikrobiologische Untersuchungen zur Elimination leichtfluechtiger Chlorkohlenwasserstoffe durch Biofilme. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Scholz-Muramatsu, H.

    1989-03-01

    The anaerobic elimination of 1,1,1-trichloroethane (TCA), trichloroethene (TCE) and tetrachlorethene (PCE) was studied under batch conditions and in a fixed bed reactor (concentrations of the chlorinated compounds 50 to 200 {mu}mol/l). Co-substrates for the anaerobic elimination of the chlorinated hydrocarbons were hydrogen, glucose, and benzoate. With glucose (2 mmol/l) PCE was dehalogenated to DCE (25 {mu}mol/m{sup 2}xh). PCE was eliminated to unchlorinated compounds under batch conditions with hydrogen as co-substrate (5 {mu}mol/lxd). In continuous culture with benzoate (2 to 4 mmol/l) PCE was dechlorinated to low concentrations of DCE or even totally as long as there was active methanogenesis. Under batch conditions TCA was dechlorinated completely with glucose as well as with benzoate (1,5 to 5 {mu}mol/lxd). PCE could be eliminated to DCE by fermenting bacteria without methanogenesis. The inhibiting factors for the elimination of dichloromethane (DCM) by groundwater contaminated with leachate of a laquer sludge landfill were studied. About 50% of the inhibiting substances belonged to the volatile fraction of the ground-water. Selected compounds out of this fraction were examined in regard to there inhibiting effect on the DCM elimination rate. A distinct inhibition was caused by 1,2-dichloroethane, ethylbenzene, and xylene. The DCM elimination was most sensitive against 1,2-dichloroethane. Besides this the leachate contained nonvolatile inhibitors which could not be identified. (orig.) With 42 refs., 12 tabs., 11 figs.

  6. 3 CFR 13526 - Executive Order 13526 of December 29, 2009. Classified National Security Information

    Science.gov (United States)

    2010-01-01

    .... Classified National Security Information 13526 Order 13526 Presidential Documents Executive Orders Executive Order 13526 of December 29, 2009 EO 13526 Classified National Security Information This order prescribes a uniform system for classifying, safeguarding, and declassifying national security information...

  7. Genotoxicity of wastewater from health care facilities.

    Science.gov (United States)

    Vlková, Alena; Wittlingerová, Zdeňka; Zimová, Magdalena; Jírová, Gabriela; Kejlová, Kristina; Janoušek, Stanislav; Jírová, Dagmar

    2016-12-18

    Health care facilities use for therapeutic purposes, diagnostics, research, and disinfection a high number of chemical compounds, such as pharmaceuticals (e.g. antibiotics, cytostatics, antidepressants), disinfectants, surfactants, metals, radioactive elements, bleach preparations, etc. Hospitals consume significant amounts of water (in the range of 400 to 1200 liters/day/bed) corresponding to the amount of wastewater discharge. Some of these chemicals are not eliminated in wastewater treatment plants and are the source of pollution for surface and groundwater supplies. Hospital wastewater represents chemical and biological risks for public and environmental health as many of these compounds might be genotoxic and are suspected to contribute to the increased incidence of cancer observed during the last decades. The changes of the genetic information can have a lethal effect, but more often cause tumor processes or mutations in embryonic development causing serious defects. A review of the available literature on the mutagenicity/genotoxicity of medical facilities wastewater is presented in this article.

  8. A valid measure to eliminate the influence of polysaccharides and ...

    African Journals Online (AJOL)

    Mike

    2015-07-22

    . Here, a valid combination measure (β-mercaptoethanol, PVP40 and PVPP were used at different stages) was created to eliminate the influence of polysaccharides and polyphenols in recalcitrant longan during DNA.

  9. Saturation current spikes eliminated in saturable core transformers

    Science.gov (United States)

    Schwarz, F. C.

    1971-01-01

    Unsaturating composite magnetic core transformer, consisting of two separate parallel cores designed so impending core saturation causes signal generation, terminates high current spike in converter primary circuit. Simplified waveform, demonstrates transformer effectiveness in eliminating current spikes.

  10. Health promotion: From malaria control to elimination | Groepe ...

    African Journals Online (AJOL)

    Here we reflect on the achievement of some of the diverse activities that have brought malaria under control, highlight key challenges and propose specific health promotion interventions required to move South Africa's malaria programme from control to elimination.

  11. NPDES (National Pollution Discharge & Elimination System) Minor Dischargers

    Data.gov (United States)

    U.S. Environmental Protection Agency — As authorized by the Clean Water Act, the National Pollutant Discharge Elimination System (NPDES) permit program controls water pollution by regulating point sources...

  12. Testing the hypothesis that treatment can eliminate HIV

    DEFF Research Database (Denmark)

    Okano, Justin T; Robbins, Danielle; Palk, Laurence

    2016-01-01

    . The elimination threshold is one new HIV infection per 1000 individuals. Here, we test the hypothesis that TasP can substantially reduce epidemics and eliminate HIV. We estimate the impact of TasP, between 1996 and 2013, on the Danish HIV epidemic in men who have sex with men (MSM), an epidemic UNAIDS has...... identified as a priority for elimination. METHODS: We use a CD4-staged Bayesian back-calculation approach to estimate incidence, and the hidden epidemic (the number of HIV-infected undiagnosed MSM). To develop the back-calculation model, we use data from an ongoing nationwide population-based study......: the Danish HIV Cohort Study. FINDINGS: Incidence, and the hidden epidemic, decreased substantially after treatment was introduced in 1996. By 2013, incidence was close to the elimination threshold: 1·4 (median, 95% Bayesian credible interval [BCI] 0·4-2·1) new HIV infections per 1000 MSM and there were only...

  13. 78 FR 48076 - Facility Security Clearance and Safeguarding of National Security Information and Restricted Data

    Science.gov (United States)

    2013-08-07

    ... [NRC-2011-0268] RIN 3150-AJ07 Facility Security Clearance and Safeguarding of National Security..., Classified National Security Information. The rule would allow licensees flexibility in determining the means... to 10 CFR part 95. PART 95--FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY...

  14. A minisum model with forbidden regions for locating a semi-desirable facility in the plane

    DEFF Research Database (Denmark)

    Juel, Henrik; Brimberg, Jack

    1998-01-01

    Most facilities in today's technological society may be classified as semi-desirable. That is, the facility provides a benefit or service to society, while adversely affecting the quality of life or social values in a number of possible ways. The paper proposes a location model for a new semi-des...

  15. Chromium Elimination and Cannon Life Extension for Gun Tubes

    Science.gov (United States)

    2012-08-30

    erosion- resistant chrome cobalt alloy matrix with 15% tungsten. Stellite is used as M60 machine gun barrel liner. Tantalum Cobalt Tungsten...U.S. Army Research, Development and Engineering Command Chromium Elimination and Cannon Life Extension for Gun Tubes ESTCP WP-201111...DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Chromium Elimination and Cannon Life Extension for Gun Tubes 5a. CONTRACT NUMBER 5b

  16. Salivary paracetamol elimination kinetics during the menstrual cycle.

    OpenAIRE

    Somaja, L; Thangam, J

    1987-01-01

    Studies were done to examine the influence of the menstrual cycle on the elimination kinetics of paracetamol. Salivary concentrations of paracetamol were determined after oral administration of 1 g of paracetamol on day 3, 10, 14, 20 and 25 of the menstrual cycle in normal healthy women volunteers with regular menstrual cycles. There was no significant difference in elimination half-life (t 1/2) or metabolic clearance rate (CL) between the various days of the menstrual cycle. The result sugge...

  17. Facility Measures Magnetic Fields

    Science.gov (United States)

    Honess, Shawn B.; Narvaez, Pablo; Mcauley, James M.

    1991-01-01

    Partly automated facility measures and computes steady near magnetic field produced by object. Designed to determine magnetic fields of equipment to be installed on spacecraft including sensitive magnetometers, with view toward application of compensating fields to reduce interfernece with spacecraft-magnetometer readings. Because of its convenient operating features and sensitivity of its measurements, facility serves as prototype for similar facilities devoted to magnetic characterization of medical equipment, magnets for high-energy particle accelerators, and magnetic materials.

  18. Synchrotron radiation facilities

    CERN Multimedia

    1972-01-01

    Particularly in the past few years, interest in using the synchrotron radiation emanating from high energy, circular electron machines has grown considerably. In our February issue we included an article on the synchrotron radiation facility at Frascati. This month we are spreading the net wider — saying something about the properties of the radiation, listing the centres where synchrotron radiation facilities exist, adding a brief description of three of them and mentioning areas of physics in which the facilities are used.

  19. Environmental leaders 2 update - Voluntary action on toxic substances: Accelerated Reduction/Elimination of Toxics (ARET): progress report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    The Accelerated Reduction/Elimination of Toxics (ARET) program was launched in 1994 as a voluntary, non-regulatory initiative to reduce emissions of the worst toxic substances. ARET targets 117 toxic substances including 30 that persist in the environment and bioaccumulate in living organisms. The pollution prevention activities of 292 facilities that responded to the ARET challenge are described in this report. The program includes facilities from eight major industrial sectors including smelting, chemical manufacturing, oil and gas processing, manufacturing, and electrical utilities. ARET participants have reduced their toxic emissions by 61 per cent from base-year levels. This represents a reduction of 35,175 tonnes of pollutants. A list of ARET target substances, a directory of ARET participants and data on emissions by facility and by substance are also included. tabs., figs.

  20. Composite Structures Manufacturing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Composite Structures Manufacturing Facility specializes in the design, analysis, fabrication and testing of advanced composite structures and materials for both...

  1. GPS Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Global Positioning System (GPS) Test Facility Instrumentation Suite (GPSIS) provides great flexibility in testing receivers by providing operational control of...

  2. Flexible Electronics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Flexible Electronics Research Facility designs, synthesizes, tests, and fabricates materials and devices compatible with flexible substrates for Army information...

  3. Nonlinear Materials Characterization Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Nonlinear Materials Characterization Facility conducts photophysical research and development of nonlinear materials operating in the visible spectrum to protect...

  4. Mobile Solar Tracker Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NIST's mobile solar tracking facility is used to characterize the electrical performance of photovoltaic panels. It incorporates meteorological instruments, a solar...

  5. Heated Tube Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Heated Tube Facility at NASA GRC investigates cooling issues by simulating conditions characteristic of rocket engine thrust chambers and high speed airbreathing...

  6. Imagery Data Base Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Imagery Data Base Facility supports AFRL and other government organizations by providing imagery interpretation and analysis to users for data selection, imagery...

  7. Universal Drive Train Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This vehicle drive train research facility is capable of evaluating helicopter and ground vehicle power transmission technologies in a system level environment. The...

  8. Proximal Probes Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Proximal Probes Facility consists of laboratories for microscopy, spectroscopy, and probing of nanostructured materials and their functional properties. At the...

  9. Catalytic Fuel Conversion Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility enables unique catalysis research related to power and energy applications using military jet fuels and alternative fuels. It is equipped with research...

  10. Textiles Performance Testing Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Textiles Performance Testing Facilities has the capabilities to perform all physical wet and dry performance testing, and visual and instrumental color analysis...

  11. Manufacturing Demonstration Facility (MDF)

    Data.gov (United States)

    Federal Laboratory Consortium — The U.S. Department of Energy Manufacturing Demonstration Facility (MDF) at Oak Ridge National Laboratory (ORNL) provides a collaborative, shared infrastructure to...

  12. Magnetics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Magnetics Research Facility houses three Helmholtz coils that generate magnetic fields in three perpendicular directions to balance the earth's magnetic field....

  13. Neutron Therapy Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Neutron Therapy Facility provides a moderate intensity, broad energy spectrum neutron beam that can be used for short term irradiations for radiobiology (cells)...

  14. Target Assembly Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Target Assembly Facility integrates new armor concepts into actual armored vehicles. Featuring the capability ofmachining and cutting radioactive materials, it...

  15. Engine Test Facility (ETF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Air Force Arnold Engineering Development Center's Engine Test Facility (ETF) test cells are used for development and evaluation testing of propulsion systems for...

  16. Pavement Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Comprehensive Environmental and Structural AnalysesThe ERDC Pavement Testing Facility, located on the ERDC Vicksburg campus, was originally constructed to provide an...

  17. Geospatial Data Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...

  18. Transonic Experimental Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Transonic Experimental Research Facility evaluates aerodynamics and fluid dynamics of projectiles, smart munitions systems, and sub-munitions dispensing systems;...

  19. DUPIC facility engineering

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. J.; Lee, H. H.; Kim, K. H. and others

    2000-03-01

    The objectives of this study are (1) the refurbishment for PIEF(Post Irradiation Examination Facility) and M6 hot-cell in IMEF(Irradiated Material Examination Facility), (2) the establishment of the compatible facility for DUPIC fuel fabrication experiments which is licensed by government organization, and (3) the establishment of the transportation system and transportation cask for nuclear material between facilities. The report for this project describes following contents, such as objectives, necessities, scope, contents, results of current step, R and D plan in future and etc.

  20. Facility Environmental Management System

    Data.gov (United States)

    Federal Laboratory Consortium — This is the Web site of the Federal Highway Administration's (FHWA's) Turner-Fairbank Highway Research Center (TFHRC) facility Environmental Management System (EMS)....

  1. Materials Characterization Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Materials Characterization Facility enables detailed measurements of the properties of ceramics, polymers, glasses, and composites. It features instrumentation...

  2. Malaria control in Nepal 1963–2012: challenges on the path towards elimination

    Science.gov (United States)

    2014-01-01

    Background Malaria is still a priority public health problem of Nepal where about 84% of the population are at risk. The aim of this paper is to highlight the past and present malaria situation in this country and its challenges for long-term malaria elimination strategies. Methods Malariometric indicator data of Nepal recorded through routine surveillance of health facilities for the years between 1963 and 2012 were compiled. Trends and differences in malaria indicator data were analysed. Results The trend of confirmed malaria cases in Nepal between 1963 and 2012 shows fluctuation, with a peak in 1985 when the number exceeded 42,321, representing the highest malaria case-load ever recorded in Nepal. This was followed by a steep declining trend of malaria with some major outbreaks. Nepal has made significant progress in controlling malaria transmission over the past decade: total confirmed malaria cases declined by 84% (12,750 in 2002 vs 2,092 in 2012), and there was only one reported death in 2012. Based on the evaluation of the National Malaria Control Programme in 2010, Nepal recently adopted a long-term malaria elimination strategy for the years 2011–2026 with the ambitious vision of a malaria-free Nepal by 2026. However, there has been an increasing trend of Plasmodium falciparum and imported malaria proportions in the last decade. Furthermore, the analysis of malariometric indicators of 31 malaria-risk districts between 2004 and 2012 shows a statistically significant reduction in the incidence of confirmed malaria and of Plasmodium vivax, but not in the incidence of P. falciparum and clinically suspected malaria. Conclusions Based on the achievements the country has made over the last decade, Nepal is preparing to move towards malaria elimination by 2026. However, considerable challenges lie ahead. These include especially, the need to improve access to diagnostic facilities to confirm clinically suspected cases and their treatment, the development of

  3. A Novel Multi-Class Ensemble Model for Classifying Imbalanced Biomedical Datasets

    Science.gov (United States)

    Bikku, Thulasi; Sambasiva Rao, N., Dr; Rao, Akepogu Ananda, Dr

    2017-08-01

    This paper mainly focuseson developing aHadoop based framework for feature selection and classification models to classify high dimensionality data in heterogeneous biomedical databases. Wide research has been performing in the fields of Machine learning, Big data and Data mining for identifying patterns. The main challenge is extracting useful features generated from diverse biological systems. The proposed model can be used for predicting diseases in various applications and identifying the features relevant to particular diseases. There is an exponential growth of biomedical repositories such as PubMed and Medline, an accurate predictive model is essential for knowledge discovery in Hadoop environment. Extracting key features from unstructured documents often lead to uncertain results due to outliers and missing values. In this paper, we proposed a two phase map-reduce framework with text preprocessor and classification model. In the first phase, mapper based preprocessing method was designed to eliminate irrelevant features, missing values and outliers from the biomedical data. In the second phase, a Map-Reduce based multi-class ensemble decision tree model was designed and implemented in the preprocessed mapper data to improve the true positive rate and computational time. The experimental results on the complex biomedical datasets show that the performance of our proposed Hadoop based multi-class ensemble model significantly outperforms state-of-the-art baselines.

  4. Coping with unobservable and mis-classified states in capture-recapture studies

    Directory of Open Access Journals (Sweden)

    Kendall, W. L.

    2004-01-01

    Full Text Available Multistate mark-recapture methods provide an excellent conceptual framework for considering estimation in studies of marked animals. Traditional methods include the assumptions that (1 each state an animal occupies is observable, and (2 state is assigned correctly at each point in time. Failure of either of these assumptions can lead to biased estimates of demographic parameters. I review design and analysis options for minimizing or eliminating these biases. Unobservable states can be adjusted for by including them in the state space of the statistical model, with zero capture probability, and incorporating the robust design, or observing animals in the unobservable state through telemetry, tag recoveries, or incidental observations. Mis¿classification can be adjusted for by auxiliary data or incorporating the robust design, in order to estimate the probability of detecting the state an animal occupies. For both unobservable and mis-classified states, the key feature of the robust design is the assumption that the state of the animal is static for at least two sampling occasions

  5. Receiver operating characteristic for a spectrogram correlator-based humpback whale detector-classifier.

    Science.gov (United States)

    Abbot, Ted A; Premus, Vincent E; Abbot, Philip A; Mayer, Owen A

    2012-09-01

    This paper presents recent experimental results and a discussion of system enhancements made to the real-time autonomous humpback whale detector-classifier algorithm first presented by Abbot et al. [J. Acoust. Soc. Am. 127, 2894-2903 (2010)]. In February 2010, a second-generation system was deployed in an experiment conducted off of leeward Kauai during which 26 h of humpback vocalizations were recorded via sonobuoy and processed in real time. These data have been analyzed along with 40 h of humpbacks-absent data collected from the same location during July-August 2009. The extensive whales-absent data set in particular has enabled the quantification of system false alarm rates and the measurement of receiver operating characteristic curves. The performance impact of three enhancements incorporated into the second-generation system are discussed, including (1) a method to eliminate redundancy in the kernel library, (2) increased use of contextual analysis, and (3) the augmentation of the training data with more recent humpback vocalizations. It will be shown that the performance of the real-time system was improved to yield a probability of correct classification of 0.93 and a probability of false alarm of 0.004 over the 66 h of independent test data.

  6. A new training algorithm using artificial neural networks to classify gender-specific dynamic gait patterns.

    Science.gov (United States)

    Andrade, Andre; Costa, Marcelo; Paolucci, Leopoldo; Braga, Antônio; Pires, Flavio; Ugrinowitsch, Herbert; Menzel, Hans-Joachim

    2015-01-01

    The aim of this study was to present a new training algorithm using artificial neural networks called multi-objective least absolute shrinkage and selection operator (MOBJ-LASSO) applied to the classification of dynamic gait patterns. The movement pattern is identified by 20 characteristics from the three components of the ground reaction force which are used as input information for the neural networks in gender-specific gait classification. The classification performance between MOBJ-LASSO (97.4%) and multi-objective algorithm (MOBJ) (97.1%) is similar, but the MOBJ-LASSO algorithm achieved more improved results than the MOBJ because it is able to eliminate the inputs and automatically select the parameters of the neural network. Thus, it is an effective tool for data mining using neural networks. From 20 inputs used for training, MOBJ-LASSO selected the first and second peaks of the vertical force and the force peak in the antero-posterior direction as the variables that classify the gait patterns of the different genders.

  7. Understanding and classifying metabolite space and metabolite-likeness.

    Directory of Open Access Journals (Sweden)

    Julio E Peironcely

    Full Text Available While the entirety of 'Chemical Space' is huge (and assumed to contain between 10(63 and 10(200 'small molecules', distinct subsets of this space can nonetheless be defined according to certain structural parameters. An example of such a subspace is the chemical space spanned by endogenous metabolites, defined as 'naturally occurring' products of an organisms' metabolism. In order to understand this part of chemical space in more detail, we analyzed the chemical space populated by human metabolites in two ways. Firstly, in order to understand metabolite space better, we performed Principal Component Analysis (PCA, hierarchical clustering and scaffold analysis of metabolites and non-metabolites in order to analyze which chemical features are characteristic for both classes of compounds. Here we found that heteroatom (both oxygen and nitrogen content, as well as the presence of particular ring systems was able to distinguish both groups of compounds. Secondly, we established which molecular descriptors and classifiers are capable of distinguishing metabolites from non-metabolites, by assigning a 'metabolite-likeness' score. It was found that the combination of MDL Public Keys and Random Forest exhibited best overall classification performance with an AUC value of 99.13%, a specificity of 99.84% and a selectivity of 88.79%. This performance is slightly better than previous classifiers; and interestingly we found that drugs occupy two distinct areas of metabolite-likeness, the one being more 'synthetic' and the other being more 'metabolite-like'. Also, on a truly prospective dataset of 457 compounds, 95.84% correct classification was achieved. Overall, we are confident that we contributed to the tasks of classifying metabolites, as well as to understanding metabolite chemical space better. This knowledge can now be used in the development of new drugs that need to resemble metabolites, and in our work particularly for assessing the metabolite

  8. Understanding and classifying metabolite space and metabolite-likeness.

    Science.gov (United States)

    Peironcely, Julio E; Reijmers, Theo; Coulier, Leon; Bender, Andreas; Hankemeier, Thomas

    2011-01-01

    While the entirety of 'Chemical Space' is huge (and assumed to contain between 10(63) and 10(200) 'small molecules'), distinct subsets of this space can nonetheless be defined according to certain structural parameters. An example of such a subspace is the chemical space spanned by endogenous metabolites, defined as 'naturally occurring' products of an organisms' metabolism. In order to understand this part of chemical space in more detail, we analyzed the chemical space populated by human metabolites in two ways. Firstly, in order to understand metabolite space better, we performed Principal Component Analysis (PCA), hierarchical clustering and scaffold analysis of metabolites and non-metabolites in order to analyze which chemical features are characteristic for both classes of compounds. Here we found that heteroatom (both oxygen and nitrogen) content, as well as the presence of particular ring systems was able to distinguish both groups of compounds. Secondly, we established which molecular descriptors and classifiers are capable of distinguishing metabolites from non-metabolites, by assigning a 'metabolite-likeness' score. It was found that the combination of MDL Public Keys and Random Forest exhibited best overall classification performance with an AUC value of 99.13%, a specificity of 99.84% and a selectivity of 88.79%. This performance is slightly better than previous classifiers; and interestingly we found that drugs occupy two distinct areas of metabolite-likeness, the one being more 'synthetic' and the other being more 'metabolite-like'. Also, on a truly prospective dataset of 457 compounds, 95.84% correct classification was achieved. Overall, we are confident that we contributed to the tasks of classifying metabolites, as well as to understanding metabolite chemical space better. This knowledge can now be used in the development of new drugs that need to resemble metabolites, and in our work particularly for assessing the metabolite-likeness of candidate

  9. Regression modeling of consumption or exposure variables classified by type.

    Science.gov (United States)

    Dorfman, A; Kimball, A W; Friedman, L A

    1985-12-01

    Consumption or exposure variables, as potential risk factors, are commonly measured and related to health effects. The measurements may be continuous or discrete, may be grouped into categories and may, in addition, be classified by type. Data analyses utilizing regression methods for the assessment of these risk factors present many problems of modeling and interpretation. Various models are proposed and evaluated, and recommendations are made. Use of the models is illustrated with Cox regression analyses of coronary heart disease mortality after 24 years of follow-up of subjects in the Framingham Study, with the focus being on alcohol consumption among these subjects.

  10. DFRFT: A Classified Review of Recent Methods with Its Application

    Directory of Open Access Journals (Sweden)

    Ashutosh Kumar Singh

    2013-01-01

    Full Text Available In the literature, there are various algorithms available for computing the discrete fractional Fourier transform (DFRFT. In this paper, all the existing methods are reviewed, classified into four categories, and subsequently compared to find out the best alternative from the view point of minimal computational error, computational complexity, transform features, and additional features like security. Subsequently, the correlation theorem of FRFT has been utilized to remove significantly the Doppler shift caused due to motion of receiver in the DSB-SC AM signal. Finally, the role of DFRFT has been investigated in the area of steganography.

  11. Classifying BCI signals from novice users with extreme learning machine

    Directory of Open Access Journals (Sweden)

    Rodríguez-Bermúdez Germán

    2017-07-01

    Full Text Available Brain computer interface (BCI allows to control external devices only with the electrical activity of the brain. In order to improve the system, several approaches have been proposed. However it is usual to test algorithms with standard BCI signals from experts users or from repositories available on Internet. In this work, extreme learning machine (ELM has been tested with signals from 5 novel users to compare with standard classification algorithms. Experimental results show that ELM is a suitable method to classify electroencephalogram signals from novice users.

  12. Classifying BCI signals from novice users with extreme learning machine

    Science.gov (United States)

    Rodríguez-Bermúdez, Germán; Bueno-Crespo, Andrés; José Martinez-Albaladejo, F.

    2017-07-01

    Brain computer interface (BCI) allows to control external devices only with the electrical activity of the brain. In order to improve the system, several approaches have been proposed. However it is usual to test algorithms with standard BCI signals from experts users or from repositories available on Internet. In this work, extreme learning machine (ELM) has been tested with signals from 5 novel users to compare with standard classification algorithms. Experimental results show that ELM is a suitable method to classify electroencephalogram signals from novice users.

  13. Anytime query-tuned kernel machine classifiers via Cholesky factorization

    Science.gov (United States)

    DeCoste, D.

    2002-01-01

    We recently demonstrated 2 to 64-fold query-time speedups of Support Vector Machine and Kernel Fisher classifiers via a new computational geometry method for anytime output bounds (DeCoste,2002). This new paper refines our approach in two key ways. First, we introduce a simple linear algebra formulation based on Cholesky factorization, yielding simpler equations and lower computational overhead. Second, this new formulation suggests new methods for achieving additional speedups, including tuning on query samples. We demonstrate effectiveness on benchmark datasets.

  14. Interface Prostheses With Classifier-Feedback-Based User Training.

    Science.gov (United States)

    Fang, Yinfeng; Zhou, Dalin; Li, Kairu; Liu, Honghai

    It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well as the centroids of the training samples, whose dimensionality is reduced to minimal number by dimension reduction. Clustering feedback provides a criterion that guides users to adjust motion gestures and muscle contraction forces intentionally. The experiment results have demonstrated that hand motion recognition accuracy increases steadily along the progress of the clustering-feedback-based user training, while conventional classifier-feedback methods, i.e., label feedback, hardly achieve any improvement. The result concludes that the use of proper classifier feedback can accelerate the process of user training, and implies prosperous future for the amputees with limited or no experience in pattern-recognition-based prosthetic device manipulation.It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well

  15. Support vector machine classifiers for large data sets.

    Energy Technology Data Exchange (ETDEWEB)

    Gertz, E. M.; Griffin, J. D.

    2006-01-31

    This report concerns the generation of support vector machine classifiers for solving the pattern recognition problem in machine learning. Several methods are proposed based on interior point methods for convex quadratic programming. Software implementations are developed by adapting the object-oriented packaging OOQP to the problem structure and by using the software package PETSc to perform time-intensive computations in a distributed setting. Linear systems arising from classification problems with moderately large numbers of features are solved by using two techniques--one a parallel direct solver, the other a Krylov-subspace method incorporating novel preconditioning strategies. Numerical results are provided, and computational experience is discussed.

  16. Role of dermatologists in leprosy elimination and post-elimination era: the Brazilian contribution.

    Science.gov (United States)

    Oliveira, Maria Leide Wand-Del-Rey; Penna, Gerson O; Telhari, S

    2007-03-01

    Dermatologists in Brazil have always been involved in care of leprosy patients, and have been alternating with public health physicians in the management of control policies. It is worth mentioning that Fernando Terra, founder of the Brazilian Society of Dermatology (BSD) in 1912, established the position of intern dermatologist at the Hospital dos Lizaros, in Rio de Janeiro, in 1913 (Souza-Araújo, 1952; Oliveira, 1991). In 1920, the dermatologist Eduardo Rabello formulated the first national public policy on the control of leprosy in the country, which was called 'Inspection of Prophylaxis of Leprosy and Venereal Diseases'. His son was an enthusiast of dermatological research and his main legacy was the polarity concept of leprosy (Rabelo, 1937). However, from 1930 to 1985, the public health physicians were in charge of the political guidelines that represented the period of establishing the vertical programmatic structure, with compulsory isolation of patients (1933-1962). Moreover, the federal states coordinated the control actions, based on the leprosy prophylaxis campaign. The dermatologists resumed the conduction of the control process in 1986, when multi-drug therapy (MDT) was implemented in the country, and in 1991, when decentralization of public healthcare services to the municipal level took place. In 2003 again, the dermatologists were no longer in control of the national policy. However, active dermatologists have acted in Brazilian references on diagnosis and treatment of Hansen's disease, at municipal, state and national levels. It is true that dermatologists have been getting away from leprosy control actions. And one could ask: who will replace this specialist? In the 'post-elimination' era, when the public primary healthcare technicians no longer consider leprosy of much significance, the knowledge of the expert in this disease and its differential diagnoses will be crucial.

  17. An assessment of national surveillance systems for malaria elimination in the Asia Pacific.

    Science.gov (United States)

    Mercado, Chris Erwin G; Ekapirat, Nattwut; Dondorp, Arjen M; Maude, Richard J

    2017-03-21

    Heads of Government from Asia and the Pacific have committed to a malaria-free region by 2030. In 2015, the total number of confirmed cases reported to the World Health Organization by 22 Asia Pacific countries was 2,461,025. However, this was likely a gross underestimate due in part to incidence data not being available from the wide variety of known sources. There is a recognized need for an accurate picture of malaria over time and space to support the goal of elimination. A survey was conducted to gain a deeper understanding of the collection of malaria incidence data for surveillance by National Malaria Control Programmes in 22 countries identified by the Asia Pacific Leaders Malaria Alliance. In 2015-2016, a short questionnaire on malaria surveillance was distributed to 22 country National Malaria Control Programmes (NMCP) in the Asia Pacific. It collected country-specific information about the extent of inclusion of the range of possible sources of malaria incidence data and the role of the private sector in malaria treatment. The findings were used to produce recommendations for the regional heads of government on improving malaria surveillance to inform regional efforts towards malaria elimination. A survey response was received from all 22 target countries. Most of the malaria incidence data collected by NMCPs originated from government health facilities, while many did not collect comprehensive data from mobile and migrant populations, the private sector or the military. All data from village health workers were included by 10/20 countries and some by 5/20. Other sources of data included by some countries were plantations, police and other security forces, sentinel surveillance sites, research or academic institutions, private laboratories and other government ministries. Malaria was treated in private health facilities in 19/21 countries, while anti-malarials were available in private pharmacies in 16/21 and private shops in 6/21. Most countries use

  18. Samarbejdsformer og Facilities Management

    DEFF Research Database (Denmark)

    Storgaard, Kresten

    Resultater fra en surveyundersøgelse om fordele og ulemper ved forskellige samarbejdsformer indenfor Facilities Management fremlægges.......Resultater fra en surveyundersøgelse om fordele og ulemper ved forskellige samarbejdsformer indenfor Facilities Management fremlægges....

  19. Machine Learning for Zwicky Transient Facility

    Science.gov (United States)

    Mahabal, Ashish; Zwicky Transient Facility, Catalina Real-Time Transient Survey

    2018-01-01

    The Zwicky Transient Facility (ZTF) will operate from 2018 to 2020 covering the accessible sky with its large 47 square degree camera. The transient detection rate is expected to be about a million per night. ZTF is thus a perfect LSST prototype. The big difference is that all of the ZTF transients can be followed up by 4- to 8-m class telescopes. Given the large numbers, using human scanners for separating the genuine transients from artifacts is out of question. For that first step as well as for classifying the transients with minimal follow-up requires machine learning. We describe the tools and plans to take on this task using follow-up facilities, and knowledge gained from archival datasets.

  20. Accuracy of Birth Certificate Data for Classifying Preterm Birth.

    Science.gov (United States)

    Stout, Molly J; Macones, George A; Tuuli, Methodius G

    2017-05-01

    Classifying preterm birth as spontaneous or indicated is critical both for clinical care and research, yet the accuracy of classification based on different data sources is unclear. We examined the accuracy of preterm birth classification as spontaneous or indicated based on birth certificate data. This is a retrospective cohort study of 123 birth certificates from preterm births in Missouri. Correct classification of spontaneous or indicated preterm birth subtype was based on multi-provider (RN, MFM Fellow, MFM attending) consensus after full medical record review. A categorisation algorithm based on clinical data available in the birth certificate was designed a priori and classification was performed by a single investigator according to the algorithm. Accuracy of birth certificate classification as spontaneous or indicated was compared to the consensus classification. Errors in misclassification were explored. Classification based on birth certificates was correct for 66% of preterm births. Most errors in classification by birth certificate occurred in classifying a birth as spontaneous when it was in fact indicated. The vast majority of errors occurred when preterm rupture of membranes (≥12 h) was checked on the birth certificate causing classification as spontaneous when there was a maternal or fetal indication for delivery. Birth certificate classification overestimated spontaneous preterm birth and underestimated indicated preterm birth compared to classification performed from medical record review. Revisions to birth certificate clinical data would allow more accurate population level surveillance of preterm birth subtypes. © 2017 John Wiley & Sons Ltd.

  1. Fuzzy classifier for fault diagnosis in analog electronic circuits.

    Science.gov (United States)

    Kumar, Ashwani; Singh, A P

    2013-11-01

    Many studies have presented different approaches for the fault diagnosis with fault models having ± 50% variation in the component values in analog electronic circuits. There is still a need of the approaches which provide the fault diagnosis with the variation in the component value below ± 50%. A new single and multiple fault diagnosis technique for soft faults in analog electronic circuit using fuzzy classifier has been proposed in this paper. This technique uses the simulation before test (SBT) approach by analyzing the frequency response of the analog circuit under faulty and fault free conditions. Three signature parameters peak gain, frequency and phase associated with peak gain, of the frequency response of the analog circuit are observed and extracted such that they give unique values for faulty and fault free configuration of the circuit. The single and double fault models with the component variations from ± 10% to ± 50% are considered. The fuzzy classifier along the classification of faults gives the estimated component value under faulty and faultfree conditions. The proposed method is validated using simulated data and the real time data for a benchmark analog circuit. The comparative analysis is also presented for both the validations. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Ensemble classifier using GRG algorithm for land cover classification

    Science.gov (United States)

    Abe, Bolanle T.; Jordaan, J. A.; Marwala, Tshilidzi

    2013-12-01

    Image processing is of great value because it enables satellite images to be translated into useful information. The preprocessing of remotely sensed images before features extraction is important to remove noise and improve the ability to interpret image data more accurately. All images should appear as if they were acquired from the same sensor at the end of image preprocessing. A major challenge associated with hyperspectral imagery in remote sensing analysis is the mixed pixels which are due to huge dimension nature of the data. This study makes a positive contribution to the problem of land cover classification by exploring Generalized Reduced Gradient (GRG) algorithm on hyperspectral datasets by using Washington DC mall and Indiana pines test site of Northwestern Indiana, USA as study sites. The algorithm was used to estimate the fractional abundance in the datasets for land cover classification. Ensemble classifiers such as random forest, bagging and support vector machines were implemented in Waikato Environment for knowledge Analysis (WEKA) to carry out the classification procedures. Experimental results show that random forest ensemble outperformed the other ensemble methods. The comparison of the classifiers is crucial for a decision maker to consider compromises in accuracy technique against complexity technique.

  3. Even more Chironomid species for classifying lake nutrient status

    Directory of Open Access Journals (Sweden)

    Les Ruse

    2015-07-01

    Full Text Available The European Union Water Framework Directive (WFD classifies ecological status of a waterbody by the determination of its natural reference state to provide a measure of perturbation by human impacts based on taxonomic composition and abundance of aquatic species. Ruse (2010; 2011 has provided methods of assessing anthropogenic perturbations to lake ecological status, in terms of nutrient enrichment and acidification, by analysing collections of floating pupal exuviae discarded by emerging adult Chironomidae. The previous nutrient assessment method was derived from chironomid and environmental data collected during 178 lake surveys of all WFD types found in Britain. Canonical Correspondence Analysis provided species optima in relation to phosphate and nitrogen concentrations. Species found in less than three surveys were excluded from analysis in case of spurious association with environmental values. Since Ruse (2010 an additional 72 lakes have been surveyed adding 31 more species for use in nutrient status assessment. These additional scoring species are reported here. The practical application of the Chironomid Pupal Exuvial Technique (CPET to classify WFD lake nutrient status is demonstrated using CPET survey data from lakes in Poland.

  4. The Marker State Space (MSS method for classifying clinical samples.

    Directory of Open Access Journals (Sweden)

    Brian P Fallon

    Full Text Available The development of accurate clinical biomarkers has been challenging in part due to the diversity between patients and diseases. One approach to account for the diversity is to use multiple markers to classify patients, based on the concept that each individual marker contributes information from its respective subclass of patients. Here we present a new strategy for developing biomarker panels that accounts for completely distinct patient subclasses. Marker State Space (MSS defines "marker states" based on all possible patterns of high and low values among a panel of markers. Each marker state is defined as either a case state or a control state, and a sample is classified as case or control based on the state it occupies. MSS was used to define multi-marker panels that were robust in cross validation and training-set/test-set analyses and that yielded similar classification accuracy to several other classification algorithms. A three-marker panel for discriminating pancreatic cancer patients from control subjects revealed subclasses of patients based on distinct marker states. MSS provides a straightforward approach for modeling highly divergent subclasses of patients, which may be adaptable for diverse applications.

  5. Speaker gender identification based on majority vote classifiers

    Science.gov (United States)

    Mezghani, Eya; Charfeddine, Maha; Nicolas, Henri; Ben Amar, Chokri

    2017-03-01

    Speaker gender identification is considered among the most important tools in several multimedia applications namely in automatic speech recognition, interactive voice response systems and audio browsing systems. Gender identification systems performance is closely linked to the selected feature set and the employed classification model. Typical techniques are based on selecting the best performing classification method or searching optimum tuning of one classifier parameters through experimentation. In this paper, we consider a relevant and rich set of features involving pitch, MFCCs as well as other temporal and frequency-domain descriptors. Five classification models including decision tree, discriminant analysis, nave Bayes, support vector machine and k-nearest neighbor was experimented. The three best perming classifiers among the five ones will contribute by majority voting between their scores. Experimentations were performed on three different datasets spoken in three languages: English, German and Arabic in order to validate language independency of the proposed scheme. Results confirm that the presented system has reached a satisfying accuracy rate and promising classification performance thanks to the discriminating abilities and diversity of the used features combined with mid-level statistics.

  6. REPTREE CLASSIFIER FOR IDENTIFYING LINK SPAM IN WEB SEARCH ENGINES

    Directory of Open Access Journals (Sweden)

    S.K. Jayanthi

    2013-01-01

    Full Text Available Search Engines are used for retrieving the information from the web. Most of the times, the importance is laid on top 10 results sometimes it may shrink as top 5, because of the time constraint and reliability on the search engines. Users believe that top 10 or 5 of total results are more relevant. Here comes the problem of spamdexing. It is a method to deceive the search result quality. Falsified metrics such as inserting enormous amount of keywords or links in website may take that website to the top 10 or 5 positions. This paper proposes a classifier based on the Reptree (Regression tree representative. As an initial step Link-based features such as neighbors, pagerank, truncated pagerank, trustrank and assortativity related attributes are inferred. Based on this features, tree is constructed. The tree uses the feature inference to differentiate spam sites from legitimate sites. WEBSPAM-UK-2007 dataset is taken as a base. It is preprocessed and converted into five datasets FEATA, FEATB, FEATC, FEATD and FEATE. Only link based features are taken for experiments. This paper focus on link spam alone. Finally a representative tree is created which will more precisely classify the web spam entries. Results are given. Regression tree classification seems to perform well as shown through experiments.

  7. Phenotype Recognition with Combined Features and Random Subspace Classifier Ensemble

    Directory of Open Access Journals (Sweden)

    Pham Tuan D

    2011-04-01

    Full Text Available Abstract Background Automated, image based high-content screening is a fundamental tool for discovery in biological science. Modern robotic fluorescence microscopes are able to capture thousands of images from massively parallel experiments such as RNA interference (RNAi or small-molecule screens. As such, efficient computational methods are required for automatic cellular phenotype identification capable of dealing with large image data sets. In this paper we investigated an efficient method for the extraction of quantitative features from images by combining second order statistics, or Haralick features, with curvelet transform. A random subspace based classifier ensemble with multiple layer perceptron (MLP as the base classifier was then exploited for classification. Haralick features estimate image properties related to second-order statistics based on the grey level co-occurrence matrix (GLCM, which has been extensively used for various image processing applications. The curvelet transform has a more sparse representation of the image than wavelet, thus offering a description with higher time frequency resolution and high degree of directionality and anisotropy, which is particularly appropriate for many images rich with edges and curves. A combined feature description from Haralick feature and curvelet transform can further increase the accuracy of classification by taking their complementary information. We then investigate the applicability of the random subspace (RS ensemble method for phenotype classification based on microscopy images. A base classifier is trained with a RS sampled subset of the original feature set and the ensemble assigns a class label by majority voting. Results Experimental results on the phenotype recognition from three benchmarking image sets including HeLa, CHO and RNAi show the effectiveness of the proposed approach. The combined feature is better than any individual one in the classification accuracy. The

  8. The Onchocerciasis Elimination Program for the Americas (OEPA).

    Science.gov (United States)

    Sauerbrey, M

    2008-09-01

    Human onchocerciasis (river blindness) occurs in 13 foci distributed among six countries in Latin America (Brazil, Colombia, Ecuador, Guatemala, Mexico and Venezuela), where about 500,000 people are considered at risk. An effort to eliminate the disease from the region was launched in response to a specific resolution adopted by the PanAmerican Health Organization (PAHO) in 1991: to eliminate onchocerciasis from the region, as a public-health problem, by 2007. The effort took advantage of the donation of the drug Mectizan (ivermectin) by Merck & Co., Inc. In 1992, the Onchocerciasis Elimination Program for the Americas (OEPA) was launched, with its headquarters in Guatemala, to act as a technical and co-ordinating body of a multinational, multi-agency coalition that includes the endemic countries, PAHO, The Carter Center, Lions Clubs, the United States Centers for Disease Control and Prevention, The Bill and Melinda Gates Foundation, Merck & Co., Inc., and other partners. This public-private partnership facilitated the establishment of programmes for the semi-annual mass administration of Mectizan in the six countries with onchocerciasis. The aims were to (1) provide sustained treatments, with coverage reaching at least 85% of those eligible to receive the drug (in the 1845 endemic communities that are distributed within the 13 regional foci); (2) eliminate new morbidity caused by Onchocerca volvulus infection by 2007; and (3) eliminate transmission of the parasite wherever feasible. Significant progress has already been made in all six countries, each of which has active programmes with treatment coverages exceeding the target of 85%. The progress is being documented in accordance with certification guidelines for onchocerciasis elimination established by the World Health Organization. No new cases of onchocercal blindness are being reported in the region, and ocular disease attributable to O. volvulus has been eliminated from nine of the 13 foci. Treatment is no

  9. Effect of eliminating chronic diseases among elderly individuals

    Directory of Open Access Journals (Sweden)

    Alessandro Goncalves Campolina

    2013-06-01

    Full Text Available OBJECTIVE: To determine whether the elimination of certain chronic diseases is capable of leading to the compression of morbidity among elderly individuals. METHODS: A population-based, cross-sectional study was carried out with official data for the city of Sao Paulo, Southeastern Brazil in 2000 and data from the SABE (Health, Wellbeing and Ageing study. Sullivan's method was used to calculate disability-free life expectancy. Cause-deleted life tables were used to calculate the probabilities of death and disabilities with the elimination of health conditions. RESULTS: The largest gains in disability-free life expectancy, with the elimination of chronic illness, occurred in the female gender. Among individuals of a more advanced age, gains in disability-free life expectancy occurred as result of a relative compression of morbidity. Among men aged 75 years, all conditions studied, except heart disease and systemic arterial pressure, led to an absolute expansion of morbidity and, at the same time, to a relative compression of morbidity upon being eliminated. CONCLUSIONS: The elimination of chronic diseases in the elderly could lead to the compression of morbidity in elderly men and women.

  10. Primary chromatic aberration elimination via optimization work with genetic algorithm

    Science.gov (United States)

    Wu, Bo-Wen; Liu, Tung-Kuan; Fang, Yi-Chin; Chou, Jyh-Horng; Tsai, Hsien-Lin; Chang, En-Hao

    2008-09-01

    Chromatic Aberration plays a part in modern optical systems, especially in digitalized and smart optical systems. Much effort has been devoted to eliminating specific chromatic aberration in order to match the demand for advanced digitalized optical products. Basically, the elimination of axial chromatic and lateral color aberration of an optical lens and system depends on the selection of optical glass. According to reports from glass companies all over the world, the number of various newly developed optical glasses in the market exceeds three hundred. However, due to the complexity of a practical optical system, optical designers have so far had difficulty in finding the right solution to eliminate small axial and lateral chromatic aberration except by the Damped Least Squares (DLS) method, which is limited in so far as the DLS method has not yet managed to find a better optical system configuration. In the present research, genetic algorithms are used to replace traditional DLS so as to eliminate axial and lateral chromatic, by combining the theories of geometric optics in Tessar type lenses and a technique involving Binary/Real Encoding, Multiple Dynamic Crossover and Random Gene Mutation to find a much better configuration for optical glasses. By implementing the algorithms outlined in this paper, satisfactory results can be achieved in eliminating axial and lateral color aberration.

  11. Rabies elimination research: juxtaposing optimism, pragmatism and realism.

    Science.gov (United States)

    Cleaveland, Sarah; Hampson, Katie

    2017-12-20

    More than 100 years of research has now been conducted into the prevention, control and elimination of rabies with safe and highly efficacious vaccines developed for use in human and animal populations. Domestic dogs are a major reservoir for rabies, and although considerable advances have been made towards the elimination and control of canine rabies in many parts of the world, the disease continues to kill tens of thousands of people every year in Africa and Asia. Policy efforts are now being directed towards a global target of zero human deaths from dog-mediated rabies by 2030 and the global elimination of canine rabies. Here we demonstrate how research provides a cause for optimism as to the feasibility of these goals through strategies based around mass dog vaccination. We summarize some of the pragmatic insights generated from rabies epidemiology and dog ecology research that can improve the design of dog vaccination strategies in low- and middle-income countries and which should encourage implementation without further delay. We also highlight the need for realism in reaching the feasible, although technically more difficult and longer-term goal of global elimination of canine rabies. Finally, we discuss how research on rabies has broader relevance to the control and elimination of a suite of diseases of current concern to human and animal health, providing an exemplar of the value of a 'One Health' approach. © 2017 The Authors.

  12. DUPIC facility engineering

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. S.; Choi, J. W.; Go, W. I.; Kim, H. D.; Song, K. C.; Jeong, I. H.; Park, H. S.; Im, C. S.; Lee, H. M.; Moon, K. H.; Hong, K. P.; Lee, K. S.; Suh, K. S.; Kim, E. K.; Min, D. K.; Lee, J. C.; Chun, Y. B.; Paik, S. Y.; Lee, E. P.; Yoo, G. S.; Kim, Y. S.; Park, J. C.

    1997-09-01

    In the early stage of the project, a comprehensive survey was conducted to identify the feasibility of using available facilities and of interface between those facilities. It was found out that the shielded cell M6 interface between those facilities. It was found out that the shielded cell M6 of IMEF could be used for the main process experiments of DUPIC fuel fabrication in regard to space adequacy, material flow, equipment layout, etc. Based on such examination, a suitable adapter system for material transfer around the M6 cell was engineered. Regarding the PIEF facility, where spent PWR fuel assemblies are stored in an annex pool, disassembly devices in the pool are retrofitted and spent fuel rod cutting and shipping system to the IMEF are designed and built. For acquisition of casks for radioactive material transport between the facilities, some adaptive refurbishment was applied to the available cask (Padirac) based on extensive analysis on safety requirements. A mockup test facility was newly acquired for remote test of DUPIC fuel fabrication process equipment prior to installation in the M6 cell of the IMEF facility. (author). 157 refs., 57 tabs., 65 figs.

  13. Exploratory Study to Identify Radiomics Classifiers for Lung Cancer Histology.

    Science.gov (United States)

    Wu, Weimiao; Parmar, Chintan; Grossmann, Patrick; Quackenbush, John; Lambin, Philippe; Bussink, Johan; Mak, Raymond; Aerts, Hugo J W L

    2016-01-01

    Radiomics can quantify tumor phenotypic characteristics non-invasively by applying feature algorithms to medical imaging data. In this study of lung cancer patients, we investigated the association between radiomic features and the tumor histologic subtypes (adenocarcinoma and squamous cell carcinoma). Furthermore, in order to predict histologic subtypes, we employed machine-learning methods and independently evaluated their prediction performance. Two independent radiomic cohorts with a combined size of 350 patients were included in our analysis. A total of 440 radiomic features were extracted from the segmented tumor volumes of pretreatment CT images. These radiomic features quantify tumor phenotypic characteristics on medical images using tumor shape and size, intensity statistics, and texture. Univariate analysis was performed to assess each feature's association with the histological subtypes. In our multivariate analysis, we investigated 24 feature selection methods and 3 classification methods for histology prediction. Multivariate models were trained on the training cohort and their performance was evaluated on the independent validation cohort using the area under ROC curve (AUC). Histology was determined from surgical specimen. In our univariate analysis, we observed that fifty-three radiomic features were significantly associated with tumor histology. In multivariate analysis, feature selection methods ReliefF and its variants showed higher prediction accuracy as compared to other methods. We found that Naive Baye's classifier outperforms other classifiers and achieved the highest AUC (0.72; p-value = 2.3 × 10(-7)) with five features: Stats_min, Wavelet_HLL_rlgl_lowGrayLevelRunEmphasis, Wavelet_HHL_stats_median, Wavelet_HLL_stats_skewness, and Wavelet_HLH_glcm_clusShade. Histological subtypes can influence the choice of a treatment/therapy for lung cancer patients. We observed that radiomic features show significant association with the lung

  14. A deep learning method for classifying mammographic breast density categories.

    Science.gov (United States)

    Mohamed, Aly A; Berg, Wendie A; Peng, Hong; Luo, Yahong; Jankowitz, Rachel C; Wu, Shandong

    2018-01-01

    Mammographic breast density is an established risk marker for breast cancer and is visually assessed by radiologists in routine mammogram image reading, using four qualitative Breast Imaging and Reporting Data System (BI-RADS) breast density categories. It is particularly difficult for radiologists to consistently distinguish the two most common and most variably assigned BI-RADS categories, i.e., "scattered density" and "heterogeneously dense". The aim of this work was to investigate a deep learning-based breast density classifier to consistently distinguish these two categories, aiming at providing a potential computerized tool to assist radiologists in assigning a BI-RADS category in current clinical workflow. In this study, we constructed a convolutional neural network (CNN)-based model coupled with a large (i.e., 22,000 images) digital mammogram imaging dataset to evaluate the classification performance between the two aforementioned breast density categories. All images were collected from a cohort of 1,427 women who underwent standard digital mammography screening from 2005 to 2016 at our institution. The truths of the density categories were based on standard clinical assessment made by board-certified breast imaging radiologists. Effects of direct training from scratch solely using digital mammogram images and transfer learning of a pretrained model on a large nonmedical imaging dataset were evaluated for the specific task of breast density classification. In order to measure the classification performance, the CNN classifier was also tested on a refined version of the mammogram image dataset by removing some potentially inaccurately labeled images. Receiver operating characteristic (ROC) curves and the area under the curve (AUC) were used to measure the accuracy of the classifier. The AUC was 0.9421 when the CNN-model was trained from scratch on our own mammogram images, and the accuracy increased gradually along with an increased size of training samples

  15. Biotechnology Facility: An ISS Microgravity Research Facility

    Science.gov (United States)

    Gonda, Steve R.; Tsao, Yow-Min

    2000-01-01

    The International Space Station (ISS) will support several facilities dedicated to scientific research. One such facility, the Biotechnology Facility (BTF), is sponsored by the Microgravity Sciences and Applications Division (MSAD) and developed at NASA's Johnson Space Center. The BTF is scheduled for delivery to the ISS via Space Shuttle in April 2005. The purpose of the BTF is to provide: (1) the support structure and integration capabilities for the individual modules in which biotechnology experiments will be performed, (2) the capability for human-tended, repetitive, long-duration biotechnology experiments, and (3) opportunities to perform repetitive experiments in a short period by allowing continuous access to microgravity. The MSAD has identified cell culture and tissue engineering, protein crystal growth, and fundamentals of biotechnology as areas that contain promising opportunities for significant advancements through low-gravity experiments. The focus of this coordinated ground- and space-based research program is the use of the low-gravity environment of space to conduct fundamental investigations leading to major advances in the understanding of basic and applied biotechnology. Results from planned investigations can be used in applications ranging from rational drug design and testing, cancer diagnosis and treatments and tissue engineering leading to replacement tissues.

  16. Eliminating the Neglected Tropical Diseases: Translational Science and New Technologies.

    Directory of Open Access Journals (Sweden)

    Peter J Hotez

    2016-03-01

    Full Text Available Today, the World Health Organization recognizes 17 major parasitic and related infections as the neglected tropical diseases (NTDs. Despite recent gains in the understanding of the nature and prevalence of NTDs, as well as successes in recent scaled-up preventive chemotherapy strategies and other health interventions, the NTDs continue to rank among the world's greatest global health problems. For virtually all of the NTDs (including those slated for elimination under the auspices of a 2012 London Declaration for NTDs and a 2013 World Health Assembly resolution [WHA 66.12], additional control mechanisms and tools are needed, including new NTD drugs, vaccines, diagnostics, and vector control agents and strategies. Elimination will not be possible without these new tools. Here we summarize some of the key challenges in translational science to develop and introduce these new technologies in order to ensure success in global NTD elimination efforts.

  17. Discrimination of Mine Seismic Events and Blasts Using the Fisher Classifier, Naive Bayesian Classifier and Logistic Regression

    Science.gov (United States)

    Dong, Longjun; Wesseloo, Johan; Potvin, Yves; Li, Xibing

    2016-01-01

    Seismic events and blasts generate seismic waveforms that have different characteristics. The challenge to confidently differentiate these two signatures is complex and requires the integration of physical and statistical techniques. In this paper, the different characteristics of blasts and seismic events were investigated by comparing probability density distributions of different parameters. Five typical parameters of blasts and events and the probability density functions of blast time, as well as probability density functions of origin time difference for neighbouring blasts were extracted as discriminant indicators. The Fisher classifier, naive Bayesian classifier and logistic regression were used to establish discriminators. Databases from three Australian and Canadian mines were established for training, calibrating and testing the discriminant models. The classification performances and discriminant precision of the three statistical techniques were discussed and compared. The proposed discriminators have explicit and simple functions which can be easily used by workers in mines or researchers. Back-test, applied results, cross-validated results and analysis of receiver operating characteristic curves in different mines have shown that the discriminator for one of the mines has a reasonably good discriminating performance.

  18. Determinants of Human African Trypanosomiasis Elimination via Paratransgenesis.

    Directory of Open Access Journals (Sweden)

    Jennifer A Gilbert

    2016-03-01

    Full Text Available Human African trypanosomiasis (HAT, transmitted by tsetse flies, has historically infected hundreds of thousands of individuals annually in sub-Saharan Africa. Over the last decade, concerted control efforts have reduced reported cases to below 10,000 annually, bringing complete elimination within reach. A potential technology to eliminate HAT involves rendering the flies resistant to trypanosome infection. This approach can be achieved through the introduction of transgenic Sodalis symbiotic bacteria that have been modified to produce a trypanocide, and propagated via Wolbachia symbionts, which confer a reproductive advantage to the paratransgenic tsetse. However, the population dynamics of these symbionts within tsetse flies have not yet been evaluated. Specifically, the key factors that determine the effectiveness of paratransgenesis have yet to be quantified. To identify the impact of these determinants on T.b. gambiense and T.b. rhodesiense transmission, we developed a mathematical model of trypanosome transmission that incorporates tsetse and symbiont population dynamics. We found that fecundity and mortality penalties associated with Wolbachia or recombinant Sodalis colonization, probabilities of vertical transmission, and tsetse migration rates are fundamental to the feasibility of HAT elimination. For example, we determined that HAT elimination could be sustained over 25 years when Wolbachia colonization minimally impacted fecundity or mortality, and when the probability of recombinant Sodalis vertical transmission exceeded 99.9%. We also found that for a narrow range of recombinant Sodalis vertical transmission probability (99.9-90.6% for T.b. gambiense and 99.9-85.8% for T.b. rhodesiense, cumulative HAT incidence was reduced between 30% and 1% for T.b. gambiense and between 21% and 3% for T.b. rhodesiense, although elimination was not predicted. Our findings indicate that fitness and mortality penalties associated with paratransgenic

  19. Wind Energy Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Office of Energy Efficiency and Renewable Energy

    2017-02-01

    This book takes readers inside the places where daily discoveries shape the next generation of wind power systems. Energy Department laboratory facilities span the United States and offer wind research capabilities to meet industry needs. The facilities described in this book make it possible for industry players to increase reliability, improve efficiency, and reduce the cost of wind energy -- one discovery at a time. Whether you require blade testing or resource characterization, grid integration or high-performance computing, Department of Energy laboratory facilities offer a variety of capabilities to meet your wind research needs.

  20. Sustainable Facilities Management

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Elle, Morten; Hoffmann, Birgitte

    2004-01-01

    The Danish public housing sector has more than 20 years of experience with sustainable facilities management based on user involvement. The paper outlines this development in a historical perspective and gives an analysis of different approaches to sustainable facilities management. The focus...... is on the housing departments and strateies for the management of the use of resources. The research methods used are case studies based on interviews in addition to literature studies. The paper explores lessons to be learned about sustainable facilities management in general, and points to a need for new...

  1. HVAC optimization as facility requirements change with corporate restructuring

    Energy Technology Data Exchange (ETDEWEB)

    Rodak, R.R.; Sankey, M.S.

    1997-06-01

    The hyper-competitive, dynamic 1990`s forced many corporations to {open_quotes}Right-Size,{close_quotes} relocating resources and equipment -- even consolidating. These changes led to utility reduction if HVAC optimization was thoroughly addressed, and energy conservation opportunities were identified and properly designed. This is true particularly when the facility`s heating and cooling systems are matched to correspond with the load changes attributed to the reduction of staff and computers. Computers have been downsized and processing power per unit of energy input increased, thus, the need for large mainframe computer centers, and their associated high intensity energy usage, have been decreased or eliminated. Cooling, therefore, also has been reduced.

  2. An Automated Acoustic System to Monitor and Classify Birds

    Directory of Open Access Journals (Sweden)

    Ho KC

    2006-01-01

    Full Text Available This paper presents a novel bird monitoring and recognition system in noisy environments. The project objective is to avoid bird strikes to aircraft. First, a cost-effective microphone dish concept (microphone array with many concentric rings is presented that can provide directional and accurate acquisition of bird sounds and can simultaneously pick up bird sounds from different directions. Second, direction-of-arrival (DOA and beamforming algorithms have been developed for the circular array. Third, an efficient recognition algorithm is proposed which uses Gaussian mixture models (GMMs. The overall system is suitable for monitoring and recognition for a large number of birds. Fourth, a hardware prototype has been built and initial experiments demonstrated that the array can acquire and classify birds accurately.

  3. ­Classifying prion and prion-like phenomena

    Science.gov (United States)

    Harbi, Djamel; Harrison, Paul M

    2014-01-01

    The universe of prion and prion-like phenomena has expanded significantly in the past several years. Here, we overview the challenges in classifying this data informatically, given that terms such as “prion-like”, “prion-related” or “prion-forming” do not have a stable meaning in the scientific literature. We examine the spectrum of proteins that have been described in the literature as forming prions, and discuss how “prion” can have a range of meaning, with a strict definition being for demonstration of infection with in vitro-derived recombinant prions. We suggest that although prion/prion-like phenomena can largely be apportioned into a small number of broad groups dependent on the type of transmissibility evidence for them, as new phenomena are discovered in the coming years, a detailed ontological approach might be necessary that allows for subtle definition of different “flavors” of prion / prion-like phenomena. PMID:24549098

  4. Sex Bias in Classifying Borderline and Narcissistic Personality Disorder.

    Science.gov (United States)

    Braamhorst, Wouter; Lobbestael, Jill; Emons, Wilco H M; Arntz, Arnoud; Witteman, Cilia L M; Bekker, Marrie H J

    2015-10-01

    This study investigated sex bias in the classification of borderline and narcissistic personality disorders. A sample of psychologists in training for a post-master degree (N = 180) read brief case histories (male or female version) and made DSM classification. To differentiate sex bias due to sex stereotyping or to base rate variation, we used different case histories, respectively: (1) non-ambiguous case histories with enough criteria of either borderline or narcissistic personality disorder to meet the threshold for classification, and (2) an ambiguous case with subthreshold features of both borderline and narcissistic personality disorder. Results showed significant differences due to sex of the patient in the ambiguous condition. Thus, when the diagnosis is not straightforward, as in the case of mixed subthreshold features, sex bias is present and is influenced by base-rate variation. These findings emphasize the need for caution in classifying personality disorders, especially borderline or narcissistic traits.

  5. Higher School Marketing Strategy Formation: Classifying the Factors

    Directory of Open Access Journals (Sweden)

    N. K. Shemetova

    2012-01-01

    Full Text Available The paper deals with the main trends of higher school management strategy formation. The author specifies the educational changes in the modern information society determining the strategy options. For each professional training level the author denotes the set of strategic factors affecting the educational service consumers and, therefore, the effectiveness of the higher school marketing. The given factors are classified from the stand-points of the providers and consumers of educational service (enrollees, students, graduates and postgraduates. The research methods include the statistic analysis and general methods of scientific analysis, synthesis, induction, deduction, comparison, and classification. The author is convinced that the university management should develop the necessary prerequisites for raising the graduates’ competitiveness in the labor market, and stimulate the active marketing policies of the relating subdivisions and departments. In author’s opinion, the above classification of marketing strategy factors can be used as the system of values for educational service providers. 

  6. Improved method for predicting protein fold patterns with ensemble classifiers.

    Science.gov (United States)

    Chen, W; Liu, X; Huang, Y; Jiang, Y; Zou, Q; Lin, C

    2012-01-27

    Protein folding is recognized as a critical problem in the field of biophysics in the 21st century. Predicting protein-folding patterns is challenging due to the complex structure of proteins. In an attempt to solve this problem, we employed ensemble classifiers to improve prediction accuracy. In our experiments, 188-dimensional features were extracted based on the composition and physical-chemical property of proteins and 20-dimensional features were selected using a coupled position-specific scoring matrix. Compared with traditional prediction methods, these methods were superior in terms of prediction accuracy. The 188-dimensional feature-based method achieved 71.2% accuracy in five cross-validations. The accuracy rose to 77% when we used a 20-dimensional feature vector. These methods were used on recent data, with 54.2% accuracy. Source codes and dataset, together with web server and software tools for prediction, are available at: http://datamining.xmu.edu.cn/main/~cwc/ProteinPredict.html.

  7. An automated approach to the design of decision tree classifiers

    Science.gov (United States)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  8. Some factors influencing interobserver variation in classifying simple pneumoconiosis.

    Science.gov (United States)

    Musch, D C; Higgins, I T; Landis, J R

    1985-01-01

    Three experienced physician readers assessed the chest radiographs of 743 men from a coal mining community in West Virginia for the signs of simple pneumoconiosis, using the ILO U/C 1971 Classification of Radiographs of the Pneumoconioses. The number of films categorised by each reader as showing evidence of simple pneumoconiosis varied from 63 (8.5%) to 114 (15.3%) of the 743 films classified. The effect of film quality and obesity on interobserver agreement was assessed by use of kappa-type analytic procedures for measuring agreement on categorical data. Poor film quality and obesity both affected agreement adversely. Poor quality films were disproportionately frequent in obese individuals, as defined by the Quetelet index. On control of film quality by stratification, the effect of obesity on interobserver profusion agreement was no longer evident. PMID:3986146

  9. Some factors influencing interobserver variation in classifying simple pneumoconiosis

    Energy Technology Data Exchange (ETDEWEB)

    Musch, D.C.; Higgins, I.T.; Landis, J.R.

    1985-05-01

    Three experienced physician readers assessed the chest radiographs of 743 men from a coal mining community in West Virginia for the signs of simple pneumoconiosis, using the ILO U/C 1971 Classification of Radiographs of the Pneumoconioses. The number of films categorized by each reader as showing evidence of simple pneumoconiosis varied from 63 (8.5%) to 114 (15.3%) of the 743 films classified. The effect of film quality and obesity on interobserver agreement was assessed by use of kappa-type analytic procedures for measuring agreement on categorical data. Poor film quality and obesity both affected agreement adversely. Poor quality films were disproportionately frequent in obese individuals, as defined by the Quetelet index. On control of film quality by stratification, the effect of obesity on interobserver profusion agreement was no longer evident.

  10. A robust dataset-agnostic heart disease classifier from Phonocardiogram.

    Science.gov (United States)

    Banerjee, Rohan; Dutta Choudhury, Anirban; Deshpande, Parijat; Bhattacharya, Sakyajit; Pal, Arpan; Mandana, K M

    2017-07-01

    Automatic classification of normal and abnormal heart sounds is a popular area of research. However, building a robust algorithm unaffected by signal quality and patient demography is a challenge. In this paper we have analysed a wide list of Phonocardiogram (PCG) features in time and frequency domain along with morphological and statistical features to construct a robust and discriminative feature set for dataset-agnostic classification of normal and cardiac patients. The large and open access database, made available in Physionet 2016 challenge was used for feature selection, internal validation and creation of training models. A second dataset of 41 PCG segments, collected using our in-house smart phone based digital stethoscope from an Indian hospital was used for performance evaluation. Our proposed methodology yielded sensitivity and specificity scores of 0.76 and 0.75 respectively on the test dataset in classifying cardiovascular diseases. The methodology also outperformed three popular prior art approaches, when applied on the same dataset.

  11. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  12. Sliding Control with Chattering Elimination for Hydraulic Drives

    DEFF Research Database (Denmark)

    Schmidt, Lasse; Andersen, Torben Ole; Pedersen, Henrik C.

    2012-01-01

    This paper presents the development of a sliding mode control scheme with chattering elimination, generally applicable for position tracking control of electro-hydraulic valve-cylinder drives. The proposed control scheme requires only common data sheet information, no knowledge on load characteri......This paper presents the development of a sliding mode control scheme with chattering elimination, generally applicable for position tracking control of electro-hydraulic valve-cylinder drives. The proposed control scheme requires only common data sheet information, no knowledge on load...

  13. Rabies Elimination in Dogs in the United States

    Centers for Disease Control (CDC) Podcasts

    2008-12-01

    Rabies has been eliminated from dogs in the United States through efforts to promote annual vaccination, but it's still a problem in wildlife in the U.S. and in wild and domesticated animals abroad. In this podcast, CDC's Dr. Charles Rupprecht discusses a study which provides proof of the elimination of rabies in dogs and what this means for the average American.  Created: 12/1/2008 by Emerging Infectious Diseases.   Date Released: 12/1/2008.

  14. Association between dysfunctional elimination syndrome and sensory processing disorder.

    Science.gov (United States)

    Pollock, Mary R; Metz, Alexia E; Barabash, Theresa

    2014-01-01

    OBJECTIVE. We explored whether sensory processing disorder (SPD) is related to dysfunctional elimination syndrome (DES). METHOD. We used the Vancouver Nonneurogenic Lower Urinary Tract Dysfunction/Dysfunctional Elimination Syndrome Questionnaire and the Short Sensory Profile with participants who sought treatment of DES (n = 19) and healthy control participants (n = 55). RESULTS. Significantly more children with DES (53%) had SPD than was reported for the general population (p sensory processing pattern would be an important aspect that could influence the plan of care. Copyright © 2014 by the American Occupational Therapy Association, Inc.

  15. Application of exponential homotopy algorithm in the inverter harmonic elimination

    Science.gov (United States)

    Zhang, Li

    2017-02-01

    Eliminating harmonic pollution and improving power factor are the quite important task in the field of power electronics. From the mathematical point of view, harmonic elimination problems can be translated into nonlinear equations. But it is difficult to directly solve the nonlinear equations because of complexity. For this reason, exponential homotopy method is proposed based on homotopy method in this paper. It has focused on built up homotopy equations by modifying the singularities of Jacobian matrix, and on this basis homotopy equations are transformed into the differential initial value problems. Numerical results show that the new exponential homotopy method has higher precision than other algorithms, and the singularity is improved.

  16. The Challenges of Identifying and Classifying Child Sexual Abuse Material.

    Science.gov (United States)

    Kloess, Juliane A; Woodhams, Jessica; Whittle, Helen; Grant, Tim; Hamilton-Giachritsis, Catherine E

    2017-08-01

    The aim of the present study was to (a) assess the reliability with which indecent images of children (IIOC) are classified as being of an indecent versus nonindecent nature, and (b) examine in detail the decision-making process engaged in by law enforcement personnel who undertake the difficult task of identifying and classifying IIOC as per the current legislative offense categories. One experienced researcher and four employees from a police force in the United Kingdom coded an extensive amount of IIOC ( n = 1,212-2,233) to determine if they (a) were deemed to be of an indecent nature, and (b) depicted a child. Interrater reliability analyses revealed both considerable agreement and disagreement across coders, which were followed up with two focus groups involving the four employees. The first entailed a general discussion of the aspects that made such material more or less difficult to identify; the second focused around images where there had been either agreement ( n = 20) or disagreement ( n = 36) across coders that the images were of an indecent nature. Using thematic analysis, a number of factors apparent within IIOC were revealed to make the determination of youthfulness and indecency significantly more challenging for coders, with most relating to the developmental stage of the victim and the ambiguity of the context of an image. Findings are discussed in light of their implications for the identification of victims of ongoing sexual exploitation/abuse, the assessment and treatment of individuals in possession of IIOC, as well as the practice of policing and sentencing this type of offending behavior.

  17. Deep Learning to Classify Radiology Free-Text Reports.

    Science.gov (United States)

    Chen, Matthew C; Ball, Robyn L; Yang, Lingyao; Moradzadeh, Nathaniel; Chapman, Brian E; Larson, David B; Langlotz, Curtis P; Amrhein, Timothy J; Lungren, Matthew P

    2017-11-13

    Purpose To evaluate the performance of a deep learning convolutional neural network (CNN) model compared with a traditional natural language processing (NLP) model in extracting pulmonary embolism (PE) findings from thoracic computed tomography (CT) reports from two institutions. Materials and Methods Contrast material-enhanced CT examinations of the chest performed between January 1, 1998, and January 1, 2016, were selected. Annotations by two human radiologists were made for three categories: the presence, chronicity, and location of PE. Classification of performance of a CNN model with an unsupervised learning algorithm for obtaining vector representations of words was compared with the open-source application PeFinder. Sensitivity, specificity, accuracy, and F1 scores for both the CNN model and PeFinder in the internal and external validation sets were determined. Results The CNN model demonstrated an accuracy of 99% and an area under the curve value of 0.97. For internal validation report data, the CNN model had a statistically significant larger F1 score (0.938) than did PeFinder (0.867) when classifying findings as either PE positive or PE negative, but no significant difference in sensitivity, specificity, or accuracy was found. For external validation report data, no statistical difference between the performance of the CNN model and PeFinder was found. Conclusion A deep learning CNN model can classify radiology free-text reports with accuracy equivalent to or beyond that of an existing traditional NLP model. © RSNA, 2017 Online supplemental material is available for this article.

  18. Hierarchical classification of protein folds using a novel ensemble classifier.

    Science.gov (United States)

    Lin, Chen; Zou, Ying; Qin, Ji; Liu, Xiangrong; Jiang, Yi; Ke, Caihuan; Zou, Quan

    2013-01-01

    The analysis of biological information from protein sequences is important for the study of cellular functions and interactions, and protein fold recognition plays a key role in the prediction of protein structures. Unfortunately, the prediction of protein fold patterns is challenging due to the existence of compound protein structures. Here, we processed the latest release of the Structural Classification of Proteins (SCOP, version 1.75) database and exploited novel techniques to impressively increase the accuracy of protein fold classification. The techniques proposed in this paper include ensemble classifying and a hierarchical framework, in the first layer of which similar or redundant sequences were deleted in two manners; a set of base classifiers, fused by various selection strategies, divides the input into seven classes; in the second layer of which, an analogous ensemble method is adopted to predict all protein folds. To our knowledge, it is the first time all protein folds can be intelligently detected hierarchically. Compared with prior studies, our experimental results demonstrated the efficiency and effectiveness of our proposed method, which achieved a success rate of 74.21%, which is much higher than results obtained with previous methods (ranging from 45.6% to 70.5%). When applied to the second layer of classification, the prediction accuracy was in the range between 23.13% and 46.05%. This value, which may not be remarkably high, is scientifically admirable and encouraging as compared to the relatively low counts of proteins from most fold recognition programs. The web server Hierarchical Protein Fold Prediction (HPFP) is available at http://datamining.xmu.edu.cn/software/hpfp.

  19. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  20. Technical bases DWPF Late Washing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Fish, D.L.; Landon, L.F.

    1992-08-10

    A task force recommended that the technical feasibility of a Late Wash' facility be assessed [1]. In this facility, each batch of tetraphenylborate slurry from Tank 49 would be given a final wash to reduce the concentrations of nitrite and radiolysis products to acceptable levels. Laboratory-scale studies have demonstrated that d the nitrite content of the slurry fed to DWPF is reduced to 0.01 M or less (and at least a 4X reduction in concentration of the soluble species is attained), (1) the need for HAN during hydrolysis is eliminated (eliminating the production of ammonium ion during hydrolysis), (2) hydrolysis may be done with a catalyst concentration that will not exceed the copper solubility in glass and (3) the non-polar organic production during hydrolysis is significantly reduced. The first phase of an aggressive research and development program has been completed and all test results obtained to date support the technical feasibility of Late Washing. Paralleling this research and development effort is an aggressive design study directed by DWPF to scope and cost retrofitting the Auxiliary Pump Pit (APP) to enable performing a final wash of each batch of precipitate slurry before R is transferred into the DWPF Soft Processing Cell (SPC). An initial technical bases for the Late Wash Facility was transmitted to DWPF on June 15, 1992. Research and development activities are continuing directed principally at optimization of the cross-f low fitter decontamination methodology and pilot-scale validation of the recommended benzene stripping metodology.

  1. Inpatient Rehabilitation Facility - Conditions

    Data.gov (United States)

    U.S. Department of Health & Human Services — A list of inpatient rehabilitation facilities with data on the number of times people with Medicare who had certain medical conditions were treated in the last year.

  2. Powder Metallurgy Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The facility is uniquely equipped as the only laboratory within DA to conduct PM processing of refractory metals and alloys as well as the processing of a wide range...

  3. Ballistic Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Ballistic Test Facility is comprised of two outdoor and one indoor test ranges, which are all instrumented for data acquisition and analysis. Full-size aircraft...

  4. Laser Guidance Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility, which provides for real time, closed loop evaluation of semi-active laser guidance hardware, has and continues to be instrumental in the development...

  5. Waste Water Facilities

    Data.gov (United States)

    Vermont Center for Geographic Information — This dataset contains the locations of municipal and industrial direct discharge wastewater treatment facilities throughout the state of Vermont. Spatial data is not...

  6. Dialysis Facility Compare Data

    Data.gov (United States)

    U.S. Department of Health & Human Services — These are the official datasets used on the Medicare.gov Dialysis Facility Compare Website provided by the Centers for Medicare and Medicaid Services. These data...

  7. Mark 1 Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Mark I Test Facility is a state-of-the-art space environment simulation test chamber for full-scale space systems testing. A $1.5M dollar upgrade in fiscal year...

  8. Advanced Microanalysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Advanced Microanalysis Facility fully integrates capabilities for chemical and structural analysis of electronic materials and devices for the U.S. Army and DoD....

  9. Air Data Calibration Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility is for low altitude subsonic altimeter system calibrations of air vehicles. Mission is a direct support of the AFFTC mission. Postflight data merge is...

  10. Concrete Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This is a 20,000-sq ft laboratory that supports research on all aspects of concrete and materials technology. The staff of this facility offer wide-ranging expertise...

  11. Coastal Harbors Modeling Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Coastal Harbors Modeling Facility is used to aid in the planning of harbor development and in the design and layout of breakwaters, absorbers, etc.. The goal is...

  12. Coastal Inlet Model Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Coastal Inlet Model Facility, as part of the Coastal Inlets Research Program (CIRP), is an idealized inlet dedicated to the study of coastal inlets and equipped...

  13. Field Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Field Research Facility (FRF) located in Duck, N.C. was established in 1977 to support the U.S. Army Corps of Engineers' coastal engineering mission. The FRF is...

  14. Geophysical Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Geophysical Research Facility (GRF) is a 60 ft long × 22 ft wide × 7 ft deep concrete basin at CRREL for fresh or saltwater investigations and can be temperature...

  15. VT Telecommunication Facilities

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) The UtilityTelecom_TELEFAC data layer contains points which are intended to represent the location of telecommunications facilities (towers and/or...

  16. GPS Satellite Simulation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The GPS satellite simulation facility consists of a GPS satellite simulator controlled by either a Silicon Graphics Origin 2000 or PC depending upon unit under test...

  17. TNO HVAC facilities

    NARCIS (Netherlands)

    Hammink, H.A.J.

    2015-01-01

    TNO has extensive knowledge of heating, ventilation and air conditioning (HVAC), and can offer its services through theoretical studies, laboratory experiments and field measurements. This complete scope, made possible through our test facilities, enables the effective development of new products,

  18. Skilled Nursing Facility PPS

    Data.gov (United States)

    U.S. Department of Health & Human Services — Section 4432(a) of the Balanced Budget Act (BBA) of 1997 modified how payment is made for Medicare skilled nursing facility (SNF) services. Effective with cost...

  19. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  20. Environmental Test Facility (ETF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Environmental Test Facility (ETF) provides non-isolated shock testing for stand-alone equipment and full size cabinets under MIL-S-901D specifications. The ETF...