WorldWideScience

Sample records for classified elimination facilities

  1. Classified facilities for environmental protection

    International Nuclear Information System (INIS)

    Anon.

    1993-02-01

    The legislation of the classified facilities governs most of the dangerous or polluting industries or fixed activities. It rests on the law of 9 July 1976 concerning facilities classified for environmental protection and its application decree of 21 September 1977. This legislation, the general texts of which appear in this volume 1, aims to prevent all the risks and the harmful effects coming from an installation (air, water or soil pollutions, wastes, even aesthetic breaches). The polluting or dangerous activities are defined in a list called nomenclature which subjects the facilities to a declaration or an authorization procedure. The authorization is delivered by the prefect at the end of an open and contradictory procedure after a public survey. In addition, the facilities can be subjected to technical regulations fixed by the Environment Minister (volume 2) or by the prefect for facilities subjected to declaration (volume 3). (A.B.)

  2. Elimination of mercury in health care facilities.

    Science.gov (United States)

    2000-03-01

    Mercury is a persistent, bioaccumulative toxin that has been linked to numerous health effects in humans and wildlife. It is a potent neurotoxin that may also harm the brain, kidneys, and lungs. Unborn children and young infants are at particular risk for brain damage from mercury exposure. Hospitals' use of mercury in chemical solutions, thermometers, blood pressure gauges, batteries, and fluorescent lamps makes these facilities large contributors to the overall emission of mercury into the environment. Most hospitals recognize the dangers of mercury. In a recent survey, four out of five hospitals stated that they have policies in place to eliminate the use of mercury-containing products. Sixty-two percent of them require vendors to disclose the presence of mercury in chemicals that the hospitals purchase. Only 12 percent distribute mercury-containing thermometers to new parents. Ninety-two percent teach their employees about the health and environmental effects of mercury, and 46 percent teach all employees how to clean up mercury spills. However, the same study showed that many hospitals have not implemented their policies. Forty-two percent were not aware whether they still purchased items containing mercury. In addition, 49 percent still purchase mercury thermometers, 44 percent purchase mercury gastrointestinal diagnostic equipment, and 64 percent still purchase mercury lab thermometers.

  3. The studies and experiments on size elimination of fine-grained feldspar from Asia Mining by using Vertical Air Classifier

    Directory of Open Access Journals (Sweden)

    Siribumrungsukha, B.

    2002-04-01

    Full Text Available Asia Mining Company produces feldspar and supplies to both domestic and overseas industries. Ore from the mine is crushed and ground and then classified by screening. That which is coarsely sized (+40 mesh can be sold to the market, while that which is finely sized is left unsold due to the market requirement that size must be of -40+140 mesh. This research designed and constructed a Vertical Air Classifier in which the fine mineral is fed to a vertical chamber while air is blown from the bottom. The main variables are air flow rates and the length of contact between the air and the mineral (at length of 10 and 15 cm. The air is blown in by air compressor and its rate is controlled by a thin plate orifice. Experiments show that the classifier can eliminate more of the size of -140 mesh as the air flow rate increases. The % fractional recovery at the size of -140 mesh is found to decrease as the air flow rate increases. The length of contact between the air and the mineral influences the elimination and the % fractional recovery as well. When the length is shorter (at 10 cm, the elimination of the size -140 mesh is better and the % fractional recovery at the size -140 mesh in the underflow is lower. At the air flow rate of 6.42 L/S and the length of contact of 10 cm, the size of -140 mesh can be reduced from 37.11% in the feed down to 17.70% in the underflow. The results demonstrate the potential of the Vertical Air Classifier to be further developed in eliminating the size -140 mesh by connecting the classifiers in series.

  4. National Pollution Discharge Elimination System (NPDES) Facility Points, Region 9, 2007, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...

  5. National Pollution Discharge Elimination System (NPDES) Facility Points, Region 9, 2011, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...

  6. National Pollution Discharge Elimination System (NPDES) Facility Points, Region 9, 2012, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...

  7. Elimination of Porcine Epidemic Diarrhea Virus in an Animal Feed Manufacturing Facility.

    Directory of Open Access Journals (Sweden)

    Anne R Huss

    Full Text Available Porcine Epidemic Diarrhea Virus (PEDV was the first virus of wide scale concern to be linked to possible transmission by livestock feed or ingredients. Measures to exclude pathogens, prevent cross-contamination, and actively reduce the pathogenic load of feed and ingredients are being developed. However, research thus far has focused on the role of chemicals or thermal treatment to reduce the RNA in the actual feedstuffs, and has not addressed potential residual contamination within the manufacturing facility that may lead to continuous contamination of finished feeds. The purpose of this experiment was to evaluate the use of a standardized protocol to sanitize an animal feed manufacturing facility contaminated with PEDV. Environmental swabs were collected throughout the facility during the manufacturing of a swine diet inoculated with PEDV. To monitor facility contamination of the virus, swabs were collected at: 1 baseline prior to inoculation, 2 after production of the inoculated feed, 3 after application of a quaternary ammonium-glutaraldehyde blend cleaner, 4 after application of a sodium hypochlorite sanitizing solution, and 5 after facility heat-up to 60°C for 48 hours. Decontamination step, surface, type, zone and their interactions were all found to impact the quantity of detectable PEDV RNA (P < 0.05. As expected, all samples collected from equipment surfaces contained PEDV RNA after production of the contaminated feed. Additionally, the majority of samples collected from non-direct feed contact surfaces were also positive for PEDV RNA after the production of the contaminated feed, emphasizing the potential role dust plays in cross-contamination of pathogen throughout a manufacturing facility. Application of the cleaner, sanitizer, and heat were effective at reducing PEDV genomic material (P < 0.05, but did not completely eliminate it.

  8. Trilateral Initiative: IAEA authentication and national certification of verification equipment for facilities with classified forms of fissile material

    International Nuclear Information System (INIS)

    Haas, Eckard; Sukhanov, Alexander; Murphy, John

    2001-01-01

    Full text: Within the framework of the Trilateral Initiative, technical challenges have arisen due to the potential of the International Atomic Energy Agency (IAEA) monitoring fissile material with classified characteristics, as well as the IAEA using facility or host country supplied monitoring equipment. In monitoring material with classified characteristics, it is recognized that the host country needs to assure that classified information is not made available to the IAEA inspectors. Thus, any monitoring equipment used to monitor material with classified characteristics has to contain information security capabilities, such as information barriers. But likewise in using host-country-supplied monitoring equipment, regarding the material being monitored the IAEA has to have confidence that the information provided by the equipment is genuine and can be used by the IAEA in fulfilling its obligation to derive conclusions based on independent verification measures. Thus the IAEA needs to go through the process of authenticating the monitoring equipment. In the same way the host country needs to go through the process to assure itself that the monitoring equipment integrated with an information barrier will not divulge any classified information about an inspected sensitive item. Both processes require on large extent identical measures, but partially also may conflict with each other. The fact that monitoring equipment needs to exhibit information security throughout its lifecycle while at the same time be capable of being authenticated necessitates the need for creative technical approaches to be pursued. (author)

  9. Hazards study of environmental protection classified facilities. Scenarios analysis; Etude de dangers des ICPE. Analyse des scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Seveque, J.L. [Cour d' Appel d' Amiens, 80 (France)

    2006-04-15

    This article describes the analysis and study of the possible impacts of accidents occurring at industrial facilities classified with respect to the environment protection. The operators of such facilities have to describe the possible risks and their consequences, the measures taken to prevent them and the level of residual risk. Therefore, it consists in calculating the consequences of all possible aggressions that a facility can undergo. The receptors are of 2 type: the human body (burns, asphyxia, intoxication, shock wave, projectile), and the surrounding equipments (fire, unconfined vapour cloud explosion (UVCE), boiling liquid expanding vapour explosion (BLEVE), fireball, dispersion of toxic gases). Content: 1 - fire-type scenario: description, modeling of thermal effects, conclusion; 2 - UVCE-type scenario: description, Lannoy method (TNT equivalent), multi-energy method, conclusion; 3 - BLEVE-type scenario: description, modeling of overpressure effects, thermal effects of the fireball; 4 - toxic cloud scenario: modeling of a toxic cloud dispersion, effects and consequences; 5 - conclusions. (J.S.)

  10. Methodological guide for the acceptance of waste with a natural radioactivity in the classified elimination facilities. Part 1: methodological guide

    International Nuclear Information System (INIS)

    Cazala, Ch.; Cessac, B.; Gay, D.

    2006-01-01

    This document is part of the thought implemented by the Direction of pollution prevention and risk (D.P.P.R.) from the Department of Ecology and Sustainable Development as part of a working group on means for detecting radioactivity at the entrance of waste storage centers (said gantries group). The D.P.P.R. has entrusted the implementation to the I.R.S.N.. The completion of the guide was conducted under the supervision of a steering committee composed of representatives of industrial producers of this control type, operators of storage centers, associations for environmental protection, experts and administration. The content and structure of the guide and the accompanying sheets result of discussions within the Steering Committee between D.P.P.R. and I.R.S.N.. (N.C.)

  11. National Pollution Discharge Elimination System (NPDES) All Facility Points, Region 9, 2007, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES facilities, outfalls/dischargers, waste water treatment plant facilities and waste water treatment plants...

  12. List of currently classified documents relative to Hanford Production Facilities Operations originated on the Hanford Site between 1961 and 1972

    Energy Technology Data Exchange (ETDEWEB)

    1993-04-01

    The United States Department of Energy (DOE) has declared that all Hanford plutonium production- and operations-related information generated between 1944 and 1972 is declassified. Any documents found and deemed useful for meeting Hanford Environmental Dose Reconstruction (HEDR) objectives may be declassified with or without deletions in accordance with DOE guidance by Authorized Derivative Declassifiers. The September 1992, letter report, Declassifications Requested by the Technical Steering Panel of Hanford Documents Produced 1944--1960, (PNWD-2024 HEDR UC-707), provides an important milestone toward achieving a complete listing of documents that may be useful to the HEDR Project. The attached listing of approximately 7,000 currently classified Hanford-originated documents relative to Hanford Production Facilities Operations between 1961 and 1972 fulfills TSP Directive 89-3. This list does not include such titles as the Irradiation Processing Department, Chemical Processing Department, and Hanford Laboratory Operations monthly reports generated after 1960 which have been previously declassified with minor deletions and made publicly available. Also Kaiser Engineers Hanford (KEH) Document Control determined that no KEH documents generated between January 1, 1961 and December 31, 1972 are currently classified. Titles which address work for others have not been included because Hanford Site contractors currently having custodial responsibility for these documents do not have the authority to determine whether other than their own staff have on file an appropriate need-to-know. Furthermore, these documents do not normally contain information relative to Hanford Site operations.

  13. Elimination of Pasteurella pneumotropica from a Mouse Barrier Facility by Using a Modified Enrofloxacin Treatment Regimen

    Science.gov (United States)

    Towne, Justin W; Wagner, April M; Griffin, Kurt J; Buntzman, Adam S; Frelinger, Jeffrey A; Besselsen, David G

    2014-01-01

    Multiple NOD.Cg-Prkdcscid Il2rgtm1WjlTg(HLA-A2.1)Enge/Sz (NSG/A2) transgenic mice maintained in a mouse barrier facility were submitted for necropsy to determine the cause of facial alopecia, tachypnea, dyspnea, and sudden death. Pneumonia and soft-tissue abscesses were observed, and Pasteurella pneumotropica biotype Jawetz was consistently isolated from the upper respiratory tract, lung, and abscesses. Epidemiologic investigation within the facility revealed presence of this pathogen in mice generated or rederived by the intramural Genetically Engineered Mouse Model (GEMM) Core but not in mice procured from several approved commercial vendors. Epidemiologic data suggested the infection originated from female or vasectomized male ND4 mice obtained from a commercial vendor and then comingled by the GEMM Core to induce pseudopregnancy in female mice for embryo implantation. Enrofloxacin delivered in drinking water (85 mg/kg body weight daily) for 14 d was sufficient to clear bacterial infection in normal, breeding, and immune-deficient mice without the need to change the antibiotic water source. This modified treatment regimen was administered to 2400 cages of mice to eradicate Pasteurella pneumotropica from the facility. Follow-up PCR testing for P. pneumotropica biotype Jawetz remained uniformly negative at 2, 6, 12, and 52 wk after treatment in multiple strains of mice that were originally infected. Together, these data indicate that enrofloxacin can eradicate P. pneumotropica from infected mice in a less labor-intensive approach that does not require breeding cessation and that is easily adaptable to the standard biweekly cage change schedule for individually ventilated cages. PMID:25255075

  14. Elimination of Pasteurella pneumotropica from a mouse barrier facility by using a modified enrofloxacin treatment regimen.

    Science.gov (United States)

    Towne, Justin W; Wagner, April M; Griffin, Kurt J; Buntzman, Adam S; Frelinger, Jeffrey A; Besselsen, David G

    2014-09-01

    Multiple NOD. Cg-Prkdc(scid)Il2rg(tm1Wjl)Tg(HLA-A2.1)Enge/Sz (NSG/A2) transgenic mice maintained in a mouse barrier facility were submitted for necropsy to determine the cause of facial alopecia, tachypnea, dyspnea, and sudden death. Pneumonia and soft-tissue abscesses were observed, and Pasteurella pneumotropica biotype Jawetz was consistently isolated from the upper respiratory tract, lung, and abscesses. Epidemiologic investigation within the facility revealed presence of this pathogen in mice generated or rederived by the intramural Genetically Engineered Mouse Model (GEMM) Core but not in mice procured from several approved commercial vendors. Epidemiologic data suggested the infection originated from female or vasectomized male ND4 mice obtained from a commercial vendor and then comingled by the GEMM Core to induce pseudopregnancy in female mice for embryo implantation. Enrofloxacin delivered in drinking water (85 mg/kg body weight daily) for 14 d was sufficient to clear bacterial infection in normal, breeding, and immune-deficient mice without the need to change the antibiotic water source. This modified treatment regimen was administered to 2400 cages of mice to eradicate Pasteurella pneumotropica from the facility. Follow-up PCR testing for P. pneumotropica biotype Jawetz remained uniformly negative at 2, 6, 12, and 52 wk after treatment in multiple strains of mice that were originally infected. Together, these data indicate that enrofloxacin can eradicate P. pneumotropica from infected mice in a less labor-intensive approach that does not require breeding cessation and that is easily adaptable to the standard biweekly cage change schedule for individually ventilated cages.

  15. Elimination of liquid discharge to the environment from the TA-50 Radioactive Liquid Waste Treatment Facility

    International Nuclear Information System (INIS)

    Moss, D.; Williams, N.; Hall, D.; Hargis, K.; Saladen, M.; Sanders, M.; Voit, S.; Worland, P.; Yarbro, S.

    1998-06-01

    Alternatives were evaluated for management of treated radioactive liquid waste from the radioactive liquid waste treatment facility (RLWTF) at Los Alamos National Laboratory. The alternatives included continued discharge into Mortandad Canyon, diversion to the sanitary wastewater treatment facility and discharge of its effluent to Sandia Canyon or Canada del Buey, and zero liquid discharge. Implementation of a zero liquid discharge system is recommended in addition to two phases of upgrades currently under way. Three additional phases of upgrades to the present radioactive liquid waste system are proposed to accomplish zero liquid discharge. The first phase involves minimization of liquid waste generation, along with improved characterization and monitoring of the remaining liquid waste. The second phase removes dissolved salts from the reverse osmosis concentrate stream to yield a higher effluent quality. In the final phase, the high-quality effluent is reused for industrial purposes within the Laboratory or evaporated. Completion of these three phases will result in zero discharge of treated radioactive liquid wastewater from the RLWTF

  16. Elimination of liquid discharge to the environment from the TA-50 Radioactive Liquid Waste Treatment Facility

    Energy Technology Data Exchange (ETDEWEB)

    Moss, D.; Williams, N.; Hall, D.; Hargis, K.; Saladen, M.; Sanders, M.; Voit, S.; Worland, P.; Yarbro, S.

    1998-06-01

    Alternatives were evaluated for management of treated radioactive liquid waste from the radioactive liquid waste treatment facility (RLWTF) at Los Alamos National Laboratory. The alternatives included continued discharge into Mortandad Canyon, diversion to the sanitary wastewater treatment facility and discharge of its effluent to Sandia Canyon or Canada del Buey, and zero liquid discharge. Implementation of a zero liquid discharge system is recommended in addition to two phases of upgrades currently under way. Three additional phases of upgrades to the present radioactive liquid waste system are proposed to accomplish zero liquid discharge. The first phase involves minimization of liquid waste generation, along with improved characterization and monitoring of the remaining liquid waste. The second phase removes dissolved salts from the reverse osmosis concentrate stream to yield a higher effluent quality. In the final phase, the high-quality effluent is reused for industrial purposes within the Laboratory or evaporated. Completion of these three phases will result in zero discharge of treated radioactive liquid wastewater from the RLWTF.

  17. Analysis of the application of selected physico-chemical methods in eliminating odor nuisance of municipal facilities

    Directory of Open Access Journals (Sweden)

    Miller Urszula

    2018-01-01

    Full Text Available Operation of municipal management facilities is inseparable from the problem of malodorous compounds emissions to the atmospheric air. In that case odor nuisance is related to the chemical composition of waste, sewage and sludge as well as to the activity of microorganisms whose products of life processes can be those odorous compounds. Significant reduction of odorant emission from many sources can be achieved by optimizing parameters and conditions of processes. However, it is not always possible to limit the formation of odorants. In such cases it is best to use appropriate deodorizing methods. The choice of the appropriate method is based on in terms of physical parameters, emission intensity of polluted gases and their composition, if it is possible to determine. Among the solutions used in municipal economy, there can be distinguished physico-chemical methods such as sorption and oxidation. In cases where the source of the emission is not encapsulated, odor masking techniques are used, which consists of spraying preparations that neutralize unpleasant odors. The paper presents the characteristics of selected methods of eliminating odor nuisance and evaluation of their applicability in municipal management facilities.

  18. Analysis of the application of selected physico-chemical methods in eliminating odor nuisance of municipal facilities

    Science.gov (United States)

    Miller, Urszula; Grzelka, Agnieszka; Romanik, Elżbieta; Kuriata, Magdalena

    2018-01-01

    Operation of municipal management facilities is inseparable from the problem of malodorous compounds emissions to the atmospheric air. In that case odor nuisance is related to the chemical composition of waste, sewage and sludge as well as to the activity of microorganisms whose products of life processes can be those odorous compounds. Significant reduction of odorant emission from many sources can be achieved by optimizing parameters and conditions of processes. However, it is not always possible to limit the formation of odorants. In such cases it is best to use appropriate deodorizing methods. The choice of the appropriate method is based on in terms of physical parameters, emission intensity of polluted gases and their composition, if it is possible to determine. Among the solutions used in municipal economy, there can be distinguished physico-chemical methods such as sorption and oxidation. In cases where the source of the emission is not encapsulated, odor masking techniques are used, which consists of spraying preparations that neutralize unpleasant odors. The paper presents the characteristics of selected methods of eliminating odor nuisance and evaluation of their applicability in municipal management facilities.

  19. Classifying Microorganisms

    DEFF Research Database (Denmark)

    Sommerlund, Julie

    2006-01-01

    This paper describes the coexistence of two systems for classifying organisms and species: a dominant genetic system and an older naturalist system. The former classifies species and traces their evolution on the basis of genetic characteristics, while the latter employs physiological characteris......This paper describes the coexistence of two systems for classifying organisms and species: a dominant genetic system and an older naturalist system. The former classifies species and traces their evolution on the basis of genetic characteristics, while the latter employs physiological...... characteristics. The coexistence of the classification systems does not lead to a conflict between them. Rather, the systems seem to co-exist in different configurations, through which they are complementary, contradictory and inclusive in different situations-sometimes simultaneously. The systems come...

  20. Carbon classified?

    DEFF Research Database (Denmark)

    Lippert, Ingmar

    2012-01-01

    . Using an actor- network theory (ANT) framework, the aim is to investigate the actors who bring together the elements needed to classify their carbon emission sources and unpack the heterogeneous relations drawn on. Based on an ethnographic study of corporate agents of ecological modernisation over...... a period of 13 months, this paper provides an exploration of three cases of enacting classification. Drawing on ANT, we problematise the silencing of a range of possible modalities of consumption facts and point to the ontological ethics involved in such performances. In a context of global warming...

  1. Eliminating armaments

    International Nuclear Information System (INIS)

    Adams, R.

    1998-01-01

    The end of Cold War induced optimistic projections concerning disarmament, elimination of nuclear weapons, elimination of massive inequities - poverty, hatred, racism. All these goals should be achieved simultaneously, but little has been achieved so far

  2. Eliminating animal facility light-at-night contamination and its effect on circadian regulation of rodent physiology, tumor growth, and metabolism: a challenge in the relocation of a cancer research laboratory.

    Science.gov (United States)

    Dauchy, Robert T; Dupepe, Lynell M; Ooms, Tara G; Dauchy, Erin M; Hill, Cody R; Mao, Lulu; Belancio, Victoria P; Slakey, Lauren M; Hill, Steven M; Blask, David E

    2011-05-01

    Appropriate laboratory animal facility lighting and lighting protocols are essential for maintaining the health and wellbeing of laboratory animals and ensuring the credible outcome of scientific investigations. Our recent experience in relocating to a new laboratory facility illustrates the importance of these considerations. Previous studies in our laboratory demonstrated that animal room contamination with light-at-night (LAN) of as little as 0.2 lx at rodent eye level during an otherwise normal dark-phase disrupted host circadian rhythms and stimulated the metabolism and proliferation of human cancer xenografts in rats. Here we examined how simple improvements in facility design at our new location completely eliminated dark-phase LAN contamination and restored normal circadian rhythms in nontumor-bearing rats and normal tumor metabolism and growth in host rats bearing tissue-isolated MCF7(SR(-)) human breast tumor xenografts or 7288CTC rodent hepatomas. Reducing LAN contamination in the animal quarters from 24.5 ± 2.5 lx to nondetectable levels (complete darkness) restored normal circadian regulation of rodent arterial blood melatonin, glucose, total fatty and linoleic acid concentrations, tumor uptake of O(2), glucose, total fatty acid and CO(2) production and tumor levels of cAMP, triglycerides, free fatty acids, phospholipids, and cholesterol esters, as well as extracellular-signal-regulated kinase, mitogen-activated protein kinase, serine-threonine protein kinase, glycogen synthase kinase 3β, γ-histone 2AX, and proliferating cell nuclear antigen.

  3. Classifying Returns as Extreme

    DEFF Research Database (Denmark)

    Christiansen, Charlotte

    2014-01-01

    I consider extreme returns for the stock and bond markets of 14 EU countries using two classification schemes: One, the univariate classification scheme from the previous literature that classifies extreme returns for each market separately, and two, a novel multivariate classification scheme tha...

  4. LCC: Light Curves Classifier

    Science.gov (United States)

    Vo, Martin

    2017-08-01

    Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.

  5. Intelligent Garbage Classifier

    Directory of Open Access Journals (Sweden)

    Ignacio Rodríguez Novelle

    2008-12-01

    Full Text Available IGC (Intelligent Garbage Classifier is a system for visual classification and separation of solid waste products. Currently, an important part of the separation effort is based on manual work, from household separation to industrial waste management. Taking advantage of the technologies currently available, a system has been built that can analyze images from a camera and control a robot arm and conveyor belt to automatically separate different kinds of waste.

  6. Classifying Linear Canonical Relations

    OpenAIRE

    Lorand, Jonathan

    2015-01-01

    In this Master's thesis, we consider the problem of classifying, up to conjugation by linear symplectomorphisms, linear canonical relations (lagrangian correspondences) from a finite-dimensional symplectic vector space to itself. We give an elementary introduction to the theory of linear canonical relations and present partial results toward the classification problem. This exposition should be accessible to undergraduate students with a basic familiarity with linear algebra.

  7. Distribution and dynamics of radionuclides and stable elements in the coastal waters off Rokkasho Village, Japan, prior to the opening of a nuclear reprocessing facility. Part 1. Sedimentation flux of suspended particles and elimination of radionuclides and stable elements from seawater

    International Nuclear Information System (INIS)

    Kondo, K.; Kawabata, H.; Ueda, S.; Hasegawa, H.; Inaba, J.; Ohmomo, Y.; Mitamura, O.; Seike, Y.

    2004-01-01

    A nuclear fuel reprocessing facility is currently under construction in Rokkasho Village, Aomori, Japan. After completion and start-up, this facility will discharge radionuclides into the Pacific Ocean through an outlet pipe set on the seafloor offshore. For future assessments of the stability of these radionuclides in the environment, a sufficient understanding of the behavior of radionuclides in this ocean ecosystem before the start-up of the facility is necessary. To understand the processes by which radionuclides and various other types of elements are eliminated from seawater, we measured the sedimentation flux of suspended particles in the coastal waters off Rokkasho Village where the sea emissions pipes will be placed. (author)

  8. Stack filter classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory

    2009-01-01

    Just as linear models generalize the sample mean and weighted average, weighted order statistic models generalize the sample median and weighted median. This analogy can be continued informally to generalized additive modeels in the case of the mean, and Stack Filters in the case of the median. Both of these model classes have been extensively studied for signal and image processing but it is surprising to find that for pattern classification, their treatment has been significantly one sided. Generalized additive models are now a major tool in pattern classification and many different learning algorithms have been developed to fit model parameters to finite data. However Stack Filters remain largely confined to signal and image processing and learning algorithms for classification are yet to be seen. This paper is a step towards Stack Filter Classifiers and it shows that the approach is interesting from both a theoretical and a practical perspective.

  9. USCIS Backlog Elimination

    Data.gov (United States)

    Department of Homeland Security — USCIS is streamlining the way immigration benefits are delivered. By working smarter and eliminating redundancies, USCIS is bringing a business model to government....

  10. Wastewater Treatment Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Individual permits for municipal, industrial, and semi-public wastewater treatment facilities in Iowa for the National Pollutant Discharge Elimination System (NPDES)...

  11. Universal leakage elimination

    International Nuclear Information System (INIS)

    Byrd, Mark S.; Lidar, Daniel A.; Wu, L.-A.; Zanardi, Paolo

    2005-01-01

    'Leakage' errors are particularly serious errors which couple states within a code subspace to states outside of that subspace, thus destroying the error protection benefit afforded by an encoded state. We generalize an earlier method for producing leakage elimination decoupling operations and examine the effects of the leakage eliminating operations on decoherence-free or noiseless subsystems which encode one logical, or protected qubit into three or four qubits. We find that by eliminating a large class of leakage errors, under some circumstances, we can create the conditions for a decoherence-free evolution. In other cases we identify a combined decoherence-free and quantum error correcting code which could eliminate errors in solid-state qubits with anisotropic exchange interaction Hamiltonians and enable universal quantum computing with only these interactions

  12. Disassembly and Sanitization of Classified Matter

    International Nuclear Information System (INIS)

    Stockham, Dwight J.; Saad, Max P.

    2008-01-01

    , Pantex personnel have asked the DSO team to assist them with the destruction of their stored classified components. The DSO process is in full-scale operation and continues to grow and serve SNL/NM and DOE by providing a solution to this evolving disposal issue. For some time, SNL has incurred significant expenses for the management and storage of classified components. This project is estimated to save DOE and Sandia several hundreds of thousands of dollars while the excess inventory is eliminated. This innovative approach eliminates the need for long-term storage of classified weapons components and the associated monitoring and accounting expenditures

  13. Radioactive wastes eliminating device

    International Nuclear Information System (INIS)

    Mitsutsuka, Norimasa.

    1979-01-01

    Purpose: To eliminate impurities and radioactive wastes by passing liquid sodium in a cold trap and an adsorption device. Constitution: Heated sodium is partially extracted from the core of a nuclear reactor by way of a pump, flown into and cooled in heat exchangers and then introduced into a cold trap for removal of impurities. The liquid sodium eliminated with impurities is introduced into an adsorption separator and purified by the elimination of radioactive wastes. The purified sodium is returned to the nuclear reactor. A heater is provided between the cold trap and the adsorption separator, so that the temperature of the liquid sodium introduced into the adsorption separator is not lower than the minimum temperature in the cold trap to thereby prevent deposition of impurities in the adsorption separator. (Kawakami, Y.)

  14. Remediation Technologies Eliminate Contaminants

    Science.gov (United States)

    2012-01-01

    All research and development has a story behind it, says Jacqueline Quinn, environmental engineer at Kennedy Space Center. For Quinn, one such story begins with the Saturn 1B launch stand at Kennedy and ends with a unique solution to a challenging environmental problem. Used in a number of Apollo missions and during the Skylab program, the Saturn 1B launch stand was dismantled following the transition to the Space Shuttle Program and stored in an open field at Kennedy. Decades later, the Center s Environmental Program Office discovered evidence of chemicals called polychlorinated biphenyls (PCBs) in the field s soil. The findings were puzzling since PCBs a toxin classified as a probable carcinogen by the Environmental Protection Agency (EPA) have been banned in the United States since 1979. Before the ban, PCBs were commonly used in transformer oils that leached into the ground when the oils were changed out and dumped near transformer sites, but there were no electrical transformers near the dismantled stand. It soon became apparent that the source of the PCBs was the launch stand itself. Prior to the ban, PCBs were used extensively in paints to add elasticity and other desirable characteristics. The PCB-laden paint on the Saturn 1B launch stand was flaking off into the field s soil. Nobody knew there were PCBs in the paint, says Quinn, noting that the ingredient was not monitored carefully when it was in use in 1960s. In fact, she says, the U.S. EPA was not even established until 1970, a year after Neil Armstrong first set foot on the Moon. Nobody knew any better at the time, Quinn says, but today, we have the responsibility to return any natural environmental media to as close to pristine a condition as possible. Quinn, fellow engineer Kathleen Loftin, and other Kennedy colleagues already had experience developing unprecedented solutions for environmental contamination; the team invented the emulsified zero-valent iron (EZVI) technology to safely treat

  15. Fingerprint prediction using classifier ensembles

    CSIR Research Space (South Africa)

    Molale, P

    2011-11-01

    Full Text Available ); logistic discrimination (LgD), k-nearest neighbour (k-NN), artificial neural network (ANN), association rules (AR) decision tree (DT), naive Bayes classifier (NBC) and the support vector machine (SVM). The performance of several multiple classifier systems...

  16. Minding Rachlin's Eliminative Materialism

    Science.gov (United States)

    McDowell, J. J.

    2012-01-01

    Rachlin's teleological behaviorism eliminates the first-person ontology of conscious experience by identifying mental states with extended patterns of behavior, and thereby maintains the materialist ontology of science. An alternate view, informed by brain-based and externalist philosophies of mind, is shown also to maintain the materialist…

  17. Eliminating Perinatal HIV Transmission

    Centers for Disease Control (CDC) Podcasts

    In this podcast, CDC’s Dr. Steve Nesheim discusses perinatal HIV transmission, including the importance of preventing HIV among women, preconception care, and timely HIV testing of the mother. Dr. Nesheim also introduces the revised curriculum Eliminating Perinatal HIV Transmission intended for faculty of OB/GYN and pediatric residents and nurse midwifery students.

  18. Recognizing, Confronting, and Eliminating Workplace Bullying.

    Science.gov (United States)

    Berry, Peggy Ann; Gillespie, Gordon L; Fisher, Bonnie S; Gormley, Denise K

    2016-07-01

    Workplace bullying (WPB) behaviors negatively affect nurse productivity, satisfaction, and retention, and hinder safe patient care. The purpose of this article is to define WPB, differentiate between incivility and WPB, and recommend actions to prevent WPB behaviors. Informed occupational and environmental health nurses and nurse leaders must recognize, confront, and eliminate WPB in their facilities and organizations. Recognizing, confronting, and eliminating WPB behaviors in health care is a crucial first step toward sustained improvements in patient care quality and the health and safety of health care employees. © 2016 The Author(s).

  19. Classified

    CERN Multimedia

    Computer Security Team

    2011-01-01

    In the last issue of the Bulletin, we have discussed recent implications for privacy on the Internet. But privacy of personal data is just one facet of data protection. Confidentiality is another one. However, confidentiality and data protection are often perceived as not relevant in the academic environment of CERN.   But think twice! At CERN, your personal data, e-mails, medical records, financial and contractual documents, MARS forms, group meeting minutes (and of course your password!) are all considered to be sensitive, restricted or even confidential. And this is not all. Physics results, in particular when being preliminary and pending scrutiny, are sensitive, too. Just recently, an ATLAS collaborator copy/pasted the abstract of an ATLAS note onto an external public blog, despite the fact that this document was clearly marked as an "Internal Note". Such an act was not only embarrassing to the ATLAS collaboration, and had negative impact on CERN’s reputation --- i...

  20. Verification of the Accountability Method as a Means to Classify Radioactive Wastes Processed Using THOR Fluidized Bed Steam Reforming at the Studsvik Processing Facility in Erwin, Tennessee, USA - 13087

    Energy Technology Data Exchange (ETDEWEB)

    Olander, Jonathan [Studsvik Processing Facility Erwin, 151 T.C. Runnion Rd., Erwin, TN 37650 (United States); Myers, Corey [Studsvik, Inc., 5605 Glenridge Drive, Suite 705, Atlanta, GA 30342 (United States)

    2013-07-01

    Studsviks' Processing Facility Erwin (SPFE) has been treating Low-Level Radioactive Waste using its patented THOR process for over 13 years. Studsvik has been mixing and processing wastes of the same waste classification but different chemical and isotopic characteristics for the full extent of this period as a general matter of operations. Studsvik utilizes the accountability method to track the movement of radionuclides from acceptance of waste, through processing, and finally in the classification of waste for disposal. Recently the NRC has proposed to revise the 1995 Branch Technical Position on Concentration Averaging and Encapsulation (1995 BTP on CA) with additional clarification (draft BTP on CA). The draft BTP on CA has paved the way for large scale blending of higher activity and lower activity waste to produce a single waste for the purpose of classification. With the onset of blending in the waste treatment industry, there is concern from the public and state regulators as to the robustness of the accountability method and the ability of processors to prevent the inclusion of hot spots in waste. To address these concerns and verify the accountability method as applied by the SPFE, as well as the SPFE's ability to control waste package classification, testing of actual waste packages was performed. Testing consisted of a comprehensive dose rate survey of a container of processed waste. Separately, the waste package was modeled chemically and radiologically. Comparing the observed and theoretical data demonstrated that actual dose rates were lower than, but consistent with, modeled dose rates. Moreover, the distribution of radioactivity confirms that the SPFE can produce a radiologically homogeneous waste form. The results of the study demonstrate: 1) the accountability method as applied by the SPFE is valid and produces expected results; 2) the SPFE can produce a radiologically homogeneous waste; and 3) the SPFE can effectively control the

  1. Verification of the Accountability Method as a Means to Classify Radioactive Wastes Processed Using THOR Fluidized Bed Steam Reforming at the Studsvik Processing Facility in Erwin, Tennessee, USA - 13087

    International Nuclear Information System (INIS)

    Olander, Jonathan; Myers, Corey

    2013-01-01

    Studsviks' Processing Facility Erwin (SPFE) has been treating Low-Level Radioactive Waste using its patented THOR process for over 13 years. Studsvik has been mixing and processing wastes of the same waste classification but different chemical and isotopic characteristics for the full extent of this period as a general matter of operations. Studsvik utilizes the accountability method to track the movement of radionuclides from acceptance of waste, through processing, and finally in the classification of waste for disposal. Recently the NRC has proposed to revise the 1995 Branch Technical Position on Concentration Averaging and Encapsulation (1995 BTP on CA) with additional clarification (draft BTP on CA). The draft BTP on CA has paved the way for large scale blending of higher activity and lower activity waste to produce a single waste for the purpose of classification. With the onset of blending in the waste treatment industry, there is concern from the public and state regulators as to the robustness of the accountability method and the ability of processors to prevent the inclusion of hot spots in waste. To address these concerns and verify the accountability method as applied by the SPFE, as well as the SPFE's ability to control waste package classification, testing of actual waste packages was performed. Testing consisted of a comprehensive dose rate survey of a container of processed waste. Separately, the waste package was modeled chemically and radiologically. Comparing the observed and theoretical data demonstrated that actual dose rates were lower than, but consistent with, modeled dose rates. Moreover, the distribution of radioactivity confirms that the SPFE can produce a radiologically homogeneous waste form. The results of the study demonstrate: 1) the accountability method as applied by the SPFE is valid and produces expected results; 2) the SPFE can produce a radiologically homogeneous waste; and 3) the SPFE can effectively control the waste package

  2. Classifying Sluice Occurrences in Dialogue

    DEFF Research Database (Denmark)

    Baird, Austin; Hamza, Anissa; Hardt, Daniel

    2018-01-01

    perform manual annotation with acceptable inter-coder agreement. We build classifier models with Decision Trees and Naive Bayes, with accuracy of 67%. We deploy a classifier to automatically classify sluice occurrences in OpenSubtitles, resulting in a corpus with 1.7 million occurrences. This will support....... Despite this, the corpus can be of great use in research on sluicing and development of systems, and we are making the corpus freely available on request. Furthermore, we are in the process of improving the accuracy of sluice identification and annotation for the purpose of created a subsequent version...

  3. Eliminating Perinatal HIV Transmission

    Centers for Disease Control (CDC) Podcasts

    2012-11-26

    In this podcast, CDC’s Dr. Steve Nesheim discusses perinatal HIV transmission, including the importance of preventing HIV among women, preconception care, and timely HIV testing of the mother. Dr. Nesheim also introduces the revised curriculum Eliminating Perinatal HIV Transmission intended for faculty of OB/GYN and pediatric residents and nurse midwifery students.  Created: 11/26/2012 by Division of HIV/AIDS Prevention.   Date Released: 11/26/2012.

  4. Quantum ensembles of quantum classifiers.

    Science.gov (United States)

    Schuld, Maria; Petruccione, Francesco

    2018-02-09

    Quantum machine learning witnesses an increasing amount of quantum algorithms for data-driven decision making, a problem with potential applications ranging from automated image recognition to medical diagnosis. Many of those algorithms are implementations of quantum classifiers, or models for the classification of data inputs with a quantum computer. Following the success of collective decision making with ensembles in classical machine learning, this paper introduces the concept of quantum ensembles of quantum classifiers. Creating the ensemble corresponds to a state preparation routine, after which the quantum classifiers are evaluated in parallel and their combined decision is accessed by a single-qubit measurement. This framework naturally allows for exponentially large ensembles in which - similar to Bayesian learning - the individual classifiers do not have to be trained. As an example, we analyse an exponentially large quantum ensemble in which each classifier is weighed according to its performance in classifying the training data, leading to new results for quantum as well as classical machine learning.

  5. IAEA safeguards and classified materials

    International Nuclear Information System (INIS)

    Pilat, J.F.; Eccleston, G.W.; Fearey, B.L.; Nicholas, N.J.; Tape, J.W.; Kratzer, M.

    1997-01-01

    The international community in the post-Cold War period has suggested that the International Atomic Energy Agency (IAEA) utilize its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials, some of which are classified, under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring classified materials. A traditional safeguards approach, based on nuclear material accountancy, would seem unavoidably to reveal classified information. However, further analysis of the IAEA's safeguards approaches is warranted in order to understand fully the scope and nature of any problems. The issues are complex and difficult, and it is expected that common technical understandings will be essential for their resolution. Accordingly, this paper examines and compares traditional safeguards item accounting of fuel at a nuclear power station (especially spent fuel) with the challenges presented by inspections of classified materials. This analysis is intended to delineate more clearly the problems as well as reveal possible approaches, techniques, and technologies that could allow the adaptation of safeguards to the unprecedented task of inspecting classified materials. It is also hoped that a discussion of these issues can advance ongoing political-technical debates on international inspections of excess classified materials

  6. Hybrid classifiers methods of data, knowledge, and classifier combination

    CERN Document Server

    Wozniak, Michal

    2014-01-01

    This book delivers a definite and compact knowledge on how hybridization can help improving the quality of computer classification systems. In order to make readers clearly realize the knowledge of hybridization, this book primarily focuses on introducing the different levels of hybridization and illuminating what problems we will face with as dealing with such projects. In the first instance the data and knowledge incorporated in hybridization were the action points, and then a still growing up area of classifier systems known as combined classifiers was considered. This book comprises the aforementioned state-of-the-art topics and the latest research results of the author and his team from Department of Systems and Computer Networks, Wroclaw University of Technology, including as classifier based on feature space splitting, one-class classification, imbalance data, and data stream classification.

  7. Risks: diagnosing and eliminating

    Directory of Open Access Journals (Sweden)

    Yuriy A. Tikhomirov

    2016-01-01

    Full Text Available Objective to develop conceptual theoretical and legal provisions and scientific recommendations on the identification analysis and elimination of risk. Methods universal dialectic method of cognition as well as scientific and private research methods based on it. Results the system was researched of risks diagnostics in the legal sphere and mechanism of influencing the quotrisk situationsquot and their consequences damage to the environment and harm to society. The concept of risk in the legal sphere was formulated the author39s classification of risks in the legal sphere is presented. The rules of analysis evaluation and prevention of risks and the model risk management framework are elaborated. Scientific novelty the mechanism for the identification analysis and elimination of risk has been developed and introduced into scientific circulation the author has proposed the classification and types of risks the reasons and the conditions promoting the risk occurrence. Practical significance the provisions and conclusions of the article can be used in the scientific lawmaking and lawenforcement activity as well as in the educational process of higher educational establishments. nbsp

  8. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  9. Knowledge Uncertainty and Composed Classifier

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 2 (2007), s. 101-105 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Boosting architecture * contextual modelling * composed classifier * knowledge management, * knowledge * uncertainty Subject RIV: IN - Informatics, Computer Science

  10. Correlation Dimension-Based Classifier

    Czech Academy of Sciences Publication Activity Database

    Jiřina, Marcel; Jiřina jr., M.

    2014-01-01

    Roč. 44, č. 12 (2014), s. 2253-2263 ISSN 2168-2267 R&D Projects: GA MŠk(CZ) LG12020 Institutional support: RVO:67985807 Keywords : classifier * multidimensional data * correlation dimension * scaling exponent * polynomial expansion Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.469, year: 2014

  11. Adaptive Regularization of Neural Classifiers

    DEFF Research Database (Denmark)

    Andersen, Lars Nonboe; Larsen, Jan; Hansen, Lars Kai

    1997-01-01

    We present a regularization scheme which iteratively adapts the regularization parameters by minimizing the validation error. It is suggested to use the adaptive regularization scheme in conjunction with optimal brain damage pruning to optimize the architecture and to avoid overfitting. Furthermo......, we propose an improved neural classification architecture eliminating an inherent redundancy in the widely used SoftMax classification network. Numerical results demonstrate the viability of the method...

  12. Electron-impact and pyrolytic eliminations from 4-tert-butylcyclohexyl xanthates

    International Nuclear Information System (INIS)

    Eadon, G.; Jefson, M.

    1976-01-01

    The stereochemistry of electron--impact induced xanthic acid elimination reactions was assessed by mass spectrographic studies of cis and trans deuterated 4-tert-butylcyclohexyl xanthates and their derivatives. Cis elimination was observed to be about 30 times as facile as trans elimination in the axial xanthate reaction. In the equatorial ester derivative reactions, the cis elimination was found to be slightly preferred. The electron-impact induced elimination results were compared with pyrolytic elimination results for the xanthates; and similar stereochemistry was observed for each type of elimination

  13. Energy-Efficient Neuromorphic Classifiers.

    Science.gov (United States)

    Martí, Daniel; Rigotti, Mattia; Seok, Mingoo; Fusi, Stefano

    2016-10-01

    Neuromorphic engineering combines the architectural and computational principles of systems neuroscience with semiconductor electronics, with the aim of building efficient and compact devices that mimic the synaptic and neural machinery of the brain. The energy consumptions promised by neuromorphic engineering are extremely low, comparable to those of the nervous system. Until now, however, the neuromorphic approach has been restricted to relatively simple circuits and specialized functions, thereby obfuscating a direct comparison of their energy consumption to that used by conventional von Neumann digital machines solving real-world tasks. Here we show that a recent technology developed by IBM can be leveraged to realize neuromorphic circuits that operate as classifiers of complex real-world stimuli. Specifically, we provide a set of general prescriptions to enable the practical implementation of neural architectures that compete with state-of-the-art classifiers. We also show that the energy consumption of these architectures, realized on the IBM chip, is typically two or more orders of magnitude lower than that of conventional digital machines implementing classifiers with comparable performance. Moreover, the spike-based dynamics display a trade-off between integration time and accuracy, which naturally translates into algorithms that can be flexibly deployed for either fast and approximate classifications, or more accurate classifications at the mere expense of longer running times and higher energy costs. This work finally proves that the neuromorphic approach can be efficiently used in real-world applications and has significant advantages over conventional digital devices when energy consumption is considered.

  14. College Choice in a Brand Elimination Framework: The Administrator's Perspective.

    Science.gov (United States)

    Rosen, Deborah E.; Curran, James M.; Greenlee, Timothy B.

    1998-01-01

    Through a survey of business programs, a study examined the nature and extent of student recruiting activities and classified them according to a "brand elimination" model. Timing and methods of recruiting were then compared to reports of enrollment changes. Results suggest that targeted recruitment activities aimed at creating awareness…

  15. 76 FR 34761 - Classified National Security Information

    Science.gov (United States)

    2011-06-14

    ... MARINE MAMMAL COMMISSION Classified National Security Information [Directive 11-01] AGENCY: Marine... Commission's (MMC) policy on classified information, as directed by Information Security Oversight Office... of Executive Order 13526, ``Classified National Security Information,'' and 32 CFR part 2001...

  16. Eliminating Rabies in Estonia

    Science.gov (United States)

    Cliquet, Florence; Robardet, Emmanuelle; Must, Kylli; Laine, Marjana; Peik, Katrin; Picard-Meyer, Evelyne; Guiot, Anne-Laure; Niin, Enel

    2012-01-01

    The compulsory vaccination of pets, the recommended vaccination of farm animals in grazing areas and the extermination of stray animals did not succeed in eliminating rabies in Estonia because the virus was maintained in two main wildlife reservoirs, foxes and raccoon dogs. These two species became a priority target therefore in order to control rabies. Supported by the European Community, successive oral vaccination (OV) campaigns were conducted twice a year using Rabigen® SAG2 baits, beginning in autumn 2005 in North Estonia. They were then extended to the whole territory from spring 2006. Following the vaccination campaigns, the incidence of rabies cases dramatically decreased, with 266 cases in 2005, 114 in 2006, four in 2007 and three in 2008. Since March 2008, no rabies cases have been detected in Estonia other than three cases reported in summer 2009 and one case in January 2011, all in areas close to the South-Eastern border with Russia. The bait uptake was satisfactory, with tetracycline positivity rates ranging from 85% to 93% in foxes and from 82% to 88% in raccoon dogs. Immunisation rates evaluated by ELISA ranged from 34% to 55% in foxes and from 38% to 55% in raccoon dogs. The rabies situation in Estonia was compared to that of the other two Baltic States, Latvia and Lithuania. Despite regular OV campaigns conducted throughout their territory since 2006, and an improvement in the epidemiological situation, rabies has still not been eradicated in these countries. An analysis of the number of baits distributed and the funding allocated by the European Commission showed that the strategy for rabies control is more cost-effective in Estonia than in Latvia and Lithuania. PMID:22393461

  17. Detection of microaneurysms in retinal images using an ensemble classifier

    Directory of Open Access Journals (Sweden)

    M.M. Habib

    2017-01-01

    Full Text Available This paper introduces, and reports on the performance of, a novel combination of algorithms for automated microaneurysm (MA detection in retinal images. The presence of MAs in retinal images is a pathognomonic sign of Diabetic Retinopathy (DR which is one of the leading causes of blindness amongst the working age population. An extensive survey of the literature is presented and current techniques in the field are summarised. The proposed technique first detects an initial set of candidates using a Gaussian Matched Filter and then classifies this set to reduce the number of false positives. A Tree Ensemble classifier is used with a set of 70 features (the most commons features in the literature. A new set of 32 MA groundtruth images (with a total of 256 labelled MAs based on images from the MESSIDOR dataset is introduced as a public dataset for benchmarking MA detection algorithms. We evaluate our algorithm on this dataset as well as another public dataset (DIARETDB1 v2.1 and compare it against the best available alternative. Results show that the proposed classifier is superior in terms of eliminating false positive MA detection from the initial set of candidates. The proposed method achieves an ROC score of 0.415 compared to 0.2636 achieved by the best available technique. Furthermore, results show that the classifier model maintains consistent performance across datasets, illustrating the generalisability of the classifier and that overfitting does not occur.

  18. Waste classifying and separation device

    International Nuclear Information System (INIS)

    Kakiuchi, Hiroki.

    1997-01-01

    A flexible plastic bags containing solid wastes of indefinite shape is broken and the wastes are classified. The bag cutting-portion of the device has an ultrasonic-type or a heater-type cutting means, and the cutting means moves in parallel with the transferring direction of the plastic bags. A classification portion separates and discriminates the plastic bag from the contents and conducts classification while rotating a classification table. Accordingly, the plastic bag containing solids of indefinite shape can be broken and classification can be conducted efficiently and reliably. The device of the present invention has a simple structure which requires small installation space and enables easy maintenance. (T.M.)

  19. Defining and Classifying Interest Groups

    DEFF Research Database (Denmark)

    Baroni, Laura; Carroll, Brendan; Chalmers, Adam

    2014-01-01

    The interest group concept is defined in many different ways in the existing literature and a range of different classification schemes are employed. This complicates comparisons between different studies and their findings. One of the important tasks faced by interest group scholars engaged...... in large-N studies is therefore to define the concept of an interest group and to determine which classification scheme to use for different group types. After reviewing the existing literature, this article sets out to compare different approaches to defining and classifying interest groups with a sample...... in the organizational attributes of specific interest group types. As expected, our comparison of coding schemes reveals a closer link between group attributes and group type in narrower classification schemes based on group organizational characteristics than those based on a behavioral definition of lobbying....

  20. Elimination device for metal impurities

    International Nuclear Information System (INIS)

    Yanagisawa, Ko.

    1982-01-01

    Purpose: To enable reuse of adsorbing materials by eliminating Fe 3 O 4 films reduced with adsorbing performance by way of electrolytic polishing and then forming fresh membranes using high temperature steams. Constitution: An elimination device is provided to a coolant clean-up system of a reactor for eliminating impurities such as cobalt. The elimination device comprises adsorbing materials made of stainless steel tips or the likes having Fe 3 O 4 films. The adsorbing materials are regenerated by applying an electric current between grid-like cathode plates and anode plates to leach out the Fe 3 O 4 films, washing out the electrolytic solution by cleaning water and then applying steams at high temperature onto the adsorbing materials to thereby form fresh Fe 3 O 4 films again thereon. The regeneration of the adsorbing materials enables to eliminate Co 60 and the like in the primary coolant efficiently. (Moriyama, K.)

  1. General and Local: Averaged k-Dependence Bayesian Classifiers

    Directory of Open Access Journals (Sweden)

    Limin Wang

    2015-06-01

    Full Text Available The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB classifier can construct at arbitrary points (values of k along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is proposed in this study to describe the local dependencies implicated in each test instance. Based on the analysis of functional dependencies, substitution-elimination resolution, a new type of semi-naive Bayesian operation, is proposed to substitute or eliminate generalization to achieve accurate estimation of conditional probability distribution while reducing computational complexity. The final classifier, averaged k-dependence Bayesian (AKDB classifiers, will average the output of KDB and local KDB. Experimental results on the repository of machine learning databases from the University of California Irvine (UCI showed that AKDB has significant advantages in zero-one loss and bias relative to naive Bayes (NB, tree augmented naive Bayes (TAN, Averaged one-dependence estimators (AODE, and KDB. Moreover, KDB and local KDB show mutually complementary characteristics with respect to variance.

  2. Comparing materials used in mist eliminators

    Energy Technology Data Exchange (ETDEWEB)

    Looney, B.; Baleno, B.; Boles, G.L.; Telow, J. [Solvay Advanced Polyers (United States)

    2007-11-15

    Wet flue gas desulfurization (FGD) systems, or wet scrubbers, are notoriously capital - and maintenance-intensive. Mist eliminators are an integral part of most wet FGD systems. These are available in a variety of materials - polypropylene, fiberglass reinforced polymer (FRP), polysulfone and stainless steel. The article discusses the material properties, performance attributes and relative cost differences associated with each of these four materials. It describes the common problems with mist eliminators - fouling and corrosion. These can be minimised by routine cleaning and use of chemical additives to prevent deposition. An analysis was carried out to compare the four materials at APS Cholla power plant. As a result the facility is retrofitting its remaining wet scrubber towers in Unit 2 with mist eliminators constructed from polysulfone as each of the current ones of the existing polypropylene needs replacing. Polysulfone is cheaper to clean and components require replacing less frequently than polypropylene. Switching from stainless steel to polypropylene has proved advantageous on 22 wet scrubbers operated by PPL Montana. 5 figs. 2 tabs.

  3. 33 CFR 154.1216 - Facility classification.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Facility classification. 154.1216... Vegetable Oils Facilities § 154.1216 Facility classification. (a) The Coast Guard classifies facilities that... classification of a facility that handles, stores, or transports animal fats or vegetable oils. The COTP may...

  4. 340 waste handling facility interim safety basis

    Energy Technology Data Exchange (ETDEWEB)

    VAIL, T.S.

    1999-04-01

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people.

  5. 340 waste handling facility interim safety basis

    International Nuclear Information System (INIS)

    VAIL, T.S.

    1999-01-01

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people

  6. A History of Classified Activities at Oak Ridge National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Quist, A.S.

    2001-01-30

    The facilities that became Oak Ridge National Laboratory (ORNL) were created in 1943 during the United States' super-secret World War II project to construct an atomic bomb (the Manhattan Project). During World War II and for several years thereafter, essentially all ORNL activities were classified. Now, in 2000, essentially all ORNL activities are unclassified. The major purpose of this report is to provide a brief history of ORNL's major classified activities from 1943 until the present (September 2000). This report is expected to be useful to the ORNL Classification Officer and to ORNL's Authorized Derivative Classifiers and Authorized Derivative Declassifiers in their classification review of ORNL documents, especially those documents that date from the 1940s and 1950s.

  7. Proportional counter end effects eliminator

    International Nuclear Information System (INIS)

    Meekins, J.F.

    1976-01-01

    An improved gas-filled proportional counter which includes a resistor network connected between the anode and cathode at the ends of the counter in order to eliminate ''end effects'' is described. 3 Claims, 2 Drawing Figures

  8. Defense Logistics Agency Revenue Eliminations

    National Research Council Canada - National Science Library

    1996-01-01

    The issue of revenue eliminations was identified during our work on the Defense Logistics Agency portion of the Audit of Revenue Accounts in the FY 1996 Financial Statements of the Defense Business Operations Fund...

  9. Composite Classifiers for Automatic Target Recognition

    National Research Council Canada - National Science Library

    Wang, Lin-Cheng

    1998-01-01

    ...) using forward-looking infrared (FLIR) imagery. Two existing classifiers, one based on learning vector quantization and the other on modular neural networks, are used as the building blocks for our composite classifiers...

  10. National Pollution Discharge Elimination System (NPDES) Wastewater Treatment Plant Points, Region 9, 2007, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA...

  11. National Pollution Discharge Elimination System (NPDES) Wastewater Treatment Plant Points, Region 9, 2011, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA...

  12. National Pollution Discharge Elimination System (NPDES) Wastewater Treatment Plant Points, Region 9, 2012, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA...

  13. Aggregation Operator Based Fuzzy Pattern Classifier Design

    DEFF Research Database (Denmark)

    Mönks, Uwe; Larsen, Henrik Legind; Lohweg, Volker

    2009-01-01

    This paper presents a novel modular fuzzy pattern classifier design framework for intelligent automation systems, developed on the base of the established Modified Fuzzy Pattern Classifier (MFPC) and allows designing novel classifier models which are hardware-efficiently implementable....... The performances of novel classifiers using substitutes of MFPC's geometric mean aggregator are benchmarked in the scope of an image processing application against the MFPC to reveal classification improvement potentials for obtaining higher classification rates....

  14. 15 CFR 4.8 - Classified Information.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Classified Information. 4.8 Section 4... INFORMATION Freedom of Information Act § 4.8 Classified Information. In processing a request for information..., the information shall be reviewed to determine whether it should remain classified. Ordinarily the...

  15. Elimination reactions. V. Steric effects in Hofmann elimination

    International Nuclear Information System (INIS)

    Coke, J.L.; Smith, G.D.; Britton, G.H. Jr.

    1975-01-01

    Earlier Hofmann elimination studies were extended, and the percent syn eliminations in several ring systems have been correlated using cis-d 1 and trans-d 1 models. The measurements of several syn and anti k/sub H//k/sub D/ kinetic isotope effects are reported. Results indicate that Hofmann elimination of N,N,N-trimethyl-3,3-dimethylcyclopentylammonium hydroxide goes by 97 percent syn mechanism to give 3,3-dimethylcyclopentene and by 70 + - 6 percent syn mechanism to give 4,4-dimethylcyclopentene. There appears to be severe steric interactions in the anti mechanism in the 3,3-dimethylcyclopentyl system. Results indicate that, for Hofmann pyrolysis of trimethylammonium hydroxides, cyclopentene is formed by a 39 +- 7 percent syn mechanism, cyclohexene is formed by a 2 + - 2 percent syn mechanism, and cycloheptene is formed by a 30 +- 2 percent syn mechanism. Steric effects on isotope effects and mechanisms are discussed. (U.S.)

  16. Refurbishment of JMTR pure water facility

    International Nuclear Information System (INIS)

    Asano, Norikazu; Hanakawa, Hiroki; Kusunoki, Hidehiko; Satou, Shinichi

    2012-05-01

    In the refurbishment of JMTR, facilities were classified into which (1) were all updated, (2) were partly updated, and (3) were continuance used by the considerations of the maintenance history, the change parts availability and the latest technology. The JMTR pure water facility was classified into all updated facility based on the consideration. The Update construction was conducted in between FY2007 and FY2008. The refurbishment of JMTR pure water facility is summarized in this report. (author)

  17. The economics of malaria control and elimination: a systematic review.

    Science.gov (United States)

    Shretta, Rima; Avanceña, Anton L V; Hatefi, Arian

    2016-12-12

    Declining donor funding and competing health priorities threaten the sustainability of malaria programmes. Elucidating the cost and benefits of continued investments in malaria could encourage sustained political and financial commitments. The evidence, although available, remains disparate. This paper reviews the existing literature on the economic and financial cost and return of malaria control, elimination and eradication. A review of articles that were published on or before September 2014 on the cost and benefits of malaria control and elimination was performed. Studies were classified based on their scope and were analysed according to two major categories: cost of malaria control and elimination to a health system, and cost-benefit studies. Only studies involving more than two control or elimination interventions were included. Outcomes of interest were total programmatic cost, cost per capita, and benefit-cost ratios (BCRs). All costs were converted to 2013 US$ for standardization. Of the 6425 articles identified, 54 studies were included in this review. Twenty-two were focused on elimination or eradication while 32 focused on intensive control. Forty-eight per cent of studies included in this review were published on or after 2000. Overall, the annual per capita cost of malaria control to a health system ranged from $0.11 to $39.06 (median: $2.21) while that for malaria elimination ranged from $0.18 to $27 (median: $3.00). BCRs of investing in malaria control and elimination ranged from 2.4 to over 145. Overall, investments needed for malaria control and elimination varied greatly amongst the various countries and contexts. In most cases, the cost of elimination was greater than the cost of control. At the same time, the benefits of investing in malaria greatly outweighed the costs. While the cost of elimination in most cases was greater than the cost of control, the benefits greatly outweighed the cost. Information from this review provides guidance to

  18. Error minimizing algorithms for nearest eighbor classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory; Zimmer, G. Beate [TEXAS A& M

    2011-01-03

    Stack Filters define a large class of discrete nonlinear filter first introd uced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which we call Ordered Hypothesis Machines (OHM), and investigate their relationship to Nearest Neighbor classifiers. We show that OHM classifiers provide a novel framework in which to train Nearest Neighbor type classifiers by minimizing empirical error based loss functions. We use the framework to investigate a new cost sensitive loss function that allows us to train a Nearest Neighbor type classifier for low false alarm rate applications. We report results on both synthetic data and real-world image data.

  19. A Novel Cascade Classifier for Automatic Microcalcification Detection.

    Directory of Open Access Journals (Sweden)

    Seung Yeon Shin

    Full Text Available In this paper, we present a novel cascaded classification framework for automatic detection of individual and clusters of microcalcifications (μC. Our framework comprises three classification stages: i a random forest (RF classifier for simple features capturing the second order local structure of individual μCs, where non-μC pixels in the target mammogram are efficiently eliminated; ii a more complex discriminative restricted Boltzmann machine (DRBM classifier for μC candidates determined in the RF stage, which automatically learns the detailed morphology of μC appearances for improved discriminative power; and iii a detector to detect clusters of μCs from the individual μC detection results, using two different criteria. From the two-stage RF-DRBM classifier, we are able to distinguish μCs using explicitly computed features, as well as learn implicit features that are able to further discriminate between confusing cases. Experimental evaluation is conducted on the original Mammographic Image Analysis Society (MIAS and mini-MIAS databases, as well as our own Seoul National University Bundang Hospital digital mammographic database. It is shown that the proposed method outperforms comparable methods in terms of receiver operating characteristic (ROC and precision-recall curves for detection of individual μCs and free-response receiver operating characteristic (FROC curve for detection of clustered μCs.

  20. Comprehensive benchmarking and ensemble approaches for metagenomic classifiers.

    Science.gov (United States)

    McIntyre, Alexa B R; Ounit, Rachid; Afshinnekoo, Ebrahim; Prill, Robert J; Hénaff, Elizabeth; Alexander, Noah; Minot, Samuel S; Danko, David; Foox, Jonathan; Ahsanuddin, Sofia; Tighe, Scott; Hasan, Nur A; Subramanian, Poorani; Moffat, Kelly; Levy, Shawn; Lonardi, Stefano; Greenfield, Nick; Colwell, Rita R; Rosen, Gail L; Mason, Christopher E

    2017-09-21

    One of the main challenges in metagenomics is the identification of microorganisms in clinical and environmental samples. While an extensive and heterogeneous set of computational tools is available to classify microorganisms using whole-genome shotgun sequencing data, comprehensive comparisons of these methods are limited. In this study, we use the largest-to-date set of laboratory-generated and simulated controls across 846 species to evaluate the performance of 11 metagenomic classifiers. Tools were characterized on the basis of their ability to identify taxa at the genus, species, and strain levels, quantify relative abundances of taxa, and classify individual reads to the species level. Strikingly, the number of species identified by the 11 tools can differ by over three orders of magnitude on the same datasets. Various strategies can ameliorate taxonomic misclassification, including abundance filtering, ensemble approaches, and tool intersection. Nevertheless, these strategies were often insufficient to completely eliminate false positives from environmental samples, which are especially important where they concern medically relevant species. Overall, pairing tools with different classification strategies (k-mer, alignment, marker) can combine their respective advantages. This study provides positive and negative controls, titrated standards, and a guide for selecting tools for metagenomic analyses by comparing ranges of precision, accuracy, and recall. We show that proper experimental design and analysis parameters can reduce false positives, provide greater resolution of species in complex metagenomic samples, and improve the interpretation of results.

  1. Technique eliminates high voltage arcing at electrode-insulator contact area

    Science.gov (United States)

    Mealy, G.

    1967-01-01

    Coating the electrode-insulator contact area with silver epoxy conductive paint and forcing the electrode and insulator tightly together into a permanent connection, eliminates electrical arcing in high-voltage electrodes supplying electrical power to vacuum facilities.

  2. Hierarchical mixtures of naive Bayes classifiers

    NARCIS (Netherlands)

    Wiering, M.A.

    2002-01-01

    Naive Bayes classifiers tend to perform very well on a large number of problem domains, although their representation power is quite limited compared to more sophisticated machine learning algorithms. In this pa- per we study combining multiple naive Bayes classifiers by using the hierar- chical

  3. Comparing classifiers for pronunciation error detection

    NARCIS (Netherlands)

    Strik, H.; Truong, K.; Wet, F. de; Cucchiarini, C.

    2007-01-01

    Providing feedback on pronunciation errors in computer assisted language learning systems requires that pronunciation errors be detected automatically. In the present study we compare four types of classifiers that can be used for this purpose: two acoustic-phonetic classifiers (one of which employs

  4. Feature extraction for dynamic integration of classifiers

    NARCIS (Netherlands)

    Pechenizkiy, M.; Tsymbal, A.; Puuronen, S.; Patterson, D.W.

    2007-01-01

    Recent research has shown the integration of multiple classifiers to be one of the most important directions in machine learning and data mining. In this paper, we present an algorithm for the dynamic integration of classifiers in the space of extracted features (FEDIC). It is based on the technique

  5. Deconvolution When Classifying Noisy Data Involving Transformations

    KAUST Repository

    Carroll, Raymond

    2012-09-01

    In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is applied. However, the inverse transformation is not constructed so as to recover the original signal, and in fact, we show that taking the latter approach is generally inadvisable. We introduce a fully data-driven procedure based on cross-validation, and use several classifiers to illustrate numerical properties of our approach. Theoretical arguments are given in support of our claims. Our procedure is applied to data generated by light detection and ranging (Lidar) technology, where we improve on earlier approaches to classifying aerosols. This article has supplementary materials online.

  6. Deconvolution When Classifying Noisy Data Involving Transformations.

    Science.gov (United States)

    Carroll, Raymond; Delaigle, Aurore; Hall, Peter

    2012-09-01

    In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is applied. However, the inverse transformation is not constructed so as to recover the original signal, and in fact, we show that taking the latter approach is generally inadvisable. We introduce a fully data-driven procedure based on cross-validation, and use several classifiers to illustrate numerical properties of our approach. Theoretical arguments are given in support of our claims. Our procedure is applied to data generated by light detection and ranging (Lidar) technology, where we improve on earlier approaches to classifying aerosols. This article has supplementary materials online.

  7. Deconvolution When Classifying Noisy Data Involving Transformations

    KAUST Repository

    Carroll, Raymond; Delaigle, Aurore; Hall, Peter

    2012-01-01

    In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is applied. However, the inverse transformation is not constructed so as to recover the original signal, and in fact, we show that taking the latter approach is generally inadvisable. We introduce a fully data-driven procedure based on cross-validation, and use several classifiers to illustrate numerical properties of our approach. Theoretical arguments are given in support of our claims. Our procedure is applied to data generated by light detection and ranging (Lidar) technology, where we improve on earlier approaches to classifying aerosols. This article has supplementary materials online.

  8. Cancer risks: Strategies for elimination

    International Nuclear Information System (INIS)

    Bannasch, P.

    1987-01-01

    This book deals with the possibilities for identifying and eliminating cancer risk factors. The current state of knowledge on the detection, assessment and elimination of chemical, physical (radiation), and biological (viruses) risk factors are comprehensively presented in 15 contributions. Chemical risk factors resulting from smoking and environmental contamination are given special attention. The coverage of cancer risks by radiation includes some of the consequences of the Chernobyl disaster. Finally, the discussion of the possible risks that certain viruses hold for cancer in man is intended to further the development of vaccinations against these viral infections. The information is directed not only at specialists, but also at a wider interested audience. Its primary aim is to convey established findings that are already being used for cancer prevention. Furthermore, the book aims to promote more intense research in the field of primary cancer prevention. Contents: General aspects; chemical carcinogens: Risk assessment; chemical carcinogens: Primary prevention; physical carcinogens - Oncogenic viruses and subject index

  9. How To Eliminate Narcissism Overnight

    Science.gov (United States)

    2011-01-01

    The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition appears likely to eliminate the diagnosis of narcissistic personality disorder. There are significant problems with the discriminant validity of the current narcissistic personality disorder critiera set; furthermore, the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition's narrow focus on “grandiosity” probably contributes to the wide disparity between low narcissistic personality disorder prevalence rates in epidemiological studies and high rates of narcissistic personality disorder in clinical practice. Nevertheless, the best course of action may be to refine the narcissistic personality disorder criteria, followed by careful field testing and a search for biomarkers, rather than wholesale elimination of the narcissistic personality disorder category. The construct of “malignant narcissism” is also worthy of more intense empirical investigation. PMID:21468294

  10. Challenges beyond elimination in leprosy.

    Science.gov (United States)

    Naaz, Farah; Mohanty, Partha Sarathi; Bansal, Avi Kumar; Kumar, Dilip; Gupta, Umesh Datta

    2017-01-01

    Every year >200,000 new leprosy cases are registered globally. This number has been fairly stable over the past 8 years. The World Health Organization has set a target to interrupt the transmission of leprosy globally by 2020. It is important, in terms of global action and research activities, to consider the eventuality of multidrug therapy (MDT) resistance developing. It is necessary to measure disease burden comprehensively, and contact-centered preventive interventions should be part of a global elimination strategy. Drug resistance is the reduction in effectiveness of a drug such as an antimicrobial or an antineoplastic in curing a disease or condition. MDT has proven to be a powerful tool in the control of leprosy, especially when patients report early and start prompt treatment. Adherence to and its successful completion is equally important. This paper has reviewed the current state of leprosy worldwide and discussed the challenges and also emphasizes the challenge beyond the elimination in leprosy.

  11. Logarithmic learning for generalized classifier neural network.

    Science.gov (United States)

    Ozyildirim, Buse Melis; Avci, Mutlu

    2014-12-01

    Generalized classifier neural network is introduced as an efficient classifier among the others. Unless the initial smoothing parameter value is close to the optimal one, generalized classifier neural network suffers from convergence problem and requires quite a long time to converge. In this work, to overcome this problem, a logarithmic learning approach is proposed. The proposed method uses logarithmic cost function instead of squared error. Minimization of this cost function reduces the number of iterations used for reaching the minima. The proposed method is tested on 15 different data sets and performance of logarithmic learning generalized classifier neural network is compared with that of standard one. Thanks to operation range of radial basis function included by generalized classifier neural network, proposed logarithmic approach and its derivative has continuous values. This makes it possible to adopt the advantage of logarithmic fast convergence by the proposed learning method. Due to fast convergence ability of logarithmic cost function, training time is maximally decreased to 99.2%. In addition to decrease in training time, classification performance may also be improved till 60%. According to the test results, while the proposed method provides a solution for time requirement problem of generalized classifier neural network, it may also improve the classification accuracy. The proposed method can be considered as an efficient way for reducing the time requirement problem of generalized classifier neural network. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Evidence-Based Psychosocial Treatments for Pediatric Elimination Disorders.

    Science.gov (United States)

    Shepard, Jaclyn A; Poler, Joseph E; Grabman, Jesse H

    2017-01-01

    Pediatric elimination disorders are common in childhood, yet psychosocial correlates are generally unclear. Given the physiological concomitants of both enuresis and encopresis, and the fact that many children with elimination disorders are initially brought to their primary care physician for treatment, medical evaluation and management are crucial and may serve as the first-line treatment approach. Scientific investigation on psychological and behavioral interventions has progressed over the past couple of decades, resulting in the identification of effective treatments for enuresis and encopresis. However, the body of literature has inherent challenges, particularly given the multicomponent nature of many of the treatment packages. This review identified 25 intervention studies-18 for nocturnal enuresis and 7 for encopresis-over the past 15 years and classified them according to the guidelines set forth by the Task Force on the Promotion and Dissemination of Psychological Procedures. For nocturnal enuresis, the urine alarm and dry-bed training were identified as well-established treatments, Full Spectrum Home Therapy was probably efficacious, lifting was possibly efficacious, and hypnotherapy and retention control training were classified as treatments of questionable efficacy. For encopresis, only two probably efficacious treatments were identified: biofeedback and enhanced toilet training (ETT). Best practice recommendations and suggestions for future research are provided to address existing limitations, including heterogeneity and the multicomponent nature of many of the interventions for pediatric elimination disorders.

  13. A CLASSIFIER SYSTEM USING SMOOTH GRAPH COLORING

    Directory of Open Access Journals (Sweden)

    JORGE FLORES CRUZ

    2017-01-01

    Full Text Available Unsupervised classifiers allow clustering methods with less or no human intervention. Therefore it is desirable to group the set of items with less data processing. This paper proposes an unsupervised classifier system using the model of soft graph coloring. This method was tested with some classic instances in the literature and the results obtained were compared with classifications made with human intervention, yielding as good or better results than supervised classifiers, sometimes providing alternative classifications that considers additional information that humans did not considered.

  14. High dimensional classifiers in the imbalanced case

    DEFF Research Database (Denmark)

    Bak, Britta Anker; Jensen, Jens Ledet

    We consider the binary classification problem in the imbalanced case where the number of samples from the two groups differ. The classification problem is considered in the high dimensional case where the number of variables is much larger than the number of samples, and where the imbalance leads...... to a bias in the classification. A theoretical analysis of the independence classifier reveals the origin of the bias and based on this we suggest two new classifiers that can handle any imbalance ratio. The analytical results are supplemented by a simulation study, where the suggested classifiers in some...

  15. Arabic Handwriting Recognition Using Neural Network Classifier

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... an OCR using Neural Network classifier preceded by a set of preprocessing .... Artificial Neural Networks (ANNs), which we adopt in this research, consist of ... advantage and disadvantages of each technique. In [9],. Khemiri ...

  16. Classifiers based on optimal decision rules

    KAUST Repository

    Amin, Talha

    2013-11-25

    Based on dynamic programming approach we design algorithms for sequential optimization of exact and approximate decision rules relative to the length and coverage [3, 4]. In this paper, we use optimal rules to construct classifiers, and study two questions: (i) which rules are better from the point of view of classification-exact or approximate; and (ii) which order of optimization gives better results of classifier work: length, length+coverage, coverage, or coverage+length. Experimental results show that, on average, classifiers based on exact rules are better than classifiers based on approximate rules, and sequential optimization (length+coverage or coverage+length) is better than the ordinary optimization (length or coverage).

  17. Classifiers based on optimal decision rules

    KAUST Repository

    Amin, Talha M.; Chikalov, Igor; Moshkov, Mikhail; Zielosko, Beata

    2013-01-01

    Based on dynamic programming approach we design algorithms for sequential optimization of exact and approximate decision rules relative to the length and coverage [3, 4]. In this paper, we use optimal rules to construct classifiers, and study two questions: (i) which rules are better from the point of view of classification-exact or approximate; and (ii) which order of optimization gives better results of classifier work: length, length+coverage, coverage, or coverage+length. Experimental results show that, on average, classifiers based on exact rules are better than classifiers based on approximate rules, and sequential optimization (length+coverage or coverage+length) is better than the ordinary optimization (length or coverage).

  18. Combining multiple classifiers for age classification

    CSIR Research Space (South Africa)

    Van Heerden, C

    2009-11-01

    Full Text Available The authors compare several different classifier combination methods on a single task, namely speaker age classification. This task is well suited to combination strategies, since significantly different feature classes are employed. Support vector...

  19. Neural Network Classifiers for Local Wind Prediction.

    Science.gov (United States)

    Kretzschmar, Ralf; Eckert, Pierre; Cattani, Daniel; Eggimann, Fritz

    2004-05-01

    This paper evaluates the quality of neural network classifiers for wind speed and wind gust prediction with prediction lead times between +1 and +24 h. The predictions were realized based on local time series and model data. The selection of appropriate input features was initiated by time series analysis and completed by empirical comparison of neural network classifiers trained on several choices of input features. The selected input features involved day time, yearday, features from a single wind observation device at the site of interest, and features derived from model data. The quality of the resulting classifiers was benchmarked against persistence for two different sites in Switzerland. The neural network classifiers exhibited superior quality when compared with persistence judged on a specific performance measure, hit and false-alarm rates.

  20. Consistency Analysis of Nearest Subspace Classifier

    OpenAIRE

    Wang, Yi

    2015-01-01

    The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...

  1. Analysis and minimization of overtraining effect in rule-based classifiers for computer-aided diagnosis

    International Nuclear Information System (INIS)

    Li Qiang; Doi Kunio

    2006-01-01

    Computer-aided diagnostic (CAD) schemes have been developed to assist radiologists detect various lesions in medical images. In CAD schemes, classifiers play a key role in achieving a high lesion detection rate and a low false-positive rate. Although many popular classifiers such as linear discriminant analysis and artificial neural networks have been employed in CAD schemes for reduction of false positives, a rule-based classifier has probably been the simplest and most frequently used one since the early days of development of various CAD schemes. However, with existing rule-based classifiers, there are major disadvantages that significantly reduce their practicality and credibility. The disadvantages include manual design, poor reproducibility, poor evaluation methods such as resubstitution, and a large overtraining effect. An automated rule-based classifier with a minimized overtraining effect can overcome or significantly reduce the extent of the above-mentioned disadvantages. In this study, we developed an 'optimal' method for the selection of cutoff thresholds and a fully automated rule-based classifier. Experimental results performed with Monte Carlo simulation and a real lung nodule CT data set demonstrated that the automated threshold selection method can completely eliminate overtraining effect in the procedure of cutoff threshold selection, and thus can minimize overall overtraining effect in the constructed rule-based classifier. We believe that this threshold selection method is very useful in the construction of automated rule-based classifiers with minimized overtraining effect

  2. Removal of micropollutants with coarse-ground activated carbon for enhanced separation with hydrocyclone classifiers.

    Science.gov (United States)

    Otto, N; Platz, S; Fink, T; Wutscherk, M; Menzel, U

    2016-01-01

    One key technology to eliminate organic micropollutants (OMP) from wastewater effluent is adsorption using powdered activated carbon (PAC). To avoid a discharge of highly loaded PAC particles into natural water bodies a separation stage has to be implemented. Commonly large settling tanks and flocculation filters with the application of coagulants and flocculation aids are used. In this study, a multi-hydrocyclone classifier with a downstream cloth filter has been investigated on a pilot plant as a space-saving alternative with no need for a dosing of chemical additives. To improve the separation, a coarser ground PAC type was compared to a standard PAC type with regard to elimination results of OMP as well as separation performance. With a PAC dosing rate of 20 mg/l an average of 64.7 wt% of the standard PAC and 79.5 wt% of the coarse-ground PAC could be separated in the hydrocyclone classifier. A total average separation efficiency of 93-97 wt% could be reached with a combination of both hydrocyclone classifier and cloth filter. Nonetheless, the OMP elimination of the coarse-ground PAC was not sufficient enough to compete with the standard PAC. Further research and development is necessary to find applicable coarse-grained PAC types with adequate OMP elimination capabilities.

  3. 76 FR 43230 - National Pollutant Discharge Elimination System-Cooling Water Intake Structures at Existing...

    Science.gov (United States)

    2011-07-20

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Parts 122 and 125 [EPA-HQ-OW-2008-0667, FRL-9441-8] RIN 2040-AE95 National Pollutant Discharge Elimination System--Cooling Water Intake Structures at Existing Facilities and Phase I Facilities AGENCY: Environmental Protection Agency (EPA). ACTION: Proposed rule...

  4. Eliminating US hospital medical errors.

    Science.gov (United States)

    Kumar, Sameer; Steinebach, Marc

    2008-01-01

    Healthcare costs in the USA have continued to rise steadily since the 1980s. Medical errors are one of the major causes of deaths and injuries of thousands of patients every year, contributing to soaring healthcare costs. The purpose of this study is to examine what has been done to deal with the medical-error problem in the last two decades and present a closed-loop mistake-proof operation system for surgery processes that would likely eliminate preventable medical errors. The design method used is a combination of creating a service blueprint, implementing the six sigma DMAIC cycle, developing cause-and-effect diagrams as well as devising poka-yokes in order to develop a robust surgery operation process for a typical US hospital. In the improve phase of the six sigma DMAIC cycle, a number of poka-yoke techniques are introduced to prevent typical medical errors (identified through cause-and-effect diagrams) that may occur in surgery operation processes in US hospitals. It is the authors' assertion that implementing the new service blueprint along with the poka-yokes, will likely result in the current medical error rate to significantly improve to the six-sigma level. Additionally, designing as many redundancies as possible in the delivery of care will help reduce medical errors. Primary healthcare providers should strongly consider investing in adequate doctor and nurse staffing, and improving their education related to the quality of service delivery to minimize clinical errors. This will lead to an increase in higher fixed costs, especially in the shorter time frame. This paper focuses additional attention needed to make a sound technical and business case for implementing six sigma tools to eliminate medical errors that will enable hospital managers to increase their hospital's profitability in the long run and also ensure patient safety.

  5. Reinforcement Learning Based Artificial Immune Classifier

    Directory of Open Access Journals (Sweden)

    Mehmet Karakose

    2013-01-01

    Full Text Available One of the widely used methods for classification that is a decision-making process is artificial immune systems. Artificial immune systems based on natural immunity system can be successfully applied for classification, optimization, recognition, and learning in real-world problems. In this study, a reinforcement learning based artificial immune classifier is proposed as a new approach. This approach uses reinforcement learning to find better antibody with immune operators. The proposed new approach has many contributions according to other methods in the literature such as effectiveness, less memory cell, high accuracy, speed, and data adaptability. The performance of the proposed approach is demonstrated by simulation and experimental results using real data in Matlab and FPGA. Some benchmark data and remote image data are used for experimental results. The comparative results with supervised/unsupervised based artificial immune system, negative selection classifier, and resource limited artificial immune classifier are given to demonstrate the effectiveness of the proposed new method.

  6. Classifier Fusion With Contextual Reliability Evaluation.

    Science.gov (United States)

    Liu, Zhunga; Pan, Quan; Dezert, Jean; Han, Jun-Wei; He, You

    2018-05-01

    Classifier fusion is an efficient strategy to improve the classification performance for the complex pattern recognition problem. In practice, the multiple classifiers to combine can have different reliabilities and the proper reliability evaluation plays an important role in the fusion process for getting the best classification performance. We propose a new method for classifier fusion with contextual reliability evaluation (CF-CRE) based on inner reliability and relative reliability concepts. The inner reliability, represented by a matrix, characterizes the probability of the object belonging to one class when it is classified to another class. The elements of this matrix are estimated from the -nearest neighbors of the object. A cautious discounting rule is developed under belief functions framework to revise the classification result according to the inner reliability. The relative reliability is evaluated based on a new incompatibility measure which allows to reduce the level of conflict between the classifiers by applying the classical evidence discounting rule to each classifier before their combination. The inner reliability and relative reliability capture different aspects of the classification reliability. The discounted classification results are combined with Dempster-Shafer's rule for the final class decision making support. The performance of CF-CRE have been evaluated and compared with those of main classical fusion methods using real data sets. The experimental results show that CF-CRE can produce substantially higher accuracy than other fusion methods in general. Moreover, CF-CRE is robust to the changes of the number of nearest neighbors chosen for estimating the reliability matrix, which is appealing for the applications.

  7. Storm Water General Permit 1 for Industrial Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — General permit #1 for storm water discharges associated with industrial facilities in Iowa for the National Pollutant Discharge Elimination System (NPDES) program.

  8. Classifying sows' activity types from acceleration patterns

    DEFF Research Database (Denmark)

    Cornou, Cecile; Lundbye-Christensen, Søren

    2008-01-01

    An automated method of classifying sow activity using acceleration measurements would allow the individual sow's behavior to be monitored throughout the reproductive cycle; applications for detecting behaviors characteristic of estrus and farrowing or to monitor illness and welfare can be foreseen....... This article suggests a method of classifying five types of activity exhibited by group-housed sows. The method involves the measurement of acceleration in three dimensions. The five activities are: feeding, walking, rooting, lying laterally and lying sternally. Four time series of acceleration (the three...

  9. Data characteristics that determine classifier performance

    CSIR Research Space (South Africa)

    Van der Walt, Christiaan M

    2006-11-01

    Full Text Available available at [11]. The kNN uses a LinearNN nearest neighbour search algorithm with an Euclidean distance metric [8]. The optimal k value is determined by performing 10-fold cross-validation. An optimal k value between 1 and 10 is used for Experiments 1... classifiers. 10-fold cross-validation is used to evaluate and compare the performance of the classifiers on the different data sets. 3.1. Artificial data generation Multivariate Gaussian distributions are used to generate artificial data sets. We use d...

  10. A Customizable Text Classifier for Text Mining

    Directory of Open Access Journals (Sweden)

    Yun-liang Zhang

    2007-12-01

    Full Text Available Text mining deals with complex and unstructured texts. Usually a particular collection of texts that is specified to one or more domains is necessary. We have developed a customizable text classifier for users to mine the collection automatically. It derives from the sentence category of the HNC theory and corresponding techniques. It can start with a few texts, and it can adjust automatically or be adjusted by user. The user can also control the number of domains chosen and decide the standard with which to choose the texts based on demand and abundance of materials. The performance of the classifier varies with the user's choice.

  11. A survey of decision tree classifier methodology

    Science.gov (United States)

    Safavian, S. R.; Landgrebe, David

    1991-01-01

    Decision tree classifiers (DTCs) are used successfully in many diverse areas such as radar signal classification, character recognition, remote sensing, medical diagnosis, expert systems, and speech recognition. Perhaps the most important feature of DTCs is their capability to break down a complex decision-making process into a collection of simpler decisions, thus providing a solution which is often easier to interpret. A survey of current methods is presented for DTC designs and the various existing issues. After considering potential advantages of DTCs over single-state classifiers, subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed.

  12. Very Low Activity Waste Disposal Facility Recently Commissioned as an Extension of El Cabril LILW Disposal Facility in Spain

    International Nuclear Information System (INIS)

    Zuloaga, P.; Navarro, M.

    2009-01-01

    This paper describes the Very Low Activity Radioactive Waste (VLLW) disposal facility, designed, built and operated by ENRESA as a part of El Cabril LILW disposal facility. El Cabril facility was commissioned in 1992 and has 28 concrete vaults with an internal volume of 100,000 m 3 , as well as waste treatment systems and waste characterization laboratories. The total needs identified in Spain for LILW disposal are of some 176,000 m 3 , of which around 120,000 m3 might be classified as VLLW This project was launched in 2003 and the major licensing steps have been town planning license (2003), construction authorization (after Environmental Impact Statement and report from Nuclear Safety Council-CSN, 2006), and Operations Authorization (after report from CSN, July 2008). The new VLLW disposal facility has a capacity for 130,000 meters cube in four disposal cells of approximately the same size. Only the first cell has been built. The design of the barriers is based on the European Directive for elimination of dangerous waste and consists of a clay layer 1 m, 3 cm geo-bentonite films, and 4 mm HDPE film. In order to minimize leachate volumes collected and help a good monitoring of the site, each cell is divided into different sections, which are protected during operation -before placing a provisional HDPE capping- by a light shelter and where leachate collection is segregated from other sections. (authors)

  13. REMOTE INTERVENTION TOWER ELIMINATION SYSTEM

    International Nuclear Information System (INIS)

    Dave Murnane; Renauld Washington

    2002-01-01

    This Topical Report is presented to satisfy reporting requirements in the Statement of work section J.5 page 120 per Department of Energy contract DE-AC26-01NT41093. The project does not contain any imperial research data. This report describes the assembly of Commercial off the shelf (COTS) items configured in a unique manner to represent new and innovative technology in the service of size reduction and material handling at DOE sites, to assist in the DandD effort currently underway at the designated DOE Facilities

  14. 75 FR 37253 - Classified National Security Information

    Science.gov (United States)

    2010-06-28

    ... ``Secret.'' (3) Each interior page of a classified document shall be marked at the top and bottom either... ``(TS)'' for Top Secret, ``(S)'' for Secret, and ``(C)'' for Confidential will be used. (2) Portions... from the informational text. (1) Conspicuously place the overall classification at the top and bottom...

  15. 75 FR 707 - Classified National Security Information

    Science.gov (United States)

    2010-01-05

    ... classified at one of the following three levels: (1) ``Top Secret'' shall be applied to information, the... exercise this authority. (2) ``Top Secret'' original classification authority may be delegated only by the... official has been delegated ``Top Secret'' original classification authority by the agency head. (4) Each...

  16. Neural Network Classifier Based on Growing Hyperspheres

    Czech Academy of Sciences Publication Activity Database

    Jiřina Jr., Marcel; Jiřina, Marcel

    2000-01-01

    Roč. 10, č. 3 (2000), s. 417-428 ISSN 1210-0552. [Neural Network World 2000. Prague, 09.07.2000-12.07.2000] Grant - others:MŠMT ČR(CZ) VS96047; MPO(CZ) RP-4210 Institutional research plan: AV0Z1030915 Keywords : neural network * classifier * hyperspheres * big -dimensional data Subject RIV: BA - General Mathematics

  17. Histogram deconvolution - An aid to automated classifiers

    Science.gov (United States)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  18. Classifying web pages with visual features

    NARCIS (Netherlands)

    de Boer, V.; van Someren, M.; Lupascu, T.; Filipe, J.; Cordeiro, J.

    2010-01-01

    To automatically classify and process web pages, current systems use the textual content of those pages, including both the displayed content and the underlying (HTML) code. However, a very important feature of a web page is its visual appearance. In this paper, we show that using generic visual

  19. Facilities & Leadership

    Data.gov (United States)

    Department of Veterans Affairs — The facilities web service provides VA facility information. The VA facilities locator is a feature that is available across the enterprise, on any webpage, for the...

  20. 75 FR 28554 - Elimination of Classification Requirement in the Green Technology Pilot Program

    Science.gov (United States)

    2010-05-21

    ...] Elimination of Classification Requirement in the Green Technology Pilot Program AGENCY: United States Patent... (USPTO) implemented the Green Technology Pilot Program on December 8, 2009, which permits patent... technologies. However, the pilot program was limited to only applications classified in a number of U.S...

  1. Classifying features in CT imagery: accuracy for some single- and multiple-species classifiers

    Science.gov (United States)

    Daniel L. Schmoldt; Jing He; A. Lynn Abbott

    1998-01-01

    Our current approach to automatically label features in CT images of hardwood logs classifies each pixel of an image individually. These feature classifiers use a back-propagation artificial neural network (ANN) and feature vectors that include a small, local neighborhood of pixels and the distance of the target pixel to the center of the log. Initially, this type of...

  2. Factor analysis on hazards for safety assessment in decommissioning workplace of nuclear facilities using a semantic differential method

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan-Seong [Korea Atomic Energy Research Institute, 1045 Daedeok-daero, Yuseong-gu, Daejeon 305-353 (Korea, Republic of)], E-mail: ksjeongl@kaeri.re.kr; Lim, Hyeon-Kyo [Chungbuk National University, 410 Sungbong-ro, Heungduk-gu, Cheongju, Chungbuk 361-763 (Korea, Republic of)

    2009-10-15

    The decommissioning of nuclear facilities must be accomplished according to its structural conditions and radiological characteristics. An effective risk analysis requires basic knowledge about possible risks, characteristics of potential hazards, and comprehensive understanding of the associated cause-effect relationships within a decommissioning for nuclear facilities. The hazards associated with a decommissioning plan are important not only because they may be a direct cause of harm to workers but also because their occurrence may, indirectly, result in increased radiological and non-radiological hazards. Workers need to be protected by eliminating or reducing the radiological and non-radiological hazards that may arise during routine decommissioning activities as well as during accidents. Therefore, to prepare the safety assessment for decommissioning of nuclear facilities, the radiological and non-radiological hazards should be systematically identified and classified. With a semantic differential method of screening factor and risk perception factor, the radiological and non-radiological hazards are screened and identified.

  3. Biochemistry Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Biochemistry Facility provides expert services and consultation in biochemical enzyme assays and protein purification. The facility currently features 1) Liquid...

  4. Elimination of Onchocerciasis from Mexico

    Science.gov (United States)

    Rodríguez-Pérez, Mario A.; Fernández-Santos, Nadia A.; Orozco-Algarra, María E.; Rodríguez-Atanacio, José A.; Domínguez-Vázquez, Alfredo; Rodríguez-Morales, Kristel B.; Real-Najarro, Olga; Prado-Velasco, Francisco G.; Cupp, Eddie W.; Richards, Frank O.; Hassan, Hassan K.; González-Roldán, Jesús F.; Kuri-Morales, Pablo A.; Unnasch, Thomas R.

    2015-01-01

    Background Mexico is one of the six countries formerly endemic for onchocerciasis in Latin America. Transmission has been interrupted in the three endemic foci of that country and mass drug distribution has ceased. Three years after mass drug distribution ended, post-treatment surveillance (PTS) surveys were undertaken which employed entomological indicators to check for transmission recrudescence. Methodology/Principal findings In-depth entomologic assessments were performed in 18 communities in the three endemic foci of Mexico. None of the 108,212 Simulium ochraceum s.l. collected from the three foci were found to contain parasite DNA when tested by polymerase chain reaction-enzyme-linked immunosorbent assay (PCR-ELISA), resulting in a maximum upper bound of the 95% confidence interval (95%-ULCI) of the infective rate in the vectors of 0.035/2,000 flies examined. This is an order of magnitude below the threshold of a 95%-ULCI of less than one infective fly per 2,000 flies tested, the current entomological criterion for interruption of transmission developed by the international community. The point estimate of seasonal transmission potential (STP) was zero, and the upper bound of the 95% confidence interval for the STP ranged from 1.2 to 1.7 L3/person/season in the different foci. This value is below all previous estimates for the minimum transmission potential required to maintain the parasite population. Conclusions/Significance The results from the in-depth entomological post treatment surveillance surveys strongly suggest that transmission has not resumed in the three foci of Mexico during the three years since the last distribution of ivermectin occurred; it was concluded that transmission remains undetectable without intervention, and Onchocerca volvulus has been eliminated from Mexico. PMID:26161558

  5. Elimination of Onchocerciasis from Mexico.

    Directory of Open Access Journals (Sweden)

    Mario A Rodríguez-Pérez

    Full Text Available Mexico is one of the six countries formerly endemic for onchocerciasis in Latin America. Transmission has been interrupted in the three endemic foci of that country and mass drug distribution has ceased. Three years after mass drug distribution ended, post-treatment surveillance (PTS surveys were undertaken which employed entomological indicators to check for transmission recrudescence.In-depth entomologic assessments were performed in 18 communities in the three endemic foci of Mexico. None of the 108,212 Simulium ochraceum s.l. collected from the three foci were found to contain parasite DNA when tested by polymerase chain reaction-enzyme-linked immunosorbent assay (PCR-ELISA, resulting in a maximum upper bound of the 95% confidence interval (95%-ULCI of the infective rate in the vectors of 0.035/2,000 flies examined. This is an order of magnitude below the threshold of a 95%-ULCI of less than one infective fly per 2,000 flies tested, the current entomological criterion for interruption of transmission developed by the international community. The point estimate of seasonal transmission potential (STP was zero, and the upper bound of the 95% confidence interval for the STP ranged from 1.2 to 1.7 L3/person/season in the different foci. This value is below all previous estimates for the minimum transmission potential required to maintain the parasite population.The results from the in-depth entomological post treatment surveillance surveys strongly suggest that transmission has not resumed in the three foci of Mexico during the three years since the last distribution of ivermectin occurred; it was concluded that transmission remains undetectable without intervention, and Onchocerca volvulus has been eliminated from Mexico.

  6. Facilities for Waste Management at Chalk River, Canada; Les Installations d'Elimination et d'Utilisation des Dechets a Chalk River, Canada; 041e 0411 041e 0420 0423 0414 041e 0412 ; Instalaciones Utilizadas para el Provechamiento y Evacuacion de Desechos Radiactivos en Chalk River, Canada

    Energy Technology Data Exchange (ETDEWEB)

    Mawson, C. A.; Russell, A. E. [Environmental Research Branch, Atomic Energy of Canada Ltd. (Canada)

    1960-07-01

    The waste disposal areas used by the Atomic Energy of Canada Limited are situated in a rock basin filled with glacial till and sand, draining into the Ottawa River. Low-activity liquid effluent is run into pits in the sand, which are filled with small rocks to prevent contact of liquid with the air. Medium- level liquid is mixed with cement in drums which are stacked and totally enclosed in concrete trenches; medium-level solids are buried in concrete-lined trenches; high-level solids are placed in holes lined with steel or concrete piping. Special facilities are provided for organic liquids and bottled wastes. Details will be given of the structural work and procedures, with an outline of the results of environmental monitoring. (author) [French] Les zones d'elimination utilisees par l'Atomic Energy of Canada Limited sont situees dans un bassin rocheux, rempli de blocs erratiques et'de sable, dont les eaux s'ecoulent dans la riviere Ottawa. Les effluents liquides de faible activite sont verses dans des puits creuses dans le sable, qui sont ensuite remplis de petites pierres pour prevenir le contact du liquide avec l'air. Les dechets liquides d'activite moyenne sont melanges a du ciment dans des barils qui sont entasses et completement enfermes dans des tranchees betonnees; les dechets solides d'activite moyenne sont enfouis dans des tranchees bordees de beton; les dechets solides de haute activite sont places dans des trous bordes de conduites d'acier ou de ciment. Des installations speciales sont prevues pour les liquides organiques et les dechets enfermes dans des recipients en forme de bouteilles. Le memoire expose en detail les travaux d'amenagement et les methodes suivies; il donne un apercu des resultats du controle du milieu ambiant. (author) [Spanish] Las zonas utilizadas por la Atomic Energy of Canada Limited para la evacuacion de desechos radiactivos estan situadas en una cuenca rocosa recubierta de limo y arena del periodo glacial, que desemboca en el

  7. Comparing cosmic web classifiers using information theory

    International Nuclear Information System (INIS)

    Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin; Jasche, Jens

    2016-01-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  8. Design of Robust Neural Network Classifiers

    DEFF Research Database (Denmark)

    Larsen, Jan; Andersen, Lars Nonboe; Hintz-Madsen, Mads

    1998-01-01

    This paper addresses a new framework for designing robust neural network classifiers. The network is optimized using the maximum a posteriori technique, i.e., the cost function is the sum of the log-likelihood and a regularization term (prior). In order to perform robust classification, we present...... a modified likelihood function which incorporates the potential risk of outliers in the data. This leads to the introduction of a new parameter, the outlier probability. Designing the neural classifier involves optimization of network weights as well as outlier probability and regularization parameters. We...... suggest to adapt the outlier probability and regularisation parameters by minimizing the error on a validation set, and a simple gradient descent scheme is derived. In addition, the framework allows for constructing a simple outlier detector. Experiments with artificial data demonstrate the potential...

  9. Comparing cosmic web classifiers using information theory

    Energy Technology Data Exchange (ETDEWEB)

    Leclercq, Florent [Institute of Cosmology and Gravitation (ICG), University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth PO1 3FX (United Kingdom); Lavaux, Guilhem; Wandelt, Benjamin [Institut d' Astrophysique de Paris (IAP), UMR 7095, CNRS – UPMC Université Paris 6, Sorbonne Universités, 98bis boulevard Arago, F-75014 Paris (France); Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: lavaux@iap.fr, E-mail: j.jasche@tum.de, E-mail: wandelt@iap.fr [Excellence Cluster Universe, Technische Universität München, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2016-08-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  10. Detection of Fundus Lesions Using Classifier Selection

    Science.gov (United States)

    Nagayoshi, Hiroto; Hiramatsu, Yoshitaka; Sako, Hiroshi; Himaga, Mitsutoshi; Kato, Satoshi

    A system for detecting fundus lesions caused by diabetic retinopathy from fundus images is being developed. The system can screen the images in advance in order to reduce the inspection workload on doctors. One of the difficulties that must be addressed in completing this system is how to remove false positives (which tend to arise near blood vessels) without decreasing the detection rate of lesions in other areas. To overcome this difficulty, we developed classifier selection according to the position of a candidate lesion, and we introduced new features that can distinguish true lesions from false positives. A system incorporating classifier selection and these new features was tested in experiments using 55 fundus images with some lesions and 223 images without lesions. The results of the experiments confirm the effectiveness of the proposed system, namely, degrees of sensitivity and specificity of 98% and 81%, respectively.

  11. Classifying objects in LWIR imagery via CNNs

    Science.gov (United States)

    Rodger, Iain; Connor, Barry; Robertson, Neil M.

    2016-10-01

    The aim of the presented work is to demonstrate enhanced target recognition and improved false alarm rates for a mid to long range detection system, utilising a Long Wave Infrared (LWIR) sensor. By exploiting high quality thermal image data and recent techniques in machine learning, the system can provide automatic target recognition capabilities. A Convolutional Neural Network (CNN) is trained and the classifier achieves an overall accuracy of > 95% for 6 object classes related to land defence. While the highly accurate CNN struggles to recognise long range target classes, due to low signal quality, robust target discrimination is achieved for challenging candidates. The overall performance of the methodology presented is assessed using human ground truth information, generating classifier evaluation metrics for thermal image sequences.

  12. Learning for VMM + WTA Embedded Classifiers

    Science.gov (United States)

    2016-03-31

    Learning for VMM + WTA Embedded Classifiers Jennifer Hasler and Sahil Shah Electrical and Computer Engineering Georgia Institute of Technology...enabling correct classification of each novel acoustic signal (generator, idle car, and idle truck ). The classification structure requires, after...measured on our SoC FPAA IC. The test input is composed of signals from urban environment for 3 objects (generator, idle car, and idle truck

  13. Bayes classifiers for imbalanced traffic accidents datasets.

    Science.gov (United States)

    Mujalli, Randa Oqab; López, Griselda; Garach, Laura

    2016-03-01

    Traffic accidents data sets are usually imbalanced, where the number of instances classified under the killed or severe injuries class (minority) is much lower than those classified under the slight injuries class (majority). This, however, supposes a challenging problem for classification algorithms and may cause obtaining a model that well cover the slight injuries instances whereas the killed or severe injuries instances are misclassified frequently. Based on traffic accidents data collected on urban and suburban roads in Jordan for three years (2009-2011); three different data balancing techniques were used: under-sampling which removes some instances of the majority class, oversampling which creates new instances of the minority class and a mix technique that combines both. In addition, different Bayes classifiers were compared for the different imbalanced and balanced data sets: Averaged One-Dependence Estimators, Weightily Average One-Dependence Estimators, and Bayesian networks in order to identify factors that affect the severity of an accident. The results indicated that using the balanced data sets, especially those created using oversampling techniques, with Bayesian networks improved classifying a traffic accident according to its severity and reduced the misclassification of killed and severe injuries instances. On the other hand, the following variables were found to contribute to the occurrence of a killed causality or a severe injury in a traffic accident: number of vehicles involved, accident pattern, number of directions, accident type, lighting, surface condition, and speed limit. This work, to the knowledge of the authors, is the first that aims at analyzing historical data records for traffic accidents occurring in Jordan and the first to apply balancing techniques to analyze injury severity of traffic accidents. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. A Bayesian classifier for symbol recognition

    OpenAIRE

    Barrat , Sabine; Tabbone , Salvatore; Nourrissier , Patrick

    2007-01-01

    URL : http://www.buyans.com/POL/UploadedFile/134_9977.pdf; International audience; We present in this paper an original adaptation of Bayesian networks to symbol recognition problem. More precisely, a descriptor combination method, which enables to improve significantly the recognition rate compared to the recognition rates obtained by each descriptor, is presented. In this perspective, we use a simple Bayesian classifier, called naive Bayes. In fact, probabilistic graphical models, more spec...

  15. Optimization of short amino acid sequences classifier

    Science.gov (United States)

    Barcz, Aleksy; Szymański, Zbigniew

    This article describes processing methods used for short amino acid sequences classification. The data processed are 9-symbols string representations of amino acid sequences, divided into 49 data sets - each one containing samples labeled as reacting or not with given enzyme. The goal of the classification is to determine for a single enzyme, whether an amino acid sequence would react with it or not. Each data set is processed separately. Feature selection is performed to reduce the number of dimensions for each data set. The method used for feature selection consists of two phases. During the first phase, significant positions are selected using Classification and Regression Trees. Afterwards, symbols appearing at the selected positions are substituted with numeric values of amino acid properties taken from the AAindex database. In the second phase the new set of features is reduced using a correlation-based ranking formula and Gram-Schmidt orthogonalization. Finally, the preprocessed data is used for training LS-SVM classifiers. SPDE, an evolutionary algorithm, is used to obtain optimal hyperparameters for the LS-SVM classifier, such as error penalty parameter C and kernel-specific hyperparameters. A simple score penalty is used to adapt the SPDE algorithm to the task of selecting classifiers with best performance measures values.

  16. SVM classifier on chip for melanoma detection.

    Science.gov (United States)

    Afifi, Shereen; GholamHosseini, Hamid; Sinha, Roopak

    2017-07-01

    Support Vector Machine (SVM) is a common classifier used for efficient classification with high accuracy. SVM shows high accuracy for classifying melanoma (skin cancer) clinical images within computer-aided diagnosis systems used by skin cancer specialists to detect melanoma early and save lives. We aim to develop a medical low-cost handheld device that runs a real-time embedded SVM-based diagnosis system for use in primary care for early detection of melanoma. In this paper, an optimized SVM classifier is implemented onto a recent FPGA platform using the latest design methodology to be embedded into the proposed device for realizing online efficient melanoma detection on a single system on chip/device. The hardware implementation results demonstrate a high classification accuracy of 97.9% and a significant acceleration factor of 26 from equivalent software implementation on an embedded processor, with 34% of resources utilization and 2 watts for power consumption. Consequently, the implemented system meets crucial embedded systems constraints of high performance and low cost, resources utilization and power consumption, while achieving high classification accuracy.

  17. Dance Facilities.

    Science.gov (United States)

    Ashton, Dudley, Ed.; Irey, Charlotte, Ed.

    This booklet represents an effort to assist teachers and administrators in the professional planning of dance facilities and equipment. Three chapters present the history of dance facilities, provide recommended dance facilities and equipment, and offer some adaptations of dance facilities and equipment, for elementary, secondary and college level…

  18. Robust Framework to Combine Diverse Classifiers Assigning Distributed Confidence to Individual Classifiers at Class Level

    Directory of Open Access Journals (Sweden)

    Shehzad Khalid

    2014-01-01

    Full Text Available We have presented a classification framework that combines multiple heterogeneous classifiers in the presence of class label noise. An extension of m-Mediods based modeling is presented that generates model of various classes whilst identifying and filtering noisy training data. This noise free data is further used to learn model for other classifiers such as GMM and SVM. A weight learning method is then introduced to learn weights on each class for different classifiers to construct an ensemble. For this purpose, we applied genetic algorithm to search for an optimal weight vector on which classifier ensemble is expected to give the best accuracy. The proposed approach is evaluated on variety of real life datasets. It is also compared with existing standard ensemble techniques such as Adaboost, Bagging, and Random Subspace Methods. Experimental results show the superiority of proposed ensemble method as compared to its competitors, especially in the presence of class label noise and imbalance classes.

  19. The Protection of Classified Information: The Legal Framework

    National Research Council Canada - National Science Library

    Elsea, Jennifer K

    2006-01-01

    Recent incidents involving leaks of classified information have heightened interest in the legal framework that governs security classification, access to classified information, and penalties for improper disclosure...

  20. Classifying smoking urges via machine learning.

    Science.gov (United States)

    Dumortier, Antoine; Beckjord, Ellen; Shiffman, Saul; Sejdić, Ervin

    2016-12-01

    Smoking is the largest preventable cause of death and diseases in the developed world, and advances in modern electronics and machine learning can help us deliver real-time intervention to smokers in novel ways. In this paper, we examine different machine learning approaches to use situational features associated with having or not having urges to smoke during a quit attempt in order to accurately classify high-urge states. To test our machine learning approaches, specifically, Bayes, discriminant analysis and decision tree learning methods, we used a dataset collected from over 300 participants who had initiated a quit attempt. The three classification approaches are evaluated observing sensitivity, specificity, accuracy and precision. The outcome of the analysis showed that algorithms based on feature selection make it possible to obtain high classification rates with only a few features selected from the entire dataset. The classification tree method outperformed the naive Bayes and discriminant analysis methods, with an accuracy of the classifications up to 86%. These numbers suggest that machine learning may be a suitable approach to deal with smoking cessation matters, and to predict smoking urges, outlining a potential use for mobile health applications. In conclusion, machine learning classifiers can help identify smoking situations, and the search for the best features and classifier parameters significantly improves the algorithms' performance. In addition, this study also supports the usefulness of new technologies in improving the effect of smoking cessation interventions, the management of time and patients by therapists, and thus the optimization of available health care resources. Future studies should focus on providing more adaptive and personalized support to people who really need it, in a minimum amount of time by developing novel expert systems capable of delivering real-time interventions. Copyright © 2016 Elsevier Ireland Ltd. All rights

  1. Classifying spaces of degenerating polarized Hodge structures

    CERN Document Server

    Kato, Kazuya

    2009-01-01

    In 1970, Phillip Griffiths envisioned that points at infinity could be added to the classifying space D of polarized Hodge structures. In this book, Kazuya Kato and Sampei Usui realize this dream by creating a logarithmic Hodge theory. They use the logarithmic structures begun by Fontaine-Illusie to revive nilpotent orbits as a logarithmic Hodge structure. The book focuses on two principal topics. First, Kato and Usui construct the fine moduli space of polarized logarithmic Hodge structures with additional structures. Even for a Hermitian symmetric domain D, the present theory is a refinem

  2. Gearbox Condition Monitoring Using Advanced Classifiers

    Directory of Open Access Journals (Sweden)

    P. Večeř

    2010-01-01

    Full Text Available New efficient and reliable methods for gearbox diagnostics are needed in automotive industry because of growing demand for production quality. This paper presents the application of two different classifiers for gearbox diagnostics – Kohonen Neural Networks and the Adaptive-Network-based Fuzzy Interface System (ANFIS. Two different practical applications are presented. In the first application, the tested gearboxes are separated into two classes according to their condition indicators. In the second example, ANFIS is applied to label the tested gearboxes with a Quality Index according to the condition indicators. In both applications, the condition indicators were computed from the vibration of the gearbox housing. 

  3. Cubical sets as a classifying topos

    DEFF Research Database (Denmark)

    Spitters, Bas

    Coquand’s cubical set model for homotopy type theory provides the basis for a computational interpretation of the univalence axiom and some higher inductive types, as implemented in the cubical proof assistant. We show that the underlying cube category is the opposite of the Lawvere theory of De...... Morgan algebras. The topos of cubical sets itself classifies the theory of ‘free De Morgan algebras’. This provides us with a topos with an internal ‘interval’. Using this interval we construct a model of type theory following van den Berg and Garner. We are currently investigating the precise relation...

  4. Double Ramp Loss Based Reject Option Classifier

    Science.gov (United States)

    2015-05-22

    of convex (DC) functions. To minimize it, we use DC programming approach [1]. The proposed method has following advantages: (1) the proposed loss LDR ...space constraints. We see that LDR does not put any restriction on ρ for it to be an upper bound of L0−d−1. 2.2 Risk Formulation Using LDR Let S = {(xn...classifier learnt using LDR based approach (C = 100, μ = 1, d = .2). Filled circles and triangles represent the support vectors. 4 Experimental Results We show

  5. Online Feature Selection for Classifying Emphysema in HRCT Images

    Directory of Open Access Journals (Sweden)

    M. Prasad

    2008-06-01

    Full Text Available Feature subset selection, applied as a pre- processing step to machine learning, is valuable in dimensionality reduction, eliminating irrelevant data and improving classifier performance. In the classic formulation of the feature selection problem, it is assumed that all the features are available at the beginning. However, in many real world problems, there are scenarios where not all features are present initially and must be integrated as they become available. In such scenarios, online feature selection provides an efficient way to sort through a large space of features. It is in this context that we introduce online feature selection for the classification of emphysema, a smoking related disease that appears as low attenuation regions in High Resolution Computer Tomography (HRCT images. The technique was successfully evaluated on 61 HRCT scans and compared with different online feature selection approaches, including hill climbing, best first search, grafting, and correlation-based feature selection. The results were also compared against ldensity maskr, a standard approach used for emphysema detection in medical image analysis.

  6. Classifying Coding DNA with Nucleotide Statistics

    Directory of Open Access Journals (Sweden)

    Nicolas Carels

    2009-10-01

    Full Text Available In this report, we compared the success rate of classification of coding sequences (CDS vs. introns by Codon Structure Factor (CSF and by a method that we called Universal Feature Method (UFM. UFM is based on the scoring of purine bias (Rrr and stop codon frequency. We show that the success rate of CDS/intron classification by UFM is higher than by CSF. UFM classifies ORFs as coding or non-coding through a score based on (i the stop codon distribution, (ii the product of purine probabilities in the three positions of nucleotide triplets, (iii the product of Cytosine (C, Guanine (G, and Adenine (A probabilities in the 1st, 2nd, and 3rd positions of triplets, respectively, (iv the probabilities of G in 1st and 2nd position of triplets and (v the distance of their GC3 vs. GC2 levels to the regression line of the universal correlation. More than 80% of CDSs (true positives of Homo sapiens (>250 bp, Drosophila melanogaster (>250 bp and Arabidopsis thaliana (>200 bp are successfully classified with a false positive rate lower or equal to 5%. The method releases coding sequences in their coding strand and coding frame, which allows their automatic translation into protein sequences with 95% confidence. The method is a natural consequence of the compositional bias of nucleotides in coding sequences.

  7. A systematic comparison of supervised classifiers.

    Directory of Open Access Journals (Sweden)

    Diego Raphael Amancio

    Full Text Available Pattern recognition has been employed in a myriad of industrial, commercial and academic applications. Many techniques have been devised to tackle such a diversity of applications. Despite the long tradition of pattern recognition research, there is no technique that yields the best classification in all scenarios. Therefore, as many techniques as possible should be considered in high accuracy applications. Typical related works either focus on the performance of a given algorithm or compare various classification methods. In many occasions, however, researchers who are not experts in the field of machine learning have to deal with practical classification tasks without an in-depth knowledge about the underlying parameters. Actually, the adequate choice of classifiers and parameters in such practical circumstances constitutes a long-standing problem and is one of the subjects of the current paper. We carried out a performance study of nine well-known classifiers implemented in the Weka framework and compared the influence of the parameter configurations on the accuracy. The default configuration of parameters in Weka was found to provide near optimal performance for most cases, not including methods such as the support vector machine (SVM. In addition, the k-nearest neighbor method frequently allowed the best accuracy. In certain conditions, it was possible to improve the quality of SVM by more than 20% with respect to their default parameter configuration.

  8. STATISTICAL TOOLS FOR CLASSIFYING GALAXY GROUP DYNAMICS

    International Nuclear Information System (INIS)

    Hou, Annie; Parker, Laura C.; Harris, William E.; Wilman, David J.

    2009-01-01

    The dynamical state of galaxy groups at intermediate redshifts can provide information about the growth of structure in the universe. We examine three goodness-of-fit tests, the Anderson-Darling (A-D), Kolmogorov, and χ 2 tests, in order to determine which statistical tool is best able to distinguish between groups that are relaxed and those that are dynamically complex. We perform Monte Carlo simulations of these three tests and show that the χ 2 test is profoundly unreliable for groups with fewer than 30 members. Power studies of the Kolmogorov and A-D tests are conducted to test their robustness for various sample sizes. We then apply these tests to a sample of the second Canadian Network for Observational Cosmology Redshift Survey (CNOC2) galaxy groups and find that the A-D test is far more reliable and powerful at detecting real departures from an underlying Gaussian distribution than the more commonly used χ 2 and Kolmogorov tests. We use this statistic to classify a sample of the CNOC2 groups and find that 34 of 106 groups are inconsistent with an underlying Gaussian velocity distribution, and thus do not appear relaxed. In addition, we compute velocity dispersion profiles (VDPs) for all groups with more than 20 members and compare the overall features of the Gaussian and non-Gaussian groups, finding that the VDPs of the non-Gaussian groups are distinct from those classified as Gaussian.

  9. Mercury⊕: An evidential reasoning image classifier

    Science.gov (United States)

    Peddle, Derek R.

    1995-12-01

    MERCURY⊕ is a multisource evidential reasoning classification software system based on the Dempster-Shafer theory of evidence. The design and implementation of this software package is described for improving the classification and analysis of multisource digital image data necessary for addressing advanced environmental and geoscience applications. In the remote-sensing context, the approach provides a more appropriate framework for classifying modern, multisource, and ancillary data sets which may contain a large number of disparate variables with different statistical properties, scales of measurement, and levels of error which cannot be handled using conventional Bayesian approaches. The software uses a nonparametric, supervised approach to classification, and provides a more objective and flexible interface to the evidential reasoning framework using a frequency-based method for computing support values from training data. The MERCURY⊕ software package has been implemented efficiently in the C programming language, with extensive use made of dynamic memory allocation procedures and compound linked list and hash-table data structures to optimize the storage and retrieval of evidence in a Knowledge Look-up Table. The software is complete with a full user interface and runs under Unix, Ultrix, VAX/VMS, MS-DOS, and Apple Macintosh operating system. An example of classifying alpine land cover and permafrost active layer depth in northern Canada is presented to illustrate the use and application of these ideas.

  10. Waste Facilities

    Data.gov (United States)

    Vermont Center for Geographic Information — This dataset was developed from the Vermont DEC's list of certified solid waste facilities. It includes facility name, contact information, and the materials...

  11. Health Facilities

    Science.gov (United States)

    Health facilities are places that provide health care. They include hospitals, clinics, outpatient care centers, and specialized care centers, ... psychiatric care centers. When you choose a health facility, you might want to consider How close it ...

  12. Fabrication Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Fabrication Facilities are a direct result of years of testing support. Through years of experience, the three fabrication facilities (Fort Hood, Fort Lewis, and...

  13. 36 CFR 1256.46 - National security-classified information.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false National security-classified... Restrictions § 1256.46 National security-classified information. In accordance with 5 U.S.C. 552(b)(1), NARA... properly classified under the provisions of the pertinent Executive Order on Classified National Security...

  14. Elimination of Ideas and Professional Socialisation

    DEFF Research Database (Denmark)

    Gravengaard, Gitte; Rimestad, Lene

    2012-01-01

    . Our aim is to study how this building of expertise takes place at meetings with a particular focus on the decision-making process concerning ideas for new news stories. In order to do this, we perform linguistic analysis of news production practices, as we investigate how the journalists' ideas...... for potential news stories are eliminated by the editor at the daily newsroom meetings. The elimination of ideas for news stories are not just eliminations; they are also corrections of culturally undesirable behaviour producing and reproducing the proper perception of an important object of knowledge...

  15. Cut elimination in multifocused linear logic

    DEFF Research Database (Denmark)

    Guenot, Nicolas; Brock-Nannestad, Taus

    2015-01-01

    We study cut elimination for a multifocused variant of full linear logic in the sequent calculus. The multifocused normal form of proofs yields problems that do not appear in a standard focused system, related to the constraints in grouping rule instances in focusing phases. We show that cut...... elimination can be performed in a sensible way even though the proof requires some specific lemmas to deal with multifocusing phases, and discuss the difficulties arising with cut elimination when considering normal forms of proofs in linear logic....

  16. Evaluating performance of high efficiency mist eliminators

    Energy Technology Data Exchange (ETDEWEB)

    Waggoner, Charles A.; Parsons, Michael S.; Giffin, Paxton K. [Mississippi State University, Institute for Clean Energy Technology, 205 Research Blvd, Starkville, MS (United States)

    2013-07-01

    Processing liquid wastes frequently generates off gas streams with high humidity and liquid aerosols. Droplet laden air streams can be produced from tank mixing or sparging and processes such as reforming or evaporative volume reduction. Unfortunately these wet air streams represent a genuine threat to HEPA filters. High efficiency mist eliminators (HEME) are one option for removal of liquid aerosols with high dissolved or suspended solids content. HEMEs have been used extensively in industrial applications, however they have not seen widespread use in the nuclear industry. Filtering efficiency data along with loading curves are not readily available for these units and data that exist are not easily translated to operational parameters in liquid waste treatment plants. A specialized test stand has been developed to evaluate the performance of HEME elements under use conditions of a US DOE facility. HEME elements were tested at three volumetric flow rates using aerosols produced from an iron-rich waste surrogate. The challenge aerosol included submicron particles produced from Laskin nozzles and super micron particles produced from a hollow cone spray nozzle. Test conditions included ambient temperature and relative humidities greater than 95%. Data collected during testing HEME elements from three different manufacturers included volumetric flow rate, differential temperature across the filter housing, downstream relative humidity, and differential pressure (dP) across the filter element. Filter challenge was discontinued at three intermediate dPs and the filter to allow determining filter efficiency using dioctyl phthalate and then with dry surrogate aerosols. Filtering efficiencies of the clean HEME, the clean HEME loaded with water, and the HEME at maximum dP were also collected using the two test aerosols. Results of the testing included differential pressure vs. time loading curves for the nine elements tested along with the mass of moisture and solid

  17. Two channel EEG thought pattern classifier.

    Science.gov (United States)

    Craig, D A; Nguyen, H T; Burchey, H A

    2006-01-01

    This paper presents a real-time electro-encephalogram (EEG) identification system with the goal of achieving hands free control. With two EEG electrodes placed on the scalp of the user, EEG signals are amplified and digitised directly using a ProComp+ encoder and transferred to the host computer through the RS232 interface. Using a real-time multilayer neural network, the actual classification for the control of a powered wheelchair has a very fast response. It can detect changes in the user's thought pattern in 1 second. Using only two EEG electrodes at positions O(1) and C(4) the system can classify three mental commands (forward, left and right) with an accuracy of more than 79 %

  18. Classifying Drivers' Cognitive Load Using EEG Signals.

    Science.gov (United States)

    Barua, Shaibal; Ahmed, Mobyen Uddin; Begum, Shahina

    2017-01-01

    A growing traffic safety issue is the effect of cognitive loading activities on traffic safety and driving performance. To monitor drivers' mental state, understanding cognitive load is important since while driving, performing cognitively loading secondary tasks, for example talking on the phone, can affect the performance in the primary task, i.e. driving. Electroencephalography (EEG) is one of the reliable measures of cognitive load that can detect the changes in instantaneous load and effect of cognitively loading secondary task. In this driving simulator study, 1-back task is carried out while the driver performs three different simulated driving scenarios. This paper presents an EEG based approach to classify a drivers' level of cognitive load using Case-Based Reasoning (CBR). The results show that for each individual scenario as well as using data combined from the different scenarios, CBR based system achieved approximately over 70% of classification accuracy.

  19. Classifying prion and prion-like phenomena.

    Science.gov (United States)

    Harbi, Djamel; Harrison, Paul M

    2014-01-01

    The universe of prion and prion-like phenomena has expanded significantly in the past several years. Here, we overview the challenges in classifying this data informatically, given that terms such as "prion-like", "prion-related" or "prion-forming" do not have a stable meaning in the scientific literature. We examine the spectrum of proteins that have been described in the literature as forming prions, and discuss how "prion" can have a range of meaning, with a strict definition being for demonstration of infection with in vitro-derived recombinant prions. We suggest that although prion/prion-like phenomena can largely be apportioned into a small number of broad groups dependent on the type of transmissibility evidence for them, as new phenomena are discovered in the coming years, a detailed ontological approach might be necessary that allows for subtle definition of different "flavors" of prion / prion-like phenomena.

  20. Hybrid Neuro-Fuzzy Classifier Based On Nefclass Model

    Directory of Open Access Journals (Sweden)

    Bogdan Gliwa

    2011-01-01

    Full Text Available The paper presents hybrid neuro-fuzzy classifier, based on NEFCLASS model, which wasmodified. The presented classifier was compared to popular classifiers – neural networks andk-nearest neighbours. Efficiency of modifications in classifier was compared with methodsused in original model NEFCLASS (learning methods. Accuracy of classifier was testedusing 3 datasets from UCI Machine Learning Repository: iris, wine and breast cancer wisconsin.Moreover, influence of ensemble classification methods on classification accuracy waspresented.

  1. Classifying Transition Behaviour in Postural Activity Monitoring

    Directory of Open Access Journals (Sweden)

    James BRUSEY

    2009-10-01

    Full Text Available A few accelerometers positioned on different parts of the body can be used to accurately classify steady state behaviour, such as walking, running, or sitting. Such systems are usually built using supervised learning approaches. Transitions between postures are, however, difficult to deal with using posture classification systems proposed to date, since there is no label set for intermediary postures and also the exact point at which the transition occurs can sometimes be hard to pinpoint. The usual bypass when using supervised learning to train such systems is to discard a section of the dataset around each transition. This leads to poorer classification performance when the systems are deployed out of the laboratory and used on-line, particularly if the regimes monitored involve fast paced activity changes. Time-based filtering that takes advantage of sequential patterns is a potential mechanism to improve posture classification accuracy in such real-life applications. Also, such filtering should reduce the number of event messages needed to be sent across a wireless network to track posture remotely, hence extending the system’s life. To support time-based filtering, understanding transitions, which are the major event generators in a classification system, is a key. This work examines three approaches to post-process the output of a posture classifier using time-based filtering: a naïve voting scheme, an exponentially weighted voting scheme, and a Bayes filter. Best performance is obtained from the exponentially weighted voting scheme although it is suspected that a more sophisticated treatment of the Bayes filter might yield better results.

  2. Just-in-time adaptive classifiers-part II: designing the classifier.

    Science.gov (United States)

    Alippi, Cesare; Roveri, Manuel

    2008-12-01

    Aging effects, environmental changes, thermal drifts, and soft and hard faults affect physical systems by changing their nature and behavior over time. To cope with a process evolution adaptive solutions must be envisaged to track its dynamics; in this direction, adaptive classifiers are generally designed by assuming the stationary hypothesis for the process generating the data with very few results addressing nonstationary environments. This paper proposes a methodology based on k-nearest neighbor (NN) classifiers for designing adaptive classification systems able to react to changing conditions just-in-time (JIT), i.e., exactly when it is needed. k-NN classifiers have been selected for their computational-free training phase, the possibility to easily estimate the model complexity k and keep under control the computational complexity of the classifier through suitable data reduction mechanisms. A JIT classifier requires a temporal detection of a (possible) process deviation (aspect tackled in a companion paper) followed by an adaptive management of the knowledge base (KB) of the classifier to cope with the process change. The novelty of the proposed approach resides in the general framework supporting the real-time update of the KB of the classification system in response to novel information coming from the process both in stationary conditions (accuracy improvement) and in nonstationary ones (process tracking) and in providing a suitable estimate of k. It is shown that the classification system grants consistency once the change targets the process generating the data in a new stationary state, as it is the case in many real applications.

  3. TOWARDS THE ELIMINATION OF PREVENTABLE DISEASES

    Directory of Open Access Journals (Sweden)

    O. V. Shamsheva

    2013-01-01

    Full Text Available The article presents incidence rates of major vaccine-preventable diseases in the world and the Russian Federation and cites mitigation measures that, in the end, must lead to the elimination of the diseases. 

  4. Region 9 NPDES Facilities 2012- Waste Water Treatment Plants

    Science.gov (United States)

    Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates direct discharges from facilities that discharge treated waste water into waters of the US. Facilities are issued NPDES permits regulating their discharge as required by the Clean Water Act. A facility may have one or more outfalls (dischargers). The location represents the facility or operating plant.

  5. Region 9 NPDES Facilities - Waste Water Treatment Plants

    Science.gov (United States)

    Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates direct discharges from facilities that discharge treated waste water into waters of the US. Facilities are issued NPDES permits regulating their discharge as required by the Clean Water Act. A facility may have one or more outfalls (dischargers). The location represents the facility or operating plant.

  6. QUEST: Eliminating Online Supervised Learning for Efficient Classification Algorithms

    Directory of Open Access Journals (Sweden)

    Ardjan Zwartjes

    2016-10-01

    Full Text Available In this work, we introduce QUEST (QUantile Estimation after Supervised Training, an adaptive classification algorithm for Wireless Sensor Networks (WSNs that eliminates the necessity for online supervised learning. Online processing is important for many sensor network applications. Transmitting raw sensor data puts high demands on the battery, reducing network life time. By merely transmitting partial results or classifications based on the sampled data, the amount of traffic on the network can be significantly reduced. Such classifications can be made by learning based algorithms using sampled data. An important issue, however, is the training phase of these learning based algorithms. Training a deployed sensor network requires a lot of communication and an impractical amount of human involvement. QUEST is a hybrid algorithm that combines supervised learning in a controlled environment with unsupervised learning on the location of deployment. Using the SITEX02 dataset, we demonstrate that the presented solution works with a performance penalty of less than 10% in 90% of the tests. Under some circumstances, it even outperforms a network of classifiers completely trained with supervised learning. As a result, the need for on-site supervised learning and communication for training is completely eliminated by our solution.

  7. QUEST: Eliminating Online Supervised Learning for Efficient Classification Algorithms.

    Science.gov (United States)

    Zwartjes, Ardjan; Havinga, Paul J M; Smit, Gerard J M; Hurink, Johann L

    2016-10-01

    In this work, we introduce QUEST (QUantile Estimation after Supervised Training), an adaptive classification algorithm for Wireless Sensor Networks (WSNs) that eliminates the necessity for online supervised learning. Online processing is important for many sensor network applications. Transmitting raw sensor data puts high demands on the battery, reducing network life time. By merely transmitting partial results or classifications based on the sampled data, the amount of traffic on the network can be significantly reduced. Such classifications can be made by learning based algorithms using sampled data. An important issue, however, is the training phase of these learning based algorithms. Training a deployed sensor network requires a lot of communication and an impractical amount of human involvement. QUEST is a hybrid algorithm that combines supervised learning in a controlled environment with unsupervised learning on the location of deployment. Using the SITEX02 dataset, we demonstrate that the presented solution works with a performance penalty of less than 10% in 90% of the tests. Under some circumstances, it even outperforms a network of classifiers completely trained with supervised learning. As a result, the need for on-site supervised learning and communication for training is completely eliminated by our solution.

  8. Classifying Adverse Events in the Dental Office.

    Science.gov (United States)

    Kalenderian, Elsbeth; Obadan-Udoh, Enihomo; Maramaldi, Peter; Etolue, Jini; Yansane, Alfa; Stewart, Denice; White, Joel; Vaderhobli, Ram; Kent, Karla; Hebballi, Nutan B; Delattre, Veronique; Kahn, Maria; Tokede, Oluwabunmi; Ramoni, Rachel B; Walji, Muhammad F

    2017-06-30

    Dentists strive to provide safe and effective oral healthcare. However, some patients may encounter an adverse event (AE) defined as "unnecessary harm due to dental treatment." In this research, we propose and evaluate two systems for categorizing the type and severity of AEs encountered at the dental office. Several existing medical AE type and severity classification systems were reviewed and adapted for dentistry. Using data collected in previous work, two initial dental AE type and severity classification systems were developed. Eight independent reviewers performed focused chart reviews, and AEs identified were used to evaluate and modify these newly developed classifications. A total of 958 charts were independently reviewed. Among the reviewed charts, 118 prospective AEs were found and 101 (85.6%) were verified as AEs through a consensus process. At the end of the study, a final AE type classification comprising 12 categories, and an AE severity classification comprising 7 categories emerged. Pain and infection were the most common AE types representing 73% of the cases reviewed (56% and 17%, respectively) and 88% were found to cause temporary, moderate to severe harm to the patient. Adverse events found during the chart review process were successfully classified using the novel dental AE type and severity classifications. Understanding the type of AEs and their severity are important steps if we are to learn from and prevent patient harm in the dental office.

  9. Is it important to classify ischaemic stroke?

    LENUS (Irish Health Repository)

    Iqbal, M

    2012-02-01

    Thirty-five percent of all ischemic events remain classified as cryptogenic. This study was conducted to ascertain the accuracy of diagnosis of ischaemic stroke based on information given in the medical notes. It was tested by applying the clinical information to the (TOAST) criteria. Hundred and five patients presented with acute stroke between Jan-Jun 2007. Data was collected on 90 patients. Male to female ratio was 39:51 with age range of 47-93 years. Sixty (67%) patients had total\\/partial anterior circulation stroke; 5 (5.6%) had a lacunar stroke and in 25 (28%) the mechanism of stroke could not be identified. Four (4.4%) patients with small vessel disease were anticoagulated; 5 (5.6%) with atrial fibrillation received antiplatelet therapy and 2 (2.2%) patients with atrial fibrillation underwent CEA. This study revealed deficiencies in the clinical assessment of patients and treatment was not tailored to the mechanism of stroke in some patients.

  10. Stress fracture development classified by bone scintigraphy

    International Nuclear Information System (INIS)

    Zwas, S.T.; Elkanovich, R.; Frank, G.; Aharonson, Z.

    1985-01-01

    There is no consensus on classifying stress fractures (SF) appearing on bone scans. The authors present a system of classification based on grading the severity and development of bone lesions by visual inspection, according to three main scintigraphic criteria: focality and size, intensity of uptake compare to adjacent bone, and local medular extension. Four grades of development (I-IV) were ranked, ranging from ill defined slightly increased cortical uptake to well defined regions with markedly increased uptake extending transversely bicortically. 310 male subjects aged 19-2, suffering several weeks from leg pains occurring during intensive physical training underwent bone scans of the pelvis and lower extremities using Tc-99-m-MDP. 76% of the scans were positive with 354 lesions, of which 88% were in th4e mild (I-II) grades and 12% in the moderate (III) and severe (IV) grades. Post-treatment scans were obtained in 65 cases having 78 lesions during 1- to 6-month intervals. Complete resolution was found after 1-2 months in 36% of the mild lesions but in only 12% of the moderate and severe ones, and after 3-6 months in 55% of the mild lesions and 15% of the severe ones. 75% of the moderate and severe lesions showed residual uptake in various stages throughout the follow-up period. Early recognition and treatment of mild SF lesions in this study prevented protracted disability and progression of the lesions and facilitated complete healing

  11. Animal facilities

    International Nuclear Information System (INIS)

    Fritz, T.E.; Angerman, J.M.; Keenan, W.G.; Linsley, J.G.; Poole, C.M.; Sallese, A.; Simkins, R.C.; Tolle, D.

    1981-01-01

    The animal facilities in the Division are described. They consist of kennels, animal rooms, service areas, and technical areas (examining rooms, operating rooms, pathology labs, x-ray rooms, and 60 Co exposure facilities). The computer support facility is also described. The advent of the Conversational Monitor System at Argonne has launched a new effort to set up conversational computing and graphics software for users. The existing LS-11 data acquisition systems have been further enhanced and expanded. The divisional radiation facilities include a number of gamma, neutron, and x-ray radiation sources with accompanying areas for related equipment. There are five 60 Co irradiation facilities; a research reactor, Janus, is a source for fission-spectrum neutrons; two other neutron sources in the Chicago area are also available to the staff for cell biology studies. The electron microscope facilities are also described

  12. Dismantling of nuclear facilities

    International Nuclear Information System (INIS)

    Tallec, M.; Kus, J.P.

    2009-01-01

    Nuclear facilities have a long estimable lifetime but necessarily limited in time. At the end of their operation period, basic nuclear installations are the object of cleansing operations and transformations that will lead to their definitive decommissioning and then to their dismantling. Because each facility is somewhere unique, cleansing and dismantling require specific techniques. The dismantlement consists in the disassembly and disposing off of big equipments, in the elimination of radioactivity in all rooms of the facility, in the demolition of buildings and eventually in the reconversion of all or part of the facility. This article describes these different steps: 1 - dismantling strategy: main de-construction guidelines, expected final state; 2 - industries and sites: cleansing and dismantling at the CEA, EDF's sites under de-construction; 3 - de-construction: main steps, definitive shutdown, preparation of dismantling, electromechanical dismantling, cleansing/decommissioning, demolition, dismantling taken into account at the design stage, management of polluted soils; 4 - waste management: dismantlement wastes, national policy of radioactive waste management, management of dismantlement wastes; 5 - mastery of risks: risk analysis, conformability of risk management with reference documents, main risks encountered at de-construction works; 6 - regulatory procedures; 7 - international overview; 8 - conclusion. (J.S.)

  13. Martian Atmospheric Pressure Static Charge Elimination Tool

    Science.gov (United States)

    Johansen, Michael R.

    2014-01-01

    A Martian pressure static charge elimination tool is currently in development in the Electrostatics and Surface Physics Laboratory (ESPL) at NASA's Kennedy Space Center. In standard Earth atmosphere conditions, static charge can be neutralized from an insulating surface using air ionizers. These air ionizers generate ions through corona breakdown. The Martian atmosphere is 7 Torr of mostly carbon dioxide, which makes it inherently difficult to use similar methods as those used for standard atmosphere static elimination tools. An initial prototype has been developed to show feasibility of static charge elimination at low pressure, using corona discharge. A needle point and thin wire loop are used as the corona generating electrodes. A photo of the test apparatus is shown below. Positive and negative high voltage pulses are sent to the needle point. This creates positive and negative ions that can be used for static charge neutralization. In a preliminary test, a floating metal plate was charged to approximately 600 volts under Martian atmospheric conditions. The static elimination tool was enabled and the voltage on the metal plate dropped rapidly to -100 volts. This test data is displayed below. Optimization is necessary to improve the electrostatic balance of the static elimination tool.

  14. Body elimination attitude family resemblance in Kuwait.

    Science.gov (United States)

    Al-Fayez, Ghenaim; Awadalla, Abdelwahid; Arikawa, Hiroko; Templer, Donald I; Hutton, Shane

    2009-12-01

    The purpose of the present study was to determine the family resemblance of attitude toward body elimination in Kuwaiti participants. This study was conceptualized in the context of the theories of moral development, importance of cleanliness in the Muslim religion, cross-cultural differences in personal hygiene practices, previous research reporting an association between family attitudes and body elimination attitude, and health implications. The 24-item Likert-type format Body Elimination Attitude Scale-Revised was administered to 277 Kuwaiti high school students and 437 of their parents. Females scored higher, indicating greater disgust, than the males. Moreover, sons' body elimination attitude correlated more strongly with fathers' attitude (r = .85) than with that of the mothers (r = .64). Daughters' attitude was similarly associated with the fathers' (r = .89) and the mothers' attitude (r = .86). The high correlations were discussed within the context of Kuwait having a collectivistic culture with authoritarian parenting style. The higher adolescent correlations, and in particular the boys' correlation with fathers than with mothers, was explained in terms of the more dominant role of the Muslim father in the family. Public health and future research implications were suggested. A theoretical formulation was advanced in which "ideal" body elimination attitude is relative rather than absolute, and is a function of one's life circumstances, one's occupation, one's culture and subculture, and the society that one lives in.

  15. 41 CFR 105-62.102 - Authority to originally classify.

    Science.gov (United States)

    2010-07-01

    ... originally classify. (a) Top secret, secret, and confidential. The authority to originally classify information as Top Secret, Secret, or Confidential may be exercised only by the Administrator and is delegable...

  16. Naive Bayesian classifiers for multinomial features: a theoretical analysis

    CSIR Research Space (South Africa)

    Van Dyk, E

    2007-11-01

    Full Text Available The authors investigate the use of naive Bayesian classifiers for multinomial feature spaces and derive error estimates for these classifiers. The error analysis is done by developing a mathematical model to estimate the probability density...

  17. Ensemble of classifiers based network intrusion detection system performance bound

    CSIR Research Space (South Africa)

    Mkuzangwe, Nenekazi NP

    2017-11-01

    Full Text Available This paper provides a performance bound of a network intrusion detection system (NIDS) that uses an ensemble of classifiers. Currently researchers rely on implementing the ensemble of classifiers based NIDS before they can determine the performance...

  18. Selecting non-classified hotels in Kenya: what really matters for business guests?

    Directory of Open Access Journals (Sweden)

    Alex K Kivuva

    2014-01-01

    Full Text Available Non-classified hotels, which comprise small hotels and guest houses, are important accommodation providers offering limited services and products as compared to the classified hotels. Through guest satisfaction, they can achieve repeat business and also get new business through word of mouth from previous guests. The main focus is for the hoteliers to know exactly the determinants of selection of hotels by their guests. In this case, the focus was on non-classified hotels in Mtwapa town at the Kenyan coast. The study adopted a cross-sectional descriptive survey design. Results from this study clearly indicate that all aspects of hotel operations are important to business guests’ selection of a non-classified hotel. However, it was revealed that this was not on equal basis. Results indicate that the core product (guestroom comfortability, hygiene and cleanliness were the most important factors in determining guests’ selection of where to stay. This research therefore suggests that any efforts towards quality improvement in a hotel should focus primarily on ensuring customer satisfaction with the guestroom. While acknowledging the importance of all aspects of hotel operations, managers should recognize the importance of the guestroom and its facilities towards hotel selection and overall customer satisfaction. Therefore, it is imperative that managers channel their resources towards improving guest services in the guestrooms in accordance with the requirements of the clientele. This includes such aspects as the look of the guest rooms, facilities provided in the guest rooms and comfortability of the bed and mattress.

  19. Fast Most Similar Neighbor (MSN) classifiers for Mixed Data

    OpenAIRE

    Hernández Rodríguez, Selene

    2010-01-01

    The k nearest neighbor (k-NN) classifier has been extensively used in Pattern Recognition because of its simplicity and its good performance. However, in large datasets applications, the exhaustive k-NN classifier becomes impractical. Therefore, many fast k-NN classifiers have been developed; most of them rely on metric properties (usually the triangle inequality) to reduce the number of prototype comparisons. Hence, the existing fast k-NN classifiers are applicable only when the comparison f...

  20. Method of eliminating gaseous hydrogen isotopes

    International Nuclear Information System (INIS)

    Nagakura, Masaaki; Imaizumi, Hideki; Suemori, Nobuo; Aizawa, Takashi; Naito, Taisei.

    1983-01-01

    Purpose: To prevent external diffusion of gaseous hydrogen isotopes such as tritium or the like upon occurrence of tritium leakage accident in a thermonuclear reactor by recovering to eliminate the isotopes rapidly and with safety. Method: Gases at the region of a reactor container where hydrogen isotopes might leak are sucked by a recycing pump, dehumidified in a dehumidifier and then recycled from a preheater through a catalytic oxidation reactor to a water absorption tower. In this structure, the dehumidifier is disposed at the upstream of the catalytic oxidation reactor to reduce the water content of the gases to be processed, whereby the eliminating efficiency for the gases to be processed can be maintained well even when the oxidation reactor is operated at a low temperature condition near the ambient temperature. This method is based on the fact that the oxidating reactivity of the catalyst can be improved significantly by eliminating the water content in the gases to be processed. (Yoshino, Y.)

  1. Three data partitioning strategies for building local classifiers (Chapter 14)

    NARCIS (Netherlands)

    Zliobaite, I.; Okun, O.; Valentini, G.; Re, M.

    2011-01-01

    Divide-and-conquer approach has been recognized in multiple classifier systems aiming to utilize local expertise of individual classifiers. In this study we experimentally investigate three strategies for building local classifiers that are based on different routines of sampling data for training.

  2. Recognition of pornographic web pages by classifying texts and images.

    Science.gov (United States)

    Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve

    2007-06-01

    With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages.

  3. 32 CFR 2400.28 - Dissemination of classified information.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Dissemination of classified information. 2400.28... SECURITY PROGRAM Safeguarding § 2400.28 Dissemination of classified information. Heads of OSTP offices... originating official may prescribe specific restrictions on dissemination of classified information when...

  4. Facilities Programming.

    Science.gov (United States)

    Bullis, Robert V.

    1992-01-01

    A procedure for physical facilities management written 17 years ago is still worth following today. Each of the steps outlined for planning, organizing, directing, controlling, and evaluating must be accomplished if school facilities are to be properly planned and constructed. However, lessons have been learned about energy consumption and proper…

  5. Nuclear facilities

    International Nuclear Information System (INIS)

    Anon.

    2000-01-01

    Here is given the decree (2000-1065) of the 25. of October 2000 reporting the publication of the convention between the Government of the French Republic and the CERN concerning the safety of the LHC (Large Hadron Collider) and the SPS (Proton Supersynchrotron) facilities, signed in Geneva on July 11, 2000. By this convention, the CERN undertakes to ensure the safety of the LHC and SPS facilities and those of the operations of the LEP decommissioning. The French legislation and regulations on basic nuclear facilities (concerning more particularly the protection against ionizing radiations, the protection of the environment and the safety of facilities) and those which could be decided later on apply to the LHC, SPS and auxiliary facilities. (O.M.)

  6. Ten years left to eliminate blinding trachoma

    Directory of Open Access Journals (Sweden)

    Haddad D.

    2010-09-01

    Full Text Available n 1997, the World Health Organization formed the Global Alliance to Eliminate Blinding Trachoma by 2020 (GET 2020, a coalition of governmental, non-governmental, research, and pharmaceutical partners. In 1998, the World Health Assembly urged member states to map blinding trachoma in endemic areas, implement the SAFE strategy (which stands for surgery for trichiasis, antibiotics, facial-cleanliness and environmental change, such as clean water and latrines and collaborate with the global alliance in its work to eliminate blinding trachoma.

  7. Duplicate Record Elimination in Large Data Files.

    Science.gov (United States)

    1981-08-01

    UNCLASSIFIJED CSTR -445 NL LmEE~hhE - I1.0 . 111112----5 1.~4 __112 ___IL25_ 1.4 111111.6 EI24 COMPUTER SCIENCES DEPARTMENT oUniversity of Wisconsin...we propose a combinatorial model for the use in the analysis of algorithms for duplicate elimination. We contend that this model can serve as a...duplicates in a multiset of records, knowing the size of the multiset and the number of distinct records in it. 3. Algorithms for Duplicate Elimination

  8. Noise elimination algorithm for modal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bao, X. X., E-mail: baoxingxian@upc.edu.cn [Department of Naval Architecture and Ocean Engineering, China University of Petroleum (East China), Qingdao 266580 (China); Li, C. L. [Key Laboratory of Marine Geology and Environment, Institute of Oceanology, Chinese Academy of Sciences, Qingdao 266071 (China); Xiong, C. B. [The First Institute of Oceanography, State Oceanic Administration, Qingdao 266061 (China)

    2015-07-27

    Modal analysis is an ongoing interdisciplinary physical issue. Modal parameters estimation is applied to determine the dynamic characteristics of structures under vibration excitation. Modal analysis is more challenging for the measured vibration response signals are contaminated with noise. This study develops a mathematical algorithm of structured low rank approximation combined with the complex exponential method to estimate the modal parameters. Physical experiments using a steel cantilever beam with ten accelerometers mounted, excited by an impulse load, demonstrate that this method can significantly eliminate noise from measured signals and accurately identify the modal frequencies and damping ratios. This study provides a fundamental mechanism of noise elimination using structured low rank approximation in physical fields.

  9. Peat classified as slowly renewable biomass fuel

    International Nuclear Information System (INIS)

    2001-01-01

    thousands of years. The report states also that peat should be classified as biomass fuel instead of biofuels, such as wood, or fossil fuels such as coal. According to the report peat is a renewable biomass fuel like biofuels, but due to slow accumulation it should be considered as slowly renewable fuel. The report estimates that bonding of carbon in both virgin and forest drained peatlands are so high that it can compensate the emissions formed in combustion of energy peat

  10. Waste Receiving and Processing Facility (WRAP) Drawing List

    International Nuclear Information System (INIS)

    WEIDERT, J.R.

    1999-01-01

    This supporting document delineates the process of identification, categorization, and/or classification of the WRAP facility drawings used to support facility operations and maintenance. This document provides a listing of those essential or safety related drawings which have been identified to date. All other WRAP facility drawings have been classified as general

  11. Fumigation success for California facility.

    Science.gov (United States)

    Hacker, Robert

    2010-02-01

    As Robert Hacker, at the time director of facilities management at the St John's Regional Medical Center in Oxnard, California, explains, the hospital, one of the area's largest, recently successfully utilised a new technology to eliminate mould, selecting a cost and time-saving fumigation process in place of the traditional "rip and tear" method. Although hospital managers knew the technology had been used extremely effectively in other US buildings, this was reportedly among the first ever healthcare applications.

  12. Taking Centrioles to the Elimination Round.

    Science.gov (United States)

    Schoborg, Todd A; Rusan, Nasser M

    2016-07-11

    Two recent papers published in The Journal of Cell Biology (Borrego-Pinto et al., 2016) and Science (Pimenta-Marques et al., 2016) have begun to shed light on the mechanism of centriole elimination during female oogenesis, highlighting a protective role for Polo kinase and the pericentriolar material. Published by Elsevier Inc.

  13. Elimination Problems in Infants and Children

    Science.gov (United States)

    ... inability to digest wheat (CELIAC DISEASE) or milk (LACTOSE INTOLERANCE) can cause these symptoms. Self CareEliminate foods that ... be an appropriate substitute for infants who have lactose intolerance. Start OverDiagnosisPain from HEMORRHOIDS or an ANAL FISSURE ...

  14. Eliminating Problems Caused by Multicollinearity: A Warning.

    Science.gov (United States)

    Kennedy, Peter E.

    1982-01-01

    Explains why an econometric practice introduced by J.C. Soper cannot eliminate the problems caused by multicollinearity. The author suggests that it can be a useful technique in that it forces researchers to pay more attention to the specifications of their models. (AM)

  15. Double elimination voltammetry of short oligonucleotides

    Czech Academy of Sciences Publication Activity Database

    Mikelová, R.; Trnková, L.; Jelen, František

    2007-01-01

    Roč. 19, č. 17 (2007), s. 1807-1814 ISSN 1040-0397 R&D Projects: GA AV ČR(CZ) IAA100040602 Institutional research plan: CEZ:AV0Z50040507; CEZ:AV0Z50040702 Keywords : adsorptive stripping voltammetry * elimination voltammetry * oligodeoxynucleotide Subject RIV: BO - Biophysics Impact factor: 2.949, year: 2007

  16. Strategy elimination in games with interaction structures

    NARCIS (Netherlands)

    Witzel, A.; Apt, K.R.; Zvesper, J.A.

    2009-01-01

    We study games in the presence of an interaction structure, which allows players to communicate their preferences, assuming that each player initially only knows his own preferences. We study the outcomes of iterated elimination of strictly dominated strategies (IESDS) that can be obtained in any

  17. [Application of thermosetting plastics to eliminate undercuts].

    Science.gov (United States)

    Bielawski, T

    1989-01-01

    The author proposes to utilize the properties of thermosetting plastics used in other fields to use them in prosthetics in order to eliminate undercuts. Application of extra equipment in claspograph in the form of studs of three dimension makes formation of undercuts' blockade easier improving the result of work at the same time.

  18. Challenges for malaria elimination in Brazil.

    Science.gov (United States)

    Ferreira, Marcelo U; Castro, Marcia C

    2016-05-20

    Brazil currently contributes 42 % of all malaria cases reported in the Latin America and the Caribbean, a region where major progress towards malaria elimination has been achieved in recent years. In 2014, malaria burden in Brazil (143,910 microscopically confirmed cases and 41 malaria-related deaths) has reached its lowest levels in 35 years, Plasmodium falciparum is highly focal, and the geographic boundary of transmission has considerably shrunk. Transmission in Brazil remains entrenched in the Amazon Basin, which accounts for 99.5 % of the country's malaria burden. This paper reviews major lessons learned from past and current malaria control policies in Brazil. A comprehensive discussion of the scientific and logistic challenges that may impact malaria elimination efforts in the country is presented in light of the launching of the Plan for Elimination of Malaria in Brazil in November 2015. Challenges for malaria elimination addressed include the high prevalence of symptomless and submicroscopic infections, emerging anti-malarial drug resistance in P. falciparum and Plasmodium vivax and the lack of safe anti-relapse drugs, the largely neglected burden of malaria in pregnancy, the need for better vector control strategies where Anopheles mosquitoes present a highly variable biting behaviour, human movement, the need for effective surveillance and tools to identify foci of infection in areas with low transmission, and the effects of environmental changes and climatic variability in transmission. Control actions launched in Brazil and results to come are likely to influence control programs in other countries in the Americas.

  19. Eliminating transducer distortion in acoustic measurements

    DEFF Research Database (Denmark)

    Agerkvist, Finn T.; Torras Rosell, Antoni; McWalter, Richard Ian

    2014-01-01

    This paper investigates the in uence of nonlinear components that contaminate the linear response of acoustic transducer, and presents a method for eliminating the in uence of nonlinearities in acoustic measurements. The method is evaluated on simulated as well as experimental data, and is shown...

  20. Visceral Leishmaniasis : Potential for Control and Elimination

    NARCIS (Netherlands)

    E.A. le Rutte (Epke)

    2018-01-01

    markdownabstractOver the past years there has been a steep increase in awareness of visceral leishmaniasis (VL); many large-scale interventions are being implemented and targets for control and elimination have been set. In this thesis the potential of reaching these targets will be explored. To

  1. Mammography Facilities

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Mammography Facility Database is updated periodically based on information received from the four FDA-approved accreditation bodies: the American College of...

  2. Canyon Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — B Plant, T Plant, U Plant, PUREX, and REDOX (see their links) are the five facilities at Hanford where the original objective was plutonium removal from the uranium...

  3. Concluding remarks on future facilities

    International Nuclear Information System (INIS)

    Jean-Marie, B.

    1989-12-01

    The principles of some of the facilities and projects for the study of hadron spectroscopy are summarized. The work is focalized on e + e - machines, which are classified according to the quark family they can study: U,D,S quark families, C quark and τ studies and B quark family. The analysis leads to the conclusion that high luminosity e + e - machines are needed to progress in the hadron spectroscopy exploration

  4. Possibilities of Sulphate Elimination from Mine Water

    Directory of Open Access Journals (Sweden)

    Heviánková Silvie

    2004-12-01

    Full Text Available The issue of „acid water“ (or AMD is well known in the world for some centuries. In the Eastern Slovakia, the most acid surface water occurs in the area of old mine Smolník, which is closed and submerged for 15 years. The submitted contribution deals with the sulphate-elimination from this locality. Recently, several methods of sulphate-elimination from the mine water are applied. The best-known methods are biological, physical-chemical and chemical precipitation. The method described in this contribution consists of chemical precipitation by sodium aluminate and calcium hydrate. Under application of this method very interesting results were obtained. The amount of SO42- anions decreased to almost zero-value, using optimal doses of the chemical reagents.

  5. McGuire snubber elimination program

    International Nuclear Information System (INIS)

    Cloud, R.L.; Leung, J.S.M.; Taylor, W.H.; Morgan, R.L. Jr.

    1993-01-01

    An engineering program has been initiated at McGuire Nuclear Stations 1 and 2 to eliminate all existing snubbers. The elimination is achieved by replacing existing snubbers with limit stop pipe supports. The program establishes plant-wide modification procedures for one-to-one substitution under the 10 CFR 50.59 requirement. Nuclear Regulatory Commission (NRC) acceptance is based on the results of both comparison analyses and the hardware implementation of sample piping systems at McGuire nuclear stations. Experimental results obtained on shake table testing and from the NRC sponsored HDR research program are also used to formulate the technical basis and design procedures for plant-wide implementation of the snubber replacement effort. The overall program plan is for nearly 3,000 snubbers to be replaced in phases consistent with the plant scheduled outages. Duke Power estimates the program, when completed, will maintain ALARA, improve reliability, and reduce plant operating costs

  6. Elimination of salmonella from animal glandular products.

    Science.gov (United States)

    De Fiebre, C W; Burck, K T; Feldman, D

    1969-03-01

    Methods for the elimination of salmonellae from selected powdered pharmaceuticals of animal glandular origin were studied. Terminal heat treatment under carefully controlled conditions was effective for pancreatin-a powder containing proteolytic, amylolytic, and lipolytic enzymes prepared from hog pancreas glands. Use of this method resulted in a significant reduction in the number of salmonella-positive batches and also reduced the testing procedures required to confirm the absence of viable salmonellae among the majority of samples tested. Powders such as stomach substance and thyroid, in which the biological activity is not enzyme in nature, were treated successfully with acidified organic solvents. Other methods were investigated but were not suitable because of a deleterious effect on the biological activity or physical properties of the product or an inability to effect salmonella elimination.

  7. Elimination of Salmonellae from Animal Glandular Products

    Science.gov (United States)

    De Fiebre, Conrad W.; Burck, Kenneth T.; Feldman, David

    1969-01-01

    Methods for the elimination of salmonellae from selected powdered pharmaceuticals of animal glandular origin were studied. Terminal heat treatment under carefully controlled conditions was effective for pancreatin—a powder containing proteolytic, amylolytic, and lipolytic enzymes prepared from hog pancreas glands. Use of this method resulted in a significant reduction in the number of salmonella-positive batches and also reduced the testing procedures required to confirm the absence of viable salmonellae among the majority of samples tested. Powders such as stomach substance and thyroid, in which the biological activity is not enzyme in nature, were treated successfully with acidified organic solvents. Other methods were investigated but were not suitable because of a deleterious effect on the biological activity or physical properties of the product or an inability to effect salmonella elimination. PMID:5780395

  8. Eliminative behaviour of dairy cows at pasture

    DEFF Research Database (Denmark)

    Whistance, Lindsay Kay; Sinclair, Liam A.; Arney, David Richard

    2011-01-01

    Walking whilst defaecating was most likely to occur when cows were simultaneously engaged in an ‘active’ state, such as going to drink or catching up with the herd. Overall, standing to defaecate and moving forward...... behaviour of 40 Holstein-Friesian cows was observed at pasture for6 heach day between morning and afternoon milking for a total of24 h. Lying (l), standing (s) and walking (w) behaviours were recorded pre, during and post-elimination. Sequences of 3–6 changes in these behaviours were recorded if expressed...... within 30 s of an eliminative event. Intentional, incidental or no avoidance of faeces was also recorded for each event. Activity, characterised as static (lying, grazing or loafing), or active (moving to a different area of field, going to drink and catching up with herd) was also recorded. Of the 437...

  9. Elimination of Plasmodium falciparum malaria in Tajikistan.

    Science.gov (United States)

    Kondrashin, Anatoly V; Sharipov, Azizullo S; Kadamov, Dilshod S; Karimov, Saifuddin S; Gasimov, Elkhan; Baranova, Alla M; Morozova, Lola F; Stepanova, Ekaterina V; Turbabina, Natalia A; Maksimova, Maria S; Morozov, Evgeny N

    2017-05-30

    Malaria was eliminated in Tajikistan by the beginning of the 1960s. However, sporadic introduced cases of malaria occurred subsequently probably as a result of transmission from infected mosquito Anopheles flying over river the Punj from the border areas of Afghanistan. During the 1970s and 1980s local outbreaks of malaria were reported in the southern districts bordering Afghanistan. The malaria situation dramatically changed during the 1990s following armed conflict and civil unrest in the newly independent Tajikistan, which paralyzed health services including the malaria control activities and a large-scale malaria epidemic occurred with more than 400,000 malaria cases. The malaria epidemic was contained by 1999 as a result of considerable financial input from the Government and the international community. Although Plasmodium falciparum constituted only about 5% of total malaria cases, reduction of its incidence was slower than that of Plasmodium vivax. To prevent increase in P. falciparum malaria both in terms of incidence and territory, a P. falciparum elimination programme in the Republic was launched in 200, jointly supported by the Government and the Global Fund for control of AIDS, tuberculosis and malaria. The main activities included the use of pyrethroids for the IRS with determined periodicity, deployment of mosquito nets, impregnated with insecticides, use of larvivorous fishes as a biological larvicide, implementation of small-scale environmental management, and use of personal protection methods by population under malaria risk. The malaria surveillance system was strengthened by the use of ACD, PCD, RCD and selective use of mass blood surveys. All detected cases were timely epidemiologically investigated and treated based on the results of laboratory diagnosis. As a result, by 2009, P. falciparum malaria was eliminated from all of Tajikistan, one year ahead of the originally targeted date. Elimination of P. falciparum also contributed towards

  10. An Elimination of Resonance in Electric Drives

    Directory of Open Access Journals (Sweden)

    Michal Malek

    2011-01-01

    Full Text Available Flexible couplings together with resonance phenomenon are present mainly in the field of servodrives where high accuracy and dynamic requirements are crucial. When dynamics doesn’t correlate with mechanical system design, unwanted frequencies in the system are exited. Sometimes we haven’t conditions (whether material or space to design mechanical system with resonant frequencies too high to be exited. In that case we must choose compensating methods which can eliminate these phenomenons. This paper is dedicated to them.

  11. Planning of elimination of emergency consequences

    Directory of Open Access Journals (Sweden)

    S. Kovalenko

    2015-05-01

    Full Text Available Introduction. The volume of useful information in the planning of elimination of emergency consequences process is reasonable to assess with calculatory problems and mathematical models. Materials and methods. The expert survey method is used to calculate quantitative values of probability and to determine the optimal solution before the information in condition is received. Results. It is determined that the quality of the solution of elimination emergency consequences depends primarily on the number of factors that are taken into account in particular circumstances of the situation; on the level of information readiness of control bodies to take decision to eliminate emergency consequences as soon as possible and to consider several options for achieving reasonableness and concreteness of a particular decision. The ratio between volume of useful information collected and processed during operation planning which is required for identifying optimal solution is calculated. This ratio allows to construct a graph of probability of identifying a solution in existing environment and probability value of identifying optimal solution before information in P*condition is obtained. This graph also shows the ratio volume of useful information collected and processed during operation planning and necessary volume of information for identifying optimal solution. Conclusion. The results of this research can be used for improving control bodies decisions to ensure safe working conditions for employees of food industry.

  12. Immune Interventions to Eliminate the HIV Reservoir.

    Science.gov (United States)

    Hsu, Denise C; Ananworanich, Jintanat

    2017-10-26

    Inducing HIV remission is a monumental challenge. A potential strategy is the "kick and kill" approach where latently infected cells are first activated to express viral proteins and then eliminated through cytopathic effects of HIV or immune-mediated killing. However, pre-existing immune responses to HIV cannot eradicate HIV infection due to the presence of escape variants, inadequate magnitude, and breadth of responses as well as immune exhaustion. The two major approaches to boost immune-mediated elimination of infected cells include enhancing cytotoxic T lymphocyte mediated killing and harnessing antibodies to eliminate HIV. Specific strategies include increasing the magnitude and breadth of T cell responses through therapeutic vaccinations, reversing the effects of T cell exhaustion using immune checkpoint inhibition, employing bispecific T cell targeting immunomodulatory proteins or dual-affinity re-targeting molecules to direct cytotoxic T lymphocytes to virus-expressing cells and broadly neutralizing antibody infusions. Methods to steer immune responses to tissue sites where latently infected cells are located need to be further explored. Ultimately, strategies to induce HIV remission must be tolerable, safe, and scalable in order to make a global impact.

  13. A Supervised Multiclass Classifier for an Autocoding System

    Directory of Open Access Journals (Sweden)

    Yukako Toko

    2017-11-01

    Full Text Available Classification is often required in various contexts, including in the field of official statistics. In the previous study, we have developed a multiclass classifier that can classify short text descriptions with high accuracy. The algorithm borrows the concept of the naïve Bayes classifier and is so simple that its structure is easily understandable. The proposed classifier has the following two advantages. First, the processing times for both learning and classifying are extremely practical. Second, the proposed classifier yields high-accuracy results for a large portion of a dataset. We have previously developed an autocoding system for the Family Income and Expenditure Survey in Japan that has a better performing classifier. While the original system was developed in Perl in order to improve the efficiency of the coding process of short Japanese texts, the proposed system is implemented in the R programming language in order to explore versatility and is modified to make the system easily applicable to English text descriptions, in consideration of the increasing number of R users in the field of official statistics. We are planning to publish the proposed classifier as an R-package. The proposed classifier would be generally applicable to other classification tasks including coding activities in the field of official statistics, and it would contribute greatly to improving their efficiency.

  14. 18 CFR 3a.12 - Authority to classify official information.

    Science.gov (United States)

    2010-04-01

    ... efficient administration. (b) The authority to classify information or material originally as Top Secret is... classify information or material originally as Secret is exercised only by: (1) Officials who have Top... information or material originally as Confidential is exercised by officials who have Top Secret or Secret...

  15. Using Neural Networks to Classify Digitized Images of Galaxies

    Science.gov (United States)

    Goderya, S. N.; McGuire, P. C.

    2000-12-01

    Automated classification of Galaxies into Hubble types is of paramount importance to study the large scale structure of the Universe, particularly as survey projects like the Sloan Digital Sky Survey complete their data acquisition of one million galaxies. At present it is not possible to find robust and efficient artificial intelligence based galaxy classifiers. In this study we will summarize progress made in the development of automated galaxy classifiers using neural networks as machine learning tools. We explore the Bayesian linear algorithm, the higher order probabilistic network, the multilayer perceptron neural network and Support Vector Machine Classifier. The performance of any machine classifier is dependant on the quality of the parameters that characterize the different groups of galaxies. Our effort is to develop geometric and invariant moment based parameters as input to the machine classifiers instead of the raw pixel data. Such an approach reduces the dimensionality of the classifier considerably, and removes the effects of scaling and rotation, and makes it easier to solve for the unknown parameters in the galaxy classifier. To judge the quality of training and classification we develop the concept of Mathews coefficients for the galaxy classification community. Mathews coefficients are single numbers that quantify classifier performance even with unequal prior probabilities of the classes.

  16. Fisher classifier and its probability of error estimation

    Science.gov (United States)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  17. Performance of classification confidence measures in dynamic classifier systems

    Czech Academy of Sciences Publication Activity Database

    Štefka, D.; Holeňa, Martin

    2013-01-01

    Roč. 23, č. 4 (2013), s. 299-319 ISSN 1210-0552 R&D Projects: GA ČR GA13-17187S Institutional support: RVO:67985807 Keywords : classifier combining * dynamic classifier systems * classification confidence Subject RIV: IN - Informatics, Computer Science Impact factor: 0.412, year: 2013

  18. 32 CFR 2400.30 - Reproduction of classified information.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Reproduction of classified information. 2400.30... SECURITY PROGRAM Safeguarding § 2400.30 Reproduction of classified information. Documents or portions of... the originator or higher authority. Any stated prohibition against reproduction shall be strictly...

  19. Classifying spaces with virtually cyclic stabilizers for linear groups

    DEFF Research Database (Denmark)

    Degrijse, Dieter Dries; Köhl, Ralf; Petrosyan, Nansen

    2015-01-01

    We show that every discrete subgroup of GL(n, ℝ) admits a finite-dimensional classifying space with virtually cyclic stabilizers. Applying our methods to SL(3, ℤ), we obtain a four-dimensional classifying space with virtually cyclic stabilizers and a decomposition of the algebraic K-theory of its...

  20. Dynamic integration of classifiers in the space of principal components

    NARCIS (Netherlands)

    Tsymbal, A.; Pechenizkiy, M.; Puuronen, S.; Patterson, D.W.; Kalinichenko, L.A.; Manthey, R.; Thalheim, B.; Wloka, U.

    2003-01-01

    Recent research has shown the integration of multiple classifiers to be one of the most important directions in machine learning and data mining. It was shown that, for an ensemble to be successful, it should consist of accurate and diverse base classifiers. However, it is also important that the

  1. An ensemble of dissimilarity based classifiers for Mackerel gender determination

    International Nuclear Information System (INIS)

    Blanco, A; Rodriguez, R; Martinez-Maranon, I

    2014-01-01

    Mackerel is an infravalored fish captured by European fishing vessels. A manner to add value to this specie can be achieved by trying to classify it attending to its sex. Colour measurements were performed on Mackerel females and males (fresh and defrozen) extracted gonads to obtain differences between sexes. Several linear and non linear classifiers such as Support Vector Machines (SVM), k Nearest Neighbors (k-NN) or Diagonal Linear Discriminant Analysis (DLDA) can been applied to this problem. However, theyare usually based on Euclidean distances that fail to reflect accurately the sample proximities. Classifiers based on non-Euclidean dissimilarities misclassify a different set of patterns. We combine different kind of dissimilarity based classifiers. The diversity is induced considering a set of complementary dissimilarities for each model. The experimental results suggest that our algorithm helps to improve classifiers based on a single dissimilarity

  2. An ensemble of dissimilarity based classifiers for Mackerel gender determination

    Science.gov (United States)

    Blanco, A.; Rodriguez, R.; Martinez-Maranon, I.

    2014-03-01

    Mackerel is an infravalored fish captured by European fishing vessels. A manner to add value to this specie can be achieved by trying to classify it attending to its sex. Colour measurements were performed on Mackerel females and males (fresh and defrozen) extracted gonads to obtain differences between sexes. Several linear and non linear classifiers such as Support Vector Machines (SVM), k Nearest Neighbors (k-NN) or Diagonal Linear Discriminant Analysis (DLDA) can been applied to this problem. However, theyare usually based on Euclidean distances that fail to reflect accurately the sample proximities. Classifiers based on non-Euclidean dissimilarities misclassify a different set of patterns. We combine different kind of dissimilarity based classifiers. The diversity is induced considering a set of complementary dissimilarities for each model. The experimental results suggest that our algorithm helps to improve classifiers based on a single dissimilarity.

  3. Just-in-time classifiers for recurrent concepts.

    Science.gov (United States)

    Alippi, Cesare; Boracchi, Giacomo; Roveri, Manuel

    2013-04-01

    Just-in-time (JIT) classifiers operate in evolving environments by classifying instances and reacting to concept drift. In stationary conditions, a JIT classifier improves its accuracy over time by exploiting additional supervised information coming from the field. In nonstationary conditions, however, the classifier reacts as soon as concept drift is detected; the current classification setup is discarded and a suitable one activated to keep the accuracy high. We present a novel generation of JIT classifiers able to deal with recurrent concept drift by means of a practical formalization of the concept representation and the definition of a set of operators working on such representations. The concept-drift detection activity, which is crucial in promptly reacting to changes exactly when needed, is advanced by considering change-detection tests monitoring both inputs and classes distributions.

  4. The global cost of eliminating avoidable blindness

    Directory of Open Access Journals (Sweden)

    Kirsten L Armstrong

    2012-01-01

    Full Text Available Aims : To complete an initial estimate of the global cost of eliminating avoidable blindness, including the investment required to build ongoing primary and secondary health care systems, as well as to eliminate the ′backlog′ of avoidable blindness. This analysis also seeks to understand and articulate where key data limitations lie. Materials and Methods : Data were collected in line with a global estimation approach, including separate costing frameworks for the primary and secondary care sectors, and the treatment of backlog. Results : The global direct health cost to eliminate avoidable blindness over a 10-year period from 2011 to 2020 is estimated at $632 billion per year (2009 US$. As countries already spend $592 billion per annum on eye health, this represents additional investment of $397.8 billion over 10 years, which is $40 billion per year or $5.80 per person for each year between 2010 and 2020. This is concentrated in high-income nations, which require 68% of the investment but comprise 16% of the world′s inhabitants. For all other regions, the additional investment required is $127 billion. Conclusions : This costing estimate has identified that low- and middle-income countries require less than half the additional investment compared with high-income nations. Low- and middle-income countries comprise the greater investment proportion in secondary care whereas high-income countries require the majority of investment into the primary sector. However, there is a need to improve sector data. Investment in better data will have positive flow-on effects for the eye health sector.

  5. Redundancy Elimination in DTN via ACK Mechanism

    Directory of Open Access Journals (Sweden)

    Xiqing Zhang

    2015-08-01

    Full Text Available The traditional routing protocols for delay tolerant networks (DTN usually take the strategy of spreading multiple copies of one message to the networks. When one copy reaches destination, the transmission of other copies not only waste the bandwidth but also deprive other messages of the opportunities for transmission. This paper brings up a mechanism to eliminate the redundant copies. By adding an acknowledge field to the packet header to delete redundant copies, it can degrade the network overhead while improve the delivery ratio. Simulation results confirm that the proposed method can improve the performance of epidemic and Spray and Wait routing protocol.

  6. Achieving universal access and moving towards elimination of new HIV infections in Cambodia

    Science.gov (United States)

    Vun, Mean Chhi; Fujita, Masami; Rathavy, Tung; Eang, Mao Tang; Sopheap, Seng; Sovannarith, Samreth; Chhorvann, Chhea; Vanthy, Ly; Sopheap, Oum; Welle, Emily; Ferradini, Laurent; Sedtha, Chin; Bunna, Sok; Verbruggen, Robert

    2014-01-01

    Introduction In the mid-1990s, Cambodia faced one of the fastest growing HIV epidemics in Asia. For its achievement in reversing this trend, and achieving universal access to HIV treatment, the country received a United Nations millennium development goal award in 2010. This article reviews Cambodia’s response to HIV over the past two decades and discusses its current efforts towards elimination of new HIV infections. Methods A literature review of published and unpublished documents, including programme data and presentations, was conducted. Results and discussion Cambodia classifies its response to one of the most serious HIV epidemics in Asia into three phases. In Phase I (1991–2000), when adult HIV prevalence peaked at 1.7% and incidence exceeded 20,000 cases, a nationwide HIV prevention programme targeted brothel-based sex work. Voluntary confidential counselling and testing and home-based care were introduced, and peer support groups of people living with HIV emerged. Phase II (2001–2011) observed a steady decline in adult prevalence to 0.8% and incidence to 1600 cases by 2011, and was characterized by: expanding antiretroviral treatment (coverage reaching more than 80%) and continuum of care; linking with tuberculosis and maternal and child health services; accelerated prevention among key populations, including entertainment establishment-based sex workers, men having sex with men, transgender persons, and people who inject drugs; engagement of health workers to deliver quality services; and strengthening health service delivery systems. The third phase (2012–2020) aims to attain zero new infections by 2020 through: sharpening responses to key populations at higher risk; maximizing access to community and facility-based testing and retention in prevention and care; and accelerating the transition from vertical approaches to linked/integrated approaches. Conclusions Cambodia has tailored its prevention strategy to its own epidemic, established

  7. Reactor feedwater facility

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, Tadashi; Kinoshita, Shoichiro; Akatsu, Jun-ichi

    1996-04-30

    In a reactor feedwater facility in which one stand-by system and at least three ordinary systems are disposed in parallel, each of the feedwater pumps is driven by an electromotor, and has substantially the same capacity. At least two systems among the ordinary systems have a pump rotation number variable means. Since the volume of each of the feedwater pump of each system is determined substantially equal, standardization is enabled to facilitate the production. While the number of electromotors is increased, since they are driven by electromotors, turbines, steam pipelines and valves for driving feed water pumps can be eliminated. Therefore, the feedwater pumps can be disposed to a region of low radiation dose being separated from a main turbine and a main condensator, to improve the degree of freedom in view of the installation. In addition, accessibility to equipments during operation is improved to improve the maintenance of feed water facilities. The number of parts for equipments can be reduced compared with that in a turbine-driving system thereby capable of reducing the operation amount for the maintenance and inspection. (N.H.)

  8. Apparatus for eliminating electrodeposition of radioactive nuclide

    International Nuclear Information System (INIS)

    Inomata, Ichiro; Ishibe, Tadao; Matsunaga, Masaaki; Konuki, Ryoichi; Suzuki, Kazunori; Watanabe, Minoru; Tomoshige, Shozo; Kondo, Kozo.

    1990-01-01

    In a conventional device for eliminating by radioactive nuclides electrodeposition, a liquid containing radioactive nuclides is electrolyzed under a presence of non-radioactive heavy metals and removing radioactive nuclides by electrodepositing them together with the heavy metals. Two anode plates are opposed in an electrolysis vessel of this device. A plurality (4 to 6) of cathode plates are arranged between the anodes in parallel with them and the cathode surfaces opposed to the anodes are insulated. Further, such a plurality of cathode plates are grouped into respective units. Alternatively, the anode plate is made of platinum-plated titanium material and the cathode plate is made of stainless steel. In the thus constituted electrodeposition eliminating device, since the cathode surface directed to the anodes on both ends are insulated, all of electric current from the anode reach the core cathode after flowing around the cathodes at both ends. As a result, there is no substantial difference in the flowing length of the electrolyzing current to each of the cathodes and these is neither difference in the electrodeposition amount. The electrodeposited products are adhered uniformly and densely to the electrodes and, simultaneously, Co-60 and Mn-54, etc. are also electrodeposited. (I.S.)

  9. Processes and problems of ammonia elimination

    Energy Technology Data Exchange (ETDEWEB)

    Tippmer, K

    1974-01-01

    In many cases a conversion of ammonia in coke oven gases to ammonium sulfate (fertilizer) is not useful. It must then be eliminated by oxidation to nitrogen and water or catalytically to N2 and hydrogen. Several processes are available for this which are combined with the simultaneous removal of hydrogen sulfide. The absorption of NH3 with NH3 incineration with and without heat utilization, the NH3 absorption with catalytic cracking of NH3, H2S and NH3 scrubbing with NH3 incineration and production of sulfuric acid (78 or 96 percent), as well as H2S and NH3 scrubbing with catalytic cracking of NH3 and production of pure sulfur are discussed in great detail. A cost comparison of these methods is provided. Lowest investments are required for an NH3 scrubbing process with elimination of NH3 but without desulfurization. Expenditures for an NH3 scrubber with desulfurization of the coke oven gas to about 1.5 g H2S/cu m and NH3 incineration with production of 78 percent H2SO4 are lower than those for the production of 96 percent H2SO4. For the latter there is more demand, however. Desulfurization to about 0.7 g H2S/cu m is only slightly more expensive. The process producing sulfur in combination with an H2S oxidation method requires somewhat lower investment costs.

  10. Elimination of frequency noise from groundwater measurements

    International Nuclear Information System (INIS)

    Chien, Y.M.; Bryce, R.W.; Strait, S.R.; Yeatman, R.A.

    1986-04-01

    Groundwater response to atmospheric fluctuation can be effectively removed from downhole-pressure records using the systematic approach. The technique is not as successful for removal of earth tides, due to a probable discrepancy between the actual earth tide and the theoretical earth tide. The advantage of the systematic technique is that a causative relationship is established for each component of the pressure response removed. This concept of data reduction is easily understood and well accepted. The disadvantage is that a record of the stress causing the pressure fluctuation must be obtained. This may be done by monitoring or synthesizing the stress. Frequency analysis offers a simpler way to eliminate the undesirable hydrologic fluctuations from the downhole pressure. Frequency analysis may prove to be impractical if the fluctuations being removed have broadband characteristics. A combination of the two techniques, such as eliminating the atmospheric effect with the systematic method and the earth-tide fluctuations with the frequency method, is the most effective and efficient approach

  11. Herbivory eliminates fitness costs of mutualism exploiters.

    Science.gov (United States)

    Simonsen, Anna K; Stinchcombe, John R

    2014-04-01

    A common empirical observation in mutualistic interactions is the persistence of variation in partner quality and, in particular, the persistence of exploitative phenotypes. For mutualisms between hosts and symbionts, most mutualism theory assumes that exploiters always impose fitness costs on their host. We exposed legume hosts to mutualistic (nitrogen-fixing) and exploitative (non-nitrogen-fixing) symbiotic rhizobia in field conditions, and manipulated the presence or absence of insect herbivory to determine if the costly fitness effects of exploitative rhizobia are context-dependent. Exploitative rhizobia predictably reduced host fitness when herbivores were excluded. However, insects caused greater damage on hosts associating with mutualistic rhizobia, as a consequence of feeding preferences related to leaf nitrogen content, resulting in the elimination of fitness costs imposed on hosts by exploitative rhizobia. Our experiment shows that herbivory is potentially an important factor in influencing the evolutionary dynamic between legumes and rhizobia. Partner choice and host sanctioning are theoretically predicted to stabilize mutualisms by reducing the frequency of exploitative symbionts. We argue that herbivore pressure may actually weaken selection on choice and sanction mechanisms, thus providing one explanation of why host-based discrimination mechanisms may not be completely effective in eliminating nonbeneficial partners. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.

  12. Class-specific Error Bounds for Ensemble Classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Prenger, R; Lemmond, T; Varshney, K; Chen, B; Hanley, W

    2009-10-06

    The generalization error, or probability of misclassification, of ensemble classifiers has been shown to be bounded above by a function of the mean correlation between the constituent (i.e., base) classifiers and their average strength. This bound suggests that increasing the strength and/or decreasing the correlation of an ensemble's base classifiers may yield improved performance under the assumption of equal error costs. However, this and other existing bounds do not directly address application spaces in which error costs are inherently unequal. For applications involving binary classification, Receiver Operating Characteristic (ROC) curves, performance curves that explicitly trade off false alarms and missed detections, are often utilized to support decision making. To address performance optimization in this context, we have developed a lower bound for the entire ROC curve that can be expressed in terms of the class-specific strength and correlation of the base classifiers. We present empirical analyses demonstrating the efficacy of these bounds in predicting relative classifier performance. In addition, we specify performance regions of the ROC curve that are naturally delineated by the class-specific strengths of the base classifiers and show that each of these regions can be associated with a unique set of guidelines for performance optimization of binary classifiers within unequal error cost regimes.

  13. Frog sound identification using extended k-nearest neighbor classifier

    Science.gov (United States)

    Mukahar, Nordiana; Affendi Rosdi, Bakhtiar; Athiar Ramli, Dzati; Jaafar, Haryati

    2017-09-01

    Frog sound identification based on the vocalization becomes important for biological research and environmental monitoring. As a result, different types of feature extractions and classifiers have been employed to evaluate the accuracy of frog sound identification. This paper presents a frog sound identification with Extended k-Nearest Neighbor (EKNN) classifier. The EKNN classifier integrates the nearest neighbors and mutual sharing of neighborhood concepts, with the aims of improving the classification performance. It makes a prediction based on who are the nearest neighbors of the testing sample and who consider the testing sample as their nearest neighbors. In order to evaluate the classification performance in frog sound identification, the EKNN classifier is compared with competing classifier, k -Nearest Neighbor (KNN), Fuzzy k -Nearest Neighbor (FKNN) k - General Nearest Neighbor (KGNN)and Mutual k -Nearest Neighbor (MKNN) on the recorded sounds of 15 frog species obtained in Malaysia forest. The recorded sounds have been segmented using Short Time Energy and Short Time Average Zero Crossing Rate (STE+STAZCR), sinusoidal modeling (SM), manual and the combination of Energy (E) and Zero Crossing Rate (ZCR) (E+ZCR) while the features are extracted by Mel Frequency Cepstrum Coefficient (MFCC). The experimental results have shown that the EKNCN classifier exhibits the best performance in terms of accuracy compared to the competing classifiers, KNN, FKNN, GKNN and MKNN for all cases.

  14. Support facilities

    International Nuclear Information System (INIS)

    Williamson, F.S.; Blomquist, J.A.; Fox, C.A.

    1977-01-01

    Computer support is centered on the Remote Access Data Station (RADS), which is equipped with a 1000 lpm printer, 1000 cpm reader, and a 300 cps paper tape reader with 500-foot spools. The RADS is located in a data preparation room with four 029 key punches (two of which interpret), a storage vault for archival magnetic tapes, card files, and a 30 cps interactive terminal principally used for job inquiry and routing. An adjacent room provides work space for users, with a documentation library and a consultant's office, plus file storage for programs and their documentations. The facility has approximately 2,600 square feet of working laboratory space, and includes two fully equipped photographic darkrooms, sectioning and autoradiographic facilities, six microscope cubicles, and five transmission electron microscopes and one Cambridge scanning electron microscope equipped with an x-ray energy dispersive analytical system. Ancillary specimen preparative equipment includes vacuum evaporators, freeze-drying and freeze-etching equipment, ultramicrotomes, and assorted photographic and light microscopic equipment. The extensive physical plant of the animal facilities includes provisions for holding all species of laboratory animals under controlled conditions of temperature, humidity, and lighting. More than forty rooms are available for studies of the smaller species. These have a potential capacity of more than 75,000 mice, or smaller numbers of larger species and those requiring special housing arrangements. There are also six dog kennels to accommodate approximately 750 dogs housed in runs that consist of heated indoor compartments and outdoor exercise areas

  15. Ship localization in Santa Barbara Channel using machine learning classifiers.

    Science.gov (United States)

    Niu, Haiqiang; Ozanich, Emma; Gerstoft, Peter

    2017-11-01

    Machine learning classifiers are shown to outperform conventional matched field processing for a deep water (600 m depth) ocean acoustic-based ship range estimation problem in the Santa Barbara Channel Experiment when limited environmental information is known. Recordings of three different ships of opportunity on a vertical array were used as training and test data for the feed-forward neural network and support vector machine classifiers, demonstrating the feasibility of machine learning methods to locate unseen sources. The classifiers perform well up to 10 km range whereas the conventional matched field processing fails at about 4 km range without accurate environmental information.

  16. Cut Elimination, Identity Elimination, and Interpolation in Super-Belnap Logics

    Czech Academy of Sciences Publication Activity Database

    Přenosil, Adam

    2017-01-01

    Roč. 105, č. 6 (2017), s. 1255-1289 ISSN 0039-3215 R&D Projects: GA ČR GBP202/12/G061 EU Projects: European Commission(XE) 689176 - SYSMICS Institutional support: RVO:67985807 Keywords : Super-Belnap logic s * Dunn–Belnap logic * Logic of Paradox * Strong Kleene logic * Exactly True Logic * Gentzen calculus * Cut elimination * Identity elimination * Interpolation Subject RIV: BA - General Mathematics OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 0.589, year: 2016

  17. Classifying hot water chemistry: Application of MULTIVARIATE STATISTICS

    OpenAIRE

    Sumintadireja, Prihadi; Irawan, Dasapta Erwin; Rezky, Yuanno; Gio, Prana Ugiana; Agustin, Anggita

    2016-01-01

    This file is the dataset for the following paper "Classifying hot water chemistry: Application of MULTIVARIATE STATISTICS". Authors: Prihadi Sumintadireja1, Dasapta Erwin Irawan1, Yuano Rezky2, Prana Ugiana Gio3, Anggita Agustin1

  18. Robust Combining of Disparate Classifiers Through Order Statistics

    Science.gov (United States)

    Tumer, Kagan; Ghosh, Joydeep

    2001-01-01

    Integrating the outputs of multiple classifiers via combiners or meta-learners has led to substantial improvements in several difficult pattern recognition problems. In this article we investigate a family of combiners based on order statistics, for robust handling of situations where there are large discrepancies in performance of individual classifiers. Based on a mathematical modeling of how the decision boundaries are affected by order statistic combiners, we derive expressions for the reductions in error expected when simple output combination methods based on the the median, the maximum and in general, the ith order statistic, are used. Furthermore, we analyze the trim and spread combiners, both based on linear combinations of the ordered classifier outputs, and show that in the presence of uneven classifier performance, they often provide substantial gains over both linear and simple order statistics combiners. Experimental results on both real world data and standard public domain data sets corroborate these findings.

  19. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    National Research Council Canada - National Science Library

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  20. An ensemble classifier to predict track geometry degradation

    International Nuclear Information System (INIS)

    Cárdenas-Gallo, Iván; Sarmiento, Carlos A.; Morales, Gilberto A.; Bolivar, Manuel A.; Akhavan-Tabatabaei, Raha

    2017-01-01

    Railway operations are inherently complex and source of several problems. In particular, track geometry defects are one of the leading causes of train accidents in the United States. This paper presents a solution approach which entails the construction of an ensemble classifier to forecast the degradation of track geometry. Our classifier is constructed by solving the problem from three different perspectives: deterioration, regression and classification. We considered a different model from each perspective and our results show that using an ensemble method improves the predictive performance. - Highlights: • We present an ensemble classifier to forecast the degradation of track geometry. • Our classifier considers three perspectives: deterioration, regression and classification. • We construct and test three models and our results show that using an ensemble method improves the predictive performance.

  1. A novel statistical method for classifying habitat generalists and specialists

    DEFF Research Database (Denmark)

    Chazdon, Robin L; Chao, Anne; Colwell, Robert K

    2011-01-01

    in second-growth (SG) and old-growth (OG) rain forests in the Caribbean lowlands of northeastern Costa Rica. We evaluate the multinomial model in detail for the tree data set. Our results for birds were highly concordant with a previous nonstatistical classification, but our method classified a higher......: (1) generalist; (2) habitat A specialist; (3) habitat B specialist; and (4) too rare to classify with confidence. We illustrate our multinomial classification method using two contrasting data sets: (1) bird abundance in woodland and heath habitats in southeastern Australia and (2) tree abundance...... fraction (57.7%) of bird species with statistical confidence. Based on a conservative specialization threshold and adjustment for multiple comparisons, 64.4% of tree species in the full sample were too rare to classify with confidence. Among the species classified, OG specialists constituted the largest...

  2. 6 CFR 7.23 - Emergency release of classified information.

    Science.gov (United States)

    2010-01-01

    ... Classified Information Non-disclosure Form. In emergency situations requiring immediate verbal release of... information through approved communication channels by the most secure and expeditious method possible, or by...

  3. Emergency reactor core cooling facility

    International Nuclear Information System (INIS)

    Yoshikawa, Kazuhiro; Kinoshita, Shoichiro; Iwata, Yasutaka.

    1996-01-01

    The present invention provides an emergency reactor core cooling device for a BWR type nuclear power plant. Namely, D/S pit (gas/water separator storage pool) water is used as a water source for the emergency reactor core cooling facility upon occurrence of loss of coolant accidents (LOCA) by introducing the D/S pit water to the emergency reactor core cooling (ECCS) pump. As a result, the function as the ECCS facility can be eliminated from the function of the condensate storage tank which has been used as the ECCS facility. If the function is unnecessary, the level of quality control and that of earthquake resistance of the condensate storage tank can be lowered to a level of ordinary facilities to provide an effect of reducing the cost. On the other hand, since the D/S pit as the alternative water source is usually a facility at high quality control level and earthquake resistant level, there is no problem. The quality of the water in the D/S pit can be maintained constant by elevating pressure of the D/S pit water by a suppression pool cleanup (SPCU) pump to pass it through a filtration desalter thereby purifying the D/S pit water during the plant operation. (I.S.)

  4. Emergency reactor core cooling facility

    Energy Technology Data Exchange (ETDEWEB)

    Yoshikawa, Kazuhiro; Kinoshita, Shoichiro; Iwata, Yasutaka

    1996-11-01

    The present invention provides an emergency reactor core cooling device for a BWR type nuclear power plant. Namely, D/S pit (gas/water separator storage pool) water is used as a water source for the emergency reactor core cooling facility upon occurrence of loss of coolant accidents (LOCA) by introducing the D/S pit water to the emergency reactor core cooling (ECCS) pump. As a result, the function as the ECCS facility can be eliminated from the function of the condensate storage tank which has been used as the ECCS facility. If the function is unnecessary, the level of quality control and that of earthquake resistance of the condensate storage tank can be lowered to a level of ordinary facilities to provide an effect of reducing the cost. On the other hand, since the D/S pit as the alternative water source is usually a facility at high quality control level and earthquake resistant level, there is no problem. The quality of the water in the D/S pit can be maintained constant by elevating pressure of the D/S pit water by a suppression pool cleanup (SPCU) pump to pass it through a filtration desalter thereby purifying the D/S pit water during the plant operation. (I.S.)

  5. For a convention for nuclear weapon elimination

    International Nuclear Information System (INIS)

    2008-03-01

    This document contains two texts linked with the project of an international convention for the elimination of nuclear weapons (the text of this project has been sent to the UN General Secretary and is part of an international campaign to abolish nuclear weapons, ICAN). These two texts are contributions presented in London at the Global Summit for a Nuclear Weapon-free World. The first one calls into question the deterrence principle and the idea of a nuclear weapon-based security. It calls for different forms of action to promote a nuclear weapon-free world. The second text stresses the role and the responsibility of states with nuclear weapons in nuclear disarmament and in the reinforcement of the nuclear non proliferation treaty (NPT)

  6. Lean for Government: Eliminating the Seven Wastes

    Science.gov (United States)

    Shepherd, Christena C.

    2012-01-01

    With shrinking budgets and a slow economy, it is becoming increasingly important for all government agencies to become more efficient. Citizens expect and deserve efficient and effective services from federal, state and local government agencies. One of the best methods to improve efficiency and eliminate waste is to institute the business process improvement methodologies known collectively as Lean; however, with reduced budgets, it may not be possible to train everyone in Lean or to engage the services of a trained consultant. It is possible, however, to raise awareness of the "Seven Wastes" of Lean in each employee, and encourage them to identify areas for improvement. Management commitment is vital to the success of these initiatives, and it is also important to develop the right metrics that will track the success of these changes.

  7. Adaptive elimination of synchronization in coupled oscillator

    Science.gov (United States)

    Zhou, Shijie; Ji, Peng; Zhou, Qing; Feng, Jianfeng; Kurths, Jürgen; Lin, Wei

    2017-08-01

    We present here an adaptive control scheme with a feedback delay to achieve elimination of synchronization in a large population of coupled and synchronized oscillators. We validate the feasibility of this scheme not only in the coupled Kuramoto’s oscillators with a unimodal or bimodal distribution of natural frequency, but also in two representative models of neuronal networks, namely, the FitzHugh-Nagumo spiking oscillators and the Hindmarsh-Rose bursting oscillators. More significantly, we analytically illustrate the feasibility of the proposed scheme with a feedback delay and reveal how the exact topological form of the bimodal natural frequency distribution influences the scheme performance. We anticipate that our developed scheme will deepen the understanding and refinement of those controllers, e.g. techniques of deep brain stimulation, which have been implemented in remedying some synchronization-induced mental disorders including Parkinson disease and epilepsy.

  8. Gaussian elimination is not optimal, revisited

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel

    2016-01-01

    We refactor the universal law for the tensor product to express matrix multiplication as the product . MN of two matrices . M and . N thus making possible to use such matrix product to encode and transform algorithms performing matrix multiplication using techniques from linear algebra. We explore...... the end results are equations involving matrix products, our exposition builds upon previous works on the category of matrices (and the related category of finite vector spaces) which we extend by showing: why the direct sum . (⊕,0) monoid is not closed, a biproduct encoding of Gaussian elimination...... such possibility and show two stepwise refinements transforming the composition . MN into the Naïve and Strassen's matrix multiplication algorithms. The inspection of the stepwise transformation of the composition of matrices . MN into the Naïve matrix multiplication algorithm evidences that the steps...

  9. RESULTS of the "ELIMINATING NOISE" campaign

    CERN Multimedia

    SC Unit

    2008-01-01

    From 4 to 6 August, CERN’s nurses conducted a screening campaign entitled "Eliminating noise". This campaign was especially aimed at young people exposed to noise during their leisure hours (playing in a band, listening to MP3 players, attending concerts, etc.). In all, 166 people attended the Infirmary, where they were able to receive personalised advice, documentation and, above all, a hearing test (audiogram). While the high attendance of people in the younger age category (18-30) was a success, their audiogram data were a cause for concern, with 24.5% showing abnormal results, hearing deficiencies which, we should remind you, are irreversible. It should be noted that such conditions are almost exclusively caused by noise exposure in a non-professional environment (leisure activities, music, etc.). This latest campaign confirms the harmful effects of noise on people’s hearing due to the absence or insufficiency of protective equipment during music-related activities; this further unde...

  10. RESULTS of the "ELIMINATING NOISE" campaign

    CERN Multimedia

    SC Unit

    2008-01-01

    From 4 to 6 August, CERN’s nurses conducted a screening campaign entitled "Eliminating noise". This campaign was especially aimed at young people exposed to noise during their leisure hours (playing in a band, listening to MP3 players, attending concerts, etc.). In all, 166 people attended the infirmary, where they were able to receive personalised advice, documentation and, above all, a hearing test (audiogram). While the high attendance of people in the younger age category (18-30) was a success, their audiogram data were a cause for concern, with 24.5% showing abnormal results, hearing deficiencies which, we should remind you, are irreversible. It should be noted that such conditions are almost exclusively caused by noise exposure in a non-professional environment (leisure activities, music, etc.). This latest campaign confirms the harmful effects of noise on people’s hearing due to the absence or insufficiency of protective equipment during music-related activities; this further unde...

  11. Adaptive elimination of synchronization in coupled oscillator

    International Nuclear Information System (INIS)

    Zhou, Shijie; Lin, Wei; Ji, Peng; Feng, Jianfeng; Zhou, Qing; Kurths, Jürgen

    2017-01-01

    We present here an adaptive control scheme with a feedback delay to achieve elimination of synchronization in a large population of coupled and synchronized oscillators. We validate the feasibility of this scheme not only in the coupled Kuramoto’s oscillators with a unimodal or bimodal distribution of natural frequency, but also in two representative models of neuronal networks, namely, the FitzHugh–Nagumo spiking oscillators and the Hindmarsh–Rose bursting oscillators. More significantly, we analytically illustrate the feasibility of the proposed scheme with a feedback delay and reveal how the exact topological form of the bimodal natural frequency distribution influences the scheme performance. We anticipate that our developed scheme will deepen the understanding and refinement of those controllers, e.g. techniques of deep brain stimulation, which have been implemented in remedying some synchronization-induced mental disorders including Parkinson disease and epilepsy. (paper)

  12. DECISION TREE CLASSIFIERS FOR STAR/GALAXY SEPARATION

    International Nuclear Information System (INIS)

    Vasconcellos, E. C.; Ruiz, R. S. R.; De Carvalho, R. R.; Capelato, H. V.; Gal, R. R.; LaBarbera, F. L.; Frago Campos Velho, H.; Trevisan, M.

    2011-01-01

    We study the star/galaxy classification efficiency of 13 different decision tree algorithms applied to photometric objects in the Sloan Digital Sky Survey Data Release Seven (SDSS-DR7). Each algorithm is defined by a set of parameters which, when varied, produce different final classification trees. We extensively explore the parameter space of each algorithm, using the set of 884,126 SDSS objects with spectroscopic data as the training set. The efficiency of star-galaxy separation is measured using the completeness function. We find that the Functional Tree algorithm (FT) yields the best results as measured by the mean completeness in two magnitude intervals: 14 ≤ r ≤ 21 (85.2%) and r ≥ 19 (82.1%). We compare the performance of the tree generated with the optimal FT configuration to the classifications provided by the SDSS parametric classifier, 2DPHOT, and Ball et al. We find that our FT classifier is comparable to or better in completeness over the full magnitude range 15 ≤ r ≤ 21, with much lower contamination than all but the Ball et al. classifier. At the faintest magnitudes (r > 19), our classifier is the only one that maintains high completeness (>80%) while simultaneously achieving low contamination (∼2.5%). We also examine the SDSS parametric classifier (psfMag - modelMag) to see if the dividing line between stars and galaxies can be adjusted to improve the classifier. We find that currently stars in close pairs are often misclassified as galaxies, and suggest a new cut to improve the classifier. Finally, we apply our FT classifier to separate stars from galaxies in the full set of 69,545,326 SDSS photometric objects in the magnitude range 14 ≤ r ≤ 21.

  13. Data management facility for JT-60

    International Nuclear Information System (INIS)

    Ohasa, K.; Kurimoto, K.; Mochizuki, O.

    1983-01-01

    This study considers the Data Management Facility which is provided for unified management of various diagnostics data with JT-60 experiments. This facility is designed for the purpose of data access. There are about 30 kinds of diagnostic devices that are classified by diagnostic objects equipped for JT-60 facility. It gathers the diagnostic date about 10 Mega Byte per each discharge. Those diagnostic data are varied qualitatively and quantitatively by experimental purpose. Other fundamental information like discharge condition, adjustive value for diagnostic devices is required to process those gathered data

  14. Local-global classifier fusion for screening chest radiographs

    Science.gov (United States)

    Ding, Meng; Antani, Sameer; Jaeger, Stefan; Xue, Zhiyun; Candemir, Sema; Kohli, Marc; Thoma, George

    2017-03-01

    Tuberculosis (TB) is a severe comorbidity of HIV and chest x-ray (CXR) analysis is a necessary step in screening for the infective disease. Automatic analysis of digital CXR images for detecting pulmonary abnormalities is critical for population screening, especially in medical resource constrained developing regions. In this article, we describe steps that improve previously reported performance of NLM's CXR screening algorithms and help advance the state of the art in the field. We propose a local-global classifier fusion method where two complementary classification systems are combined. The local classifier focuses on subtle and partial presentation of the disease leveraging information in radiology reports that roughly indicates locations of the abnormalities. In addition, the global classifier models the dominant spatial structure in the gestalt image using GIST descriptor for the semantic differentiation. Finally, the two complementary classifiers are combined using linear fusion, where the weight of each decision is calculated by the confidence probabilities from the two classifiers. We evaluated our method on three datasets in terms of the area under the Receiver Operating Characteristic (ROC) curve, sensitivity, specificity and accuracy. The evaluation demonstrates the superiority of our proposed local-global fusion method over any single classifier.

  15. Verification of classified fissile material using unclassified attributes

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-01-01

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated

  16. A cardiorespiratory classifier of voluntary and involuntary electrodermal activity

    Directory of Open Access Journals (Sweden)

    Sejdic Ervin

    2010-02-01

    Full Text Available Abstract Background Electrodermal reactions (EDRs can be attributed to many origins, including spontaneous fluctuations of electrodermal activity (EDA and stimuli such as deep inspirations, voluntary mental activity and startling events. In fields that use EDA as a measure of psychophysiological state, the fact that EDRs may be elicited from many different stimuli is often ignored. This study attempts to classify observed EDRs as voluntary (i.e., generated from intentional respiratory or mental activity or involuntary (i.e., generated from startling events or spontaneous electrodermal fluctuations. Methods Eight able-bodied participants were subjected to conditions that would cause a change in EDA: music imagery, startling noises, and deep inspirations. A user-centered cardiorespiratory classifier consisting of 1 an EDR detector, 2 a respiratory filter and 3 a cardiorespiratory filter was developed to automatically detect a participant's EDRs and to classify the origin of their stimulation as voluntary or involuntary. Results Detected EDRs were classified with a positive predictive value of 78%, a negative predictive value of 81% and an overall accuracy of 78%. Without the classifier, EDRs could only be correctly attributed as voluntary or involuntary with an accuracy of 50%. Conclusions The proposed classifier may enable investigators to form more accurate interpretations of electrodermal activity as a measure of an individual's psychophysiological state.

  17. 76 FR 24802 - Eliminating the Decision Review Board

    Science.gov (United States)

    2011-05-03

    ... 0960-AG80 Eliminating the Decision Review Board AGENCY: Social Security Administration. ACTION: Final rules. SUMMARY: We are eliminating the Decision Review Board (DRB) portions of part 405 of our rules...-level process. DSI also eliminated review by the Appeals Council, the final step in our administrative...

  18. 77 FR 30871 - Implementing the Prison Rape Elimination Act

    Science.gov (United States)

    2012-05-23

    ...--Implementing the Prison Rape Elimination Act Proclamation 8823--Armed Forces Day, 2012 #0; #0; #0; Presidential... Prison Rape Elimination Act Memorandum for the Heads of Executive Departments and Agencies Sexual... Rape Elimination Act of 2003 (PREA) was enacted with bipartisan support and established a ``zero...

  19. Emission Facilities - Erosion & Sediment Control Facilities

    Data.gov (United States)

    NSGIC Education | GIS Inventory — An Erosion and Sediment Control Facility is a DEP primary facility type related to the Water Pollution Control program. The following sub-facility types related to...

  20. Balanced sensitivity functions for tuning multi-dimensional Bayesian network classifiers

    NARCIS (Netherlands)

    Bolt, J.H.; van der Gaag, L.C.

    Multi-dimensional Bayesian network classifiers are Bayesian networks of restricted topological structure, which are tailored to classifying data instances into multiple dimensions. Like more traditional classifiers, multi-dimensional classifiers are typically learned from data and may include

  1. On the Importance of Elimination Heuristics in Lazy Propagation

    DEFF Research Database (Denmark)

    Madsen, Anders Læsø; Butz, Cory J.

    2012-01-01

    elimination orders on-line. This paper considers the importance of elimination heuristics in LP when using Variable Elimination (VE) as the message and single marginal computation algorithm. It considers well-known cost measures for selecting the next variable to eliminate and a new cost measure....... The empirical evaluation examines dierent heuristics as well as sequences of cost measures, and was conducted on real-world and randomly generated Bayesian networks. The results show that for most cases performance is robust relative to the cost measure used and in some cases the elimination heuristic can have...

  2. Nonparametric, Coupled ,Bayesian ,Dictionary ,and Classifier Learning for Hyperspectral Classification.

    Science.gov (United States)

    Akhtar, Naveed; Mian, Ajmal

    2017-10-03

    We present a principled approach to learn a discriminative dictionary along a linear classifier for hyperspectral classification. Our approach places Gaussian Process priors over the dictionary to account for the relative smoothness of the natural spectra, whereas the classifier parameters are sampled from multivariate Gaussians. We employ two Beta-Bernoulli processes to jointly infer the dictionary and the classifier. These processes are coupled under the same sets of Bernoulli distributions. In our approach, these distributions signify the frequency of the dictionary atom usage in representing class-specific training spectra, which also makes the dictionary discriminative. Due to the coupling between the dictionary and the classifier, the popularity of the atoms for representing different classes gets encoded into the classifier. This helps in predicting the class labels of test spectra that are first represented over the dictionary by solving a simultaneous sparse optimization problem. The labels of the spectra are predicted by feeding the resulting representations to the classifier. Our approach exploits the nonparametric Bayesian framework to automatically infer the dictionary size--the key parameter in discriminative dictionary learning. Moreover, it also has the desirable property of adaptively learning the association between the dictionary atoms and the class labels by itself. We use Gibbs sampling to infer the posterior probability distributions over the dictionary and the classifier under the proposed model, for which, we derive analytical expressions. To establish the effectiveness of our approach, we test it on benchmark hyperspectral images. The classification performance is compared with the state-of-the-art dictionary learning-based classification methods.

  3. Classifying a smoker scale in adult daily and nondaily smokers.

    Science.gov (United States)

    Pulvers, Kim; Scheuermann, Taneisha S; Romero, Devan R; Basora, Brittany; Luo, Xianghua; Ahluwalia, Jasjit S

    2014-05-01

    Smoker identity, or the strength of beliefs about oneself as a smoker, is a robust marker of smoking behavior. However, many nondaily smokers do not identify as smokers, underestimating their risk for tobacco-related disease and resulting in missed intervention opportunities. Assessing underlying beliefs about characteristics used to classify smokers may help explain the discrepancy between smoking behavior and smoker identity. This study examines the factor structure, reliability, and validity of the Classifying a Smoker scale among a racially diverse sample of adult smokers. A cross-sectional survey was administered through an online panel survey service to 2,376 current smokers who were at least 25 years of age. The sample was stratified to obtain equal numbers of 3 racial/ethnic groups (African American, Latino, and White) across smoking level (nondaily and daily smoking). The Classifying a Smoker scale displayed a single factor structure and excellent internal consistency (α = .91). Classifying a Smoker scores significantly increased at each level of smoking, F(3,2375) = 23.68, p smoker identity, stronger dependence on cigarettes, greater health risk perceptions, more smoking friends, and were more likely to carry cigarettes. Classifying a Smoker scores explained unique variance in smoking variables above and beyond that explained by smoker identity. The present study supports the use of the Classifying a Smoker scale among diverse, experienced smokers. Stronger endorsement of characteristics used to classify a smoker (i.e., stricter criteria) was positively associated with heavier smoking and related characteristics. Prospective studies are needed to inform prevention and treatment efforts.

  4. Representative Vector Machines: A Unified Framework for Classical Classifiers.

    Science.gov (United States)

    Gui, Jie; Liu, Tongliang; Tao, Dacheng; Sun, Zhenan; Tan, Tieniu

    2016-08-01

    Classifier design is a fundamental problem in pattern recognition. A variety of pattern classification methods such as the nearest neighbor (NN) classifier, support vector machine (SVM), and sparse representation-based classification (SRC) have been proposed in the literature. These typical and widely used classifiers were originally developed from different theory or application motivations and they are conventionally treated as independent and specific solutions for pattern classification. This paper proposes a novel pattern classification framework, namely, representative vector machines (or RVMs for short). The basic idea of RVMs is to assign the class label of a test example according to its nearest representative vector. The contributions of RVMs are twofold. On one hand, the proposed RVMs establish a unified framework of classical classifiers because NN, SVM, and SRC can be interpreted as the special cases of RVMs with different definitions of representative vectors. Thus, the underlying relationship among a number of classical classifiers is revealed for better understanding of pattern classification. On the other hand, novel and advanced classifiers are inspired in the framework of RVMs. For example, a robust pattern classification method called discriminant vector machine (DVM) is motivated from RVMs. Given a test example, DVM first finds its k -NNs and then performs classification based on the robust M-estimator and manifold regularization. Extensive experimental evaluations on a variety of visual recognition tasks such as face recognition (Yale and face recognition grand challenge databases), object categorization (Caltech-101 dataset), and action recognition (Action Similarity LAbeliNg) demonstrate the advantages of DVM over other classifiers.

  5. Performance of smokeless gasoline fire test facility

    International Nuclear Information System (INIS)

    Griffin, J.F.; Watkins, R.A.

    1978-01-01

    Packaging for radioactive materials must perform satisfactorily when subjected to temperatures simulating an accident involving a fire. The new thermal test facility has proved to be a reliable method for satisfactorily performing the required test. The flame provides sufficient heat to assure that the test is valid, and the temperature can be controlled satisfactorily. Also, the air and water mist systems virtually eliminate any smoke and thereby exceed the local EPA requirements. The combination of the two systems provides an inexpensive, low maintenance technique for elimination of the smoke plume

  6. Elimination of onchocerciasis from Colombia: first proof of concept of river blindness elimination in the world.

    Science.gov (United States)

    Nicholls, Rubén Santiago; Duque, Sofía; Olaya, Luz Adriana; López, Myriam Consuelo; Sánchez, Sol Beatriz; Morales, Alba Lucía; Palma, Gloria Inés

    2018-04-11

    Onchocerciasis is a chronic parasitic infection originally endemic in 13 discrete regional foci distributed among six countries of Latin America (Brazil, Colombia, Ecuador, Guatemala, Mexico and Venezuela). In Colombia, this disease was discovered in 1965 in the Pacific Coast of the country. The National Onchocerciasis Elimination Program was established in 1993 with the aim of eliminating disease morbidity and infection transmission. In 2013, the World Health Organization (WHO) verified Colombia as free of onchocerciasis, becoming the first country in the world to reach such a goal. This report provides the empirical evidence of the elimination of Onchocerca volvulus transmission by Simulium exiguum (s.l.) after 12 years of 6-monthly mass drug administration of Mectizan® (ivermectin) to all the eligible residents living in this endemic area. From 1996 onwards, a biannual community-based mass ivermectin administration programme was implemented, complemented by health education and community participation. In-depth parasitological, serological and entomological surveys were conducted periodically between 1998 and 2007 to evaluate the impact of ivermectin treatment according to the 2001 WHO guidelines. When the interruption of parasite transmission was demonstrated, the drug distribution ceased and a three-year post-treatment surveillance (PTS) period (2008-2010) was initiated. After 23 rounds of treatment, parasitological and ophthalmological assessments showed absence of microfilariae in skin and anterior chamber of the eyes. Serological tests proved lack of antibodies against O. volvulus in children under 10 years-old. A total of 10,500 S. exiguum flies tested by PCR had no L3 infection (infectivity rate = 0.0095%; 95% CI: 0.0029-0.049) during 2004, indicating interruption of parasite transmission. However, biannual ivermectin treatments continued until 2007 followed by a 3-year PTS period at the end of which 13,481 flies were analyzed and no infective flies were

  7. Polio elimination in Nigeria: A review.

    Science.gov (United States)

    Nasir, Usman Nakakana; Bandyopadhyay, Ananda Sankar; Montagnani, Francesca; Akite, Jacqueline Elaine; Mungu, Etaluka Blanche; Uche, Ifeanyi Valentine; Ismaila, Ahmed Mohammed

    2016-03-03

    Nigeria has made tremendous strides towards eliminating polio and has been free of wild polio virus (WPV) for more than a year as of August 2015. However, sustained focus towards getting rid of all types of poliovirus by improving population immunity and enhancing disease surveillance will be needed to ensure it sustains the polio-free status. We reviewed the pertinent literature including published and unpublished, official reports and working documents of the Global Polio Eradication Initiative (GPEI) partners as well as other concerned organizations. The literature were selected based on the following criteria: published in English Language, published after year 2000, relevant content and conformance to the theme of the review and these were sorted accordingly. The challenges facing the Polio Eradication Initiative (PEI) in Nigeria were found to fall into 3 broad categories viz failure to vaccinate, failure of the Oral Polio Vaccine (OPV) and epidemiology of the virus. Failure to vaccinate resulted from insecurity, heterogeneous political support, programmatic limitation in implementation of vaccination campaigns, poor performance of vaccination teams in persistently poor performing Local Government areas and sporadic vaccine refusals in Northern Nigeria. Sub optimal effectiveness of OPV in some settings as well as the rare occurrence of VDPVs associated with OPV type 2 in areas of low immunization coverage were also found to be key issues. Some of the innovations which helped to manage the threats to the PEI include a strong government accountability frame work, change from type 2 containing OPV to bi valent OPVs for supplementary immunization activities (SIA), enhancing environmental surveillance in key states (Sokoto, Kano and Borno) along with an overall improvement in SIA quality. There has been an improvement in coverage of routine immunization and vaccination campaigns, which has resulted in Nigeria being removed from the list of endemic countries

  8. Malaria elimination in Haiti by the year 2020: an achievable goal?

    Science.gov (United States)

    Boncy, Paul Jacques; Adrien, Paul; Lemoine, Jean Frantz; Existe, Alexandre; Henry, Patricia Jean; Raccurt, Christian; Brasseur, Philippe; Fenelon, Natael; Dame, John B; Okech, Bernard A; Kaljee, Linda; Baxa, Dwayne; Prieur, Eric; El Badry, Maha A; Tagliamonte, Massimiliano S; Mulligan, Connie J; Carter, Tamar E; Beau de Rochars, V Madsen; Lutz, Chelsea; Parke, Dana M; Zervos, Marcus J

    2015-06-05

    Haiti and the Dominican Republic, which share the island of Hispaniola, are the last locations in the Caribbean where malaria still persists. Malaria is an important public health concern in Haiti with 17,094 reported cases in 2014. Further, on January 12, 2010, a record earthquake devastated densely populated areas in Haiti including many healthcare and laboratory facilities. Weakened infrastructure provided fertile reservoirs for uncontrolled transmission of infectious pathogens. This situation results in unique challenges for malaria epidemiology and elimination efforts. To help Haiti achieve its malaria elimination goals by year 2020, the Laboratoire National de Santé Publique and Henry Ford Health System, in close collaboration with the Direction d'Épidémiologie, de Laboratoire et de Recherches and the Programme National de Contrôle de la Malaria, hosted a scientific meeting on "Elimination Strategies for Malaria in Haiti" on January 29-30, 2015 at the National Laboratory in Port-au-Prince, Haiti. The meeting brought together laboratory personnel, researchers, clinicians, academics, public health professionals, and other stakeholders to discuss main stakes and perspectives on malaria elimination. Several themes and recommendations emerged during discussions at this meeting. First, more information and research on malaria transmission in Haiti are needed including information from active surveillance of cases and vectors. Second, many healthcare personnel need additional training and critical resources on how to properly identify malaria cases so as to improve accurate and timely case reporting. Third, it is necessary to continue studies genotyping strains of Plasmodium falciparum in different sites with active transmission to evaluate for drug resistance and impacts on health. Fourth, elimination strategies outlined in this report will continue to incorporate use of primaquine in addition to chloroquine and active surveillance of cases. Elimination of

  9. Current Directional Protection of Series Compensated Line Using Intelligent Classifier

    Directory of Open Access Journals (Sweden)

    M. Mollanezhad Heydarabadi

    2016-12-01

    Full Text Available Current inversion condition leads to incorrect operation of current based directional relay in power system with series compensated device. Application of the intelligent system for fault direction classification has been suggested in this paper. A new current directional protection scheme based on intelligent classifier is proposed for the series compensated line. The proposed classifier uses only half cycle of pre-fault and post fault current samples at relay location to feed the classifier. A lot of forward and backward fault simulations under different system conditions upon a transmission line with a fixed series capacitor are carried out using PSCAD/EMTDC software. The applicability of decision tree (DT, probabilistic neural network (PNN and support vector machine (SVM are investigated using simulated data under different system conditions. The performance comparison of the classifiers indicates that the SVM is a best suitable classifier for fault direction discriminating. The backward faults can be accurately distinguished from forward faults even under current inversion without require to detect of the current inversion condition.

  10. Neural network classifier of attacks in IP telephony

    Science.gov (United States)

    Safarik, Jakub; Voznak, Miroslav; Mehic, Miralem; Partila, Pavol; Mikulec, Martin

    2014-05-01

    Various types of monitoring mechanism allow us to detect and monitor behavior of attackers in VoIP networks. Analysis of detected malicious traffic is crucial for further investigation and hardening the network. This analysis is typically based on statistical methods and the article brings a solution based on neural network. The proposed algorithm is used as a classifier of attacks in a distributed monitoring network of independent honeypot probes. Information about attacks on these honeypots is collected on a centralized server and then classified. This classification is based on different mechanisms. One of them is based on the multilayer perceptron neural network. The article describes inner structure of used neural network and also information about implementation of this network. The learning set for this neural network is based on real attack data collected from IP telephony honeypot called Dionaea. We prepare the learning set from real attack data after collecting, cleaning and aggregation of this information. After proper learning is the neural network capable to classify 6 types of most commonly used VoIP attacks. Using neural network classifier brings more accurate attack classification in a distributed system of honeypots. With this approach is possible to detect malicious behavior in a different part of networks, which are logically or geographically divided and use the information from one network to harden security in other networks. Centralized server for distributed set of nodes serves not only as a collector and classifier of attack data, but also as a mechanism for generating a precaution steps against attacks.

  11. Maximum margin classifier working in a set of strings.

    Science.gov (United States)

    Koyano, Hitoshi; Hayashida, Morihiro; Akutsu, Tatsuya

    2016-03-01

    Numbers and numerical vectors account for a large portion of data. However, recently, the amount of string data generated has increased dramatically. Consequently, classifying string data is a common problem in many fields. The most widely used approach to this problem is to convert strings into numerical vectors using string kernels and subsequently apply a support vector machine that works in a numerical vector space. However, this non-one-to-one conversion involves a loss of information and makes it impossible to evaluate, using probability theory, the generalization error of a learning machine, considering that the given data to train and test the machine are strings generated according to probability laws. In this study, we approach this classification problem by constructing a classifier that works in a set of strings. To evaluate the generalization error of such a classifier theoretically, probability theory for strings is required. Therefore, we first extend a limit theorem for a consensus sequence of strings demonstrated by one of the authors and co-workers in a previous study. Using the obtained result, we then demonstrate that our learning machine classifies strings in an asymptotically optimal manner. Furthermore, we demonstrate the usefulness of our machine in practical data analysis by applying it to predicting protein-protein interactions using amino acid sequences and classifying RNAs by the secondary structure using nucleotide sequences.

  12. Use of information barriers to protect classified information

    International Nuclear Information System (INIS)

    MacArthur, D.; Johnson, M.W.; Nicholas, N.J.; Whiteson, R.

    1998-01-01

    This paper discusses the detailed requirements for an information barrier (IB) for use with verification systems that employ intrusive measurement technologies. The IB would protect classified information in a bilateral or multilateral inspection of classified fissile material. Such a barrier must strike a balance between providing the inspecting party the confidence necessary to accept the measurement while protecting the inspected party's classified information. The authors discuss the structure required of an IB as well as the implications of the IB on detector system maintenance. A defense-in-depth approach is proposed which would provide assurance to the inspected party that all sensitive information is protected and to the inspecting party that the measurements are being performed as expected. The barrier could include elements of physical protection (such as locks, surveillance systems, and tamper indicators), hardening of key hardware components, assurance of capabilities and limitations of hardware and software systems, administrative controls, validation and verification of the systems, and error detection and resolution. Finally, an unclassified interface could be used to display and, possibly, record measurement results. The introduction of an IB into an analysis system may result in many otherwise innocuous components (detectors, analyzers, etc.) becoming classified and unavailable for routine maintenance by uncleared personnel. System maintenance and updating will be significantly simplified if the classification status of as many components as possible can be made reversible (i.e. the component can become unclassified following the removal of classified objects)

  13. Generalization in the XCSF classifier system: analysis, improvement, and extension.

    Science.gov (United States)

    Lanzi, Pier Luca; Loiacono, Daniele; Wilson, Stewart W; Goldberg, David E

    2007-01-01

    We analyze generalization in XCSF and introduce three improvements. We begin by showing that the types of generalizations evolved by XCSF can be influenced by the input range. To explain these results we present a theoretical analysis of the convergence of classifier weights in XCSF which highlights a broader issue. In XCSF, because of the mathematical properties of the Widrow-Hoff update, the convergence of classifier weights in a given subspace can be slow when the spread of the eigenvalues of the autocorrelation matrix associated with each classifier is large. As a major consequence, the system's accuracy pressure may act before classifier weights are adequately updated, so that XCSF may evolve piecewise constant approximations, instead of the intended, and more efficient, piecewise linear ones. We propose three different ways to update classifier weights in XCSF so as to increase the generalization capabilities of XCSF: one based on a condition-based normalization of the inputs, one based on linear least squares, and one based on the recursive version of linear least squares. Through a series of experiments we show that while all three approaches significantly improve XCSF, least squares approaches appear to be best performing and most robust. Finally we show how XCSF can be extended to include polynomial approximations.

  14. Dynamic cluster generation for a fuzzy classifier with ellipsoidal regions.

    Science.gov (United States)

    Abe, S

    1998-01-01

    In this paper, we discuss a fuzzy classifier with ellipsoidal regions that dynamically generates clusters. First, for the data belonging to a class we define a fuzzy rule with an ellipsoidal region. Namely, using the training data for each class, we calculate the center and the covariance matrix of the ellipsoidal region for the class. Then we tune the fuzzy rules, i.e., the slopes of the membership functions, successively until there is no improvement in the recognition rate of the training data. Then if the number of the data belonging to a class that are misclassified into another class exceeds a prescribed number, we define a new cluster to which those data belong and the associated fuzzy rule. Then we tune the newly defined fuzzy rules in the similar way as stated above, fixing the already obtained fuzzy rules. We iterate generation of clusters and tuning of the newly generated fuzzy rules until the number of the data belonging to a class that are misclassified into another class does not exceed the prescribed number. We evaluate our method using thyroid data, Japanese Hiragana data of vehicle license plates, and blood cell data. By dynamic cluster generation, the generalization ability of the classifier is improved and the recognition rate of the fuzzy classifier for the test data is the best among the neural network classifiers and other fuzzy classifiers if there are no discrete input variables.

  15. Air Quality Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research FacilityFacilities with operating permits for Title V of the Federal Clean Air Act, as well as facilities required to submit an air emissions inventory, and other facilities...

  16. Reward eliminates retrieval-induced forgetting.

    Science.gov (United States)

    Imai, Hisato; Kim, Dongho; Sasaki, Yuka; Watanabe, Takeo

    2014-12-02

    Although it is well known that reward enhances learning and memory, how extensively such enhancement occurs remains unclear. To address this question, we examined how reward influences retrieval-induced forgetting (RIF) in which the retrieval of a nonpracticed item under the same category as a practiced item is worse than the retrieval of a nonpracticed item outside the category. Subjects were asked to try to encode category-exemplar pairs (e.g., FISH-salmon). Then, they were presented with a category name and a two-letter word stem (e.g., FISH-sa) and were asked to complete an encoded word (retrieval practice). For a correct response, apple juice was given as a reward in the reward condition and a beeping sound was presented in the no-reward condition. Finally, subjects were asked to report whether each exemplar had been presented in the first phase. RIF was replicated in the no-reward condition. However, in the reward condition, RIF was eliminated. These results suggest that reward enhances processing of retrieval of unpracticed members by mechanisms such as spreading activation within the same category, irrespective of whether items were practiced or not.

  17. Eliminating Residents Increases the Cost of Care.

    Science.gov (United States)

    DeMarco, Deborah M; Forster, Richard; Gakis, Thomas; Finberg, Robert W

    2017-08-01

    Academic health centers are facing a potential reduction in Medicare financing for graduate medical education (GME). Both the Medicare Payment Advisory Commission and the National Commission on Fiscal Responsibility and Reform (Deficit Commission) have suggested cutting approximately half the funding that teaching hospitals receive for indirect medical education. Because of the effort that goes into teaching trainees, who are only transient employees, hospital executives often see teaching programs as a drain on resources. In light of the possibility of a Medicare cut to GME programs, we undertook an analysis to assess the financial risk of training programs to our institution and the possibility of saving money by reducing resident positions. The chief administrative officer, in collaboration with the hospital chief financial officer, performed a financial analysis to examine the possibility of decreasing costs by reducing residency programs at the University of Massachusetts Memorial Medical Center. Despite the real costs of our training programs, the analysis demonstrated that GME programs have a positive impact on hospital finances. Reducing or eliminating GME programs would have a negative impact on our hospital's bottom line.

  18. Seasonal blood shortages can be eliminated.

    Science.gov (United States)

    Gilcher, Ronald O; McCombs, Suzanne

    2005-11-01

    This review is designed to help readers understand seasonal blood shortages and provide solutions through the use of technology that can increase the number of red blood cell units collected and the use of recruitment and marketing initiatives that appeal to the increasingly diverse donor base. Seasonal shortages are, in reality, mostly shortages of group O red blood cells and occur most commonly during midsummer and early winter. The shortages occur primarily from increased use of group O red blood cells at times of decreased donor availability. While reducing the disproportionate use of red cells will help, blood centers can more quickly reduce the seasonal deficits by using automated red cell technology to collect double red blood cell units; targeted marketing programs to provide effective messages; seasonal advertising campaigns; and recognition, benefits, and incentives to enhance the donor motivation donation threshold. A multi-level approach to increasing blood donations at difficult times of the year can ensure that donations are increased at a time when regular donor availability is decreased. Seasonal blood shortages can be eliminated by understanding the nature of the shortages, why and when they occur, and using more sophisticated recruitment and marketing strategies as well as automated collection technologies to enhance the blood supply.

  19. Elimination of Sitophilus oryzae (L.) by irradiation

    International Nuclear Information System (INIS)

    Castro, Fernanda Peixoto

    2005-01-01

    Food treatment by exposure to ionizing radiation, known as food irradiation, presents several attractive features such as: leaving no residues, posing no threat to consumer health, usually causing no damage to sensory or nutritional properties and acting uniformly throughout the volume of the products. This work investigated the efficiency of irradiation for eliminating Sitophilus oryzae (L.), also known as 'the rice weevil', a small beetle frequently found in infested grains. A total of 444 individuals of Sitophilus oryzae (L.) found in corn meal and noodles supplies were irradiated with gamma ray doses of 0, 0.6, 0.9, 1.0, 1.2, 1.5 and 2.0 kGy and then visually monitored for 4 days in order to determine the number of insects still alive. The least-squares fitting method was used to determine the survival curves as functions of post-irradiation time and dose. The living fraction of the irradiated population was found to decrease exponentially with time. The results indicated that doses of 2.0, 1.5 and 0.6 kGy cause immediate death, instantaneous immobility and death of the species within one week, respectively. The findings suggest that disinfestation of Sitophilus oryzae (L.) by irradiation is an interesting option to the dangerous use of toxic chemicals. (author)

  20. Minimizing or eliminating refueling of nuclear reactor

    Science.gov (United States)

    Doncals, Richard A.; Paik, Nam-Chin; Andre, Sandra V.; Porter, Charles A.; Rathbun, Roy W.; Schwallie, Ambrose L.; Petras, Diane S.

    1989-01-01

    Demand for refueling of a liquid metal fast nuclear reactor having a life of 30 years is eliminated or reduced to intervals of at least 10 years by operating the reactor at a low linear-power density, typically 2.5 kw/ft of fuel rod, rather than 7.5 or 15 kw/ft, which is the prior art practice. So that power of the same magnitude as for prior art reactors is produced, the volume of the core is increased. In addition, the height of the core and it diameter are dimensioned so that the ratio of the height to the diameter approximates 1 to the extent practicable considering the requirement of control and that the pressure drop in the coolant shall not be excessive. The surface area of a cylinder of given volume is a minimum if the ratio of the height to the diameter is 1. By minimizing the surface area, the leakage of neutrons is reduced. By reducing the linear-power density, increasing core volume, reducing fissile enrichment and optimizing core geometry, internal-core breeding of fissionable fuel is substantially enhanced. As a result, core operational life, limited by control worth requirements and fuel burnup capability, is extended up to 30 years of continuous power operation.

  1. Elimination device for decontaminated surface layers

    International Nuclear Information System (INIS)

    Yoshikawa, Kozo.

    1983-01-01

    Purpose: To conduct efficient decontamination injecting solid carbon dioxide particles at a high speed by using a simple and compact device. Constitution: Liquid carbon dioxide is injected from a first vessel containing liquid carbon dioxide by way of a carbon dioxide supply tube to a solid carbon dioxide particle jetting device. The liquid carbon dioxide is partially converted into fine solid carbon dioxide particles due to the temperature reduction caused by adiabatic expansion of the gaseous carbon dioxide in an expansion space for the gaseous carbon dioxide formed in the jetting device and arrives at a solid carbon dioxide injection nozzle in communication with the expansion space. Then, the fine solid carbon dioxide particles are further cooled and accelerated by the nitrogen gas jetted out from a nitrogen gas nozzle at the top of a nitrogen gas supply tube in communication with a second vessel containing liquid nitrogen disposed within the nozzle, and jetted out from the solid carbon dioxide injection nozzle to collide against the surface to be decontaminated and eliminate the surface contamination. (Seki, T.)

  2. Elimination of Sitophilus oryzae (L.) by irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Castro, Fernanda Peixoto [Exercito Brasileiro, Brasilia, DF (Brazil). Diretoria de Suprimento. Dept. Logistico]. E-mail: peixotocastro@dlog.eb.mil.br; Vital, Helio de Carvalho [Centro Tecnologico do Exercito (CTEx), Rio de Janeiro, RJ (brazil)]. E-mail: vital@ctex.eb.br

    2005-07-01

    Food treatment by exposure to ionizing radiation, known as food irradiation, presents several attractive features such as: leaving no residues, posing no threat to consumer health, usually causing no damage to sensory or nutritional properties and acting uniformly throughout the volume of the products. This work investigated the efficiency of irradiation for eliminating Sitophilus oryzae (L.), also known as 'the rice weevil', a small beetle frequently found in infested grains. A total of 444 individuals of Sitophilus oryzae (L.) found in corn meal and noodles supplies were irradiated with gamma ray doses of 0, 0.6, 0.9, 1.0, 1.2, 1.5 and 2.0 kGy and then visually monitored for 4 days in order to determine the number of insects still alive. The least-squares fitting method was used to determine the survival curves as functions of post-irradiation time and dose. The living fraction of the irradiated population was found to decrease exponentially with time. The results indicated that doses of 2.0, 1.5 and 0.6 kGy cause immediate death, instantaneous immobility and death of the species within one week, respectively. The findings suggest that disinfestation of Sitophilus oryzae (L.) by irradiation is an interesting option to the dangerous use of toxic chemicals. (author)

  3. Elimination of excess molybdenum by cattle

    Energy Technology Data Exchange (ETDEWEB)

    Toelgyesi, G.; Elmoty, I.A.

    1967-01-01

    It was found that cattle would ingest spontaneously 5-15 g of molybdenum on one occasion. The uptake of this quantity caused but moderate loss of appetite and mild enteritis, both normalizing in one week. The occurrence of a severe acute molybdenum poisoning can be practically excluded, owing to refusal of the poisoned feed. Spontaneously ingested molybdenum caused on the first day a 30-100 fold rise of ruminal Mo-level, decreasing to the order of the normal value in about one week. But in the urine and faeces, Mo-level was at least 10 fold, in the blood and milk about 4 fold of the normal one, even one or two weeks after ingestion. During this period at least 90% of ingested Mo was eliminated with the faeces, urine and milk. One week after the ingestion of molybdenum, the rumen content showed no evidence on poisoning and no trace of molybdenum. Oral administration of ammonium molybdenate in an amount equivalent to 40 g molybdenum caused no fatality. In fact, cattle would never ingest spontaneously such a large dose.

  4. SpectraClassifier 1.0: a user friendly, automated MRS-based classifier-development system

    Directory of Open Access Journals (Sweden)

    Julià-Sapé Margarida

    2010-02-01

    Full Text Available Abstract Background SpectraClassifier (SC is a Java solution for designing and implementing Magnetic Resonance Spectroscopy (MRS-based classifiers. The main goal of SC is to allow users with minimum background knowledge of multivariate statistics to perform a fully automated pattern recognition analysis. SC incorporates feature selection (greedy stepwise approach, either forward or backward, and feature extraction (PCA. Fisher Linear Discriminant Analysis is the method of choice for classification. Classifier evaluation is performed through various methods: display of the confusion matrix of the training and testing datasets; K-fold cross-validation, leave-one-out and bootstrapping as well as Receiver Operating Characteristic (ROC curves. Results SC is composed of the following modules: Classifier design, Data exploration, Data visualisation, Classifier evaluation, Reports, and Classifier history. It is able to read low resolution in-vivo MRS (single-voxel and multi-voxel and high resolution tissue MRS (HRMAS, processed with existing tools (jMRUI, INTERPRET, 3DiCSI or TopSpin. In addition, to facilitate exchanging data between applications, a standard format capable of storing all the information needed for a dataset was developed. Each functionality of SC has been specifically validated with real data with the purpose of bug-testing and methods validation. Data from the INTERPRET project was used. Conclusions SC is a user-friendly software designed to fulfil the needs of potential users in the MRS community. It accepts all kinds of pre-processed MRS data types and classifies them semi-automatically, allowing spectroscopists to concentrate on interpretation of results with the use of its visualisation tools.

  5. COMPARISON OF SVM AND FUZZY CLASSIFIER FOR AN INDIAN SCRIPT

    Directory of Open Access Journals (Sweden)

    M. J. Baheti

    2012-01-01

    Full Text Available With the advent of technological era, conversion of scanned document (handwritten or printed into machine editable format has attracted many researchers. This paper deals with the problem of recognition of Gujarati handwritten numerals. Gujarati numeral recognition requires performing some specific steps as a part of preprocessing. For preprocessing digitization, segmentation, normalization and thinning are done with considering that the image have almost no noise. Further affine invariant moments based model is used for feature extraction and finally Support Vector Machine (SVM and Fuzzy classifiers are used for numeral classification. . The comparison of SVM and Fuzzy classifier is made and it can be seen that SVM procured better results as compared to Fuzzy Classifier.

  6. Optimal threshold estimation for binary classifiers using game theory.

    Science.gov (United States)

    Sanchez, Ignacio Enrique

    2016-01-01

    Many bioinformatics algorithms can be understood as binary classifiers. They are usually compared using the area under the receiver operating characteristic ( ROC ) curve. On the other hand, choosing the best threshold for practical use is a complex task, due to uncertain and context-dependent skews in the abundance of positives in nature and in the yields/costs for correct/incorrect classification. We argue that considering a classifier as a player in a zero-sum game allows us to use the minimax principle from game theory to determine the optimal operating point. The proposed classifier threshold corresponds to the intersection between the ROC curve and the descending diagonal in ROC space and yields a minimax accuracy of 1-FPR. Our proposal can be readily implemented in practice, and reveals that the empirical condition for threshold estimation of "specificity equals sensitivity" maximizes robustness against uncertainties in the abundance of positives in nature and classification costs.

  7. Statistical text classifier to detect specific type of medical incidents.

    Science.gov (United States)

    Wong, Zoie Shui-Yee; Akiyama, Masanori

    2013-01-01

    WHO Patient Safety has put focus to increase the coherence and expressiveness of patient safety classification with the foundation of International Classification for Patient Safety (ICPS). Text classification and statistical approaches has showed to be successful to identifysafety problems in the Aviation industryusing incident text information. It has been challenging to comprehend the taxonomy of medical incidents in a structured manner. Independent reporting mechanisms for patient safety incidents have been established in the UK, Canada, Australia, Japan, Hong Kong etc. This research demonstrates the potential to construct statistical text classifiers to detect specific type of medical incidents using incident text data. An illustrative example for classifying look-alike sound-alike (LASA) medication incidents using structured text from 227 advisories related to medication errors from Global Patient Safety Alerts (GPSA) is shown in this poster presentation. The classifier was built using logistic regression model. ROC curve and the AUC value indicated that this is a satisfactory good model.

  8. A Topic Model Approach to Representing and Classifying Football Plays

    KAUST Repository

    Varadarajan, Jagannadan

    2013-09-09

    We address the problem of modeling and classifying American Football offense teams’ plays in video, a challenging example of group activity analysis. Automatic play classification will allow coaches to infer patterns and tendencies of opponents more ef- ficiently, resulting in better strategy planning in a game. We define a football play as a unique combination of player trajectories. To this end, we develop a framework that uses player trajectories as inputs to MedLDA, a supervised topic model. The joint maximiza- tion of both likelihood and inter-class margins of MedLDA in learning the topics allows us to learn semantically meaningful play type templates, as well as, classify different play types with 70% average accuracy. Furthermore, this method is extended to analyze individual player roles in classifying each play type. We validate our method on a large dataset comprising 271 play clips from real-world football games, which will be made publicly available for future comparisons.

  9. Defending Malicious Script Attacks Using Machine Learning Classifiers

    Directory of Open Access Journals (Sweden)

    Nayeem Khan

    2017-01-01

    Full Text Available The web application has become a primary target for cyber criminals by injecting malware especially JavaScript to perform malicious activities for impersonation. Thus, it becomes an imperative to detect such malicious code in real time before any malicious activity is performed. This study proposes an efficient method of detecting previously unknown malicious java scripts using an interceptor at the client side by classifying the key features of the malicious code. Feature subset was obtained by using wrapper method for dimensionality reduction. Supervised machine learning classifiers were used on the dataset for achieving high accuracy. Experimental results show that our method can efficiently classify malicious code from benign code with promising results.

  10. A systems biology-based classifier for hepatocellular carcinoma diagnosis.

    Directory of Open Access Journals (Sweden)

    Yanqiong Zhang

    Full Text Available AIM: The diagnosis of hepatocellular carcinoma (HCC in the early stage is crucial to the application of curative treatments which are the only hope for increasing the life expectancy of patients. Recently, several large-scale studies have shed light on this problem through analysis of gene expression profiles to identify markers correlated with HCC progression. However, those marker sets shared few genes in common and were poorly validated using independent data. Therefore, we developed a systems biology based classifier by combining the differential gene expression with topological features of human protein interaction networks to enhance the ability of HCC diagnosis. METHODS AND RESULTS: In the Oncomine platform, genes differentially expressed in HCC tissues relative to their corresponding normal tissues were filtered by a corrected Q value cut-off and Concept filters. The identified genes that are common to different microarray datasets were chosen as the candidate markers. Then, their networks were analyzed by GeneGO Meta-Core software and the hub genes were chosen. After that, an HCC diagnostic classifier was constructed by Partial Least Squares modeling based on the microarray gene expression data of the hub genes. Validations of diagnostic performance showed that this classifier had high predictive accuracy (85.88∼92.71% and area under ROC curve (approximating 1.0, and that the network topological features integrated into this classifier contribute greatly to improving the predictive performance. Furthermore, it has been demonstrated that this modeling strategy is not only applicable to HCC, but also to other cancers. CONCLUSION: Our analysis suggests that the systems biology-based classifier that combines the differential gene expression and topological features of human protein interaction network may enhance the diagnostic performance of HCC classifier.

  11. analysis, diagnosis and prognosis of leprosy utilizing fuzzy classifier

    African Journals Online (AJOL)

    TRIPPLEJO2K2

    expert system eliminates uncertainty and imprecision associated with the diagnosis of Leprosy. .... learns by observing how people regulate real systems, Leondes (2010). ... not a thing at the same time, Zadeh (1965). ... Microsoft Windows XP Professional Operating System, .... Systems, Advances in Soft Computing. Series” ...

  12. The Radiological Research Accelerator Facility:

    International Nuclear Information System (INIS)

    Hall, E.J.; Goldhagen, P.

    1988-07-01

    The Radiological Research Accelerator Facility (RARAF) is based on a 4-MV Van de Graaff accelerator, which is used to generated a variety of well-characterized radiation beams for research in radiobiology, radiological physics, and radiation chemistry. It is part of the Radiological Research Laboratory (RRL) of Columbia University, and its operation is supported as a National Facility by the U.S. Department of Energy. As such, RARAF is available to all potential users on an equal basis, and scientists outside the RRL are encouraged to submit proposals for experiments at RARAF. Facilities and services are provided to users, but the research projects themselves must be supported separately. RARAF was located at BNL from 1967 until 1980, when it was dismantled and moved to the Nevis Laboratories of Columbia University, where it was then reassembled and put back into operation. Data obtained from experiment using RARAF have been of pragmatic value to radiation protection and to neutron therapy. At a more fundamental level, the research at RARAF has provided insight into the biological action of radiation and especially its relation to energy distribution in the cell. High-LET radiations are an agent of special importance because they can cause measurable cellular effects by single particles, eliminating some of the complexities of multievent action and more clearly disclosing basic features. This applies particularly to radiation carcinogenesis. Facilities are available at RARAF for exposing objects to different radiations having a wide range of linear energy transfers (LETs)

  13. Implications of physical symmetries in adaptive image classifiers

    DEFF Research Database (Denmark)

    Sams, Thomas; Hansen, Jonas Lundbek

    2000-01-01

    It is demonstrated that rotational invariance and reflection symmetry of image classifiers lead to a reduction in the number of free parameters in the classifier. When used in adaptive detectors, e.g. neural networks, this may be used to decrease the number of training samples necessary to learn...... a given classification task, or to improve generalization of the neural network. Notably, the symmetrization of the detector does not compromise the ability to distinguish objects that break the symmetry. (C) 2000 Elsevier Science Ltd. All rights reserved....

  14. Silicon nanowire arrays as learning chemical vapour classifiers

    International Nuclear Information System (INIS)

    Niskanen, A O; Colli, A; White, R; Li, H W; Spigone, E; Kivioja, J M

    2011-01-01

    Nanowire field-effect transistors are a promising class of devices for various sensing applications. Apart from detecting individual chemical or biological analytes, it is especially interesting to use multiple selective sensors to look at their collective response in order to perform classification into predetermined categories. We show that non-functionalised silicon nanowire arrays can be used to robustly classify different chemical vapours using simple statistical machine learning methods. We were able to distinguish between acetone, ethanol and water with 100% accuracy while methanol, ethanol and 2-propanol were classified with 96% accuracy in ambient conditions.

  15. Fabrication techniques to eliminate postweld heat treatment

    International Nuclear Information System (INIS)

    Lochhead, J.C.

    1978-01-01

    Postweld heat treatments to reduce residual stresses (stress relief operations) have been a common practice in the pressure vessel industry for a large number of years. A suitable heat treatment operation can, in particular for low alloy steels, have additional beneficial effects, i.e. a reduction in peak hardness values in the heat-affected zone, an improvement in weld metal properties, and a lowering of the adverse effects of the welding process on the mechanical properties of the parent material adjacent to the weld metal. However, continuing studies in the field of brittle fracture, improved parent materials, and more sophisticated nondestructive testing techniques have led to the elimination of such a practice in ever-increasing thickness ranges and types of material. For instance, the recently issued BS 5500 compared with BS 1113 (1969) lifts the thickness limit requiring stress relief in certain circumstances from 19 to 35mm for C steels. With respect to materials the CEGB has stated that as a result of successful operational experience it will no longer be necessary to postweld heat treat butt welds in 2 1/4 Cr-1Mo tubes of certain dimensions. Despite this trend, over a period of years a number of instances have arisen where, because of some factor, postweld heat treatment, although perhaps desirable, is not possible. This Paper describes several such examples. It must be noted that the examples quoted consist of relatively important and major items. It has been necessary within the confines of this Paper to condense the reports. It is hoped that no significant factors have been omitted. (author)

  16. Eliminating LH2 in LOX-collect space launchers - Key to on-demand capability

    Science.gov (United States)

    Leingang, J. L.; Carreiro, L. R.; Maurice, L. Q.

    1993-01-01

    Two air-breathing reusable two-stage space launch vehicle concepts are proposed, in which the first stage employs turboramjet propulsion and the second stage uses rockets, which are expected to provide very rapid response launch of 10,000 lb polar-orbit payloads. In both concepts, liquid oxygen (LOX) for the second stage is collected during first stage ascent, thus eliminating the need for LOX ground servicing facilities. In the first concept, liquid hydrogen in the amount just sufficient to condense and collect second state LOX is the only cryogenic fluid that is loaded on the vehicle at takeoff. The second concept uses the heat sink of conventional jet propulsion fuel and water coolant to drive a lightweight adaptation of the commercial LOX production process, eliminating all cryogenics at takeoff. Both concepts should permit true launch-on-demand capability with aircraftlike ground operations.

  17. Reactor facility

    International Nuclear Information System (INIS)

    Suzuki, Hiroaki; Murase, Michio; Yokomizo, Osamu.

    1997-01-01

    The present invention provides a BWR type reactor facility capable of suppressing the amount of steams generated by the mutual effect of a failed reactor core and coolants upon occurrence of an imaginal accident, and not requiring spacial countermeasures for enhancing the pressure resistance of the container vessel. Namely, a means for supplying cooling water at a temperature not lower by 30degC than the saturated temperature corresponding to the inner pressure of the containing vessel upon occurrence of an accident is disposed to a lower dry well below the pressure vessel. As a result, upon occurrence of such an accident that the reactor core should be melted and flown downward of the pressure vessel, when cooling water at a temperature not lower than the saturated temperature, for example, cooling water at 100degC or higher is supplied to the lower dry well, abrupt generation of steams by the mutual effect of the failed reactor core and cooling water is scarcely caused compared with a case of supplying cooling water at a temperature lower than the saturation temperature by 30degC or more. Accordingly, the amount of steams to be generated can be suppressed, and special countermeasure is no more necessary for enhancing the pressure resistance of the container vessel is no more necessary. (I.S.)

  18. Nuclear facilities

    International Nuclear Information System (INIS)

    Anon.

    2002-01-01

    During September and October 2001, 15 events were recorded on the first grade and 1 on the second grade of the INES scale. The second grade event is in fact a re-classification of an incident that occurred on the second april 2001 at Dampierre power plant. This event happened during core refueling, a shift in the operation sequence led to the wrong positioning of 113 assemblies. A preliminary study of this event shows that this wrong positioning could have led, in other circumstances, to the ignition of nuclear reactions. Even in that case, the analysis made by EDF shows that the consequences on the staff would have been limited. Nevertheless a further study has shown that the existing measuring instruments could not have detected the power increase announcing the beginning of the chain reaction. The investigation has shown that there were deficiencies in the control of the successive operations involved in refueling. EDF has proposed a series of corrective measures to be implemented in all nuclear power plants. The other 15 events are described in the article. During this period 121 inspections have been made in nuclear facilities. (A.C.)

  19. 18 CFR 367.18 - Criteria for classifying leases.

    Science.gov (United States)

    2010-04-01

    ... the lessee) must not give rise to a new classification of a lease for accounting purposes. ... classifying the lease. (4) The present value at the beginning of the lease term of the minimum lease payments... taxes to be paid by the lessor, including any related profit, equals or exceeds 90 percent of the excess...

  20. Discrimination-Aware Classifiers for Student Performance Prediction

    Science.gov (United States)

    Luo, Ling; Koprinska, Irena; Liu, Wei

    2015-01-01

    In this paper we consider discrimination-aware classification of educational data. Mining and using rules that distinguish groups of students based on sensitive attributes such as gender and nationality may lead to discrimination. It is desirable to keep the sensitive attributes during the training of a classifier to avoid information loss but…

  1. 29 CFR 1910.307 - Hazardous (classified) locations.

    Science.gov (United States)

    2010-07-01

    ... equipment at the location. (c) Electrical installations. Equipment, wiring methods, and installations of... covers the requirements for electric equipment and wiring in locations that are classified depending on... provisions of this section. (4) Division and zone classification. In Class I locations, an installation must...

  2. 29 CFR 1926.407 - Hazardous (classified) locations.

    Science.gov (United States)

    2010-07-01

    ...) locations, unless modified by provisions of this section. (b) Electrical installations. Equipment, wiring..., DEPARTMENT OF LABOR (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR CONSTRUCTION Electrical Installation Safety... electric equipment and wiring in locations which are classified depending on the properties of the...

  3. 18 CFR 3a.71 - Accountability for classified material.

    Science.gov (United States)

    2010-04-01

    ... numbers assigned to top secret material will be separate from the sequence for other classified material... central control registry in calendar year 1969. TS 1006—Sixth Top Secret document controlled by the... control registry when the document is transferred. (e) For Top Secret documents only, an access register...

  4. Classifier fusion for VoIP attacks classification

    Science.gov (United States)

    Safarik, Jakub; Rezac, Filip

    2017-05-01

    SIP is one of the most successful protocols in the field of IP telephony communication. It establishes and manages VoIP calls. As the number of SIP implementation rises, we can expect a higher number of attacks on the communication system in the near future. This work aims at malicious SIP traffic classification. A number of various machine learning algorithms have been developed for attack classification. The paper presents a comparison of current research and the use of classifier fusion method leading to a potential decrease in classification error rate. Use of classifier combination makes a more robust solution without difficulties that may affect single algorithms. Different voting schemes, combination rules, and classifiers are discussed to improve the overall performance. All classifiers have been trained on real malicious traffic. The concept of traffic monitoring depends on the network of honeypot nodes. These honeypots run in several networks spread in different locations. Separation of honeypots allows us to gain an independent and trustworthy attack information.

  5. Bayesian Classifier for Medical Data from Doppler Unit

    Directory of Open Access Journals (Sweden)

    J. Málek

    2006-01-01

    Full Text Available Nowadays, hand-held ultrasonic Doppler units (probes are often used for noninvasive screening of atherosclerosis in the arteries of the lower limbs. The mean velocity of blood flow in time and blood pressures are measured on several positions on each lower limb. By listening to the acoustic signal generated by the device or by reading the signal displayed on screen, a specialist can detect peripheral arterial disease (PAD.This project aims to design software that will be able to analyze data from such a device and classify it into several diagnostic classes. At the Department of Functional Diagnostics at the Regional Hospital in Liberec a database of several hundreds signals was collected. In cooperation with the specialist, the signals were manually classified into four classes. For each class, selected signal features were extracted and then used for training a Bayesian classifier. Another set of signals was used for evaluating and optimizing the parameters of the classifier. Slightly above 84 % of successfully recognized diagnostic states, was recently achieved on the test data. 

  6. An Investigation to Improve Classifier Accuracy for Myo Collected Data

    Science.gov (United States)

    2017-02-01

    Bad Samples Effect on Classification Accuracy 7 5.1 Naïve Bayes (NB) Classifier Accuracy 7 5.2 Logistic Model Tree (LMT) 10 5.3 K-Nearest Neighbor...gesture, pitch feature, user 06. All samples exhibit reversed movement...20 Fig. A-2 Come gesture, pitch feature, user 14. All samples exhibit reversed movement

  7. Diagnosis of Broiler Livers by Classifying Image Patches

    DEFF Research Database (Denmark)

    Jørgensen, Anders; Fagertun, Jens; Moeslund, Thomas B.

    2017-01-01

    The manual health inspection are becoming the bottleneck at poultry processing plants. We present a computer vision method for automatic diagnosis of broiler livers. The non-rigid livers, of varying shape and sizes, are classified in patches by a convolutional neural network, outputting maps...

  8. Support vector machines classifiers of physical activities in preschoolers

    Science.gov (United States)

    The goal of this study is to develop, test, and compare multinomial logistic regression (MLR) and support vector machines (SVM) in classifying preschool-aged children physical activity data acquired from an accelerometer. In this study, 69 children aged 3-5 years old were asked to participate in a s...

  9. A Linguistic Image of Nature: The Burmese Numerative Classifier System

    Science.gov (United States)

    Becker, Alton L.

    1975-01-01

    The Burmese classifier system is coherent because it is based upon a single elementary semantic dimension: deixis. On that dimension, four distances are distinguished, distances which metaphorically substitute for other conceptual relations between people and other living beings, people and things, and people and concepts. (Author/RM)

  10. Data Stream Classification Based on the Gamma Classifier

    Directory of Open Access Journals (Sweden)

    Abril Valeria Uriarte-Arcia

    2015-01-01

    Full Text Available The ever increasing data generation confronts us with the problem of handling online massive amounts of information. One of the biggest challenges is how to extract valuable information from these massive continuous data streams during single scanning. In a data stream context, data arrive continuously at high speed; therefore the algorithms developed to address this context must be efficient regarding memory and time management and capable of detecting changes over time in the underlying distribution that generated the data. This work describes a novel method for the task of pattern classification over a continuous data stream based on an associative model. The proposed method is based on the Gamma classifier, which is inspired by the Alpha-Beta associative memories, which are both supervised pattern recognition models. The proposed method is capable of handling the space and time constrain inherent to data stream scenarios. The Data Streaming Gamma classifier (DS-Gamma classifier implements a sliding window approach to provide concept drift detection and a forgetting mechanism. In order to test the classifier, several experiments were performed using different data stream scenarios with real and synthetic data streams. The experimental results show that the method exhibits competitive performance when compared to other state-of-the-art algorithms.

  11. Building an automated SOAP classifier for emergency department reports.

    Science.gov (United States)

    Mowery, Danielle; Wiebe, Janyce; Visweswaran, Shyam; Harkema, Henk; Chapman, Wendy W

    2012-02-01

    Information extraction applications that extract structured event and entity information from unstructured text can leverage knowledge of clinical report structure to improve performance. The Subjective, Objective, Assessment, Plan (SOAP) framework, used to structure progress notes to facilitate problem-specific, clinical decision making by physicians, is one example of a well-known, canonical structure in the medical domain. Although its applicability to structuring data is understood, its contribution to information extraction tasks has not yet been determined. The first step to evaluating the SOAP framework's usefulness for clinical information extraction is to apply the model to clinical narratives and develop an automated SOAP classifier that classifies sentences from clinical reports. In this quantitative study, we applied the SOAP framework to sentences from emergency department reports, and trained and evaluated SOAP classifiers built with various linguistic features. We found the SOAP framework can be applied manually to emergency department reports with high agreement (Cohen's kappa coefficients over 0.70). Using a variety of features, we found classifiers for each SOAP class can be created with moderate to outstanding performance with F(1) scores of 93.9 (subjective), 94.5 (objective), 75.7 (assessment), and 77.0 (plan). We look forward to expanding the framework and applying the SOAP classification to clinical information extraction tasks. Copyright © 2011. Published by Elsevier Inc.

  12. Learning to classify wakes from local sensory information

    Science.gov (United States)

    Alsalman, Mohamad; Colvert, Brendan; Kanso, Eva; Kanso Team

    2017-11-01

    Aquatic organisms exhibit remarkable abilities to sense local flow signals contained in their fluid environment and to surmise the origins of these flows. For example, fish can discern the information contained in various flow structures and utilize this information for obstacle avoidance and prey tracking. Flow structures created by flapping and swimming bodies are well characterized in the fluid dynamics literature; however, such characterization relies on classical methods that use an external observer to reconstruct global flow fields. The reconstructed flows, or wakes, are then classified according to the unsteady vortex patterns. Here, we propose a new approach for wake identification: we classify the wakes resulting from a flapping airfoil by applying machine learning algorithms to local flow information. In particular, we simulate the wakes of an oscillating airfoil in an incoming flow, extract the downstream vorticity information, and train a classifier to learn the different flow structures and classify new ones. This data-driven approach provides a promising framework for underwater navigation and detection in application to autonomous bio-inspired vehicles.

  13. The Closing of the Classified Catalog at Boston University

    Science.gov (United States)

    Hazen, Margaret Hindle

    1974-01-01

    Although the classified catalog at Boston University libraries has been a useful research tool, it has proven too expensive to keep current. The library has converted to a traditional alphabetic subject catalog and will recieve catalog cards from the Ohio College Library Center through the New England Library Network. (Author/LS)

  14. Recognition of Arabic Sign Language Alphabet Using Polynomial Classifiers

    Directory of Open Access Journals (Sweden)

    M. Al-Rousan

    2005-08-01

    Full Text Available Building an accurate automatic sign language recognition system is of great importance in facilitating efficient communication with deaf people. In this paper, we propose the use of polynomial classifiers as a classification engine for the recognition of Arabic sign language (ArSL alphabet. Polynomial classifiers have several advantages over other classifiers in that they do not require iterative training, and that they are highly computationally scalable with the number of classes. Based on polynomial classifiers, we have built an ArSL system and measured its performance using real ArSL data collected from deaf people. We show that the proposed system provides superior recognition results when compared with previously published results using ANFIS-based classification on the same dataset and feature extraction methodology. The comparison is shown in terms of the number of misclassified test patterns. The reduction in the rate of misclassified patterns was very significant. In particular, we have achieved a 36% reduction of misclassifications on the training data and 57% on the test data.

  15. Reconfigurable support vector machine classifier with approximate computing

    NARCIS (Netherlands)

    van Leussen, M.J.; Huisken, J.; Wang, L.; Jiao, H.; De Gyvez, J.P.

    2017-01-01

    Support Vector Machine (SVM) is one of the most popular machine learning algorithms. An energy-efficient SVM classifier is proposed in this paper, where approximate computing is utilized to reduce energy consumption and silicon area. A hardware architecture with reconfigurable kernels and

  16. Classifying regularized sensor covariance matrices: An alternative to CSP

    NARCIS (Netherlands)

    Roijendijk, L.M.M.; Gielen, C.C.A.M.; Farquhar, J.D.R.

    2016-01-01

    Common spatial patterns ( CSP) is a commonly used technique for classifying imagined movement type brain-computer interface ( BCI) datasets. It has been very successful with many extensions and improvements on the basic technique. However, a drawback of CSP is that the signal processing pipeline

  17. Classifying regularised sensor covariance matrices: An alternative to CSP

    NARCIS (Netherlands)

    Roijendijk, L.M.M.; Gielen, C.C.A.M.; Farquhar, J.D.R.

    2016-01-01

    Common spatial patterns (CSP) is a commonly used technique for classifying imagined movement type brain computer interface (BCI) datasets. It has been very successful with many extensions and improvements on the basic technique. However, a drawback of CSP is that the signal processing pipeline

  18. Two-categorical bundles and their classifying spaces

    DEFF Research Database (Denmark)

    Baas, Nils A.; Bökstedt, M.; Kro, T.A.

    2012-01-01

    -category is a classifying space for the associated principal 2-bundles. In the process of proving this we develop a lot of powerful machinery which may be useful in further studies of 2-categorical topology. As a corollary we get a new proof of the classification of principal bundles. A calculation based...

  19. 3 CFR - Classified Information and Controlled Unclassified Information

    Science.gov (United States)

    2010-01-01

    ... on Transparency and Open Government and on the Freedom of Information Act, my Administration is... memoranda of January 21, 2009, on Transparency and Open Government and on the Freedom of Information Act; (B... 3 The President 1 2010-01-01 2010-01-01 false Classified Information and Controlled Unclassified...

  20. Comparison of Classifier Architectures for Online Neural Spike Sorting.

    Science.gov (United States)

    Saeed, Maryam; Khan, Amir Ali; Kamboh, Awais Mehmood

    2017-04-01

    High-density, intracranial recordings from micro-electrode arrays need to undergo Spike Sorting in order to associate the recorded neuronal spikes to particular neurons. This involves spike detection, feature extraction, and classification. To reduce the data transmission and power requirements, on-chip real-time processing is becoming very popular. However, high computational resources are required for classifiers in on-chip spike-sorters, making scalability a great challenge. In this review paper, we analyze several popular classifiers to propose five new hardware architectures using the off-chip training with on-chip classification approach. These include support vector classification, fuzzy C-means classification, self-organizing maps classification, moving-centroid K-means classification, and Cosine distance classification. The performance of these architectures is analyzed in terms of accuracy and resource requirement. We establish that the neural networks based Self-Organizing Maps classifier offers the most viable solution. A spike sorter based on the Self-Organizing Maps classifier, requires only 7.83% of computational resources of the best-reported spike sorter, hierarchical adaptive means, while offering a 3% better accuracy at 7 dB SNR.

  1. A Gene Expression Classifier of Node-Positive Colorectal Cancer

    Directory of Open Access Journals (Sweden)

    Paul F. Meeh

    2009-10-01

    Full Text Available We used digital long serial analysis of gene expression to discover gene expression differences between node-negative and node-positive colorectal tumors and developed a multigene classifier able to discriminate between these two tumor types. We prepared and sequenced long serial analysis of gene expression libraries from one node-negative and one node-positive colorectal tumor, sequenced to a depth of 26,060 unique tags, and identified 262 tags significantly differentially expressed between these two tumors (P < 2 x 10-6. We confirmed the tag-to-gene assignments and differential expression of 31 genes by quantitative real-time polymerase chain reaction, 12 of which were elevated in the node-positive tumor. We analyzed the expression levels of these 12 upregulated genes in a validation panel of 23 additional tumors and developed an optimized seven-gene logistic regression classifier. The classifier discriminated between node-negative and node-positive tumors with 86% sensitivity and 80% specificity. Receiver operating characteristic analysis of the classifier revealed an area under the curve of 0.86. Experimental manipulation of the function of one classification gene, Fibronectin, caused profound effects on invasion and migration of colorectal cancer cells in vitro. These results suggest that the development of node-positive colorectal cancer occurs in part through elevated epithelial FN1 expression and suggest novel strategies for the diagnosis and treatment of advanced disease.

  2. Cascaded lexicalised classifiers for second-person reference resolution

    NARCIS (Netherlands)

    Purver, M.; Fernández, R.; Frampton, M.; Peters, S.; Healey, P.; Pieraccini, R.; Byron, D.; Young, S.; Purver, M.

    2009-01-01

    This paper examines the resolution of the second person English pronoun you in multi-party dialogue. Following previous work, we attempt to classify instances as generic or referential, and in the latter case identify the singular or plural addressee. We show that accuracy and robustness can be

  3. Human Activity Recognition by Combining a Small Number of Classifiers.

    Science.gov (United States)

    Nazabal, Alfredo; Garcia-Moreno, Pablo; Artes-Rodriguez, Antonio; Ghahramani, Zoubin

    2016-09-01

    We consider the problem of daily human activity recognition (HAR) using multiple wireless inertial sensors, and specifically, HAR systems with a very low number of sensors, each one providing an estimation of the performed activities. We propose new Bayesian models to combine the output of the sensors. The models are based on a soft outputs combination of individual classifiers to deal with the small number of sensors. We also incorporate the dynamic nature of human activities as a first-order homogeneous Markov chain. We develop both inductive and transductive inference methods for each model to be employed in supervised and semisupervised situations, respectively. Using different real HAR databases, we compare our classifiers combination models against a single classifier that employs all the signals from the sensors. Our models exhibit consistently a reduction of the error rate and an increase of robustness against sensor failures. Our models also outperform other classifiers combination models that do not consider soft outputs and an Markovian structure of the human activities.

  4. Evaluation of three classifiers in mapping forest stand types using ...

    African Journals Online (AJOL)

    EJIRO

    applied for classification of the image. Supervised classification technique using maximum likelihood algorithm is the most commonly and widely used method for land cover classification (Jia and Richards, 2006). In Australia, the maximum likelihood classifier was effectively used to map different forest stand types with high.

  5. Classifying patients' complaints for regulatory purposes : A Pilot Study

    NARCIS (Netherlands)

    Bouwman, R.J.R.; Bomhoff, Manja; Robben, Paul; Friele, R.D.

    2018-01-01

    Objectives: It is assumed that classifying and aggregated reporting of patients' complaints by regulators helps to identify problem areas, to respond better to patients and increase public accountability. This pilot study addresses what a classification of complaints in a regulatory setting

  6. Localizing genes to cerebellar layers by classifying ISH images.

    Directory of Open Access Journals (Sweden)

    Lior Kirsch

    Full Text Available Gene expression controls how the brain develops and functions. Understanding control processes in the brain is particularly hard since they involve numerous types of neurons and glia, and very little is known about which genes are expressed in which cells and brain layers. Here we describe an approach to detect genes whose expression is primarily localized to a specific brain layer and apply it to the mouse cerebellum. We learn typical spatial patterns of expression from a few markers that are known to be localized to specific layers, and use these patterns to predict localization for new genes. We analyze images of in-situ hybridization (ISH experiments, which we represent using histograms of local binary patterns (LBP and train image classifiers and gene classifiers for four layers of the cerebellum: the Purkinje, granular, molecular and white matter layer. On held-out data, the layer classifiers achieve accuracy above 94% (AUC by representing each image at multiple scales and by combining multiple image scores into a single gene-level decision. When applied to the full mouse genome, the classifiers predict specific layer localization for hundreds of new genes in the Purkinje and granular layers. Many genes localized to the Purkinje layer are likely to be expressed in astrocytes, and many others are involved in lipid metabolism, possibly due to the unusual size of Purkinje cells.

  7. An ensemble self-training protein interaction article classifier.

    Science.gov (United States)

    Chen, Yifei; Hou, Ping; Manderick, Bernard

    2014-01-01

    Protein-protein interaction (PPI) is essential to understand the fundamental processes governing cell biology. The mining and curation of PPI knowledge are critical for analyzing proteomics data. Hence it is desired to classify articles PPI-related or not automatically. In order to build interaction article classification systems, an annotated corpus is needed. However, it is usually the case that only a small number of labeled articles can be obtained manually. Meanwhile, a large number of unlabeled articles are available. By combining ensemble learning and semi-supervised self-training, an ensemble self-training interaction classifier called EST_IACer is designed to classify PPI-related articles based on a small number of labeled articles and a large number of unlabeled articles. A biological background based feature weighting strategy is extended using the category information from both labeled and unlabeled data. Moreover, a heuristic constraint is put forward to select optimal instances from unlabeled data to improve the performance further. Experiment results show that the EST_IACer can classify the PPI related articles effectively and efficiently.

  8. Classifying Your Food as Acid, Low-Acid, or Acidified

    OpenAIRE

    Bacon, Karleigh

    2012-01-01

    As a food entrepreneur, you should be aware of how ingredients in your product make the food look, feel, and taste; as well as how the ingredients create environments for microorganisms like bacteria, yeast, and molds to survive and grow. This guide will help you classifying your food as acid, low-acid, or acidified.

  9. Gene-expression Classifier in Papillary Thyroid Carcinoma

    DEFF Research Database (Denmark)

    Londero, Stefano Christian; Jespersen, Marie Louise; Krogdahl, Annelise

    2016-01-01

    BACKGROUND: No reliable biomarker for metastatic potential in the risk stratification of papillary thyroid carcinoma exists. We aimed to develop a gene-expression classifier for metastatic potential. MATERIALS AND METHODS: Genome-wide expression analyses were used. Development cohort: freshly...

  10. Abbreviations: Their Effects on Comprehension of Classified Advertisements.

    Science.gov (United States)

    Sokol, Kirstin R.

    Two experimental designs were used to test the hypothesis that abbreviations in classified advertisements decrease the reader's comprehension of such ads. In the first experimental design, 73 high school students read four ads (for employment, used cars, apartments for rent, and articles for sale) either with abbreviations or with all…

  11. Origin of malaria cases: a 7-year audit of global trends in indigenous and imported cases in relation to malaria elimination

    Directory of Open Access Journals (Sweden)

    Mar Velarde-Rodríguez

    2015-10-01

    Full Text Available Background: Countries in the different stages of pre-elimination, elimination, and prevention of reintroduction are required to report the number of indigenous and imported malaria cases to the World Health Organization (WHO. However, these data have not been systematically analysed at the global level. Objective: For the period 2007 to 2013, we aimed to report on 1 the proportion of countries providing data on the origin of malaria cases and 2 the origin of malaria cases in countries classified as being in the stages of pre-elimination, elimination and prevention of reintroduction. Design: An observational study using annual data reported through routine health information systems to the WHO Global Malaria Programme between 2007 and 2013. Results: For all countries classified as being in pre-elimination, elimination, and prevention of reintroduction in the year 2013, there has been a substantial decrease in the total number of indigenous malaria cases, from more than 15,000 cases reported in 2007 to less than 4,000 cases reported in 2013. However, the total number of imported malaria cases has increased over that time period, from 5,600 imported cases in 2007 to approximately 6,800 in 2013. Conclusions: Vigilant monitoring of the numbers of imported and indigenous malaria cases at national and global levels as well as appropriate strategies to target these cases will be critical to achieve malaria eradication.

  12. General considerations for neutron capture therapy at a reactor facility

    International Nuclear Information System (INIS)

    Binney, S.E.

    2001-01-01

    In addition to neutron beam intensity and quality, there are also a number of other significant criteria related to a nuclear reactor that contribute to a successful neutron capture therapy (NCT) facility. These criteria are classified into four main categories: Nuclear design factors, facility management and operations factors, facility resources, and non-technical factors. Important factors to consider are given for each of these categories. In addition to an adequate neutron beam intensity and quality, key requirements for a successful neutron capture therapy facility include necessary finances to construct or convert a facility for NCT, a capable medical staff to perform the NCT, and the administrative support for the facility. The absence of any one of these four factors seriously jeopardizes the overall probability of success of the facility. Thus nuclear reactor facility management considering becoming involved in neutron capture therapy, should it be proven clinically successful, should take all these factors into consideration. (author)

  13. Application of Facility Management in Brownfield Conversion

    Directory of Open Access Journals (Sweden)

    Wernerová Eva

    2016-12-01

    Full Text Available The subject of this paper covers two issues, namely the issue of brownfields and their conversion and the issue of Facility Management, which offers the possibility of applying its principles and tools for extending the benefit of the construction works as a tool for active access to care for the property. This paper aims to link these two topics and to identify the possibility of applying Facility Management in the conversation process of revitalization of brownfields so that subsequent commissioning eliminates the risk of future costly operation and relapse of the revitalized building into the category of brownfields.

  14. Australian rubella serosurvey 2012-2013: On track for elimination?

    Science.gov (United States)

    Edirisuriya, Chathura; Beard, Frank H; Hendry, Alexandra J; Dey, Aditi; Gidding, Heather F; Hueston, Linda; Dwyer, Dominic E; Wood, James G; Macartney, Kristine K; McIntyre, Peter B

    2018-04-13

    The World Health Organization has targeted rubella virus for elimination regionally. Australia was one of the first countries to implement a nationally funded rubella immunisation program, in 1971, and conducts regular national rubella serosurveillance studies. We aimed to estimate the seroprevalence of rubella-specific IgG antibody in the Australian population by age and sex in 2012-2013, to compare the results with three previous serosurveys conducted in 1996-1999, 2002 and 2007 and to estimate the effective reproduction numbers (R n ). This study used 2729 serum and plasma specimens, randomly selected from a specimen bank collected in 2012-2013 across Australia. Age groups included in the sample ranged from 1 to 49 years. Sera were tested for rubella-specific IgG-antibody using the Enzygnost anti-rubella IgG enzyme immunoassay and classified as positive, negative or equivocal according to rubella-specific IgG concentrations of >7 IU/ml, <3 IU/ml and 3-7 IU/ml, respectively. The overall proportions seropositive, seronegative and equivocal for rubella-specific IgG were 92.1% (95% CI, 91.0-93.2), 6.7% (95% CI, 5.7-7.7) and 1.2% (95% CI, 0.8-1.6), respectively. The proportion of males seropositive was significantly lower than females in the 30-34 (83.1% vs. 96.8%, p = 0.003), 35-39 (86.1% vs. 96.3%, p = 0.02) and 40-44 (86.1% vs. 95.7%, p = 0.03) year age groups. R n for rubella in 2012-2013 was estimated to be 0.33 (95% CI 0.28-0.39). The 2012-2013 national serosurvey showed levels of rubella-specific IgG seropositivity in the Australian population are relatively high with no evidence of decrease compared to previous serosurveys conducted in 1996-1999, 2002 and 2007. The lower proportion of seropositive males aged 30-44 years likely reflects the initial immunisation program targeting females only. To our knowledge this study represents the longest period of serosurveillance following introduction of a nationally funded rubella immunisation

  15. Irradiation Facilities at CERN

    CERN Document Server

    Gkotse, Blerina; Carbonez, Pierre; Danzeca, Salvatore; Fabich, Adrian; Garcia, Alia, Ruben; Glaser, Maurice; Gorine, Georgi; Jaekel, Martin, Richard; Mateu,Suau, Isidre; Pezzullo, Giuseppe; Pozzi, Fabio; Ravotti, Federico; Silari, Marco; Tali, Maris

    2017-01-01

    CERN provides unique irradiation facilities for applications in many scientific fields. This paper summarizes the facilities currently operating for proton, gamma, mixed-field and electron irradiations, including their main usage, characteristics and information about their operation. The new CERN irradiation facilities database is also presented. This includes not only CERN facilities but also irradiation facilities available worldwide.

  16. Research Facilities | Wind | NREL

    Science.gov (United States)

    Research Facilities Research Facilities NREL's state-of-the-art wind research facilities at the Research Facilities Photo of five men in hard hards observing the end of a turbine blade while it's being tested. Structural Research Facilities A photo of two people silhouetted against a computer simulation of

  17. North Slope, Alaska ESI: FACILITY (Facility Points)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains data for oil field facilities for the North Slope of Alaska. Vector points in this data set represent oil field facility locations. This data...

  18. Deep Feature Learning and Cascaded Classifier for Large Scale Data

    DEFF Research Database (Denmark)

    Prasoon, Adhish

    from data rather than having a predefined feature set. We explore deep learning approach of convolutional neural network (CNN) for segmenting three dimensional medical images. We propose a novel system integrating three 2D CNNs, which have a one-to-one association with the xy, yz and zx planes of 3D......This thesis focuses on voxel/pixel classification based approaches for image segmentation. The main application is segmentation of articular cartilage in knee MRIs. The first major contribution of the thesis deals with large scale machine learning problems. Many medical imaging problems need huge...... amount of training data to cover sufficient biological variability. Learning methods scaling badly with number of training data points cannot be used in such scenarios. This may restrict the usage of many powerful classifiers having excellent generalization ability. We propose a cascaded classifier which...

  19. Scoring and Classifying Examinees Using Measurement Decision Theory

    Directory of Open Access Journals (Sweden)

    Lawrence M. Rudner

    2009-04-01

    Full Text Available This paper describes and evaluates the use of measurement decision theory (MDT to classify examinees based on their item response patterns. The model has a simple framework that starts with the conditional probabilities of examinees in each category or mastery state responding correctly to each item. The presented evaluation investigates: (1 the classification accuracy of tests scored using decision theory; (2 the effectiveness of different sequential testing procedures; and (3 the number of items needed to make a classification. A large percentage of examinees can be classified accurately with very few items using decision theory. A Java Applet for self instruction and software for generating, calibrating and scoring MDT data are provided.

  20. MAMMOGRAMS ANALYSIS USING SVM CLASSIFIER IN COMBINED TRANSFORMS DOMAIN

    Directory of Open Access Journals (Sweden)

    B.N. Prathibha

    2011-02-01

    Full Text Available Breast cancer is a primary cause of mortality and morbidity in women. Reports reveal that earlier the detection of abnormalities, better the improvement in survival. Digital mammograms are one of the most effective means for detecting possible breast anomalies at early stages. Digital mammograms supported with Computer Aided Diagnostic (CAD systems help the radiologists in taking reliable decisions. The proposed CAD system extracts wavelet features and spectral features for the better classification of mammograms. The Support Vector Machines classifier is used to analyze 206 mammogram images from Mias database pertaining to the severity of abnormality, i.e., benign and malign. The proposed system gives 93.14% accuracy for discrimination between normal-malign and 87.25% accuracy for normal-benign samples and 89.22% accuracy for benign-malign samples. The study reveals that features extracted in hybrid transform domain with SVM classifier proves to be a promising tool for analysis of mammograms.

  1. Evaluation of LDA Ensembles Classifiers for Brain Computer Interface

    International Nuclear Information System (INIS)

    Arjona, Cristian; Pentácolo, José; Gareis, Iván; Atum, Yanina; Gentiletti, Gerardo; Acevedo, Rubén; Rufiner, Leonardo

    2011-01-01

    The Brain Computer Interface (BCI) translates brain activity into computer commands. To increase the performance of the BCI, to decode the user intentions it is necessary to get better the feature extraction and classification techniques. In this article the performance of a three linear discriminant analysis (LDA) classifiers ensemble is studied. The system based on ensemble can theoretically achieved better classification results than the individual counterpart, regarding individual classifier generation algorithm and the procedures for combine their outputs. Classic algorithms based on ensembles such as bagging and boosting are discussed here. For the application on BCI, it was concluded that the generated results using ER and AUC as performance index do not give enough information to establish which configuration is better.

  2. Security Enrichment in Intrusion Detection System Using Classifier Ensemble

    Directory of Open Access Journals (Sweden)

    Uma R. Salunkhe

    2017-01-01

    Full Text Available In the era of Internet and with increasing number of people as its end users, a large number of attack categories are introduced daily. Hence, effective detection of various attacks with the help of Intrusion Detection Systems is an emerging trend in research these days. Existing studies show effectiveness of machine learning approaches in handling Intrusion Detection Systems. In this work, we aim to enhance detection rate of Intrusion Detection System by using machine learning technique. We propose a novel classifier ensemble based IDS that is constructed using hybrid approach which combines data level and feature level approach. Classifier ensembles combine the opinions of different experts and improve the intrusion detection rate. Experimental results show the improved detection rates of our system compared to reference technique.

  3. The three-dimensional origin of the classifying algebra

    International Nuclear Information System (INIS)

    Fuchs, Juergen; Schweigert, Christoph; Stigner, Carl

    2010-01-01

    It is known that reflection coefficients for bulk fields of a rational conformal field theory in the presence of an elementary boundary condition can be obtained as representation matrices of irreducible representations of the classifying algebra, a semisimple commutative associative complex algebra. We show how this algebra arises naturally from the three-dimensional geometry of factorization of correlators of bulk fields on the disk. This allows us to derive explicit expressions for the structure constants of the classifying algebra as invariants of ribbon graphs in the three-manifold S 2 xS 1 . Our result unravels a precise relation between intertwiners of the action of the mapping class group on spaces of conformal blocks and boundary conditions in rational conformal field theories.

  4. Machine learning classifiers and fMRI: a tutorial overview.

    Science.gov (United States)

    Pereira, Francisco; Mitchell, Tom; Botvinick, Matthew

    2009-03-01

    Interpreting brain image experiments requires analysis of complex, multivariate data. In recent years, one analysis approach that has grown in popularity is the use of machine learning algorithms to train classifiers to decode stimuli, mental states, behaviours and other variables of interest from fMRI data and thereby show the data contain information about them. In this tutorial overview we review some of the key choices faced in using this approach as well as how to derive statistically significant results, illustrating each point from a case study. Furthermore, we show how, in addition to answering the question of 'is there information about a variable of interest' (pattern discrimination), classifiers can be used to tackle other classes of question, namely 'where is the information' (pattern localization) and 'how is that information encoded' (pattern characterization).

  5. Lung Nodule Detection in CT Images using Neuro Fuzzy Classifier

    Directory of Open Access Journals (Sweden)

    M. Usman Akram

    2013-07-01

    Full Text Available Automated lung cancer detection using computer aided diagnosis (CAD is an important area in clinical applications. As the manual nodule detection is very time consuming and costly so computerized systems can be helpful for this purpose. In this paper, we propose a computerized system for lung nodule detection in CT scan images. The automated system consists of two stages i.e. lung segmentation and enhancement, feature extraction and classification. The segmentation process will result in separating lung tissue from rest of the image, and only the lung tissues under examination are considered as candidate regions for detecting malignant nodules in lung portion. A feature vector for possible abnormal regions is calculated and regions are classified using neuro fuzzy classifier. It is a fully automatic system that does not require any manual intervention and experimental results show the validity of our system.

  6. A Bayesian Classifier for X-Ray Pulsars Recognition

    Directory of Open Access Journals (Sweden)

    Hao Liang

    2016-01-01

    Full Text Available Recognition for X-ray pulsars is important for the problem of spacecraft’s attitude determination by X-ray Pulsar Navigation (XPNAV. By using the nonhomogeneous Poisson model of the received photons and the minimum recognition error criterion, a classifier based on the Bayesian theorem is proposed. For X-ray pulsars recognition with unknown Doppler frequency and initial phase, the features of every X-ray pulsar are extracted and the unknown parameters are estimated using the Maximum Likelihood (ML method. Besides that, a method to recognize unknown X-ray pulsars or X-ray disturbances is proposed. Simulation results certificate the validity of the proposed Bayesian classifier.

  7. Conformal anomaly and elimination of infrared divergences in curved spacetime

    International Nuclear Information System (INIS)

    Grib, A.A.; Nesteruk, A.V.; Pritomanov, S.A.

    1984-01-01

    The relation between the problem of eliminating the infrared divergences and the conformal anomaly of the regularized energy-momentum tensor is studied in homogeneous isotropic and anisotropic spacetime. It is shown that elimination of the infrared divergence by means of a cutoff or the introduction of a conformally invariant mass of the field leads to the absence of the conformal anomaly

  8. The Role of Quantifier Alternations in Cut Elimination

    DEFF Research Database (Denmark)

    Gerhardy, Philipp

    2005-01-01

    Extending previous results on the complexity of cut elimination for the sequent calculus LK, we discuss the role of quantifier alternations and develop a measure to describe the complexity of cut elimination in terms of quantifier alternations in cut formulas and contractions on such formulas...

  9. Rabies Elimination in Dogs in the United States

    Centers for Disease Control (CDC) Podcasts

    Rabies has been eliminated from dogs in the United States through efforts to promote annual vaccination, but it's still a problem in wildlife in the U.S. and in wild and domesticated animals abroad. In this podcast, CDC's Dr. Charles Rupprecht discusses a study which provides proof of the elimination of rabies in dogs and what this means for the average American.

  10. Malaria elimination practices in rural community residents in ...

    African Journals Online (AJOL)

    53. Rwanda Journal Series F: Medicine and Health Sciences Vol. 2 No. 1, 2015. Malaria elimination practices in rural community residents in Rwanda: A cross sectional study ... is an entirely preventable and treatable disease, provided that effective .... The most way used for malaria prevention, control and elimination.

  11. Making Career Decisions--A Sequential Elimination Approach.

    Science.gov (United States)

    Gati, Itamar

    1986-01-01

    Presents a model for career decision making based on the sequential elimination of occupational alternatives, an adaptation for career decisions of Tversky's (1972) elimination-by-aspects theory of choice. The expected utility approach is reviewed as a representative compensatory model for career decisions. Advantages, disadvantages, and…

  12. 75 FR 65238 - Loan Guaranty: Elimination of Redundant Regulations; Correction

    Science.gov (United States)

    2010-10-22

    ... DEPARTMENT OF VETERANS AFFAIRS 38 CFR Part 36 RIN 2900-AN71 Loan Guaranty: Elimination of... June 15, 2010 (75 FR 33704), amending its loan guaranty regulations to eliminate redundant regulations... INFORMATION CONTACT: William White, Acting Assistant Director for Loan Processing and Valuation (262...

  13. Wavelet classifier used for diagnosing shock absorbers in cars

    Directory of Open Access Journals (Sweden)

    Janusz GARDULSKI

    2007-01-01

    Full Text Available The paper discusses some commonly used methods of hydraulic absorbertesting. Disadvantages of the methods are described. A vibro-acoustic method is presented and recommended for practical use on existing test rigs. The method is based on continuous wavelet analysis combined with neural classifier and 25-neuron, one-way, three-layer back propagation network. The analysis satisfies the intended aim.

  14. Classified installations for environmental protection subject to declaration. Tome 2

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Legislation concerning classified installations govern most of industries or dangerous or pollutant activities. This legislation aims to prevent risks and harmful effects coming from an installation, air pollution, water pollution, noise, wastes produced by installations, even aesthetic bad effects. Pollutant or dangerous activities are defined in a list called nomenclature which obliged installations to a rule of declaration or authorization. Technical regulations ordered by the secretary of state for the environment are listed in tome 2

  15. Classified study and clinical value of the phase imaging features

    International Nuclear Information System (INIS)

    Dang Yaping; Ma Aiqun; Zheng Xiaopu; Yang Aimin; Xiao Jiang; Gao Xinyao

    2000-01-01

    445 patients with various heart diseases were examined by the gated cardiac blood pool imaging, and the phase was classified. The relationship between the seven types with left ventricular function index, clinical heart function, different heart diseases as well as electrocardiograph was studied. The results showed that the phase image classification could match with the clinical heart function. It can visually, directly and accurately indicate clinical heart function and can be used to identify diagnosis of heart disease

  16. Evaluating Classifiers in Detecting 419 Scams in Bilingual Cybercriminal Communities

    OpenAIRE

    Mbaziira, Alex V.; Abozinadah, Ehab; Jones Jr, James H.

    2015-01-01

    Incidents of organized cybercrime are rising because of criminals are reaping high financial rewards while incurring low costs to commit crime. As the digital landscape broadens to accommodate more internet-enabled devices and technologies like social media, more cybercriminals who are not native English speakers are invading cyberspace to cash in on quick exploits. In this paper we evaluate the performance of three machine learning classifiers in detecting 419 scams in a bilingual Nigerian c...

  17. Classifying Radio Galaxies with the Convolutional Neural Network

    International Nuclear Information System (INIS)

    Aniyan, A. K.; Thorat, K.

    2017-01-01

    We present the application of a deep machine learning technique to classify radio images of extended sources on a morphological basis using convolutional neural networks (CNN). In this study, we have taken the case of the Fanaroff–Riley (FR) class of radio galaxies as well as radio galaxies with bent-tailed morphology. We have used archival data from the Very Large Array (VLA)—Faint Images of the Radio Sky at Twenty Centimeters survey and existing visually classified samples available in the literature to train a neural network for morphological classification of these categories of radio sources. Our training sample size for each of these categories is ∼200 sources, which has been augmented by rotated versions of the same. Our study shows that CNNs can classify images of the FRI and FRII and bent-tailed radio galaxies with high accuracy (maximum precision at 95%) using well-defined samples and a “fusion classifier,” which combines the results of binary classifications, while allowing for a mechanism to find sources with unusual morphologies. The individual precision is highest for bent-tailed radio galaxies at 95% and is 91% and 75% for the FRI and FRII classes, respectively, whereas the recall is highest for FRI and FRIIs at 91% each, while the bent-tailed class has a recall of 79%. These results show that our results are comparable to that of manual classification, while being much faster. Finally, we discuss the computational and data-related challenges associated with the morphological classification of radio galaxies with CNNs.

  18. Classifying Radio Galaxies with the Convolutional Neural Network

    Energy Technology Data Exchange (ETDEWEB)

    Aniyan, A. K.; Thorat, K. [Department of Physics and Electronics, Rhodes University, Grahamstown (South Africa)

    2017-06-01

    We present the application of a deep machine learning technique to classify radio images of extended sources on a morphological basis using convolutional neural networks (CNN). In this study, we have taken the case of the Fanaroff–Riley (FR) class of radio galaxies as well as radio galaxies with bent-tailed morphology. We have used archival data from the Very Large Array (VLA)—Faint Images of the Radio Sky at Twenty Centimeters survey and existing visually classified samples available in the literature to train a neural network for morphological classification of these categories of radio sources. Our training sample size for each of these categories is ∼200 sources, which has been augmented by rotated versions of the same. Our study shows that CNNs can classify images of the FRI and FRII and bent-tailed radio galaxies with high accuracy (maximum precision at 95%) using well-defined samples and a “fusion classifier,” which combines the results of binary classifications, while allowing for a mechanism to find sources with unusual morphologies. The individual precision is highest for bent-tailed radio galaxies at 95% and is 91% and 75% for the FRI and FRII classes, respectively, whereas the recall is highest for FRI and FRIIs at 91% each, while the bent-tailed class has a recall of 79%. These results show that our results are comparable to that of manual classification, while being much faster. Finally, we discuss the computational and data-related challenges associated with the morphological classification of radio galaxies with CNNs.

  19. Efficient Multi-Concept Visual Classifier Adaptation in Changing Environments

    Science.gov (United States)

    2016-09-01

    sets of images, hand annotated by humans with region boundary outlines followed by label assignment. This annotation is time consuming , and...performed as a necessary but time- consuming step to train su- pervised classifiers. U nsupervised o r s elf-supervised a pproaches h ave b een used to...time- consuming labeling pro- cess. However, the lack of human supervision has limited most of this work to binary classification (e.g., traversability

  20. Classifying apples by the means of fluorescence imaging

    OpenAIRE

    Codrea, Marius C.; Nevalainen, Olli S.; Tyystjärvi, Esa; VAN DE VEN, Martin; VALCKE, Roland

    2004-01-01

    Classification of harvested apples when predicting their storage potential is an important task. This paper describes how chlorophyll a fluorescence images taken in blue light through a red filter, can be used to classify apples. In such an image, fluorescence appears as a relatively homogenous area broken by a number of small nonfluorescing spots, corresponding to normal corky tissue patches, lenticells, and to damaged areas that lower the quality of the apple. The damaged regions appear mor...

  1. Building Road-Sign Classifiers Using a Trainable Similarity Measure

    Czech Academy of Sciences Publication Activity Database

    Paclík, P.; Novovičová, Jana; Duin, R.P.W.

    2006-01-01

    Roč. 7, č. 3 (2006), s. 309-321 ISSN 1524-9050 R&D Projects: GA AV ČR IAA2075302 EU Projects: European Commission(XE) 507752 - MUSCLE Institutional research plan: CEZ:AV0Z10750506 Keywords : classifier system design * road-sign classification * similarity data representation Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.434, year: 2006 http://www.ewh.ieee.org/tc/its/trans.html

  2. Classifying Radio Galaxies with the Convolutional Neural Network

    Science.gov (United States)

    Aniyan, A. K.; Thorat, K.

    2017-06-01

    We present the application of a deep machine learning technique to classify radio images of extended sources on a morphological basis using convolutional neural networks (CNN). In this study, we have taken the case of the Fanaroff-Riley (FR) class of radio galaxies as well as radio galaxies with bent-tailed morphology. We have used archival data from the Very Large Array (VLA)—Faint Images of the Radio Sky at Twenty Centimeters survey and existing visually classified samples available in the literature to train a neural network for morphological classification of these categories of radio sources. Our training sample size for each of these categories is ˜200 sources, which has been augmented by rotated versions of the same. Our study shows that CNNs can classify images of the FRI and FRII and bent-tailed radio galaxies with high accuracy (maximum precision at 95%) using well-defined samples and a “fusion classifier,” which combines the results of binary classifications, while allowing for a mechanism to find sources with unusual morphologies. The individual precision is highest for bent-tailed radio galaxies at 95% and is 91% and 75% for the FRI and FRII classes, respectively, whereas the recall is highest for FRI and FRIIs at 91% each, while the bent-tailed class has a recall of 79%. These results show that our results are comparable to that of manual classification, while being much faster. Finally, we discuss the computational and data-related challenges associated with the morphological classification of radio galaxies with CNNs.

  3. Classifying Floating Potential Measurement Unit Data Products as Science Data

    Science.gov (United States)

    Coffey, Victoria; Minow, Joseph

    2015-01-01

    We are Co-Investigators for the Floating Potential Measurement Unit (FPMU) on the International Space Station (ISS) and members of the FPMU operations and data analysis team. We are providing this memo for the purpose of classifying raw and processed FPMU data products and ancillary data as NASA science data with unrestricted, public availability in order to best support science uses of the data.

  4. Snoring classified: The Munich-Passau Snore Sound Corpus.

    Science.gov (United States)

    Janott, Christoph; Schmitt, Maximilian; Zhang, Yue; Qian, Kun; Pandit, Vedhas; Zhang, Zixing; Heiser, Clemens; Hohenhorst, Winfried; Herzog, Michael; Hemmert, Werner; Schuller, Björn

    2018-03-01

    Snoring can be excited in different locations within the upper airways during sleep. It was hypothesised that the excitation locations are correlated with distinct acoustic characteristics of the snoring noise. To verify this hypothesis, a database of snore sounds is developed, labelled with the location of sound excitation. Video and audio recordings taken during drug induced sleep endoscopy (DISE) examinations from three medical centres have been semi-automatically screened for snore events, which subsequently have been classified by ENT experts into four classes based on the VOTE classification. The resulting dataset containing 828 snore events from 219 subjects has been split into Train, Development, and Test sets. An SVM classifier has been trained using low level descriptors (LLDs) related to energy, spectral features, mel frequency cepstral coefficients (MFCC), formants, voicing, harmonic-to-noise ratio (HNR), spectral harmonicity, pitch, and microprosodic features. An unweighted average recall (UAR) of 55.8% could be achieved using the full set of LLDs including formants. Best performing subset is the MFCC-related set of LLDs. A strong difference in performance could be observed between the permutations of train, development, and test partition, which may be caused by the relatively low number of subjects included in the smaller classes of the strongly unbalanced data set. A database of snoring sounds is presented which are classified according to their sound excitation location based on objective criteria and verifiable video material. With the database, it could be demonstrated that machine classifiers can distinguish different excitation location of snoring sounds in the upper airway based on acoustic parameters. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Young module multiplicities and classifying the indecomposable Young permutation modules

    OpenAIRE

    Gill, Christopher C.

    2012-01-01

    We study the multiplicities of Young modules as direct summands of permutation modules on cosets of Young subgroups. Such multiplicities have become known as the p-Kostka numbers. We classify the indecomposable Young permutation modules, and, applying the Brauer construction for p-permutation modules, we give some new reductions for p-Kostka numbers. In particular we prove that p-Kostka numbers are preserved under multiplying partitions by p, and strengthen a known reduction given by Henke, c...

  6. BIOPHARMACEUTICS CLASSIFICATION SYSTEM: A STRATEGIC TOOL FOR CLASSIFYING DRUG SUBSTANCES

    OpenAIRE

    Rohilla Seema; Rohilla Ankur; Marwaha RK; Nanda Arun

    2011-01-01

    The biopharmaceutical classification system (BCS) is a scientific approach for classifying drug substances based on their dose/solubility ratio and intestinal permeability. The BCS has been developed to allow prediction of in vivo pharmacokinetic performance of drug products from measurements of permeability and solubility. Moreover, the drugs can be categorized into four classes of BCS on the basis of permeability and solubility namely; high permeability high solubility, high permeability lo...

  7. NPDES Permit for Soap Creek Associates Wastewater Treatment Facility in Montana

    Science.gov (United States)

    Under National Pollutant Discharge Elimination System permit number MT-0023183, Soap Creek Associates, Inc. is authorized to discharge from its wastewater treatment facility located in West, Bighorn County, Montana, to Soap Creek.

  8. NPDES Permit for Town of Lodge Grass Wastewater Treatment Facility in Montana

    Science.gov (United States)

    Under National Pollutant Discharge Elimination System permit number MT0021890, the Town of Lodge Grass is authorized to discharge from from its wastewater treatment facility in Big Horn County to an unnamed slough to the Little Bighorn River.

  9. A Comprehensive Copper Compliance Strategy: Implementing Regulatory Guidance at Pearl Harbor Naval Shipyard & Intermediate Maintenance Facility

    National Research Council Canada - National Science Library

    Earley, P. J; Rosen, G; Rivera-Duarte, I; Gauthier, R. D; Arias-Thode, Y; Thompson, J; Swope, B

    2007-01-01

    Studies were performed to develop a new National Pollution Discharge Elimination Systems Permit for the discharge of effluents from the Pearl Harbor Naval Shipyard and Intermediate Maintenance Facility into Pearl Harbor...

  10. Self-organizing map classifier for stressed speech recognition

    Science.gov (United States)

    Partila, Pavol; Tovarek, Jaromir; Voznak, Miroslav

    2016-05-01

    This paper presents a method for detecting speech under stress using Self-Organizing Maps. Most people who are exposed to stressful situations can not adequately respond to stimuli. Army, police, and fire department occupy the largest part of the environment that are typical of an increased number of stressful situations. The role of men in action is controlled by the control center. Control commands should be adapted to the psychological state of a man in action. It is known that the psychological changes of the human body are also reflected physiologically, which consequently means the stress effected speech. Therefore, it is clear that the speech stress recognizing system is required in the security forces. One of the possible classifiers, which are popular for its flexibility, is a self-organizing map. It is one type of the artificial neural networks. Flexibility means independence classifier on the character of the input data. This feature is suitable for speech processing. Human Stress can be seen as a kind of emotional state. Mel-frequency cepstral coefficients, LPC coefficients, and prosody features were selected for input data. These coefficients were selected for their sensitivity to emotional changes. The calculation of the parameters was performed on speech recordings, which can be divided into two classes, namely the stress state recordings and normal state recordings. The benefit of the experiment is a method using SOM classifier for stress speech detection. Results showed the advantage of this method, which is input data flexibility.

  11. Deconstructing Cross-Entropy for Probabilistic Binary Classifiers

    Directory of Open Access Journals (Sweden)

    Daniel Ramos

    2018-03-01

    Full Text Available In this work, we analyze the cross-entropy function, widely used in classifiers both as a performance measure and as an optimization objective. We contextualize cross-entropy in the light of Bayesian decision theory, the formal probabilistic framework for making decisions, and we thoroughly analyze its motivation, meaning and interpretation from an information-theoretical point of view. In this sense, this article presents several contributions: First, we explicitly analyze the contribution to cross-entropy of (i prior knowledge; and (ii the value of the features in the form of a likelihood ratio. Second, we introduce a decomposition of cross-entropy into two components: discrimination and calibration. This decomposition enables the measurement of different performance aspects of a classifier in a more precise way; and justifies previously reported strategies to obtain reliable probabilities by means of the calibration of the output of a discriminating classifier. Third, we give different information-theoretical interpretations of cross-entropy, which can be useful in different application scenarios, and which are related to the concept of reference probabilities. Fourth, we present an analysis tool, the Empirical Cross-Entropy (ECE plot, a compact representation of cross-entropy and its aforementioned decomposition. We show the power of ECE plots, as compared to other classical performance representations, in two diverse experimental examples: a speaker verification system, and a forensic case where some glass findings are present.

  12. Evaluation of Polarimetric SAR Decomposition for Classifying Wetland Vegetation Types

    Directory of Open Access Journals (Sweden)

    Sang-Hoon Hong

    2015-07-01

    Full Text Available The Florida Everglades is the largest subtropical wetland system in the United States and, as with subtropical and tropical wetlands elsewhere, has been threatened by severe environmental stresses. It is very important to monitor such wetlands to inform management on the status of these fragile ecosystems. This study aims to examine the applicability of TerraSAR-X quadruple polarimetric (quad-pol synthetic aperture radar (PolSAR data for classifying wetland vegetation in the Everglades. We processed quad-pol data using the Hong & Wdowinski four-component decomposition, which accounts for double bounce scattering in the cross-polarization signal. The calculated decomposition images consist of four scattering mechanisms (single, co- and cross-pol double, and volume scattering. We applied an object-oriented image analysis approach to classify vegetation types with the decomposition results. We also used a high-resolution multispectral optical RapidEye image to compare statistics and classification results with Synthetic Aperture Radar (SAR observations. The calculated classification accuracy was higher than 85%, suggesting that the TerraSAR-X quad-pol SAR signal had a high potential for distinguishing different vegetation types. Scattering components from SAR acquisition were particularly advantageous for classifying mangroves along tidal channels. We conclude that the typical scattering behaviors from model-based decomposition are useful for discriminating among different wetland vegetation types.

  13. Patients on weaning trials classified with support vector machines

    International Nuclear Information System (INIS)

    Garde, Ainara; Caminal, Pere; Giraldo, Beatriz F; Schroeder, Rico; Voss, Andreas; Benito, Salvador

    2010-01-01

    The process of discontinuing mechanical ventilation is called weaning and is one of the most challenging problems in intensive care. An unnecessary delay in the discontinuation process and an early weaning trial are undesirable. This study aims to characterize the respiratory pattern through features that permit the identification of patients' conditions in weaning trials. Three groups of patients have been considered: 94 patients with successful weaning trials, who could maintain spontaneous breathing after 48 h (GSucc); 39 patients who failed the weaning trial (GFail) and 21 patients who had successful weaning trials, but required reintubation in less than 48 h (GRein). Patients are characterized by their cardiorespiratory interactions, which are described by joint symbolic dynamics (JSD) applied to the cardiac interbeat and breath durations. The most discriminating features in the classification of the different groups of patients (GSucc, GFail and GRein) are identified by support vector machines (SVMs). The SVM-based feature selection algorithm has an accuracy of 81% in classifying GSucc versus the rest of the patients, 83% in classifying GRein versus GSucc patients and 81% in classifying GRein versus the rest of the patients. Moreover, a good balance between sensitivity and specificity is achieved in all classifications

  14. Comparison of artificial intelligence classifiers for SIP attack data

    Science.gov (United States)

    Safarik, Jakub; Slachta, Jiri

    2016-05-01

    Honeypot application is a source of valuable data about attacks on the network. We run several SIP honeypots in various computer networks, which are separated geographically and logically. Each honeypot runs on public IP address and uses standard SIP PBX ports. All information gathered via honeypot is periodically sent to the centralized server. This server classifies all attack data by neural network algorithm. The paper describes optimizations of a neural network classifier, which lower the classification error. The article contains the comparison of two neural network algorithm used for the classification of validation data. The first is the original implementation of the neural network described in recent work; the second neural network uses further optimizations like input normalization or cross-entropy cost function. We also use other implementations of neural networks and machine learning classification algorithms. The comparison test their capabilities on validation data to find the optimal classifier. The article result shows promise for further development of an accurate SIP attack classification engine.

  15. Classifier-Guided Sampling for Complex Energy System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report documents the results of a Laboratory Directed Research and Development (LDRD) effort enti tled "Classifier - Guided Sampling for Complex Energy System Optimization" that was conducted during FY 2014 and FY 2015. The goal of this proj ect was to develop, implement, and test major improvements to the classifier - guided sampling (CGS) algorithm. CGS is type of evolutionary algorithm for perform ing search and optimization over a set of discrete design variables in the face of one or more objective functions. E xisting evolutionary algorithms, such as genetic algorithms , may require a large number of o bjecti ve function evaluations to identify optimal or near - optimal solutions . Reducing the number of evaluations can result in significant time savings, especially if the objective function is computationally expensive. CGS reduce s the evaluation count by us ing a Bayesian network classifier to filter out non - promising candidate designs , prior to evaluation, based on their posterior probabilit ies . In this project, b oth the single - objective and multi - objective version s of the CGS are developed and tested on a set of benchm ark problems. As a domain - specific case study, CGS is used to design a microgrid for use in islanded mode during an extended bulk power grid outage.

  16. Application of the Naive Bayesian Classifier to optimize treatment decisions

    International Nuclear Information System (INIS)

    Kazmierska, Joanna; Malicki, Julian

    2008-01-01

    Background and purpose: To study the accuracy, specificity and sensitivity of the Naive Bayesian Classifier (NBC) in the assessment of individual risk of cancer relapse or progression after radiotherapy (RT). Materials and methods: Data of 142 brain tumour patients irradiated from 2000 to 2005 were analyzed. Ninety-six attributes related to disease, patient and treatment were chosen. Attributes in binary form consisted of the training set for NBC learning. NBC calculated an individual conditional probability of being assigned to: relapse or progression (1), or no relapse or progression (0) group. Accuracy, attribute selection and quality of classifier were determined by comparison with actual treatment results, leave-one-out and cross validation methods, respectively. Clinical setting test utilized data of 35 patients. Treatment results at classification were unknown and were compared with classification results after 3 months. Results: High classification accuracy (84%), specificity (0.87) and sensitivity (0.80) were achieved, both for classifier training and in progressive clinical evaluation. Conclusions: NBC is a useful tool to support the assessment of individual risk of relapse or progression in patients diagnosed with brain tumour undergoing RT postoperatively

  17. A support vector machine (SVM) based voltage stability classifier

    Energy Technology Data Exchange (ETDEWEB)

    Dosano, R.D.; Song, H. [Kunsan National Univ., Kunsan, Jeonbuk (Korea, Republic of); Lee, B. [Korea Univ., Seoul (Korea, Republic of)

    2007-07-01

    Power system stability has become even more complex and critical with the advent of deregulated energy markets and the growing desire to completely employ existing transmission and infrastructure. The economic pressure on electricity markets forces the operation of power systems and components to their limit of capacity and performance. System conditions can be more exposed to instability due to greater uncertainty in day to day system operations and increase in the number of potential components for system disturbances potentially resulting in voltage stability. This paper proposed a support vector machine (SVM) based power system voltage stability classifier using local measurements of voltage and active power of load. It described the procedure for fast classification of long-term voltage stability using the SVM algorithm. The application of the SVM based voltage stability classifier was presented with reference to the choice of input parameters; input data preconditioning; moving window for feature vector; determination of learning samples; and other considerations in SVM applications. The paper presented a case study with numerical examples of an 11-bus test system. The test results for the feasibility study demonstrated that the classifier could offer an excellent performance in classification with time-series measurements in terms of long-term voltage stability. 9 refs., 14 figs.

  18. Entropy based classifier for cross-domain opinion mining

    Directory of Open Access Journals (Sweden)

    Jyoti S. Deshmukh

    2018-01-01

    Full Text Available In recent years, the growth of social network has increased the interest of people in analyzing reviews and opinions for products before they buy them. Consequently, this has given rise to the domain adaptation as a prominent area of research in sentiment analysis. A classifier trained from one domain often gives poor results on data from another domain. Expression of sentiment is different in every domain. The labeling cost of each domain separately is very high as well as time consuming. Therefore, this study has proposed an approach that extracts and classifies opinion words from one domain called source domain and predicts opinion words of another domain called target domain using a semi-supervised approach, which combines modified maximum entropy and bipartite graph clustering. A comparison of opinion classification on reviews on four different product domains is presented. The results demonstrate that the proposed method performs relatively well in comparison to the other methods. Comparison of SentiWordNet of domain-specific and domain-independent words reveals that on an average 72.6% and 88.4% words, respectively, are correctly classified.

  19. Hepatic, renal, and total body galactose elimination in the pig

    DEFF Research Database (Denmark)

    Winkler, K; Henriksen, Jens Henrik Sahl; Tygstrup, N

    1993-01-01

    Galactose elimination capacity is used as a quantitative measure of liver function on the assumption that galactose elimination outside the liver is negligible or easily corrected for. The relationship between hepatic and extrahepatic removal of galactose was studied in anesthetized pigs during...... reabsorption (Tm 178 +/- 3.0 mumol/min, Km 3.8 +/- 0.9 mmol/l, n = 20). Metabolic conversion of galactose in the kidney was not demonstrable. At all concentrations studied (0.4-5.8 mmol/l), total galactose elimination from the body exceeded the sum of hepatic and renal elimination by approximately 100 mumol....../min, independent of the concentration. At blood concentrations usually used for clinical estimation of the galactose elimination capacity (approximately 4 mmol/l), hepatic removal in the pig accounted for 55% and renal removal for 30% of total removal; 15% of removal occurred in other organs. We conclude...

  20. Elimination of americium-241 after a case of accidental inhalation

    International Nuclear Information System (INIS)

    Edvardsson, K.A.; Lindgren, L.

    1976-01-01

    In handling a 241 Am source one person received an internal contamination of about 140 nCi of americium oxide, which was deposited in the lung region. Elimination of the activity was followed for more than 3 months by external gamma counting and excreta analyses. During the first week after the inhalation about 80% of the total intake was eliminated with an effective half-life of less than 2 days. The remaining activity, deposited in the lung region, was eliminated with an effective half-life of about 17 days. About 15% of the activity eliminated from the lung region from the 10th to the 50th day was eliminated in the faeces. (author)

  1. Legal problems of waste treatment in German atomic energy facilities

    International Nuclear Information System (INIS)

    Pfaffelhuber, J.K.

    1980-01-01

    The execution of the strategies of waste treatment and disposal calls for the laws and regulations on the obligations of the owners of equipments and facilities and of the state for securing safety and the final elimination of radioactive wastes, which are defined mainly in Article 9 of Atomgesetz and Section 2 (Article 44 - 48) of the order on protection from radiation. The owners of equipments and facilities of atomic energy technology shall limit the emission of radiation to about 6% of internationally permissible values, avoid uncontrolled emission without fail, inspect emission and submit reports yearly to government offices. The owners have attention obligations to utilize harmlessly produced radioactive residues and the expanded or dismantled parts of radioactive equipments or to eliminate orderly such things as radioactive wastes, only when such utilization is unable technically or economically, or not adequate under the protection aims of Atomgesetz. The possessors of radioactive wastes shall deliver the wastes to the accumulation places of provinces for intermediate storage, to the facilities of the Federal Republic for securing safety or final storage, or the facilities authorized by government offices for the elimination of radioactive wastes. Provinces shall install the accumulation places for the intermediate storage of radioactive wastes produced in their territories, and the Federal Republic shall set up the facilities for securing safety and the final elimination of radioactive wastes (Article 9, Atomgesetz). (Okada, K.)

  2. Jupiter Laser Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Jupiter Laser Facility is an institutional user facility in the Physical and Life Sciences Directorate at LLNL. The facility is designed to provide a high degree...

  3. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  4. Operational and safety requirement of radiation facility

    International Nuclear Information System (INIS)

    Zulkafli Ghazali

    2007-01-01

    Gamma and electron irradiation facilities are the most common industrial sources of ionizing radiation. They have been used for medical, industrial and research purposes since the 1950s. Currently there are more than 160 gamma irradiation facilities and over 600 electron beam facilities in operation worldwide. These facilities are either used for the sterilization of medical and pharmaceutical products, the preservation of foodstuffs, polymer synthesis and modification, or the eradication of insect infestation. Irradiation with electron beam, gamma ray or ultra violet light can also destroy complex organic contaminants in both liquid and gaseous waste. EB systems are replacing traditional chemical sterilization methods in the medical supply industry. The ultra-violet curing facility, however, has found more industrial application in printing and furniture industries. Gamma and electron beam facilities produce very high dose rates during irradiation, and thus there is a potential of accidental exposure in the irradiation chamber which can be lethal within minutes. Although, the safety record of this industry has been relatively very good, there have been fatalities recorded in Italy (1975), Norway (1982), El Salvador (1989) and Israel (1990). Precautions against uncontrolled entry into irradiation chamber must therefore be taken. This is especially so in the case of gamma irradiation facilities those contain large amounts of radioactivity. If the mechanism for retracting the source is damaged, the source may remain exposed. This paper will, to certain extent, describe safety procedure and system being installed at ALURTRON, Nuclear Malaysia to eliminate accidental exposure of electron beam irradiation. (author)

  5. Prevention and control of tuberculosis in correctional and detention facilities: recommendations from the CDC

    CSIR Research Space (South Africa)

    Parsons, S

    2006-07-01

    Full Text Available and Detention Facilities: Recommendations from CDC Endorsed by the Advisory Council for the Elimination of Tuberculosis, the National Commission on Correctional Health Care, and the American Correctional Association MMWR CONTENTS Introduction... in Correctional and Detention Facilities: Recommendations from CDC Endorsed by the Advisory Council for the Elimination of Tuberculosis, the National Commission on Correctional Health Care, and the American Correctional Association Summary Tuberculosis (TB...

  6. Aperture area measurement facility

    Data.gov (United States)

    Federal Laboratory Consortium — NIST has established an absolute aperture area measurement facility for circular and near-circular apertures use in radiometric instruments. The facility consists of...

  7. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  8. Licensed Healthcare Facilities

    Data.gov (United States)

    California Natural Resource Agency — The Licensed Healthcare Facilities point layer represents the locations of all healthcare facilities licensed by the State of California, Department of Health...

  9. Facility Registry Service (FRS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Facility Registry Service (FRS) provides an integrated source of comprehensive (air, water, and waste) environmental information about facilities across EPA,...

  10. Interim safety basis for fuel supply shutdown facility

    International Nuclear Information System (INIS)

    Brehm, J.R.; Deobald, T.L.; Benecke, M.W.; Remaize, J.A.

    1995-01-01

    This ISB in conjunction with the new TSRs, will provide the required basis for interim operation or restrictions on interim operations and administrative controls for the Facility until a SAR is prepared in accordance with the new requirements. It is concluded that the risk associated with the current operational mode of the Facility, uranium closure, clean up, and transition activities required for permanent closure, are within Risk Acceptance Guidelines. The Facility is classified as a Moderate Hazard Facility because of the potential for an unmitigated fire associated with the uranium storage buildings

  11. Geospatial Technology: A Tool to Aid in the Elimination of Malaria in Bangladesh

    Directory of Open Access Journals (Sweden)

    Karen E. Kirk

    2014-12-01

    Full Text Available Bangladesh is a malaria endemic country. There are 13 districts in the country bordering India and Myanmar that are at risk of malaria. The majority of malaria morbidity and mortality cases are in the Chittagong Hill Tracts, the mountainous southeastern region of Bangladesh. In recent years, malaria burden has declined in the country. In this study, we reviewed and summarized published data (through 2014 on the use of geospatial technologies on malaria epidemiology in Bangladesh and outlined potential contributions of geospatial technologies for eliminating malaria in the country. We completed a literature review using “malaria, Bangladesh” search terms and found 218 articles published in peer-reviewed journals listed in PubMed. After a detailed review, 201 articles were excluded because they did not meet our inclusion criteria, 17 articles were selected for final evaluation. Published studies indicated geospatial technologies tools (Geographic Information System, Global Positioning System, and Remote Sensing were used to determine vector-breeding sites, land cover classification, accessibility to health facility, treatment seeking behaviors, and risk mapping at the household, regional, and national levels in Bangladesh. To achieve the goal of malaria elimination in Bangladesh, we concluded that further research using geospatial technologies should be integrated into the country’s ongoing surveillance system to identify and better assess progress towards malaria elimination.

  12. Guide to research facilities

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    This Guide provides information on facilities at US Department of Energy (DOE) and other government laboratories that focus on research and development of energy efficiency and renewable energy technologies. These laboratories have opened these facilities to outside users within the scientific community to encourage cooperation between the laboratories and the private sector. The Guide features two types of facilities: designated user facilities and other research facilities. Designated user facilities are one-of-a-kind DOE facilities that are staffed by personnel with unparalleled expertise and that contain sophisticated equipment. Other research facilities are facilities at DOE and other government laboratories that provide sophisticated equipment, testing areas, or processes that may not be available at private facilities. Each facility listing includes the name and phone number of someone you can call for more information.

  13. Comparative metabolism and elimination of acetanilide compounds by rat.

    Science.gov (United States)

    Davison, K L; Larsen, G L; Feil, V J

    1994-10-01

    1. 14C-labelled propachlor, alachlor, butachlor, metolachlor, methoxypropachlor and some of their mercapturic acid pathway metabolites (MAP) were given to rat either by gavage or by perfusion into a renal artery. MAP metabolites were isolated from bile and urine. 2. Rat gavaged with propachlor and methoxypropachlor eliminated 14C mostly in urine, whereas rat gavaged with alachlor, butachlor and metolachlor eliminated 14C about equally divided between urine and faeces. When bile ducts were cannulated, the gavaged rat eliminated most of the 14C in bile for all compounds. The amount of 14C in bile from the propachlor-gavaged rat was less than that for the other acetanilides, with the difference being in the urine. 3. The mercapturic acid metabolites 2-methylsulphinyl-N-(1-methylhydroxyethyl)-N-phenylacetam ide and 2-methylsulphinyl-N-(1-methylmethoxyethyl)-N-phenylacetam ide were isolated from the urine and bile of the methoxypropachlor-gavaged rat. 4. Bile was the major route for 14C elimination when MAP metabolites of alachlor, butachlor and metolachlor were perfused into a renal artery. Urine was the major route for 14C elimination when MAP metabolites of propachlor and methoxypropachlor were perfused. Mercapturic acid conjugates were major metabolites in bile and urine when MAP metabolites were perfused. 5. We conclude that alkyl groups on the phenyl portion of the acetanilide causes biliary elimination to be favoured over urinary elimination.

  14. Uptake, disposition, and elimination of acrylamide in rainbow trout

    International Nuclear Information System (INIS)

    Petersen, D.W.; Kleinow, K.M.; Kraska, R.C.; Lech, J.J.

    1985-01-01

    The uptake, disposition, and elimination of [2,3- 14 C]acrylamide was studied in fingerling rainbow trout exposed to 0.388 and 0.710 mg/liter [2,3- 14 C]acrylamide at 12 degrees C under static water conditions for 72 hr. 14 C in carcass and viscera was determined at times ranging from 4 to 72 hr after the beginning of the exposure period and 4 to 96 hr after transfer of the fish to fresh flowing water for the elimination studies. Uptake of 14 C was initially rapid and plateaued after 72 hr of acrylamide exposure. No appreciable bioaccumulation occurred in carcass or viscera at either exposure concentration and 14 C distributed approximately equally to all tissues studied. Elimination of 14 C from carcass and viscera was biphasic with a terminal half-life of approximately 7 days. 14 C elimination was not uniform in all tissues studied with the most rapid elimination occurring in blood and gill and the slowest elimination occurring in muscle and intestine. In addition, 10 to 15% of the initial total 14 C in carcass or viscera was nonextractable and was associated with the protein fraction of the sample at all time points in the depuration period. Approximately 20% of an ip administered dose of [ 14 C]acrylamide was eliminated via the gills, 7% via the urine, and less than 1% via the bile in 2 hr. At least three biliary metabolites were isolated by HPLC

  15. Progress toward elimination of onchocerciasis in the Americas.

    Science.gov (United States)

    Sauerbrey, Mauricio; Rakers, Lindsay J; Richards, Frank O

    2018-03-01

    The Onchocerciasis Elimination Program for the Americas (OEPA) is a regional initiative and international partnership that has made considerable progress toward its goal since it was launched in 1993. Its strategy is based on mass drug administration of ivermectin (Mectizan, donated by MSD, also known as Merck & Co., Inc., Kenilworth, NJ, USA), twice or four times per year, with at least 85% coverage of eligible populations. From 1989 to 2016, 11 741 276 ivermectin treatments have been given in the Americas, eliminating transmission in 11 of 13 foci. The OEPA's success has had a great influence on programs in Africa, especially Sudan and Uganda, which moved from a control to an elimination strategy in 2006 and 2007, respectively. The successes in the Americas have also greatly influenced WHO guidelines for onchocerciasis transmission elimination. With four of the six originally endemic American countries now WHO verified as having eliminated onchocerciasis transmission, and 95% of ivermectin treatments in the region halted, the regional focus is now on the remaining active transmission zone, called the Yanomami Area, on the border between Venezuela and Brazil. Both countries have difficult political climates that hinder the elimination task in this remote and relatively neglected region. As with other elimination efforts, 'the final inch' is often the most difficult task of all.

  16. An elimination diet for chronic urticaria of childhood.

    Science.gov (United States)

    Kemp, A S; Schembri, G

    1985-09-16

    Twenty-three children with chronic urticaria were treated with an elimination diet for two weeks. Eighteen completed the period of dietary elimination; in seven of the 18 children there was a marked remission of the urticaria during the second week of the diet. The administration of challenge capsules provoked an exacerbation of urticaria in five of the 14 (36%) children given aspirin. The incidence of reactions to tartrazine, sodium benzoate and yeast (7%) was not significantly different from those to the lactose placebo (9%). In selected cases, elimination diets with controlled reintroduction of foods have a role in the management of chronic urticaria in childhood.

  17. Criteria for eliminating items of a Test of Figural Analogies

    Directory of Open Access Journals (Sweden)

    Diego Blum

    2013-12-01

    Full Text Available This paper describes the steps taken to eliminate two of the items in a Test of Figural Analogies (TFA. The main guidelines of psychometric analysis concerning Classical Test Theory (CTT and Item Response Theory (IRT are explained. The item elimination process was based on both the study of the CTT difficulty and discrimination index, and the unidimensionality analysis. The a, b, and c parameters of the Three Parameter Logistic Model of IRT were also considered for this purpose, as well as the assessment of each item fitting this model. The unfavourable characteristics of a group of TFA items are detailed, and decisions leading to their possible elimination are discussed.

  18. Classifying and monitoring water quality by use of satellite imagery

    Science.gov (United States)

    Scherz, J. P.; Crane, D. R.; Rogers, R. H.

    1976-01-01

    A technique is developed to eliminate the atmosphere and surface noise effects on Landsat signals of water bodies by manipulating the total signal from Landsat in such a way that only the volume reflectance is left as a residual. With the Landsat signal from a lake and the known volume reflectance for its clear water it is possible to eliminate the surface and atmospheric effects and have residual signals that are indicative only of the type and concentration of the material in other lakes. Laboratory values are more precise than field values because in the field one must contend with indirect skylight and wave action which can be removed in the laboratory. The volume reflectance of distilled water or a very clear lake approaching distilled water was determined in the laboratory by the use of the Bendix radiant power measuring instrument. The Bendix multispectral data analysis system provided a color categorized image of several hundred lakes in a Wisconsin area. These lakes were categorized for tannin and nontannin waters and for the degrees of algae, silt, weeds, and bottom effects present.

  19. On the statistical assessment of classifiers using DNA microarray data

    Directory of Open Access Journals (Sweden)

    Carella M

    2006-08-01

    Full Text Available Abstract Background In this paper we present a method for the statistical assessment of cancer predictors which make use of gene expression profiles. The methodology is applied to a new data set of microarray gene expression data collected in Casa Sollievo della Sofferenza Hospital, Foggia – Italy. The data set is made up of normal (22 and tumor (25 specimens extracted from 25 patients affected by colon cancer. We propose to give answers to some questions which are relevant for the automatic diagnosis of cancer such as: Is the size of the available data set sufficient to build accurate classifiers? What is the statistical significance of the associated error rates? In what ways can accuracy be considered dependant on the adopted classification scheme? How many genes are correlated with the pathology and how many are sufficient for an accurate colon cancer classification? The method we propose answers these questions whilst avoiding the potential pitfalls hidden in the analysis and interpretation of microarray data. Results We estimate the generalization error, evaluated through the Leave-K-Out Cross Validation error, for three different classification schemes by varying the number of training examples and the number of the genes used. The statistical significance of the error rate is measured by using a permutation test. We provide a statistical analysis in terms of the frequencies of the genes involved in the classification. Using the whole set of genes, we found that the Weighted Voting Algorithm (WVA classifier learns the distinction between normal and tumor specimens with 25 training examples, providing e = 21% (p = 0.045 as an error rate. This remains constant even when the number of examples increases. Moreover, Regularized Least Squares (RLS and Support Vector Machines (SVM classifiers can learn with only 15 training examples, with an error rate of e = 19% (p = 0.035 and e = 18% (p = 0.037 respectively. Moreover, the error rate

  20. Can scientific journals be classified based on their citation profiles?

    Directory of Open Access Journals (Sweden)

    Sayed-Amir Marashi

    2015-03-01

    Full Text Available Classification of scientific publications is of great importance in biomedical research evaluation. However, accurate classification of research publications is challenging and normally is performed in a rather subjective way. In the present paper, we propose to classify biomedical publications into superfamilies, by analysing their citation profiles, i.e. the location of citations in the structure of citing articles. Such a classification may help authors to find the appropriate biomedical journal for publication, may make journal comparisons more rational, and may even help planners to better track the consequences of their policies on biomedical research.

  1. Classifying the future of universes with dark energy

    International Nuclear Information System (INIS)

    Chiba, Takeshi; Takahashi, Ryuichi; Sugiyama, Naoshi

    2005-01-01

    We classify the future of the universe for general cosmological models including matter and dark energy. If the equation of state of dark energy is less then -1, the age of the universe becomes finite. We compute the rest of the age of the universe for such universe models. The behaviour of the future growth of matter density perturbation is also studied. We find that the collapse of the spherical overdensity region is greatly changed if the equation of state of dark energy is less than -1

  2. DFRFT: A Classified Review of Recent Methods with Its Application

    Directory of Open Access Journals (Sweden)

    Ashutosh Kumar Singh

    2013-01-01

    Full Text Available In the literature, there are various algorithms available for computing the discrete fractional Fourier transform (DFRFT. In this paper, all the existing methods are reviewed, classified into four categories, and subsequently compared to find out the best alternative from the view point of minimal computational error, computational complexity, transform features, and additional features like security. Subsequently, the correlation theorem of FRFT has been utilized to remove significantly the Doppler shift caused due to motion of receiver in the DSB-SC AM signal. Finally, the role of DFRFT has been investigated in the area of steganography.

  3. Application of a naive Bayesians classifiers in assessing the supplier

    Directory of Open Access Journals (Sweden)

    Mijailović Snežana

    2017-01-01

    Full Text Available The paper considers the class of interactive knowledge based systems whose main purpose of making proposals and assisting customers in making decisions. The mathematical model provides a set of examples of learning about the delivered series of outflows from three suppliers, as well as an analysis of an illustrative example for assessing the supplier using a naive Bayesian classifier. The model was developed on the basis of the analysis of subjective probabilities, which are later revised with the help of new empirical information and Bayesian theorem on a posterior probability, i.e. combining of subjective and objective conditional probabilities in the choice of a reliable supplier.

  4. Interface Prostheses With Classifier-Feedback-Based User Training.

    Science.gov (United States)

    Fang, Yinfeng; Zhou, Dalin; Li, Kairu; Liu, Honghai

    2017-11-01

    It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well as the centroids of the training samples, whose dimensionality is reduced to minimal number by dimension reduction. Clustering feedback provides a criterion that guides users to adjust motion gestures and muscle contraction forces intentionally. The experiment results have demonstrated that hand motion recognition accuracy increases steadily along the progress of the clustering-feedback-based user training, while conventional classifier-feedback methods, i.e., label feedback, hardly achieve any improvement. The result concludes that the use of proper classifier feedback can accelerate the process of user training, and implies prosperous future for the amputees with limited or no experience in pattern-recognition-based prosthetic device manipulation.It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well

  5. Nonlinear Knowledge in Kernel-Based Multiple Criteria Programming Classifier

    Science.gov (United States)

    Zhang, Dongling; Tian, Yingjie; Shi, Yong

    Kernel-based Multiple Criteria Linear Programming (KMCLP) model is used as classification methods, which can learn from training examples. Whereas, in traditional machine learning area, data sets are classified only by prior knowledge. Some works combine the above two classification principle to overcome the defaults of each approach. In this paper, we propose a model to incorporate the nonlinear knowledge into KMCLP in order to solve the problem when input consists of not only training example, but also nonlinear prior knowledge. In dealing with real world case breast cancer diagnosis, the model shows its better performance than the model solely based on training data.

  6. On-line computing in a classified environment

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.

    1982-01-01

    Westinghouse Hanford Company (WHC) recently developed a Department of Energy (DOE) approved real-time, on-line computer system to control nuclear material. The system simultaneously processes both classified and unclassified information. Implementation of this system required application of many security techniques. The system has a secure, but user friendly interface. Many software applications protect the integrity of the data base from malevolent or accidental errors. Programming practices ensure the integrity of the computer system software. The audit trail and the reports generation capability record user actions and status of the nuclear material inventory

  7. A Handbook for Derivative Classifiers at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Sinkula, Barbara Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-23

    The Los Alamos Classification Office (within the SAFE-IP group) prepared this handbook as a resource for the Laboratory’s derivative classifiers (DCs). It contains information about United States Government (USG) classification policy, principles, and authorities as they relate to the LANL Classification Program in general, and to the LANL DC program specifically. At a working level, DCs review Laboratory documents and material that are subject to classification review requirements, while the Classification Office provides the training and resources for DCs to perform that vital function.

  8. Classifying BCI signals from novice users with extreme learning machine

    Directory of Open Access Journals (Sweden)

    Rodríguez-Bermúdez Germán

    2017-07-01

    Full Text Available Brain computer interface (BCI allows to control external devices only with the electrical activity of the brain. In order to improve the system, several approaches have been proposed. However it is usual to test algorithms with standard BCI signals from experts users or from repositories available on Internet. In this work, extreme learning machine (ELM has been tested with signals from 5 novel users to compare with standard classification algorithms. Experimental results show that ELM is a suitable method to classify electroencephalogram signals from novice users.

  9. Towards elimination of asbestos-related diseases: a theoretical basis for international cooperation.

    Science.gov (United States)

    Takahashi, Ken; Kang, Seong-Kyu

    2010-12-01

    We develop a theoretical framework for international cooperation that can be used for the elimination of asbestos-related diseases (ARDs). The framework is based on the similarities in the temporal patterns of asbestos use and occurrence of ARDs in diverse countries. The status of each nation can be characterized by observing asbestos use and ARD frequency therein using a time window. Countries that supply technology for prevention of ARDs can be classified as donors and countries that receive these technologies as recipients. We suggest identification of three levels of core preventative technologies. Development of a common platform to gather and manage core preventative technologies will combine the strengths of donor countries and the needs of recipient countries.

  10. The mixed waste management facility

    International Nuclear Information System (INIS)

    Streit, R.D.

    1995-10-01

    During FY96, the Mixed Waste Management Facility (MWMF) Project has the following major objectives: (1) Complete Project Preliminary Design Review (PDR). (2) Complete final design (Title II) of MWMF major systems. (3) Coordinate all final interfaces with the Decontamination and Waste Treatment Facility (DWTF) for facility utilities and facility integration. (4) Begin long-lead procurements. (5) Issue Project Baseline Revision 2-Preliminary Design (PB2), modifying previous baselines per DOE-requested budget profiles and cost reduction. Delete Mediated Electrochemical Oxidation (MEO) as a treatment process for initial demonstration. (6) Complete submittal of, and ongoing support for, applications for air permit. (7) Begin detailed planning for start-up, activation, and operational interfaces with the Laboratory's Hazardous Waste Management Division (HWM). In achieving these objectives during FY96, the Project will incorporate and implement recent DOE directives to maximize the cost savings associated with the DWTF/MWMF integration (initiated in PB1.2); to reduce FY96 new Budget Authority to ∼$10M (reduced from FY97 Validation of $15.3M); and to keep Project fiscal year funding requirements largely uniform at ∼$10M/yr. A revised Project Baseline (i.e., PB2), to be issued during the second quarter of FY96, will address the implementation and impact of this guidance from an overall Project viewpoint. For FY96, the impact of this guidance is that completion of final design has been delayed relative to previous baselines (resulting from the delay in the completion of preliminary design); ramp-up in staffing has been essentially eliminated; and procurements have been balanced through the Project to help balance budget needs to funding availability

  11. Improved training for target detection using Fukunaga-Koontz transform and distance classifier correlation filter

    Science.gov (United States)

    Elbakary, M. I.; Alam, M. S.; Aslan, M. S.

    2008-03-01

    In a FLIR image sequence, a target may disappear permanently or may reappear after some frames and crucial information such as direction, position and size related to the target are lost. If the target reappears at a later frame, it may not be tracked again because the 3D orientation, size and location of the target might be changed. To obtain information about the target before disappearing and to detect the target after reappearing, distance classifier correlation filter (DCCF) is trained manualy by selecting a number of chips randomly. This paper introduces a novel idea to eliminates the manual intervention in training phase of DCCF. Instead of selecting the training chips manually and selecting the number of the training chips randomly, we adopted the K-means algorithm to cluster the training frames and based on the number of clusters we select the training chips such that a training chip for each cluster. To detect and track the target after reappearing in the field-ofview ,TBF and DCCF are employed. The contduced experiemnts using real FLIR sequences show results similar to the traditional agorithm but eleminating the manual intervention is the advantage of the proposed algorithm.

  12. SVM-RFE based feature selection and Taguchi parameters optimization for multiclass SVM classifier.

    Science.gov (United States)

    Huang, Mei-Ling; Hung, Yung-Hsiang; Lee, W M; Li, R K; Jiang, Bo-Ru

    2014-01-01

    Recently, support vector machine (SVM) has excellent performance on classification and prediction and is widely used on disease diagnosis or medical assistance. However, SVM only functions well on two-group classification problems. This study combines feature selection and SVM recursive feature elimination (SVM-RFE) to investigate the classification accuracy of multiclass problems for Dermatology and Zoo databases. Dermatology dataset contains 33 feature variables, 1 class variable, and 366 testing instances; and the Zoo dataset contains 16 feature variables, 1 class variable, and 101 testing instances. The feature variables in the two datasets were sorted in descending order by explanatory power, and different feature sets were selected by SVM-RFE to explore classification accuracy. Meanwhile, Taguchi method was jointly combined with SVM classifier in order to optimize parameters C and γ to increase classification accuracy for multiclass classification. The experimental results show that the classification accuracy can be more than 95% after SVM-RFE feature selection and Taguchi parameter optimization for Dermatology and Zoo databases.

  13. Coping with unobservable and mis-classified states in capture-recapture studies

    Directory of Open Access Journals (Sweden)

    Kendall, W. L.

    2004-01-01

    Full Text Available Multistate mark-recapture methods provide an excellent conceptual framework for considering estimation in studies of marked animals. Traditional methods include the assumptions that (1 each state an animal occupies is observable, and (2 state is assigned correctly at each point in time. Failure of either of these assumptions can lead to biased estimates of demographic parameters. I review design and analysis options for minimizing or eliminating these biases. Unobservable states can be adjusted for by including them in the state space of the statistical model, with zero capture probability, and incorporating the robust design, or observing animals in the unobservable state through telemetry, tag recoveries, or incidental observations. Mis¿classification can be adjusted for by auxiliary data or incorporating the robust design, in order to estimate the probability of detecting the state an animal occupies. For both unobservable and mis-classified states, the key feature of the robust design is the assumption that the state of the animal is static for at least two sampling occasions

  14. A calibration facility for radon fluxmeter

    International Nuclear Information System (INIS)

    Li Xianjie; Qiu Shoukang; Zhou Jianliang; Liu Chunkui; Pan Jialin; Yang Mingli

    1998-01-01

    Calibration facilities for radon fluxmeter with three kinds of different emanation medium have been developed. The stability of radon flux is 5%, 9% (RSD) respectively. The uniformity of radon flux is 4.5%, 8.5% (RSD) respectively. These specifications fulfill the calibration requirement for radon fluxmeter. The determination of radon flux of facility takes full account of eliminating the main error source-attenuation effect (including leakage and back diffusion etc.): not only prevent attenuation and make a relevant correction. Therefore the accuracy of determination is assured. The calibration, intercomparison of radon flux meter and the quantitatively evaluation on the measurement method of radon flux are made to be possible by the successful establishment of this facility. (author)

  15. Los Alamos Plutonium Facility Waste Management System

    International Nuclear Information System (INIS)

    Smith, K.; Montoya, A.; Wieneke, R.; Wulff, D.; Smith, C.; Gruetzmacher, K.

    1997-01-01

    This paper describes the new computer-based transuranic (TRU) Waste Management System (WMS) being implemented at the Plutonium Facility at Los Alamos National Laboratory (LANL). The Waste Management System is a distributed computer processing system stored in a Sybase database and accessed by a graphical user interface (GUI) written in Omnis7. It resides on the local area network at the Plutonium Facility and is accessible by authorized TRU waste originators, count room personnel, radiation protection technicians (RPTs), quality assurance personnel, and waste management personnel for data input and verification. Future goals include bringing outside groups like the LANL Waste Management Facility on-line to participate in this streamlined system. The WMS is changing the TRU paper trail into a computer trail, saving time and eliminating errors and inconsistencies in the process

  16. Communication grounding facility

    International Nuclear Information System (INIS)

    Lee, Gye Seong

    1998-06-01

    It is about communication grounding facility, which is made up twelve chapters. It includes general grounding with purpose, materials thermal insulating material, construction of grounding, super strength grounding method, grounding facility with grounding way and building of insulating, switched grounding with No. 1A and LCR, grounding facility of transmission line, wireless facility grounding, grounding facility in wireless base station, grounding of power facility, grounding low-tenton interior power wire, communication facility of railroad, install of arrester in apartment and house, install of arrester on introduction and earth conductivity and measurement with introduction and grounding resistance.

  17. Formic Acid Free Flowsheet Development To Eliminate Catalytic Hydrogen Generation In The Defense Waste Processing

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, Dan P.; Stone, Michael E.; Newell, J. David; Fellinger, Terri L.; Bricker, Jonathan M.

    2012-09-14

    The Defense Waste Processing Facility (DWPF) processes legacy nuclear waste generated at the Savannah River Site (SRS) during production of plutonium and tritium demanded by the Cold War. The nuclear waste is first treated via a complex sequence of controlled chemical reactions and then vitrified into a borosilicate glass form and poured into stainless steel canisters. Converting the nuclear waste into borosilicate glass canisters is a safe, effective way to reduce the volume of the waste and stabilize the radionuclides. Testing was initiated to determine whether the elimination of formic acid from the DWPF's chemical processing flowsheet would eliminate catalytic hydrogen generation. Historically, hydrogen is generated in chemical processing of alkaline High Level Waste sludge in DWPF. In current processing, sludge is combined with nitric and formic acid to neutralize the waste, reduce mercury and manganese, destroy nitrite, and modify (thin) the slurry rheology. The noble metal catalyzed formic acid decomposition produces hydrogen and carbon dioxide. Elimination of formic acid by replacement with glycolic acid has the potential to eliminate the production of catalytic hydrogen. Flowsheet testing was performed to develop the nitric-glycolic acid flowsheet as an alternative to the nitric-formic flowsheet currently being processed at the DWPF. This new flowsheet has shown that mercury can be reduced and removed by steam stripping in DWPF with no catalytic hydrogen generation. All processing objectives were also met, including greatly reducing the Slurry Mix Evaporator (SME) product yield stress as compared to the baseline nitric/formic flowsheet. Ten DWPF tests were performed with nonradioactive simulants designed to cover a broad compositional range. No hydrogen was generated in testing without formic acid.

  18. Progress towards malaria elimination in Zimbabwe with special reference to the period 2003-2015.

    Science.gov (United States)

    Sande, Shadreck; Zimba, Moses; Mberikunashe, Joseph; Tangwena, Andrew; Chimusoro, Anderson

    2017-07-24

    An intensive effort to control malaria in Zimbabwe has produced dramatic reductions in the burden of the disease over the past 13 years. The successes have prompted the Zimbabwe's National Malaria Control Programme to commit to elimination of malaria. It is critical to analyse the changes in the morbidity trends based on surveillance data, and scrutinize reorientation to strategies for elimination. This is a retrospective study of available Ministry of Health surveillance data and programme reports, mostly from 2003 to 2015. Malaria epidemiological data were drawn from the National Health Information System database. Data on available resources, malaria control strategies, morbidity and mortality trends were analysed, and opportunities for Zimbabwe malaria elimination agenda was perused. With strong government commitment and partner support, the financial gap for malaria programming shrank by 91.4% from about US$13 million in 2012 to US$1 million in 2015. Vector control comprises indoor residual house spraying (IRS) and long-lasting insecticidal nets, and spray coverage increased from 28% in 2003 to 95% in 2015. Population protected by IRS increased also from 20 to 96% for the same period. In 2009, diagnostics improved from clinical to parasitological confirmation either by rapid diagnostic tests or microscopy. Artemisinin-based combination therapy was used to treat malaria following chloroquine resistance in 2000, and sulfadoxine-pyrimethamine in 2004. In 2003, there were 155 malaria cases per 1000 populations reported from all health facilities throughout the country. The following decade witnessed a substantial decline in cases to only 22 per 1000 populations in 2012. A resurgence was reported in 2013 (29/1000) and 2014 (39/1000), thereafter morbidity declined to 29 cases per 1000 populations, only to the same level as in 2013. Overall, morbidity declined by 81% from 2003 to 2015. Inpatient malaria deaths per 100,000 populations doubled in 4 years, from 2

  19. Malaria training for community health workers in the setting of elimination: a qualitative study from China.

    Science.gov (United States)

    Lu, Guangyu; Liu, Yaobao; Wang, Jinsong; Li, Xiangming; Liu, Xing; Beiersmann, Claudia; Feng, Yu; Cao, Jun; Müller, Olaf

    2018-02-23

    Continuous training of health workers is a key intervention to maintain their good performance and keep their vigilance during malaria elimination programmes. However, countries progressing toward malaria elimination have a largely decreased malaria disease burden, less frequent exposure of health workers to malaria patients, and new challenges in the epidemiology of the remaining malaria cases. Moreover, competing health priorities and usually a decline in resources and in political commitment also pose challenges to the elimination programme. As a consequence, the acceptability, sustainability, and impact of malaria training and education programmes face challenges. However, little is known of the perceptions and expectations of malaria training and education programmes of health workers being engaged in countries with malaria elimination programmes. This qualitative study provides information on perceptions and expectations of health workers of malaria training programmes from China, which aims to malaria elimination by the year 2020. This study was embedded into a larger study on the challenges and lessons learned during the malaria surveillance strategy in China, involving 42 interviews with malaria experts, health staff, laboratory practitioners, and village doctors at the provincial, city, county, township, and village levels from Gansu province (northwestern China) and Jiangsu province (southeastern China). In the context of an increasing number of imported malaria cases in China, the majority of respondents emphasized the necessity and importance of such programmes and complained about a decreasing frequency of training courses. Moreover, they called for innovative strategies to improve the implementation and sustainability of the malaria training programmes until the elimination goal has been achieved. Perceptions and expectations of health workers from different health centres were quite different. Health workers from higher-level facilities were more

  20. Classifying and mapping wetlands and peat resources using digital cartography

    Science.gov (United States)

    Cameron, Cornelia C.; Emery, David A.

    1992-01-01

    Digital cartography allows the portrayal of spatial associations among diverse data types and is ideally suited for land use and resource analysis. We have developed methodology that uses digital cartography for the classification of wetlands and their associated peat resources and applied it to a 1:24 000 scale map area in New Hampshire. Classifying and mapping wetlands involves integrating the spatial distribution of wetlands types with depth variations in associated peat quality and character. A hierarchically structured classification that integrates the spatial distribution of variations in (1) vegetation, (2) soil type, (3) hydrology, (4) geologic aspects, and (5) peat characteristics has been developed and can be used to build digital cartographic files for resource and land use analysis. The first three parameters are the bases used by the National Wetlands Inventory to classify wetlands and deepwater habitats of the United States. The fourth parameter, geological aspects, includes slope, relief, depth of wetland (from surface to underlying rock or substrate), wetland stratigraphy, and the type and structure of solid and unconsolidated rock surrounding and underlying the wetland. The fifth parameter, peat characteristics, includes the subsurface variation in ash, acidity, moisture, heating value (Btu), sulfur content, and other chemical properties as shown in specimens obtained from core holes. These parameters can be shown as a series of map data overlays with tables that can be integrated for resource or land use analysis.

  1. Efficacy of MRI in classifying proximal focal femoral deficiency

    International Nuclear Information System (INIS)

    Maldjian, C.; Patel, T.Y.; Klein, R.M.; Smith, R.C.

    2007-01-01

    To evaluate the efficacy of MRI in classifying PFFD and to compare MRI to radiographic classification of PFFD. Radiographic and MRI classification of the cases was performed utilizing the Amstutz classification system. Retrospective evaluation of radiographs and MRI exams in nine hips of eight patients with proximal focal femoral deficiency was performed by two radiologists. The cases were classified by radiographs as Amstutz 1: n=3, Amstutz 3: n=3, Amstutz 4: n=1 and Amstutz 5: n=2. The classifications based on MRI were Amstutz 1: n=6, Amstutz 2: n=1, Amstutz 3: n=0, Amstutz 4: n=2 and Amstutz 5: n=0. Three hips demonstrated complete agreement. There were six discordant hips. In two of the discordant cases, follow-up radiographs of 6 months or greater intervals were available and helped to confirm MRI findings. Errors in radiographic evaluation consisted of overestimating the degree of deficiency. MRI is more accurate than radiographic evaluation for the classification of PFFD, particularly early on, prior to the ossification of cartilaginous components in the femurs. Since radiographic evaluation tends to overestimate the degree of deficiency, MRI is a more definitive modality for evaluation of PFFD. (orig.)

  2. REPTREE CLASSIFIER FOR IDENTIFYING LINK SPAM IN WEB SEARCH ENGINES

    Directory of Open Access Journals (Sweden)

    S.K. Jayanthi

    2013-01-01

    Full Text Available Search Engines are used for retrieving the information from the web. Most of the times, the importance is laid on top 10 results sometimes it may shrink as top 5, because of the time constraint and reliability on the search engines. Users believe that top 10 or 5 of total results are more relevant. Here comes the problem of spamdexing. It is a method to deceive the search result quality. Falsified metrics such as inserting enormous amount of keywords or links in website may take that website to the top 10 or 5 positions. This paper proposes a classifier based on the Reptree (Regression tree representative. As an initial step Link-based features such as neighbors, pagerank, truncated pagerank, trustrank and assortativity related attributes are inferred. Based on this features, tree is constructed. The tree uses the feature inference to differentiate spam sites from legitimate sites. WEBSPAM-UK-2007 dataset is taken as a base. It is preprocessed and converted into five datasets FEATA, FEATB, FEATC, FEATD and FEATE. Only link based features are taken for experiments. This paper focus on link spam alone. Finally a representative tree is created which will more precisely classify the web spam entries. Results are given. Regression tree classification seems to perform well as shown through experiments.

  3. Deposition of Nanostructured Thin Film from Size-Classified Nanoparticles

    Science.gov (United States)

    Camata, Renato P.; Cunningham, Nicholas C.; Seol, Kwang Soo; Okada, Yoshiki; Takeuchi, Kazuo

    2003-01-01

    Materials comprising nanometer-sized grains (approximately 1_50 nm) exhibit properties dramatically different from those of their homogeneous and uniform counterparts. These properties vary with size, shape, and composition of nanoscale grains. Thus, nanoparticles may be used as building blocks to engineer tailor-made artificial materials with desired properties, such as non-linear optical absorption, tunable light emission, charge-storage behavior, selective catalytic activity, and countless other characteristics. This bottom-up engineering approach requires exquisite control over nanoparticle size, shape, and composition. We describe the design and characterization of an aerosol system conceived for the deposition of size classified nanoparticles whose performance is consistent with these strict demands. A nanoparticle aerosol is generated by laser ablation and sorted according to size using a differential mobility analyzer. Nanoparticles within a chosen window of sizes (e.g., (8.0 plus or minus 0.6) nm) are deposited electrostatically on a surface forming a film of the desired material. The system allows the assembly and engineering of thin films using size-classified nanoparticles as building blocks.

  4. Speaker gender identification based on majority vote classifiers

    Science.gov (United States)

    Mezghani, Eya; Charfeddine, Maha; Nicolas, Henri; Ben Amar, Chokri

    2017-03-01

    Speaker gender identification is considered among the most important tools in several multimedia applications namely in automatic speech recognition, interactive voice response systems and audio browsing systems. Gender identification systems performance is closely linked to the selected feature set and the employed classification model. Typical techniques are based on selecting the best performing classification method or searching optimum tuning of one classifier parameters through experimentation. In this paper, we consider a relevant and rich set of features involving pitch, MFCCs as well as other temporal and frequency-domain descriptors. Five classification models including decision tree, discriminant analysis, nave Bayes, support vector machine and k-nearest neighbor was experimented. The three best perming classifiers among the five ones will contribute by majority voting between their scores. Experimentations were performed on three different datasets spoken in three languages: English, German and Arabic in order to validate language independency of the proposed scheme. Results confirm that the presented system has reached a satisfying accuracy rate and promising classification performance thanks to the discriminating abilities and diversity of the used features combined with mid-level statistics.

  5. Spread-sheet application to classify radioactive material for shipment

    International Nuclear Information System (INIS)

    Brown, A.N.

    1998-01-01

    A spread-sheet application has been developed at the Idaho National Engineering and Environmental Laboratory to aid the shipper when classifying nuclide mixtures of normal form, radioactive materials. The results generated by this spread-sheet are used to confirm the proper US DOT classification when offering radioactive material packages for transport. The user must input to the spread-sheet the mass of the material being classified, the physical form (liquid or not) and the activity of each regulated nuclide. The spread-sheet uses these inputs to calculate two general values: 1)the specific activity of the material and a summation calculation of the nuclide content. The specific activity is used to determine if the material exceeds the DOT minimal threshold for a radioactive material. If the material is calculated to be radioactive, the specific activity is also used to determine if the material meets the activity requirement for one of the three low specific activity designations (LSA-I, LSA-II, LSA-III, or not LSA). Again, if the material is calculated to be radioactive, the summation calculation is then used to determine which activity category the material will meet (Limited Quantity, Type A, Type B, or Highway Route Controlled Quantity). This spread-sheet has proven to be an invaluable aid for shippers of radioactive materials at the Idaho National Engineering and Environmental Laboratory. (authors)

  6. Identifying aggressive prostate cancer foci using a DNA methylation classifier.

    Science.gov (United States)

    Mundbjerg, Kamilla; Chopra, Sameer; Alemozaffar, Mehrdad; Duymich, Christopher; Lakshminarasimhan, Ranjani; Nichols, Peter W; Aron, Manju; Siegmund, Kimberly D; Ukimura, Osamu; Aron, Monish; Stern, Mariana; Gill, Parkash; Carpten, John D; Ørntoft, Torben F; Sørensen, Karina D; Weisenberger, Daniel J; Jones, Peter A; Duddalwar, Vinay; Gill, Inderbir; Liang, Gangning

    2017-01-12

    Slow-growing prostate cancer (PC) can be aggressive in a subset of cases. Therefore, prognostic tools to guide clinical decision-making and avoid overtreatment of indolent PC and undertreatment of aggressive disease are urgently needed. PC has a propensity to be multifocal with several different cancerous foci per gland. Here, we have taken advantage of the multifocal propensity of PC and categorized aggressiveness of individual PC foci based on DNA methylation patterns in primary PC foci and matched lymph node metastases. In a set of 14 patients, we demonstrate that over half of the cases have multiple epigenetically distinct subclones and determine the primary subclone from which the metastatic lesion(s) originated. Furthermore, we develop an aggressiveness classifier consisting of 25 DNA methylation probes to determine aggressive and non-aggressive subclones. Upon validation of the classifier in an independent cohort, the predicted aggressive tumors are significantly associated with the presence of lymph node metastases and invasive tumor stages. Overall, this study provides molecular-based support for determining PC aggressiveness with the potential to impact clinical decision-making, such as targeted biopsy approaches for early diagnosis and active surveillance, in addition to focal therapy.

  7. Spreadsheet application to classify radioactive material for shipment

    International Nuclear Information System (INIS)

    Brown, A.N.

    1997-12-01

    A spreadsheet application has been developed at the Idaho National Engineering and Environmental Laboratory to aid the shipper when classifying nuclide mixtures of normal form, radioactive materials. The results generated by this spreadsheet are used to confirm the proper US Department of Transportation (DOT) classification when offering radioactive material packages for transport. The user must input to the spreadsheet the mass of the material being classified, the physical form (liquid or not), and the activity of each regulated nuclide. The spreadsheet uses these inputs to calculate two general values: (1) the specific activity of the material, and (2) a summation calculation of the nuclide content. The specific activity is used to determine if the material exceeds the DOT minimal threshold for a radioactive material (Yes or No). If the material is calculated to be radioactive, the specific activity is also used to determine if the material meets the activity requirement for one of the three Low Specific Activity designations (LSA-I, LSA-II, LSA-III, or Not LSA). Again, if the material is calculated to be radioactive, the summation calculation is then used to determine which activity category the material will meet (Limited Quantity, Type A, Type B, or Highway Route Controlled Quantity)

  8. Classifying decommissioning wastes for allocation to appropriate final repositories

    International Nuclear Information System (INIS)

    Alder, J.C.; Tunaboylu, K.

    1982-01-01

    For the safe disposal of radioactive wastes in different repositories, it is of advantage to classify them in well-defined conditioned categories, appropriate for final disposal. These categories, the so-called waste sorts are characterized by similar radionuclide distribution, similar nuclide-specific activity concentrations and similar waste matrix. A methodology is presented for classifying decommissioning wastes and is applied to the decommissioning wastes arising from a Swiss program of 6 GWe. The amounts and nuclide-specific activity inventories of the decommissioning waste sorts have been estimated. A first allocation into two different repository types has been performed. Such a classification enables one to define the source parameters for repository safety analysis and allows one to allocate the different waste categories into appropriate final repositories. This work presents a first iteration to determine which waste sorts belong to which repository type. The characteristics of waste sorts have to be better defined and the protective strength of the repository barriers has to be optimized. 7 references, 2 figures, 4 tables

  9. Classifying magnetic resonance image modalities with convolutional neural networks

    Science.gov (United States)

    Remedios, Samuel; Pham, Dzung L.; Butman, John A.; Roy, Snehashis

    2018-02-01

    Magnetic Resonance (MR) imaging allows the acquisition of images with different contrast properties depending on the acquisition protocol and the magnetic properties of tissues. Many MR brain image processing techniques, such as tissue segmentation, require multiple MR contrasts as inputs, and each contrast is treated differently. Thus it is advantageous to automate the identification of image contrasts for various purposes, such as facilitating image processing pipelines, and managing and maintaining large databases via content-based image retrieval (CBIR). Most automated CBIR techniques focus on a two-step process: extracting features from data and classifying the image based on these features. We present a novel 3D deep convolutional neural network (CNN)- based method for MR image contrast classification. The proposed CNN automatically identifies the MR contrast of an input brain image volume. Specifically, we explored three classification problems: (1) identify T1-weighted (T1-w), T2-weighted (T2-w), and fluid-attenuated inversion recovery (FLAIR) contrasts, (2) identify pre vs postcontrast T1, (3) identify pre vs post-contrast FLAIR. A total of 3418 image volumes acquired from multiple sites and multiple scanners were used. To evaluate each task, the proposed model was trained on 2137 images and tested on the remaining 1281 images. Results showed that image volumes were correctly classified with 97.57% accuracy.

  10. Testing the hypothesis that treatment can eliminate HIV

    DEFF Research Database (Denmark)

    Okano, Justin T; Robbins, Danielle; Palk, Laurence

    2016-01-01

    BACKGROUND: Worldwide, approximately 35 million individuals are infected with HIV; about 25 million of these live in sub-Saharan Africa. WHO proposes using treatment as prevention (TasP) to eliminate HIV. Treatment suppresses viral load, decreasing the probability an individual transmits HIV....... The elimination threshold is one new HIV infection per 1000 individuals. Here, we test the hypothesis that TasP can substantially reduce epidemics and eliminate HIV. We estimate the impact of TasP, between 1996 and 2013, on the Danish HIV epidemic in men who have sex with men (MSM), an epidemic UNAIDS has...... identified as a priority for elimination. METHODS: We use a CD4-staged Bayesian back-calculation approach to estimate incidence, and the hidden epidemic (the number of HIV-infected undiagnosed MSM). To develop the back-calculation model, we use data from an ongoing nationwide population-based study...

  11. Elimination kinetic model for organic chemicals in earthworms.

    NARCIS (Netherlands)

    Dimitrova, N.; Dimitrov, S.; Georgieva, D.; van Gestel, C.A.M.; Hankard, P.; Spurgeon, D.J.; Li, H.; Mekenyan, O.

    2010-01-01

    Mechanistic understanding of bioaccumulation in different organisms and environments should take into account the influence of organism and chemical depending factors on the uptake and elimination kinetics of chemicals. Lipophilicity, metabolism, sorption (bioavailability) and biodegradation of

  12. Elimination of the Landau ghost from chiral solitons

    International Nuclear Information System (INIS)

    Hartmann, J.; Beck, F.; Bentz, W.

    1994-01-01

    We show a practical way based on the Kaellen-Lehmann representation for the two-point functions to eliminate the instability of the vacuum against formation of small sized meson configurations in the chiral σ model

  13. Eliminating Adversary Weapons of Mass Destruction: What's at Stake?

    National Research Council Canada - National Science Library

    Hersman, Rebecca K

    2004-01-01

    .... Unfortunately, the current preoccupation with intelligence might mask other issues and shortcomings in the American ability to eliminate the threat posed by weapons of mass destruction in the hands of its enemies...

  14. QUEST : Eliminating online supervised learning for efficient classification algorithms

    NARCIS (Netherlands)

    Zwartjes, Ardjan; Havinga, Paul J.M.; Smit, Gerard J.M.; Hurink, Johann L.

    2016-01-01

    In this work, we introduce QUEST (QUantile Estimation after Supervised Training), an adaptive classification algorithm for Wireless Sensor Networks (WSNs) that eliminates the necessity for online supervised learning. Online processing is important for many sensor network applications. Transmitting

  15. 48 CFR 3004.470 - Security requirements for access to unclassified facilities, Information Technology resources...

    Science.gov (United States)

    2010-10-01

    ... access to unclassified facilities, Information Technology resources, and sensitive information. 3004.470... Technology resources, and sensitive information. ... ACQUISITION REGULATION (HSAR) GENERAL ADMINISTRATIVE MATTERS Safeguarding Classified and Sensitive Information...

  16. AOV Facility Tool/Facility Safety Specifications -

    Data.gov (United States)

    Department of Transportation — Develop and maintain authorizing documents that are standards that facilities must follow. These standards are references of FAA regulations and are specific to the...

  17. Natural elimination of volatile halogenated hydrocarbons from the environment

    Energy Technology Data Exchange (ETDEWEB)

    Harress, H.M.; Grathwohl, P.; Torunski, H.

    1987-01-01

    Recently carried out field investigations of groundwater contaminations with volatile halogenated hydrocarbons have shown evidence of natural elimination of these hazardous substances. This elimination effects is rare and observed in connection with special geological conditions. With regard to some contaminated sites, the following mechanisms for this behaviour are discussed: 1. Stripping by naturally ascending gases. 2. Sorption on soil organic matter. 3. Biodegradation. The so far compiled knowledge allowed to develop further research programmes, which are pursued in various projects.

  18. Lacunarity Elimination in the Translation of Nonequivalent Juridical Terms

    Directory of Open Access Journals (Sweden)

    Vladimir A. Lazarev

    2017-06-01

    Full Text Available In the article the authors refer to the problem of lacunarity elimination in legal texts reflecting national specific features in the legal sphere. Various ways of filling gaps when translating from English into Russian are suggested using legal-dogmatic theoretical texts as an example. Different variants of the lacunae elimination are proposed, which take into account the peculiarities of this type of text and can assist in the work of a legal translator.

  19. Which structural rules admit cut elimination? An algebraic criterion

    OpenAIRE

    Terui, Kazushige

    2007-01-01

    Consider a general class of structural inference rules such as exchange, weakening, contraction and their generalizations. Among them, some are harmless but others do harm to cut elimination. Hence it is natural to ask under which condition cut elimination is preserved when a set of structural rules is added to a structure-free logic. The aim of this work is to give such a condition by using algebraic semantics. ¶ We consider full Lambek calculus (FL), i.e., intuitioni...

  20. Neural classifiers for learning higher-order correlations

    International Nuclear Information System (INIS)

    Gueler, M.

    1999-01-01

    Studies by various authors suggest that higher-order networks can be more powerful and biologically more plausible with respect to the more traditional multilayer networks. These architecture make explicit use of nonlinear interactions between input variables in the form of higher-order units or product units. If it is known a priori that the problem to be implemented possesses a given set of invariances like in the translation, rotation, and scale invariant recognition problems, those invariances can be encoded, thus eliminating all higher-order terms which are incompatible with the invariances. In general, however, it is a serious set-back that the complexity of learning increases exponentially with the size of inputs. This paper reviews higher-order networks and introduces an implicit representation in which learning complexity is mainly decided by the number of higher-order terms to be learned and increases only linearly with the input size

  1. Neural Classifiers for Learning Higher-Order Correlations

    Science.gov (United States)

    Güler, Marifi

    1999-01-01

    Studies by various authors suggest that higher-order networks can be more powerful and are biologically more plausible with respect to the more traditional multilayer networks. These architectures make explicit use of nonlinear interactions between input variables in the form of higher-order units or product units. If it is known a priori that the problem to be implemented possesses a given set of invariances like in the translation, rotation, and scale invariant pattern recognition problems, those invariances can be encoded, thus eliminating all higher-order terms which are incompatible with the invariances. In general, however, it is a serious set-back that the complexity of learning increases exponentially with the size of inputs. This paper reviews higher-order networks and introduces an implicit representation in which learning complexity is mainly decided by the number of higher-order terms to be learned and increases only linearly with the input size.

  2. Health information system strengthening and malaria elimination in Papua New Guinea.

    Science.gov (United States)

    Rosewell, Alexander; Makita, Leo; Muscatello, David; John, Lucy Ninmongo; Bieb, Sibauk; Hutton, Ross; Ramamurthy, Sundar; Shearman, Phil

    2017-07-05

    The objective of the study was to describe an m-health initiative to strengthen malaria surveillance in a 184-health facility, multi-province, project aimed at strengthening the National Health Information System (NHIS) in a country with fragmented malaria surveillance, striving towards enhanced control, pre-elimination. A remote-loading mobile application and secure online platform for health professionals was created to interface with the new system (eNHIS). A case-based malaria testing register was developed and integrated geo-coded households, villages and health facilities. A malaria programme management dashboard was created, with village-level malaria mapping tools, and statistical algorithms to identify malaria outbreaks. Since its inception in 2015, 160,750 malaria testing records, including village of residence, have been reported to the eNHIS. These case-based, geo-coded malaria data are 100% complete, with a median data entry delay of 9 days from the date of testing. The system maps malaria to the village level in near real-time as well as the availability of treatment and diagnostics to health facility level. Data aggregation, analysis, outbreak detection, and reporting are automated. The study demonstrates that using mobile technologies and GIS in the capture and reporting of NHIS data in Papua New Guinea provides timely, high quality, geo-coded, case-based malaria data required for malaria elimination. The health systems strengthening approach of integrating malaria information management into the eNHIS optimizes sustainability and provides enormous flexibility to cater for future malaria programme needs.

  3. A deep learning method for classifying mammographic breast density categories.

    Science.gov (United States)

    Mohamed, Aly A; Berg, Wendie A; Peng, Hong; Luo, Yahong; Jankowitz, Rachel C; Wu, Shandong

    2018-01-01

    Mammographic breast density is an established risk marker for breast cancer and is visually assessed by radiologists in routine mammogram image reading, using four qualitative Breast Imaging and Reporting Data System (BI-RADS) breast density categories. It is particularly difficult for radiologists to consistently distinguish the two most common and most variably assigned BI-RADS categories, i.e., "scattered density" and "heterogeneously dense". The aim of this work was to investigate a deep learning-based breast density classifier to consistently distinguish these two categories, aiming at providing a potential computerized tool to assist radiologists in assigning a BI-RADS category in current clinical workflow. In this study, we constructed a convolutional neural network (CNN)-based model coupled with a large (i.e., 22,000 images) digital mammogram imaging dataset to evaluate the classification performance between the two aforementioned breast density categories. All images were collected from a cohort of 1,427 women who underwent standard digital mammography screening from 2005 to 2016 at our institution. The truths of the density categories were based on standard clinical assessment made by board-certified breast imaging radiologists. Effects of direct training from scratch solely using digital mammogram images and transfer learning of a pretrained model on a large nonmedical imaging dataset were evaluated for the specific task of breast density classification. In order to measure the classification performance, the CNN classifier was also tested on a refined version of the mammogram image dataset by removing some potentially inaccurately labeled images. Receiver operating characteristic (ROC) curves and the area under the curve (AUC) were used to measure the accuracy of the classifier. The AUC was 0.9421 when the CNN-model was trained from scratch on our own mammogram images, and the accuracy increased gradually along with an increased size of training samples

  4. Research activities by INS cyclotron facility

    International Nuclear Information System (INIS)

    1992-06-01

    Research activities made by the cyclotron facility and the related apparatuses at Institute for Nuclear Study (INS), University of Tokyo, have been reviewed in terms of the associated scientific publications. This publication list, which is to be read as a continuation of INS-Rep.-608 (October, 1986), includes experimental works on low-energy nuclear physics, accelerator technology, instrumental developments, radiation physics and other applications in interdisciplinary fields. The publications are classified into the following four categories. (A) : Internal reports published in INS. (B) : Publications in international scientific journals on experimental research works done by the cyclotron facility and the related apparatuses at INS. Those made by outside users are also included. (C) : Publications in international scientific journals on experimental low-energy nuclear physics, which have been done by the staff of INS Nuclear Physics Division using facilities outside INS. (D) : Contributions to international conferences. (author)

  5. Least Square Support Vector Machine Classifier vs a Logistic Regression Classifier on the Recognition of Numeric Digits

    Directory of Open Access Journals (Sweden)

    Danilo A. López-Sarmiento

    2013-11-01

    Full Text Available In this paper is compared the performance of a multi-class least squares support vector machine (LSSVM mc versus a multi-class logistic regression classifier to problem of recognizing the numeric digits (0-9 handwritten. To develop the comparison was used a data set consisting of 5000 images of handwritten numeric digits (500 images for each number from 0-9, each image of 20 x 20 pixels. The inputs to each of the systems were vectors of 400 dimensions corresponding to each image (not done feature extraction. Both classifiers used OneVsAll strategy to enable multi-classification and a random cross-validation function for the process of minimizing the cost function. The metrics of comparison were precision and training time under the same computational conditions. Both techniques evaluated showed a precision above 95 %, with LS-SVM slightly more accurate. However the computational cost if we found a marked difference: LS-SVM training requires time 16.42 % less than that required by the logistic regression model based on the same low computational conditions.

  6. 78 FR 48076 - Facility Security Clearance and Safeguarding of National Security Information and Restricted Data

    Science.gov (United States)

    2013-08-07

    ...-2011-0268] RIN 3150-AJ07 Facility Security Clearance and Safeguarding of National Security Information..., Classified National Security Information. The rule would allow licensees flexibility in determining the means... licensee security education and training programs and enhances the protection of classified information...

  7. Higher School Marketing Strategy Formation: Classifying the Factors

    Directory of Open Access Journals (Sweden)

    N. K. Shemetova

    2012-01-01

    Full Text Available The paper deals with the main trends of higher school management strategy formation. The author specifies the educational changes in the modern information society determining the strategy options. For each professional training level the author denotes the set of strategic factors affecting the educational service consumers and, therefore, the effectiveness of the higher school marketing. The given factors are classified from the stand-points of the providers and consumers of educational service (enrollees, students, graduates and postgraduates. The research methods include the statistic analysis and general methods of scientific analysis, synthesis, induction, deduction, comparison, and classification. The author is convinced that the university management should develop the necessary prerequisites for raising the graduates’ competitiveness in the labor market, and stimulate the active marketing policies of the relating subdivisions and departments. In author’s opinion, the above classification of marketing strategy factors can be used as the system of values for educational service providers. 

  8. An automated approach to the design of decision tree classifiers

    Science.gov (United States)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  9. A robust dataset-agnostic heart disease classifier from Phonocardiogram.

    Science.gov (United States)

    Banerjee, Rohan; Dutta Choudhury, Anirban; Deshpande, Parijat; Bhattacharya, Sakyajit; Pal, Arpan; Mandana, K M

    2017-07-01

    Automatic classification of normal and abnormal heart sounds is a popular area of research. However, building a robust algorithm unaffected by signal quality and patient demography is a challenge. In this paper we have analysed a wide list of Phonocardiogram (PCG) features in time and frequency domain along with morphological and statistical features to construct a robust and discriminative feature set for dataset-agnostic classification of normal and cardiac patients. The large and open access database, made available in Physionet 2016 challenge was used for feature selection, internal validation and creation of training models. A second dataset of 41 PCG segments, collected using our in-house smart phone based digital stethoscope from an Indian hospital was used for performance evaluation. Our proposed methodology yielded sensitivity and specificity scores of 0.76 and 0.75 respectively on the test dataset in classifying cardiovascular diseases. The methodology also outperformed three popular prior art approaches, when applied on the same dataset.

  10. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  11. The Motivation of Betrayal by Leaking of Classified Information

    Directory of Open Access Journals (Sweden)

    Lăzăroiu Laurențiu-Leonard

    2017-03-01

    Full Text Available Trying to forecast the human behavior involves acts and knowledge of motivational theories, applicable to profile of each organization and in particular to each individual’s style. The anticipation of personal attitudes has not the only aim for a passive monitoring of professional activity, but also wants to increase performance of risk avoidance, in acordance with a specific organizational environment. The emergence and development of motivational forms and values, whose projections determine social crimes, are risk factors, affecting the professional activity of the person, but also affecting the performance and stability of the institution. Moreover, if the motivation determines attitudes aimed at compromising classified information, the resulting actions may be considered as threats to national security. The prevention of such threats can only be achieved by understanding motivational mechanisms and external conditions for the perssonel that make it possible to transform some intentions into real actions.

  12. Using point-set compression to classify folk songs

    DEFF Research Database (Denmark)

    Meredith, David

    2014-01-01

    -neighbour algorithm and leave-one-out cross-validation to classify the 360 melodies into tune families. The classifications produced by the algorithms were compared with a ground-truth classification prepared by expert musicologists. Twelve of the thirteen compressors used in the experiment were based...... compared. The highest classification success rate of 77–84% was achieved by COSIATEC, followed by 60–64% for Forth’s algorithm and then 52–58% for SIATECCompress. When the NCDs were calculated using bzip2, the success rate was only 12.5%. The results demonstrate that the effectiveness of NCD for measuring...... similarity between folk-songs for classification purposes is highly dependent upon the actual compressor chosen. Furthermore, it seems that compressors based on finding maximal repeated patterns in point-set representations of music show more promise for NCD-based music classification than general...

  13. Sex Bias in Classifying Borderline and Narcissistic Personality Disorder.

    Science.gov (United States)

    Braamhorst, Wouter; Lobbestael, Jill; Emons, Wilco H M; Arntz, Arnoud; Witteman, Cilia L M; Bekker, Marrie H J

    2015-10-01

    This study investigated sex bias in the classification of borderline and narcissistic personality disorders. A sample of psychologists in training for a post-master degree (N = 180) read brief case histories (male or female version) and made DSM classification. To differentiate sex bias due to sex stereotyping or to base rate variation, we used different case histories, respectively: (1) non-ambiguous case histories with enough criteria of either borderline or narcissistic personality disorder to meet the threshold for classification, and (2) an ambiguous case with subthreshold features of both borderline and narcissistic personality disorder. Results showed significant differences due to sex of the patient in the ambiguous condition. Thus, when the diagnosis is not straightforward, as in the case of mixed subthreshold features, sex bias is present and is influenced by base-rate variation. These findings emphasize the need for caution in classifying personality disorders, especially borderline or narcissistic traits.

  14. Fisher information metrics for binary classifier evaluation and training

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Different evaluation metrics for binary classifiers are appropriate to different scientific domains and even to different problems within the same domain. This presentation focuses on the optimisation of event selection to minimise statistical errors in HEP parameter estimation, a problem that is best analysed in terms of the maximisation of Fisher information about the measured parameters. After describing a general formalism to derive evaluation metrics based on Fisher information, three more specific metrics are introduced for the measurements of signal cross sections in counting experiments (FIP1) or distribution fits (FIP2) and for the measurements of other parameters from distribution fits (FIP3). The FIP2 metric is particularly interesting because it can be derived from any ROC curve, provided that prevalence is also known. In addition to its relation to measurement errors when used as an evaluation criterion (which makes it more interesting that the ROC AUC), a further advantage of the FIP2 metric is ...

  15. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  16. Classifying and Visualising Roman Pottery using Computer-scanned Typologies

    Directory of Open Access Journals (Sweden)

    Jacqueline Christmas

    2018-05-01

    Full Text Available For many archaeological assemblages and type-series, accurate drawings of standardised pottery vessels have been recorded in consistent styles. This provides the opportunity to extract individual pot drawings and derive from them data that can be used for analysis and visualisation. Starting from PDF scans of the original pages of pot drawings, we have automated much of the process for locating, defining the boundaries, extracting and orientating each individual pot drawing. From these processed images, basic features such as width and height, the volume of the interior, the edges, and the shape of the cross-section outline are extracted and are then used to construct more complex features such as a measure of a pot's 'circularity'. Capturing these traits opens up new possibilities for (a classifying vessel form in a way that is sensitive to the physical characteristics of pots relative to other vessels in an assemblage, and (b visualising the results of quantifying assemblages using standard typologies. A frequently encountered problem when trying to compare pottery from different archaeological sites is that the pottery is classified into forms and labels using different standards. With a set of data from early Roman urban centres and related sites that has been labelled both with forms (e.g. 'platter' and 'bowl' and shape identifiers (based on the Camulodunum type-series, we use the extracted features from images to look both at how the pottery forms cluster for a given set of features, and at how the features may be used to compare finds from different sites.

  17. Deep Learning to Classify Radiology Free-Text Reports.

    Science.gov (United States)

    Chen, Matthew C; Ball, Robyn L; Yang, Lingyao; Moradzadeh, Nathaniel; Chapman, Brian E; Larson, David B; Langlotz, Curtis P; Amrhein, Timothy J; Lungren, Matthew P

    2018-03-01

    Purpose To evaluate the performance of a deep learning convolutional neural network (CNN) model compared with a traditional natural language processing (NLP) model in extracting pulmonary embolism (PE) findings from thoracic computed tomography (CT) reports from two institutions. Materials and Methods Contrast material-enhanced CT examinations of the chest performed between January 1, 1998, and January 1, 2016, were selected. Annotations by two human radiologists were made for three categories: the presence, chronicity, and location of PE. Classification of performance of a CNN model with an unsupervised learning algorithm for obtaining vector representations of words was compared with the open-source application PeFinder. Sensitivity, specificity, accuracy, and F1 scores for both the CNN model and PeFinder in the internal and external validation sets were determined. Results The CNN model demonstrated an accuracy of 99% and an area under the curve value of 0.97. For internal validation report data, the CNN model had a statistically significant larger F1 score (0.938) than did PeFinder (0.867) when classifying findings as either PE positive or PE negative, but no significant difference in sensitivity, specificity, or accuracy was found. For external validation report data, no statistical difference between the performance of the CNN model and PeFinder was found. Conclusion A deep learning CNN model can classify radiology free-text reports with accuracy equivalent to or beyond that of an existing traditional NLP model. © RSNA, 2017 Online supplemental material is available for this article.

  18. Immunohistochemical analysis of breast tissue microarray images using contextual classifiers

    Directory of Open Access Journals (Sweden)

    Stephen J McKenna

    2013-01-01

    Full Text Available Background: Tissue microarrays (TMAs are an important tool in translational research for examining multiple cancers for molecular and protein markers. Automatic immunohistochemical (IHC scoring of breast TMA images remains a challenging problem. Methods: A two-stage approach that involves localization of regions of invasive and in-situ carcinoma followed by ordinal IHC scoring of nuclei in these regions is proposed. The localization stage classifies locations on a grid as tumor or non-tumor based on local image features. These classifications are then refined using an auto-context algorithm called spin-context. Spin-context uses a series of classifiers to integrate image feature information with spatial context information in the form of estimated class probabilities. This is achieved in a rotationally-invariant manner. The second stage estimates ordinal IHC scores in terms of the strength of staining and the proportion of nuclei stained. These estimates take the form of posterior probabilities, enabling images with uncertain scores to be referred for pathologist review. Results: The method was validated against manual pathologist scoring on two nuclear markers, progesterone receptor (PR and estrogen receptor (ER. Errors for PR data were consistently lower than those achieved with ER data. Scoring was in terms of estimated proportion of cells that were positively stained (scored on an ordinal scale of 0-6 and perceived strength of staining (scored on an ordinal scale of 0-3. Average absolute differences between predicted scores and pathologist-assigned scores were 0.74 for proportion of cells and 0.35 for strength of staining (PR. Conclusions: The use of context information via spin-context improved the precision and recall of tumor localization. The combination of the spin-context localization method with the automated scoring method resulted in reduced IHC scoring errors.

  19. Progress Toward Regional Measles Elimination - Worldwide, 2000-2016.

    Science.gov (United States)

    Dabbagh, Alya; Patel, Minal K; Dumolard, Laure; Gacic-Dobo, Marta; Mulders, Mick N; Okwo-Bele, Jean-Marie; Kretsinger, Katrina; Papania, Mark J; Rota, Paul A; Goodson, James L

    2017-10-27

    The fourth United Nations Millennium Development Goal, adopted in 2000, set a target to reduce child mortality by two thirds by 2015. One indicator of progress toward this target was measles vaccination coverage (1). In 2010, the World Health Assembly (WHA) set three milestones for measles control by 2015: 1) increase routine coverage with the first dose of a measles-containing vaccine (MCV1) among children aged 1 year to ≥90% at the national level and to ≥80% in every district; 2) reduce global annual measles incidence to measles mortality by 95% from the 2000 estimate (2).* In 2012, WHA endorsed the Global Vaccine Action Plan, † with the objective of eliminating measles in four World Health Organization (WHO) regions by 2015 and in five regions by 2020. Countries in all six WHO regions have adopted goals for measles elimination by or before 2020. Measles elimination is defined as the absence of endemic measles virus transmission in a region or other defined geographic area for ≥12 months, in the presence of a high quality surveillance system that meets targets of key performance indicators. This report updates a previous report (3) and describes progress toward global measles control milestones and regional measles elimination goals during 2000-2016. During this period, annual reported measles incidence decreased 87%, from 145 to 19 cases per million persons, and annual estimated measles deaths decreased 84%, from 550,100 to 89,780; measles vaccination prevented an estimated 20.4 million deaths. However, the 2015 milestones have not yet been met; only one WHO region has been verified as having eliminated measles. Improved implementation of elimination strategies by countries and their partners is needed, with focus on increasing vaccination coverage through substantial and sustained additional investments in health systems, strengthening surveillance systems, using surveillance data to drive programmatic actions, securing political commitment, and raising

  20. Elimination of copper in tissues and organs of rainbow trout

    Directory of Open Access Journals (Sweden)

    Gaye Dogan

    2011-01-01

    Full Text Available Copper (Cu elimination was investigated in the tissue and organs of rainbow trout (Oncorhynchus mykiss, Walbaum, 1792, after Cu-free diets exposure. In the current study, fish were fed to satiation on diets containing 0.022 (Group 1; Control, 0.043 (Group 2, 0.123 (Group 3, 0.424 (Group 4 g Cu*kg-1 diet for 60 days before elimination experiment. A total of 288 fish (mean weight 84.28±1.05 g were randomly transferred to 12 fibreglass tanks. The fish were fed the Cu-free diet twice daily, until apparent satiation, during 60 days. Subsequently, the experiment was established for a period of elimination, during which samples were taken at days 15, 30, 45 and 60. Cu concentration in the muscle, gill tissue, digestive system, liver and whole body of fish were determined after 60 days depuration. Cu concentrations in tissues of rainbow trout decreased during depuration period, and the order of Cu elimination in tissue and organs of rainbow trout was: digestive system (73.1 %, then gill (41.1 %, muscle (31.5 % and liver (17.2 % for group 2; digestive system (74.1%, then muscle (65.8%, gill (60.0% and liver (34.6% for group 3; and digestive system (85.8%, then muscle (80.8%, liver (50.5% and less/equal in gill (50.2% for group 4. In statistical analysis, both groups and time were significant factors (P less than 0.05 on elimination rate. Moreover, significant interaction between groups and time were identified on elimination rate. Digestive system showed the fastest elimination rates of Cu at all groups compared with other tissues.

  1. An integrated lean-methods approach to hospital facilities redesign.

    Science.gov (United States)

    Nicholas, John

    2012-01-01

    Lean production methods for eliminating waste and improving processes in manufacturing are now being applied in healthcare. As the author shows, the methods are appropriate for redesigning hospital facilities. When used in an integrated manner and employing teams of mostly clinicians, the methods produce facility designs that are custom-fit to patient needs and caregiver work processes, and reduce operational costs. The author reviews lean methods and an approach for integrating them in the redesign of hospital facilities. A case example of the redesign of an emergency department shows the feasibility and benefits of the approach.

  2. Primary chromatic aberration elimination via optimization work with genetic algorithm

    Science.gov (United States)

    Wu, Bo-Wen; Liu, Tung-Kuan; Fang, Yi-Chin; Chou, Jyh-Horng; Tsai, Hsien-Lin; Chang, En-Hao

    2008-09-01

    Chromatic Aberration plays a part in modern optical systems, especially in digitalized and smart optical systems. Much effort has been devoted to eliminating specific chromatic aberration in order to match the demand for advanced digitalized optical products. Basically, the elimination of axial chromatic and lateral color aberration of an optical lens and system depends on the selection of optical glass. According to reports from glass companies all over the world, the number of various newly developed optical glasses in the market exceeds three hundred. However, due to the complexity of a practical optical system, optical designers have so far had difficulty in finding the right solution to eliminate small axial and lateral chromatic aberration except by the Damped Least Squares (DLS) method, which is limited in so far as the DLS method has not yet managed to find a better optical system configuration. In the present research, genetic algorithms are used to replace traditional DLS so as to eliminate axial and lateral chromatic, by combining the theories of geometric optics in Tessar type lenses and a technique involving Binary/Real Encoding, Multiple Dynamic Crossover and Random Gene Mutation to find a much better configuration for optical glasses. By implementing the algorithms outlined in this paper, satisfactory results can be achieved in eliminating axial and lateral color aberration.

  3. Rabies elimination research: juxtaposing optimism, pragmatism and realism.

    Science.gov (United States)

    Cleaveland, Sarah; Hampson, Katie

    2017-12-20

    More than 100 years of research has now been conducted into the prevention, control and elimination of rabies with safe and highly efficacious vaccines developed for use in human and animal populations. Domestic dogs are a major reservoir for rabies, and although considerable advances have been made towards the elimination and control of canine rabies in many parts of the world, the disease continues to kill tens of thousands of people every year in Africa and Asia. Policy efforts are now being directed towards a global target of zero human deaths from dog-mediated rabies by 2030 and the global elimination of canine rabies. Here we demonstrate how research provides a cause for optimism as to the feasibility of these goals through strategies based around mass dog vaccination. We summarize some of the pragmatic insights generated from rabies epidemiology and dog ecology research that can improve the design of dog vaccination strategies in low- and middle-income countries and which should encourage implementation without further delay. We also highlight the need for realism in reaching the feasible, although technically more difficult and longer-term goal of global elimination of canine rabies. Finally, we discuss how research on rabies has broader relevance to the control and elimination of a suite of diseases of current concern to human and animal health, providing an exemplar of the value of a 'One Health' approach. © 2017 The Authors.

  4. State plans to force companies to eliminate environmental burdens

    International Nuclear Information System (INIS)

    Marcan, P.

    2004-01-01

    The Ministry of Environment is preparing legislation aimed at forcing the state and especially private enterprises to map and eliminate tips, refuse from company premises and farmyards, and manure heaps. It is expected that the main burden will fall on private enterprises. The department is still working on the wording of this new Act on environmental burdens and so it is not yet clear whether it will be of assistance in the elimination of environmental burdens. The Ministry is aware that economic aspects must also be taken into account when exercising pressure on the companies. Closing down a company that cannot meet environmental criteria would result in redundancies and so the time schedule for the elimination of environmental burdens will be adjusted to fit the financial situation of the company involved. The ministry plans to first find companies responsible for environmental debts and then set a deadline for the preparation of a project to eliminate the environmental burden. The project would have to contain a description of elimination methods, in addition to a time schedule and cost assessment. If a private company does not report an environmental burden, the competent public authority will have the power to request access to the premises to undertake an inspection. (author)

  5. Lesotho - Health Facility Survey

    Data.gov (United States)

    Millennium Challenge Corporation — The main objective of the 2011 Health Facility Survey (HFS) was to establish a baseline for informing the Health Project performance indicators on health facilities,...

  6. Armament Technology Facility (ATF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Armament Technology Facility is a 52,000 square foot, secure and environmentally-safe, integrated small arms and cannon caliber design and evaluation facility....

  7. Projectile Demilitarization Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Projectile Wash Out Facility is US Army Ammunition Peculiar Equipment (APE 1300). It is a pilot scale wash out facility that uses high pressure water and steam...

  8. Rocketball Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This test facility offers the capability to emulate and measure guided missile radar cross-section without requiring flight tests of tactical missiles. This facility...

  9. Materiel Evaluation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — CRREL's Materiel Evaluation Facility (MEF) is a large cold-room facility that can be set up at temperatures ranging from −20°F to 120°F with a temperature change...

  10. Environmental Toxicology Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Fully-equipped facilities for environmental toxicology researchThe Environmental Toxicology Research Facility (ETRF) located in Vicksburg, MS provides over 8,200 ft...

  11. Dialysis Facility Compare

    Data.gov (United States)

    U.S. Department of Health & Human Services — Dialysis Facility Compare helps you find detailed information about Medicare-certified dialysis facilities. You can compare the services and the quality of care that...

  12. Energetics Conditioning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Energetics Conditioning Facility is used for long term and short term aging studies of energetic materials. The facility has 10 conditioning chambers of which 2...

  13. Explosive Components Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The 98,000 square foot Explosive Components Facility (ECF) is a state-of-the-art facility that provides a full-range of chemical, material, and performance analysis...

  14. Facilities for US Radioastronomy.

    Science.gov (United States)

    Thaddeus, Patrick

    1982-01-01

    Discusses major developments in radioastronomy since 1945. Topics include proposed facilities, very-long-baseline interferometric array, millimeter-wave telescope, submillimeter-wave telescope, and funding for radioastronomy facilities and projects. (JN)

  15. Neighbourhood facilities for sustainability

    CSIR Research Space (South Africa)

    Gibberd, Jeremy T

    2013-01-01

    Full Text Available . In this paper these are referred to as ‘Neighbourhood Facilities for Sustainability’. Neighbourhood Facilities for Sustainability (NFS) are initiatives undertaken by individuals and communities to build local sustainable systems which not only improve...

  16. Cold Vacuum Drying Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Located near the K-Basins (see K-Basins link) in Hanford's 100 Area is a facility called the Cold Vacuum Drying Facility (CVDF).Between 2000 and 2004, workers at the...

  17. Ouellette Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Thermal Test Facility is a joint Army/Navy state-of-the-art facility (8,100 ft2) that was designed to:Evaluate and characterize the effect of flame and thermal...

  18. Integrated Disposal Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Located near the center of the 586-square-mile Hanford Site is the Integrated Disposal Facility, also known as the IDF.This facility is a landfill similar in concept...

  19. Facility design: introduction

    International Nuclear Information System (INIS)

    Unger, W.E.

    1980-01-01

    The design of shielded chemical processing facilities for handling plutonium is discussed. The TRU facility is considered in particular; its features for minimizing the escape of process materials are listed. 20 figures

  20. Facilities improvement for sustainability of existing public office ...

    African Journals Online (AJOL)

    The study examined the building design features of a cosmopolitan public office building in Abuja. The features were classified into Spatial Plan, Structure and Facilities, to determine which of the 3 variables requires urgent sustainable improvement from end-users' perspective in existing public office buildings in developing ...

  1. Classified model and characteristics of strategies at tourist companies

    Directory of Open Access Journals (Sweden)

    I.V. Saukh

    2017-12-01

    Full Text Available The research is devoted to the assessment of the scientific approaches to the identification of classification features of the strategy and its types distinguished in accordance with the mentioned features. The research object is the activities of tourist companies and this determines the choice of strategies typical for the tourism field. It is substantiated that the scientific approaches to the classification of strategies are various in specific literature because of obscurity in the strategy definition, vagueness and plurality of its classified features. Due to the current research the authors have improved the classified model of strategies for tourist companies that will result in making effective management decisions directed to the development of enterprise potential under conditions of unstable and unpredictable external environment. The paper singles out the peculiarities of functioning the tourism branch, which are the following : high sensitivity to the changes in external environment; the high level of competition in the field; dynamics and the lack of necessity for the use of «far-seeing» strategies; insufficiency of information provision for the application of traditional western models and matric methods of strategy development; time gap between obtaining the service and its consumption; a great number of intermediaries; seasonal swings in demands; the sudden shift of external environment caused by cyclicity, globalization, political decisions of separate countries and etc. The article shows essential differences in the development of financial strategies of small-scale enterprises and stock companies of tourist business. It is substantiated that small-scale enterprises develop strategies directed to a higher level of personal services, occupational competence, ability and experience in designing, the best knowledge of regional conditions and flexible decisions caused by the peculiarities of the received orders. Taking into

  2. A NEW WASTE CLASSIFYING MODEL: HOW WASTE CLASSIFICATION CAN BECOME MORE OBJECTIVE?

    Directory of Open Access Journals (Sweden)

    Burcea Stefan Gabriel

    2015-07-01

    Full Text Available The waste management specialist must be able to identify and analyze waste generation sources and to propose proper solutions to prevent the waste generation and encurage the waste minimisation. In certain situations like implementing an integrated waste management sustem and configure the waste collection methods and capacities, practitioners can face the challenge to classify the generated waste. This will tend to be the more demanding as the literature does not provide a coherent system of criteria required for an objective waste classification process. The waste incineration will determine no doubt a different waste classification than waste composting or mechanical and biological treatment. In this case the main question is what are the proper classification criteria witch can be used to realise an objective waste classification? The article provide a short critical literature review of the existing waste classification criteria and suggests the conclusion that the literature can not provide unitary waste classification system which is unanimously accepted and assumed by ideologists and practitioners. There are various classification criteria and more interesting perspectives in the literature regarding the waste classification, but the most common criteria based on which specialists classify waste into several classes, categories and types are the generation source, physical and chemical features, aggregation state, origin or derivation, hazardous degree etc. The traditional classification criteria divided waste into various categories, subcategories and types; such an approach is a conjectural one because is inevitable that according to the context in which the waste classification is required the used criteria to differ significantly; hence the need to uniformizating the waste classification systems. For the first part of the article it has been used indirect observation research method by analyzing the literature and the various

  3. Passive elimination of static electricity in oil industry

    Directory of Open Access Journals (Sweden)

    Gaćanović Mićo

    2014-01-01

    Full Text Available This study explains the existing and real conditions of a possible passive elimination of static electricity when loading oil and oil derivatives. We are considering the formation and survival of gas bubbles both in the volume of oil in its depth, but also at the surface of oil and oil derivatives of the partly filled reservoir, and formation of both volume and surface electric charge in oil and oil derivatives. The study presents the research of formation and survival of static electricity in both reservoirs and tank trucks of different geometric shapes partly filled with oil and oil derivatives. We are proposing a new original possibility of passive elimination of static electricity when loading oil and oil derivatives in reservoirs and tank trucks. The proposed passive device for elimination of static electricity is protected at the international level in the domain of intellectual property (with a patent, model and distinctive mark.

  4. Sliding Control with Chattering Elimination for Hydraulic Drives

    DEFF Research Database (Denmark)

    Schmidt, Lasse; Andersen, Torben Ole; Pedersen, Henrik C.

    2012-01-01

    This paper presents the development of a sliding mode control scheme with chattering elimination, generally applicable for position tracking control of electro-hydraulic valve-cylinder drives. The proposed control scheme requires only common data sheet information, no knowledge on load characteri......This paper presents the development of a sliding mode control scheme with chattering elimination, generally applicable for position tracking control of electro-hydraulic valve-cylinder drives. The proposed control scheme requires only common data sheet information, no knowledge on load...... controller is developed for the control derivative based on a reduced order model. Simulation results demonstrate strong robustness when subjected to parameter perturbations and that chattering is eliminated....

  5. Risk Management - Variance Minimization or Lower Tail Outcome Elimination

    DEFF Research Database (Denmark)

    Aabo, Tom

    2002-01-01

    on future cash flows (the budget), while risk managers concerned about costly lower tail outcomes will hedge (considerably) less depending on the level of uncertainty. A risk management strategy of lower tail outcome elimination is in line with theoretical recommendations in a corporate value......This paper illustrates the profound difference between a risk management strategy of variance minimization and a risk management strategy of lower tail outcome elimination. Risk managers concerned about the variability of cash flows will tend to center their hedge decisions on their best guess......-adding perspective. A cross-case study of blue-chip industrial companies partly supports the empirical use of a risk management strategy of lower tail outcome elimination but does not exclude other factors from (co-)driving the observations....

  6. Hypothyroidism in patients after thyroid elimination by 131I

    International Nuclear Information System (INIS)

    Vana, S.; Nemec, J.; Reisenauer, R.

    1979-01-01

    Patients after elimination of the thyroid gland with radioiodine 131 I develop hypothyroidism only slowly, the peripheral parameters lagging behind the protein bound iodine especially till the fiftieth day after elimination. In young patients the Achilles tendon reflex and the preejection period lag behind symmetrically, in older patients the effect of the supply of thyroid hormones to the skeletal muscles disappear faster, whereas the heart retains the reserves of the thyroid hormones or systems dependent on thyroid hormones affecting the rapidity of myocardial contraction for a relatively longer period of time. Thus, in older patients after elimination of the thyroid gland with radioiodine 131 I the Achilles tendon reflex is a better criterion of hypothyroidism than the preejection period of heart contraction. (author)

  7. In situ measured elimination of Vibrio cholerae from brackish water.

    Science.gov (United States)

    Pérez, María Elena Martínez; Macek, Miroslav; Galván, María Teresa Castro

    2004-01-01

    In situ elimination of fluorescently labelled Vibrio cholerae (FLB) was measured in two saline water bodies in Mexico: in a brackish water lagoon, Mecoacán (Gulf of Mexico; State of Tabasco) and an athalassohaline lake, Alchichica (State of Puebla). Disappearance rates of fluorescently labelled V. cholera O1 showed that they were eliminated from the environment at an average rate of 32% and 63%/day, respectively (based on the bacterial standing stocks). The indirect immunofluorescence method confirmed the presence of V. cholerae O1 in the lagoon. However, the elimination of FLB was not directly related either to the presence or absence of the bacterium in the water body or to the phytoplankton concentration.

  8. Soil-biofilters for elimination of xenobiotics from wastewaters

    DEFF Research Database (Denmark)

    Bester, Kai; Schäfer, D; Janzen, N.

    treatment plants are not designed to eliminate these compounds. Even more of these compounds are discharged by storm waters and combined sewer overflows. It is generally suggested that separating sewers into waste water and rainwater systems might help to improve the situation. However in the last few years...... it has been demonstrated, that storm waters can be heavily polluted with biocides, lubricants and PAHs. In this study we investigated the possibilities to eliminate lipophilic fragrances, bactericides, UV blockers, lubricants etc, as well as more hydrophilic organophosphate flame retardants, biocides...... and other components with low cost soil biofilter techniques suited for on-site decentralised treatment of storm waters and combined sewer overflows. On the other hand the same systems and compounds were tested in for polishing treated waste water in respect of further elimination of xenobiotic compounds....

  9. Biologic phosphorus elimination - influencing parameters, boundary conditions, process optimation

    International Nuclear Information System (INIS)

    Dai Xiaohu.

    1992-01-01

    This paper first presents a systematic study of the basic process of biologic phosphorus elimination as employed by the original 'Phoredox (Main Stream) Process'. The conditions governing the process and the factors influencing its performance were determined by trial operation. A stationary model was developed for the purpose of modelling biologic phosphorus elimination in such a main stream process and optimising the dimensioning. The validity of the model was confirmed by operational data given in the literature and by operational data from the authors' own semitechnical-scale experimental plant. The model permits simulation of the values to be expected for effluent phosphorus and phosphate concentrations for given influent data and boundary conditions. It is thus possible to dimension a plant for accomodation of the original Phoredox (Main Stream) Process or any similar phosphorus eliminating plant that is to work according to the principle of the main stream process. (orig./EF) [de

  10. Rayleigh Instability-Assisted Satellite Droplets Elimination in Inkjet Printing.

    Science.gov (United States)

    Yang, Qiang; Li, Huizeng; Li, Mingzhu; Li, Yanan; Chen, Shuoran; Bao, Bin; Song, Yanlin

    2017-11-29

    Elimination of satellite droplets in inkjet printing has long been desired for high-resolution and precision printing of functional materials and tissues. Generally, the strategy to suppress satellite droplets is to control ink properties, such as viscosity or surface tension, to assist ink filaments in retracting into one drop. However, this strategy brings new restrictions to the ink, such as ink viscosity, surface tension, and concentration. Here, we report an alternative strategy that the satellite droplets are eliminated by enhancing Rayleigh instability of filament at the break point to accelerate pinch-off of the droplet from the nozzle. A superhydrophobic and ultralow adhesive nozzle with cone morphology exhibits the capability to eliminate satellite droplets by cutting the ink filament at breakup point effectively. As a result, the nozzles with different sizes (10-80 μm) are able to print more inks (1 printing electronics and biotechnologies.

  11. Eliminating the Neglected Tropical Diseases: Translational Science and New Technologies.

    Directory of Open Access Journals (Sweden)

    Peter J Hotez

    2016-03-01

    Full Text Available Today, the World Health Organization recognizes 17 major parasitic and related infections as the neglected tropical diseases (NTDs. Despite recent gains in the understanding of the nature and prevalence of NTDs, as well as successes in recent scaled-up preventive chemotherapy strategies and other health interventions, the NTDs continue to rank among the world's greatest global health problems. For virtually all of the NTDs (including those slated for elimination under the auspices of a 2012 London Declaration for NTDs and a 2013 World Health Assembly resolution [WHA 66.12], additional control mechanisms and tools are needed, including new NTD drugs, vaccines, diagnostics, and vector control agents and strategies. Elimination will not be possible without these new tools. Here we summarize some of the key challenges in translational science to develop and introduce these new technologies in order to ensure success in global NTD elimination efforts.

  12. CLEAR test facility

    CERN Multimedia

    Ordan, Julien Marius

    2017-01-01

    A new user facility for accelerator R&D, the CERN Linear Electron Accelerator for Research (CLEAR), started operation in August 2017. CLEAR evolved from the former CLIC Test Facility 3 (CTF3) used by the Compact Linear Collider (CLIC). The new facility is able to host and test a broad range of ideas in the accelerator field.

  13. 340 Waste handling Facility Hazard Categorization and Safety Analysis

    International Nuclear Information System (INIS)

    Rodovsky, T.J.

    2010-01-01

    decommissioning or pumping of radioactive materials from the vault tanks is prohibited. The Criticality Safety Program, HNF-7098, currently classifies an Exempt facility as one that is less than HC 3 per DOE STD 1027-92, Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports, therefore the 340 Facility is classified as Exempt. Exempt facilities are not required to comply with most of the requirements specified in the Criticality Safety Program. The exceptions, with regards to the 340 Facility, include maintaining an accounting of the facility source term to ensure that the facility hazard category is not changed and ensuring that fissionable materials are appropriately labeled.

  14. Tissue distribution and elimination of rotenone in rainbow trout

    Science.gov (United States)

    Gingerich, W.H.

    1986-01-01

    The fate of a single i.v. dose (120 μg/kg) of the piscicide [14C]rotenone was evaluated in rainbow trout for periods up to 72 h after dosing. Rotenone was rapidly cleared from the plasma; less than 2% of the dose remained in the plasma compartment after 20 min. The highest concentrations of rotenone residues (% dose/g tissue) were in the hepatobiliary system, bile, intestine, and in heart, lateral line swimming muscle, and posterior kidney; tissues that are highly dependent on oxidative metabolism. Although rotenone activity was present in all cell fractions examined, greater than 40% was associated with the mitochondrial fraction of liver, kidney, and muscle. More than 85% of the activity extracted from these tissues, except the liver, was parent rotenone. Elimination from whole body and major tissue depots conformed to simple first-order kinetics; the estimated half-life from whole body was 68.5 h. Branchial elimination accounted for 5% of the injected dose over a 4-h period, and urinary elimination was less than 2% over a 48-h period. Rotenone was eliminated essentially unchanged across the gills; however, parent rotenone was not found in either urine or bile. More than 80% of the activity in both urine and bile eluted from HPLC chromatographs as a highly polar fraction that was not hydrolyzed by incubation with either β-glucuronidase or sulfatase. The results imply that hepatobiliary excretion is the major route of elimination for rotenone residues in the trout and that metabolism to a more polar form is a prerequisite for elimination in both the bile and the urine

  15. Determinants of Human African Trypanosomiasis Elimination via Paratransgenesis.

    Directory of Open Access Journals (Sweden)

    Jennifer A Gilbert

    2016-03-01

    Full Text Available Human African trypanosomiasis (HAT, transmitted by tsetse flies, has historically infected hundreds of thousands of individuals annually in sub-Saharan Africa. Over the last decade, concerted control efforts have reduced reported cases to below 10,000 annually, bringing complete elimination within reach. A potential technology to eliminate HAT involves rendering the flies resistant to trypanosome infection. This approach can be achieved through the introduction of transgenic Sodalis symbiotic bacteria that have been modified to produce a trypanocide, and propagated via Wolbachia symbionts, which confer a reproductive advantage to the paratransgenic tsetse. However, the population dynamics of these symbionts within tsetse flies have not yet been evaluated. Specifically, the key factors that determine the effectiveness of paratransgenesis have yet to be quantified. To identify the impact of these determinants on T.b. gambiense and T.b. rhodesiense transmission, we developed a mathematical model of trypanosome transmission that incorporates tsetse and symbiont population dynamics. We found that fecundity and mortality penalties associated with Wolbachia or recombinant Sodalis colonization, probabilities of vertical transmission, and tsetse migration rates are fundamental to the feasibility of HAT elimination. For example, we determined that HAT elimination could be sustained over 25 years when Wolbachia colonization minimally impacted fecundity or mortality, and when the probability of recombinant Sodalis vertical transmission exceeded 99.9%. We also found that for a narrow range of recombinant Sodalis vertical transmission probability (99.9-90.6% for T.b. gambiense and 99.9-85.8% for T.b. rhodesiense, cumulative HAT incidence was reduced between 30% and 1% for T.b. gambiense and between 21% and 3% for T.b. rhodesiense, although elimination was not predicted. Our findings indicate that fitness and mortality penalties associated with paratransgenic

  16. Three parallel information systems for malaria elimination in Swaziland, 2010-2015: are the numbers the same?

    Science.gov (United States)

    Zulu, Z; Kunene, S; Mkhonta, N; Owiti, P; Sikhondze, W; Mhlanga, M; Simelane, Z; Geoffroy, E; Zachariah, R

    2018-04-25

    Background: To be able to eliminate malaria, accurate, timely reporting and tracking of all confirmed malaria cases is crucial. Swaziland, a country in the process of eliminating malaria, has three parallel health information systems. Design: This was a cross-sectional study using country-wide programme data from 2010 to 2015. Methods: The Malaria Surveillance Database System (MSDS) is a comprehensive malaria database, the Immediate Disease Notification System (IDNS) is meant to provide early warning and trigger case investigations to prevent onward malaria transmission and potential epidemics, and the Health Management Information Systems (HMIS) reports on all morbidity at health facility level. Discrepancies were stratified by health facility level and type. Results: Consistent over-reporting of 9-85% was noticed in the HMIS, principally at the primary health care level (clinic and/or health centre). In the IDNS, the discrepancy went from under-reporting (12%) to over-reporting (32%); this was also seen at the primary care level. At the hospital level, there was under-reporting in both the HMIS and IDNS. Conclusions: There are considerable discrepancies in the numbers of confirmed malaria cases in the HMIS and IDNS in Swaziland. This may misrepresent the malaria burden and delay case investigation, predisposing the population to potential epidemics. There is an urgent need to improve data integrity in order to guide and evaluate efforts toward elimination.

  17. The decision tree classifier - Design and potential. [for Landsat-1 data

    Science.gov (United States)

    Hauska, H.; Swain, P. H.

    1975-01-01

    A new classifier has been developed for the computerized analysis of remote sensor data. The decision tree classifier is essentially a maximum likelihood classifier using multistage decision logic. It is characterized by the fact that an unknown sample can be classified into a class using one or several decision functions in a successive manner. The classifier is applied to the analysis of data sensed by Landsat-1 over Kenosha Pass, Colorado. The classifier is illustrated by a tree diagram which for processing purposes is encoded as a string of symbols such that there is a unique one-to-one relationship between string and decision tree.

  18. Use of gamma radiation to eliminate fungi from wood

    International Nuclear Information System (INIS)

    Freitag, C.M.; Morrell, J.J.

    1998-01-01

    The use of gamma irradiation for eliminating pests from imported wood products was investigated, using ponderosa pine blocks colonized by Aspergillus niger, Ophiostoma piceae, O. perfectum, Penicillium spp., Phlebia subserialis, or Postia placenta. While previous studies suggest that a dosage of 2.5 Mrads is required to eliminate fungi from wood, only one isolation was made from wafers exposed to 1.5 Mrad. This suggests that lower dosages may be adequate for mitigating pests in wood, although further studies using other fungi are recommended

  19. Feasibility and roadmap analysis for malaria elimination in China.

    Science.gov (United States)

    Zhou, Xiao-Nong; Xia, Zhi-Gui; Wang, Ru-Bo; Qian, Ying-Jun; Zhou, Shui-Sen; Utzinger, Jürg; Tanner, Marcel; Kramer, Randall; Yang, Wei-Zhong

    2014-01-01

    To understand the current status of the malaria control programme at the county level in accordance with the criteria of the World Health Organisation, the gaps and feasibility of malaria elimination at the county and national levels were analysed based on three kinds of indicators: transmission capacity, capacity of the professional team, and the intensity of intervention. Finally, a roadmap for national malaria elimination in the People's Republic of China is proposed based on the results of a feasibility assessment at the national level. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Eliminating the Influence of Harmonic Components in Operational Modal Analysis

    DEFF Research Database (Denmark)

    Jacobsen, Niels-Jørgen; Andersen, Palle; Brincker, Rune

    2007-01-01

    structures, in contrast, are subject inherently to deterministic forces due to the rotating parts in the machinery. These forces are seen as harmonic components in the responses, and their influence should be eliminated before extracting the modes in their vicinity. This paper describes a new method based...... on the well-known Enhanced Frequency Domain Decomposition (EFDD) technique for eliminating these harmonic components in the modal parameter extraction process. For assessing the quality of the method, various experiments were carried out where the results were compared with those obtained with pure stochastic...