WorldWideScience

Sample records for deep pleat hepa

  1. The effect of media area on the dust holding capacity of deep pleat HEPA filters

    Energy Technology Data Exchange (ETDEWEB)

    Dyment, J. [AWE, Aldermaston (United Kingdom); Loughborough, D. [AEAT Harwell, Oxford (United Kingdom)

    1997-08-01

    The high potential cost of storage, treatment and disposal of radioactive wastes places a premium on the longevity of installed HEPA filters in situations in radioactive processing facilities where dust capacity is a life determining factor. Previous work investigated the dust holding capacity v pressure drop characteristics of different designs of HEPA filter and also the effect of using graded density papers. This paper records an investigation of the effect of media area variation on the dust holding capacity of the {open_quotes}deep-pleat{close_quotes} design of HEPA filter. As in the previously reported work two test dusts (carbon black and sub micron sodium chloride) in the range (0.15 - 0.4{mu}m) were used. Media area adjustment was effected by varying the number of separators within the range 60 - 90. Results with the coarser dust allowed an optimum media area to be identified. Media areas greater or smaller than this optimum retained less dust than the optimum for the same terminal pressure drop. Conversely with the finer sodium chloride aerosol the dust holding capacity continued to increase up to the maximum area investigated. 7 refs., 4 figs.

  2. Criteria for calculating the efficiency of deep-pleated HEPA filters with aluminum separators during and after design basis accidents

    International Nuclear Information System (INIS)

    Bergman, W.; First, M.W.; Anderson, W.L.

    1995-01-01

    We have reviewed the literature on the performance of HEPA filters under normal and abnormal conditions to establish criteria for calculating the efficiency of HEPA filters in a DOE nonreactor nuclear facility during and after a Design Basis Accident (DBA). This study is only applicable to the standard deep-pleated HEPA filter with aluminum separators as specified in ASME N509[1]. Other HEPA filter designs such as the mini-pleat and separatorless filters are not included in this study. The literature review included the performance of new filters and parameters that may cause deterioration in the filter performance such as filter age, radiation, corrosive chemicals, seismic and rough handling, high temperature, moisture, particle clogging, high air flow and pressure pulses. The deterioration of the filter efficiency depends on the exposure parameters; in severe exposure conditions the filter will be damaged and have a residual efficiency of 0%. There are large gaps and limitations in the data that introduce significant error in the estimates of HEPA filter efficiencies under DBA conditions. Because of this limitation, conservative values of filter efficiency were chosen. The estimation of the efficiency of the HEPA filters under DBA conditions involves three steps: (1) The filter pressure drop and environmental parameters are determined during and after the DBA, (2) Comparing the filter pressure drop to a set of threshold values above which the filter is damaged. There is a different threshold value for each combination of environmental parameters, and (3) Determining the filter efficiency. If the filter pressure drop is greater than the threshold value, the filter is damaged and is assigned 0% efficiency. If the pressure drop is less, then the filter is not damaged and the efficiency is determined from literature values of the efficiency at the environmental conditions

  3. Criteria for calculating the efficiency of deep-pleated HEPA filters with aluminum separators during and after design basis accidents

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; First, M.W.; Anderson, W.L. [Lawrence Livermore National Laboratory, CA (United States)] [and others

    1995-02-01

    We have reviewed the literature on the performance of HEPA filters under normal and abnormal conditions to establish criteria for calculating the efficiency of HEPA filters in a DOE nonreactor nuclear facility during and after a Design Basis Accident (DBA). This study is only applicable to the standard deep-pleated HEPA filter with aluminum separators as specified in ASME N509[1]. Other HEPA filter designs such as the mini-pleat and separatorless filters are not included in this study. The literature review included the performance of new filters and parameters that may cause deterioration in the filter performance such as filter age, radiation, corrosive chemicals, seismic and rough handling, high temperature, moisture, particle clogging, high air flow and pressure pulses. The deterioration of the filter efficiency depends on the exposure parameters; in severe exposure conditions the filter will be damaged and have a residual efficiency of 0%. There are large gaps and limitations in the data that introduce significant error in the estimates of HEPA filter efficiencies under DBA conditions. Because of this limitation, conservative values of filter efficiency were chosen. The estimation of the efficiency of the HEPA filters under DBA conditions involves three steps: (1) The filter pressure drop and environmental parameters are determined during and after the DBA, (2) Comparing the filter pressure drop to a set of threshold values above which the filter is damaged. There is a different threshold value for each combination of environmental parameters, and (3) Determining the filter efficiency. If the filter pressure drop is greater than the threshold value, the filter is damaged and is assigned 0% efficiency. If the pressure drop is less, then the filter is not damaged and the efficiency is determined from literature values of the efficiency at the environmental conditions.

  4. The effect of media area on the dust holding capacity of deep pleat HEPA filters

    International Nuclear Information System (INIS)

    Dyment, J.; Loughborough, D.

    1997-01-01

    The high potential cost of storage, treatment and disposal of radioactive wastes places a premium on the longevity of installed HEPA filters in situations in radioactive processing facilities where dust capacity is a life determining factor. Previous work investigated the dust holding capacity v pressure drop characteristics of different designs of HEPA filter and also the effect of using graded density papers. This paper records an investigation of the effect of media area variation on the dust holding capacity of the open-quotes deep-pleatclose quotes design of HEPA filter. As in the previously reported work two test dusts (carbon black and sub micron sodium chloride) in the range (0.15 - 0.4μm) were used. Media area adjustment was effected by varying the number of separators within the range 60 - 90. Results with the coarser dust allowed an optimum media area to be identified. Media areas greater or smaller than this optimum retained less dust than the optimum for the same terminal pressure drop. Conversely with the finer sodium chloride aerosol the dust holding capacity continued to increase up to the maximum area investigated. 7 refs., 4 figs

  5. Development and evaluation of a HEPA filter for increased strength and resistance to elevated temperature

    International Nuclear Information System (INIS)

    Gilbert, H.; Bergman, W.; Fretthold, J.K.

    1993-01-01

    We have completed a preliminary study of an improved HEPA filter for increased strength and resistance to elevated temperature to improve the reliability of the standard deep pleated HEPA filter under accident conditions. The improvements to the HEPA filter consist of a silicone rubber sealant and a new HEPA medium reinforced with a glass cloth. Three prototype filters were built and evaluated for temperature and pressure resistance and resistance to rough handling. The temperature resistance test consisted of exposing the HEPA filter to 1,000 scan (1,700 m 3 /hr) at 700 degrees F (371 degrees C) for five minutes.The pressure resistance test consisted of exposing the HEPA filter to a differential pressure of 10 in. w.g. (2.5 kPa) using a water saturated air flow at 95 degrees F (35 degrees C). For the rough handling test, we used a vibrating machine designated the Q110. DOP filter efficiency tests were performed before and after each of the environmental tests. In addition to following the standard practice of using a separate new filter for each environmental test, we also subjected the same filter to the elevated temperature test followed by the pressure resistance test. The efficiency test results show that the improved HEPA filter is significantly better than the standard HEPA filter. Further studies are recommended to evaluate the improved HEPA filter and to assess its performance under more severe accident conditions

  6. Evaluation of the effect of media velocity on HEPA filter performance

    International Nuclear Information System (INIS)

    Alderman, Steven; Parsons, Michael; Hogancamp, Kristina; Norton, O. Perry; Waggoner, Charles

    2007-01-01

    Section FC of the ASME AG-1 Code addresses glass fiber HEPA filters and restricts the media velocity to a maximum of 2.54 cm/s (5 ft/min). Advances in filter media technology allow glass fiber HEPA filters to function at significantly higher velocities and still achieve HEPA performance. However, diffusional capture of particles < 100 nm is reduced at higher media velocities due to shorter residence times within the media matrix. Therefore, it is unlikely that higher media velocities for HEPA filters will be allowed without data to demonstrate the effect of media velocity on removal of particles in the smaller size classes. In order to address this issue, static testing has been conducted to generate performance related data and a range of dynamic testing has provided data regarding filter lifetimes, loading characteristics, changes in filter efficiency and the most penetrating particle size over time. Testing was conducted using 31 cm x 31 cm x 29 cm deep pleat HEPA filters supplied from two manufacturers. Testing was conducted at media velocities ranging from 2.0-4.5 cm/s with a solid aerosol challenge composed of potassium chloride. Two set of media velocity data were obtained for each filter type. In one set of evaluations, the maximum aerosol challenge particle size was limited to 3 μm, while particles above 3 μm were not constrained in the second set. This provided for considerable variability in the challenge mass mean diameter and overall mass loading rate. Results of this testing will be provided to the ASME AG-1 FC Committee for consideration in future versions of the HEPA standard. In general, the initial filter efficiency decreased with increasing media velocity. However, initial filter efficiencies were generally good in all cases. Filter efficiency values averaged over the first ten minute of the loading cycle ranged from 99.970 to 99.996 %. Additionally, the most penetrating particle size was observed to decrease with increasing media velocity

  7. Performance of HEPA filters under hot dynamic conditions

    International Nuclear Information System (INIS)

    Frankum, D.P.; Costigan, G.

    1995-01-01

    Accidents in nuclear facilities involving fires may have implications upon the ventilation systems where high efficiency particulate air (HEPA) filters are used to minimise the airborne release of radioactive or toxic particles. The Filter Development Section at Harwell Laboratory has been investigating the effect of temperature on the performance of HEPA filters under hot dynamic conditions[ 1 ] for a number of years. The test rig is capable of delivering air flows of 10001/s (at ambient conditions) at temperatures up to 500 degrees C, where measurements of the penetration and pressure drop across the filter are obtained. This paper reports the experiments on different constructions of HEPA filters; rectangular and circular. The filters were tested at an air temperature of 200 degrees C for up to 48 hours at the rated airflow to assess their performance. The penetration measurements for rectangular filters were observed to be below 0.021% after prolonged operation. In a number of cases, holes appeared along the pleat creases of circular filters although the penetration remained below 1%. The sealing gasket for these filters was noted to deform with temperature, permitting a leakage path. A prototype high strength circular filter was evaluated at temperatures of up to 400 degrees C with a penetration less than 0.65%

  8. Performance of HEPA filters under hot dynamic conditions

    Energy Technology Data Exchange (ETDEWEB)

    Frankum, D.P.; Costigan, G. [AEA Technology, Oxfordshire (United Kingdom)

    1995-02-01

    Accidents in nuclear facilities involving fires may have implications upon the ventilation systems where high efficiency particulate air (HEPA) filters are used to minimise the airborne release of radioactive or toxic particles. The Filter Development Section at Harwell Laboratory has been investigating the effect of temperature on the performance of HEPA filters under hot dynamic conditions[{sub 1}] for a number of years. The test rig is capable of delivering air flows of 10001/s (at ambient conditions) at temperatures up to 500{degrees}C, where measurements of the penetration and pressure drop across the filter are obtained. This paper reports the experiments on different constructions of HEPA filters; rectangular and circular. The filters were tested at an air temperature of 200{degrees}C for up to 48 hours at the rated airflow to assess their performance. The penetration measurements for rectangular filters were observed to be below 0.021% after prolonged operation. In a number of cases, holes appeared along the pleat creases of circular filters although the penetration remained below 1%. The sealing gasket for these filters was noted to deform with temperature, permitting a leakage path. A prototype high strength circular filter was evaluated at temperatures of up to 400{degrees}C with a penetration less than 0.65%.

  9. Effect of elevated temperature on the mechanical strength of HEPA filters

    International Nuclear Information System (INIS)

    Elfawal, M.M.; Eladham, K.A.; Hammed, F.H.; Abdrabbo, M.F.

    1993-01-01

    The effect of elevated temperature on the mechanical strength of HEPA filters was studied in order to evaluate and improve their performance under high temperature conditions. As part of this study the mechanical strength of HEPA filter medium which is the limiting factor in terms of the filter strength was experimentally studied at elevated temperature up to 400 degree C, and thermal exposure times ranged from 2 min to 4 h. The failure pressures of HEPA filter units after long exposure to 250 degree C were also investigated. The test results show that the medium strength decreases with increase in temperature challenge and thermal exposure time due to burnout of the organic binder used to improve the strength and flexibility of the medium. The test results also show that the tensile strength of the conventional filter medium drops to about 40 % of the value at room temperature after exposure to 250 degree C for 6 h; therefore, the continuous exposure of the conventional filter medium to this temperature is critical. The average failure differential pressures of all commercial tested filters were found to lie between 9 and 18 kPa at ambient temperature and between 6 and 11 kPa after thermal challenge at 250 degree C for 100 h. It was found that swelling and capture of the ends of individual pleats has led to filter failure.3 fig., 2 tab

  10. Further development of the cleanable steel HEPA filter, cost/benefit analysis, and comparison with competing technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Lopez, R.; Wilson, K. [Lawrence Livermore National Lab., CA (United States)] [and others

    1997-08-01

    We have made further progress in developing a cleanable steel fiber HEPA filter. We fabricated a pleated cylindrical cartridge using commercially available steel fiber media that is made with 1 {mu}m stainless steel fibers and sintered into a sheet form. Test results at the Department of Energy (DOE) Filter Test Station at Oak Ridge show the prototype filter cartridge has 99.99% efficiency for 0.3 {mu}m dioctyl phthalate (DOP) aerosols and a pressure drop of 1.5 inches. Filter loading and cleaning tests using AC Fine dust showed the filter could be repeatedly cleaned using reverse air pulses. Our analysis of commercially optimized filters suggest that cleanable steel HEPA filters need to be made from steel fibers less than 1{mu}m, and preferably 0.5 {mu}m, to meet the standard HEPA filter requirements in production units. We have demonstrated that 0.5 {mu}m steel fibers can be produced using the fiber bundling and drawing process. The 0.5 {mu}m steel fibers are then sintered into small filter samples and tested for efficiency and pressure drop. Test results on the sample showed a penetration of 0.0015 % at 0.3 {mu}m and a pressure drop of 1.15 inches at 6.9 ft/min (3.5 cm/s) velocity. Based on these results, steel fiber media can easily meet the requirements of 0.03 % penetration and 1.0 inch of pressure drop by using less fibers in the media. A cost analysis of the cleanable steel HEPA filter shows that, although the steel HEPA filter costs much more than the standard glass fiber HEPA filter, it has the potential to be very cost effective because of the high disposal costs of contaminated HEPA filters. We estimate that the steel HEPA filter will save an average of $16,000 over its 30 year life. The additional savings from the clean-up costs resulting from ruptured glass HEPA filters during accidents was not included but makes the steel HEPA filter even more cost effective. 33 refs., 28 figs., 1 tab.

  11. The development of a HEPA filter with improved dust holding characteristics

    International Nuclear Information System (INIS)

    Dyment, J.; Hamblin, C.

    1995-01-01

    A limitation of the HEPA filters used in the extract of nuclear facilities is their relatively low capacity for captured dust. The costs associated with the disposal of a typical filter means that there are clear incentives to extend filter life. The work described in this report are the initial stages in the development of a filter which incorporates a medium which enhances its dust holding capacity. Experimental equipment was installed to enable the dust loading characteristics of candidate media to be compared with those of the glass fibre based papers currently used in filter construction. These tests involved challenging representative samples of the media with an air stream containing a controlled concentration of thermally generated sodium chloride particles. The dust loading characteristics of the media were then compared in terms of the rate of increasing in pressure differential. A number of open-quotes graded densityclose quotes papers were subsequently identified which appeared to offer significant improvements in dust holding. In the second phase of the programme deep-pleat filters (1,700 M 3 h -1 ) incorporating graded density papers were manufactured and tested. Improvements of up to 50% were observed in their capacity for the sub-micron sodium chloride test dust. Smaller differences (15%) were measured when a coarser, carbon black, challenge was used. This is attributed to the differences in the particles sizes of the two dusts

  12. ASME AG-1 Section FC Qualified HEPA Filters; a Particle Loading Comparison - 13435

    International Nuclear Information System (INIS)

    Stillo, Andrew; Ricketts, Craig I.

    2013-01-01

    High Efficiency Particulate Air (HEPA) Filters used to protect personnel, the public and the environment from airborne radioactive materials are designed, manufactured and qualified in accordance with ASME AG-1 Code section FC (HEPA Filters) [1]. The qualification process requires that filters manufactured in accordance with this ASME AG-1 code section must meet several performance requirements. These requirements include performance specifications for resistance to airflow, aerosol penetration, resistance to rough handling, resistance to pressure (includes high humidity and water droplet exposure), resistance to heated air, spot flame resistance and a visual/dimensional inspection. None of these requirements evaluate the particle loading capacity of a HEPA filter design. Concerns, over the particle loading capacity, of the different designs included within the ASME AG-1 section FC code[1], have been voiced in the recent past. Additionally, the ability of a filter to maintain its integrity, if subjected to severe operating conditions such as elevated relative humidity, fog conditions or elevated temperature, after loading in use over long service intervals is also a major concern. Although currently qualified HEPA filter media are likely to have similar loading characteristics when evaluated independently, filter pleat geometry can have a significant impact on the in-situ particle loading capacity of filter packs. Aerosol particle characteristics, such as size and composition, may also have a significant impact on filter loading capacity. Test results comparing filter loading capacities for three different aerosol particles and three different filter pack configurations are reviewed. The information presented represents an empirical performance comparison among the filter designs tested. The results may serve as a basis for further discussion toward the possible development of a particle loading test to be included in the qualification requirements of ASME AG-1

  13. ASME AG-1 Section FC Qualified HEPA Filters; a Particle Loading Comparison - 13435

    Energy Technology Data Exchange (ETDEWEB)

    Stillo, Andrew [Camfil Farr, 1 North Corporate Drive, Riverdale, NJ 07457 (United States); Ricketts, Craig I. [New Mexico State University, Department of Engineering Technology and Surveying Engineering, P.O. Box 30001 MSC 3566, Las Cruces, NM 88003-8001 (United States)

    2013-07-01

    High Efficiency Particulate Air (HEPA) Filters used to protect personnel, the public and the environment from airborne radioactive materials are designed, manufactured and qualified in accordance with ASME AG-1 Code section FC (HEPA Filters) [1]. The qualification process requires that filters manufactured in accordance with this ASME AG-1 code section must meet several performance requirements. These requirements include performance specifications for resistance to airflow, aerosol penetration, resistance to rough handling, resistance to pressure (includes high humidity and water droplet exposure), resistance to heated air, spot flame resistance and a visual/dimensional inspection. None of these requirements evaluate the particle loading capacity of a HEPA filter design. Concerns, over the particle loading capacity, of the different designs included within the ASME AG-1 section FC code[1], have been voiced in the recent past. Additionally, the ability of a filter to maintain its integrity, if subjected to severe operating conditions such as elevated relative humidity, fog conditions or elevated temperature, after loading in use over long service intervals is also a major concern. Although currently qualified HEPA filter media are likely to have similar loading characteristics when evaluated independently, filter pleat geometry can have a significant impact on the in-situ particle loading capacity of filter packs. Aerosol particle characteristics, such as size and composition, may also have a significant impact on filter loading capacity. Test results comparing filter loading capacities for three different aerosol particles and three different filter pack configurations are reviewed. The information presented represents an empirical performance comparison among the filter designs tested. The results may serve as a basis for further discussion toward the possible development of a particle loading test to be included in the qualification requirements of ASME AG-1

  14. The development of a HEPA filter with improved dust holding characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Dyment, J.; Hamblin, C.

    1995-02-01

    A limitation of the HEPA filters used in the extract of nuclear facilities is their relatively low capacity for captured dust. The costs associated with the disposal of a typical filter means that there are clear incentives to extend filter life. The work described in this report are the initial stages in the development of a filter which incorporates a medium which enhances its dust holding capacity. Experimental equipment was installed to enable the dust loading characteristics of candidate media to be compared with those of the glass fibre based papers currently used in filter construction. These tests involved challenging representative samples of the media with an air stream containing a controlled concentration of thermally generated sodium chloride particles. The dust loading characteristics of the media were then compared in terms of the rate of increasing in pressure differential. A number of {open_quotes}graded density{close_quotes} papers were subsequently identified which appeared to offer significant improvements in dust holding. In the second phase of the programme deep-pleat filters (1,700 M{sup 3}h{sup {minus}1}) incorporating graded density papers were manufactured and tested. Improvements of up to 50% were observed in their capacity for the sub-micron sodium chloride test dust. Smaller differences (15%) were measured when a coarser, carbon black, challenge was used. This is attributed to the differences in the particles sizes of the two dusts.

  15. Experimental and numerical study of pleated filters clogging

    International Nuclear Information System (INIS)

    Gervais, Pierre-Colin

    2013-01-01

    Pleated filters are widely used in air treatments because of the advantageous effective surface to overall dimension ratio they offer. Their major drawback though resides in their reduced lifetime which still needs to be controlled. Indeed, when clogging, the pressure drop considerably increases, the filtration flow is then no longer maintained which might lead to the deterioration of the media. It is then crucial to characterize the evolution of the pressure drop under operating conditions in order to best design these equipments. Part of our work consisted in studying how the operating conditions influence the geometry of the deposit. To do so, we used Single- Photon Emission Computed Tomography (SPECT), a non-destructive imaging technique that keeps intact the particle structuring. The visualization of aerosol deposit at the beginning of the filtration process allows observing preferential particle deposition on the whole height of the pleat. A numerical approach was used to study the permeability of bimodal fibrous media and we experimentally studied the local velocity as well as the biphasic flow inside pleated filter media. Comparison between experiments and simulations allowed us to validate the Geodict code for a wide range of media properties and velocities. Regarding bimodal fibrous media, the fast data acquisition has allowed testing several existing models which resulted in classifying them in a unique way. If the experimental results on the initial deposition in pleated filters are encouraging, those related to beforehand clogging point to several improvements regarding the technique we used. (author) [fr

  16. A Study on the Bandwidth Characteristics of Pleated Pneumatic Artificial Muscles

    Directory of Open Access Journals (Sweden)

    Rino Versluys

    2009-01-01

    Full Text Available Pleated pneumatic artificial muscles have interesting properties that can be of considerable significance in robotics and automation. With a view to the potential use of pleated pneumatic artificial muscles as actuators for a fatigue test bench (high forces and small displacements, the bandwidth characteristics of a muscle-valve system were investigated. Bandwidth is commonly used for linear systems, as the Bode plot is independent of the amplitude of the input signal. However, due to the non-linear behaviour of pleated pneumatic artificial muscles, the system's gain becomes dependent on the amplitude of the input sine wave. As a result, only one Bode plot is insufficient to clearly describe or identify a non-linear system. In this study, the bandwidth of a muscle-valve system was assessed from two perspectives: a varying amplitude and a varying offset of the input sine wave. A brief introduction to pneumatic artificial muscles is given. The concept of pleated pneumatic artificial muscles is explained. Furthermore, the different test methods and experimental results are presented.

  17. Catalytic pleat filter bags for combined particulate separation and nitrogen oxides removal from flue gas streams

    International Nuclear Information System (INIS)

    Park, Young Ok; Choi, Ho Kyung

    2010-01-01

    The development of a high temperature catalytically active pleated filter bag with hybrid filter equipment for the combined removal of particles and nitrogen oxides from flue gas streams is presented. A special catalyst load in stainless steel mesh cartridge with a high temperature pleated filter bag followed by optimized catalytic activation was developed to reach the required nitrogen oxides levels and to maintain the higher collection efficiencies. The catalytic properties of the developed high temperature filter bags with hybrid filter equipment were studied and demonstrated in a pilot scale test rig and a demonstration plant using commercial scale of high temperature catalytic pleated filter bags. The performance of the catalytic pleated filter bags were tested under different operating conditions, such as filtration velocity and operating temperature. Moreover, the cleaning efficiency and residual pressure drop of the catalyst loaded cartridges in pleated filter bags were tested. As result of theses studies, the optimum operating conditions for the catalytic pleated filter bags are determined. (author)

  18. Modelling of air flows in pleated filters and of their clogging by solid particles

    International Nuclear Information System (INIS)

    Del Fabbro, L.

    2002-01-01

    The devices of air cleaning against particles are widely spread in various branches of industry: nuclear, motor, food, electronic,...; among these devices, numerous are constituted by pleated porous media to increase the surface of filtration and thus to reduce the pressure drop, for given air flow. The objective of our work is to compensate a lack evident of knowledge on the evolution of the pressure drop of pleated filter during the clogging and to deduct a modelling from it, on the basis of experiments concerning industrial filters of nuclear and car types. The obtained model is a function of characteristics of the filtering medium and pleats, of the characteristics of solid particles deposited on the filter, of the mass of particles and of the aeraulic conditions of air flow. It also depends on data on the clogging of flat filters of equivalent medium. To elaborate this model of pressure drop, an initial stage was carried out in order to characterize, experimentally and numerically, the pressure drop and the distribution of air flow in clean pleated filters of nuclear (high efficiency particulate air filter, in fiberglasses) and car (mean efficiency filter, in fibers of cellulose) types. The numerical model allowed to understand the fundamental role played by the aeraulic resistance of the filtering medium. From an non-dimensional approach, we established a semi-empirical model of pressure drop for a clean pleated filter valid for both studied types of medium; this model is used of first base for the development of the final model of clogging. The study of the clogging of the filters showed the complexity of the phenomenon dependent mainly on a reduction of the surface of filtration. This observation brings us to propose a clogging of pleated filters in three phases. Both first phases are similar in those observed for flat filters, while last phase corresponds to a reduction of the surface of filtration and leads a strong increase of the filter pressure drop

  19. HEPA Filter Vulnerability Assessment

    International Nuclear Information System (INIS)

    GUSTAVSON, R.D.

    2000-01-01

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection

  20. Analysis of an MCU HEPA filter

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-01-01

    A series of direct analyses on three portions (inlet, center, and outlet) of the High Efficiency Particulate Air (HEPA) filter material from the Modular Caustic-Side Solvent Extraction Unit (MCU) have been performed; this includes x-ray methods such as X-Ray Diffraction (XRD), Contained Scanning Electron Microscopy (CSEM) and X-Ray Fluorescence (XRF), as well as Fourier Transform InfraRed spectroscopy (FTIR). Additionally, two leaching studies (one with water, one with dichloromethane) have been performed on three portions (inlet, center, and outlet) of the HEPA filter material, with the leachates being analyzed by Inductively-coupled plasma emission spectroscopy (ICPES), Semi-Volatile Organic Analysis (SVOA) and gammascan. From the results of the analyses, SRNL feels that cesium-depleted solvent is being introduced into the HEPA filter. The most likely avenue for this is mechanical aerosolization of solvent, where the aerosol is then carried along an airstream into the HEPA filter. Once introduced into the HEPA filter media, the solvent wicks throughout the material, and migrates towards the outlet end. Once on the outlet end, continual drying could cause particulate flakes to exit the filter and travel farther down the airstream path.

  1. Qualification of box HEPA filters for nuclear applications

    International Nuclear Information System (INIS)

    Bergman, W.; Larsen, G.; Wilson, K.; Rainer, F.

    1995-03-01

    We have successfully completed qualification tests on high efficiency particulate air (HEPA) filters that are encapsulated within a box and manufactured by American Air Filters. The qualification tests are required by the American Society of Mechanical Engineers Standard ASME N509 and the U.S. Military Standard MIL-F-51068 for HEPA filters to be used in nuclear applications. The qualification tests specify minimum filter efficiencies following exposure to heated air, overpressure, and rough handling. Prior to this study, no box HEPA filters from any manufacturer had been qualified despite their wide-spread use in Department of Energy (DOE) facilities. Box HEPA filters are not addressed in any of the existing HEPA standards and only briefly discussed in the Nuclear Air Cleaning Handbook

  2. Experimental investigation of in situ cleanable HEPA filters

    International Nuclear Information System (INIS)

    Adamson, D.J.

    2000-01-01

    Savannah River Technology Center (SRTC), High Level Waste Division, Tanks Focus Area, and the Federal Energy Technology Center (FETC) have been investigating high efficiency particulate air (HEPA) filters which can be regenerated or cleaned in situ as an alternative to conventional disposable HEPA filters. This technical report documents concerns pertaining to conventional HEPA filters

  3. Simulation of the air flows in many industrial pleated filters

    International Nuclear Information System (INIS)

    Del Fabbro, L.; Brun, P.; Laborde, J.C.; Lacan, J.; Ricciardi, L.; Renoux, A.

    2000-01-01

    The study presents results concerning the characterization of the charge loss and the air flow in nuclear and automobile type pleated filters. The experimental studies in correlation with the numerical models showed an homogenous distribution of the air flows in a THE nuclear type filter, whereas the distribution is heterogenous in the case of an automobile filter. (A.L.B.)

  4. Self Cleaning HEPA Filtration without Interrupting Process Flow

    International Nuclear Information System (INIS)

    Wylde, M.

    2009-01-01

    The strategy of protecting the traditional glass fibre HEPA filtration train from it's blinding contamination and the recovery of dust by the means of self cleaning, pre-filtration is a proven means in the reduction of ultimate disposal volumes and has been used within the Fuel Production Industry. However, there is an increasing demand in nuclear applications requiring elevated operating temperatures, fire resistance, moisture resistance and chemical composition that the existing glass fibre HEPA filtration cannot accommodate, which can be remedied by the use of a metallic HEPA filter media. Previous research (Bergman et al 1997, Moore et al 1992) suggests that the then costs to the DOE, based on a five year life cycle, was $29.5 million for the installation, testing, removal and disposal of glass fibre HEPA filtration trains. Within these costs, $300 was the value given to the filter and $4,450 was given to the peripheral activity. Development of a low cost, cleanable, metallic, direct replacement of the traditional filter train will the clear solution. The Bergman et al work has suggested that a 1000 ft 3 /min, cleanable, stainless HEPA could be commercially available for $5,000 each, whereas the industry has determined that the truer cost of such an item in isolation would be closer to $15,000. This results in a conflict within the requirement between 'low cost' and 'stainless HEPA'. By proposing a system that combines metallic HEPA filtration with the ability to self clean without interrupting the process flow, the need for a tradition HEPA filtration train will be eliminated and this dramatically reduces the resources required for cleaning or disposal, thus presenting a route to reducing ultimate costs. The paper will examine the performance characteristics, filtration efficiency, flow verses differential pressure and cleanability of a self cleaning HEPA grade sintered metal filter element, together with data to prove the contention. (authors)

  5. In-place HEPA filter penetration test

    International Nuclear Information System (INIS)

    Bergman, W.; Wilson, K.; Elliott, J.; Bettencourt, B.; Slawski, J.W.

    1997-01-01

    We have demonstrated the feasibility of conducting penetration tests on high efficiency particulate air (HEPA) filters as installed in nuclear ventilation systems. The in-place penetration test, which is designed to yield equivalent penetration measurements as the standard DOP efficiency test, is based on measuring the aerosol penetration of the filter installation as a function of particle size using a portable laser particle counter. This in-place penetration test is compared to the current in-place leak test using light scattering photometers for single HEPA filter installations and for HEPA filter plenums using the shroud method. Test results show the in-place penetration test is more sensitive than the in-place leak test, has a similar operating procedure, but takes longer to conduct. Additional tests are required to confirm that the in-place penetration test yields identical results as the standard dioctyl phthalate (DOP) penetration test for HEPA filters with controlled leaks in the filter and gasket and duct by-pass leaks. Further development of the procedure is also required to reduce the test time before the in- place penetration test is practical

  6. Performance of HEPA filters under severe conditions, 3

    International Nuclear Information System (INIS)

    Osaki, Makoto; Zanma, Tokugo; Kanagawa, Akira.

    1986-01-01

    Performance of high efficiency particulate air (HEPA) filters at temperatures from ambient to 240 deg C was measured to prove that HEPA filters kept up their regulated decontamination factor (DF) at elevated temperatures. The DF for NaCl aerosol was measured by using a laser particle spectrometer. Pressure drop of HEPA filters at elevated temperatures was also measured. The DF increased at elevated temperatures. The DF at 200 deg C was an order of magnitude higher than that at ambient. The change of DF at elevated temperatures of various HEPA filters was effectively evaluated by using the ratio of single fiber collection efficiencies at ambient to those at elevated temperatures. Pressure drop of HEPA filters also increased at elevated temperatures. The pressure drop at 200 deg C was 1.3 times larger than that at ambient. The change of DF and pressure drop at elevated temperatures was explained by applying Kirsh's theory to elevated temperatures. (author)

  7. In-place HEPA filter penetration test

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Wilson, K.; Elliott, J. [Lawrence Livermore National Lab., CA (United States)] [and others

    1997-08-01

    We have demonstrated the feasibility of conducting penetration tests on high efficiency particulate air (HEPA) filters as installed in nuclear ventilation systems. The in-place penetration test, which is designed to yield equivalent penetration measurements as the standard DOP efficiency test, is based on measuring the aerosol penetration of the filter installation as a function of particle size using a portable laser particle counter. This in-place penetration test is compared to the current in-place leak test using light scattering photometers for single HEPA filter installations and for HEPA filter plenums using the shroud method. Test results show the in-place penetration test is more sensitive than the in-place leak test, has a similar operating procedure, but takes longer to conduct. Additional tests are required to confirm that the in-place penetration test yields identical results as the standard dioctyl phthalate (DOP) penetration test for HEPA filters with controlled leaks in the filter and gasket and duct by-pass leaks. Further development of the procedure is also required to reduce the test time before the in-place penetration test is practical. 14 refs., 14 figs., 3 tabs.

  8. HEPA Help

    Science.gov (United States)

    Rathey, Allen

    2006-01-01

    Poor indoor air quality in school facilities can detract from the health and productivity of students, teachers and other employees. Asthma--often triggered or aggravated by dust--is the No. 1 cause of chronic absenteeism in schools. Using vacuum cleaners equipped with high-efficiency particulate air (HEPA) filters to clean education institutions…

  9. Efficient simulations of fluid flow coupled with poroelastic deformations in pleated filters

    KAUST Repository

    Calo, Victor M.

    2015-04-27

    Pleated filters are broadly used for various applications. In certain cases, especially in solid-liquid separation case, the filtering media may get deflected and that may change the overall performance characteristics of the filter. From the modeling point of view, this is a challenging multiphysics problem, namely the interaction of the fluid with a so-called poroelastic structure. This work focuses on the development of an algorithm for the simulation of the Fluid Porous Structure Interaction (FPSI) problem in the case of pleated filtering media. The first part of the work is concerned with the development of a robust and accurate numerical method for solving the Stokes-Brinkman system of equations on quadrilateral grids. The mathematical model describes a free fluid flow coupled with a flow in porous media in a domain that contains the filtering media. To discretize the complex computational domain we use quadrilateral boundary fitted grids which resolve porous-fluid interfaces. The Stokes-Brinkman system of equations is discretized here using a sophisticated finite volume method, namely multi-point flux approximation (MPFA) O-method. MPFA is widely used, e.g., in solving scalar elliptic equations with full tensor and highly varying coefficients and/or solving on heterogeneous non-orthogonalgrids. Up to the authors’ knowledge, there was no investigation of MPFA discretization for Stokes-Brinkman problems, and this study aims to fill this gap. Some numerical experiments are presented in order to demonstrate the robustness of the proposed numerical algorithm[1]. The second part of this study focuses on the coupling of the flow model with the deflection of the filtering media. For the consideration of the FPSI problem in 3D, the classical Biot system describes coupled flow and deformations in a porous body due to difference in the upstream and downstream pressures. Solving the Biot system of equations is complicated and requires a significant amount of

  10. In Situ Cleanable Alternative HEPA Filter Media

    International Nuclear Information System (INIS)

    Adamson, D. J.; Terry, M. T.

    2002-01-01

    The Westinghouse Savannah River Company, located at the Savannah River Site in Aiken, South Carolina, is currently testing two types of filter media for possible deployment as in situ regenerable/cleanable High Efficiency Particulate Air (HEPA) filters. The filters are being investigated to replace conventional, disposable, glass-fiber, HEPA filters that require frequent removal, replacement, and disposal. This is not only costly and subjects site personnel to radiation exposure, but adds to the ever-growing waste disposal problem. The types of filter media being tested, as part of a National Energy Technology Laboratory procurement, are sintered nickel metal and ceramic monolith membrane. These media were subjected to a hostile environment to simulate conditions that challenge the high-level waste tank ventilation systems. The environment promoted rapid filter plugging to maximize the number of filter loading/cleaning cycles that would occur in a specified period of time. The filters were challenged using nonradioactive simulated high-level waste materials and atmospheric dust; materials that cause filter pluggage in the field. The filters are cleaned in situ using an aqueous solution. The study found that both filter media were insensitive to high humidity or moisture conditions and were easily cleaned in situ. The filters regenerated to approximately clean filter status even after numerous plugging and in situ cleaning cycles. Air Techniques International is conducting particle retention testing on the filter media at the Oak Ridge Filter Test Facility. The filters are challenged using 0.3-mm di-octyl phthalate particles. Both the ceramic and sintered media have a particle retention efficiency > 99.97%. The sintered metal and ceramic filters not only can be cleaned in situ, but also hold great potential as a long life alternative to conventional HEPA filters. The Defense Nuclear Facility Safety Board Technical Report, ''HEPA Filters Used in the Department of

  11. Experimental investigation of in situ cleanable HEPA filter

    International Nuclear Information System (INIS)

    Adamson, D.J.

    1999-01-01

    The Westinghouse Savannah River Company located at the Savannah River Site (SRS) in Aiken, South Carolina is currently testing the feasibility of developing an in situ cleanable high efficiency particulate air (HEPA) filter system. Sintered metal filters are being tested for regenerability or cleanability in simulated conditions found in a high level waste (HLW) tank ventilation system. The filters are being challenged using materials found in HLW tanks. HLW simulated salt, HLW simulated sludge and South Carolina road dust. Various cleaning solutions have been used to clean the filters in situ. The tanks are equipped with a ventilation system to maintain the tank contents at negative pressure to prevent the release of radioactive material to the environment. This system is equipped with conventional disposable glass-fiber HEPA filter cartridges. Removal and disposal of these filters is not only costly, but subjects site personnel to radiation exposure and possible contamination. A test apparatus was designed to simulate the ventilation system of a HLW tank with an in situ cleaning system. Test results indicate that the Mott sintered metal HEPA filter is suitable as an in situ cleanable or regenerable HEPA filter. Data indicates that high humidity or water did not effect the filter performance and the sintered metal HEPA filter was easily cleaned numerous times back to new filter performance by an in situ spray system. The test apparatus allows the cleaning of the soiled HEPA filters to be accomplished without removing the filters from process. This innovative system would eliminate personnel radiation exposure associated with removal of contaminated filters and the high costs of filter replacement and disposal. The results of these investigations indicate that an in situ cleanable HEPA filter system for radioactive and commercial use could be developed and manufactured

  12. Performance of multiple HEPA filters against plutonium aerosols

    International Nuclear Information System (INIS)

    Gonzales, M.; Elder, J.; Ettinger, H.

    1975-01-01

    Performance of multiple stages of High Efficiency Particulate Air (HEPA) filters against aerosols similar to those produced by plutonium processing facilities has been verified as part of an experimental program. A system of three HEPA filters in series was tested against 238 PuO 2 aerosol concentrations as high as 3.3 x 10 10 d/s-m 3 . An air nebulization aerosol generation system, using ball milled plutonium oxide suspended in water, provided test aerosols with size characteristics similar to those defined by a field sampling program at several different AEC plutonium processing facilities. Aerosols have been produced ranging from 0.22 μm activity median aerodynamic diameter (amad) to 1.6 μm amad. The smaller size distributions yield 10 to 30 percent of the total activity in the less than 0.22 μm size range allowing efficiency measurement as a function of size for the first two HEPA filters in series. The low level of activity on the sampler downstream of the third HEPA filter (approximately 0.01 c/s) precludes aerosol size characterization downstream of this filter. For the first two HEPA filters, overall efficiency, and efficiency as a function of size, exceeds 99.98 percent including the <0.12 μm and the 0.12 to 0.22 μm size intervals. Efficiency of the third HEPA filter is somewhat lower with an overall average efficiency of 99.8 percent and an apparent minimum efficiency of 99.5 percent. This apparently lower efficiency is an artifact due to the low level of activity on the sampler downstream of HEPA No. 3 and the variations due to counting statistics. Recent runs with higher concentrations, thereby improving statistical variations, show efficiencies well within minimum requirements. (U.S.)

  13. A method and machine for forming pleated and bellow tubes

    International Nuclear Information System (INIS)

    Banks, J.W.

    1975-01-01

    In a machine, the rollers outside the rough tube are rigidly supported for assuring the accurate forming of each turn of the pleated tube, the latter being position-indexed independently of the already formed turns. An inner roller is supported by a device for adjusting and indexing the position thereof on a carriage. The thus obtained tubes are suitable, in particular, for forming expansion sealing joints for power generators or nuclear reactors [fr

  14. Fundamental study on recovery uranium oxide from HEPA filters

    International Nuclear Information System (INIS)

    Izumida, T.; Noguchi, Y.

    1993-01-01

    Large numbers of spent HEPA filters are produced at uranium fuel fabrication facilities. Uranium oxide particles have been collected on these filters. Then, a spent HEPA filter treatment system was developed from the viewpoint of recovering the UO 2 and minimizing the volume. The system consists of a mechanical separation process and a chemical dissolution process. This paper describes the results of fundamental experiments on recovering UO 2 from HEPA filters

  15. Evaluation of HEPA filter service life

    International Nuclear Information System (INIS)

    Fretthold, J.K.; Stithem, A.R.

    1997-01-01

    Rocky Flats Environmental Technology Site (RFETS), has approximately 10,000 High Efficiency Particulate Air (HEPA) Filters installed in a variety of filter plenums. These ventilation/filtration plenum systems are used to control the release of airborne particulate contaminates to the environment during normal operations and potential accidents. This report summarizes the results of destructive and non-destructive tests on HEPA filters obtained from a wide variety of ages and service conditions. These tests were performed to determine an acceptable service life criteria for HEPA filters used at Rocky Flats Environmental Technology Site (RFETS). A total of 140 filters of various ages (1972 to 1996) and service history (new, aged unused, used) were tested. For the purpose of this report, filter age from manufacture date/initial test date to the current sample date was used, as opposed to the actual time a filter was installed in an operating system

  16. Experience with HEPA filters at United States nuclear installations

    International Nuclear Information System (INIS)

    Bellamy, R.R.

    1977-01-01

    Part 50 of Title 10 of the United States Code of Federal Regulations requires that a number of atmosphere cleanup systems be included in the design of commercial nuclear power plants to be licensed in the United States. These filtering systems are to contain high efficiency particulate air (HEPA) filters for removal of radioactive particulate matter generated during normal and accident conditions. Recommendations for the design, testing and maintenance of the filtering systems and HEPA filter components are contained in a number of United States Nuclear Regulatory Commission documents and industry standards. This paper will discuss this published guidance available to designers of filtering systems and the plant operators of U.S. commercial nuclear power plants. The paper will also present a survey of published reports of experience with HEPA filters, failures and possible causes for the failures, and other abnormal occurrences pertaining to HEPA filters installed in U.S. nuclear power installations. A discussion will be included of U.S. practices for qualification of HEPA filters before installation, and verification of continued performance capability at scheduled intervals during operation

  17. DOE standard: Quality assurance inspection and testing of HEPA filters

    International Nuclear Information System (INIS)

    1999-02-01

    This standard establishes essential elements for the quality assurance inspection and testing of HEPA filters by US Department of Energy (DOE)-accepted Filter Test Facilities (FTF). The standard specifies HEPA filter quality assurance inspection and testing practices established in DOE-STD-3022-98, DOE HEPA Filter Test Program, and provides a basis for the preparation of written operating procedures for primary FTF functions

  18. Pressure transients across HEPA filters

    International Nuclear Information System (INIS)

    Gregory, W.; Reynolds, G.; Ricketts, C.; Smith, P.R.

    1977-01-01

    Nuclear fuel cycle facilities require ventilation for health and safety reasons. High efficiency particulate air (HEPA) filters are located within ventilation systems to trap radioactive dust released in reprocessing and fabrication operations. Pressure transients within the air cleaning systems may be such that the effectiveness of the filtration system is questioned under certain accident conditions. These pressure transients can result from both natural and man-caused phenomena: atmospheric pressure drop caused by a tornado or explosions and nuclear excursions initiate pressure pulses that could create undesirable conditions across HEPA filters. Tornado depressurization is a relatively slow transient as compared to pressure pulses that result from combustible hydrogen-air mixtures. Experimental investigation of these pressure transients across air cleaning equipment has been undertaken by Los Alamos Scientific Laboratory and New Mexico State University. An experimental apparatus has been constructed to impose pressure pulses across HEPA filters. The experimental equipment is described as well as preliminary results using variable pressurization rates. Two modes of filtration of an aerosol injected upstream of the filter is examined. A laser instrumentation for measuring the aerosol release, during the transient, is described

  19. HEPA air filter (image)

    Science.gov (United States)

    ... pet dander and other irritating allergens from the air. Along with other methods to reduce allergens, such ... controlling the amount of allergens circulating in the air. HEPA filters can be found in most air ...

  20. Use of evidence in 3 local level HEPA policies in Denmark

    DEFF Research Database (Denmark)

    Jakobsen, Mette Winge; Juel Lau, Cathrine; Skovgaard, Thomas

    2013-01-01

    of relevant evidence for HEPA, resources as well as organizational structure, culture and capacity. Discussion: Our insight into the actual impact of research in HEPA policy making is still sketchy. However, projects such as REPOPA will help to further our understanding of how research and other kind...... activity (HEPA) policies in 7 countries. This presentation draws on the Danish results of the policy analyses. Focus is on the use and the type of research used in three local level HEPA policies in Denmark. Methods: Three municipal level policies were selected for further investigation. Document analysis...

  1. ALTERNATE HIGH EFFICIENCY PARTICULATE AIR (HEPA) FILTRATION SYSTEM

    Energy Technology Data Exchange (ETDEWEB)

    Bruce Bishop; Robert Goldsmith; Karsten Nielsen; Phillip Paquette

    2002-08-16

    In Phase IIA of this project, CeraMem has further developed and scaled up ceramic HEPA filters that are appropriate for use on filtration of vent gas from HLW tanks at DOE sites around the country. This work included procuring recrystallized SiC monoliths, developing membrane and cement materials, and defining a manufacturing process for the production of prototype full sizes HEPA filters. CeraMem has demonstrated that prototype full size filters can be manufactured by producing 9 full size filters that passed DOP aerosol testing at the Oak Ridge Filter Test Facility. One of these filters was supplied to the Savannah River Technical Center (SRTC) for process tests using simulated HLW tank waste. SRTC has reported that the filter was regenerable (with some increase in pressure drop) and that the filter retained its HEPA retention capability. CeraMem has also developed a Regenerable HEPA Filter System (RHFS) design and acceptance test plan that was reviewed by DOE personnel. The design and acceptance test plan form the basis of the system proposal for follow-on work in Phase IIB of this project.

  2. ALTERNATE HIGH EFFICIENCY PARTICULATE AIR (HEPA) FILTRATION SYSTEM

    International Nuclear Information System (INIS)

    Bruce Bishop; Robert Goldsmith; Karsten Nielsen; Phillip Paquette

    2002-01-01

    In Phase IIA of this project, CeraMem has further developed and scaled up ceramic HEPA filters that are appropriate for use on filtration of vent gas from HLW tanks at DOE sites around the country. This work included procuring recrystallized SiC monoliths, developing membrane and cement materials, and defining a manufacturing process for the production of prototype full sizes HEPA filters. CeraMem has demonstrated that prototype full size filters can be manufactured by producing 9 full size filters that passed DOP aerosol testing at the Oak Ridge Filter Test Facility. One of these filters was supplied to the Savannah River Technical Center (SRTC) for process tests using simulated HLW tank waste. SRTC has reported that the filter was regenerable (with some increase in pressure drop) and that the filter retained its HEPA retention capability. CeraMem has also developed a Regenerable HEPA Filter System (RHFS) design and acceptance test plan that was reviewed by DOE personnel. The design and acceptance test plan form the basis of the system proposal for follow-on work in Phase IIB of this project

  3. Factors Influencing HEPA Filter Performance

    International Nuclear Information System (INIS)

    Parsons, M.S.; Waggoner, Ch.A.

    2009-01-01

    Properly functioning HEPA air filtration systems depend on a variety of factors that start with the use of fully characterized challenge conditions for system design and then process control during operation. This paper addresses factors that should be considered during the design phase as well as operating parameters that can be monitored to ensure filter function and lifetime. HEPA filters used in nuclear applications are expected to meet design, fabrication, and performance requirements set forth in the ASME AG-1 standard. The DOE publication Nuclear Air Cleaning Handbook (NACH) is an additional guidance document for design and operation HEPA filter systems in DOE facilities. These two guidelines establish basic maximum operating parameters for temperature, maximum aerosol particle size, maximum particulate matter mass concentration, acceptable differential pressure range, and filter media velocity. Each of these parameters is discussed along with data linking variability of each parameter with filter function and lifetime. Temporal uncertainty associated with gas composition, temperature, and absolute pressure of the air flow can have a direct impact on the volumetric flow rate of the system with a corresponding impact on filter media velocity. Correlations between standard units of flow rate (standard meters per minute or cubic feet per minute) versus actual units of volumetric flow rate are shown for variations in relative humidity for a 70 deg. C to 200 deg. C temperature range as an example of gas composition that, uncorrected, will influence media velocity. The AG-1 standard establishes a 2.5 cm/s (5 feet per minute) ceiling for media velocities of nuclear grade HEPA filters. Data are presented that show the impact of media velocities from 2.0 to 4.0 cm/s media velocities (4 to 8 fpm) on differential pressure, filter efficiency, and filter lifetime. Data will also be presented correlating media velocity effects with two different particle size

  4. Requirements for a cleanable steel HEPA filter derived from a systems analysis

    International Nuclear Information System (INIS)

    Bergman, W.

    1996-06-01

    A systems analysis was conducted to determine customer requirements for a cleanable high efficiency particulate air (HEPA) filter in DOE Environmental Management (EM) facilities. The three principal drivers for cleanable steel HEPA are large cost savings, improved filter reliability, and new regulations; they produce a strong incentive to DOE customers to use cleanable steel HEPA filters. Input for customer requirements were obtained from field trips to EM sites and from discussions. Most existing applications require that cleanable steel HEPA filters meet size/performance requirements of standard glass HEPA filters; applications in new facilities can relax size/weight/pressure drop requirements on a case-by-case basis. We then obtained input from commercial firms on availability of cleanable steel HEPA filters. Systems analysis then showed that currently available technology was only able to meet customer needs in a limited number of cases. Further development is needed to meet requirements of EM customers. For cleanable steel HEPA to be retrofitted into existing systems, pressure drop and weight must be reduced. Pressure drop can be reduced by developing steel fiber media from 0.5 μm dia steel fibers. Weight can be reduced by packaging the steel fiber media in one of the standard HEPA configurations. Although most applications will be able to use standard 304 or 316L alloys, an acid resistant alloy such as Hastelloy or Inconel will be needed for incinerator and other thermal processes

  5. HEPA filter concerns - an overview

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, J.F. [Department of Energy, Washington, DC (United States)

    1995-02-01

    The U.S. Department of Energy (DOE) recently initiated a complete review of the DOE High Efficiency Particulate Air (HEPA) Filter Program to identify areas for improvement. Although this process is currently ongoing, various issues and problems have already been identified for action that not only impacts the DOE HEPA filter program, but potentially the national and international air cleaning community as well. This paper briefly reviews a few of those concerns that may be of interest, and discusses actions initiated by the DOE to address the associated issues and problems. Issues discussed include: guidance standards, in-place testing, specifications, Test Facilities, portable units, vacuum cleaners, substitute aerosols, filter efficiencies, aging/shelf life/service life, fire suppression, handbook, Quality Products List (QPL), QA testing, and evaluations.

  6. Closure of 324 Facility potential HEPA filter failure unreviewed safety questions

    International Nuclear Information System (INIS)

    Enghusen, M.B.

    1997-01-01

    This document summarizes the activities which occurred to resolve an Unreviewed Safety Question (USQ) for the 324 Facility [Waste Technology Engineering Laboratory] involving Potential HEPA Filter Breach. The facility ventilation system had the capacity to fail the HEPA filters during accident conditions which would totally plug the filters. The ventilation system fans were modified which lowered fan operating parameters and prevented HEPA filter failures which might occur during accident conditions

  7. Survey of life-cycle costs of glass-paper HEPA filters

    International Nuclear Information System (INIS)

    Moore, P.; Bergman, W.; Gilbert, H.

    1992-08-01

    We have conducted a survey of the major users of glass-paper HEPA filters in the DOE complex to ascertain the life cycle costs of these filters. Purchase price of the filters is only a minor portion of the costs; the major expenditures are incurred during the removal and disposal of contaminated filters. Through personal interviews, site visits and completion of questionnaires, we have determined the costs associated with the use of HEPA filters in the DOE complex. The total approximate life-cycle cost for a standard (2 in. x 2 in. x 1 in.) glass-paper HEPA filter is $3,000 for one considered low-level waste (LLW), $11,780 for transuranic (TRU) and $15,000 for high-level waste (HLW). The weighted-average cost for a standard HEPA filter in the complex is $4,753

  8. Self Cleaning High Efficiency Particulate Air (HEPA) Filtration without Interrupting Process Flow - 59347

    International Nuclear Information System (INIS)

    Chadwick, Chris

    2012-01-01

    The strategy of protecting the traditional glass fibre HEPA filtration train from it's blinding contamination and the recovery of dust by the means of self cleaning, pre-filtration is a proven means in the reduction of ultimate disposal volumes and has been used within the Fuel Production Industry. However, there is an increasing demand in nuclear applications requiring elevated operating temperatures, fire resistance, moisture resistance and chemical composition that the existing glass fibre HEPA filtration cannot accommodate, which can be remedied by the use of a metallic HEPA filter media. Previous research suggests that the then costs to the Department of Energy (DOE), based on a five year life cycle, was $29.5 million for the installation, testing, removal and disposal of glass fibre HEPA filtration trains. Within these costs, $300 was the value given to the filter and $4, 450 was given to the peripheral activity. Development of a low cost, cleanable, metallic, direct replacement of the traditional filter train will the clear solution. The Bergman et al work has suggested that a 1000 ft 3 /min, cleanable, stainless HEPA could be commercially available for $5, 000 each, whereas the industry has determined that the truer cost of such an item in isolation would be closer to $15, 000. This results in a conflict within the requirement between 'low cost' and 'stainless HEPA'. By proposing a system that combines metallic HEPA filtration with the ability to self clean without interrupting the process flow, the need for a tradition HEPA filtration train will be eliminated and this dramatically reduces the resources required for cleaning or disposal, thus presenting a route to reducing ultimate costs. The paper will examine the performance characteristics, filtration efficiency, flow verses differential pressure and cleanability of a self cleaning HEPA grade sintered metal filter element, together with data to prove the contention. (authors)

  9. A review of DOE HEPA filter component test activities

    Energy Technology Data Exchange (ETDEWEB)

    Slawski, J.W.; Bresson, J.F. [Informatics Corp., Inc., Albuquerque, NM (United States); Scripsick, R.C. [Los Alamos National Lab., NM (United States)

    1997-08-01

    All HEPA filters purchased for installation in DOE nuclear facilities are required to be tested at a Filter Test Facility (FTF) prior to installation. The number of HEPA filters purchased by DOE has been reduced so much that the Hanford FTF was closed. From Fiscal Year (FY) 1992 to 1994, funding was not provided to the FTF Technical Support Group (TSG) at the Los Alamos National Laboratory. As a consequence, Round Robin Tests (RRTs), performed twice each year by the FTFs to assess constituency of test results among the FTFs, were not performed in FY 1992 and FY 1993. The Annual Reports of FTF test activities were not prepared for FY 1992 - 1995. Technical support provided to the FTFs was minimal. There is talk of closing a second FTF, and ongoing discussions as to whether DOE will continue to fund operation of the FTFs. In FY 1994, DOE Defense Programs commenced funding the TSG. RRT data for FY 1994 and 1995 have been entered into the database; the FY 1994 RRT report has been issued; and the FY 1995 RRT report is in progress. Data from semiannual reports have been retrieved and entered into the database. Standards related to HEPA filter test and procurement activities are now scheduled for issuance by FY 1996. Continuation of these activities depends on whether DOE will continue to support the HEPA filter test program. The history and activities of the FTFs and the TSG at Los Alamos have been reported at previous Air Cleaning Conferences. Data from the FY 1991 Annual Report of FTF activities was presented at the 1992 Air Cleaning Conference. Preparation of the Annual Reports was temporarily suspended in 1992. However, all of the FTF Semiannual report data have been retrieved and entered into the data base. This paper focuses primarily on the results of HEPA filter tests conducted by FTFs during FY 1992 - FY 1995, and the possible effects of the DOE program uncertainties on the quality of HEPA filters for installation at the DOE sites. 15 refs., 13 tabs.

  10. Development and evaluation of a HEPA filter for increased strength and resistance to elevated temperature

    International Nuclear Information System (INIS)

    Gilbert, H.; Bergman, W.; Fretthold, J.K.

    1992-01-01

    We have developed an improved HEPA filter for increased strength and resistance to elevated temperature to improve the reliability of HEPA filters under accident conditions. The improvements to the HEPA filter consist of a silicone rubber sealant and a new HEPA medium reinforced with a glass cloth. Several prototype filters were built and evaluated for temperature and pressure resistance and resistance to rough handling. The temperature resistance test consisted of exposing the HEPA filter to 1,000 scan at 700 degrees F for five minutes. The pressure resistance test consisted of exposing the HEPA filter to a differential pressure of 10 in. w.g. using a water saturated air flow at 95 degrees F. For the rough handling test, we used a vibrating machine designated the Q110. DOP filter efficiency tests were performed before and after each of the environmental tests. In addition to following the standard practice of using a separate new filter for each environmental test, we also subjected the same filter to the elevated temperature test followed by the pressure resistance test. The efficiency test results show that the improved HEPA filter is significantly better than the standard HEPA filter

  11. Performance of multiple HEPA filters against plutonium aerosols

    International Nuclear Information System (INIS)

    Gonzales, M.; Elder, J.C.; Tillery, M.I.; Ettinger, H.J.

    1976-11-01

    Performance of multiple stages of high-efficiency particulate air (HEPA) filters has been verified against plutonium aerosols similar in size characteristics to those challenging the air-cleaning systems of plutonium-processing facilities. An experimental program was conducted to test each filter in systems of three HEPA filters operated in series against 238 PuO 2 aerosols as high as 3.3 x 10 10 dis/s . m 3 in activity concentration and ranging from 0.22 μm to 1.6 μm in activity median aerodynamic diameter (amad). Mean penetration (ratio of downstream to upstream concentration) of each of the three filters in series was below 0.0002, but it apparently increased at each successive filter. Penetration vs size measurements showed that maximum penetration of 238 PuO 2 occurred for sizes between 0.4- and 0.7-μm aerodynamic diameter (D/sub ae/). HEPA filter penetration at half of rated flow differed little from full-flow penetration

  12. Remote aerosol testing of large size HEPA filter banks

    International Nuclear Information System (INIS)

    Franklin, B.; Pasha, M.; Bronger, C.A.

    1987-01-01

    Different methods of testing HEPA filter banks are described. Difficulties in remote testing of large banks of HEPA filters in series with minimum distances between banks, and with no available access upstream and downstream of the filter house, are discussed. Modifications incorporated to make the filter system suitable for remote testing without personnel re-entry into the filter house are described for a 51,000 m/sup 3//hr filter unit at the WIPP site

  13. FULL SCALE REGENERABLE HEPA FILTER DESIGN USING SINTERED METAL FILTER ELEMENTS

    International Nuclear Information System (INIS)

    Gil Ramos; Kenneth Rubow; Ronald Sekellick

    2002-01-01

    A Department of Energy funded contract involved the development of porous metal as a HEPA filter, and the subsequent design of a full-scale regenerable HEPA filtration system (RHFS). This RHFS could replace the glass fiber HEPA filters currently being used on the high level waste (HLW) tank ventilation system with a system that would be moisture tolerant, durable, and cleanable in place. The origins of the contract are a 1996 investigation at the Savannah River Technology Center (SRTC) regarding the use of porous metal as a HEPA filter material. This contract was divided into Phases I, IIA and IIB. Phase I of the contract evaluated simple filter cylinders in a simulated High Level Waste (HLW) environment and the ability to clean and regenerate the filter media after fouling. Upon the successful completion of Phase I, Phase IIA was conducted, which included lab scale prototype testing and design of a full-scale system. The work completed under Phase IIA included development of a full-scale system design, development of a filter media meeting the HEPA filtration efficiency that would also be regenerable using prescribed cleaning procedures, and the testing of a single element system prototype at Savannah River. All contract objectives were met. The filter media selected was a nickel material already under development at Mott, which met the HEPA filtration efficiency standard. The Mott nickel media met and exceeded the HEPA requirement, providing 99.99% removal against a requirement of 99.97%. Double open-ended elements of this media were provided to the Savannah River Test Center for HLW simulation testing in the single element prototype filter. These elements performed well and further demonstrated the practicality of a metallic media regenerable HEPA filter system. An evaluation of the manufacturing method on many elements demonstrated the reproducibility to meet the HEPA filtration requirement. The full-scale design of the Mott RHFS incorporated several important

  14. Degradation of HEPA filters exposed to DMSO

    International Nuclear Information System (INIS)

    Bergman, W.; Wilson, K.; Larsen, G.; Lopez, R.; LeMay, J.

    1994-01-01

    Dimethyl sulfoxide (DMSO) sprays are being used to remove the high explosive (HE) from nuclear weapons in the process of their dismantlement. A boxed 50 cfm HEPA filter with an integral prefilter was exposed to DMSO vapor and aerosols that were generated by a spray nozzle to simulate conditions expected in the HE dissolution operation. After 198 hours of operation, the pressure drop of the filter had increased from 1.15 inches to 2.85 inches, and the efficiency for 0.3 μm dioctyl sebacate (DOS) aerosols decreased from 99.992% to 98.6%. Most of the DMSO aerosols had collected as a liquid pool inside the boxed HEPA. The liquid was blown out of the filter exit with 100 cfm air flow at the end of the test. Since the filter still met the minimum allowed efficiency of 99.97% after 166 hours of exposure, we recommend replacing the filter every 160 hours of operation or sooner if the pressure drop increases by 50%. Examination of the filter showed that visible cracks appeared at the joints of the wooden frame and a portion of the sealant had pulled away from the frame. Since all of the DMSO will be trapped in the first HEPA filter, the second HEPA filter should not suffer from DMSO degradation. Thus the combined efficiency for the first filter (98.6%) and the second filter (99.97%) is 99.99996% for 0.3μm particles. If the first filter is replaced prior to its degradation, each of the filters will have 99.97% efficiency, and the combined efficiency will be 99.999991%. The collection efficiency for DMSO/HE aerosols will be much higher because the particle size is much greater

  15. Degradation of HEPA filters exposed to DMSO

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Wilson, K.; Larsen, G. [Lawrence Livermore National Laboratory, CA (United States)] [and others

    1995-02-01

    Dimethyl sulfoxide (DMSO) sprays are being used to remove the high explosive (HE) from nuclear weapons in the process of their dismantlement. A boxed 50 cmf HEPA filter with an integral prefilter was exposed to DMSO vapor and aerosols that were generated by a spray nozzle to simulate conditions expected in the HE dissolution operation. After 198 hours of operation, the pressure drop of the filter had increased form 1.15 inches to 2,85 inches, and the efficiency for 0.3 {mu}m dioctyl sebacate (DOS) aerosols decreased form 99.992% to 98.6%. Most of the DMSO aerosols had collected as a liquid pool inside the boxed HEPA. The liquid was blown out of the filter exit with 100 cmf air flow at the end of the test. Since the filter still met the minimum allowed efficiency of 99.97% after 166 hours of exposure, we recommend replacing the filter every 160 hours of operation or sooner if the pressure drop increases by 50%. Examination of the filter showed that visible cracks appeared at the joints of the wooden frame and a portion of the sealant had pulled away from the frame. Since all of the DMSO will be trapped in the first HEPA filter, the second HEPA filter should not suffer from DMSO degradation. Thus the combined efficiency for the first filter (98.6%) and the second filter (99.97%) is 99.99996% for 0.3 {mu}m particles. If the first filter is replaced prior to its degradation, each of the filters will have 99.97% efficiency, and the combined efficiency will be 99.999991%. The collection efficiency for DMSO/HE aerosols will be much higher because the particle size is much greater.

  16. Filter Paper: Solution to High Self-Attenuation Corrections in HEPA Filter Measurements

    International Nuclear Information System (INIS)

    Oberer, R.B.; Harold, N.B.; Gunn, C.A.; Brummett, M.; Chaing, L.G.

    2005-01-01

    An 8 by 8 by 6 inch High Efficiency Particulate Air (HEPA) filter was measured as part of a uranium holdup survey in June of 2005 as it has been routinely measured every two months since 1998. Although the survey relies on gross gamma count measurements, this was one of a few measurements that had been converted to a quantitative measurement in 1998. The measurement was analyzed using the traditional Generalized Geometry Holdup (GGH) approach, using HMS3 software, with an area calibration and self-attenuation corrected with an empirical correction factor of 1.06. A result of 172 grams of 235 U was reported. The actual quantity of 235 U in the filter was approximately 1700g. Because of this unusually large discrepancy, the measurement of HEPA filters will be discussed. Various techniques for measuring HEPA filters will be described using the measurement of a 24 by 24 by 12 inch HEPA filter as an example. A new method to correct for self attenuation will be proposed for this measurement Following the discussion of the 24 by 24 by 12 inch HEPA filter, the measurement of the 8 by 8 by 6 inch will be discussed in detail

  17. Real world industrial solutions to cost and waste volume reduction using metallic HEPA/THE filtration together with an examination of effective HEPA Pre-Filtration Preventing the Blinding Solids from reaching the HEPA/THE filters and recovering the blinding solids for disposal, reducing both waste volume and cost

    International Nuclear Information System (INIS)

    Chadwick, Ch.

    2008-01-01

    The disposal costs of contaminated HEPA and THE filter elements have been proved to be disproportionately high compared with the cost of the elements themselves. If HEPA filters could be cleaned to a condition where they could either be re-used or decontaminated to the extent that they could be stored as a lower cost wasteform or if HEPA/THE filter elements were available without any organic content likely to give rise to flammable or explosive decomposition gases during long term storage this would also reduce the costs and monitoring necessary in storage. Using current state-of-the-art metallic filter media, it is possible to provide robust, completely inorganic, cleanable HEPA/THE filter elements to meet any duty already met by traditional glass-fibre HEPA/THE elements, within the same space limitations and with equivalent pressure loss. Additionally, traditional HEPA filter systems have limitations that often prevent them from solving many of the filtration problems in the nuclear industry. The paper will address several of these matters of concern by considering the use of metallic filter media to solve HEPA filtration problems ranging from the long term storage of transuranic waste at the WIPP site, spent and damaged fuel assemblies, in glove box ventilation and tank venting to the venting of fumes at elevated temperatures from incinerators, vitrification processes, conversion and sintering furnaces as well as downstream of iodine absorbers in gas cooled reactors in the UK. The paper reviews the technology, development, performance characteristics, filtration efficiency, flow/differential pressure character, cleanability and cost of sintered metal fiber in comparison with traditional resin bonded glass fiber filter media and sintered metal powder filter media. Examples of typical filter element and system configurations and applications will be presented. In addition, the paper will also address the economic case for installing self cleaning pre

  18. Cost and waste volume reduction in HEPA filter trains by effective pre-filtration

    International Nuclear Information System (INIS)

    Chadwick, Chris

    2007-01-01

    Data published elsewhere (Moore, et al., 1992; Bergman et al., 1997) suggests that the then costs of disposable type Glass Fibre HEPA filtration trains to the DOE was $55 million per year (based on an average usage of HEPA panels of 11,748 pieces per year between 1987 and 1990), $50 million of which was attributable to installation, testing, removal and disposal. The same authors suggest that by 1995 the number of HEPA panels being used had dropped to an estimated 4000 pieces per year due to the ending of the Cold War. The yearly cost to the DOE of 4000 units per year was estimated to be $29.5 million using the same parameters that previously suggested the $55 million figure. Within that cost estimate, $300 each was the value given to the filter and $4,450 was given to peripheral activity per filter. Clearly, if the $4,450 component could be reduced, tremendous saving could result, in addition to a significant reduction in the legacy burden of waste volumes. This same cost is applied to both the 11,748 and 4000 usage figures. The work up to now has focussed on the development of a low cost, long life (cleanable), direct replacement of the traditional filter train. This paper will review an alternative strategy, that of preventing the contaminating dust from reaching and blinding the HEPA filters, and thereby removing the need to replace them. What has become clear is that 'low cost' and 'Metallic HEPA' are not compatible terms. The original Bergman et al., 1997 work suggested that 1000 cfm (cubic feet per minute) (1690 m 3 /hr) stainless HEPAs could be commercially available for $5000 each after development (although the $70,000 development unit may be somewhat exaggerated - the authors own company have estimated development units able to be retrofitted into strengthened standard housings would be available for perhaps $30,000). The likely true cost of such an item produced industrially in significant numbers may be closer to $15,000 each. That being the case, the

  19. Hygroscopic Metamorphic 4D Pleats

    Science.gov (United States)

    Yang, Shu

    There have been significant interests in morphing 2D sheets into 3D structures via programmed out-of-plane distortion, including bending, tilting, rotating, and folding as seen in recent origami and kirigami strategies. Hydrogel is one of the unique soft materials that can swell and shrink, thereby enabling real-time 4D motions in response to external stimuli, such as pH, temperature, and moisture. To achieve reliable folding behaviors, it often requires a large amount of water molecules or ions diffusing in and out of the hydrogel sheet, thus the entire sheet is immersed in an aqueous solution. Here, we demonstrate the design and folding of hierarchical pleats patterned from a combination of hydrophobic and hygroscopic materials, allowing us to spatially and locally control the water condensation induced by environmental humidity. In turn, we show out-of-plane deformation of the 2D sheets only in the patterned hygroscopic regions, much like the folding behaviors of many plants. By designing the dimension, geometry, and density of hygroscopic microstructures (as pixels) in the hydrophobic materials, we can display the enhanced water condensation together with the spatial guidance of obtained droplets as unified water-harvesting systems. When the water droplets become large enough, they roll off from the hierarchical sheet along the inclined plane that is programmed by the hygroscopic motion of hydrogel, and eventually wrapped by the folded sheet to keep them from evaporation. We acknowledge support from NSF/EFRI-ODISSEI, EFRI 13-31583.

  20. Modelling of air flows in pleated filters and of their clogging by solid particles; Modelisation des ecoulements d'air et du colmatage des filtres plisses par des aerosols solides

    Energy Technology Data Exchange (ETDEWEB)

    Del Fabbro, L

    2002-07-01

    The devices of air cleaning against particles are widely spread in various branches of industry: nuclear, motor, food, electronic,...; among these devices, numerous are constituted by pleated porous media to increase the surface of filtration and thus to reduce the pressure drop, for given air flow. The objective of our work is to compensate a lack evident of knowledge on the evolution of the pressure drop of pleated filter during the clogging and to deduct a modelling from it, on the basis of experiments concerning industrial filters of nuclear and car types. The obtained model is a function of characteristics of the filtering medium and pleats, of the characteristics of solid particles deposited on the filter, of the mass of particles and of the aeraulic conditions of air flow. It also depends on data on the clogging of flat filters of equivalent medium. To elaborate this model of pressure drop, an initial stage was carried out in order to characterize, experimentally and numerically, the pressure drop and the distribution of air flow in clean pleated filters of nuclear (high efficiency particulate air filter, in fiberglasses) and car (mean efficiency filter, in fibers of cellulose) types. The numerical model allowed to understand the fundamental role played by the aeraulic resistance of the filtering medium. From an non-dimensional approach, we established a semi-empirical model of pressure drop for a clean pleated filter valid for both studied types of medium; this model is used of first base for the development of the final model of clogging. The study of the clogging of the filters showed the complexity of the phenomenon dependent mainly on a reduction of the surface of filtration. This observation brings us to propose a clogging of pleated filters in three phases. Both first phases are similar in those observed for flat filters, while last phase corresponds to a reduction of the surface of filtration and leads a strong increase of the filter pressure drop

  1. Effect of age on the structural integrity of HEPA filters

    International Nuclear Information System (INIS)

    Johnson, J.S.; Beason, D.G.; Smith, P.R.; Gregory, W.S.

    1989-01-01

    All of the controls on high-efficiency particulate air (HEPA) filters are based on rigid manufacturing standards with regard to filtration efficiency, temperature performance, pressure integrity, and strength. Third-party inspection and testing by the US Department of Energy increases the reliability of new HEPA filters, but only routine in-place testing is used to assure that an aging filter performs adequately. In 1980 the Lawrence Livermore National Laboratory initiated a small evaluation to determine if age has a significant effect on the structural integrity of HEPA filters. A series of used uncontaminated filters dating back to 1965 was obtained for these tests. Tensile strength tests on the old media indicated a decrease in strength. To provide additional measurement of the filters' overall strength, several of these aged filters were subjected to pressure pulses equivalent to the NRC Region I tornado pulses and shock wave over pressures. Data from these tests indicate a decrease in breaking pressure of from 25-50%. A large increase in complete filter pack blow-out during the simulated NRC Region I tornado tests was also observed. The preliminary results indicate the need for an administrative lifetime for HEPA filters used in critical nuclear facilities. Due to the unique conditions in each facility, different administrative lifetimes may be necessary

  2. Multi-Canister overpack internal HEPA filters

    International Nuclear Information System (INIS)

    SMITH, K.E.

    1998-01-01

    The rationale for locating a filter assembly inside each Multi-Canister Overpack (MCO) rather than include the filter in the Cold Vacuum Drying (CVD) process piping system was to eliminate the potential for contamination to the operators, processing equipment, and the MCO. The internal HEPA filters provide essential protection to facility workers from alpha contamination, both external skin contamination and potential internal depositions. Filters installed in the CVD process piping cannot mitigate potential contamination when breaking the process piping connections. Experience with K-Basin material has shown that even an extremely small release can result in personnel contamination and costly schedule disruptions to perform equipment and facility decontamination. Incorporating the filter function internal to the MCO rather than external is consistent with ALARA requirements of 10 CFR 835. Based on the above, the SNF Project position is to retain the internal HEPA filters in the MCO design

  3. A user's evaluation of radial flow HEPA filters

    International Nuclear Information System (INIS)

    Purcell, J.A.

    1992-07-01

    High efficiency particulate air (HEPA) filters of rectangular cross section have been used to remove particulates and the associated radioactivity from air ventilation streams since the advent of nuclear materials processing. Use of round axial flow HEPA filters is also longstanding. The advantages of radial flow filters in a circular configuration have been well demonstrated in UKAEA during the last 5--7 years. An evaluation of radial flow filters for fissile process gloveboxes reveals several substantial benefits in addition to the advantages claimed in UKAEA Facilities. The radial flow filter may be provided in a favorable geometry resulting in improved criticality safety. The filter configuration lends to in-place testing at the glovebox to exhaust duct interface. This will achieve compliance with DOE Order 6430.1A, Section 99.0.2. Preliminary testing at SRS for radial flow filters manufactured by Flanders Filters, Inc. revealed compliance in all the usual specifications for filtration efficiency, pressure differential and materials of construction. An evaluation, further detailed in this report, indicates that the radial flow HEPA filter should be considered for inclusion in new ventilation system designs

  4. Study on the Metal Fiber Filter Modeling for Capturing Radioactive Aerosol

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seunguk; Lee, Chanhyun; Park, Minchan; Lee, Jaekeun [EcoEnergy Research Institute, Busan (Korea, Republic of)

    2015-05-15

    The components of air cleaning system are demisters to remove entrained moisture, pre-filters to remove the bulk of the particulate matter, high efficiency particulate air (HEPA) filters, iodine absorbers(generally, activated carbon) and HEPA filters after the absorbers for redundancy and collection of carbon fines. The HEPA filters are most important components to prevent radioactive aerosols from being released to control room and adjacent environment. The Conventional HEPA filter has pleated media for low pressure drop. Consequently, the filters must provide high collection efficiency as well as low pressure drop. Unfortunately, conventional HEPA filters are made of glass fiber and polyester, and pose disposal issues since they cannot be recycled. In fact, 31,055 HEPA filters used in nuclear facilities in the U.S are annually disposed. The Analyses at face velocities 1cm/s and 10cm/s are also carried out, and they also show R2 value of 0.995. However, since official HEPA filter standards are established at face velocity of 5cm/s, this value will be used in further analysis. From the comparative studies carried out at different filter thickness and face velocities, a good correlation is found between the model and the experiment.

  5. Investigation of water accumulation in an offgas test facility HEPA housing

    International Nuclear Information System (INIS)

    Speed, D.L.; Burns, D.B.; Van Pelt, W.B.; Burns, H.H.

    1997-01-01

    The Consolidated Incineration Facility, at the Department of Energy's Savannah River Site, is designed to treat solid and liquid RCRA hazardous and mixed wastes generated by site operations and clean-up activities. During CIF's pretrial burn campaigns in 1995, an appreciable amount of water was recovered from the HEPA housings. Questions were immediately raised as to the source of the water, and the degree of wetness of the filters during operation. There are two primary issues involved: Water could reduce the life expectancy and performance of the HEPA filters, housing, and associated ducting, and wet HEPAs also present radiological concerns for personnel during filter change-out. A similar phenomenon was noted at the Offgas Components Test Facility (OCTF), a 1/10 scale pilot of CIF's air pollution control system. Tests at OCTF indicated the water's most likely origin to be vapor condensing out from the flue gas stream due to excessive air in-leakage at housing door seals, ducting flanges, and actual holes in the ducting. The rate of accumulation bears no statistical correlation to such process parameters as steam flow, reheater outlet temperature and offgas velocity in the duct. Test results also indicated that the HEPA filter media is moistened by the initial process flow while the facility is being brought on line. However, even when the HEPA filters were manually drenched prior to startup, they became completely dry within four hours of the time steam was introduced to the reheater. Finally, no demonstrable relationship was found between the degree of filter media wetness and filter dP

  6. Cost and waste volume reduction in HEPA filter trains by effective pre-filtration

    International Nuclear Information System (INIS)

    Chadwick, Chris; Kaufman, Seth

    2006-01-01

    Data published elsewhere (Moore, et el 1992; Bergman et al 1997) suggests that the then costs of disposable type Glass Fibre HEPA filtration trains to the DOE was USD 55 million per year (based on an average usage of HEPA panels of 11,748 pieces per year between 1987 and 1990), USD 50 million of which was attributable to installation, testing, removal and disposal - although the life cycle costs are themselves based on estimates dating from 1987-1990. The same authors suggest that by 1995 the number of HEPA panels being used had dropped to an estimated 4000 pieces per year due to the ending of the Cold War. The yearly cost to the DOE of 4000 units per year was estimated to be USD 29.5 million using the same parameters that suggested the previously stated USD 55 million for the larger quantity. Within that cost estimate, USD 300 was the value given to the filter and USD 4,450 was given to peripheral activity per filter. Clearly, if the USD 4,450 component could be reduced, tremendous saving could result, in addition to a significant reduction in the legacy burden of waste volumes. This same cost is applied to both the 11,748 and 4000 usage figures. The work up to now has focussed on the development of a low cost, long life (cleanable) direct replacement of the traditional filter train, but this paper will review an alternative strategy, that of preventing the contaminating dust from reaching and blinding the HEPA filters, and thereby removing the need to replace them. What has become clear is that 'low cost' and 'stainless HEPA' are not compatible terms. The original Bergman et al work suggested that 1000 ft 3 /min stainless HEPAs could be commercially available for USD 5000 each after development (although the USD 70,000 development unit may be somewhat exaggerated - the authors have estimated that development units able to be retro-fitted into strengthened standard housings would be available for perhaps USD 30,000). The likely true cost of such an item produced

  7. Response of HEPA filters to simulated-accident conditions

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; Smith, P.R.; Fenton, D.E.

    1982-01-01

    High-efficiency particulate air (HEPA) filters have been subjected to simulated accident conditions to determine their response to abnormal operating events. Both domestic and European standard and high-capacity filters have been evaluated to determine their response to simulated fire, explosion, and tornado conditions. The HEPA filter structural limitations for tornado and explosive loadings are discussed. In addition, filtration efficiencies during these accident conditions are reported for the first time. Our data indicate efficiencies between 80% and 90% for shock loadings below the structural limit level. We describe two types of testing for ineffective filtration - clean filters exposed to pulse-entrained aerosol and dirty filters exposed to tornado and shock pulses. Efficiency and material loss data are described. Also, the resonse of standard HEPA filters to simulated fire conditions is presented. We describe a unique method of measuring accumulated combustion products on the filter. Additionally, data relating to pressure drop vs accumulated mass during plugging are reported for simulated combustion aerosols. The effects of concentration and moisture levels on filter plugging were evaluated. We are obtaining all of the above data so that mathematical models can be developed for fire, explosion, and tornado accident analysis computer codes. These computer codes can be used to assess the response of nuclear air cleaning systems to accident conditions

  8. Evaluation of data from HEPA filter quality assurance testing stations

    International Nuclear Information System (INIS)

    Collins, J.T.; Bellamy, R.R.; Allen, J.R.

    1979-01-01

    In Revision 1 to Regulatory Guide 1.52, issued in July 1976, the NRC recommended that high efficiency particulate air (HEPA) filters for use in engineered safety features (ESF) atmosphere cleanup systems be visually inspected and dioctylphtalate (DOP) tested at either of two Department of Energy (DOE) operated QA Filter Testing Stations prior to their installation and use in commercial nuclear power plants. This practice was initiated because filter vendors were unable to consistently provide a HEPA filter that would meet the stringent requirements established by DOE and NRC and its predecessor the AEC. In 1977, the NRC staff undertook a program to revise Regulatory Guide 1.52 to reflect recently issued industry standards (e.g., ANSI N509 and N510) and current industry practices. Revision 2 to Regulatory Guide 1.52 was formally issued in March 1978. In conducting this review, the recommendation that HEPA filters, intended for use in ESF systems in commercial nuclear power plants, be routinely tested at the DOE-QA Filter Testing Stations was revaluated. As part of this evluation a detailed analysis of the filter test results recorded by the two QA Testing Stations during the period 1971 to 1977 was conducted. This paper summarizes the results of the analysis and explains the rationale for deleting the requirement that all HEPA filters intended for use in ESF systems be tested at the AQ Testing Station

  9. Ceramic High Efficiency Particulate Air (HEPA) Filter Final Report CRADA No. TC02160.0

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bergman, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-08-25

    The technical objective of this project was to develop a ceramic HEPA filter technology, by initially producing and testing coupon ceramics, small scale prototypes, and full scale prototype HEPA filters, and to address relevant manufacturing and commercialization technical issues.

  10. Determination of HEPA Filter Efficiency With Diocthyl Pthalate Aerosol

    International Nuclear Information System (INIS)

    Bunawas; Ruslanto, P O; Suhariyono, G

    1996-01-01

    Ultrafine aerosol filtration by HEPA (High Efficiency Particulate Air) filter has been determinated experimentally, based on the measurement of monodisperse Diocthyl Pthalate (DOP) aerosol concentration before and after passing the test filter. Using this technique, filter efficiency can be determined as a function of aerosol diameter with range from 0.017 to 0.747 um. The average efficiencies for Whatman -41 ; Whatman -42 and Whatman GF/A filters were 56.14 %; 95,74 %; and 99.65 % respectively. Gelman A Fiber Glass and Whatman membrane filter have fulfilled criterion as HEPA filter according to standard of IAEA, because of their minimum effiency of 99.90 %

  11. Review of Department of Energy HEPA filter test activities

    International Nuclear Information System (INIS)

    McIntyre, J.A.

    1993-01-01

    Filter Test Facilities (FTFs) and the FTF Technical Support Group (TSG) continue to provide services to the Department of Energy (DOE). Additional tasks relating to the HEPA filter cycle have been added to the TSG. The tasks include the quality assessment review for the in-place testing of HEPA filters at DOE sites and the formation of an in-place testing standards writing group. Summary of ongoing FTFs and TSG activities for FY 1990-FY 1992 including the technical input for implementation of the High Flow Alternative Test System (HFATS), update of the DOE Standards, the status of the quality assessment review and in-place testing standards writing group are discussed

  12. HEPA filter fire (and subsequent unfiltered release)

    International Nuclear Information System (INIS)

    Powers, T.B.

    1996-01-01

    This document supports the development and presentation of the following accident scenario in the TWRS Final Safety Analysis Report: HEPA Filter Failure - Exposure to High Temperature or Pressure. The calculations needed to quantify the risk associated with this accident scenario are included within

  13. Mini-pleat filters for improved indoor air quality. Filtri a 'piccole pieghe' per una migliore qualita' dell'aria negli ambienti civili e negli impianti industriali

    Energy Technology Data Exchange (ETDEWEB)

    Zucchelli, D.

    1992-07-01

    Advanced manufacturing techniques applied to the fabrication of air filters have led to the creation of a high quality/efficiency mini-pleat filter which, however, has yet to see wide use in commercial space heating ventilation and air conditioning systems. Now, with greater attention being given to indoor air quality, these high performance filters should see greater market demand. This paper discusses the design and performance characteristics of mini-pleat filters and surveys the range of models currently available on the market.

  14. Testing cleanable/reuseable HEPA prefilters for mixed waste incinerator air pollution control systems

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.B.; Wong, A.; Walker, B.W.; Paul, J.D. [Westinghouse Savannah River Co., Aiken, SC (United States)

    1997-08-01

    The Consolidated Incineration Facility (CIF) at the US DOE Savannah River Site is undergoing preoperational testing. The CIF is designed to treat solid and liquid RCRA hazardous and mixed wastes from site operations and clean-up activities. The technologies selected for use in the air pollution control system (APCS) were based on reviews of existing incinerators, air pollution control experience, and recommendations from consultants. This approach resulted in a facility design using experience from other operating hazardous/radioactive incinerators. In order to study the CIF APCS prior to operation, a 1/10 scale pilot facility, the Offgas Components Test Facility (OCTF), was constructed and has been in operation since late 1994. Its mission is to demonstrate the design integrity of the CIF APCS and optimize equipment/instrument performance of the full scale production facility. Operation of the pilot facility has provided long-term performance data of integrated systems and critical facility components. This has reduced facility startup problems and helped ensure compliance with facility performance requirements. Technical support programs assist in assuring all stakeholders the CIF can properly treat combustible hazardous, mixed, and low-level radioactive wastes. High Efficiency Particulate Air (HEPA) filters are used to remove hazardous and radioactive particulates from the exhaust gas strewn before being released into the atmosphere. The HEPA filter change-out frequency has been a potential issue and was the first technical issue to be studied at the OCTF. Tests were conducted to evaluate the performance of HEPA filters under different operating conditions. These tests included evaluating the impact on HEPA life of scrubber operating parameters and the type of HEPA prefilter used. This pilot-scale testing demonstrated satisfactory HEPA filter life when using cleanable metal prefilters and high flows of steam and water in the offgas scrubber. 8 figs., 2 tabs.

  15. Criteria for calculating the efficiency of HEPA filters during and after design basis accidents

    International Nuclear Information System (INIS)

    Bergman, W.; First, M.W.; Anderson, W.L.; Gilbert, H.; Jacox, J.W.

    1994-12-01

    We have reviewed the literature on the performance of high efficiency particulate air (HEPA) filters under normal and abnormal conditions to establish criteria for calculating the efficiency of HEPA filters in a DOE nonreactor nuclear facility during and after a Design Basis Accident (DBA). The literature review included the performance of new filters and parameters that may cause deterioration in the filter performance such as filter age, radiation, corrosive chemicals, seismic and rough handling, high temperature, moisture, particle clogging, high air flow and pressure pulses. The deterioration of the filter efficiency depends on the exposure parameters; in severe exposure conditions the filter will be structurally damaged and have a residual efficiency of 0%. Despite the many studies on HEPA filter performance under adverse conditions, there are large gaps and limitations in the data that introduce significant error in the estimates of HEPA filter efficiencies under DBA conditions. Because of this limitation, conservative values of filter efficiency were chosen when there was insufficient data

  16. Structural testing of salt loaded HEPA filters for WIPP

    International Nuclear Information System (INIS)

    Smith, P.R.; Leslie, I.H.; Hensel, E.C.; Shultheis, T.M.; Walls, J.R.

    1993-01-01

    The ventilation studies of the Waste Isolation Pilot Plant described in this paper were performed by personnel from New Mexico State Univ. in collaboration with Sandia National Laboratories, Los Alamos National Laboratory and Westinghouse Corporation. High efficiency particulate air filters (0.61m by 0.61m by 0.3m) of the type in use at the Waste Isolation Pilot Plant were loaded with salt aerosol provided from that site. The structural strength of salt-loaded, high-efficiency filters was investigated at two humidity levels, high (75%RH) and low (13-14% RH), by subjecting the filters to pressure transients of the types expected from tornadoes. Filters loaded under the high humidity condition proved to have a greater structural strength than did the filters loaded under the low humidity conditions, when both types were subjected to tornado-like pressure pulses. This unexpected results was apparently due to the crystallization of salt upon the wire face guard of the HEPA filter loaded under the high humidity condition which kept salt from penetrating the filter medium while still providing a substantial pressure drop at the standard flow rate. Results are also presented for HEPA filters pre-conditioned at 100% RH before structural testing and for HEPA filters in series with pre-filters

  17. Potential for HEPA filter damage from water spray systems in filter plenums

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W. [Lawrence Livermore National Lab., CA (United States); Fretthold, J.K. [Rocky Flats Safe Sites of Colorado, Golden, CO (United States); Slawski, J.W. [Department of Energy, Germantown, MD (United States)

    1997-08-01

    The water spray systems in high efficiency particulate air (HEPA) filter plenums that are used in nearly all Department of Energy (DOE) facilities for protection against fire was designed under the assumption that the HEPA filters would not be damaged by the water sprays. The most likely scenario for filter damage involves filter plugging by the water spray, followed by the fan blowing out the filter medium. A number of controlled laboratory tests that were previously conducted in the late 1980s are reviewed in this paper to provide a technical basis for the potential HEPA filter damage by the water spray system in HEPA filter plenums. In addition to the laboratory tests, the scenario for BEPA filter damage during fires has also occurred in the field. A fire in a four-stage, BEPA filter plenum at Rocky Flats in 1980 caused the first three stages of BEPA filters to blow out of their housing and the fourth stage to severely bow. Details of this recently declassified fire are presented in this paper. Although these previous findings suggest serious potential problems exist with the current water spray system in filter plenums, additional studies are required to confirm unequivocally that DOE`s critical facilities are at risk. 22 refs., 15 figs.

  18. Improved remote HEPA filtration development program

    International Nuclear Information System (INIS)

    Wilson, C.E. III.

    1987-03-01

    This paper presents a summary of the prototype development and hot cell mock-up testing program undertaken to adapt a commercial remote HEPA filter housing for use in the Process Facility Modification Project (PFMP). This program was initiated in response to the project design criteria and documentation that required the air from the hot cell environment to be exhausted through three stages of HEPA filtration. Due to the anticipated quantity of radioactive contamination captured by the first stage of filters, it was determined that the first stage would need to be located in a remotely operated and maintained shielded cell adjoining the primary hot cell areas. Commercially available remote filtration equipment was evaluated and candidate unit was identified, which could be developed into a suitable filter housing. A candidate unit was obtained from Flanders Filters, Inc. and a series of hot cell mock-up tests were identified in the 305 facility at the Hanford site. The results of these tests, and further interaction with the vendor, led to a prototype remote filter housing which satisfied most PFMP criteria and proved to be significantly superior to existing commercial units for remote operation/maintenance

  19. Overexpression of HepaCAM inhibits cell viability and motility through suppressing nucleus translocation of androgen receptor and ERK signaling in prostate cancer.

    Science.gov (United States)

    Song, Xuedong; Wang, Yin; Du, Hongfei; Fan, Yanru; Yang, Xue; Wang, Xiaorong; Wu, Xiaohou; Luo, Chunli

    2014-07-01

    HepaCAM is suppressed in a variety of human cancers, and involved in cell adhesion, growth, migration, invasion, and survival. However, the expression and function of HepaCAM in prostate cancer are still unknown. HepaCAM expression has been detected by RT-PCR, Western blotting and immunohistochemistry staining in prostate cell lines RWPE-1, LNCap, DU145, PC3, and in 75 human prostate tissue specimens, respectively. Meanwhile, the cell proliferation ability was detected by WST-8 assay. The role of HepaCAM in prostate cancer cell migration and invasion was examined by wound healing and transwell assay. And flow cytometry was used to observe the apoptosis of prostate cancer cells. Then we detected changes of Androgen Receptor translocation and ERK signaling using immunofluorescence staining and western blot after overexpression of HepaCAM. The HepaCAM expression was significantly down-regulated in prostate cancer tissues and undetected in prostate cancer cells. However, the low HepaCAM expression was not statistically associated with clinicopathological characteristics of prostate cancer. Overexpression of HepaCAM in prostate cancer cells decreased the cell proliferation, migration and invasion, and induced the cell apoptosis. Meanwhile, HepaCAM prevented the androgen receptor translocation from the cytoplasm to the nucleus and down-regulated the MAPK/ERK signaling. Our results suggested that HepaCAM acted as a tumor suppressor in prostate cancer. HepaCAM inhibited cell viability and motility which might be through suppressing the nuclear translocation of Androgen Receptor and down-regulating the ERK signaling. Therefore, it was indicated that HepaCAM may be a potential therapeutic target for prostate cancer. © 2014 Wiley Periodicals, Inc.

  20. Advantageous use of HepaRG cells for the screening and mechanistic study of drug-induced steatosis

    Energy Technology Data Exchange (ETDEWEB)

    Tolosa, Laia [Unidad de Hepatología Experimental, Instituto de Investigación Sanitaria La Fe, Valencia 46026 (Spain); Gómez-Lechón, M. José [Unidad de Hepatología Experimental, Instituto de Investigación Sanitaria La Fe, Valencia 46026 (Spain); CIBERehd, FIS, Barcelona 08036 (Spain); Jiménez, Nuria [Unidad de Hepatología Experimental, Instituto de Investigación Sanitaria La Fe, Valencia 46026 (Spain); Hervás, David [Biostatistics Unit, Instituto de Investigación Sanitaria La Fe, Valencia 46026 (Spain); Jover, Ramiro [Unidad de Hepatología Experimental, Instituto de Investigación Sanitaria La Fe, Valencia 46026 (Spain); CIBERehd, FIS, Barcelona 08036 (Spain); Departamento de Bioquímica y Biología Molecular, Facultad de Medicina, Universidad de Valencia, Valencia 46010 (Spain); Donato, M. Teresa, E-mail: donato_mte@gva.es [Unidad de Hepatología Experimental, Instituto de Investigación Sanitaria La Fe, Valencia 46026 (Spain); CIBERehd, FIS, Barcelona 08036 (Spain); Departamento de Bioquímica y Biología Molecular, Facultad de Medicina, Universidad de Valencia, Valencia 46010 (Spain)

    2016-07-01

    Only a few in vitro assays have been proposed to evaluate the steatotic potential of new drugs. The present study examines the utility of HepaRG cells as a cell-based assay system for screening drug-induced liver steatosis. A high-content screening assay was run to evaluate multiple toxicity-related cell parameters in HepaRG cells exposed to 28 compounds, including drugs reported to cause steatosis through different mechanisms and non-steatotic compounds. Lipid content was the most sensitive parameter for all the steatotic drugs, whereas no effects on lipid levels were produced by non-steatotic compounds. Apart from fat accumulation, increased ROS production and altered mitochondrial membrane potential were also found in the cells exposed to steatotic drugs, which indicates that all these cellular events contributed to drug-induced hepatotoxicity. These findings are of clinical relevance as most effects were observed at drug concentrations under 100-fold of the therapeutic peak plasmatic concentration. HepaRG cells showed increased lipid overaccumulation vs. HepG2 cells, which suggests greater sensitivity to drug-induced steatosis. An altered expression profile of transcription factors and the genes that code key proteins in lipid metabolism was also found in the cells exposed to drugs capable of inducing liver steatosis. Our results generally indicate the value of HepaRG cells for assessing the risk of liver damage associated with steatogenic compounds and for investigating the molecular mechanisms involved in drug-induced steatosis. - Highlights: • HepaRG cells were explored as an in vitro model to detect steatogenic potential. • Multiple toxicity-related endpoints were analysed by HCS. • HepaRG showed a greater sensitivity to drug-induced steatosis than HepG2 cells. • Changes in the expression of genes related to lipid metabolism were revealed. • HepaRG allow mechanistic understanding of liver damage induced by steatogenic drugs.

  1. Advantageous use of HepaRG cells for the screening and mechanistic study of drug-induced steatosis

    International Nuclear Information System (INIS)

    Tolosa, Laia; Gómez-Lechón, M. José; Jiménez, Nuria; Hervás, David; Jover, Ramiro; Donato, M. Teresa

    2016-01-01

    Only a few in vitro assays have been proposed to evaluate the steatotic potential of new drugs. The present study examines the utility of HepaRG cells as a cell-based assay system for screening drug-induced liver steatosis. A high-content screening assay was run to evaluate multiple toxicity-related cell parameters in HepaRG cells exposed to 28 compounds, including drugs reported to cause steatosis through different mechanisms and non-steatotic compounds. Lipid content was the most sensitive parameter for all the steatotic drugs, whereas no effects on lipid levels were produced by non-steatotic compounds. Apart from fat accumulation, increased ROS production and altered mitochondrial membrane potential were also found in the cells exposed to steatotic drugs, which indicates that all these cellular events contributed to drug-induced hepatotoxicity. These findings are of clinical relevance as most effects were observed at drug concentrations under 100-fold of the therapeutic peak plasmatic concentration. HepaRG cells showed increased lipid overaccumulation vs. HepG2 cells, which suggests greater sensitivity to drug-induced steatosis. An altered expression profile of transcription factors and the genes that code key proteins in lipid metabolism was also found in the cells exposed to drugs capable of inducing liver steatosis. Our results generally indicate the value of HepaRG cells for assessing the risk of liver damage associated with steatogenic compounds and for investigating the molecular mechanisms involved in drug-induced steatosis. - Highlights: • HepaRG cells were explored as an in vitro model to detect steatogenic potential. • Multiple toxicity-related endpoints were analysed by HCS. • HepaRG showed a greater sensitivity to drug-induced steatosis than HepG2 cells. • Changes in the expression of genes related to lipid metabolism were revealed. • HepaRG allow mechanistic understanding of liver damage induced by steatogenic drugs.

  2. The impact of metallic filter media on HEPA filtration

    International Nuclear Information System (INIS)

    Chadwick, Chris; Kaufman, Seth

    2006-01-01

    Traditional HEPA filter systems have limitations that often prevent them from solving many of the filtration problems in the nuclear industry; particularly in applications where long service or storage life, high levels of radioactivity, dangerous decomposition products, chemical aggression, organic solvents, elevated operating temperatures, fire resistance and resistance to moisture are issues. This paper addresses several of these matters of concern by considering the use of metallic filter media to solve HEPA filtration problems ranging from the long term storage of transuranic waste at the WIPP site, spent and damaged fuel assemblies, in glove box ventilation and tank venting to the venting of fumes at elevated temperatures from incinerators, vitrification processes and conversion and sintering furnaces as well as downstream of iodine absorbers in gas cooled reactors in the UK. The paper reviews the basic technology, development, performance characteristics and filtration efficiency, flow versus differential pressure, cleanability and costs of sintered metal fiber in comparison with traditional resin bonded glass fiber filter media and sintered metal powder filter media. Examples of typical filter element and system configurations and applications will be presented The paper will also address the economic case for installing self cleaning pre-filtration, using metallic media, to recover the small volumes of dust that would otherwise blind large volumes of final disposable HEPA filters, thus presenting a route to reduce ultimate disposal volumes and secondary waste streams. (authors)

  3. Summary of meeting on disposal of LET ampersand D HEPA filters

    International Nuclear Information System (INIS)

    1991-01-01

    This report is a compilation of correspondence between Westinghouse Idaho Nuclear Company and the US EPA over a period of time from 1988 to 1992 (most from 1991-92) regarding waste management compliance with EPA regulations. Typical subjects include: compliance with satellite accumulation requirements; usage of ''Sure Shot'' containers in place of aerosol cans; notice of upcoming recyclable battery shipments; disposition of batteries; HEPA filter leach sampling and permit impacts; functional and operation requirements for the spent filter handling system; summary of meeting on disposal of LET and D HEPA filters; solvent substitution database report; and mercury vapor light analytical testing

  4. A single standard for in-place testing of DOE HEPA filters - not

    Energy Technology Data Exchange (ETDEWEB)

    Mokler, B.V. [Los Alamos National Laboratory, NM (United States)

    1995-02-01

    This article is a review of arguments against the use of a single standard for in-place testing of DOE HEPA filters. The author feels that the term `standard` entails mandatory compliance. Additionally, the author feels that the variety of DOE HEPA systems requiring in-place testing is such that the guidance for testing must be written in a permissive fashion, allowing options and alternatives. With this in mind, it is not possible to write a single document entailing mandatory compliance for all DOE facilities.

  5. Volatility and leachability of heavy metals and radionuclides in thermally treated HEPA filter media generated from nuclear facilities.

    Science.gov (United States)

    Yoon, In-Ho; Choi, Wang-Kyu; Lee, Suk-Chol; Min, Byung-Youn; Yang, Hee-Chul; Lee, Kune-Woo

    2012-06-15

    The purpose of the present study was to apply thermal treatments to reduce the volume of HEPA filter media and to investigate the volatility and leachability of heavy metals and radionuclides during thermal treatment. HEPA filter media were transformed to glassy bulk material by thermal treatment at 900°C for 2h. The most abundant heavy metal in the HEPA filter media was Zn, followed by Sr, Pb and Cr, and the main radionuclide was Cs-137. The volatility tests showed that the heavy metals and radionuclides in radioactive HEPA filter media were not volatilized during the thermal treatment. PCT tests indicated that the leachability of heavy metals and radionuclides was relatively low compared to those of other glasses. XRD results showed that Zn and Cs reacted with HEPA filter media and were transformed into crystalline willemite (ZnO·SiO(2)) and pollucite (Cs(2)OAl(2)O(3)4SiO(2)), which are not volatile or leachable. The proposed technique for the volume reduction and transformation of radioactive HEPA filter media into glassy bulk material is a simple and energy efficient procedure without additives that can be performed at relatively low temperature compared with conventional vitrification process. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Volatility and leachability of heavy metals and radionuclides in thermally treated HEPA filter media generated from nuclear facilities

    International Nuclear Information System (INIS)

    Yoon, In-Ho; Choi, Wang-Kyu; Lee, Suk-Chol; Min, Byung-Youn; Yang, Hee-Chul; Lee, Kune-Woo

    2012-01-01

    Highlights: ► Thermally treated HEPA filter media was transformed into glassy bulk materials. ► Main radionuclide and heavy metal were Cs-137 and Zn. ► Cs and Zn were transformed into stable form without volatilization and leaching. ► The proposed technique is simple and energy efficient procedure. - Abstract: The purpose of the present study was to apply thermal treatments to reduce the volume of HEPA filter media and to investigate the volatility and leachability of heavy metals and radionuclides during thermal treatment. HEPA filter media were transformed to glassy bulk material by thermal treatment at 900 °C for 2 h. The most abundant heavy metal in the HEPA filter media was Zn, followed by Sr, Pb and Cr, and the main radionuclide was Cs-137. The volatility tests showed that the heavy metals and radionuclides in radioactive HEPA filter media were not volatilized during the thermal treatment. PCT tests indicated that the leachability of heavy metals and radionuclides was relatively low compared to those of other glasses. XRD results showed that Zn and Cs reacted with HEPA filter media and were transformed into crystalline willemite (ZnO·SiO 2 ) and pollucite (Cs 2 OAl 2 O 3 4SiO 2 ), which are not volatile or leachable. The proposed technique for the volume reduction and transformation of radioactive HEPA filter media into glassy bulk material is a simple and energy efficient procedure without additives that can be performed at relatively low temperature compared with conventional vitrification process.

  7. Studies on Hepa filter test methods

    International Nuclear Information System (INIS)

    Lee, S.H.; Jon, K.S.; Park, W.J.; Ryoo, R.

    1981-01-01

    The purpose of this study is to compare testing methods of the HEPA filter adopted in other countries with each other, and to design and construct a test duct system to establish testing methods. The American D.O.P. test method, the British NaCl test method and several other independently developed methods are compared. It is considered that the D.O.P. method is most suitable for in-plant and leak tests

  8. Test plan for N2 HEPA filters assembly shop stock used on PFP E4 exhaust system

    International Nuclear Information System (INIS)

    DICK, J.D.

    1999-01-01

    At Plutonium Finishing Plant (PFP) and Plutonium Reclamation Facility (PRF) Self-contained HEPA filters, encased in wooden frames and boxes, are installed in the E4 Exhaust Ventilation System to provide confinement of radioactive releases to the environment and confinement of radioactive contamination within designated zones inside the facility. Recently during the routine testing in-leakage was discovered downstream of the Self-contained HEPA filters boxes. This Test Plan describes the approach to conduct investigation of the root causes for the in-leakage of HEPA filters

  9. Safety evaluation for packaging (onsite) for the Pacific Northwest National Laboratory HEPA filter box

    International Nuclear Information System (INIS)

    McCoy, J.C.

    1998-01-01

    This safety evaluation for packaging (SEP) evaluates and documents the safe onsite transport of eight high-efficiency particulate air (HEPA) filters in the Pacific Northwest National Laboratory HEPA Filter Box from the 300 Area of the Hanford Site to the Central Waste Complex and on to burial in the 200 West Area. Use of this SEP is authorized for 1 year from the date of release

  10. Performance of HEPA Filter Medium under Accidental Conditions in Nuclear Installations

    International Nuclear Information System (INIS)

    El-Fawal, M.M.

    2011-01-01

    High Efficiency Particulate Air filters (HEPA Filters) are the main components in ventilation or confinement system for the retention of radioactive particles in nuclear installations. During abnormal conditions or accidents (e.g. fire accident, criticality in a nuclear fuel cycle facility and LOCA in power reactors) the resulting heat, smoke and humidity affect to a large extent the performance of HEPA filters. As a part of a research programme aims at the evaluation and improvement of the performance of HEPA filter media during abnormal conditions, the effect of elevated temperatures up to 400 degree C on the resistance of medium to penetration of water under pressure has been investigated. The test results showed that the resistance of the medium to penetration of water decreases with increase in temperature and thermal exposure time. This could be attributed to burnout of the organic binder used to improve the resistance of the medium to the penetration of water. The results also showed that at 400 degree C the resistance of the medium to the penetration of water disappeared. This was confirmed by inspection of the filter medium samples after exposure to high temperature using a scanning electron microscope. The inspection of the medium samples showed that the organic binder in the medium was deformed and finally collapsed at 400 degree C. Also, a best estimate model for the relation of filter medium resistance to water penetration under elevated temperature has been implemented. The results of this study can help in establishing a regulatory operating limit conditions (OLCs) for HEPA filter operation at high temperatures conditions in nuclear installations

  11. Performance of HEPA Filter Medium under Accidental Conditions in Nuclear Installations

    International Nuclear Information System (INIS)

    ElFawal, M.M.

    2009-01-01

    High Efficiency Particulate Air filters (HEPA Filters) are the main components in ventilation or confinement system for the retention of radioactive particles in nuclear installations. During abnormal conditions or accidents (e.g. fire accident, criticality in a nuclear fuel cycle facility and LOCA in power reactors) the resulting heat, smoke and humidity affect to a large extent the performance of HEPA filters. As a part of a research programme aims at the evaluation and improvement of the performance of HEPA filter media during abnormal conditions, the effect of elevated temperatures up to 400 degree C on the resistance of medium to penetration of water under pressure has been investigated. The test results showed that the resistance of the medium to penetration of water decreases with increase in temperature and thermal exposure time. This could be attributed to burnout of the organic binder used to improve the resistance of the medium to the penetration of water. The results also showed that at 400 degree C the resistance of the medium to the penetration of water disappeared. This was confirmed by inspection of the filter medium samples after exposure to high temperature using a scanning electron microscope. The inspection of the medium samples showed that the organic binder in the medium was deformed and finally collapsed at 400 degree C. Also, a best estimate model for the relation of filter medium resistance to water penetration under elevated temperature has been implemented. The results of this study can help in establishing a regulatory operating limit conditions (OLCs) for HEPA filter operation at high temperatures conditions in nuclear installations.

  12. Multiple HEPA filter test methods, July 1, 1974--March 31, 1975

    International Nuclear Information System (INIS)

    Schuster, B.G.; Osetek, D.J.

    1975-08-01

    A laboratory apparatus has been constructed for testing two HEPA filters in a series configuration. The apparatus consists of an instrumented wind tunnel in which the HEPA filters are mounted, and an auxiliary wind tunnel for obtaining diluted samples of the challenge aerosol upstream of the first filter. Measurements performed with a single particle aerosol spectrometer demonstrate the capability for measuring overall protection factors of greater than 2.5 x 10 8 . The decay of penetration as a function of time in individual HEPA filters indicates no preferential size discrimination in the range of 0.1 μm to 1.0 μm; nor is there a preferential size discrimination of penetration in this same range. A theoretical feasibility study has been performed on the use of an inhomogeneous electric field/induced aerosol electric dipole interaction for potential use as an air cleaning mechanism. Numerical evaluation of a coaxial cylinder geometry indicates that the method is feasible for collection of particles down to 0.1 μm under typical airflow velocity conditions. Small modifications in the geometry may be incorporated to create an instrument capable of measuring particle size. Geometries other than coaxial cylinders are also under investigation

  13. A Method for Cobalt and Cesium Leaching from Glass Fiber in HEPA Filter

    International Nuclear Information System (INIS)

    Kim, Gye Nam; Lee, Suk Chol; Yang, Hee Chul; Yoon, In Ho; Choi, Wang Kyu; Moon, Jei Kwon

    2011-01-01

    A great amount of radioactive waste has been generated during the operation of nuclear facilities. Recently, the storage space of a radioactive waste storage facility in the Korea Atomic Energy Research Institute (KAERI) was almost saturated with many radioactive wastes. So, the present is a point of time that a volume reduction of the wastes in a radioactive waste storage facility needs. There are spent HEPA filter wastes of about 2,226 sets in the radioactive waste storage facility in KAERI. All these spent filter wastes have been stored in accordance with their original form without any treatment. Up to now a compression treatment of these spent HEPA filters has been carried out to repack the compressed spent HEPA filters into a 200 liter drum for their volume reduction. Frame and separator are contaminated with a low concentration of nuclide, while the glass fiber is contaminated with a high concentration of nuclide. So, for the disposal of the glass filter to the environment, the glass fiber should be leached to lower its radioactive concentration first and then must be stabilized by solidification and so on. Therefore, it is necessary to develop a leaching process of glass fiber in a HEPA filter. Leaching is a separation technology, which is often used to remove a metal or a nuclide from a solid mixture with the help of a liquid solvent

  14. HEPA Filter Disposal Write-Up 10/19/16

    Energy Technology Data Exchange (ETDEWEB)

    Loll, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-10-20

    Process knowledge (PK) collection on HEPA filters is handled via the same process as other waste streams at LLNL. The Field technician or Characterization point of contact creates an information gathering document (IGD) in the IGD database, with input provided from the generator, and submits it for electronic approval. This document is essentially a waste generation profile, detailing the physical, chemical as well as radiological characteristics, and hazards, of a waste stream. It will typically contain a general, but sometimes detailed, description of the work processes which generated the waste. It will contain PK as well as radiological and industrial hygiene analytical swipe results, and any other analytical or other supporting knowledge related to characterization. The IGD goes through an electronic approval process to formalize the characterization and to ensure the waste has an appropriate disposal path. The waste generator is responsible for providing initial process knowledge information, and approves the IGD before it routed to chemical and radiological waste characterization professionals. This is the standard characterization process for LLNL-generated HEPA Filters.

  15. Predicting mass loading as a function of pressure difference across prefilter/HEPA filter systems

    International Nuclear Information System (INIS)

    Novick, V.J.; Klassen, J.F.; Monson, P.R.

    1992-01-01

    The purpose of this work is to develop a methodology for predicting the mass loading and pressure drop effects on a prefilter/ HEPA filter system. The methodology relies on the use of empirical equations for the specific resistance of the aerosol loaded filter as a function of the particle diameter. These correlations relate the pressure difference across a filter to the mass loading on the filter and accounts for aerosol particle density effects. These predictions are necessary for the efficient design of new filtration systems and for risk assessment studies of existing filter systems. This work specifically addresses the prefilter/HEPA filter Airborne Activity Confinement Systems (AACS) at the Savannah River Plant. In order to determine the mass loading on the system, it is necessary to establish the efficiency characteristics for the prefilter, the mass loading characteristics of the prefilter measured as a function of pressure difference across the prefilter, and the mass loading characteristics of the HEPA filter as a function of pressure difference across the filter. Furthermore, the efficiency and mass loading characteristics need to be determined as a function of the aerosol particle diameter. A review of the literature revealed that no previous work had been performed to characterize the prefilter material of interest. In order to complete the foundation of information necessary to predict total mass loadings on prefilter/HEPA filter systems, it was necessary to determine the prefilter efficiency and mass loading characteristics. The measured prefilter characteristics combined with the previously determined HEPA filter characteristics allowed the resulting pressure difference across both filters to be predicted as a function of total particle mass for a given particle distribution. These predictions compare favorably to experimental measurements (±25%)

  16. Extension of the maintenance cycle of HEPA filters by optimization of the technical characteristics of filters and their construction

    International Nuclear Information System (INIS)

    Bella, H.; Stiehl, H.H.; Sinhuber, D.

    1977-01-01

    The knowledge of the parameters of HEPA filters used at present in nuclear plants allows optimization of such filters with respect to flow rate, pressure drop and service life. The application of optimizing new types of HEPA filters of improved performance is reported. The calculated results were checked experimentally. The use of HEPA filters optimized with respect to dust capacity and service life, and the effects of this new type of filter on the reduction of operating and maintenance costs are discussed

  17. Efficiency and mass loading characteristics of a typical HEPA filter media material

    International Nuclear Information System (INIS)

    Novick, V.J.; Higgins, P.J.; Dierkschiede, B.; Abrahamson, C.; Richardson, W.B.; Monson, P.R.; Ellison, P.G.

    1991-01-01

    The particle removal efficiency of the high-efficiency particulate air (HEPA) filter material used at the Savannah River Site was measured as a function of monodisperse particle diameter and two gas filtration velocities. the results indicate that the material meets or exceeds the minimum specified efficiency of 99.97% for all particle diameters at both normal and minimum operating flow conditions encountered at the Savannah River site. The pressure drop across the HEPA filter material used at the Savannah River site was measured as a function of particle mass loading for various aerosol size distributions. The pressure drop was found to increase linearly with the particle mass loaded onto the filters, as long as the particles were completely dry. The slope of the curve was found to be dependent on the particle diameter and velocity of the aerosol. The linear behavior between the initial pressure drop (clean filter) and the final pressure drop (loaded filter) implies that the filtration mechanism is dominated by the particle cake that rapidly forms on the front surface of the HEPA filter. This behavior is consistent with the high filtration efficiency of the material

  18. Applied patent RFID systems for building reacting HEPA air ventilation system in hospital operation rooms.

    Science.gov (United States)

    Lin, Jesun; Pai, Jar-Yuan; Chen, Chih-Cheng

    2012-12-01

    RFID technology, an automatic identification and data capture technology to provide identification, tracing, security and so on, was widely applied to healthcare industry in these years. Employing HEPA ventilation system in hospital is a way to ensure healthful indoor air quality to protect patients and healthcare workers against hospital-acquired infections. However, the system consumes lots of electricity which cost a lot. This study aims to apply the RFID technology to offer a unique medical staff and patient identification, and reacting HEPA air ventilation system in order to reduce the cost, save energy and prevent the prevalence of hospital-acquired infection. The system, reacting HEPA air ventilation system, contains RFID tags (for medical staffs and patients), sensor, and reacting system which receives the information regarding the number of medical staff and the status of the surgery, and controls the air volume of the HEPA air ventilation system accordingly. A pilot program was carried out in a unit of operation rooms of a medical center with 1,500 beds located in central Taiwan from Jan to Aug 2010. The results found the air ventilation system was able to function much more efficiently with less energy consumed. Furthermore, the indoor air quality could still keep qualified and hospital-acquired infection or other occupational diseases could be prevented.

  19. Invasive aspergillosis in severely neutropenic patients over 18 years: impact of intranasal amphotericin B and HEPA filtration.

    Science.gov (United States)

    Withington, S; Chambers, S T; Beard, M E; Inder, A; Allen, J R; Ikram, R B; Schousboe, M I; Heaton, D C; Spearing, R I; Hart, D N

    1998-01-01

    The impact of intranasal amphotericin B and high-efficiency particulate air (HEPA) filtration on the incidence of invasive aspergillosis was reviewed in patients from 1977 to 1994 undergoing intensive chemotherapy. Overall, the incidence of proven invasive aspergillosis was reduced from 24.4% (1977-1984) to 7.1% (1985-1991) (P < 0.001) following the introduction of intranasal prophylaxis, but when probable cases of aspergillosis were included and lymphoma cases excluded, there was no change in incidence. Following the introduction of HEPA filtration, patient exposure to aspergillus spores as measured by air sampling was markedly reduced and there were no new cases of invasive aspergillosis. HEPA filtration proved effective in reducing invasive aspergillosis and has allowed increasingly aggressive treatment regimens to be introduced.

  20. Evaluation of the effect of media velocity on filter efficiency and most penetrating particle size of nuclear grade high-efficiency particulate air filters.

    Science.gov (United States)

    Alderman, Steven L; Parsons, Michael S; Hogancamp, Kristina U; Waggoner, Charles A

    2008-11-01

    High-efficiency particulate air (HEPA) filters are widely used to control particulate matter emissions from processes that involve management or treatment of radioactive materials. Section FC of the American Society of Mechanical Engineers AG-1 Code on Nuclear Air and Gas Treatment currently restricts media velocity to a maximum of 2.5 cm/sec in any application where this standard is invoked. There is some desire to eliminate or increase this media velocity limit. A concern is that increasing media velocity will result in higher emissions of ultrafine particles; thus, it is unlikely that higher media velocities will be allowed without data to demonstrate the effect of media velocity on removal of ultrafine particles. In this study, the performance of nuclear grade HEPA filters, with respect to filter efficiency and most penetrating particle size, was evaluated as a function of media velocity. Deep-pleat nuclear grade HEPA filters (31 cm x 31 cm x 29 cm) were evaluated at media velocities ranging from 2.0 to 4.5 cm/sec using a potassium chloride aerosol challenge having a particle size distribution centered near the HEPA filter most penetrating particle size. Filters were challenged under two distinct mass loading rate regimes through the use of or exclusion of a 3 microm aerodynamic diameter cut point cyclone. Filter efficiency and most penetrating particle size measurements were made throughout the duration of filter testing. Filter efficiency measured at the onset of aerosol challenge was noted to decrease with increasing media velocity, with values ranging from 99.999 to 99.977%. The filter most penetrating particle size recorded at the onset of testing was noted to decrease slightly as media velocity was increased and was typically in the range of 110-130 nm. Although additional testing is needed, these findings indicate that filters operating at media velocities up to 4.5 cm/sec will meet or exceed current filter efficiency requirements. Additionally

  1. Review of Department of Energy HEPA filter test activities, FY 1990--FY 1992

    International Nuclear Information System (INIS)

    McIntyre, J.A.

    1992-01-01

    Filter Test Facilities (FTFs) and the FTF Technical Support Group (TSG) continue to provide services to the Department of Energy (DOE). Additional tasks relating to the HEPA filter cycle have been added to the TSG. The tasks include the quality assessment review for the in-place testing of HEPA filters at DOE sites and the formation of an in-place testing standards writing group. Summary of ongoing FTFs and TSG activities for FY 1990-FY 1992 including the technical input for implementation of the High Flow Alternative Test System (HFATS), update of the DOE Standards, the status of the quality assessment review and in-place testing standards writing group are discussed

  2. HEPA filter testing - Department of Energy Office of Nuclear Energy Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Sherwood, G.L. Jr. [Department of Energy, Washington, DC (United States)

    1995-02-01

    This paper provides the background of, and some results from, a review of HEPA filter testing during 1993 at selected Department of Energy (DOE) facilities. Recommendations for improvements in standards resulting from the review are also presented.

  3. Development of acid-resistant HEPA filter components

    International Nuclear Information System (INIS)

    Terada, K.; Woodard, R.W.; Buttedahl, O.I.

    1981-01-01

    Laboratory and in-service tests of various HEPA filter media and separators were conducted to establish their relative resistances to HNO 3 -HF vapors. Filter medium of glass fiber with Nomex additive and aluminum separators with an epoxy-vinyl coating have performed quite well in the acid environment in the laboratory, and in prototype-filters placed in service in a plenum at Rocky Flats. Proprietary filters with new design and/or components were also tested in service with generally good results

  4. Particle size for greatest penetration of HEPA filters - and their true efficiency

    International Nuclear Information System (INIS)

    da Roza, R.A.

    1982-01-01

    The particle size that most greatly penetrates a filter is a function of filter media construction, aerosol density, and air velocity. In this paper the published results of several experiments are compared with a modern filtration theory that predicts single-fiber efficiency and the particle size of maximum penetration. For high-efficiency particulate air (HEPA) filters used under design conditions this size is calculated to be 0.21 μm diam. This is in good agreement with the experimental data. The penetration at 0.21 μm is calculated to be seven times greater than at the 0.3 μm used for testing HEPA filters. Several mechanisms by which filters may have a lower efficiency in use than when tested are discussed

  5. Alternative strategies to reduce cost and waste volume in HEPA filtration using metallic filter media - 59348

    International Nuclear Information System (INIS)

    Chadwick, Chris

    2012-01-01

    Document available in abstract form only. Full text of publication follows: The disposal costs of contaminated HEPA and THE filter elements have been proved to be disproportionately high compared with the cost of the elements themselves. Work published elsewhere (Moore, et el 1992; Bergman et al 1997) suggests that the cost of use of traditional, panel type, glass fibre HEPA filtration trains to the DOE was, during that period, $29.5 million, based on a five year life cycle, and including installation, testing, removal and disposal life cycle costs being based on estimates dating from 1987-1990. Within that cost estimate, $300 was the value given to the filter and $4, 450 was given to the peripheral activity. Clearly, if the $4, 450 component could be reduced, tremendous saving could ensue, in addition to the reduction of the legacy burden of waste volume. This issue exists for operators in both the US and in Europe. If HEPA filters could be cleaned to a condition where they could either be re-used or decontaminated to the extent that they could be stored as a lower cost wasteform or if HEPA/THE filter elements were available without any organic content likely to give rise to flammable or explosive decomposition gases during long term storage this would also reduce the costs and monitoring necessary in storage. (author)

  6. Improved HEPA Filter Technology for Flexible and Rigid Containment Barriers

    International Nuclear Information System (INIS)

    Pinson, Paul Arthur

    1998-01-01

    Safety and reliability in glovebox operations can be significantly improved and waste packaging efficiencies can be increased by inserting flexible, lightweight, high capacity HEPA filters into the walls of plastic sheet barriers. This HEPA filter/barrier technology can be adapted to a wide variety of applications: disposable waste bags, protective environmental barriers for electronic equipment, single or multiple use glovebag assemblies, flexible glovebox wall elements, and room partitions. These reliable and inexpensive filtered barriers have many uses in fields such as radioactive waste processing, HVAC filter changeout, vapor or grit blasting, asbestos cleanup, pharmaceutical, medical, biological, and electronic equipment containment. The applications can result in significant cost savings, improved operational reliability and safety, and total waste volume reduction. This technology was developed at the Argonne National Laboratory-West (ANL-W) in 1993 and has been used at ANL-W since then at the TRU Waste Characterization Chamber Gloveboxes. Another 1998 AGS Conference paper titled ''TRU Waste Characterization Gloveboxes'', presented by Mr. David Duncan of ANL-W, describes these boxes

  7. Preliminary studies to determine the shelf life of HEPA filters

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, H.; Fretthold, J.K.; Rainer, F. [Lawrence Livermore National Laboratory, CA (United States)] [and others

    1995-02-01

    We have completed a preliminary study using filter media tests and filter qualification tests to investigate the effect of shelf-life on HEPA filter performance. Our media studies showed that the tensile strength decreased with age, but the data were not sufficient to establish a shelf-life. Thermogravimetric analyses demonstrated that one manufacturer had media with low tensile strength due to insufficient binder. The filter qualification tests (heated air and overpressure) conducted on different aged filters showed that filter age is not the primary factor affecting filter performance; materials and the construction design have a greater effect. An unexpected finding of our study was that sub-standard HEPA filters have been installed in DOE facilities despite existing regulations and filter qualification tests. We found that the filter with low tensile strength failed the overpressure test. The same filter had passed the heated air test, but left the filter so structurally weak, it was prone to blow-out. We recommend that DOE initiate a filter qualification program to prevent this occurrence.

  8. Recleaning of HEPA filters by reverse flow - evaluation of the underlying processes and the cleaning technique

    International Nuclear Information System (INIS)

    Leibold, H.; Leiber, T.; Doeffert, I.; Wilhelm, J.G.

    1993-08-01

    HEPA filter operation at high concentrations of fine dusts requires the periodic recleaning of the filter units in their service locations. Due to the low mechanical stress induced during the recleaning process the regenration via low pressure reverse flow is a very suitable technique. Recleanability of HEPA filter had been attained for particle diameter >0,4 μm at air velocities up to 1 m/s, but filter clogging occurred in case of smaller particles. The recleaning forces are too weak for particles [de

  9. Efficient transfection of Xenobiotic Responsive Element-biosensor plasmid using diether lipid and phosphatidylcholine liposomes in differentiated HepaRG cells.

    Science.gov (United States)

    Demazeau, Maxime; Quesnot, Nicolas; Ripoche, Nicolas; Rauch, Claudine; Jeftić, Jelena; Morel, Fabrice; Gauffre, Fabienne; Benvegnu, Thierry; Loyer, Pascal

    2017-05-30

    In this study, we evaluated cationic liposomes prepared from diether-NH 2 and egg phosphatidylcholine (EPC) for in vitro gene delivery. The impact of the lipid composition, i.e. the EPC and Diether-NH 2 molar ratio, on in vitro transfection efficiency and cytotoxicity was investigated using the human HEK293T and hepatoma HepaRG cells known to be permissive and poorly permissive cells for liposome-mediated gene transfer, respectively. Here, we report that EPC/Diether-NH 2 -based liposomes enabled a very efficient transfection with low cytotoxicity compared to commercial transfection reagents in both HEK293T and proliferating progenitor HepaRG cells. Taking advantage of these non-toxic EPC/Diether-NH 2 -based liposomes, we developed a method to efficiently transfect differentiated hepatocyte-like HepaRG cells and a biosensor plasmid containing a Xenobiotic Responsive Element and a minimal promoter driving the transcription of the luciferase reporter gene. We demonstrated that the luciferase activity was induced by a canonical inducer of cytochrome P450 genes, the benzo[a]pyrene, and two environmental contaminants, the fluoranthene, a polycyclic aromatic hydrocarbon, and the endosulfan, an organochlorine insecticide, known to induce toxicity and genotoxicity in differentiated HepaRG cells. In conclusion, we established a new efficient lipofection-mediated gene transfer in hepatocyte-like HepaRG cells opening new perspectives in drug evaluation relying on xenobiotic inducible biosensor plasmids. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Multiple HEPA filter test methods, January--December 1976

    International Nuclear Information System (INIS)

    Schuster, B.; Kyle, T.; Osetek, D.

    1977-06-01

    The testing of tandem high-efficiency particulate air (HEPA) filter systems is of prime importance for the measurement of accurate overall system protection factors. A procedure, based on the use of an intra-cavity laser particle spectrometer, has been developed for measuring protection factors in the 10 8 range. A laboratory scale model of a filter system was constructed and initially tested to determine individual HEPA filter characteristics with regard to size and state (liquid or solid) of several test aerosols. Based on these laboratory measurements, in-situ testing has been successfully conducted on a number of single and tandem filter installations within the Los Alamos Scientific Laboratory as well as on extraordinary large single systems at Rocky Flats. For the purpose of recovery and for simplified solid waste disposal, or prefiltering purposes, two versions of an inhomogeneous electric field air cleaner have been devised and are undergoing testing. Initial experience with one of the systems, which relies on an electrostatic spraying phenomenon, indicates performance efficiency of greater than 99.9% for flow velocities commonly used in air cleaning systems. Among the effluents associated with nuclear fuel reprocessing is 129 I. An intra-cavity laser detection system is under development which shows promise of being able to detect mixing ratios of one part in 10 7 , I 2 in air

  11. Viral Penetration of High Efficiency Particulate Air (HEPA) Filters

    Science.gov (United States)

    2007-02-01

    PVC tubing (Excelon® RNT,US Plastics, Lima , Ohio). Each path runs through a test article and thence through one AGI-30 all-glass impingers (Chemglass...a mechanical flow meter (Blue–White 400, Huntington Beach , California, or PMR1-101346, Cole– Parmer, Vernon Hills, Illinois). At the end of the...fibrous Filters." Air Pollution Control Association 30(4): 377-381. Leenders, G. J. M. and J. H. Stadhouders (1980s). "Effectiveness of HEPA

  12. Three-dimensional HepaRG model as an attractive tool for toxicity testing.

    Science.gov (United States)

    Leite, Sofia B; Wilk-Zasadna, Iwona; Zaldivar, Jose M; Airola, Elodie; Reis-Fernandes, Marcos A; Mennecozzi, Milena; Guguen-Guillouzo, Christiane; Chesne, Christopher; Guillou, Claude; Alves, Paula M; Coecke, Sandra

    2012-11-01

    The culture of HepaRG cells as three dimensional (3D) structures in the spinner-bioreactor may represent added value as a hepatic system for toxicological purposes. The use of a cost-effective commercially available bioreactor, which is compatible with high-throughput cell analysis, constitutes an attractive approach for routine use in the drug testing industry. In order to assess specific aspects of the biotransformation capacity of the bioreactor-based HepaRG system, the induction of CYP450 enzymes (i.e., CYP1A2, 2B6, 2C9, and 3A4) and the activity of the phase II enzyme, uridine diphosphate glucuronoltransferase (UGT), were tested. The long-term functionality of the system was demonstrated by 7-week stable profiles of albumin secretion, CYP3A4 induction, and UGT activities. Immunofluorescence-based staining showed formation of tissue-like arrangements including bile canaliculi-like structures and polar distribution of transporters. The use of in silico models to analyze the in vitro data related to hepatotoxic activity of acetaminophen (APAP) demonstrated the advantage of the integration of kinetic and dynamic aspects for a better understanding of the in vitro cell behavior. The bioactivation of APAP and its related cytotoxicity was assessed in a system compatible to high-throughput screening. The approach also proved to be a good strategy to reduce the time necessary to obtain fully differentiated cell cultures. In conclusion, HepaRG cells cultured in 3D spinner-bioreactors are an attractive tool for toxicological studies, showing a liver-like performance and demonstrating a practical applicability for toxicodynamic approaches.

  13. Institute for Clean Energy Technology Mississippi State University NSR&D Aged HEPA Filter Study Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Jacks, Robert [Mississippi State Univ., Mississippi State, MS (United States); Stormo, Julie [Mississippi State Univ., Mississippi State, MS (United States); Rose, Coralie [Mississippi State Univ., Mississippi State, MS (United States); Rickert, Jaime [Mississippi State Univ., Mississippi State, MS (United States); Waggoner, Charles A. [Mississippi State Univ., Mississippi State, MS (United States)

    2017-03-22

    Data have demonstrated that filter media lose tensile strength and the ability to resist the effects of moisture as a function of age. Testing of new and aged filters needs to be conducted to correlate reduction of physical strength of HEPA media to the ability of filters to withstand upset conditions. Appendix C of the Nuclear Air Cleaning Handbook provides the basis for DOE’s HEPA filter service life guidance. However, this appendix also points out the variability of data, and it does not correlate performance of aged filters to degradation of media due to age. Funding awarded by NSR&D to initiate full-scale testing of aged HEPA filters addresses the issue of correlating media degradation due to age with testing of new and aged HEPA filters under a generic design basis event set of conditions. This funding has accelerated the process of describing this study via: (1) establishment of a Technical Working Group of all stakeholders, (2) development and approval of a test plan, (3) development of testing and autopsy procedures, (4) acquiring an initial set of aged filters, (5) testing the initial set of aged filters, and (6) developing the filter test report content for each filter tested. This funding was very timely and has moved the project forward by at least three years. Activities have been correlated with testing conducted under DOE-EM funding for evaluating performance envelopes for AG-1 Section FC Separator and Separatorless filters. This coordination allows correlation of results from the NSR&D Aged Filter Study with results from testing new filters of the Separator and Separatorless Filter Study. DOE-EM efforts have identified approximately 100 more filters of various ages that have been stored under Level B conditions. NSR&D funded work allows a time for rigorous review among subject matter experts before moving forward with development of the testing matrix that will be used for additional filters. The NSR&D data sets are extremely valuable in as much

  14. Optimal Performance Simulation of a Metal Fiber Filter for Capturing Radioactive Aerosols

    International Nuclear Information System (INIS)

    Lee, Seung Uk; Lee, Chan Hyun; Park, Min Chan; Lee, Jaek Eun

    2016-01-01

    In this study, the metal fiber filter used for removing radioactive aerosol is systematically dissected and studied in order to figure out the optimal design which can be applied to the actual operation conditions in nuclear heating, ventilation and air conditioning (HVAC) systems for particle collection. In order to derive the optimal design for metal fiber HEPA filter, a numerical model is developed and its results are compared to experimental data to test reliability. Moreover, sensitivity analysis is performed using important parameters to determine which parameters have large influence on the filter performance. Using the model developed in this study, optimal design parameters for pleated metal fiber filters are derived which include fiber diameter less than 4 μm, solidity larger than 0.2, filter thickness larger than 1 mm, and face velocity lower than 5 cm/s. With these conditions, the metal filter qualified for the HEPA filter standard which specified 99.97% efficiency in the 0.3 μm particle size range.

  15. First Study Of HEPA Filter Prototype Performance To Control The Airborne Pollution

    International Nuclear Information System (INIS)

    Soetomo; Suwarno

    2000-01-01

    This paper will report the efficiency test result of the filtration tool prototype of High Efficiency Particulate Air (HEPA filter) for low temperature, to control the airborne pollution of aerosol particle of solid and liquid. The prototype design of HEPA filter was based on the characteristic data of filter material (fibrous diameter, density, filter thickness), flow rate of air and first pressure drop. From the result of laboratory scale test, using DOP/PSL aerosol with 0,3 mum diameter and the flow rate of 3,78 m exp.3/min, was obtained filtration efficiency revolve between 89,90 and 99,94 % for the filter prototype of A, B, C, and D. the efficiency estimation of theory with filtration programme and the experiment was different amount 1 %. The value of the prototype efficiency of D filter was not far different with AAF-USA filter and its price is cheaper 30 % than the price of AAF-USA filter

  16. In-situ continuous scanning high efficiency particulate air (HEPA) filter monitoring system

    International Nuclear Information System (INIS)

    Kirchner, K.N.; Johnson, C.M.; Lucerna, J.J.; Barnett, R.L.

    1985-01-01

    The testing and replacement of HEPA filters, which are widely used in the nuclear industry to purify process air before it is ventilated to the atmosphere, is a costly and labor-intensive undertaking. Current methods of testing filter performance, such as differential pressure measurement and scanning air monitoring, allow for determination of overall filter performance but preclude detection of symptoms of incipient filter failure, such as small holes in the filters themselves. Using current technology, a continual in-situ monitoring system has been designed which provides three major improvements over current methods of filter testing and replacement. This system (1) realizes a cost savings by reducing the number of intact filters which are currently being replaced unnecessarily, (2) provides a more accurate and quantitative measurement of filter performance than is currently achieved with existing testing methods, and (3) reduces personnel exposure to a radioactive environment by automatically performing most testing operations. The operation and performance of the HEPA filter monitoring system are discussed

  17. Behavior of the polygonal HEPA filter exposed to water droplets carried by the offgas flow

    International Nuclear Information System (INIS)

    Jannakos, K.; Potgeter, G.; Legner, W.

    1991-01-01

    A polygonal high-efficiency particulate air (HEPA) filter element has been developed and tested with a view to cleaning the dissolver offgas from reprocessing plants. It is likewise suited to filter process offgases generated in other plants. Due to its high dew point (about 30 degree C) the dissolver offgas, before being directed into the HEPA filter, is heated with a gas heater to approx. 100 degree C so that condensation in the pipework upstream of the filter and in the filter proper is avoided. In case of failure of the heater the offgas may undergo condensation upstream of the HEPA filter until it is bypassed to a standby heater or a standby filter system. Consequently, the filter may be loaded with water droplets. therefore, experiments have been performed with a view to estimating the behavior of the polygonal filter element when exposed to condensate droplets in a real plant. According to the experiments performed so far it can be anticipated that in case of failure of the heater the amount of condensate produced until bypassing to a standby system will not damage a new or little loaded polygonal filter element. The experiments will be carried on with the goal of investigating the behavior of a heavily loaded polygonal filter element exposed to water droplets

  18. Penetration of HEPA filters by alpha recoil aerosols

    International Nuclear Information System (INIS)

    McDowell, W.J.; Seeley, F.G.; Ryan, M.T.

    1976-01-01

    The self-scattering of alpha-active substances has long been recognized and is attributed to expulsion of aggregates of atoms from the surface of alpha-active materials by alpha emission recoil energy, and perhaps to further propulsion of these aggregates by subsequent alpha recoils. Workers at the University of Lowell recently predicted that this phenomenon might affect the retention of alpha-active particulate matter by HEPA filters, and found support in experiments with 212 Pb. Tests at Oak Ridge National Laboratory have confirmed that alpha-emitting particulate matter does penetrate high-efficiency filter media, such as that used in HEPA filters, much more effectively than do non-radioactive or beta-gamma active aerosols. Filter retention efficiencies drastically lower than the 99.9 percent quoted for ordinary particulate matter were observed with 212 Pb, 253 Es, and 238 Pu sources, indicating that the phenomenon is common to all of these and probably to all alpha-emitting materials of appropriate half-life. Results with controlled air-flow through filters in series are consistent with the picture of small particles dislodged from the ''massive'' surface of an alpha-active material, and then repeatedly dislodged from positions on the filter fibers by subsequent alpha recoils. The process shows only a small dependence on the physical form of the source material. Oxide dust, nitrate salt, and plated metal all seem to generate the recoil particles effectively. The amount penetrating a series of filters depends on the total amount of activity in the source material, its specific activity, and the length of time of air flow

  19. Investigations into the penetration and pressure drop of HEPA filter media during loading with submicron particle aerosols at high concentrations

    International Nuclear Information System (INIS)

    Leibold, H; Wilhelm, J.G.

    1991-01-01

    High Efficiency Particulate Air (HEPA) filters are typically employed in particle removal and retention within the air cleaning systems of clean rooms in the pharmaceutical, nuclear and semiconductor industries for dust concentrations of some μg/m 3 . Their extremely high removal efficiencies for submicron particles make them attractive candidates in complying with increasingly lower emission limits for industrial processes that involve dust concentrations of up to several g/m 3 . Cost-effective operation under such conditions requires the filter units to be recleanable. The recleanability of HEPA filter media depends not only on the operating conditions during the cleaning process but also on the filtration conditions during particle loading. The structure and location of the particles captured by the glass fiber matrix greatly affect the degree to which they can be subsequently dislodged and removed from the filter medium. Changes in filtration efficiency with service time for various particle diameters in the critical submicron size range, as well as the effects of filtration velocity on the increase in pressure drop, are important criteria with regard to recleaning HEPA filter units. Of special significance for the recleanability of HEPA filter media is knowledge of how operating conditions affect dust cake formation. (author)

  20. Behavior of HEPA filters under high humidity airflows

    International Nuclear Information System (INIS)

    Ricketts, C.I.

    1992-10-01

    To help determine and improve the safety margins of High Efficiency Particulate Air (HEPA) filter units in nuclear facilities under possible accident conditions, the structural limits and failure mechanisms of filter in high-humidity airflows were established and the fundamental physical phenomena underlying filter failure or malfunction in humid air were identified. Empirical models for increases in filter pressure drop with time in terms of the relevant airstream parameters were also developed. The weaknesses of currently employed humidity countermeasures used in filter protection are discussed and fundamental explanations for reported filter failures in normal service are given. (orig./DG) [de

  1. HEPA-filter smoke plugging problem

    International Nuclear Information System (INIS)

    Gaskill, J.R.; Magee, M.W.

    1975-01-01

    Actual experiences indicate that during the early stages of a fire, pyrolysis and incomplete combustion of organic materials used in the furnishings or interior finishes of laboratories yield copious quantities of smoke particulates, both liquid and solid. Furthermore, the use of fire retardants in materials used for the above purpose interferes with the combustion process, so that burning of such materials in later stages of a fire will yield dense smoke. These particulates can plug up a HEPA filter or even a more porous prefilter, and thus effectively shut off the exhaust ventilation. In this case, the fire room will pressurize and contamination may spread in an uncontrolled manner. Both small- and large-scale tests have been conducted to evaluate the nature and degree of the problem as a function of materials involved, rate of exposure to the fire, and kinds and temperatures of smoke so generated. Some test work has also been done on scrubbing of smoke. Proposed future work is described. (U.S.)

  2. Evaluating the Efficiency of Hepatoprotector Hepa Veda in Patients with Liver Pathology

    Directory of Open Access Journals (Sweden)

    Yu.M. Stepanov

    2015-04-01

    Full Text Available The article presents the results of efficiency of monotherapy with hepatoprotector Hepa veda in patients with liver pathology. There were found a significant decrease of aminotransferase level in patients with non-alcoholic steatohepatitis and a tendency to decrease in patients with chronic viral hepatitis C that showed the efficiency of this hepatoprotector.

  3. Proposed retrofit of HEPA filter plenums with injection and sampling manifolds for in-place filter testing

    Energy Technology Data Exchange (ETDEWEB)

    Fretthold, J.K. [EG& G Rocky Flats, Inc., Golden, CO (United States)

    1995-02-01

    The importance of testing HEPA filter exhaust plenums with consideration for As Low as Reasonably Achievable (ALARA) will require that new technology be applied to existing plenum designs. HEPA filter testing at Rocky Flats has evolved slowly due to a number of reasons. The first plenums were built in the 1950`s, preceding many standards. The plenums were large, which caused air dispersal problems. The systems were variable air flow. Access to the filters was difficult. The test methods became extremely conservative. Changes in methods were difficult to make. The acceptance of new test methods has been made in recent years with the change in plant mission and the emphasis on worker safety.

  4. Submicron and Nanoparticulate Matter Removal by HEPA-Rated Media Filters and Packed Beds of Granular Materials

    Science.gov (United States)

    Perry, J. L.; Agui, J. H.; Vijayakimar, R

    2016-01-01

    Contaminants generated aboard crewed spacecraft by diverse sources consist of both gaseous chemical contaminants and particulate matter. Both HEPA media filters and packed beds of granular material, such as activated carbon, which are both commonly employed for cabin atmosphere purification purposes have efficacy for removing nanoparticulate contaminants from the cabin atmosphere. The phenomena associated with particulate matter removal by HEPA media filters and packed beds of granular material are reviewed relative to their efficacy for removing fine (less than 2.5 micrometers) and ultrafine (less than 0.01 micrometers) sized particulate matter. Considerations are discussed for using these methods in an appropriate configuration to provide the most effective performance for a broad range of particle sizes including nanoparticulates.

  5. Preliminary studies to determine the shelf life of HEPA filters. Revision 1

    International Nuclear Information System (INIS)

    Gilbert, H.; Fretthold, J.K.; Rainer, F.; Bergman, W.; Beason, D.

    1995-02-01

    We have completed a preliminary study using filter media tests and filter qualification tests to investigate the effect of shelf-life on HEPA filter performance. Our media studies showed that the tensile strength decreased with age, but the data were not sufficient to establish a shelf-life. Thermogravimetric analyses demonstrated that one manufacturer had media with low tensile strength due to insufficient binder. The filter qualification tests (heated air and overpressure) conducted on different aged filters showed that filter age is not the primary factor affecting filter performance; materials and the construction design have a greater effect. An unexpected finding of our study was that sub-standard HEPA filters have been installed in DOE facilities despite existing regulations and filter qualification tests. We found that the filter with low tensile strength failed the overpressure test. The same filter had passed the heated air test, but left the filter so structurally weak, it was prone to blow-out. We recommend that DOE initiate a filter qualification program to prevent this occurrence

  6. High-efficiency particulate air (HEPA) filter performance following service and radiation exposure

    International Nuclear Information System (INIS)

    Jones, L.R.

    1975-01-01

    Small HEPA filters were exposed to a 60 Co source with a radiation strength of 3 x 10 7 rads per hour and then exposed to steam--air mixtures at several times filter design flow, followed by extended exposure to steam and air at reduced flow. Additional filters were exposed to air flow in a reactor confinement system and then similarly tested with steam--air mixture flows. The test data and calculated effects of filter pluggage with moisture on confinement system performance following potential reactor accidents are described. Gamma radiation exposure impaired the performance of new filters only slightly and temporarily improved performance of service aged filters. Normal confinement system service significantly impaired filter performance although not sufficiently to prevent adequate performance of the SRP confinement system following an unlikely reactor accident. Calculations based on measured filter pluggage indicate that during an accident air flow could be reduced approximately 50 percent with service-degraded HEPA filters present, or approximately 10 percent with new filters damaged by the radiation exposure. (U.S.)

  7. Methods for in-place testing of HEPA and iodine filters used in nuclear power plants

    International Nuclear Information System (INIS)

    Holmberg, R.; Laine, J.

    1978-04-01

    The purpose of this work was a general investigation of existing in-place test methods and to build an equipment for in-place testing of HEPA and iodine sorption filters. In this work the discussion is limited to methods used in in-place testing of HEPA and iodine sorption filters used in light-water-cooled reactor plants. Dealy systems, built for the separation of noble gases, and testing of them is not discussed in the work. Contaminants present in the air of a reactor containment can roughly be diveded into three groups: aerosols, reactive gases, and noble gases. The aerosols are filtered with HEPA (High Efficiency Particulate Air) filters. The most important reactive gases are molecular iodine and its two compounds: hydrogen iodide and methyl iodide. Of gases to be removed by the filters methyl iodide is the gas most difficult to remove especially at high relative humidities. Impregnated activated charcoal is generally used as sorption material in the iodine filters. Experience gained from the use of nuclear power plants proves that the function of high efficiency air filter systems can not be considered safe until this is proved by in-place tests. In-place tests in use are basically equal. A known test agent is injected upstream of the filter to be tested. The efficiency is calculated from air samples taken from both sides of the filter. (author)

  8. Stable Overexpression of the Constitutive Androstane Receptor Reduces the Requirement for Culture with Dimethyl Sulfoxide for High Drug Metabolism in HepaRG Cells.

    Science.gov (United States)

    van der Mark, Vincent A; Rudi de Waart, D; Shevchenko, Valery; Elferink, Ronald P J Oude; Chamuleau, Robert A F M; Hoekstra, Ruurdtje

    2017-01-01

    Dimethylsulfoxide (DMSO) induces cellular differentiation and expression of drug metabolic enzymes in the human liver cell line HepaRG; however, DMSO also induces cell death and interferes with cellular activities. The aim of this study was to examine whether overexpression of the constitutive androstane receptor (CAR, NR1I3), the nuclear receptor controlling various drug metabolism genes, would sufficiently promote differentiation and drug metabolism in HepaRG cells, optionally without using DMSO. By stable lentiviral overexpression of CAR, HepaRG cultures were less affected by DMSO in total protein content and obtained increased resistance to acetaminophen- and amiodarone-induced cell death. Transcript levels of CAR target genes were significantly increased in HepaRG-CAR cultures without DMSO, resulting in increased activities of cytochrome P450 (P450) enzymes and bilirubin conjugation to levels equal or surpassing those of HepaRG cells cultured with DMSO. Unexpectedly, CAR overexpression also increased the activities of non-CAR target P450s, as well as albumin production. In combination with DMSO treatment, CAR overexpression further increased transcript levels and activities of CAR targets. Induction of CYP1A2 and CYP2B6 remained unchanged, whereas CYP3A4 was reduced. Moreover, the metabolism of low-clearance compounds warfarin and prednisolone was increased. In conclusion, CAR overexpression creates a more physiologically relevant environment for studies on hepatic (drug) metabolism and differentiation in HepaRG cells without the utilization of DMSO. DMSO still may be applied to accomplish higher drug metabolism, required for sensitive assays, such as low-clearance studies and identification of (rare) metabolites, whereas reduced total protein content after DMSO culture is diminished by CAR overexpression. Copyright © 2016 by The American Society for Pharmacology and Experimental Therapeutics.

  9. Extraction of semivolatile organic compounds from high-efficiency particulate air (HEPA) filters by supercritical carbon dioxide

    International Nuclear Information System (INIS)

    Schilling, J.B.

    1997-09-01

    Supercritical fluid extraction (SFE) using unmodified carbon dioxide has been explored as an alternative method for the extraction of semivolatile organic compounds from high-efficiency particulate air (HEPA) filters. HEPA filters provide the final stage of containment on many exhaust systems in US Department of Energy (DOE) facilities by preventing the escape of chemical and radioactive materials entrained in the exhausted air. The efficiency of the filters is tested by the manufacturer and DOE using dioctylphthalate (DOP), a substance regulated by the US Environmental Protection Agency under the Resource Conservation and Recovery Act. Therefore, the filters must be analyzed for semivolatile organics before disposal. Ninety-eight acid, base, and neutral semivolatile organics were spiked onto blank HEPA material and extracted using SFE, Soxhlet, automated Soxhlet, and sonication techniques. The SFE conditions were optimized using a Dionex SFE-703 instrument. Average recoveries for the 98 semivolatile compounds are 82.7% for Soxhlet, 74.0% for sonication, 70.2% for SFE, and 62.9% for Soxtec. Supercritical fluid extraction reduces the extraction solvent volume to 10--15 mL, a factor of 20--30 less than Soxhlet and more than 5 times less than Soxtec and sonication. Extraction times of 30--45 min are used compared to 16--18 h for Soxhlet extraction

  10. PPAR agonists reduce steatosis in oleic acid-overloaded HepaRG cells

    International Nuclear Information System (INIS)

    Rogue, Alexandra; Anthérieu, Sébastien; Vluggens, Aurore; Umbdenstock, Thierry; Claude, Nancy; Moureyre-Spire, Catherine de la; Weaver, Richard J.; Guillouzo, André

    2014-01-01

    Although non-alcoholic fatty liver disease (NAFLD) is currently the most common form of chronic liver disease there is no pharmacological agent approved for its treatment. Since peroxisome proliferator-activated receptors (PPARs) are closely associated with hepatic lipid metabolism, they seem to play important roles in NAFLD. However, the effects of PPAR agonists on steatosis that is a common pathology associated with NAFLD, remain largely controversial. In this study, the effects of various PPAR agonists, i.e. fenofibrate, bezafibrate, troglitazone, rosiglitazone, muraglitazar and tesaglitazar on oleic acid-induced steatotic HepaRG cells were investigated after a single 24-hour or 2-week repeat treatment. Lipid vesicles stained by Oil-Red O and triglycerides accumulation caused by oleic acid overload, were decreased, by up to 50%, while fatty acid oxidation was induced after 2-week co-treatment with PPAR agonists. The greatest effects on reduction of steatosis were obtained with the dual PPARα/γ agonist muraglitazar. Such improvement of steatosis was associated with up-regulation of genes related to fatty acid oxidation activity and down-regulation of many genes involved in lipogenesis. Moreover, modulation of expression of some nuclear receptor genes, such as FXR, LXRα and CAR, which are potent actors in the control of lipogenesis, was observed and might explain repression of de novo lipogenesis. Conclusion: Altogether, our in vitro data on steatotic HepaRG cells treated with PPAR agonists correlated well with clinical investigations, bringing a proof of concept that drug-induced reversal of steatosis in human can be evaluated in in vitro before conducting long-term and costly in vivo studies in animals and patients. - Highlights: • There is no pharmacological agent approved for the treatment of NAFLD. • This study demonstrates that PPAR agonists can reduce fatty acid-induced steatosis. • Some nuclear receptors appear to be potent actors in the control

  11. PPAR agonists reduce steatosis in oleic acid-overloaded HepaRG cells

    Energy Technology Data Exchange (ETDEWEB)

    Rogue, Alexandra [Inserm UMR 991, 35043 Rennes Cedex (France); Université de Rennes 1, Faculté des Sciences Pharmaceutiques et Biologiques, 35043 Rennes Cedex (France); Biologie Servier, Gidy (France); Anthérieu, Sébastien; Vluggens, Aurore [Inserm UMR 991, 35043 Rennes Cedex (France); Université de Rennes 1, Faculté des Sciences Pharmaceutiques et Biologiques, 35043 Rennes Cedex (France); Umbdenstock, Thierry [Technologie Servier, Orléans (France); Claude, Nancy [Institut de Recherches Servier, Courbevoie (France); Moureyre-Spire, Catherine de la; Weaver, Richard J. [Biologie Servier, Gidy (France); Guillouzo, André, E-mail: Andre.Guillouzo@univ-rennes1.fr [Inserm UMR 991, 35043 Rennes Cedex (France); Université de Rennes 1, Faculté des Sciences Pharmaceutiques et Biologiques, 35043 Rennes Cedex (France)

    2014-04-01

    Although non-alcoholic fatty liver disease (NAFLD) is currently the most common form of chronic liver disease there is no pharmacological agent approved for its treatment. Since peroxisome proliferator-activated receptors (PPARs) are closely associated with hepatic lipid metabolism, they seem to play important roles in NAFLD. However, the effects of PPAR agonists on steatosis that is a common pathology associated with NAFLD, remain largely controversial. In this study, the effects of various PPAR agonists, i.e. fenofibrate, bezafibrate, troglitazone, rosiglitazone, muraglitazar and tesaglitazar on oleic acid-induced steatotic HepaRG cells were investigated after a single 24-hour or 2-week repeat treatment. Lipid vesicles stained by Oil-Red O and triglycerides accumulation caused by oleic acid overload, were decreased, by up to 50%, while fatty acid oxidation was induced after 2-week co-treatment with PPAR agonists. The greatest effects on reduction of steatosis were obtained with the dual PPARα/γ agonist muraglitazar. Such improvement of steatosis was associated with up-regulation of genes related to fatty acid oxidation activity and down-regulation of many genes involved in lipogenesis. Moreover, modulation of expression of some nuclear receptor genes, such as FXR, LXRα and CAR, which are potent actors in the control of lipogenesis, was observed and might explain repression of de novo lipogenesis. Conclusion: Altogether, our in vitro data on steatotic HepaRG cells treated with PPAR agonists correlated well with clinical investigations, bringing a proof of concept that drug-induced reversal of steatosis in human can be evaluated in in vitro before conducting long-term and costly in vivo studies in animals and patients. - Highlights: • There is no pharmacological agent approved for the treatment of NAFLD. • This study demonstrates that PPAR agonists can reduce fatty acid-induced steatosis. • Some nuclear receptors appear to be potent actors in the control

  12. Use of sulfuric-nitric acid for the recovery of plutonium from HEPA filters. (620.2, WH001/LWE)

    International Nuclear Information System (INIS)

    Clark, D.E.

    1978-09-01

    Contaminated high-efficiency particulate air (HEPA) filter media, containing PuO 2 powder which had been calcined at 700 0 C, were treated with concentrated H 2 SO 4 -HNO 3 at 190 to 200 0 C for periods ranging from 0.5 to 2.0 hours. When followed by a dilute HNO 3 rinse, this treatment was shown to be very effective as a plutonium recovery operation (approximately greater than 97% of the plutonium was solubilized). A proposed treatment scheme is given which could provide both a plutonium recovery option for HEPA filters and a reduction in overall waste volume

  13. Health hazards associated with the use of di-(2-ethylhexyl) phthalate (commonly referred to as DOP) in HEPA filter test

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-01-01

    Di-(2-ethylhexyl) phthalate (DEHP), commonly referred to as di-octyl phthalate, is an important production chemical in the US. In addition to its major use as an additive in plastics, DEHP is widely used to evaluate the effectiveness of high efficiency particulate air (HEPA) filters. Historically, DEHP was also used in quantitative fit testing for respirators. Evaluations of this compound a decade ago showed that it can induce hepatocellular carcinomas in laboratory animals. Although most Department of Energy (DOE) facilities have since discontinued using DEHP in respirator fit testing, DEHP continues to be used for evaluating HEPA filters. This report summarizes available information on the toxicity, mutagenicity, carcinogenicity, and other hazards and problems posed by DEHP, specifically with reference to HEPA filter testing. Information on work practice improvements as well as the availability and suitability of DEHP substitutes are also presented. This material should assist the DOE in the safe use of this material.

  14. In-duct countermeasures for reducing fire-generated-smoke-aerosol exposure to HEPA filters

    International Nuclear Information System (INIS)

    Alvares, N.J.; Beason, D.G.; Ford, H.W.

    1978-01-01

    An experimental program was conducted to assess the endurance and lifetime of HEPA filters exposed to fire-generated aerosols, and to reduce the aerosol exposure by installing engineering countermeasures in the duct between the fire source and HEPA filters. Large cribs of wood and other potential fuels of interest were ''forcefully burned'' in a partially ventilated enclosure. In a ''forceful burn'' the crib of fuel is continuously exposed to an energetic premixed methane flame during the entire experimental period. This tactic serves two purposes: it optimizes the production of smoke rich in unburned pyrolyzates which provides severe exposure to the filters, and it facilitates the ignition and enhances the combustion of cribs formed with synthetic polymers. The experiments were conducted in an enclosure specifically designed and instrumented for fire tests. The test cell has a volume of 100 m 3 and includes instrumentation to measure the internal temperature distribution, pressure, thermal radiation field, flow fields, gas concentration, particulate size distribution and mass, fuel weight loss, inlet and exit air velocities, and smoke optical density. The countermeasure techniques include the use of passively operated sprinkler systems in the fire test cell, of fine and dense water scrubbing sprays, and of rolling prefiltration systems in the exit duct of the fire test cell. Of the countermeasures surveyed, the rolling prefilter system showed the most promise. This paper concentrates on the effect of control variables; i.e., enclosure air supply, fuel composition and crib porosity on the combustion response; i.e., crib burning rate, enclosure temperature rise, oxygen consumption, and CO, CO 2 and total hydrocarbon production. A discussion of the attempts to rationalize smoke aerosol properties will be included along with results from the effect of countermeasure application on HEPA filter lifetimes

  15. Transient Heating and Thermomechanical Stress Modeling of Ceramic HEPA Filters

    Energy Technology Data Exchange (ETDEWEB)

    Bogle, Brandon [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kelly, James [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Haslam, Jeffrey [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-29

    The purpose of this report is to showcase an initial finite-element analysis model of a ceramic High-Efficiency Particulate (HEPA) Air filter design. Next generation HEPA filter assemblies are being developed at LLNL to withstand high-temperature fire scenarios by use of ceramics and advanced materials. The filters are meant for use in radiological and nuclear facilities, and are required to survive 500°C fires over an hour duration. During such conditions, however, collecting data under varying parameters can be challenging; therefore, a Finite Element Analysis model of the filter was conducted using COMSOL ® Multiphysics to analyze the effects of fire. Finite Element Analysis (FEA) modelling offers several opportunities: researchers can quickly and easily consider impacts of potential design changes, material selection, and flow characterization on filter performance. Specifically, this model provides stress references for the sealant at high temperatures. Modeling of full filter assemblies was deemed inefficient given the computational requirements, so a section of three tubes from the assembly was modeled. The model looked at the transient heating and thermomechanical stress development during a 500°C air flow at 6 CFM. Significant stresses were found at the ceramic-metal interfaces of the filter, and conservative temperature profiles at locations of interest were plotted. The model can be used for the development of sealants that minimize stresses at the ceramic-metal interface. Further work on the model would include the full filter assembly and consider heat losses to make more accurate predictions.

  16. Performance of 1000- and 1800- cfm HEPA filters on long exposure to low atmospheric dust loadings, II

    International Nuclear Information System (INIS)

    First, M.W.; Rudnick, S.N.

    1981-01-01

    Comparative tests were made to evaluate the performance characteristics of American- and European-design HEPA filters when exposed, for a number of years, to aerosols characteristic of nuclear and biohazard service. Although some of the European-design filters were operated at their rated airflow capacity of 1800 cfm, some were downrated to 1000 cfm to determine if their service life could be more than tripled compared to conventional 1000-cfm Americal-design HEPA filters, as filter theory predicts. Initial results indicate, however, that for the ambient aerosol used in this study, a European-design filter has a service life of only 1.6 times greater than an American-design filter when both operate at 1000 cfm. Further tests are in progress to verify this result

  17. Development of glass-fiber high-efficiency particulate air filters of high structural strength on the basis of the establishment of failure mechanisms

    International Nuclear Information System (INIS)

    Ruedinger, V.; Ricketts, C.I.; Wilhelm, J.G.; Alken, W.

    1987-01-01

    Practical experience from routine operation in nuclear installations as well as extensive bench and laboratory testing proved the structural limits of HEPA filters to be very low thus demonstrating the need for improvement of their structural strength. Detailed analysis of the courses and modes of filter failure under the challenge of dry air at high velocities and ambient temperature, together with additional measurements, allowed the establishment of the dominating mechanisms of filter failure. Based on this information, the following three options for effective and economical improvements in filter structural limits exist: (1) an increase in the tensile strength of the filter medium; (2) an increase in the stability of the pack to prevent the swelling of individual pleats; and (3) an increase in the area moment of inertia of the separators and a decrease in the sharpness of their edges. By using a reinforced glass fiber filter medium, the structural strength of standard size HEPA filters was increased to 31 kPa with dry air and beyond 10 kPa with air at high humidity. Prototype filters built with standard glass-fiber media and separators with inclined corrugations exhibited failure pressures of approximately 50 kPa under high velocity airflows. The combination of both types of improvements, together with other measures, will soon lead to even higher HEPA-filter structural strength

  18. Analysis of fire and smoke threat to off-gas HEPA filters in a transuranium processing plant

    International Nuclear Information System (INIS)

    Alvares, N.J.

    1988-01-01

    The author performed an analysis of fire risk to the high-efficiency particulate air (HEPA) filters that provide ventilation containment for a transuranium processing plant at the Oak Ridge National Laboratory. A fire-safety survey by an independent fire-protection consulting company had identified the HEPA filters in the facility's off-gas containment ventilation system as being at risk from fire effects. Independently studied were the ventilation networks and flow dynamics, and typical fuel loads were analyzed. It was found that virtually no condition for fire initiation exists and that, even if a fire started, its consequences would be minimal as a result of standard shut-down procedures. Moreover, the installed fire-protection system would limit any fire and thus would further reduce smoke or heat exposure to the ventilation components. 4 references, 4 figures, 5 tables

  19. Observations of the distribution and the nature of alpha-active particulate material in a HEPA filter used for plutonium-containing dust

    International Nuclear Information System (INIS)

    Ryan, M.T.; McDowell, W.J.

    1977-02-01

    Autoradiography has been used to determine the distribution and the nature of plutonium particulate material on a high-efficiency particulate air (HEPA) filter used to filter 239 Pu-containing dust. Higher concentrations of alpha-active material on upstream and downstream folds of the filter indicate uneven airflow through the filter. Observations of aggregate recoil particles on the downstream face of the filter suggest that aggregate recoil transfer, a mechanism which may reduce long-term HEPA filter efficiency, may be occurring. Amounts of alpha activity found on downstream filters confirm this supposition

  20. Investigation and deactivation of B Plant HEPA filters

    International Nuclear Information System (INIS)

    Roege, P.E.

    1997-01-01

    This paper describes the integrated approach used to manage environmental, safety, and health considerations related to the B Plant canyon exhaust air filters at the US Department of Energy (DOE) Hanford Site. The narrative illustrates the development and implementation of integrated safety management as applied to a facility and its systems undergoing deactivation. During their lifetime, the high efficiency particulate air (HEPA) filters prevented the release of significant quantities of radioactive materials into the air. As the material in B Plant AVESF accumulated on the filters, it created an unusual situation. Over long periods of time, the radiation dose from the filter loading, combined with aging and chemical exposure actually degrade those filters which were intended to protect against any release to the environment

  1. Performance of HEPA filters at LLNL following the 1980 and 1989 earthquakes

    International Nuclear Information System (INIS)

    Bergman, W.; Elliott, J.; Wilson, K.

    1995-01-01

    The Lawrence Livermore National Laboratory has experienced two significant earthquakes for which data is available to assess the ability of HEPA filters to withstand seismic conditions. A 5.9 magnitude earthquake with an epicenter 10 miles from LLNL struck on January 24, l980. Estimates of the peak ground accelerations ranged from 0.2 to 0.3 g. A 7.0 magnitude earthquake with an epicenter about 50 miles from LLNL struck on October 17, 1989. Measurements of the ground accelerations at LLNL averaged 0.1 g. The results from the in-place filter tests obtained after each of the earthquakes were compiled and studied to determine if the earthquakes had caused filter leakage. Our study showed that only the 1980 earthquake resulted in a small increase in the number of HEPA filters developing leaks. In the 12 months following the 1980 and 1989 earthquakes, the in-place filter tests showed 8.0% and 4.1% of all filters respectively developed leaks. The average percentage of filters developing leaks from 1980 to 1993 was 3.3%+/-1.7%. The increase in the filter leaks is significant for the 1980 earthquake, but not for the 1989 earthquake. No contamination was detected following the earthquakes that would suggest transient releases from the filtration system

  2. Performance of HEPA filters at LLNL following the 1980 and 1989 earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Elliott, J.; Wilson, K. [Lawrence Livermore National Laboratory, CA (United States)

    1995-02-01

    The Lawrence Livermore National Laboratory has experienced two significant earthquakes for which data is available to assess the ability of HEPA filters to withstand seismic conditions. A 5.9 magnitude earthquake with an epicenter 10 miles from LLNL struck on January 24, l980. Estimates of the peak ground accelerations ranged from 0.2 to 0.3 g. A 7.0 magnitude earthquake with an epicenter about 50 miles from LLNL struck on October 17, 1989. Measurements of the ground accelerations at LLNL averaged 0.1 g. The results from the in-place filter tests obtained after each of the earthquakes were compiled and studied to determine if the earthquakes had caused filter leakage. Our study showed that only the 1980 earthquake resulted in a small increase in the number of HEPA filters developing leaks. In the 12 months following the 1980 and 1989 earthquakes, the in-place filter tests showed 8.0% and 4.1% of all filters respectively developed leaks. The average percentage of filters developing leaks from 1980 to 1993 was 3.3%+/-1.7%. The increase in the filter leaks is significant for the 1980 earthquake, but not for the 1989 earthquake. No contamination was detected following the earthquakes that would suggest transient releases from the filtration system.

  3. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    International Nuclear Information System (INIS)

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  4. Particle Removal Efficiency of the Portable HEPA Air Cleaner in a Simulated Hospital Ward

    DEFF Research Database (Denmark)

    Qian, Hua; Li, Yuguo; Sun, Hequan

    2010-01-01

    of beds in an isolation ward is insufficient. An experiment was conducted in a full scale experimental ward with a dimension of 6.7 m × 6 m × 2.7 m and 6 beds to test these hypotheses for a portable HEPA filter. The removal efficiency for different size particles was measured at different locations...

  5. Measurement of gamma activity from the PUREX stack, Number 296-A-10, HEPA filters

    International Nuclear Information System (INIS)

    Barnett, J.M.

    1995-11-01

    In response to the Environmental Protection Agency's requirements for evaluating radioactive emissions from stacks, this test plan was developed. The test plan employs the use of low resolution (NaI) portable gamma spectrometry to identify and measure gamma emitting radionuclides from HEPA filters. The test description, expected results, and test set-up and steps are discussed

  6. Multiple HEPA filter test methods. Progress report, January--December 1977

    International Nuclear Information System (INIS)

    Schuster, B.; Kyle, T.; Osetek, D.

    1978-09-01

    Tandem high-efficiency particulate air (HEPA) filter efficiency measurements have been successfully performed on a large number of 20,000 CFM installations. The testing procedure relies on the use of a laser intracavity particle spectrometer and a very high-volume thermal dioctyl phthalate aerosol generator designed and constructed specifically for this purpose. For systems that cannot be tested in this fashion, work has been initiated on the generation and detection of a fluorescent self-identifying aerosol to eliminate the background problem. General candidate aerosols and methods to disperse them have been uncovered. Two distinct detection concepts have evolved for the measurement of size and concentration of these particles

  7. Impact of isomalathion on malathion cytotoxicity and genotoxicity in human HepaRG cells.

    OpenAIRE

    Josse , Rozenn; Sharanek , Ahmad; Savary , Camille C; Guillouzo , André

    2014-01-01

    International audience; Isomalathion is a major impurity of technical grade malathion, one of the most abundantly applied insecticides; however little is known about its hepatotoxicity. In the present study, cytotoxicity and genotoxicity of malathion and isomalathion either individually or in combination, were assessed using the metabolically competent human liver HepaRG cell line. Isomalathion reduced cell viability starting at a 100 μM concentration after a 24h exposure. It also significant...

  8. HEPA filter leaching concept validation trials at the Idaho Chemical Processing Plant

    International Nuclear Information System (INIS)

    Chakravartty, A.C.

    1995-04-01

    The enclosed report documents six New Waste Calcining Facility (NWCF) HEPA filter leaching trials conducted at the Idaho Chemical Processing Plant using a filter leaching system to validate the filter leaching treatment concept. The test results show that a modified filter leaching system will be able to successfully remove both hazardous and radiological constituents to RCRA disposal levels. Based on the success of the filter leach trials, the existing leaching system will be modified to provide a safe, simple, effective, and operationally flexible filter leaching system

  9. Human HepaRG Cells can be Cultured in Hanging-drop Plates for Cytochrome P450 Induction and Function Assays.

    Science.gov (United States)

    Murayama, Norie; Usui, Takashi; Slawny, Nicky; Chesné, Christophe; Yamazaki, Hiroshi

    2015-01-01

    Recent guidance/guidelines for industry recommend that cytochrome P450 induction can be assessed using human hepatocyte enzyme activity and/or mRNA levels to evaluate potential drug- drug interactions. To evaluate time-dependent cytochrome P450 induction precisely, induction of CYP1A2, CYP2B6, and CYP3A4 mRNA was confirmed (>2-fold) by the treatment with omeprazole, phenobarbital, and rifampicin, respectively, for 24 or 48 h on day 3 from the start of culture. After 24 h, the fold induction of CYP1A2 with 3.6 and 1.8x10(4) HepaRG cells per well was lower than that for 7.2x10(4) cells. CYP1A2 induction levels at 24 h were higher than those after 48 h. In contrast, higher CYP2B6 inductions were confirmed after 48 h exposure than after 24 h, independent of the number of cells per well. To help reduce the use of human cryopreserved hepatocytes, typical P450-dependent enzyme activities were investigated in human HepaRG cells cultured in commercial hanging-drop plates. Newly designed 96-well hanging-drop plates were capable of maintaining human CYP3A-dependent midazolam hydroxylation activities for up to 4 days using only 10% of the recommended initial 7.2x10(4) cells per well. Favorable HepaRG function using hanging-drop plates was confirmed by detecting 1'- hydroxymidazolam O-glucuronide on day 3, suggesting an improvement over traditional control plates in which this metabolite can be detected for 24-well plates. These results suggest that the catalytic function and/or induction of CYP1A2, CYP2B6, and CYP3A4 can be readily assessed with reduced numbers of starting HepaRG cells cultured in three-dimensional cultures in drops prepared with hanging-drop plates.

  10. In Vivo Evaluation of a New Embolic Spherical Particle (HepaSphere) in a Kidney Animal Model

    International Nuclear Information System (INIS)

    Luis, Esther de; Bilbao, Jose I.; Ciercoles, Jose A. Garcia Jalon de; Martinez-Cuesta, Antonio; Martino Rodriguez, Alba de; Lozano, Maria D.

    2008-01-01

    HepaSphere is a new spherical embolic material developed in a dry state that absorbs fluids and adapts to the vessel wall, leaving no space between the particle and the arterial wall. The aim of this study was to elucidate the final in vivo size, deformation, final location, and main properties of the particles when reconstituted with two different contrast media (Iodixanol and Ioxaglate) in an animal model. Two sizes of 'dry-state' particles (50-100 and 150-200 μm) were reconstituted using both ionic and nonionic contrast media. The mixture was used to partly embolize both kidneys in an animal model (14 pigs). The animals were sacrificed 4 weeks after the procedure and the samples processed. The final size of the particles was 230.2 ± 62.5 μm for the 50- to 100-μm dry-state particles and 314.4 ± 71 μm for the 150- to 200-μm dry-state particles. When the contrast medium (ionic versus nonionic) used for the reconstitution was studied to compare (Student's t-test) the final size of the particles, no differences were found (p > 0.05). The mean in vivo deformation for HepaSphere was 17.1% ± 12.3%. No differences (p > 0.05) were found in the deformation of the particle regarding the dry-state size or the contrast medium (Mann-Whitney test). We conclude that HepaSphere is stable, occludes perfectly, and morphologically adapts to the vessel lumen of the arteries embolized. There is no recanalization of the arteries 4 weeks after embolization. Its final in vivo size is predictable and the particle has the same properties in terms of size and deformation with the two different contrast media (Iodixanol and Ioxaglate)

  11. Evaluation of Alternative Control for Prevention and or Mitigation of HEPA Filter Failure Accidents at Tank Farm Facilities

    International Nuclear Information System (INIS)

    GUSTAVSON, R.D.

    2000-01-01

    This study evaluates the adequacy and benefit of use of HEPA filter differential pressure limiting setpoints to initiate exhauster shut down as an alternative safety control for postulated accidents that might result in filtration failure and subsequent unfiltered release from Tank Farm primary tank ventilators

  12. Performance testing of HEPA filters: Progress towards a European standard procedure

    Energy Technology Data Exchange (ETDEWEB)

    Dyment, J.

    1997-08-01

    Proposals for a future European testing procedure for {open_quotes}High Efficiency Particulate Air Filters (HEPA and ULPA){close_quotes} are being developed by CEN (Comite Europeen de Normalisation). The new standard will be given the status of national standard in participating countries, conflicting national standards being withdrawn. The standard will comprise 5 parts covering the grouping and classification of HEPA and ULPA filters according to their efficiency, fundamental principles of testing, marking etc (in part 1). Part 2 will cover aerosol production, measurement principles, counting equipment and statistics. Parts 3-5 will cover testing flat sheet media, leak testing of filter elements and the efficiency testing of filter elements respectively. The efficiency test methods allow the use of either homogeneous monodisperse or polydisperse aerosols for the determination of particulate filtration efficiencies as a function of particle size. The particle size at which maximum penetration occurs is first determined in flat sheet media tests; tests on filter elements (constructed using the same filter medium) may be carried out using either a homogeneous monodisperse aerosol of the size at which maximum penetration occurs (MPPS) or a polydisperse aerosol whose median size is close to the MPPS. Tests with monodisperse aerosols may be conducted using condensation nucleus counting equipment; tests using polydisperse test aerosols require the use of optical sizing particle counters. When determining the efficiency of filter elements the downstream aerosol concentrations may be determined from air samples obtained using either an overall method (single point sampling after mixing) or a scan method. The scan method also allows {open_quotes}local{close_quotes} efficiency values to be determined. 1 ref., 1 fig., 1 tab.

  13. Electrofibrous prefilters for use in nuclear ventilation systems

    International Nuclear Information System (INIS)

    Bergman, W.; Kuhl, W.D.; Russell, W.L.; Taylor, R.D.; Hebard, H.D.; Biermann, A.H.; Alvares, N.J.; Beason, D.G.; Lum, B.Y.

    1981-01-01

    We have established a comprehensive program for the US Department of Energy to develop electrofibrous prefilters to extend the life of High Efficiency Particulate Air (HEPA) filters that are used in the nuclear industry. We have selected the electrofibrous filter because, compared to the mechanical fibrous filter, it has a higher efficiency and longer lifetime. Two different electrofibrous filters have been developed for use in nuclear ventilation systems. One prototype is a stationary prefilter while the other is a rolling prefilter. Both prefilters use the same basic filtering technique in which a fibrous filter medium is sandwiched between a high voltage electrode and a ground electrode, both electrodes having a sufficient open area to offer minimum air resistance. The applied voltage on the electrodes generates an electric field that polarizes the filter fibers, which then attract suspended particles via electrostatic forces. The filter media and electrodes have been pleated to provide a sufficiently long particle residence time. The special requirement of protecting the HEPA filter from a high concentration of smoke aerosols during fire conditions led to the development of the rolling, electrofibrous prefilter. We established the feasibility of this concept in a series of tests using commercially available rolling prefilters that were modified for removing smoke aerosols. Although the rolling prefilter concept is not a cost effective measure for the sole purpose of protecting HEPA filters from smoke aerosols, it became cost effective when used primarily for protecting the HEPA filters from normal production aerosols. The same piece of equipment is then used for both normal operating conditions as well as emergency fire conditions. Several prototype electrofibrous rolling prefilters were designed, built and evaluated. The filter evaluations were conducted using NaCl and DOP aerosols as well as smoke aerosols

  14. Comparison of Emery 3004 and 3006 characteristics with DOP for possible use in HEPA filter leak tests

    Energy Technology Data Exchange (ETDEWEB)

    Kovach, B.J.; Banks, E.M.; Kovacs, G. [Nuclear Consulting Services, Inc., Columbus, OH (United States)

    1995-02-01

    The particle size distribution, concentration, liquid to aerosol conversion rate and ignition properties of DOP, Emery 3004 and Emery 3006 aerosols generated by the NUCON Aerosol Generators Models SN-10 and DG-F were obtained. Results demonstrate the Emery products are acceptable replacements for DOP in performing leak testing of HEPA filters.

  15. HEPA Filter Performance under Adverse Conditions

    International Nuclear Information System (INIS)

    Parsons, Michael; Hogancamp, Kristina; Alderman, Steven; Waggoner, Charles

    2007-01-01

    This study involved challenging nuclear grade high-efficiency particulate air (HEPA) filters under a variety of conditions that can arise in Department of Energy (DOE) applications such as: low or high RH, controlled and uncontrolled challenge, and filters with physically damaged media or seals (i.e., leaks). Reported findings correlate filter function as measured by traditional differential pressure techniques in comparison with simultaneous instrumental determination of up and down stream PM concentrations. Additionally, emission rates and failure signatures will be discussed for filters that have either failed or exceeded their usable lifetime. Significant findings from this effort include the use of thermocouples up and down stream of the filter housing to detect the presence of moisture. Also demonstrated in the moisture challenge series of tests is the effect of repeated wetting of the filter. This produces a phenomenon referred to as transient failure before the tensile strength of the media weakens to the point of physical failure. An evaluation of the effect of particle size distribution of the challenge aerosol on loading capacity of filters is also included. Results for soot and two size distributions of KCl are reported. Loading capacities for filters ranged from approximately 70 g of soot to nearly 900 g for the larger particle size distribution of KCl. (authors)

  16. Dose- and time-dependent effects of phenobarbital on gene expression profiling in human hepatoma HepaRG cells

    International Nuclear Information System (INIS)

    Lambert, Carine B.; Spire, Catherine; Claude, Nancy; Guillouzo, Andre

    2009-01-01

    Phenobarbital (PB) induces or represses a wide spectrum of genes in rodent liver. Much less is known about its effects in human liver. We used pangenomic cDNA microarrays to analyze concentration- and time-dependent gene expression profile changes induced by PB in the well-differentiated human HepaRG cell line. Changes in gene expression profiles clustered at specific concentration ranges and treatment times. The number of correctly annotated genes significantly modulated by at least three different PB concentration ranges (spanning 0.5 to 3.2 mM) at 20 h exposure amounted to 77 and 128 genes (p ≤ 0.01) at 2- and 1.8-fold filter changes, respectively. At low concentrations (0.5 and 1 mM), PB-responsive genes included the well-recognized CAR- and PXR-dependent responsive cytochromes P450 (CYP2B6, CYP3A4), sulfotransferase 2A1 and plasma transporters (ABCB1, ABCC2), as well as a number of genes critically involved in various metabolic pathways, including lipid (CYP4A11, CYP4F3), vitamin D (CYP24A1) and bile (CYP7A1 and CYP8B1) metabolism. At concentrations of 3.2 mM or higher after 20 h, and especially 48 h, increased cytotoxic effects were associated with disregulation of numerous genes related to oxidative stress, DNA repair and apoptosis. Primary human hepatocyte cultures were also exposed to 1 and 3.2 mM PB for 20 h and the changes were comparable to those found in HepaRG cells treated under the same conditions. Taken altogether, our data provide further evidence that HepaRG cells closely resemble primary human hepatocytes and provide new information on the effects of PB in human liver. These data also emphasize the importance of investigating dose- and time-dependent effects of chemicals when using toxicogenomic approaches

  17. U-235 Holdup Measurements in the 321-M Lathe HEPA Banks

    International Nuclear Information System (INIS)

    Salaymeh, S.R.

    2002-01-01

    The Analytical Development Section of Savannah River Technology Center (SRTC) was requested by the Facilities Decommissioning Division (FDD) to determine the holdup of enriched uranium in the 321-M facility as part of an overall deactivation project of the facility. The results of the holdup assays are essential for determining compliance with the Waste Acceptance Criteria, Material Control and Accountability, and to meet criticality safety controls. This report covers holdup measurements of uranium residue in six high efficiency particulate air (HEPA) filter banks of the A-lathe and B-lathe exhaust systems of the 321-M facility. This report discusses the non-destructive assay measurements, assumptions, calculations, and results of the uranium holdup in these six items

  18. Development and evaluation of a cleanable high efficiency steel filter

    International Nuclear Information System (INIS)

    Bergman, W.; Larsen, G.; Weber, F.; Wilson, P.; Lopez, R.; Valha, G.; Conner, J.; Garr, J.; Williams, K.; Biermann, A.; Wilson, K.; Moore, P.; Gellner, C.; Rapchun, D.; Simon, K.; Turley, J.; Frye, L.; Monroe, D.

    1993-01-01

    We have developed a high efficiency steel filter that can be cleaned in-situ by reverse air pulses. The filter consists of 64 pleated cylindrical filter elements packaged into a 6l0 x 6l0 x 292 mm aluminum frame and has 13.5 m 2 of filter area. The filter media consists of a sintered steel fiber mat using 2 μm diameter fibers. We conducted an optimization study for filter efficiency and pressure drop to determine the filter design parameters of pleat width, pleat depth, outside diameter of the cylinder, and the total number of cylinders. Several prototype cylinders were then built and evaluated in terms of filter cleaning by reverse air pulses. The results of these studies were used to build the high efficiency steel filter. We evaluated the prototype filter for efficiency and cleanability. The DOP filter certification test showed the filter has a passing efficiency of 99.99% but a failing pressure drop of 0.80 kPa at 1,700 m 3 /hr. Since we were not able to achieve a pressure drop less than 0.25 kPa, the steel filter does not meet all the criteria for a HEPA filter. Filter loading and cleaning tests using AC Fine dust showed the filter could be repeatedly cleaned by reverse air pulses. The next phase of the prototype evaluation consisted of installing the unit and support housing in the exhaust duct work of a uranium grit blaster for a field evaluation at the Y-12 Plant in Oak Ridge, TN. The grit blaster is used to clean the surface of uranium parts and generates a cloud of UO 2 aerosols. We used a 1,700 m 3 /hr slip stream from the 10,200 m 3 /hr exhaust system

  19. Investigation of HEPA filters subjected to tornado pressure pulses

    International Nuclear Information System (INIS)

    Gregory, W.S.; Horak, H.L.; Smith, P.R.; Ricketts, C.

    1977-03-01

    An experimental program is described that will determine the response of 0.6-x 0.6-m (24-x 24-in.) high-efficiency particulate air (HEPA) filters to tornado-induced pressure transients. A blow-down system will be used to impose pressure differentials across the filters. Progress in construction of this system is reported with a description of the component parts and their functions. The test facility is essentially complete with the exception of an air dryer system that has not yet been delivered. Initial structural testing will begin in March 1977. A description is given of the instrumentation needed to measure air pressure, velocity, turbulence, humidity and particulate concentration. This instrumentation includes pressure transducers, humidity equipment, laser Doppler velocimeters (LDV), signal processors and a data acquisition system. Operational theory of the LDV and its proposed use as a particle counting device are described

  20. Selecting Cells for Bioartificial Liver Devices and the Importance of a 3D Culture Environment: A Functional Comparison between the HepaRG and C3A Cell Lines.

    Science.gov (United States)

    van Wenum, Martien; Adam, Aziza A A; Hakvoort, Theodorus B M; Hendriks, Erik J; Shevchenko, Valery; van Gulik, Thomas M; Chamuleau, Robert A F M; Hoekstra, Ruurdtje

    2016-01-01

    Recently, the first clinical trials on Bioartificial Livers (BALs) loaded with a proliferative human hepatocyte cell source have started. There are two cell lines that are currently in an advanced state of BAL development; HepaRG and HepG2/C3A. In this study we aimed to compare both cell lines on applicability in BALs and to identify possible strategies for further improvement. We tested both cell lines in monolayer- and BAL cultures on growth characteristics, hepatic differentiation, nitrogen-, carbohydrate-, amino acid- and xenobiotic metabolism. Interestingly, both cell lines adapted the hepatocyte phenotype more closely when cultured in BALs; e.g. monolayer cultures produced lactate, while BAL cultures showed diminished lactate production (C3A) or conversion to elimination (HepaRG), and urea cycle activity increased upon BAL culturing in both cell lines. HepaRG-BALs outperformed C3A-BALs on xenobiotic metabolism, ammonia elimination and lactate elimination, while protein synthesis was comparable. In BAL cultures of both cell lines ammonia elimination correlated positively with glutamine production and glutamate consumption, suggesting ammonia elimination was mainly driven by the balance between glutaminase and glutamine synthetase activity. Both cell lines lacked significant urea cycle activity and both required multiple culture weeks before reaching optimal differentiation in BALs. In conclusion, culturing in BALs enhanced hepatic functionality of both cell lines and from these, the HepaRG cells are the most promising proliferative cell source for BAL application.

  1. Evaluation of the HEPA filter in-place test method in a corrosive off-gas environment

    International Nuclear Information System (INIS)

    Murphy, L.P.; Wong, M.A.; Girton, R.C.

    1978-01-01

    Experiments were performed to determine if the combined effects of temperature, humidity, and oxides of nitrogen (NO/sub x/) hinder the in-place testing of high-efficiency particulate air (HEPA) filters used for cleaning the off-gas from a nuclear waste solidification facility. The laboratory system that was designed to simulate the process off-gas contained two HEPA filters in series with sample ports before each filter and after the filter bank. The system also included a reaction bomb for partial conversion of NO to NO 2 . Instrumentation measured stream flow, humidity, NO/sub x/ concentration, and temperature. Comparison measurements of the DOP concentrations were made by a forward light-scattering photometer and a single particle intra-cavity laser particle spectrometer. Experimental conditions could be varied, but maximum system capabilities were 95% relative humidity, 90 0 C, and 10,000 ppM of NO/sub x/. A 2 3 factorial experimental design was used for the test program. This design determined the main effects of each factor plus the interactions of the factors in combination. The results indicated that water vapor and NO/sub x/ interfere with the conventional photometer measurements. Suggested modifications that include a unique sample dryer are described to correct the interferences. The laser particle spectrometer appears to be an acceptable instrument for measurements under adverse off-gas conditions

  2. Method for HEPA filter leak scanning with differentiating aerosol detector

    Energy Technology Data Exchange (ETDEWEB)

    Kovach, B.J.; Banks, E.M.; Wikoff, W.O. [NUCON International, Inc., Columbus, OH (United States)

    1997-08-01

    While scanning HEPA filters for leaks with {open_quotes}Off the Shelf{close_quote} aerosol detection equipment, the operator`s scanning speed is limited by the time constant and threshold sensitivity of the detector. This is based on detection of the aerosol density, where the maximum signal is achieved when the scanning probe resides over the pinhole longer than several detector time-constants. Since the differential value of the changing signal can be determined by observing only the first small fraction of the rising signal, using a differentiating amplifier will speed up the locating process. The other advantage of differentiation is that slow signal drift or zero offset will not interfere with the process of locating the leak, since they are not detected. A scanning hand-probe attachable to any NUCON{reg_sign} Aerosol Detector displaying the combination of both aerosol density and differentiated signal was designed. 3 refs., 1 fig.

  3. Liver Progenitor Cell Line HepaRG Differentiated in a Bioartificial Liver Effectively Supplies Liver Support to Rats with Acute Liver Failure

    NARCIS (Netherlands)

    Nibourg, Geert A. A.; Chamuleau, Robert A. F. M.; van der Hoeven, Tessa V.; Maas, Martinus A. W.; Ruiter, An F. C.; Lamers, Wouter H.; Oude Elferink, Ronald P. J.; van Gulik, Thomas M.; Hoekstra, Ruurdtje

    2012-01-01

    A major roadblock to the application of bioartificial livers is the need for a human liver cell line that displays a high and broad level of hepatic functionality. The human bipotent liver progenitor cell line HepaRG is a promising candidate in this respect, for its potential to differentiate into

  4. Study of the effect of humidity, particle hygroscopicity and size on the mass loading capacity of HEPA filters

    International Nuclear Information System (INIS)

    Gupta, A.

    1992-01-01

    The effect of humidity, particle hygroscopicity and size on the mass loading capacity of glass fiber HEPA filters has been studied. At humidifies above the deliquescent point, the pressure drop across the HEPA filter increased non-linearly with the areal loading density (mass collected/filtration area) of NaCl aerosol, thus significantly reducing the mass loading capacity of the filter compared to dry hygroscopic or non-hygroscopic particle mass loadings. The specific cake resistance, K 2 , has been computed for different test conditions and used as a measure of the mass loading capacity. K. was found to decrease with increasing humidity for the non-hygroscopic aluminum oxide particles and the hygroscopic NaCl particles (at humidities below the deliquescent point). It is postulated that an increase in humidity leads to the formation of a more open particulate cake which lowers the pressure drop for a given mass loading. A formula for predicting K 2 for lognormally distributed aerosols (parameters obtained from impactor data) is derived. The resistance factor, R, calculated using this formula was compared to the theoretical R calculated using the Rudnick-Happel expression. For the non-hygroscopic aluminum oxide the agreement was good but for the hygroscopic sodium chloride, due to large variation in the cake porosity estimates, the agreement was poor

  5. Replacement of HEPA Filters at the LANL CMR Facility: Risks Reduced by Comprehensive Waste Characterization

    International Nuclear Information System (INIS)

    Corpion, J.; Barr, A.; Martinez, P.; Bader, M.

    2002-01-01

    In March 2001, the Los Alamos National Laboratory (LANL) completed the replacement of 720 radioactively contaminated HEPA filters for $5.7M. This project was completed five months ahead of schedule and $6.0M under budget with no worker injuries or contaminations. Numerous health and safety, environmental, and waste disposal problems were overcome, including having to perform work in a radioactively contaminated work environment, that was also contaminated with perchlorates (potential explosive). High waste disposal costs were also an issue. A project risk analysis and government cost estimate determined that the cost of performing the work would be $11.8M. To reduce risk, a $1.2M comprehensive condition assessment was performed to determine the degree of toxic and radioactive contamination trapped on the HEPA filters; and to determine whether explosive concentrations of perchlorates were present. Workers from LANL and personnel from Waldheim International of Knoxville, TN collected hundreds of samples wearing personnel protective gear against radioactive, toxic, and explosive hazards. LANL also funded research at the New Mexico Institute of Mining and Technology to determine the explosivity of perchlorates. The data acquired from the condition assessment showed that toxic metals, toxic organic compounds, and explosive concentrations of perchlorates were absent. The data also showed that the extent of actinide metal contamination was less than expected, reducing the potential of transuranic waste generation by 50%. Consequently, $4.2M in cost savings and $1.8M in risk reduction were realized by increased worker productivity and waste segregation

  6. Study on DOP substitutes for leaking rate testing of HEPA filter used in nuclear air cleaning systems

    International Nuclear Information System (INIS)

    Qiu Dangui; Zhang Jirong; Hou Jianrong; Qiao Taifei; Shen Dapeng; Shi Yingxia

    2012-01-01

    Based on an extensive investigation over available literatures concerning HEPA filter testing, PEG400, SHELL on dina oil 15 and P.a. were chosen as candidates for Dop substitutes, and on which a series of tests were conducted about their aerosol conversion rate, particle size distribution, Dop detector response and leaking rate in H EPA filter. With consideration of technical properties, safety performance and economy, homemade P.a. is finally selected as the best substitute for Dop among the three. (authors)

  7. Reliability and Validity of the SE-HEPA: Examining Physical Activity- and Healthy Eating-Specific Self-Efficacy among a Sample of Preadolescents

    Science.gov (United States)

    Steele, Michael M.; Burns, Leonard G.; Whitaker, Brandi N.

    2013-01-01

    Objective. The purpose of this study was to examine the psychometric properties of the self-efficacy for healthy eating and physical activity measure (SE-HEPA) for preadolescents. Method. The reliability of the measure was examined to determine if the internal consistency of the measure was adequate (i.e., [alpha]s greater than 0.70). Next, in an…

  8. Preferential induction of the AhR gene battery in HepaRG cells after a single or repeated exposure to heterocyclic aromatic amines

    International Nuclear Information System (INIS)

    Dumont, Julie; Josse, Rozenn; Lambert, Carine; Antherieu, Sebastien; Laurent, Veronique; Loyer, Pascal; Robin, Marie-Anne; Guillouzo, Andre

    2010-01-01

    2-Amino-1-methyl-6-phenylimidazo[4,5-b]pyridine (PhIP) and 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx) are two of the most common heterocyclic aromatic amines (HAA) produced during cooking of meat, fish and poultry. Both HAA produce different tumor profiles in rodents and are suspected to be carcinogenic in humans. In order to better understand the molecular basis of HAA toxicity, we have analyzed gene expression profiles in the metabolically competent human HepaRG cells using pangenomic oligonucleotide microarrays, after either a single (24-h) or a repeated (28-day) exposure to 10 μM PhIP or MeIQx. The most responsive genes to both HAA were downstream targets of the arylhydrocarbon receptor (AhR): CYP1A1 and CYP1A2 after both time points and CYP1B1 and ALDH3A1 after 28 days. Accordingly, CYP1A1/1A2 induction in HAA-treated HepaRG cells was prevented by chemical inhibition or small interference RNA-mediated down-regulation of the AhR. Consistently, HAA induced activity of the CYP1A1 promoter, which contains a consensus AhR-related xenobiotic-responsive element (XRE). In addition, several other genes exhibited both time-dependent and compound-specific expression changes with, however, a smaller magnitude than previously reported for the prototypical AhR target genes. These changes concerned genes mainly related to cell growth and proliferation, apoptosis, and cancer. In conclusion, these results identify the AhR gene battery as the preferential target of PhIP and MeIQx in HepaRG cells and further support the hypothesis that intake of HAA in diet might increase human cancer risk.

  9. Solutions for Dioctyl Phthalate (DOP) tested high efficiency particulate air (HEPA) filters destined for disposal at Hanford, Washington

    International Nuclear Information System (INIS)

    Gablin, K.A.

    1992-11-01

    In January 1992, Argonne National Laboratory East, Environmental and Waste Management Program, learned that a chemical material used for testing of all HEPA filters at the primary source, Flanders Filter, Inc. in Washington, NC, was considered a hazardous chemical by Washington State Dangerous Waste Regulations. These regulations are under the jurisdiction of the Washington Administration Code, Chapter 173-303, and therefore directly under impact the Hanford Site Solid Waste Acceptance Criteria. Dioctyl Phthalate, ''DOP'' as it is referred to in chemical abbreviation form, is added in small test quantities at the factory, at three Department of Energy (DOE) operated HEPA filter test facilities, and in the installed duct work at various operating laboratories or production facilities. When small amounts of radioactivity are added to the filter media in operation, the result is a mixed waste. This definition would normally only develop in the state of Washington since their acceptance criteria is ten times more stringent then the US Environmental Protection Agencys' (US EPA). Methods of Processing will be discussed, which will include detoxification, physical separation, heat and vacuum separation, and compaction. The economic impact of a mixed waste definition in the State of Washington, and an Low Level Waste (LLW) definition in other locations, may lend this product to be a prime candidate for commercial disposal in the future, or a possible de-listing by the State of Washington

  10. Dual-color fluorescence imaging to monitor CYP3A4 and CYP3A7 expression in human hepatic carcinoma HepG2 and HepaRG cells.

    Directory of Open Access Journals (Sweden)

    Saori Tsuji

    Full Text Available Human adult hepatocytes expressing CYP3A4, a major cytochrome P450 enzyme, are required for cell-based assays to evaluate the potential risk of drug-drug interactions caused by transcriptional induction of P450 enzymes in early-phase drug discovery and development. However, CYP3A7 is preferentially expressed in premature hepatoblasts and major hepatic carcinoma cell lines. The human hepatocellular carcinoma cell line HepaRG possesses a high self-renewal capacity and can differentiate into hepatic cells similar to human adult hepatocytes in vitro. Transgenic HepaRG cells, in which the expression of fluorescent reporters is regulated by 35 kb regulatory elements of CYP3A4, have a distinct advantage over human hepatocytes isolated by collagenase perfusion, which are unstable in culture. Thus, we created transgenic HepaRG and HepG2 cells by replacing the protein-coding regions of human CYP3A4 and CYP3A7 with enhanced green fluorescent protein (EGFP and DsRed reporters, respectively, in a bacterial artificial chromosome vector that included whole regulatory elements. The intensity of DsRed fluorescence was initially high during the proliferation of transgenic HepaRG cells. However, most EGFP-positive cells were derived from those in which DsRed fluorescence was extinguished. Comparative analyses in these transgenic clones showed that changes in the total fluorescence intensity of EGFP reflected fold changes in the mRNA level of endogenous CYP3A4. Moreover, CYP3A4 induction was monitored by the increase in EGFP fluorescence. Thus, this assay provides a real-time evaluation system for quality assurance of hepatic differentiation into CYP3A4-expressing cells, unfavourable CYP3A4 induction, and fluorescence-activated cell sorting-mediated enrichment of CYP3A4-expressing hepatocytes based on the total fluorescence intensities of fluorescent reporters, without the need for many time-consuming steps.

  11. Characterizing radionuclides in the B Plant HEPA filters

    International Nuclear Information System (INIS)

    Roege, P.E.

    1998-01-01

    B Plant was built during World War II to separate plutonium for nuclear weapons from reactor fuel. Later, the plant was re-equipped and used to separate radioactive fission products from the Hanford Site's nuclear processing waste tanks. The facility is now being deactivated: eliminating, stabilizing, and documenting existing hazards to allow safe surveillance and maintenance pending a final disposition which is yet to be determined. The processing areas of the plant, including process cells and exhaust air system, are heavily contaminated with radioactive cesium and strontium from the tank waste separation process. However, detailed characterization is difficult because many of these areas are inaccessible because of physical barriers and high radiological dose rates. The five existing canyon high efficiency particulate air (HEPA) filters were thought to contain a significant fraction of the inventory, but estimates were highly uncertain. This paper describes the process used to inspect and characterize the radionuclide content in one of these filters. The investigation required a collaborative effort among field and technical personnel. Sophisticated computer modeling and detector technologies were employed in conjunction with sound radiological control and field work practices. The outcome of the effort was a considerable reduction in the filter inventory estimate, accompanied by a greatly improved level of confidence in the data. The information derived from this project will provide a sound basis for future decisions regarding filter disposition

  12. Collection of aerosols in high efficiency particulate air filters

    International Nuclear Information System (INIS)

    Pratt, R.P.; Green, B.L.

    1987-01-01

    The investigation of the performance of HEPA filters of both minipleat and conventional deep pleat designs has continued at Harwell. Samples of filters from several manufacturers have been tested against the UKAEA/BNF plc filter purchasing specification. No unexpected problems have come to light in these tests, apart from some evidence to suggest that although meeting the specification minipleat filters are inherently weaker in burst strength terms than conventional filters. In addition tests have been carried out to investigate the dust loading versus pressure drop characteristics of both designs of filters using a range of test dusts - ASHRAE dust, carbon black, BS 2831 No. 2 test dust and sodium chloride. In parallel with laboratory test work a more fundamental study on the effects of geometric arrangement of filter media within the filter frame has been carried out on behalf of the UKAEA by Loughborough University. The results of this study has been the development of a mathematical model to predict the dust load versus pressure drop characteristic as a function of filter media geometry. This has produced good agreement with laboratory test results using a challenge aerosol in the 1-5 μm size range. Further observations have been made to enhance understanding of the deposition of aerosols within the filter structure. The observations suggest that the major influence on dust loading is the depth of material collected in the flow channel as a surface deposition, and this explains the relatively poor performance of the minipleat design of filter

  13. Evaluation of self-contained HEPA filter

    Energy Technology Data Exchange (ETDEWEB)

    Arndt, T.E. [Westinghouse Hanford Company, Richland, WA (United States)

    1995-02-01

    This paper presents the results of an evaluation of a self-contained high-efficiency particulate air filter (SHEPA) used in nuclear applications. A SCHEPA consists of filter medium encapsulated in a casing that is part of the system boundary. The SCHEPA filter serves as a combination of filter housing and filter. The filter medium is attached directly to the casing using adhesive as a bonding agent. A cylindrical connection in the middle of the end caps connects the filter assembly to adjoining ductwork. The SCHEPA must perform the functions of a filter housing, filter frame, and filter. It was recognized that the codes and standards do not address the SCHEPA specifically. Therefore, the investigation evaluated the SCHEPA against current codes and standards related to the functional requirements of an air-cleaning system. The specific standards used are required by DOE Order 6430.1A{sup 1} and include ASME N509{sup 3}, ASME N510{sup 4}, ERDA 76-21{sup 5}, MIL-F-51068F{sup 6}, NFPA 90A, {sup 7} and NFPA 91{sup 8}. The evaluation does not address whether the SCHEPA as a standard (off-the-shelf) filter could be upgraded to meet the current code requirements for an air-cleaning unit. The evaluation also did not consider how the SCHEPA was used in a system (e.g., whether it was under positive or negative pressure or whether it served as an air inlet filter to prevent contamination releases under system pressurization). The results of the evaluation show that, the SCHEPA filter does not meet design, fabrication, testing, and documentation requirements of ASME N509{sup 3} and ASME N510{sup 4}. The paper will identify these deficiencies. Specific exhaust system requirements and application should be considered when an evaluation of the SCHEPA filter is being performed in existing systems. When new designs are being comtemplated, other types of HEPA filter housings can be used in lieu of the SCHEPA filter.

  14. A device for uranium series leaching from glass fiber in HEPA filter

    International Nuclear Information System (INIS)

    Gye-Nam Kim; Hye-Min Park; Wang-Kyu Choi; Jei-Kwon Moon

    2012-01-01

    For the disposal of a high efficiency particulate air (HEPA) glass filter into the environment, the glass fiber should be leached to lower its radioactive concentration to the clearance level. To derive an optimum method for the removal of uranium series from a HEPA glass fiber, five methods were applied in this study. That is, chemical leaching by a 4.0 M HNO 3 -0.1 M Ce(IV) solution, chemical leaching by a 5 wt% NaOH solution, chemical leaching by a 0.5 M H 2 O 2 -1.0 M Na 2 CO 3 solution, chemical consecutive chemical leaching by a 4.0 M HNO 3 solution, and repeated chemical leaching by a 4.0 M HNO 3 solution were used to remove the uranium series. The residual radioactivity concentrations of 238 U, 235 U, 226 Ra, and 234 Th in glass after leaching for 5 h by the 4.0 M HNO 3 -0.1 M Ce(IV) solution were 2.1, 0.3, 1.1, and 1.2 Bq/g. The residual radioactivity concentrations of 238 U, 235 U, 226 Ra, and 234 Th in glass after leaching for 36 h by 4.0 M HNO 3 -0.1 M Ce(IV) solution were 76.9, 3.4, 63.7, and 71.9 Bq/g. The residual radioactivity concentrations of 238 U, 235 U, 226 Ra, and 234 Th in glass after leaching for 8 h by a 0.5 M H 2 O 2 -1.0 M Na 2 CO 3 solution were 8.9, 0.0, 1.91, and 6.4 Bq/g. The residual radioactivity concentrations of 238 U, 235 U, 226 Ra, and 234 Th in glass after consecutive leaching for 8 h by the 4.0 M HNO 3 solution were 2.08, 0.12, 1.55, and 2.0 Bq/g. The residual radioactivity concentrations of 238 U, 235 U, 226 Ra, and 234 Th in glass after three repetitions of leaching for 3 h by the 4.0 M HNO 3 solution were 0.02, 0.02, 0.29, and 0.26 Bq/g. Meanwhile, the removal efficiencies of 238 U, 235 U, 226 Ra, and 234 Th from the waste solution after its precipitation-filtration treatment with NaOH and alum for reuse of the 4.0 M HNO 3 waste solution were 100, 100, 93.3, and 100%. (author)

  15. Penetration of HEPA filters by alpha recoil aerosols

    International Nuclear Information System (INIS)

    McDowell, W.J.; Seeley, F.G.; Ryan, M.T.

    1976-01-01

    Tests at Oak Ridge National Laboratory confirmed that alpha-emitting particulate matter does penetrate high-efficiency filter medium, identical to that used in HEPA filters, much more effectively than do non-radioactive or beta-gamma active aerosols. Filter retention efficiencies drastically lower than the 99.97 percent quoted for ordinary particulate matter have been observed with 212 Pb, 253 Es, and 238 Pu sources, indicating that the phenomenon is common to all of these and probably to all alpha-emitting materials of appropriate half-life. Results with controlled air-flow through filters in series are consistent with the picture of small particles dislodged from the ''massive'' surface of an alpha-active material, and then repeatedly dislodged from positions on the filter fibers, by the alpha recoils. The process shows only a small dependence on the physical form of the source material. Oxide dust, nitrate salt, and plated metal all seem to generate the recoil particles effectively. The amount penetrating a series of filters depends on the total amount of activity in the source material, its specific activity, and the length of time of air flow. Dependence on the air flow velocity is slight. It appears that this phenomenon has not been observed in previous experiments with alpha-active aerosols because the tests did not continue for a sufficiently long time. A theoretical model of the process has been developed, amenable to computer handling, that should allow calculation of the rate constants associated with the transfer through and release of radioactive material from a filter system by this process

  16. Review of in-place HEPA filter testing at several DOE facilities

    Energy Technology Data Exchange (ETDEWEB)

    Mokler, B.V.; Scripsick, R.C. [Los Alamos National Laboratory, NM (United States)

    1995-02-01

    The Office of Nuclear Energy Self-Assessment recently sponsored reviews of HEPA filter systems at several DOE facilities. One aspect emphasized in these reviews was in-place filter testing practices. Although in-place testing was generally performed as required in facility specifications, we noted several areas in which improvements were possible. Examples of some common problems and approaches to their solution will be presented. Areas of suggested improvement include: (1) ensuring the validity of test results; (2) recognizing and quantifying the uncertainty in penetration measurements; (3) expanding the analysis and reporting of test results to provide more than pass/fail information; (4) addressing the special problems of multiple stage systems; and (5) increasing the technical support and training provided in-place testing personnel. Ensuring the validity of test results, for example, requires more careful attention to the operation of test equipment, checking test measurements and system operating parameters for internal consistency, and more attention to documentation of system geometry and operation. Some issues will require additional study before the results can be incorporated into decision making on filter bank testing requirements and performance specifications.

  17. Review of in-place HEPA filter testing at several DOE facilities

    International Nuclear Information System (INIS)

    Mokler, B.V.; Scripsick, R.C.

    1995-01-01

    The Office of Nuclear Energy Self-Assessment recently sponsored reviews of HEPA filter systems at several DOE facilities. One aspect emphasized in these reviews was in-place filter testing practices. Although in-place testing was generally performed as required in facility specifications, we noted several areas in which improvements were possible. Examples of some common problems and approaches to their solution will be presented. Areas of suggested improvement include: (1) ensuring the validity of test results; (2) recognizing and quantifying the uncertainty in penetration measurements; (3) expanding the analysis and reporting of test results to provide more than pass/fail information; (4) addressing the special problems of multiple stage systems; and (5) increasing the technical support and training provided in-place testing personnel. Ensuring the validity of test results, for example, requires more careful attention to the operation of test equipment, checking test measurements and system operating parameters for internal consistency, and more attention to documentation of system geometry and operation. Some issues will require additional study before the results can be incorporated into decision making on filter bank testing requirements and performance specifications

  18. Evaluation of multistage filtration to reduce sand filter exhaust activity

    International Nuclear Information System (INIS)

    Zippler, D.B.

    1975-01-01

    Air from the Savannah River Plant Fuel Reprocessing facilities is filtered through deep bed sand filters consisting of 8 1 / 2 feet of gravel and sand. These filters have performed satisfactorily for the past 18 years in maintaining radioactive release levels to a minimum. The apparent filter efficiency has been determined for many years by measurements of the quantity of radioactivity in the air stream before and after the filter. Such tests have indicated efficiencies of 99.9 percent or better. Even with sand filter efficiency approaching a single stage HEPA filter, new emphasis on further reduction in release of plutonium activity to the environment prompted a study to determine what value backup HEPA filtration could provide. To evaluate the specific effect additional HEPA filtration would have on the removal of Pu from the existing sand filter exhaust stream, a test was conducted by passing a sidestream of sand-filtered air through a standard 24 x 24 x 11 1 / 2 in. HEPA filter. Isokinetic air samples were withdrawn upstream and downstream of the HEPA filter and counted for alpha activity. Efficiency calculations indicated that backup HEPA filtration could be expected to provide an additional 99 percent removal of the Pu activity from the present sand-filter exhaust. (U.S.)

  19. Tailored liquid chromatography-mass spectrometry analysis improves the coverage of the intracellular metabolome of HepaRG cells.

    Science.gov (United States)

    Cuykx, Matthias; Negreira, Noelia; Beirnaert, Charlie; Van den Eede, Nele; Rodrigues, Robim; Vanhaecke, Tamara; Laukens, Kris; Covaci, Adrian

    2017-03-03

    Metabolomics protocols are often combined with Liquid Chromatography-Mass Spectrometry (LC-MS) using mostly reversed phase chromatography coupled to accurate mass spectrometry, e.g. quadrupole time-of-flight (QTOF) mass spectrometers to measure as many metabolites as possible. In this study, we optimised the LC-MS separation of cell extracts after fractionation in polar and non-polar fractions. Both phases were analysed separately in a tailored approach in four different runs (two for the non-polar and two for the polar-fraction), each of them specifically adapted to improve the separation of the metabolites present in the extract. This approach improves the coverage of a broad range of the metabolome of the HepaRG cells and the separation of intra-class metabolites. The non-polar fraction was analysed using a C18-column with end-capping, mobile phase compositions were specifically adapted for each ionisation mode using different co-solvents and buffers. The polar extracts were analysed with a mixed mode Hydrophilic Interaction Liquid Chromatography (HILIC) system. Acidic metabolites from glycolysis and the Krebs cycle, together with phosphorylated compounds, were best detected with a method using ion pairing (IP) with tributylamine and separation on a phenyl-hexyl column. Accurate mass detection was performed with the QTOF in MS-mode only using an extended dynamic range to improve the quality of the dataset. Parameters with the greatest impact on the detection were the balance between mass accuracy and linear range, the fragmentor voltage, the capillary voltage, the nozzle voltage, and the nebuliser pressure. By using a tailored approach for the intracellular HepaRG metabolome, consisting of three different LC techniques, over 2200 metabolites can be measured with a high precision and acceptable linear range. The developed method is suited for qualitative untargeted LC-MS metabolomics studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Preliminary field evaluation of high efficiency steel filters

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Larsen, G.; Lopez, R. [Lawrence Livermore National Laboratory, CA (United States)] [and others

    1995-02-01

    We have conducted an evaluation of two high efficiency steel filters in the exhaust of an uranium oxide grit blaster at the Y-12 Plant in Oak Ridge Tennessee. The filters were installed in a specially designed filter housing with a reverse air-pulse cleaning system for automatically cleaning the filters in-place. Previous tests conducted on the same filters and housing at LLNL under controlled conditions using Arizona road dust showed good cleanability with reverse air pulses. Two high efficiency steel filters, containing 64 pleated cartridge elements housed in the standard 2` x 2` x 1` HEPA frame, were evaluated in the filter test housing using a 1,000 cfm slip stream containing a high concentration of depleted uranium oxide dust. One filter had the pleated cartridges manufactured to our specifications by the Pall Corporation and the other by Memtec Corporation. Test results showed both filters had a rapid increase in pressure drop with time, and reverse air pulses could not decrease the pressure drop. We suspected moisture accumulation in the filters was the problem since there were heavy rains during the evaluations, and the pressure drop of the Memtec filter decreased dramatically after passing clean, dry air through the filter and after the filter sat idle for one week. Subsequent laboratory tests on a single filter cartridge confirmed that water accumulation in the filter was responsible for the increase in filter pressure drop and the inability to lower the pressure drop by reverse air pulses. No effort was made to identify the source of the water accumulation and correct the problem because the available funds were exhausted.

  1. Experimental relationship between the specific resistance of a HEPA [High Efficiency Particulate Air] filter and particle diameters of different aerosol materials

    International Nuclear Information System (INIS)

    Novick, V.J.; Monson, P.R.; Ellison, P.E.

    1990-01-01

    The increase in pressure drop across a HEPA filter has been measured as a function of the particle mass loading using two materials with different particle morphologies. The HEPA filter media chosen, is identical to the filter media used in the Airborne Activity Confinement System (AACS) on the Savannah River Reactors. The velocity through the test filter media was the same as the velocity through the AACS media, under normal operating flow conditions. Sodium Chloride challenge particles were generated using an atomizer, resulting in regularly shaped crystalline forms. Ammonium chloride aerosols were formed from the gas phase reaction of HCl and NH 4 OH vapors resulting in irregular agglomerates. In both cases, the generation conditions were adjusted to provide several different particle size distributions. For each particle size distribution, the mass of material loaded per unit area of filter per unit pressure drop for a given filtration velocity (1/Specific resistance) was measured. Theoretical considerations in the most widely accepted filter cake model predict that the mass per unit area and per unit pressure drop should increase with the particle density times the particle diameter squared. However, these test results indicate that the increase in the mass loaded per unit area per unit pressure drop, for both materials, can be better described by plotting the specific resistance divided by the particle density as an inverse function of the particle density times the particle diameter squared. 9 refs., 7 figs

  2. Aging assessment of nuclear air-treatment system HEPA filters and adsorbers

    International Nuclear Information System (INIS)

    Winegardner, W.K.

    1993-08-01

    A Phase I aging assessment of high-efficiency particulate air (HEPA) filters and activated carbon gas adsorption units (adsorbers) was performed by the Pacific Northwest Laboratory (PNL) as part of the US Nuclear Regulatory Commission's (NRC) Nuclear Plant Aging Research (NPAR) Program. Information concerning design features; failure experience; aging mechanisms, effects, and stressors; and surveillance and monitoring methods for these key air-treatment system components was compiled. Over 1100 failures, or 12 percent of the filter installations, were reported as part of a Department of Energy (DOE) survey. Investigators from other national laboratories have suggested that aging effects could have contributed to over 80 percent of these failures. Tensile strength tests on aged filter media specimens indicated a decrease in strength. Filter aging mechanisms range from those associated with particle loading to reactions that alter properties of sealants and gaskets. Low radioiodine decontamination factors associated with the Three Mile Island (TMI) accident were attributed to the premature aging of the carbon in the adsorbers. Mechanisms that can lead to impaired adsorber performance include oxidation as well as the loss of potentially available active sites as a result of the adsorption of pollutants. Stressors include heat, moisture, radiation, and airborne particles and contaminants

  3. Survey of HEPA filter applications and experience at Department of Energy sites

    International Nuclear Information System (INIS)

    Carbaugh, E.H.

    1981-11-01

    Results indicated that approximately 58% of the filters surveyed were changed out in the 1977 to 1979 study period and some 18% of all filters were changed out more than once. Most changeouts (60%) were due to the existence of a high pressure drop across the filter, indicative of filter plugging. The next most recurrent reasons for changeout and their percentage changeouts were leak test failure (15%) and preventive maintenance service life limit (12%). An average filter service life was calculated to be 3.0 years with a 2.0-year standard deviation. The labor required for filter changeout was calculated as 1.5 manhours per filter changed. Filter failures occurred with approximately 12% of all installed filters. Most failures (60%) occurred for unknown reasons and handling or installation damage accounted for an additional 20% of all failures. Media ruptures, filter frame failures and seal failures occurred with approximately equal frequency at 5 to 6% each. Subjective responses to the questionnaire indicate problems are: need for improved acid and moisture resistant filters; filters more readily disposable as radioactive waste; improved personnel training in filter handling and installation; and need for pretreatment of air prior to HEPA filtration

  4. Selecting Cells for Bioartificial Liver Devices and the Importance of a 3D Culture Environment: A Functional Comparison between the HepaRG and C3A Cell Lines

    NARCIS (Netherlands)

    van Wenum, Martien; Adam, Aziza A. A.; Hakvoort, Theodorus B. M.; Hendriks, Erik J.; Shevchenko, Valery; van Gulik, Thomas M.; Chamuleau, Robert A. F. M.; Hoekstra, Ruurdtje

    2016-01-01

    Recently, the first clinical trials on Bioartificial Livers (BALs) loaded with a proliferative human hepatocyte cell source have started. There are two cell lines that are currently in an advanced state of BAL development; HepaRG and HepG2/C3A. In this study we aimed to compare both cell lines on

  5. Improvements of High Current/ Low Pressure Liquid And Gas Targets For Cyclotron Produced Radioisotopes

    Energy Technology Data Exchange (ETDEWEB)

    Hur, M. G. [Korea Atomic Energy Research Institute, Jeongup (Korea, Republic of); Hong, B. H. [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of); Chai, J. S. [SungKyunKwan University, Seoul (Korea, Republic of)

    2009-07-01

    The development of the C-11 cylindrical target with cooling fin for 13 MeV and 30 MeV proton beams and the development of pleated double-foil O-18 water target were carried out. For the test of new target system it was done at 2 pilots of cyclotron centres in Korea. The development of pleated double-foil O-18 water target was also executed. The pleated foil has the more advantages than flat foil. With the same beam bombarding the pleated foil with cooling had more yield of F-18production. CFD and FEM study were considered to design of the pleated foil and flat foil structure. (author)

  6. A Transcriptional Regulatory Network Containing Nuclear Receptors and Long Noncoding RNAs Controls Basal and Drug-Induced Expression of Cytochrome P450s in HepaRG Cells.

    Science.gov (United States)

    Chen, Liming; Bao, Yifan; Piekos, Stephanie C; Zhu, Kexin; Zhang, Lirong; Zhong, Xiao-Bo

    2018-07-01

    Cytochrome P450 (P450) enzymes are responsible for metabolizing drugs. Expression of P450s can directly affect drug metabolism, resulting in various outcomes in therapeutic efficacy and adverse effects. Several nuclear receptors are transcription factors that can regulate expression of P450s at both basal and drug-induced levels. Some long noncoding RNAs (lncRNAs) near a transcription factor are found to participate in the regulatory functions of the transcription factors. The aim of this study is to determine whether there is a transcriptional regulatory network containing nuclear receptors and lncRNAs controlling both basal and drug-induced expression of P450s in HepaRG cells. Small interfering RNAs or small hairpin RNAs were applied to knock down four nuclear receptors [hepatocyte nuclear factor 1 α (HNF1 α ), hepatocyte nuclear factor 4 α (HNF4 α ), pregnane X receptor (PXR), and constitutive androstane receptor (CAR)] as well as two lncRNAs [HNF1 α antisense RNA 1 (HNF1 α -AS1) and HNF4 α antisense RNA 1 (HNF4 α -AS1)] in HepaRG cells with or without treatment of phenobarbital or rifampicin. Expression of eight P450 enzymes was examined in both basal and drug-induced levels. CAR and PXR mainly regulated expression of specific P450s. HNF1 α and HNF4 α affected expression of a wide range of P450s as well as other transcription factors. HNF1 α and HNF4 α controlled the expression of their neighborhood lncRNAs, HNF1 α -AS1 and HNF4 α -AS1, respectively. HNF1 α -AS1 and HNF4 α -AS1 was also involved in the regulation of P450s and transcription factors in diverse manners. Altogether, our study concludes that a transcription regulatory network containing the nuclear receptors and lncRNAs controls both basal and drug-induced expression of P450s in HepaRG cells. Copyright © 2018 by The American Society for Pharmacology and Experimental Therapeutics.

  7. Pilot-scale tests of HEME and HEPA dissolution process

    Energy Technology Data Exchange (ETDEWEB)

    Qureshi, Z.H.; Strege, D.K.

    1994-06-01

    A series of pilot-scale demonstration tests for the dissolution of High Efficiency Mist Eliminators (HEME`s) and High Efficiency Particulate Airfilters (HEPA) were performed on a 1/5th linear scale. These fiberglass filters are to be used in the Defense Waste Processing Facility (DWPF) to decontaminate the effluents from the off-gases generated during the feed preparation process and vitrification. When removed, these filters will be dissolved in the Decontamination Waste Treatment Tank (DWTT) using 5 wt% NaOH solution. The contaminated fiberglass is converted to an aqueous stream which will be transferred to the waste tanks. The filter metal structure will be rinsed with process water before its disposal as low-level solid waste. The pilot-scale study reported here successfully demonstrated a simple one step process using 5 wt% NaOH solution. The proposed process requires the installation of a new water spray ring with 30 nozzles. In addition to the reduced waste generated, the total process time is reduced to 48 hours only (66% saving in time). The pilot-scale tests clearly demonstrated that the dissolution process of HEMEs has two stages - chemical digestion of the filter and mechanical erosion of the digested filter. The digestion is achieved by a boiling 5 wt% caustic solutions, whereas the mechanical break down of the digested filter is successfully achieved by spraying process water on the digested filter. An alternate method of breaking down the digested filter by increased air sparging of the solution was found to be marginally successful are best. The pilot-scale tests also demonstrated that the products of dissolution are easily pumpable by a centrifugal pump.

  8. Pilot-scale tests of HEME and HEPA dissolution process

    International Nuclear Information System (INIS)

    Qureshi, Z.H.; Strege, D.K.

    1994-06-01

    A series of pilot-scale demonstration tests for the dissolution of High Efficiency Mist Eliminators (HEME's) and High Efficiency Particulate Airfilters (HEPA) were performed on a 1/5th linear scale. These fiberglass filters are to be used in the Defense Waste Processing Facility (DWPF) to decontaminate the effluents from the off-gases generated during the feed preparation process and vitrification. When removed, these filters will be dissolved in the Decontamination Waste Treatment Tank (DWTT) using 5 wt% NaOH solution. The contaminated fiberglass is converted to an aqueous stream which will be transferred to the waste tanks. The filter metal structure will be rinsed with process water before its disposal as low-level solid waste. The pilot-scale study reported here successfully demonstrated a simple one step process using 5 wt% NaOH solution. The proposed process requires the installation of a new water spray ring with 30 nozzles. In addition to the reduced waste generated, the total process time is reduced to 48 hours only (66% saving in time). The pilot-scale tests clearly demonstrated that the dissolution process of HEMEs has two stages - chemical digestion of the filter and mechanical erosion of the digested filter. The digestion is achieved by a boiling 5 wt% caustic solutions, whereas the mechanical break down of the digested filter is successfully achieved by spraying process water on the digested filter. An alternate method of breaking down the digested filter by increased air sparging of the solution was found to be marginally successful are best. The pilot-scale tests also demonstrated that the products of dissolution are easily pumpable by a centrifugal pump

  9. Activation of the sonic hedgehog signaling pathway occurs in the CD133 positive cells of mouse liver cancer Hepa 1–6 cells

    Directory of Open Access Journals (Sweden)

    Jeng KS

    2013-08-01

    Full Text Available Kuo-Shyang Jeng,1 I-Shyan Sheen,2 Wen-Juei Jeng,2 Ming-Che Yu,3 Hsin-I Hsiau,3 Fang-Yu Chang,3 Hsin-Hua Tsai31Department of Surgery, Far Eastern Memorial Hospital, Taipei, 2Department of Hepato-Gastroenterology, Chang Gung Memorial Hospital, Linkou Medical Center, Chang Gung University, 3Department of Medical Research, Far Eastern Memorial Hospital, Taipei, Taiwan, Republic of ChinaBackground: The important role of cancer stem cells in carcinogenesis has been emphasized in research. CD133+ cells have been mentioned as liver cancer stem cells in hepatocellular carcinoma (HCC. Some researchers have proposed that the sonic hedgehog (Shh pathway contributes to hepatocarcinogenesis and that the pathway activation occurs mainly in cancer stem cells. We investigated whether the activation of the Shh pathway occurs in CD133+ cells from liver cancer.Materials and methods: We used magnetic sorting to isolate CD133+ cells from mouse cancer Hepa 1–6 cells. To examine the clonogenicity, cell culture and soft agar colony formation assay were performed between CD133+ and CD133- cells. To study the activation of the Shh pathway, we examined the mRNA expressions of Shh, patched homolog 1 (Ptch-1, glioma-associated oncogene homolog 1 (Gli-1, and smoothened homolog (Smoh by real-time polymerase chain reaction of both CD133+ and CD133- cells.Results: The number (mean ± standard deviation of colonies of CD133+ cells and CD133- cells was 1,031.0 ± 104.7 and 119.7 ± 17.6 respectively. This difference was statistically significant (P < 0.001. Their clonogenicity was 13.7% ± 1.4% and 1.6% ± 0.2% respectively with a statistically significant difference found (P < 0.001. CD133+ cells and CD133– cells were found to have statistically significant differences in Shh mRNA and Smoh mRNA (P = 0.005 and P = 0.043 respectively.Conclusion: CD133+ Hepa 1–6 cells have a significantly higher colony proliferation and clonogenicity. The Shh pathway is activated in these

  10. The histone deacetylase inhibiting drug Entinostat induces lipid accumulation in differentiated HepaRG cells

    Science.gov (United States)

    Nunn, Abigail D. G.; Scopigno, Tullio; Pediconi, Natalia; Levrero, Massimo; Hagman, Henning; Kiskis, Juris; Enejder, Annika

    2016-06-01

    Dietary overload of toxic, free metabolic intermediates leads to disrupted insulin signalling and fatty liver disease. However, it was recently reported that this pathway might not be universal: depletion of histone deacetylase (HDAC) enhances insulin sensitivity alongside hepatic lipid accumulation in mice, but the mechanistic role of microscopic lipid structure in this effect remains unclear. Here we study the effect of Entinostat, a synthetic HDAC inhibitor undergoing clinical trials, on hepatic lipid metabolism in the paradigmatic HepaRG liver cell line. Specifically, we statistically quantify lipid droplet morphology at single cell level utilizing label-free microscopy, coherent anti-Stokes Raman scattering, supported by gene expression. We observe Entinostat efficiently rerouting carbohydrates and free-fatty acids into lipid droplets, upregulating lipid coat protein gene Plin4, and relocating droplets nearer to the nucleus. Our results demonstrate the power of Entinostat to promote lipid synthesis and storage, allowing reduced systemic sugar levels and sequestration of toxic metabolites within protected protein-coated droplets, suggesting a potential therapeutic strategy for diseases such as diabetes and metabolic syndrome.

  11. Walking in the high-rise city: a Health Enhancement and Pedometer-determined Ambulatory (HEPA program in Hong Kong

    Directory of Open Access Journals (Sweden)

    Leung AYM

    2014-08-01

    Full Text Available Angela YM Leung,1,2 Mike KT Cheung,3 Michael A Tse,4 Wai Chuen Shum,5 BJ Lancaster,1,6 Cindy LK Lam7 1School of Nursing, 2Research Centre on Heart, Brain, Hormone and Healthy Aging, Li Ka Shing Faculty of Medicine, University of Hong Kong, 3Centre on Research and Advocacy, Hong Kong Society for Rehabilitation, 4Institute of Human Performance, University of Hong Kong, 5Sheng Kung Hui Holy Carpenter Church Social Services, Hong Kong Special Administrative Region, People’s Republic of China; 6School of Nursing, Vanderbilt University, Nashville, TN, USA; 7Department of Family Medicine and Primary Care, University of Hong Kong, Hong Kong Special Administrative Region, People’s Republic of China Abstract: Due to the lack of good infrastructure in the public estates, many older adults in urban areas are sedentary. The Health Enhancement and Pedometer-Determined Ambulatory (HEPA program was developed to assist older adults with diabetes and/or hypertension to acquire walking exercise habits and to build social support, while engaged in regular physical activity. This study aimed to describe the HEPA program and to report changes in participants’ walking capacity and body strength after 10-week walking sessions. A pre- and postintervention design was used. Pedometers were used to measure the number of steps taken per day before and after the 10-week intervention. Upper and lower body strength, lower body flexibility, and quality of life were assessed. A total of 205 older adults completed the program and all health assessments. After the 10-week intervention, the average number of steps per day increased by 36%, from 6,591 to 8,934. Lower body strength, upper body strength, and aerobic fitness increased significantly after 10 weeks, along with improvement in the 12-item Short Form Health Survey (SF™-12 physical and mental health component summary scores. A social support network was built in the neighborhood, and the local environment was

  12. Differences in TCDD-elicited gene expression profiles in human HepG2, mouse Hepa1c1c7 and rat H4IIE hepatoma cells

    Directory of Open Access Journals (Sweden)

    Burgoon Lyle D

    2011-04-01

    Full Text Available Abstract Background 2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD is an environmental contaminant that elicits a broad spectrum of toxic effects in a species-specific manner. Current risk assessment practices routinely extrapolate results from in vivo and in vitro rodent models to assess human risk. In order to further investigate the species-specific responses elicited by TCDD, temporal gene expression responses in human HepG2, mouse Hepa1c1c7 and rat H4IIE cells were compared. Results Microarray analysis identified a core set of conserved gene expression responses across species consistent with the role of AhR in mediating adaptive metabolic responses. However, significant species-specific as well as species-divergent responses were identified. Computational analysis of the regulatory regions of species-specific and -divergent responses suggests that dioxin response elements (DREs are involved. These results are consistent with in vivo rat vs. mouse species-specific differential gene expression, and more comprehensive comparative DRE searches. Conclusions Comparative analysis of human HepG2, mouse Hepa1c1c7 and rat H4IIE TCDD-elicited gene expression responses is consistent with in vivo rat-mouse comparative gene expression studies, and more comprehensive comparative DRE searches, suggesting that AhR-mediated gene expression is species-specific.

  13. The human hepatocyte cell lines IHH and HepaRG: models to study glucose, lipid and lipoprotein metabolism.

    Science.gov (United States)

    Samanez, Carolina Huaman; Caron, Sandrine; Briand, Olivier; Dehondt, Hélène; Duplan, Isabelle; Kuipers, Folkert; Hennuyer, Nathalie; Clavey, Véronique; Staels, Bart

    2012-07-01

    Metabolic diseases reach epidemic proportions. A better knowledge of the associated alterations in the metabolic pathways in the liver is necessary. These studies need in vitro human cell models. Several human hepatoma models are used, but the response of many metabolic pathways to physiological stimuli is often lost. Here, we characterize two human hepatocyte cell lines, IHH and HepaRG, by analysing the expression and regulation of genes involved in glucose and lipid metabolism. Our results show that the glycolysis pathway is activated by glucose and insulin in both lines. Gluconeogenesis gene expression is induced by forskolin in IHH cells and inhibited by insulin in both cell lines. The lipogenic pathway is regulated by insulin in IHH cells. Finally, both cell lines secrete apolipoprotein B-containing lipoproteins, an effect promoted by increasing glucose concentrations. These two human cell lines are thus interesting models to study the regulation of glucose and lipid metabolism.

  14. The case for improved HEPA-filter mechanical performance standards revisited

    Energy Technology Data Exchange (ETDEWEB)

    Ricketts, C.I.; Smith, P.R. [New Mexico State Univ., Las Cruces, NM (United States)

    1997-08-01

    Under benign operating conditions, High Efficiency Particulate Air (HEPA) filter units serve as reliable and relatively economical components in the air cleaning systems of nuclear facilities worldwide. Despite more than four decades of filter-unit evaluation and improvements, however, the material strength characteristics of the glass fiber filter medium continue to ultimately limit filter functional reliability. In worst-case scenarios involving fire suppression, loss-of-coolant accidents (LOCA`s), or exposure to shock waves or tornado induced flows, rupture of the filter medium of units meeting current qualification standards cannot be entirely ruled out. Even under so-called normal conditions of operation, instances of filter failure reported in the literature leave open questions of filter-unit reliability. Though developments of filter units with improved burst strengths have been pursued outside the United States, support for efforts in this country has been comparatively minimal. This despite user requests for filters with greater moisture resistance, for example. Or the fact that conventional filter designs result in not only the least robust component to be found in a nuclear air cleaning system, but also the one most sensitive to the adverse effects of conditions deviating from those of normal operation. Filter qualification-test specifications of current codes, standards, and regulatory guidelines in the United States are based primarily upon research performed in a 30-year period beginning in the 1950`s. They do not seem to reflect the benefits of the more significant developments and understanding of filter failure modes and mechanisms achieved since that time. One overseas design, based on such knowledge, has proven reliability under adverse operating conditions involving combined and serial challenges. Its widespread use, however, has faltered on a lack of consensus in upgrading filter performance standards. 34 refs., 2 figs., 3 tabs.

  15. Prospective evaluation of FibroTest®, FibroMeter®, and HepaScore® for staging liver fibrosis in chronic hepatitis B: comparison with hepatitis C.

    Science.gov (United States)

    Leroy, Vincent; Sturm, Nathalie; Faure, Patrice; Trocme, Candice; Marlu, Alice; Hilleret, Marie-Noëlle; Morel, Françoise; Zarski, Jean-Pierre

    2014-07-01

    Fibrosis blood tests have been validated in chronic hepatitis C. Their diagnostic accuracy is less documented in hepatitis B. The aim of this study was to describe the diagnostic performance of FibroTest®, FibroMeter®, and HepaScore® for liver fibrosis in hepatitis B compared to hepatitis C. 510 patients mono-infected with hepatitis B or C and matched on fibrosis stage were included. Blood tests were performed the day of the liver biopsy. Histological lesions were staged according to METAVIR. Fibrosis stages were distributed as followed: F0 n=76, F1 n=192, F2 n=132, F3 n=54, F4 n=56. Overall diagnostic performance of blood tests were similar between hepatitis B and C with AUROC ranging from 0.75 to 0.84 for significant fibrosis, 0.82 to 0.85 for extensive fibrosis and 0.84 to 0.87 for cirrhosis. Optimal cut-offs were consistently lower in hepatitis B compared to hepatitis C, especially for the diagnosis of extensive fibrosis and cirrhosis, with decreased sensitivity and negative predictive values. More hepatitis B than C patients with F ⩾3 were underestimated: FibroTest®: 47% vs. 26%, FibroMeter®: 24% vs. 6%, HepaScore®: 41% vs. 24%, pfibrosis underestimation. Overall the diagnostic performance of blood tests is similar in hepatitis B and C. The risk of underestimating significant fibrosis and cirrhosis is however greater in hepatitis B and cannot be entirely corrected by the use of more stringent cut-offs. Copyright © 2014 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  16. DeepPy: Pythonic deep learning

    DEFF Research Database (Denmark)

    Larsen, Anders Boesen Lindbo

    This technical report introduces DeepPy – a deep learning framework built on top of NumPy with GPU acceleration. DeepPy bridges the gap between highperformance neural networks and the ease of development from Python/NumPy. Users with a background in scientific computing in Python will quickly...... be able to understand and change the DeepPy codebase as it is mainly implemented using high-level NumPy primitives. Moreover, DeepPy supports complex network architectures by letting the user compose mathematical expressions as directed graphs. The latest version is available at http...

  17. Resistance of HEPA filter separator materials to humid air--hydrogen fluoride--fluorine environments

    International Nuclear Information System (INIS)

    Weber, C.W.; Petit, G.S.; Woodfin, S.B.

    1977-01-01

    The U. S. Energy Research and Development Administration (ERDA) is interested in the development of a high-efficiency particulate air (HEPA) filter that is resistant to such corrosive reagents as hydrogen fluoride (HF) and fluorine (F 2 ) in air environments of normal relative humidity (about 50% RH). Several types of separator materials are used in the fabrication of commercial filters. The basic types of separator materials are asbestos, Kraft paper, plastic, and aluminum. At the request of the ERDA Division of Operational Safety, the different types of separator materials have been evaluated for their resistance to corrosive attack by HF and F 2 . The separator materials were dynamically tested in the 4-stage multiunit tester located in the Oak Ridge Gaseous Diffusion Plant laboratories. This is the system previously used in the evaluation of the Herty Foundation filter paper samples. Concurrent with the testing of filter media for its resistance to HF and F 2 , another component of the completed filter, the separator, was tested. All samples were exposed to a constant air flow (50% RH) of 32 liters/min, at 100 0 F, containing 900 ppM HF and 300 ppM F 2 . Exposure periods varied from 2 to 1000 h; however, the longer exposures were made only on the stronger candidates. Test results show the plastic and aluminum separator materials to be superior to the other types in resistance to HF and F 2 . The asbestos separators disintegrated after a relatively short exposure time; the Kraft paper types were the next weakest. The Clear Plastic S was the best performer of the plastics tested

  18. Whole house particle removal and clean air delivery rates for in-duct and portable ventilation systems.

    Science.gov (United States)

    Macintosh, David L; Myatt, Theodore A; Ludwig, Jerry F; Baker, Brian J; Suh, Helen H; Spengler, John D

    2008-11-01

    A novel method for determining whole house particle removal and clean air delivery rates attributable to central and portable ventilation/air cleaning systems is described. The method is used to characterize total and air-cleaner-specific particle removal rates during operation of four in-duct air cleaners and two portable air-cleaning devices in a fully instrumented test home. Operation of in-duct and portable air cleaners typically increased particle removal rates over the baseline rates determined in the absence of operating a central fan or an indoor air cleaner. Removal rates of 0.3- to 0.5-microm particles ranged from 1.5 hr(-1) during operation of an in-duct, 5-in. pleated media filter to 7.2 hr(-1) for an in-duct electrostatic air cleaner in comparison to a baseline rate of 0 hr(-1) when the air handler was operating without a filter. Removal rates for total particulate matter less than 2.5 microm in aerodynamic diameter (PM2.5) mass concentrations were 0.5 hr(-1) under baseline conditions, 0.5 hr(-1) during operation of three portable ionic air cleaners, 1 hr(-1) for an in-duct 1-in. media filter, 2.4 hr(-1) for a single high-efficiency particle arrestance (HEPA) portable air cleaner, 4.6 hr(-1) for an in-duct 5-in. media filter, 4.7 hr(-1) during operation of five portable HEPA filters, 6.1 hr(-1) for a conventional in-duct electronic air cleaner, and 7.5 hr(-1) for a high efficiency in-duct electrostatic air cleaner. Corresponding whole house clean air delivery rates for PM2.5 attributable to the air cleaner independent of losses within the central ventilation system ranged from 2 m3/min for the conventional media filter to 32 m3/min for the high efficiency in-duct electrostatic device. Except for the portable ionic air cleaner, the devices considered here increased particle removal indoors over baseline deposition rates.

  19. A novel genotoxin-specific qPCR array based on the metabolically competent human HepaRG™ cell line as a rapid and reliable tool for improved in vitro hazard assessment.

    Science.gov (United States)

    Ates, Gamze; Mertens, Birgit; Heymans, Anja; Verschaeve, Luc; Milushev, Dimiter; Vanparys, Philippe; Roosens, Nancy H C; De Keersmaecker, Sigrid C J; Rogiers, Vera; Doktorova, Tatyana Y

    2018-04-01

    Although the value of the regulatory accepted batteries for in vitro genotoxicity testing is recognized, they result in a high number of false positives. This has a major impact on society and industries developing novel compounds for pharmaceutical, chemical, and consumer products, as afflicted compounds have to be (prematurely) abandoned or further tested on animals. Using the metabolically competent human HepaRG ™ cell line and toxicogenomics approaches, we have developed an upgraded, innovative, and proprietary gene classifier. This gene classifier is based on transcriptomic changes induced by 12 genotoxic and 12 non-genotoxic reference compounds tested at sub-cytotoxic concentrations, i.e., IC10 concentrations as determined by the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) assay. The resulting gene classifier was translated into an easy-to-handle qPCR array that, as shown by pathway analysis, covers several different cellular processes related to genotoxicity. To further assess the predictivity of the tool, a set of 5 known positive and 5 known negative test compounds for genotoxicity was evaluated. In addition, 2 compounds with debatable genotoxicity data were tested to explore how the qPCR array would classify these. With an accuracy of 100%, when equivocal results were considered positive, the results showed that combining HepaRG ™ cells with a genotoxin-specific qPCR array can improve (geno)toxicological hazard assessment. In addition, the developed qPCR array was able to provide additional information on compounds for which so far debatable genotoxicity data are available. The results indicate that the new in vitro tool can improve human safety assessment of chemicals in general by basing predictions on mechanistic toxicogenomics information.

  20. Protection of the vehicle cab environment against bacteria, fungi and endotoxins in composting facilities.

    Science.gov (United States)

    Schlosser, O; Huyard, A; Rybacki, D; Do Quang, Z

    2012-06-01

    Microbial quality of air inside vehicle cabs is a major occupational health risk management issue in composting facilities. Large differences and discrepancies in protection factors between vehicles and between biological agents have been reported. This study aimed at estimating the mean protection efficiency of the vehicle cab environment against bioaerosols with higher precision. In-cab measurement results were also analysed to ascertain whether or not these protection systems reduce workers' exposure to tolerable levels. Five front-end loaders, one mobile mixer and two agricultural tractors pulling windrow turners were investigated. Four vehicles were fitted with a pressurisation and high efficiency particulate air (HEPA) filtration system. The four others were only equipped with pleated paper filter without pressurisation. Bacteria, fungi and endotoxins were measured in 72 pairs of air samples, simultaneously collected inside the cab and on the outside of the cab with a CIP 10-M sampler. A front-end loader, purchased a few weeks previously, fitted with a pressurisation and high efficiency particulate air (HEPA) filtration system, and with a clean cab, exhibited a mean protection efficiency of between 99.47% CI 95% [98.58-99.97%] and 99.91% [99.78-99.98%] depending on the biological agent. It is likely that the lower protection efficiency demonstrated in other vehicles was caused by penetration through the only moderately efficient filters, by the absence of pressurisation, by leakage in the filter-sealing system, and by re-suspension of particles which accumulated in dirty cabs. Mean protection efficiency in regards to bacteria and endotoxins ranged between 92.64% [81.87-97.89%] and 98.61% [97.41-99.38%], and between 92.68% [88.11-96.08%] and 98.43% [97.44-99.22%], respectively. The mean protection efficiency was the lowest when confronted with fungal spores, from 59.76% [4.19-90.75%] to 94.71% [91.07-97.37%]. The probability that in-cab exposure to fungi

  1. Deep Echo State Network (DeepESN): A Brief Survey

    OpenAIRE

    Gallicchio, Claudio; Micheli, Alessio

    2017-01-01

    The study of deep recurrent neural networks (RNNs) and, in particular, of deep Reservoir Computing (RC) is gaining an increasing research attention in the neural networks community. The recently introduced deep Echo State Network (deepESN) model opened the way to an extremely efficient approach for designing deep neural networks for temporal data. At the same time, the study of deepESNs allowed to shed light on the intrinsic properties of state dynamics developed by hierarchical compositions ...

  2. Why & When Deep Learning Works: Looking Inside Deep Learnings

    OpenAIRE

    Ronen, Ronny

    2017-01-01

    The Intel Collaborative Research Institute for Computational Intelligence (ICRI-CI) has been heavily supporting Machine Learning and Deep Learning research from its foundation in 2012. We have asked six leading ICRI-CI Deep Learning researchers to address the challenge of "Why & When Deep Learning works", with the goal of looking inside Deep Learning, providing insights on how deep networks function, and uncovering key observations on their expressiveness, limitations, and potential. The outp...

  3. Differential toxicity of heterocyclic aromatic amines and their mixture in metabolically competent HepaRG cells

    International Nuclear Information System (INIS)

    Dumont, Julie; Josse, Rozenn; Lambert, Carine; Antherieu, Sebastien; Le Hegarat, Ludovic; Aninat, Caroline; Robin, Marie-Anne; Guguen-Guillouzo, Christiane

    2010-01-01

    Human exposure to heterocyclic aromatic amines (HAA) usually occurs through mixtures rather than individual compounds. However, the toxic effects and related mechanisms of co-exposure to HAA in humans remain unknown. We compared the effects of two of the most common HAA, 2-amino-1-methyl-6-phenylimidazo[4,5-b]pyridine (PhIP) and 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), individually or in combination, in the metabolically competent human hepatoma HepaRG cells. Various endpoints were measured including cytotoxicity, apoptosis, oxidative stress and DNA damage by the comet assay. Moreover, the effects of PhIP and/or MeIQx on mRNA expression and activities of enzymes involved in their activation and detoxification pathways were evaluated. After a 24 h treatment, PhIP and MeIQx, individually and in combination, exerted differential effects on apoptosis, oxidative stress, DNA damage and cytochrome P450 (CYP) activities. Only PhIP induced DNA damage. It was also a stronger inducer of CYP1A1 and CYP1B1 expression and activity than MeIQx. In contrast, only MeIQx exposure resulted in a significant induction of CYP1A2 activity. The combination of PhIP with MeIQx induced an oxidative stress and showed synergistic effects on apoptosis. However, PhIP-induced genotoxicity was abolished by a co-exposure with MeIQx. Such an inhibitory effect could be explained by a significant decrease in CYP1A2 activity which is responsible for PhIP genotoxicity. Our findings highlight the need to investigate interactions between HAA when assessing risks for human health and provide new insights in the mechanisms of interaction between PhIP and MeIQx.

  4. A new look at the economic disposal of contaminated aerosol filters

    International Nuclear Information System (INIS)

    Sinhuber, D.; Stiehl, H.H.; Schroth, W.

    1977-01-01

    HEPA filter elements loaded with radioactive aerosols represent a large percentage of the radioactive waste of nuclear plants. The HEPA filter elements used at present in air filter units are not suitable to an economic disposal and space-saving final storage on account of their construction. New concepts for HEPA filters are necessary from the point of view of waste disposal. The criteria of these new designs are as follows: reduction of space required for HEPA filters used at the same or even increased flow rates and with the same efficiency; essentially smaller dimensions of HEPA filters for storage in waste containers; removal without contamination, volume reduction (without contamination) for disposal of the HEPA filters; smallest possible volume of the HEPA filters after processing for final storage. The construction of HEPA filters as a result of the requirements mentioned above and taking into consideration the present stage of technology is explained. The advantages of such construction with regard to the criteria mentioned before are presented in comparison with the HEPA filters at present in use

  5. Deep iCrawl: An Intelligent Vision-Based Deep Web Crawler

    OpenAIRE

    R.Anita; V.Ganga Bharani; N.Nityanandam; Pradeep Kumar Sahoo

    2011-01-01

    The explosive growth of World Wide Web has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. Deep web pages are created dynamically as a result of queries posed to specific web databases. The structure of the deep web pages makes it impossible for traditional web crawlers to access deep web contents. This paper, Deep iCrawl, gives a novel and vision-based app...

  6. Folded membrane dialyzer with mechanically sealed edges

    Energy Technology Data Exchange (ETDEWEB)

    Markley, F.W.

    A semipermeable membrane is folded in accordion fashion to form a stack of pleats and the edges are sealed so as to isolate the opposite surfaces of the membrane. The stack is contained within a case that provides ports for flow of blood in contact with one surface of the membrane through channels formed by the pleats and also provides ports for flow of a dialysate through channels formed by the pleats in contact with the other surface of the membrane. The serpentine side edges of the membrane are sealed by a solidified plastic material, whereas effective mechanical means are provided to seal the end edges of the folded membrane. The mechanical means include a clamping strip which biases case sealing flanges into a sealed relationship with end portions of the membrane near the end edges, which portions extend from the stack and between the sealing flanges.

  7. Promotion of health-enhancing physical activity in rheumatoid arthritis: a comparative study on healthcare providers in Italy, The Netherlands and Sweden.

    Science.gov (United States)

    Brodin, Nina; Hurkmans, Emalie; DiMatteo, Luigi; Nava, Tiziana; Vliet Vlieland, Thea; Opava, Christina H

    2015-10-01

    The objectives of this study were to compare attitudes, practice of advice, perceived competencies and educational needs related to health-enhancing physical activity (HEPA) in rheumatoid arthritis (RA) among Dutch, Italian and Swedish healthcare providers (HCP) and to explore associations between these factors and age, gender and HEPA levels of HCP. Questionnaires were sent to 2939 HCP, members of their national rheumatology organizations. HEPA was assessed with the Short Questionnaire to Assess Health-Enhancing Physical Activity or the International Physical Activity Questionnaire; attitudes, practice of advice, perceived competencies and educational needs with a 23-item questionnaire. Overall response rate was 33 %. Ninety-five percent of HCP agreed that HEPA is an important health goal in RA. More Swedish HCP had positive attitudes to the attainability and safety of HEPA in RA. There were no differences between countries in practice of advice on HEPA to patients with RA in general or to those with recent onset disease, but more Italian HCP were reluctant to advise HEPA to patients with established disease. Of the total HCP, 36 to 60 % used public health guidelines to advise on HEPA, with Dutch HCP taking less advantage. Still they estimated a higher proportion of patients with RA to follow such advice. Italian HCP perceived their competencies the highest, but were also more interested in education to promote HEPA. Gender, age and HEPA performance had no association with attitudes toward HEPA, while a number of associations were found between these factors and practice of advice and perceived competencies. The differences found between HCP in the three countries might indicate the need for educational initiatives to improve HEPA promotion.

  8. Deep learning

    CERN Document Server

    Goodfellow, Ian; Courville, Aaron

    2016-01-01

    Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language proces...

  9. Deep Incremental Boosting

    OpenAIRE

    Mosca, Alan; Magoulas, George D

    2017-01-01

    This paper introduces Deep Incremental Boosting, a new technique derived from AdaBoost, specifically adapted to work with Deep Learning methods, that reduces the required training time and improves generalisation. We draw inspiration from Transfer of Learning approaches to reduce the start-up time to training each incremental Ensemble member. We show a set of experiments that outlines some preliminary results on some common Deep Learning datasets and discuss the potential improvements Deep In...

  10. Deep Super Learner: A Deep Ensemble for Classification Problems

    OpenAIRE

    Young, Steven; Abdou, Tamer; Bener, Ayse

    2018-01-01

    Deep learning has become very popular for tasks such as predictive modeling and pattern recognition in handling big data. Deep learning is a powerful machine learning method that extracts lower level features and feeds them forward for the next layer to identify higher level features that improve performance. However, deep neural networks have drawbacks, which include many hyper-parameters and infinite architectures, opaqueness into results, and relatively slower convergence on smaller datase...

  11. DeepRT: deep learning for peptide retention time prediction in proteomics

    OpenAIRE

    Ma, Chunwei; Zhu, Zhiyong; Ye, Jun; Yang, Jiarui; Pei, Jianguo; Xu, Shaohang; Zhou, Ruo; Yu, Chang; Mo, Fan; Wen, Bo; Liu, Siqi

    2017-01-01

    Accurate predictions of peptide retention times (RT) in liquid chromatography have many applications in mass spectrometry-based proteomics. Herein, we present DeepRT, a deep learning based software for peptide retention time prediction. DeepRT automatically learns features directly from the peptide sequences using the deep convolutional Neural Network (CNN) and Recurrent Neural Network (RNN) model, which eliminates the need to use hand-crafted features or rules. After the feature learning, pr...

  12. Physician Knowledge and Attitudes About Hepatitis A and Current Practices Regarding Hepatitis A Vaccination Delivery.

    Science.gov (United States)

    Nelson, Noele P; Allison, Mandy A; Lindley, Megan C; Brtnikova, Michaela; Crane, Lori A; Beaty, Brenda L; Hurley, Laura P; Kempe, Allison

    2017-07-01

    To assess physicians': 1) knowledge and attitudes about hepatitis A disease and hepatitis A (HepA) vaccine, 2) child care and school HepA vaccine mandates, 3) practices related to HepA vaccine delivery, 4) factors associated with strongly recommending HepA vaccine to all 1- to 2-year-olds, and 5) feasibility of implementing HepA catch-up vaccination at health maintenance visits. A national survey was conducted among representative networks of pediatricians and family medicine physicians (FMs) from March to June, 2014 via e-mail or mail on the basis of provider preference. Response rates were 81% (356 of 440) among pediatricians and 75% (348 of 464) among FMs. Less than 50% correctly identified that hepatitis A virus (HAV) infection is usually asymptomatic in young children and that morbidity from HAV disease increases with age. Ninety-two percent of pediatricians and 59% of FMs strongly recommend HepA vaccine for all 1- to 2-year-olds. In addition to practice specialty, belief that HepA vaccine is required for kindergarten enrollment was the most robust predictor of strong physician recommendation. Gaps in knowledge regarding HAV infection and hepatitis A recommendations and lack of a strong recommendation for routine HepA vaccination of young children among FMs likely contribute to suboptimal coverage. Closing knowledge gaps and addressing barriers that prevent all physicians from strongly recommending HepA vaccine to 1- to 2-year-olds could help increase HepA vaccine coverage and ultimately improve population protection against HAV infection. Published by Elsevier Inc.

  13. Absolute air filtering equipment in the nuclear industrie. Design - Safety - Experience

    International Nuclear Information System (INIS)

    Lucas, J.C.

    1977-01-01

    The problems encountered in the design of absolute filters (HEPA FILTERS) are presented: glass-fibre filter papers; standards and characteristics: efficiency, fire-resistance, humidity-resistance, radiation-resistance, etc; various types of paper folding: deep folds and small folds, dihedrally mounted; filtering elements; designs; characteristics and quality control; The design of filtration equipment is also analysed: mounting in metal or concrete casings. French and American designs (Regulatory Guide 1-52); and gas-tight casings allowing contaminated filters to be renewed without breaking the gas-tight seal

  14. Analyses of the deep borehole drilling status for a deep borehole disposal system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Youl; Choi, Heui Joo; Lee, Min Soo; Kim, Geon Young; Kim, Kyung Su [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The purpose of disposal for radioactive wastes is not only to isolate them from humans, but also to inhibit leakage of any radioactive materials into the accessible environment. Because of the extremely high level and long-time scale radioactivity of HLW(High-level radioactive waste), a mined deep geological disposal concept, the disposal depth is about 500 m below ground, is considered as the safest method to isolate the spent fuels or high-level radioactive waste from the human environment with the best available technology at present time. Therefore, as an alternative disposal concept, i.e., deep borehole disposal technology is under consideration in number of countries in terms of its outstanding safety and cost effectiveness. In this paper, the general status of deep drilling technologies was reviewed for deep borehole disposal of high level radioactive wastes. Based on the results of these review, very preliminary applicability of deep drilling technology for deep borehole disposal analyzed. In this paper, as one of key technologies of deep borehole disposal system, the general status of deep drilling technologies in oil industry, geothermal industry and geo scientific field was reviewed for deep borehole disposal of high level radioactive wastes. Based on the results of these review, the very preliminary applicability of deep drilling technology for deep borehole disposal such as relation between depth and diameter, drilling time and feasibility classification was analyzed.

  15. Deep Space Telecommunications

    Science.gov (United States)

    Kuiper, T. B. H.; Resch, G. M.

    2000-01-01

    The increasing load on NASA's deep Space Network, the new capabilities for deep space missions inherent in a next-generation radio telescope, and the potential of new telescope technology for reducing construction and operation costs suggest a natural marriage between radio astronomy and deep space telecommunications in developing advanced radio telescope concepts.

  16. Greedy Deep Dictionary Learning

    OpenAIRE

    Tariyal, Snigdha; Majumdar, Angshul; Singh, Richa; Vatsa, Mayank

    2016-01-01

    In this work we propose a new deep learning tool called deep dictionary learning. Multi-level dictionaries are learnt in a greedy fashion, one layer at a time. This requires solving a simple (shallow) dictionary learning problem, the solution to this is well known. We apply the proposed technique on some benchmark deep learning datasets. We compare our results with other deep learning tools like stacked autoencoder and deep belief network; and state of the art supervised dictionary learning t...

  17. DeepBipolar: Identifying genomic mutations for bipolar disorder via deep learning.

    Science.gov (United States)

    Laksshman, Sundaram; Bhat, Rajendra Rana; Viswanath, Vivek; Li, Xiaolin

    2017-09-01

    Bipolar disorder, also known as manic depression, is a brain disorder that affects the brain structure of a patient. It results in extreme mood swings, severe states of depression, and overexcitement simultaneously. It is estimated that roughly 3% of the population of the United States (about 5.3 million adults) suffers from bipolar disorder. Recent research efforts like the Twin studies have demonstrated a high heritability factor for the disorder, making genomics a viable alternative for detecting and treating bipolar disorder, in addition to the conventional lengthy and costly postsymptom clinical diagnosis. Motivated by this study, leveraging several emerging deep learning algorithms, we design an end-to-end deep learning architecture (called DeepBipolar) to predict bipolar disorder based on limited genomic data. DeepBipolar adopts the Deep Convolutional Neural Network (DCNN) architecture that automatically extracts features from genotype information to predict the bipolar phenotype. We participated in the Critical Assessment of Genome Interpretation (CAGI) bipolar disorder challenge and DeepBipolar was considered the most successful by the independent assessor. In this work, we thoroughly evaluate the performance of DeepBipolar and analyze the type of signals we believe could have affected the classifier in distinguishing the case samples from the control set. © 2017 Wiley Periodicals, Inc.

  18. Deep learning? What deep learning? | Fourie | South African ...

    African Journals Online (AJOL)

    In teaching generally over the past twenty years, there has been a move towards teaching methods that encourage deep, rather than surface approaches to learning. The reason for this being that students, who adopt a deep approach to learning are considered to have learning outcomes of a better quality and desirability ...

  19. DeepInfer: open-source deep learning deployment toolkit for image-guided therapy

    Science.gov (United States)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-03-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  20. Deep learning with Python

    CERN Document Server

    Chollet, Francois

    2018-01-01

    DESCRIPTION Deep learning is applicable to a widening range of artificial intelligence problems, such as image classification, speech recognition, text classification, question answering, text-to-speech, and optical character recognition. Deep Learning with Python is structured around a series of practical code examples that illustrate each new concept introduced and demonstrate best practices. By the time you reach the end of this book, you will have become a Keras expert and will be able to apply deep learning in your own projects. KEY FEATURES • Practical code examples • In-depth introduction to Keras • Teaches the difference between Deep Learning and AI ABOUT THE TECHNOLOGY Deep learning is the technology behind photo tagging systems at Facebook and Google, self-driving cars, speech recognition systems on your smartphone, and much more. AUTHOR BIO Francois Chollet is the author of Keras, one of the most widely used libraries for deep learning in Python. He has been working with deep neural ...

  1. Improving indoor air quality by using the new generation of corrugated cardboard-based filters.

    Science.gov (United States)

    Candiani, Gabriele; Del Curto, Barbara; Cigada, Alberto

    2012-09-27

    Indoor Air Quality (IAQ) is strictly affected by the concentration of total suspended particulate matter (TSP). Air filtration is by far the most feasible suggestion to improve IAQ. Unfortunately, highly effective HEPA filters also have a few major weaknesses that have hindered their widespread use. There is therefore a renewed interest in developing novel, cost-effective filtration systems. We have recently reported the development of cardboard-based filters for bacterial removal that were further implemented and tested herein. A parallelepiped filter manufactured by aligning strips of corrugated cardboard and surrounded by a cardboard frame was specifically designed with an internal pocket holding a partially cut antistatic pleated fabric (HP). This filter, together with its parent version (CTRL) and a commercially sourced specimen (CAF), were assessed comparatively in a long-time test to assess their effectiveness on TSP removal. We found that the TSP abatement efficiency (E%) of the HP filter was relatively high and invariable over the 93 days of test and the pressure drop (PD%) decrease because of filter clogging was moderate. Most important, the HP filter was the most effective if assessed in terms of overall yield (Y%) and its performance was quite constant over the entire period considered. This work disclosed this novel class of corrugated cardboard-based filters as promising tools to ameliorate IAQ in light of their good TSP removal properties that endure over time. Moreover, cardboard is a lightweight, inexpensive, and eco-friendly material and corrugated cardboard-based air filters are very easy to shape and mount on and/or replace in existing ventilation systems.

  2. Deep learning evaluation using deep linguistic processing

    OpenAIRE

    Kuhnle, Alexander; Copestake, Ann

    2017-01-01

    We discuss problems with the standard approaches to evaluation for tasks like visual question answering, and argue that artificial data can be used to address these as a complement to current practice. We demonstrate that with the help of existing 'deep' linguistic processing technology we are able to create challenging abstract datasets, which enable us to investigate the language understanding abilities of multimodal deep learning models in detail, as compared to a single performance value ...

  3. Cross-sector cooperation in health-enhancing physical activity policymaking

    DEFF Research Database (Denmark)

    Hämäläinen, Riitta-Maija; Aro, Arja R.; Juel Lau, Cathrine

    2016-01-01

    in health-enhancing physical activity (HEPA) policies in six European Union (EU) member states. METHODS: Qualitative content analysis of HEPA policies and semi-structured interviews with key policymakers in six European countries. RESULTS: Cross-sector cooperation varied between EU member states within HEPA...

  4. Morphology and identification of fly eggs: application in forensic entomology.

    Science.gov (United States)

    Sanit, S; Sribanditmongkol, P; Sukontason, K L; Moophayak, K; Klong-Klaew, T; Yasanga, T; Sukontason, K

    2013-06-01

    Fly eggs found in corpses can be used as entomological evidence in forensic investigation. This study aims to investigate the morphology of forensically important fly eggs. Eggs of Chrysomya rufifacies, Chrysomya megacephala, Chrysomya pinguis, Chrysomya nigripes, Hypopygiopsis tumrasvini, Lucilia cuprina, Lucilia porphyrina and Musca domestica were examined using 1% potassium permanganate solution for 1 min. Morphometric analysis revealed that the mean length of Hy. tumrasvini (1.63 mm) and C. pinguis (1.65 mm) eggs was the longest, followed by that of L. porphyrina (1.45 mm), C. rufifacies (1.34 mm). The egg length, width of median area and darkness staining of hatching pleats were distinctive features. Four categories of median area were proposed, based on width; (1) distinctly wide (Megaselia scalaris, Synthesiomyia nudiseta); (2) wide (C. nigripes, M. domestica); (3) slightly widening (Hy. tumrasvini, L. cuprina, L. porphyrina); and (4) narrow (C. rufifacies, C. albiceps, C. megacephala, C. pinguis). Four species were examined using SEM, i.e., C. megacephala, C. pinguis, Hy. tumrasvini and L. porphyrina. The eggs of C. megacephala demonstrated swollen hatching pleats. Inside, the hexagon of the chorion appeared as a sponging bumpy feature. The egg of C. pinguis was similar to C. megacephala, except for the sponging bumpy feature on the outer surface of the hatching pleats. Regarding Hy. tumrasvini and L. porphyrina, their island structure was apparent at the inner surface of the upright hatching pleats. The key for identifying these eggs together with other reported species in Thailand has been updated.

  5. Geometric mechanics of periodic pleated origami.

    Science.gov (United States)

    Wei, Z Y; Guo, Z V; Dudte, L; Liang, H Y; Mahadevan, L

    2013-05-24

    Origami structures are mechanical metamaterials with properties that arise almost exclusively from the geometry of the constituent folds and the constraint of piecewise isometric deformations. Here we characterize the geometry and planar and nonplanar effective elastic response of a simple periodically folded Miura-ori structure, which is composed of identical unit cells of mountain and valley folds with four-coordinated ridges, defined completely by two angles and two lengths. We show that the in-plane and out-of-plane Poisson's ratios are equal in magnitude, but opposite in sign, independent of material properties. Furthermore, we show that effective bending stiffness of the unit cell is singular, allowing us to characterize the two-dimensional deformation of a plate in terms of a one-dimensional theory. Finally, we solve the inverse design problem of determining the geometric parameters for the optimal geometric and mechanical response of these extreme structures.

  6. DeepMitosis: Mitosis detection via deep detection, verification and segmentation networks.

    Science.gov (United States)

    Li, Chao; Wang, Xinggang; Liu, Wenyu; Latecki, Longin Jan

    2018-04-01

    Mitotic count is a critical predictor of tumor aggressiveness in the breast cancer diagnosis. Nowadays mitosis counting is mainly performed by pathologists manually, which is extremely arduous and time-consuming. In this paper, we propose an accurate method for detecting the mitotic cells from histopathological slides using a novel multi-stage deep learning framework. Our method consists of a deep segmentation network for generating mitosis region when only a weak label is given (i.e., only the centroid pixel of mitosis is annotated), an elaborately designed deep detection network for localizing mitosis by using contextual region information, and a deep verification network for improving detection accuracy by removing false positives. We validate the proposed deep learning method on two widely used Mitosis Detection in Breast Cancer Histological Images (MITOSIS) datasets. Experimental results show that we can achieve the highest F-score on the MITOSIS dataset from ICPR 2012 grand challenge merely using the deep detection network. For the ICPR 2014 MITOSIS dataset that only provides the centroid location of mitosis, we employ the segmentation model to estimate the bounding box annotation for training the deep detection network. We also apply the verification model to eliminate some false positives produced from the detection model. By fusing scores of the detection and verification models, we achieve the state-of-the-art results. Moreover, our method is very fast with GPU computing, which makes it feasible for clinical practice. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Deep frying

    NARCIS (Netherlands)

    Koerten, van K.N.

    2016-01-01

    Deep frying is one of the most used methods in the food processing industry. Though practically any food can be fried, French fries are probably the most well-known deep fried products. The popularity of French fries stems from their unique taste and texture, a crispy outside with a mealy soft

  8. DeepPVP: phenotype-based prioritization of causative variants using deep learning

    KAUST Repository

    Boudellioua, Imene

    2018-05-02

    Background: Prioritization of variants in personal genomic data is a major challenge. Recently, computational methods that rely on comparing phenotype similarity have shown to be useful to identify causative variants. In these methods, pathogenicity prediction is combined with a semantic similarity measure to prioritize not only variants that are likely to be dysfunctional but those that are likely involved in the pathogenesis of a patient\\'s phenotype. Results: We have developed DeepPVP, a variant prioritization method that combined automated inference with deep neural networks to identify the likely causative variants in whole exome or whole genome sequence data. We demonstrate that DeepPVP performs significantly better than existing methods, including phenotype-based methods that use similar features. DeepPVP is freely available at https://github.com/bio-ontology-research-group/phenomenet-vp Conclusions: DeepPVP further improves on existing variant prioritization methods both in terms of speed as well as accuracy.

  9. Hot, deep origin of petroleum: deep basin evidence and application

    Science.gov (United States)

    Price, Leigh C.

    1978-01-01

    Use of the model of a hot deep origin of oil places rigid constraints on the migration and entrapment of crude oil. Specifically, oil originating from depth migrates vertically up faults and is emplaced in traps at shallower depths. Review of petroleum-producing basins worldwide shows oil occurrence in these basins conforms to the restraints of and therefore supports the hypothesis. Most of the world's oil is found in the very deepest sedimentary basins, and production over or adjacent to the deep basin is cut by or directly updip from faults dipping into the basin deep. Generally the greater the fault throw the greater the reserves. Fault-block highs next to deep sedimentary troughs are the best target areas by the present concept. Traps along major basin-forming faults are quite prospective. The structural style of a basin governs the distribution, types, and amounts of hydrocarbons expected and hence the exploration strategy. Production in delta depocenters (Niger) is in structures cut by or updip from major growth faults, and structures not associated with such faults are barren. Production in block fault basins is on horsts next to deep sedimentary troughs (Sirte, North Sea). In basins whose sediment thickness, structure and geologic history are known to a moderate degree, the main oil occurrences can be specifically predicted by analysis of fault systems and possible hydrocarbon migration routes. Use of the concept permits the identification of significant targets which have either been downgraded or ignored in the past, such as production in or just updip from thrust belts, stratigraphic traps over the deep basin associated with major faulting, production over the basin deep, and regional stratigraphic trapping updip from established production along major fault zones.

  10. Deep learning in bioinformatics.

    Science.gov (United States)

    Min, Seonwoo; Lee, Byunghan; Yoon, Sungroh

    2017-09-01

    In the era of big data, transformation of biomedical big data into valuable knowledge has been one of the most important challenges in bioinformatics. Deep learning has advanced rapidly since the early 2000s and now demonstrates state-of-the-art performance in various fields. Accordingly, application of deep learning in bioinformatics to gain insight from data has been emphasized in both academia and industry. Here, we review deep learning in bioinformatics, presenting examples of current research. To provide a useful and comprehensive perspective, we categorize research both by the bioinformatics domain (i.e. omics, biomedical imaging, biomedical signal processing) and deep learning architecture (i.e. deep neural networks, convolutional neural networks, recurrent neural networks, emergent architectures) and present brief descriptions of each study. Additionally, we discuss theoretical and practical issues of deep learning in bioinformatics and suggest future research directions. We believe that this review will provide valuable insights and serve as a starting point for researchers to apply deep learning approaches in their bioinformatics studies. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. DeepSimulator: a deep simulator for Nanopore sequencing

    KAUST Repository

    Li, Yu

    2017-12-23

    Motivation: Oxford Nanopore sequencing is a rapidly developed sequencing technology in recent years. To keep pace with the explosion of the downstream data analytical tools, a versatile Nanopore sequencing simulator is needed to complement the experimental data as well as to benchmark those newly developed tools. However, all the currently available simulators are based on simple statistics of the produced reads, which have difficulty in capturing the complex nature of the Nanopore sequencing procedure, the main task of which is the generation of raw electrical current signals. Results: Here we propose a deep learning based simulator, DeepSimulator, to mimic the entire pipeline of Nanopore sequencing. Starting from a given reference genome or assembled contigs, we simulate the electrical current signals by a context-dependent deep learning model, followed by a base-calling procedure to yield simulated reads. This workflow mimics the sequencing procedure more naturally. The thorough experiments performed across four species show that the signals generated by our context-dependent model are more similar to the experimentally obtained signals than the ones generated by the official context-independent pore model. In terms of the simulated reads, we provide a parameter interface to users so that they can obtain the reads with different accuracies ranging from 83% to 97%. The reads generated by the default parameter have almost the same properties as the real data. Two case studies demonstrate the application of DeepSimulator to benefit the development of tools in de novo assembly and in low coverage SNP detection. Availability: The software can be accessed freely at: https://github.com/lykaust15/DeepSimulator.

  12. Deep learning relevance

    DEFF Research Database (Denmark)

    Lioma, Christina; Larsen, Birger; Petersen, Casper

    2016-01-01

    train a Recurrent Neural Network (RNN) on existing relevant information to that query. We then use the RNN to "deep learn" a single, synthetic, and we assume, relevant document for that query. We design a crowdsourcing experiment to assess how relevant the "deep learned" document is, compared...... to existing relevant documents. Users are shown a query and four wordclouds (of three existing relevant documents and our deep learned synthetic document). The synthetic document is ranked on average most relevant of all....

  13. STIMULATION TECHNOLOGIES FOR DEEP WELL COMPLETIONS

    Energy Technology Data Exchange (ETDEWEB)

    Stephen Wolhart

    2003-06-01

    The Department of Energy (DOE) is sponsoring a Deep Trek Program targeted at improving the economics of drilling and completing deep gas wells. Under the DOE program, Pinnacle Technologies is conducting a project to evaluate the stimulation of deep wells. The objective of the project is to assess U.S. deep well drilling & stimulation activity, review rock mechanics & fracture growth in deep, high pressure/temperature wells and evaluate stimulation technology in several key deep plays. Phase 1 was recently completed and consisted of assessing deep gas well drilling activity (1995-2007) and an industry survey on deep gas well stimulation practices by region. Of the 29,000 oil, gas and dry holes drilled in 2002, about 300 were drilled in the deep well; 25% were dry, 50% were high temperature/high pressure completions and 25% were simply deep completions. South Texas has about 30% of these wells, Oklahoma 20%, Gulf of Mexico Shelf 15% and the Gulf Coast about 15%. The Rockies represent only 2% of deep drilling. Of the 60 operators who drill deep and HTHP wells, the top 20 drill almost 80% of the wells. Six operators drill half the U.S. deep wells. Deep drilling peaked at 425 wells in 1998 and fell to 250 in 1999. Drilling is expected to rise through 2004 after which drilling should cycle down as overall drilling declines.

  14. Deep learning in TMVA Benchmarking Benchmarking TMVA DNN Integration of a Deep Autoencoder

    CERN Document Server

    Huwiler, Marc

    2017-01-01

    The TMVA library in ROOT is dedicated to multivariate analysis, and in partic- ular oers numerous machine learning algorithms in a standardized framework. It is widely used in High Energy Physics for data analysis, mainly to perform regression and classication. To keep up to date with the state of the art in deep learning, a new deep learning module was being developed this summer, oering deep neural net- work, convolutional neural network, and autoencoder. TMVA did not have yet any autoencoder method, and the present project consists in implementing the TMVA autoencoder class based on the deep learning module. It also includes some bench- marking performed on the actual deep neural network implementation, in comparison to the Keras framework with Tensorflow and Theano backend.

  15. Hepatic tissue environment in NEMO-deficient mice critically regulates positive selection of donor cells after hepatocyte transplantation.

    Directory of Open Access Journals (Sweden)

    Michaela Kaldenbach

    Full Text Available BACKGROUND: Hepatocyte transplantation (HT is a promising alternative treatment strategy for end-stage liver diseases compared with orthotopic liver transplantation. A limitation for this approach is the low engraftment of donor cells. The deletion of the I-kappa B kinase-regulatory subunit IKKγ/NEMO in hepatocytes prevents nuclear factor (NF-kB activation and triggers spontaneous liver apoptosis, chronic hepatitis and the development of liver fibrosis and hepatocellular carcinoma. We hypothesized that NEMOΔhepa mice may therefore serve as an experimental model to study HT. METHODS: Pre-conditioned NEMOΔhepa mice were transplanted with donor-hepatocytes from wildtype (WT and mice deficient for the pro-apoptotic mediator Caspase-8 (Casp8Δhepa. RESULTS: Transplantation of isolated WT-hepatocytes into pre-conditioned NEMOΔhepa mice resulted in a 6-7 fold increase of donor cells 12 weeks after HT, while WT-recipients showed no liver repopulation. The use of apoptosis-resistant Casp8Δhepa-derived donor cells further enhanced the selection 3-fold after 12-weeks and up to 10-fold increase after 52 weeks compared with WT donors. While analysis of NEMOΔhepa mice revealed strong liver injury, HT-recipient NEMOΔhepa mice showed improved liver morphology and decrease in serum transaminases. Concomitant with these findings, the histological examination elicited an improved liver tissue architecture associated with significantly lower levels of apoptosis, decreased proliferation and a lesser amount of liver fibrogenesis. Altogether, our data clearly support the therapeutic benefit of the HT procedure into NEMOΔhepa mice. CONCLUSION: This study demonstrates the feasibility of the NEMOΔhepa mouse as an in vivo tool to study liver repopulation after HT. The improvement of the characteristic phenotype of chronic liver injury in NEMOΔhepa mice after HT suggests the therapeutic potential of HT in liver diseases with a chronic inflammatory phenotype and

  16. Deep subsurface microbial processes

    Science.gov (United States)

    Lovley, D.R.; Chapelle, F.H.

    1995-01-01

    Information on the microbiology of the deep subsurface is necessary in order to understand the factors controlling the rate and extent of the microbially catalyzed redox reactions that influence the geophysical properties of these environments. Furthermore, there is an increasing threat that deep aquifers, an important drinking water resource, may be contaminated by man's activities, and there is a need to predict the extent to which microbial activity may remediate such contamination. Metabolically active microorganisms can be recovered from a diversity of deep subsurface environments. The available evidence suggests that these microorganisms are responsible for catalyzing the oxidation of organic matter coupled to a variety of electron acceptors just as microorganisms do in surface sediments, but at much slower rates. The technical difficulties in aseptically sampling deep subsurface sediments and the fact that microbial processes in laboratory incubations of deep subsurface material often do not mimic in situ processes frequently necessitate that microbial activity in the deep subsurface be inferred through nonmicrobiological analyses of ground water. These approaches include measurements of dissolved H2, which can predict the predominant microbially catalyzed redox reactions in aquifers, as well as geochemical and groundwater flow modeling, which can be used to estimate the rates of microbial processes. Microorganisms recovered from the deep subsurface have the potential to affect the fate of toxic organics and inorganic contaminants in groundwater. Microbial activity also greatly influences 1 the chemistry of many pristine groundwaters and contributes to such phenomena as porosity development in carbonate aquifers, accumulation of undesirably high concentrations of dissolved iron, and production of methane and hydrogen sulfide. Although the last decade has seen a dramatic increase in interest in deep subsurface microbiology, in comparison with the study of

  17. DeepSurv: personalized treatment recommender system using a Cox proportional hazards deep neural network.

    Science.gov (United States)

    Katzman, Jared L; Shaham, Uri; Cloninger, Alexander; Bates, Jonathan; Jiang, Tingting; Kluger, Yuval

    2018-02-26

    Medical practitioners use survival models to explore and understand the relationships between patients' covariates (e.g. clinical and genetic features) and the effectiveness of various treatment options. Standard survival models like the linear Cox proportional hazards model require extensive feature engineering or prior medical knowledge to model treatment interaction at an individual level. While nonlinear survival methods, such as neural networks and survival forests, can inherently model these high-level interaction terms, they have yet to be shown as effective treatment recommender systems. We introduce DeepSurv, a Cox proportional hazards deep neural network and state-of-the-art survival method for modeling interactions between a patient's covariates and treatment effectiveness in order to provide personalized treatment recommendations. We perform a number of experiments training DeepSurv on simulated and real survival data. We demonstrate that DeepSurv performs as well as or better than other state-of-the-art survival models and validate that DeepSurv successfully models increasingly complex relationships between a patient's covariates and their risk of failure. We then show how DeepSurv models the relationship between a patient's features and effectiveness of different treatment options to show how DeepSurv can be used to provide individual treatment recommendations. Finally, we train DeepSurv on real clinical studies to demonstrate how it's personalized treatment recommendations would increase the survival time of a set of patients. The predictive and modeling capabilities of DeepSurv will enable medical researchers to use deep neural networks as a tool in their exploration, understanding, and prediction of the effects of a patient's characteristics on their risk of failure.

  18. Hepatocyte polarization is essential for the productive entry of the hepatitis B virus.

    Science.gov (United States)

    Schulze, Andreas; Mills, Kerry; Weiss, Thomas S; Urban, Stephan

    2012-02-01

    Human hepatitis B virus (HBV) is characterized by a high species specificity and a distinct liver tropism. Within the liver, HBV replication occurs in differentiated and polarized hepatocytes. Accordingly, the in vitro HBV infection of primary human hepatocytes (PHHs) and the human hepatoma cell line, HepaRG, is restricted to differentiated, hepatocyte-like cells. Though preparations of PHH contain up to 100% hepatic cells, cultures of differentiated HepaRG cells are a mixture of hepatocyte-like and biliary-like epithelial cells. We used PHH and HepaRG cells and compared the influence of virus inoculation dose, cell differentiation, and polarization on productive HBV infection. At multiplicities of genome equivalents (mge) >8,000, almost 100% of PHHs could be infected. In contrast, only a subset of HepaRG cells stained positive for HBcAg at comparable or even higher mge. Infection predominantly occurred at the edges of islands of hepatocyte-like HepaRG cells. This indicates a limited accessibility of the HBV receptor, possibly as a result of its polar sorting. Multidrug resistance protein 2 (MRP2), a marker selectively transported to the apical (i.e., canalicular) cell membrane, revealed two polarization phenotypes of HepaRG cells. HBV infection within the islands of hepatocyte-like HepaRG cells preferentially occurred in cells that resemble PHH, exhibiting canalicular structures. However, disruption of cell-cell junctions allowed the additional infection of cells that do not display a PHH-like polarization. HBV enters hepatocytes via the basolateral membrane. This model, at least partially, explains the difference of PHH and HepaRG cells in infection efficacy, provides insights into natural HBV infection, and establishes a basis for optimization of the HepaRG infection system. Copyright © 2011 American Association for the Study of Liver Diseases.

  19. The impact of portable high-efficiency particulate air filters on the incidence of invasive aspergillosis in a large acute tertiary-care hospital.

    Science.gov (United States)

    Abdul Salam, Zakir-Hussain; Karlin, Rubiyah Binte; Ling, Moi Lin; Yang, Kok Soong

    2010-05-01

    Worldwide, the frequency of invasive fungal infections has been increasing, with a corresponding increase in the numbers of high-risk patients. Exposure reduction through the use of high-efficiency particulate air (HEPA) filters has been the preferred primary preventive strategy for these high-risk patients. Although the efficiency and benefits of fixed HEPA filters is well proven, the benefits of portable HEPA filters are still inconclusive. This was a retrospective study to assess the impact of 48 portable HEPA filter units deployed in selected wards in Singapore General Hospital, an acute tertiary-care hospital in Singapore. Data were extracted between December 2005 and June 2008 on the diagnoses at discharge and microbiological and histological laboratory findings. All patients with possible, probable, or proven invasive aspergillosis (IA) were included. In wards with portable HEPA filters, the incidence rate of IA of 34.61/100,000 patient-days in the pre-installation period was reduced to 17.51/100,000 patient-days in the post-installation period (P = .01), for an incidence rate ratio of 1.98 (95% confidence interval [CI], 1.10-2.97). In wards with no HEPA filters, there was no significant change in the incidence rate during the study period. Portable HEPA filters were associated with an adjusted odds ratio of 0.49 (95% CI, 0.28-0.85; P = .01), adjusted for diagnosis and length of hospital stay. Portable HEPA filters are effective in the prevention of IA. The cost of widespread portable HEPA filtration in hospitals will be more than offset by the decreases in nosocomial infections in general and in IA in particular. Copyright (c) 2010 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  20. Pathogenesis of deep endometriosis.

    Science.gov (United States)

    Gordts, Stephan; Koninckx, Philippe; Brosens, Ivo

    2017-12-01

    The pathophysiology of (deep) endometriosis is still unclear. As originally suggested by Cullen, change the definition "deeper than 5 mm" to "adenomyosis externa." With the discovery of the old European literature on uterine bleeding in 5%-10% of the neonates and histologic evidence that the bleeding represents decidual shedding, it is postulated/hypothesized that endometrial stem/progenitor cells, implanted in the pelvic cavity after birth, may be at the origin of adolescent and even the occasionally premenarcheal pelvic endometriosis. Endometriosis in the adolescent is characterized by angiogenic and hemorrhagic peritoneal and ovarian lesions. The development of deep endometriosis at a later age suggests that deep infiltrating endometriosis is a delayed stage of endometriosis. Another hypothesis is that the endometriotic cell has undergone genetic or epigenetic changes and those specific changes determine the development into deep endometriosis. This is compatible with the hereditary aspects, and with the clonality of deep and cystic ovarian endometriosis. It explains the predisposition and an eventual causal effect by dioxin or radiation. Specific genetic/epigenetic changes could explain the various expressions and thus typical, cystic, and deep endometriosis become three different diseases. Subtle lesions are not a disease until epi(genetic) changes occur. A classification should reflect that deep endometriosis is a specific disease. In conclusion the pathophysiology of deep endometriosis remains debated and the mechanisms of disease progression, as well as the role of genetics and epigenetics in the process, still needs to be unraveled. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  1. Macroanatomic, light, and electron microscopic examination of pecten oculi in the seagull (Larus canus).

    Science.gov (United States)

    Ince, Nazan Gezer; Onuk, Burcu; Kabak, Yonca Betil; Alan, Aydin; Kabak, Murat

    2017-07-01

    The present study was conducted to determine macroanatomic characteristic as well as light and electron microscopic examination (SEM) of pecten oculi and totally 20 bulbus oculi belonging to 10 seagulls (Larus canus) were used. Pecten oculi formations consisted of 18 to 21 pleats and their shape looked like a snail. Apical length of the pleats forming pecten oculi were averagely measured as 5.77 ± 0.56 mm, retina-dependent base length was 9.01 ± 1.35 mm and height was measured as 6.4 ± 0.62 mm. In pecten oculi formations which extend up to 1/3 of the bulbus oculi, two different vascular formations were determined according to thickness of the vessel diameter. Among these, vessels with larger diameters which are less than the others in count were classified as afferent and efferent vessels, smaller vessels which are greater in size were classified as capillaries. Furthermore, the granules which were observed intensely in apical side of the pleats of pecten oculi were observed to distribute randomly along the plica. © 2017 Wiley Periodicals, Inc.

  2. DeepSpark: A Spark-Based Distributed Deep Learning Framework for Commodity Clusters

    OpenAIRE

    Kim, Hanjoo; Park, Jaehong; Jang, Jaehee; Yoon, Sungroh

    2016-01-01

    The increasing complexity of deep neural networks (DNNs) has made it challenging to exploit existing large-scale data processing pipelines for handling massive data and parameters involved in DNN training. Distributed computing platforms and GPGPU-based acceleration provide a mainstream solution to this computational challenge. In this paper, we propose DeepSpark, a distributed and parallel deep learning framework that exploits Apache Spark on commodity clusters. To support parallel operation...

  3. Feasibility for the medium efficiency filter as a postfilter in the air cleaning unit

    International Nuclear Information System (INIS)

    Lim, H. S.; Jung, D. Y.; Byun, S. C.; Kim, S. H.

    2002-01-01

    The Air Cleaning Unit (ACU) is provided in a nuclear facility to filter the radioactive materials in gaseous effluents released from the facility during normal operation and during a postulated accident. The Air Cleaning Unit (ACU) consists of pre-HEPA filters, charcoal adsorber, post HEPA filters, fans, etc. The charcoal filters keep on-site dose and off-site effluents ALARA, consistent with regulatory requirements. The function of HEPA filter downstream of charcoal(carbon) adsorber in ACU is to catch potential radioactive carbon dust and to be a backup in the event of failure of upstream HEPA. Previous Regulatory Guide use only post HEPA filter of charcoal adsorber downstream but the Regulatory Guide of current revisions allows use of 95% dust spot efficiency filters in lieu of HEPA at the downstream of the carbon adsorber. In this paper is described that the background information of filters, Current Regulatory Guide of revised by the United States Nuclear Regulatory Commission and the feasibility for the medium efficiency filter as a carbon adsorber post filter in the Air Cleaning Unit

  4. Equivalent molecular mass of cytosolic and nuclear forms of Ah receptor from Hepa-1 cells determined by photoaffinity labeling with 2,3,7,8-[3H]tetrachlorodibenzo-p-dioxin

    International Nuclear Information System (INIS)

    Prokipcak, R.D.; Okey, A.B.

    1990-01-01

    The structure of the Ah receptor previously has been extensively characterized by reversible binding of the high affinity ligand 2,3,7,8-tetrachlorodibenzo-p-dioxin. We report the use of [ 3 H]2,3,7,8-tetrachlorodibenzo-p-dioxin as a photoaffinity ligand for Ah receptor from the mouse hepatoma cell line Hepa-1c1c9. Both cytosolic and nuclear forms of Ah receptor could be specifically photoaffinity-labeled, which allowed determination of molecular mass for the two forms under denaturing conditions. After analysis by fluorography of polyacrylamide gels run in the presence of sodium dodecyl sulfate, molecular mass for the cytosolic form of Ah receptor was estimated at 92,000 +/- 4,300 and that for the nuclear form was estimated at 93,500 +/- 3,400. Receptor in mixture of cytosol and nuclear extract (each labeled separately with [ 3 H]2,3,7,8-tetrachlorodibenzo-p-dioxin) migrated as a single band. These results are consistent with the presence of a common ligand-binding subunit of identical molecular mass in both cytosolic and nuclear complexes

  5. DeepPVP: phenotype-based prioritization of causative variants using deep learning

    KAUST Repository

    Boudellioua, Imene; Kulmanov, Maxat; Schofield, Paul N; Gkoutos, Georgios V; Hoehndorf, Robert

    2018-01-01

    phenotype-based methods that use similar features. DeepPVP is freely available at https://github.com/bio-ontology-research-group/phenomenet-vp Conclusions: DeepPVP further improves on existing variant prioritization methods both in terms of speed as well

  6. DeepARG: a deep learning approach for predicting antibiotic resistance genes from metagenomic data.

    Science.gov (United States)

    Arango-Argoty, Gustavo; Garner, Emily; Pruden, Amy; Heath, Lenwood S; Vikesland, Peter; Zhang, Liqing

    2018-02-01

    Growing concerns about increasing rates of antibiotic resistance call for expanded and comprehensive global monitoring. Advancing methods for monitoring of environmental media (e.g., wastewater, agricultural waste, food, and water) is especially needed for identifying potential resources of novel antibiotic resistance genes (ARGs), hot spots for gene exchange, and as pathways for the spread of ARGs and human exposure. Next-generation sequencing now enables direct access and profiling of the total metagenomic DNA pool, where ARGs are typically identified or predicted based on the "best hits" of sequence searches against existing databases. Unfortunately, this approach produces a high rate of false negatives. To address such limitations, we propose here a deep learning approach, taking into account a dissimilarity matrix created using all known categories of ARGs. Two deep learning models, DeepARG-SS and DeepARG-LS, were constructed for short read sequences and full gene length sequences, respectively. Evaluation of the deep learning models over 30 antibiotic resistance categories demonstrates that the DeepARG models can predict ARGs with both high precision (> 0.97) and recall (> 0.90). The models displayed an advantage over the typical best hit approach, yielding consistently lower false negative rates and thus higher overall recall (> 0.9). As more data become available for under-represented ARG categories, the DeepARG models' performance can be expected to be further enhanced due to the nature of the underlying neural networks. Our newly developed ARG database, DeepARG-DB, encompasses ARGs predicted with a high degree of confidence and extensive manual inspection, greatly expanding current ARG repositories. The deep learning models developed here offer more accurate antimicrobial resistance annotation relative to current bioinformatics practice. DeepARG does not require strict cutoffs, which enables identification of a much broader diversity of ARGs. The

  7. Stimulation Technologies for Deep Well Completions

    Energy Technology Data Exchange (ETDEWEB)

    None

    2003-09-30

    The Department of Energy (DOE) is sponsoring the Deep Trek Program targeted at improving the economics of drilling and completing deep gas wells. Under the DOE program, Pinnacle Technologies is conducting a study to evaluate the stimulation of deep wells. The objective of the project is to assess U.S. deep well drilling & stimulation activity, review rock mechanics & fracture growth in deep, high pressure/temperature wells and evaluate stimulation technology in several key deep plays. An assessment of historical deep gas well drilling activity and forecast of future trends was completed during the first six months of the project; this segment of the project was covered in Technical Project Report No. 1. The second progress report covers the next six months of the project during which efforts were primarily split between summarizing rock mechanics and fracture growth in deep reservoirs and contacting operators about case studies of deep gas well stimulation.

  8. 3-Nitrobenzanthrone and 3-aminobenzanthrone induce DNA damage and cell signalling in Hepa1c1c7 cells.

    Science.gov (United States)

    Landvik, N E; Arlt, V M; Nagy, E; Solhaug, A; Tekpli, X; Schmeiser, H H; Refsnes, M; Phillips, D H; Lagadic-Gossmann, D; Holme, J A

    2010-02-03

    3-Nitrobenzanthrone (3-NBA) is a mutagenic and carcinogenic environmental pollutant found in diesel exhaust and urban air pollution. In the present work we have characterised the effects of 3-NBA and its metabolite 3-aminobenzanthrone (3-ABA) on cell death and cytokine release in mouse hepatoma Hepa1c1c7 cells. These effects were related to induced DNA damage and changes in cell signalling pathways. 3-NBA resulted in cell death and caused most DNA damage as judged by the amount of DNA adducts ((32)P-postlabelling assay), single strand (ss)DNA breaks and oxidative DNA lesions (comet assay) detected. An increased phosphorylation of H2AX, chk1, chk2 and partly ATM was observed using flow cytometry and/or Western blotting. Both compounds increased phosphorylation of p53 and MAPKs (ERK, p38 and JNK). However, only 3-NBA caused an accumulation of p53 in the nucleus and a translocation of Bax to the mitochondria. The p53 inhibitor pifithrin-alpha inhibited 3-NBA-induced apoptosis, indicating that cell death was a result of the triggering of DNA signalling pathways. The highest phosphorylation of Akt and degradation of IkappaB-alpha (suggesting activation of NF-kappaB) were also seen after treatment with 3-NBA. In contrast 3-ABA increased IL-6 release, but caused little or no toxicity. Cytokine release was inhibited by PD98059 and curcumin, suggesting that ERK and NF-kappaB play a role in this process. In conclusion, 3-NBA seems to have a higher potency to induce DNA damage compatible with its cytotoxic effects, while 3-ABA seems to have a greater effect on the immune system. Copyright 2009 Elsevier B.V. All rights reserved.

  9. 3-Nitrobenzanthrone and 3-aminobenzanthrone induce DNA damage and cell signalling in Hepa1c1c7 cells

    Energy Technology Data Exchange (ETDEWEB)

    Landvik, N.E. [Division of Environmental Medicine, Norwegian Institute of Public Health, P.O. Box 404 Torshov N-4303 Oslo (Norway); Arlt, V.M.; Nagy, E. [Section of Molecular Carcinogenesis, Institute of Cancer Research, Brookes Lawley Building, Sutton, Surrey SM2 5NG (United Kingdom); Solhaug, A. [Section for Toxicology, Department of Feed and Food Safety, National Veterinary Institute Pb 750 Sentrum, N-0106 Oslo (Norway); Tekpli, X. [EA SeRAIC, Equipe labellisee Ligue contre le Cancer, IFR 140, Universite de Rennes 1, Rennes (France); Schmeiser, H.H. [Research Group Genetic Alteration in Carcinogenesis, German Cancer Research Center, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany); Refsnes, M. [Division of Environmental Medicine, Norwegian Institute of Public Health, P.O. Box 404 Torshov N-4303 Oslo (Norway); Phillips, D.H. [Section of Molecular Carcinogenesis, Institute of Cancer Research, Brookes Lawley Building, Sutton, Surrey SM2 5NG (United Kingdom); Lagadic-Gossmann, D. [EA SeRAIC, Equipe labellisee Ligue contre le Cancer, IFR 140, Universite de Rennes 1, Rennes (France); Holme, J.A., E-mail: jorn.holme@fhi.no [Division of Environmental Medicine, Norwegian Institute of Public Health, P.O. Box 404 Torshov N-4303 Oslo (Norway)

    2010-02-03

    3-Nitrobenzanthrone (3-NBA) is a mutagenic and carcinogenic environmental pollutant found in diesel exhaust and urban air pollution. In the present work we have characterised the effects of 3-NBA and its metabolite 3-aminobenzanthrone (3-ABA) on cell death and cytokine release in mouse hepatoma Hepa1c1c7 cells. These effects were related to induced DNA damage and changes in cell signalling pathways. 3-NBA resulted in cell death and caused most DNA damage as judged by the amount of DNA adducts ({sup 32}P-postlabelling assay), single strand (ss)DNA breaks and oxidative DNA lesions (comet assay) detected. An increased phosphorylation of H2AX, chk1, chk2 and partly ATM was observed using flow cytometry and/or Western blotting. Both compounds increased phosphorylation of p53 and MAPKs (ERK, p38 and JNK). However, only 3-NBA caused an accumulation of p53 in the nucleus and a translocation of Bax to the mitochondria. The p53 inhibitor pifithrin-alpha inhibited 3-NBA-induced apoptosis, indicating that cell death was a result of the triggering of DNA signalling pathways. The highest phosphorylation of Akt and degradation of I{kappa}B-{alpha} (suggesting activation of NF-{kappa}B) were also seen after treatment with 3-NBA. In contrast 3-ABA increased IL-6 release, but caused little or no toxicity. Cytokine release was inhibited by PD98059 and curcumin, suggesting that ERK and NF-{kappa}B play a role in this process. In conclusion, 3-NBA seems to have a higher potency to induce DNA damage compatible with its cytotoxic effects, while 3-ABA seems to have a greater effect on the immune system.

  10. DeepQA: improving the estimation of single protein model quality with deep belief networks.

    Science.gov (United States)

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-12-05

    Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .

  11. Stimulation Technologies for Deep Well Completions

    Energy Technology Data Exchange (ETDEWEB)

    Stephen Wolhart

    2005-06-30

    The Department of Energy (DOE) is sponsoring the Deep Trek Program targeted at improving the economics of drilling and completing deep gas wells. Under the DOE program, Pinnacle Technologies conducted a study to evaluate the stimulation of deep wells. The objective of the project was to review U.S. deep well drilling and stimulation activity, review rock mechanics and fracture growth in deep, high-pressure/temperature wells and evaluate stimulation technology in several key deep plays. This report documents results from this project.

  12. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    Science.gov (United States)

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. DeepBase: annotation and discovery of microRNAs and other noncoding RNAs from deep-sequencing data.

    Science.gov (United States)

    Yang, Jian-Hua; Qu, Liang-Hu

    2012-01-01

    Recent advances in high-throughput deep-sequencing technology have produced large numbers of short and long RNA sequences and enabled the detection and profiling of known and novel microRNAs (miRNAs) and other noncoding RNAs (ncRNAs) at unprecedented sensitivity and depth. In this chapter, we describe the use of deepBase, a database that we have developed to integrate all public deep-sequencing data and to facilitate the comprehensive annotation and discovery of miRNAs and other ncRNAs from these data. deepBase provides an integrative, interactive, and versatile web graphical interface to evaluate miRBase-annotated miRNA genes and other known ncRNAs, explores the expression patterns of miRNAs and other ncRNAs, and discovers novel miRNAs and other ncRNAs from deep-sequencing data. deepBase also provides a deepView genome browser to comparatively analyze these data at multiple levels. deepBase is available at http://deepbase.sysu.edu.cn/.

  14. Deep Borehole Disposal as an Alternative Concept to Deep Geological Disposal

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jongyoul; Lee, Minsoo; Choi, Heuijoo; Kim, Kyungsu [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this paper, the general concept and key technologies for deep borehole disposal of spent fuels or HLW, as an alternative method to the mined geological disposal method, were reviewed. After then an analysis on the distance between boreholes for the disposal of HLW was carried out. Based on the results, a disposal area were calculated approximately and compared with that of mined geological disposal. These results will be used as an input for the analyses of applicability for DBD in Korea. The disposal safety of this system has been demonstrated with underground research laboratory and some advanced countries such as Finland and Sweden are implementing their disposal project on commercial stage. However, if the spent fuels or the high-level radioactive wastes can be disposed of in the depth of 3-5 km and more stable rock formation, it has several advantages. Therefore, as an alternative disposal concept to the mined deep geological disposal concept (DGD), very deep borehole disposal (DBD) technology is under consideration in number of countries in terms of its outstanding safety and cost effectiveness. In this paper, the general concept of deep borehole disposal for spent fuels or high level radioactive wastes was reviewed. And the key technologies, such as drilling technology of large diameter borehole, packaging and emplacement technology, sealing technology and performance/safety analyses technologies, and their challenges in development of deep borehole disposal system were analyzed. Also, very preliminary deep borehole disposal concept including disposal canister concept was developed according to the nuclear environment in Korea.

  15. Deep Borehole Disposal as an Alternative Concept to Deep Geological Disposal

    International Nuclear Information System (INIS)

    Lee, Jongyoul; Lee, Minsoo; Choi, Heuijoo; Kim, Kyungsu

    2016-01-01

    In this paper, the general concept and key technologies for deep borehole disposal of spent fuels or HLW, as an alternative method to the mined geological disposal method, were reviewed. After then an analysis on the distance between boreholes for the disposal of HLW was carried out. Based on the results, a disposal area were calculated approximately and compared with that of mined geological disposal. These results will be used as an input for the analyses of applicability for DBD in Korea. The disposal safety of this system has been demonstrated with underground research laboratory and some advanced countries such as Finland and Sweden are implementing their disposal project on commercial stage. However, if the spent fuels or the high-level radioactive wastes can be disposed of in the depth of 3-5 km and more stable rock formation, it has several advantages. Therefore, as an alternative disposal concept to the mined deep geological disposal concept (DGD), very deep borehole disposal (DBD) technology is under consideration in number of countries in terms of its outstanding safety and cost effectiveness. In this paper, the general concept of deep borehole disposal for spent fuels or high level radioactive wastes was reviewed. And the key technologies, such as drilling technology of large diameter borehole, packaging and emplacement technology, sealing technology and performance/safety analyses technologies, and their challenges in development of deep borehole disposal system were analyzed. Also, very preliminary deep borehole disposal concept including disposal canister concept was developed according to the nuclear environment in Korea

  16. Tumor necrosis factor-alpha potentiates the cytotoxicity of amiodarone in Hepa1c1c7 cells: roles of caspase activation and oxidative stress.

    Science.gov (United States)

    Lu, Jingtao; Miyakawa, Kazuhisa; Roth, Robert A; Ganey, Patricia E

    2013-01-01

    Amiodarone (AMD), a class III antiarrhythmic drug, causes idiosyncratic hepatotoxicity in human patients. We demonstrated previously that tumor necrosis factor-alpha (TNF-α) plays an important role in a rat model of AMD-induced hepatotoxicity under inflammatory stress. In this study, we developed a model in vitro to study the roles of caspase activation and oxidative stress in TNF potentiation of AMD cytotoxicity. AMD caused cell death in Hepa1c1c7 cells, and TNF cotreatment potentiated its toxicity. Activation of caspases 9 and 3/7 was observed in AMD/TNF-cotreated cells, and caspase inhibitors provided minor protection from cytotoxicity. Intracellular reactive oxygen species (ROS) generation and lipid peroxidation were observed after treatment with AMD and were further elevated by TNF cotreatment. Adding water-soluble antioxidants (trolox, N-acetylcysteine, glutathione, or ascorbate) produced only minor attenuation of AMD/TNF-induced cytotoxicity and did not influence the effect of AMD alone. On the other hand, α-tocopherol (TOCO), which reduced lipid peroxidation and ROS generation, prevented AMD toxicity and caused pronounced reduction in cytotoxicity from AMD/TNF cotreatment. α-TOCO plus a pancaspase inhibitor completely abolished AMD/TNF-induced cytotoxicity. In summary, activation of caspases and oxidative stress were observed after AMD/TNF cotreatment, and caspase inhibitors and a lipid-soluble free-radical scavenger attenuated AMD/TNF-induced cytotoxicity.

  17. Deep Mapping and Spatial Anthropology

    Directory of Open Access Journals (Sweden)

    Les Roberts

    2016-01-01

    Full Text Available This paper provides an introduction to the Humanities Special Issue on “Deep Mapping”. It sets out the rationale for the collection and explores the broad-ranging nature of perspectives and practices that fall within the “undisciplined” interdisciplinary domain of spatial humanities. Sketching a cross-current of ideas that have begun to coalesce around the concept of “deep mapping”, the paper argues that rather than attempting to outline a set of defining characteristics and “deep” cartographic features, a more instructive approach is to pay closer attention to the multivalent ways deep mapping is performatively put to work. Casting a critical and reflexive gaze over the developing discourse of deep mapping, it is argued that what deep mapping “is” cannot be reduced to the otherwise a-spatial and a-temporal fixity of the “deep map”. In this respect, as an undisciplined survey of this increasing expansive field of study and practice, the paper explores the ways in which deep mapping can engage broader discussion around questions of spatial anthropology.

  18. Deep Vein Thrombosis

    African Journals Online (AJOL)

    OWNER

    Deep Vein Thrombosis: Risk Factors and Prevention in Surgical Patients. Deep Vein ... preventable morbidity and mortality in hospitalized surgical patients. ... the elderly.3,4 It is very rare before the age ... depends on the risk level; therefore an .... but also in the post-operative period. ... is continuing uncertainty regarding.

  19. Hepatitis A vaccination coverage among adults 18-49 years traveling to a country of high or intermediate endemicity, United States.

    Science.gov (United States)

    Lu, Peng-Jun; Byrd, Kathy K; Murphy, Trudy V

    2013-05-01

    Since 1996, hepatitis A vaccine (HepA) has been recommended for adults at increased risk for infection including travelers to high or intermediate hepatitis A endemic countries. In 2009, travel outside the United States and Canada was the most common exposure nationally reported for persons with hepatitis A virus (HAV) infection. To assess HepA vaccination coverage among adults 18-49 years traveling to a country of high or intermediate endemicity in the United States. We analyzed data from the 2010 National Health Interview Survey (NHIS), to determine self-reported HepA vaccination coverage (≥1 dose) and series completion (≥2 dose) among persons 18-49 years who traveled, since 1995, to a country of high or intermediate HAV endemicity. Multivariable logistic regression and predictive marginal analyses were conducted to identify factors independently associated with HepA vaccine receipt. In 2010, approximately 36.6% of adults 18-49 years reported traveling to high or intermediate hepatitis A endemic countries; among this group unadjusted HepA vaccination coverage was 26.6% compared to 12.7% among non-travelers (P-valuestravel status was an independent predictor of HepA coverage and series completion (both P-valuestravelers, HepA coverage and series completion (≥2 doses) were higher for travelers 18-25 years (prevalence ratios 2.3, 2.8, respectively, P-valuestravelers 26-39 years (prevalence ratios 1.5, 1.5, respectively, P-valuetravelers 40-49 years. Other characteristics independently associated with a higher likelihood of HepA receipt among travelers included Asian race/ethnicity, male sex, never having been married, having a high school or higher education, living in the western United States, having greater number of physician contacts or receipt of influenza vaccination in the previous year. HepB vaccination was excluded from the model because of the significant correlation between receipt of HepA vaccination and HepB vaccination could distort the model

  20. pDeep: Predicting MS/MS Spectra of Peptides with Deep Learning.

    Science.gov (United States)

    Zhou, Xie-Xuan; Zeng, Wen-Feng; Chi, Hao; Luo, Chunjie; Liu, Chao; Zhan, Jianfeng; He, Si-Min; Zhang, Zhifei

    2017-12-05

    In tandem mass spectrometry (MS/MS)-based proteomics, search engines rely on comparison between an experimental MS/MS spectrum and the theoretical spectra of the candidate peptides. Hence, accurate prediction of the theoretical spectra of peptides appears to be particularly important. Here, we present pDeep, a deep neural network-based model for the spectrum prediction of peptides. Using the bidirectional long short-term memory (BiLSTM), pDeep can predict higher-energy collisional dissociation, electron-transfer dissociation, and electron-transfer and higher-energy collision dissociation MS/MS spectra of peptides with >0.9 median Pearson correlation coefficients. Further, we showed that intermediate layer of the neural network could reveal physicochemical properties of amino acids, for example the similarities of fragmentation behaviors between amino acids. We also showed the potential of pDeep to distinguish extremely similar peptides (peptides that contain isobaric amino acids, for example, GG = N, AG = Q, or even I = L), which were very difficult to distinguish using traditional search engines.

  1. Searching for prostate cancer by fully automated magnetic resonance imaging classification: deep learning versus non-deep learning.

    Science.gov (United States)

    Wang, Xinggang; Yang, Wei; Weinreb, Jeffrey; Han, Juan; Li, Qiubai; Kong, Xiangchuang; Yan, Yongluan; Ke, Zan; Luo, Bo; Liu, Tao; Wang, Liang

    2017-11-13

    Prostate cancer (PCa) is a major cause of death since ancient time documented in Egyptian Ptolemaic mummy imaging. PCa detection is critical to personalized medicine and varies considerably under an MRI scan. 172 patients with 2,602 morphologic images (axial 2D T2-weighted imaging) of the prostate were obtained. A deep learning with deep convolutional neural network (DCNN) and a non-deep learning with SIFT image feature and bag-of-word (BoW), a representative method for image recognition and analysis, were used to distinguish pathologically confirmed PCa patients from prostate benign conditions (BCs) patients with prostatitis or prostate benign hyperplasia (BPH). In fully automated detection of PCa patients, deep learning had a statistically higher area under the receiver operating characteristics curve (AUC) than non-deep learning (P = 0.0007 deep learning method and 0.70 (95% CI 0.63-0.77) for non-deep learning method, respectively. Our results suggest that deep learning with DCNN is superior to non-deep learning with SIFT image feature and BoW model for fully automated PCa patients differentiation from prostate BCs patients. Our deep learning method is extensible to image modalities such as MR imaging, CT and PET of other organs.

  2. Deep learning for image classification

    Science.gov (United States)

    McCoppin, Ryan; Rizki, Mateen

    2014-06-01

    This paper provides an overview of deep learning and introduces the several subfields of deep learning including a specific tutorial of convolutional neural networks. Traditional methods for learning image features are compared to deep learning techniques. In addition, we present our preliminary classification results, our basic implementation of a convolutional restricted Boltzmann machine on the Mixed National Institute of Standards and Technology database (MNIST), and we explain how to use deep learning networks to assist in our development of a robust gender classification system.

  3. Deep learning for computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Goh, Garrett B. [Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland Washington 99354; Hodas, Nathan O. [Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland Washington 99354; Vishnu, Abhinav [Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland Washington 99354

    2017-03-08

    The rise and fall of artificial neural networks is well documented in the scientific literature of both the fields of computer science and computational chemistry. Yet almost two decades later, we are now seeing a resurgence of interest in deep learning, a machine learning algorithm based on “deep” neural networks. Within the last few years, we have seen the transformative impact of deep learning the computer science domain, notably in speech recognition and computer vision, to the extent that the majority of practitioners in those field are now regularly eschewing prior established models in favor of deep learning models. In this review, we provide an introductory overview into the theory of deep neural networks and their unique properties as compared to traditional machine learning algorithms used in cheminformatics. By providing an overview of the variety of emerging applications of deep neural networks, we highlight its ubiquity and broad applicability to a wide range of challenges in the field, including QSAR, virtual screening, protein structure modeling, QM calculations, materials synthesis and property prediction. In reviewing the performance of deep neural networks, we observed a consistent outperformance against non neural networks state-of-the-art models across disparate research topics, and deep neural network based models often exceeded the “glass ceiling” expectations of their respective tasks. Coupled with the maturity of GPU-accelerated computing for training deep neural networks and the exponential growth of chemical data on which to train these networks on, we anticipate that deep learning algorithms will be a useful tool and may grow into a pivotal role for various challenges in the computational chemistry field.

  4. Deep learning for computational chemistry.

    Science.gov (United States)

    Goh, Garrett B; Hodas, Nathan O; Vishnu, Abhinav

    2017-06-15

    The rise and fall of artificial neural networks is well documented in the scientific literature of both computer science and computational chemistry. Yet almost two decades later, we are now seeing a resurgence of interest in deep learning, a machine learning algorithm based on multilayer neural networks. Within the last few years, we have seen the transformative impact of deep learning in many domains, particularly in speech recognition and computer vision, to the extent that the majority of expert practitioners in those field are now regularly eschewing prior established models in favor of deep learning models. In this review, we provide an introductory overview into the theory of deep neural networks and their unique properties that distinguish them from traditional machine learning algorithms used in cheminformatics. By providing an overview of the variety of emerging applications of deep neural networks, we highlight its ubiquity and broad applicability to a wide range of challenges in the field, including quantitative structure activity relationship, virtual screening, protein structure prediction, quantum chemistry, materials design, and property prediction. In reviewing the performance of deep neural networks, we observed a consistent outperformance against non-neural networks state-of-the-art models across disparate research topics, and deep neural network-based models often exceeded the "glass ceiling" expectations of their respective tasks. Coupled with the maturity of GPU-accelerated computing for training deep neural networks and the exponential growth of chemical data on which to train these networks on, we anticipate that deep learning algorithms will be a valuable tool for computational chemistry. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  5. What Really is Deep Learning Doing?

    OpenAIRE

    Xiong, Chuyu

    2017-01-01

    Deep learning has achieved a great success in many areas, from computer vision to natural language processing, to game playing, and much more. Yet, what deep learning is really doing is still an open question. There are a lot of works in this direction. For example, [5] tried to explain deep learning by group renormalization, and [6] tried to explain deep learning from the view of functional approximation. In order to address this very crucial question, here we see deep learning from perspect...

  6. Hepatitis A vaccination coverage among adults 18–49 years traveling to a country of high or intermediate endemicity, United States

    Science.gov (United States)

    Lu, Peng-jun; Byrd, Kathy K.; Murphy, Trudy V.

    2018-01-01

    Background Since 1996, hepatitis A vaccine (HepA) has been recommended for adults at increased risk for infection including travelers to high or intermediate hepatitis A endemic countries. In 2009, travel outside the United States and Canada was the most common exposure nationally reported for persons with hepatitis A virus (HAV) infection. Objective To assess HepA vaccination coverage among adults 18–49 years traveling to a country of high or intermediate endemicity in the United States. Methods We analyzed data from the 2010 National Health Interview Survey (NHIS), to determine self-reported HepA vaccination coverage (≥1 dose) and series completion (≥2 dose) among persons 18–49 years who traveled, since 1995, to a country of high or intermediate HAV endemicity. Multivariable logistic regression and predictive marginal analyses were conducted to identify factors independently associated with HepA vaccine receipt. Results In 2010, approximately 36.6% of adults 18–49 years reported traveling to high or intermediate hepatitis A endemic countries; among this group unadjusted HepA vaccination coverage was 26.6% compared to 12.7% among non-travelers (P-values hepatitis A endemicity was associated with higher likelihood of HepA vaccination in 2010 among adults 18–49 years, self-reported HepA vaccination coverage was low among adult travelers to these areas. Healthcare providers should ask their patients’ upcoming travel plans and recommend and offer travel related vaccinations to their patients. PMID:23523408

  7. Taoism and Deep Ecology.

    Science.gov (United States)

    Sylvan, Richard; Bennett, David

    1988-01-01

    Contrasted are the philosophies of Deep Ecology and ancient Chinese. Discusses the cosmology, morality, lifestyle, views of power, politics, and environmental philosophies of each. Concludes that Deep Ecology could gain much from Taoism. (CW)

  8. deepTools2: a next generation web server for deep-sequencing data analysis.

    Science.gov (United States)

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Interactions of endosulfan and methoxychlor involving CYP3A4 and CYP2B6 in human HepaRG cells.

    Science.gov (United States)

    Savary, Camille C; Jossé, Rozenn; Bruyère, Arnaud; Guillet, Fabrice; Robin, Marie-Anne; Guillouzo, André

    2014-08-01

    Humans are usually exposed to several pesticides simultaneously; consequently, combined actions between pesticides themselves or between pesticides and other chemicals need to be addressed in the risk assessment. Many pesticides are efficient activators of pregnane X receptor (PXR) and/or constitutive androstane receptor (CAR), two major nuclear receptors that are also activated by other substrates. In the present work, we searched for interactions between endosulfan and methoxychlor, two organochlorine pesticides whose major routes of metabolism involve CAR- and PXR-regulated CYP3A4 and CYP2B6, and whose mechanisms of action in humans remain poorly understood. For this purpose, HepaRG cells were treated with both pesticides separately or in mixture for 24 hours or 2 weeks at concentrations relevant to human exposure levels. In combination they exerted synergistic cytotoxic effects. Whatever the duration of treatment, both compounds increased CYP3A4 and CYP2B6 mRNA levels while differently affecting their corresponding activities. Endosulfan exerted a direct reversible inhibition of CYP3A4 activity that was confirmed in human liver microsomes. By contrast, methoxychlor induced this activity. The effects of the mixture on CYP3A4 activity were equal to the sum of those of each individual compound, suggesting an additive effect of each pesticide. Despite CYP2B6 activity being unchanged and increased with endosulfan and methoxychlor, respectively, no change was observed with their mixture, supporting an antagonistic effect. Altogether, our data suggest that CAR and PXR activators endosulfan and methoxychlor can interact together and with other exogenous substrates in human hepatocytes. Their effects on CYP3A4 and CYP2B6 activities could have important consequences if extrapolated to the in vivo situation. Copyright © 2014 by The American Society for Pharmacology and Experimental Therapeutics.

  10. Is Multitask Deep Learning Practical for Pharma?

    Science.gov (United States)

    Ramsundar, Bharath; Liu, Bowen; Wu, Zhenqin; Verras, Andreas; Tudor, Matthew; Sheridan, Robert P; Pande, Vijay

    2017-08-28

    Multitask deep learning has emerged as a powerful tool for computational drug discovery. However, despite a number of preliminary studies, multitask deep networks have yet to be widely deployed in the pharmaceutical and biotech industries. This lack of acceptance stems from both software difficulties and lack of understanding of the robustness of multitask deep networks. Our work aims to resolve both of these barriers to adoption. We introduce a high-quality open-source implementation of multitask deep networks as part of the DeepChem open-source platform. Our implementation enables simple python scripts to construct, fit, and evaluate sophisticated deep models. We use our implementation to analyze the performance of multitask deep networks and related deep models on four collections of pharmaceutical data (three of which have not previously been analyzed in the literature). We split these data sets into train/valid/test using time and neighbor splits to test multitask deep learning performance under challenging conditions. Our results demonstrate that multitask deep networks are surprisingly robust and can offer strong improvement over random forests. Our analysis and open-source implementation in DeepChem provide an argument that multitask deep networks are ready for widespread use in commercial drug discovery.

  11. Measurement and characterization of filtration efficiencies for prefilter materials used in aerosol filtration

    International Nuclear Information System (INIS)

    Sciortino, J.

    1991-01-01

    In applications where the filtration of large quantities of mixed (liquid and solid) aerosols is desired, a multistage filtration system is often employed. This system consists of a prefilter, a High Efficiency Particulate Air (HEPA) filter, and any number of specialized filters particular to the filtration application. The prefilter removes liquids and any large particles from the air stream, keeping them from prematurely loading the HEPA filter downstream. The HEPA filter eliminates 99.97% of all particulates in the aerosol. The specialized filters downstream of the HEPA filter can be used to remove organic volatiles or other vapors. While the properties of HEPA filters have been extensively investigated, literature characterizing the prefilter is scarce. The purpose of this report is to characterize the efficiency of the prefilter as a function of particle size, nature of the particle (solid or liquid), and the gas flow rate across the face of the prefilter. 1 ref., 4 figs

  12. DeepGait: A Learning Deep Convolutional Representation for View-Invariant Gait Recognition Using Joint Bayesian

    Directory of Open Access Journals (Sweden)

    Chao Li

    2017-02-01

    Full Text Available Human gait, as a soft biometric, helps to recognize people through their walking. To further improve the recognition performance, we propose a novel video sensor-based gait representation, DeepGait, using deep convolutional features and introduce Joint Bayesian to model view variance. DeepGait is generated by using a pre-trained “very deep” network “D-Net” (VGG-D without any fine-tuning. For non-view setting, DeepGait outperforms hand-crafted representations (e.g., Gait Energy Image, Frequency-Domain Feature and Gait Flow Image, etc.. Furthermore, for cross-view setting, 256-dimensional DeepGait after PCA significantly outperforms the state-of-the-art methods on the OU-ISR large population (OULP dataset. The OULP dataset, which includes 4007 subjects, makes our result reliable in a statistically reliable way.

  13. Invited talk: Deep Learning Meets Physics

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Deep Learning has emerged as one of the most successful fields of machine learning and artificial intelligence with overwhelming success in industrial speech, text and vision benchmarks. Consequently it evolved into the central field of research for IT giants like Google, facebook, Microsoft, Baidu, and Amazon. Deep Learning is founded on novel neural network techniques, the recent availability of very fast computers, and massive data sets. In its core, Deep Learning discovers multiple levels of abstract representations of the input. The main obstacle to learning deep neural networks is the vanishing gradient problem. The vanishing gradient impedes credit assignment to the first layers of a deep network or to early elements of a sequence, therefore limits model selection. Major advances in Deep Learning can be related to avoiding the vanishing gradient like stacking, ReLUs, residual networks, highway networks, and LSTM. For Deep Learning, we suggested self-normalizing neural networks (SNNs) which automatica...

  14. Deep geothermics

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    The hot-dry-rocks located at 3-4 km of depth correspond to low permeable rocks carrying a large amount of heat. The extraction of this heat usually requires artificial hydraulic fracturing of the rock to increase its permeability before water injection. Hot-dry-rocks geothermics or deep geothermics is not today a commercial channel but only a scientific and technological research field. The Soultz-sous-Forets site (Northern Alsace, France) is characterized by a 6 degrees per meter geothermal gradient and is used as a natural laboratory for deep geothermal and geological studies in the framework of a European research program. Two boreholes have been drilled up to 3600 m of depth in the highly-fractured granite massif beneath the site. The aim is to create a deep heat exchanger using only the natural fracturing for water transfer. A consortium of german, french and italian industrial companies (Pfalzwerke, Badenwerk, EdF and Enel) has been created for a more active participation to the pilot phase. (J.S.). 1 fig., 2 photos

  15. Stable architectures for deep neural networks

    Science.gov (United States)

    Haber, Eldad; Ruthotto, Lars

    2018-01-01

    Deep neural networks have become invaluable tools for supervised machine learning, e.g. classification of text or images. While often offering superior results over traditional techniques and successfully expressing complicated patterns in data, deep architectures are known to be challenging to design and train such that they generalize well to new data. Critical issues with deep architectures are numerical instabilities in derivative-based learning algorithms commonly called exploding or vanishing gradients. In this paper, we propose new forward propagation techniques inspired by systems of ordinary differential equations (ODE) that overcome this challenge and lead to well-posed learning problems for arbitrarily deep networks. The backbone of our approach is our interpretation of deep learning as a parameter estimation problem of nonlinear dynamical systems. Given this formulation, we analyze stability and well-posedness of deep learning and use this new understanding to develop new network architectures. We relate the exploding and vanishing gradient phenomenon to the stability of the discrete ODE and present several strategies for stabilizing deep learning for very deep networks. While our new architectures restrict the solution space, several numerical experiments show their competitiveness with state-of-the-art networks.

  16. Deep Seawater Intrusion Enhanced by Geothermal Through Deep Faults in Xinzhou Geothermal Field in Guangdong, China

    Science.gov (United States)

    Lu, G.; Ou, H.; Hu, B. X.; Wang, X.

    2017-12-01

    This study investigates abnormal sea water intrusion from deep depth, riding an inland-ward deep groundwater flow, which is enhanced by deep faults and geothermal processes. The study site Xinzhou geothermal field is 20 km from the coast line. It is in southern China's Guangdong coast, a part of China's long coastal geothermal belt. The geothermal water is salty, having fueled an speculation that it was ancient sea water retained. However, the perpetual "pumping" of the self-flowing outflow of geothermal waters might alter the deep underground flow to favor large-scale or long distant sea water intrusion. We studied geochemical characteristics of the geothermal water and found it as a mixture of the sea water with rain water or pore water, with no indication of dilution involved. And we conducted numerical studies of the buoyancy-driven geothermal flow in the deep ground and find that deep down in thousand meters there is favorable hydraulic gradient favoring inland-ward groundwater flow, allowing seawater intrude inland for an unusually long tens of kilometers in a granitic groundwater flow system. This work formed the first in understanding geo-environment for deep ground water flow.

  17. Deep Learning in Neuroradiology.

    Science.gov (United States)

    Zaharchuk, G; Gong, E; Wintermark, M; Rubin, D; Langlotz, C P

    2018-02-01

    Deep learning is a form of machine learning using a convolutional neural network architecture that shows tremendous promise for imaging applications. It is increasingly being adapted from its original demonstration in computer vision applications to medical imaging. Because of the high volume and wealth of multimodal imaging information acquired in typical studies, neuroradiology is poised to be an early adopter of deep learning. Compelling deep learning research applications have been demonstrated, and their use is likely to grow rapidly. This review article describes the reasons, outlines the basic methods used to train and test deep learning models, and presents a brief overview of current and potential clinical applications with an emphasis on how they are likely to change future neuroradiology practice. Facility with these methods among neuroimaging researchers and clinicians will be important to channel and harness the vast potential of this new method. © 2018 by American Journal of Neuroradiology.

  18. Temperature impacts on deep-sea biodiversity.

    Science.gov (United States)

    Yasuhara, Moriaki; Danovaro, Roberto

    2016-05-01

    Temperature is considered to be a fundamental factor controlling biodiversity in marine ecosystems, but precisely what role temperature plays in modulating diversity is still not clear. The deep ocean, lacking light and in situ photosynthetic primary production, is an ideal model system to test the effects of temperature changes on biodiversity. Here we synthesize current knowledge on temperature-diversity relationships in the deep sea. Our results from both present and past deep-sea assemblages suggest that, when a wide range of deep-sea bottom-water temperatures is considered, a unimodal relationship exists between temperature and diversity (that may be right skewed). It is possible that temperature is important only when at relatively high and low levels but does not play a major role in the intermediate temperature range. Possible mechanisms explaining the temperature-biodiversity relationship include the physiological-tolerance hypothesis, the metabolic hypothesis, island biogeography theory, or some combination of these. The possible unimodal relationship discussed here may allow us to identify tipping points at which on-going global change and deep-water warming may increase or decrease deep-sea biodiversity. Predicted changes in deep-sea temperatures due to human-induced climate change may have more adverse consequences than expected considering the sensitivity of deep-sea ecosystems to temperature changes. © 2014 Cambridge Philosophical Society.

  19. Extreme Longevity in Proteinaceous Deep-Sea Corals

    Energy Technology Data Exchange (ETDEWEB)

    Roark, E B; Guilderson, T P; Dunbar, R B; Fallon, S J; Mucciarone, D A

    2009-02-09

    Deep-sea corals are found on hard substrates on seamounts and continental margins world-wide at depths of 300 to {approx}3000 meters. Deep-sea coral communities are hotspots of deep ocean biomass and biodiversity, providing critical habitat for fish and invertebrates. Newly applied radiocarbon age date from the deep water proteinaceous corals Gerardia sp. and Leiopathes glaberrima show that radial growth rates are as low as 4 to 35 {micro}m yr{sup -1} and that individual colony longevities are on the order of thousands of years. The management and conservation of deep sea coral communities is challenged by their commercial harvest for the jewelry trade and damage caused by deep water fishing practices. In light of their unusual longevity, a better understanding of deep sea coral ecology and their interrelationships with associated benthic communities is needed to inform coherent international conservation strategies for these important deep-sea ecosystems.

  20. New optimized drill pipe size for deep-water, extended reach and ultra-deep drilling

    Energy Technology Data Exchange (ETDEWEB)

    Jellison, Michael J.; Delgado, Ivanni [Grant Prideco, Inc., Hoston, TX (United States); Falcao, Jose Luiz; Sato, Ademar Takashi [PETROBRAS, Rio de Janeiro, RJ (Brazil); Moura, Carlos Amsler [Comercial Perfuradora Delba Baiana Ltda., Rio de Janeiro, RJ (Brazil)

    2004-07-01

    A new drill pipe size, 5-7/8 in. OD, represents enabling technology for Extended Reach Drilling (ERD), deep water and other deep well applications. Most world-class ERD and deep water wells have traditionally been drilled with 5-1/2 in. drill pipe or a combination of 6-5/8 in. and 5-1/2 in. drill pipe. The hydraulic performance of 5-1/2 in. drill pipe can be a major limitation in substantial ERD and deep water wells resulting in poor cuttings removal, slower penetration rates, diminished control over well trajectory and more tendency for drill pipe sticking. The 5-7/8 in. drill pipe provides a significant improvement in hydraulic efficiency compared to 5-1/2 in. drill pipe and does not suffer from the disadvantages associated with use of 6-5/8 in. drill pipe. It represents a drill pipe assembly that is optimized dimensionally and on a performance basis for casing and bit programs that are commonly used for ERD, deep water and ultra-deep wells. The paper discusses the engineering philosophy behind 5-7/8 in. drill pipe, the design challenges associated with development of the product and reviews the features and capabilities of the second-generation double-shoulder connection. The paper provides drilling case history information on significant projects where the pipe has been used and details results achieved with the pipe. (author)

  1. Deep Reinforcement Learning: An Overview

    OpenAIRE

    Li, Yuxi

    2017-01-01

    We give an overview of recent exciting achievements of deep reinforcement learning (RL). We discuss six core elements, six important mechanisms, and twelve applications. We start with background of machine learning, deep learning and reinforcement learning. Next we discuss core RL elements, including value function, in particular, Deep Q-Network (DQN), policy, reward, model, planning, and exploration. After that, we discuss important mechanisms for RL, including attention and memory, unsuperv...

  2. DeepMirTar: a deep-learning approach for predicting human miRNA targets.

    Science.gov (United States)

    Wen, Ming; Cong, Peisheng; Zhang, Zhimin; Lu, Hongmei; Li, Tonghua

    2018-06-01

    MicroRNAs (miRNAs) are small noncoding RNAs that function in RNA silencing and post-transcriptional regulation of gene expression by targeting messenger RNAs (mRNAs). Because the underlying mechanisms associated with miRNA binding to mRNA are not fully understood, a major challenge of miRNA studies involves the identification of miRNA-target sites on mRNA. In silico prediction of miRNA-target sites can expedite costly and time-consuming experimental work by providing the most promising miRNA-target-site candidates. In this study, we reported the design and implementation of DeepMirTar, a deep-learning-based approach for accurately predicting human miRNA targets at the site level. The predicted miRNA-target sites are those having canonical or non-canonical seed, and features, including high-level expert-designed, low-level expert-designed, and raw-data-level, were used to represent the miRNA-target site. Comparison with other state-of-the-art machine-learning methods and existing miRNA-target-prediction tools indicated that DeepMirTar improved overall predictive performance. DeepMirTar is freely available at https://github.com/Bjoux2/DeepMirTar_SdA. lith@tongji.edu.cn, hongmeilu@csu.edu.cn. Supplementary data are available at Bioinformatics online.

  3. Deep Unfolding for Topic Models.

    Science.gov (United States)

    Chien, Jen-Tzung; Lee, Chao-Hsi

    2018-02-01

    Deep unfolding provides an approach to integrate the probabilistic generative models and the deterministic neural networks. Such an approach is benefited by deep representation, easy interpretation, flexible learning and stochastic modeling. This study develops the unsupervised and supervised learning of deep unfolded topic models for document representation and classification. Conventionally, the unsupervised and supervised topic models are inferred via the variational inference algorithm where the model parameters are estimated by maximizing the lower bound of logarithm of marginal likelihood using input documents without and with class labels, respectively. The representation capability or classification accuracy is constrained by the variational lower bound and the tied model parameters across inference procedure. This paper aims to relax these constraints by directly maximizing the end performance criterion and continuously untying the parameters in learning process via deep unfolding inference (DUI). The inference procedure is treated as the layer-wise learning in a deep neural network. The end performance is iteratively improved by using the estimated topic parameters according to the exponentiated updates. Deep learning of topic models is therefore implemented through a back-propagation procedure. Experimental results show the merits of DUI with increasing number of layers compared with variational inference in unsupervised as well as supervised topic models.

  4. Docker Containers for Deep Learning Experiments

    OpenAIRE

    Gerke, Paul K.

    2017-01-01

    Deep learning is a powerful tool to solve problems in the area of image analysis. The dominant compute platform for deep learning is Nvidia’s proprietary CUDA, which can only be used together with Nvidia graphics cards. The nivida-docker project allows exposing Nvidia graphics cards to docker containers and thus makes it possible to run deep learning experiments in docker containers.In our department, we use deep learning to solve problems in the area of medical image analysis and use docker ...

  5. Auxiliary Deep Generative Models

    DEFF Research Database (Denmark)

    Maaløe, Lars; Sønderby, Casper Kaae; Sønderby, Søren Kaae

    2016-01-01

    Deep generative models parameterized by neural networks have recently achieved state-of-the-art performance in unsupervised and semi-supervised learning. We extend deep generative models with auxiliary variables which improves the variational approximation. The auxiliary variables leave...... the generative model unchanged but make the variational distribution more expressive. Inspired by the structure of the auxiliary variable we also propose a model with two stochastic layers and skip connections. Our findings suggest that more expressive and properly specified deep generative models converge...... faster with better results. We show state-of-the-art performance within semi-supervised learning on MNIST (0.96%), SVHN (16.61%) and NORB (9.40%) datasets....

  6. Accelerating Deep Learning with Shrinkage and Recall

    OpenAIRE

    Zheng, Shuai; Vishnu, Abhinav; Ding, Chris

    2016-01-01

    Deep Learning is a very powerful machine learning model. Deep Learning trains a large number of parameters for multiple layers and is very slow when data is in large scale and the architecture size is large. Inspired from the shrinking technique used in accelerating computation of Support Vector Machines (SVM) algorithm and screening technique used in LASSO, we propose a shrinking Deep Learning with recall (sDLr) approach to speed up deep learning computation. We experiment shrinking Deep Lea...

  7. Consolidated Deep Actor Critic Networks (DRAFT)

    NARCIS (Netherlands)

    Van der Laan, T.A.

    2015-01-01

    The works [Volodymyr et al. Playing atari with deep reinforcement learning. arXiv preprint arXiv:1312.5602, 2013.] and [Volodymyr et al. Human-level control through deep reinforcement learning. Nature, 518(7540):529–533, 2015.] have demonstrated the power of combining deep neural networks with

  8. Deep Galaxy: Classification of Galaxies based on Deep Convolutional Neural Networks

    OpenAIRE

    Khalifa, Nour Eldeen M.; Taha, Mohamed Hamed N.; Hassanien, Aboul Ella; Selim, I. M.

    2017-01-01

    In this paper, a deep convolutional neural network architecture for galaxies classification is presented. The galaxy can be classified based on its features into main three categories Elliptical, Spiral, and Irregular. The proposed deep galaxies architecture consists of 8 layers, one main convolutional layer for features extraction with 96 filters, followed by two principles fully connected layers for classification. It is trained over 1356 images and achieved 97.272% in testing accuracy. A c...

  9. Ventilation design modifications at Los Alamos Scientific Laboratory major plutonium operational areas

    International Nuclear Information System (INIS)

    Stafford, R.G.; Gallimore, J.C. Jr.; Mitchell, R.N.; Maraman, W.J.; McNeese, W.D.

    1975-01-01

    Major ventilation design modifications in plutonium operational areas at Los Alamos have occurred during the past two years. An additional stage of HEPA filters has been added to DP West glove-box process exhaust resulting in significant effluent reductions. The additional stage of HEPA filters is unique in that each filter may be individually DOP tested. Radiological filter efficiencies of each process exhaust stage is presented. DP West room air ventilation systems have been modified to incorporate a single stage of HEPA filters in contrast to a previous American Air Filter PL-24 filtration system. Plutonium effluent reductions of 10 2 to 10 3 have resulted in these new systems. Modified DOP testing procedures for room air filtration systems are discussed. Major plutonium areas of the CMR Building utilizing Aerosolve 95 process exhaust filtration systems have been upgraded with two stages of HEPA filters. Significant reductions in effluent are evident. A unique method of DOP testing each bank of HEPA filters is discussed. Radiological efficiencies of both single and two-stage filters are discussed. (U.S.)

  10. Deep Feature Consistent Variational Autoencoder

    OpenAIRE

    Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping

    2016-01-01

    We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...

  11. EXPERIENCE OF ORNITHINE ASPARTATE (HEPA-MERZ AND PROBIOTICS BIOFLORUM FORTE IN THE TREATMENT OF NON-SEVERE FORMS OF ALCOHOLIC AND NON-ALCOHOLIC FATTY LIVER DISEASE

    Directory of Open Access Journals (Sweden)

    L. Yu. Ilchenko

    2016-01-01

    Full Text Available Aim: to evaluate the efficacy and tolerability of ornithine aspartate, probiotic Bioflorum Forte and their combination with steatosis and steatohepatitis in patients  with alcohol and non-alcoholic  fatty  liver disease. Materials and methods.  An open, randomized,  comparative  clinical study, which included 30 outpatients and inpatients with a diagnosis of steatosis, steatohepatitis. We analyzed the clinical symptoms, functional state of the liver. With the help of questionnaires  (Grids LeGo and post intoxication alcohol syndrome have established the presence of chronic alcohol intoxication. Test transmissions of numbers used to characterize the cognitive function, as well as detection  of minimal hepatic encephalopathy. Quality of life was assessed by questionnaire for patients with chronic liver disease — CLDQ (The chronic liver disease questionnaire. The duration of treatment was4 weeks. Results: all three treatment regimens have demonstrated therapeutic  efficacy: clinical improvement, recovery of liver function and results in cognitive function. When combined therapy also produced a significant improvement  in patients’ quality of life. It is shown that  the safety and tolerability of the means employed, adverse events were not reported. Conclusion: the results obtained allow us to recommend the use of ornithine aspartate (Hepa-Merz, both as monotherapy and as part of complex therapy of steatosis,  steatohepatitis with probiotic Bioflorum Forte in patients with alcoholic and non-alcoholic fatty liver disease.

  12. Identifying Strategies Programs Adopt to Meet Healthy Eating and Physical Activity Standards in Afterschool Programs.

    Science.gov (United States)

    Weaver, Robert G; Moore, Justin B; Turner-McGrievy, Brie; Saunders, Ruth; Beighle, Aaron; Khan, M Mahmud; Chandler, Jessica; Brazendale, Keith; Randell, Allison; Webster, Collin; Beets, Michael W

    2017-08-01

    The YMCA of USA has adopted Healthy Eating and Physical Activity (HEPA) Standards for its afterschool programs (ASPs). Little is known about strategies YMCA ASPs are implementing to achieve Standards and these strategies' effectiveness. (1) Identify strategies implemented in YMCA ASPs and (2) evaluate the relationship between strategy implementation and meeting Standards. HEPA was measured via accelerometer (moderate-to-vigorous-physical-activity [MVPA]) and direct observation (snacks served) in 20 ASPs. Strategies were identified and mapped onto a capacity building framework ( Strategies To Enhance Practice [STEPs]). Mixed-effects regression estimated increases in HEPA outcomes as implementation increased. Model-implied estimates were calculated for high (i.e., highest implementation score achieved), moderate (median implementation score across programs), and low (lowest implementation score achieved) implementation for both HEPA separately. Programs implemented a variety of strategies identified in STEPs. For every 1-point increase in implementation score 1.45% (95% confidence interval = 0.33% to 2.55%, p ≤ .001) more girls accumulated 30 min/day of MVPA and fruits and/or vegetables were served on 0.11 more days (95% confidence interval = 0.11-0.45, p ≤ .01). Relationships between implementation and other HEPA outcomes did not reach statistical significance. Still regression estimates indicated that desserts are served on 1.94 fewer days (i.e., 0.40 vs. 2.34) in the highest implementing program than the lowest implementing program and water is served 0.73 more days (i.e., 2.37 vs. 1.64). Adopting HEPA Standards at the national level does not lead to changes in routine practice in all programs. Practical strategies that programs could adopt to more fully comply with the HEPA Standards are identified.

  13. Research in Hospitality Management - Vol 6, No 1 (2016)

    African Journals Online (AJOL)

    Sustainable tourism development and the world heritage status of the Wadden Sea: ... Employees, sustainability and motivation: Increasing employee engagement by ... Food waste reduction at Restaurant De Pleats: Small steps for mankind ...

  14. How Stressful Is "Deep Bubbling"?

    Science.gov (United States)

    Tyrmi, Jaana; Laukkanen, Anne-Maria

    2017-03-01

    Water resistance therapy by phonating through a tube into the water is used to treat dysphonia. Deep submersion (≥10 cm in water, "deep bubbling") is used for hypofunctional voice disorders. Using it with caution is recommended to avoid vocal overloading. This experimental study aimed to investigate how strenuous "deep bubbling" is. Fourteen subjects, half of them with voice training, repeated the syllable [pa:] in comfortable speaking pitch and loudness, loudly, and in strained voice. Thereafter, they phonated a vowel-like sound both in comfortable loudness and loudly into a glass resonance tube immersed 10 cm into the water. Oral pressure, contact quotient (CQ, calculated from electroglottographic signal), and sound pressure level were studied. The peak oral pressure P(oral) during [p] and shuttering of the outer end of the tube was measured to estimate the subglottic pressure P(sub) and the mean P(oral) during vowel portions to enable calculation of transglottic pressure P(trans). Sensations during phonation were reported with an open-ended interview. P(sub) and P(oral) were higher in "deep bubbling" and P(trans) lower than in loud syllable phonation, but the CQ did not differ significantly. Similar results were obtained for the comparison between loud "deep bubbling" and strained phonation, although P(sub) did not differ significantly. Most of the subjects reported "deep bubbling" to be stressful only for respiratory and lip muscles. No big differences were found between trained and untrained subjects. The CQ values suggest that "deep bubbling" may increase vocal fold loading. Further studies should address impact stress during water resistance exercises. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  15. Nor-ursodeoxycholic acid reverses hepatocyte-specific nemo-dependent steatohepatitis.

    Science.gov (United States)

    Beraza, Naiara; Ofner-Ziegenfuss, Lisa; Ehedego, Haksier; Boekschoten, Mark; Bischoff, Stephan C; Mueller, Michael; Trauner, Michael; Trautwein, Christian

    2011-03-01

    Hepatocyte-specific NEMO/NF-κB deleted mice (NEMO(Δhepa)) develop spontaneous non-alcoholic steatohepatitis (NASH). Free fatty acids and bile acids promote DR5 expression. TRAIL/NK cell-mediated activation of TRAIL-R2/DR5 plays an important role during acute injury in NEMO(Δhepa) mice. To inhibit the progression of NASH in the absence of hepatocyte-NEMO/NF-kB signaling. NEMOf/f and NEMO(Δhepa) mice were fed with a low-fat diet, and with two anticholestatic diets; UDCA and NorUDCA. The impact of these treatments on the progression of NASH was evaluated. We show that high expression of DR5 in livers from NEMO(Δhepa) mice is accompanied by an abundant presence of bile acids (BAs), misregulation of BA transporters and significant alteration of lipid metabolism-related genes. Additionally, mice lacking NEMO in hepatocytes spontaneously showed ductular response at young age. Unexpectedly, feeding of NEMO(Δhepa) mice with low-fat diet failed to improve chronic liver injury. Conversely, anti-cholestatic treatment with nor-ursodeoxycholic acid (NorUDCA), but not with ursodeoxycholic acid (UDCA), led to a significant attenuation of liver damage in NEMO(Δhepa) mice. The strong therapeutic effect of NorUDCA relied on a significant downregulation of LXR-dependent lipogenesis and the normalisation of BA metabolism through mechanisms involving cross-talk between Cyp7a1 and SHP. This was associated with the significant improvement of liver histology, NEMO(Δhepa)/NorUDCA-treated mice showed lower apoptosis and reduced CyclinD1 expression, indicating attenuation of the compensatory proliferative response to hepatocellular damage. Finally, fibrosis and ductular reaction markers were significantly reduced in NorUDCA-treated NEMO(Δhepa) mice. Overall, our work demonstrates the contribution of bile acids metabolism to the progression of NASH in the absence of hepatocyte-NF-kB through mechanisms involving DR5-apoptosis, inflammation and fibrosis. Our work suggests a potential

  16. Evaluation of the DeepWind concept

    DEFF Research Database (Denmark)

    Schmidt Paulsen, Uwe; Borg, Michael; Gonzales Seabra, Luis Alberto

    The report describes the DeepWind 5 MW conceptual design as a baseline for results obtained in the scientific and technical work packages of the DeepWind project. A comparison of DeepWi nd with existing VAWTs and paper projects are carried out and the evaluation of the concept in terms of cost...

  17. Simulator Studies of the Deep Stall

    Science.gov (United States)

    White, Maurice D.; Cooper, George E.

    1965-01-01

    Simulator studies of the deep-stall problem encountered with modern airplanes are discussed. The results indicate that the basic deep-stall tendencies produced by aerodynamic characteristics are augmented by operational considerations. Because of control difficulties to be anticipated in the deep stall, it is desirable that adequate safeguards be provided against inadvertent penetrations.

  18. Deep Learning

    DEFF Research Database (Denmark)

    Jensen, Morten Bornø; Bahnsen, Chris Holmberg; Nasrollahi, Kamal

    2018-01-01

    I løbet af de sidste 10 år er kunstige neurale netværk gået fra at være en støvet, udstødt tekno-logi til at spille en hovedrolle i udviklingen af kunstig intelligens. Dette fænomen kaldes deep learning og er inspireret af hjernens opbygning.......I løbet af de sidste 10 år er kunstige neurale netværk gået fra at være en støvet, udstødt tekno-logi til at spille en hovedrolle i udviklingen af kunstig intelligens. Dette fænomen kaldes deep learning og er inspireret af hjernens opbygning....

  19. DeepNAT: Deep convolutional neural network for segmenting neuroanatomy.

    Science.gov (United States)

    Wachinger, Christian; Reuter, Martin; Klein, Tassilo

    2018-04-15

    We introduce DeepNAT, a 3D Deep convolutional neural network for the automatic segmentation of NeuroAnaTomy in T1-weighted magnetic resonance images. DeepNAT is an end-to-end learning-based approach to brain segmentation that jointly learns an abstract feature representation and a multi-class classification. We propose a 3D patch-based approach, where we do not only predict the center voxel of the patch but also neighbors, which is formulated as multi-task learning. To address a class imbalance problem, we arrange two networks hierarchically, where the first one separates foreground from background, and the second one identifies 25 brain structures on the foreground. Since patches lack spatial context, we augment them with coordinates. To this end, we introduce a novel intrinsic parameterization of the brain volume, formed by eigenfunctions of the Laplace-Beltrami operator. As network architecture, we use three convolutional layers with pooling, batch normalization, and non-linearities, followed by fully connected layers with dropout. The final segmentation is inferred from the probabilistic output of the network with a 3D fully connected conditional random field, which ensures label agreement between close voxels. The roughly 2.7million parameters in the network are learned with stochastic gradient descent. Our results show that DeepNAT compares favorably to state-of-the-art methods. Finally, the purely learning-based method may have a high potential for the adaptation to young, old, or diseased brains by fine-tuning the pre-trained network with a small training sample on the target application, where the availability of larger datasets with manual annotations may boost the overall segmentation accuracy in the future. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Deep Learning and Its Applications in Biomedicine.

    Science.gov (United States)

    Cao, Chensi; Liu, Feng; Tan, Hai; Song, Deshou; Shu, Wenjie; Li, Weizhong; Zhou, Yiming; Bo, Xiaochen; Xie, Zhi

    2018-02-01

    Advances in biological and medical technologies have been providing us explosive volumes of biological and physiological data, such as medical images, electroencephalography, genomic and protein sequences. Learning from these data facilitates the understanding of human health and disease. Developed from artificial neural networks, deep learning-based algorithms show great promise in extracting features and learning patterns from complex data. The aim of this paper is to provide an overview of deep learning techniques and some of the state-of-the-art applications in the biomedical field. We first introduce the development of artificial neural network and deep learning. We then describe two main components of deep learning, i.e., deep learning architectures and model optimization. Subsequently, some examples are demonstrated for deep learning applications, including medical image classification, genomic sequence analysis, as well as protein structure classification and prediction. Finally, we offer our perspectives for the future directions in the field of deep learning. Copyright © 2018. Production and hosting by Elsevier B.V.

  1. Deep Water Acoustics

    Science.gov (United States)

    2016-06-28

    the Deep Water project and participate in the NPAL Workshops, including Art Baggeroer (MIT), J. Beron- Vera (UMiami), M. Brown (UMiami), T...Kathleen E . Wage. The North Pacific Acoustic Laboratory deep-water acoustic propagation experiments in the Philippine Sea. J. Acoust. Soc. Am., 134(4...estimate of the angle α during PhilSea09, made from ADCP measurements at the site of the DVLA. Sim. A B1 B2 B3 C D E F Prof. # 0 4 4 4 5 10 16 20 α

  2. Overview of deep learning in medical imaging.

    Science.gov (United States)

    Suzuki, Kenji

    2017-09-01

    The use of machine learning (ML) has been increasing rapidly in the medical imaging field, including computer-aided diagnosis (CAD), radiomics, and medical image analysis. Recently, an ML area called deep learning emerged in the computer vision field and became very popular in many fields. It started from an event in late 2012, when a deep-learning approach based on a convolutional neural network (CNN) won an overwhelming victory in the best-known worldwide computer vision competition, ImageNet Classification. Since then, researchers in virtually all fields, including medical imaging, have started actively participating in the explosively growing field of deep learning. In this paper, the area of deep learning in medical imaging is overviewed, including (1) what was changed in machine learning before and after the introduction of deep learning, (2) what is the source of the power of deep learning, (3) two major deep-learning models: a massive-training artificial neural network (MTANN) and a convolutional neural network (CNN), (4) similarities and differences between the two models, and (5) their applications to medical imaging. This review shows that ML with feature input (or feature-based ML) was dominant before the introduction of deep learning, and that the major and essential difference between ML before and after deep learning is the learning of image data directly without object segmentation or feature extraction; thus, it is the source of the power of deep learning, although the depth of the model is an important attribute. The class of ML with image input (or image-based ML) including deep learning has a long history, but recently gained popularity due to the use of the new terminology, deep learning. There are two major models in this class of ML in medical imaging, MTANN and CNN, which have similarities as well as several differences. In our experience, MTANNs were substantially more efficient in their development, had a higher performance, and required a

  3. WFIRST: Science from Deep Field Surveys

    Science.gov (United States)

    Koekemoer, Anton; Foley, Ryan; WFIRST Deep Field Working Group

    2018-01-01

    WFIRST will enable deep field imaging across much larger areas than those previously obtained with Hubble, opening up completely new areas of parameter space for extragalactic deep fields including cosmology, supernova and galaxy evolution science. The instantaneous field of view of the Wide Field Instrument (WFI) is about 0.3 square degrees, which would for example yield an Ultra Deep Field (UDF) reaching similar depths at visible and near-infrared wavelengths to that obtained with Hubble, over an area about 100-200 times larger, for a comparable investment in time. Moreover, wider fields on scales of 10-20 square degrees could achieve depths comparable to large HST surveys at medium depths such as GOODS and CANDELS, and would enable multi-epoch supernova science that could be matched in area to LSST Deep Drilling fields or other large survey areas. Such fields may benefit from being placed on locations in the sky that have ancillary multi-band imaging or spectroscopy from other facilities, from the ground or in space. The WFIRST Deep Fields Working Group has been examining the science considerations for various types of deep fields that may be obtained with WFIRST, and present here a summary of the various properties of different locations in the sky that may be considered for future deep fields with WFIRST.

  4. TOPIC MODELING: CLUSTERING OF DEEP WEBPAGES

    OpenAIRE

    Muhunthaadithya C; Rohit J.V; Sadhana Kesavan; E. Sivasankar

    2015-01-01

    The internet is comprised of massive amount of information in the form of zillions of web pages.This information can be categorized into the surface web and the deep web. The existing search engines can effectively make use of surface web information.But the deep web remains unexploited yet. Machine learning techniques have been commonly employed to access deep web content.

  5. DeepSimulator: a deep simulator for Nanopore sequencing

    KAUST Repository

    Li, Yu; Han, Renmin; Bi, Chongwei; Li, Mo; Wang, Sheng; Gao, Xin

    2017-01-01

    or assembled contigs, we simulate the electrical current signals by a context-dependent deep learning model, followed by a base-calling procedure to yield simulated reads. This workflow mimics the sequencing procedure more naturally. The thorough experiments

  6. 1-Nitropyrene (1-NP) induces apoptosis and apparently a non-apoptotic programmed cell death (paraptosis) in Hepa1c1c7 cells

    International Nuclear Information System (INIS)

    Asare, Nana; Landvik, Nina E.; Lagadic-Gossmann, Dominique; Rissel, Mary; Tekpli, Xavier; Ask, Kjetil; Lag, Marit; Holme, Jorn A.

    2008-01-01

    Mechanistic studies of nitro-PAHs (polycyclic aromatic hydrocarbons) of interest might help elucidate which chemical characteristics are most important in eliciting toxic effects. 1-Nitropyrene (1-NP) is the predominant nitrated PAH emitted in diesel exhaust. 1-NP-exposed Hepa1c1c7 cells exhibited marked changes in cellular morphology, decreased proliferation and different forms of cell death. A dramatic increase in cytoplasmic vacuolization was observed already after 6 h of exposure and the cells started to round up at 12 h. The rate of cell proliferation was markedly reduced at 24 h and apoptotic as well as propidium iodide (PI)-positive cells appeared. Electron microscopic examination revealed that the vacuolization was partly due to mitochondria swelling. The caspase inhibitor Z-VAD-FMK inhibited only the apoptotic cell death and Nec-1 (an inhibitor of necroptosis) exhibited no inhibitory effects on either cell death or vacuolization. In contrast, cycloheximide markedly reduced both the number of apoptotic and PI-positive cells as well as the cytoplasmic vacuolization, suggesting that 1-NP induced paraptotic cell death. All the MAPKs; ERK1/2, p38 and JNK, appear to be involved in the death process since marked activation was observed upon 1-NP exposure, and their inhibitors partly reduced the induced cell death. The ERK1/2 inhibitor PD 98057 completely blocked the induced vacuolization, whereas the other MAPKs inhibitors only had minor effects on this process. These findings suggest that 1-NP may cause apoptosis and paraptosis. In contrast, the corresponding amine (1-aminopyrene) elicited only minor apoptotic and necrotic cell death, and cells with characteristics typical of paraptosis were absent

  7. Building Program Vector Representations for Deep Learning

    OpenAIRE

    Mou, Lili; Li, Ge; Liu, Yuxuan; Peng, Hao; Jin, Zhi; Xu, Yan; Zhang, Lu

    2014-01-01

    Deep learning has made significant breakthroughs in various fields of artificial intelligence. Advantages of deep learning include the ability to capture highly complicated features, weak involvement of human engineering, etc. However, it is still virtually impossible to use deep learning to analyze programs since deep architectures cannot be trained effectively with pure back propagation. In this pioneering paper, we propose the "coding criterion" to build program vector representations, whi...

  8. [Deep vein thrombosis prophylaxis.

    Science.gov (United States)

    Sandoval-Chagoya, Gloria Alejandra; Laniado-Laborín, Rafael

    2013-01-01

    Background: despite the proven effectiveness of preventive therapy for deep vein thrombosis, a significant proportion of patients at risk for thromboembolism do not receive prophylaxis during hospitalization. Our objective was to determine the adherence to thrombosis prophylaxis guidelines in a general hospital as a quality control strategy. Methods: a random audit of clinical charts was conducted at the Tijuana General Hospital, Baja California, Mexico, to determine the degree of adherence to deep vein thrombosis prophylaxis guidelines. The instrument used was the Caprini's checklist for thrombosis risk assessment in adult patients. Results: the sample included 300 patient charts; 182 (60.7 %) were surgical patients and 118 were medical patients. Forty six patients (15.3 %) received deep vein thrombosis pharmacologic prophylaxis; 27.1 % of medical patients received deep vein thrombosis prophylaxis versus 8.3 % of surgical patients (p < 0.0001). Conclusions: our results show that adherence to DVT prophylaxis at our hospital is extremely low. Only 15.3 % of our patients at risk received treatment, and even patients with very high risk received treatment in less than 25 % of the cases. We have implemented strategies to increase compliance with clinical guidelines.

  9. Contemporary deep recurrent learning for recognition

    Science.gov (United States)

    Iftekharuddin, K. M.; Alam, M.; Vidyaratne, L.

    2017-05-01

    Large-scale feed-forward neural networks have seen intense application in many computer vision problems. However, these networks can get hefty and computationally intensive with increasing complexity of the task. Our work, for the first time in literature, introduces a Cellular Simultaneous Recurrent Network (CSRN) based hierarchical neural network for object detection. CSRN has shown to be more effective to solving complex tasks such as maze traversal and image processing when compared to generic feed forward networks. While deep neural networks (DNN) have exhibited excellent performance in object detection and recognition, such hierarchical structure has largely been absent in neural networks with recurrency. Further, our work introduces deep hierarchy in SRN for object recognition. The simultaneous recurrency results in an unfolding effect of the SRN through time, potentially enabling the design of an arbitrarily deep network. This paper shows experiments using face, facial expression and character recognition tasks using novel deep recurrent model and compares recognition performance with that of generic deep feed forward model. Finally, we demonstrate the flexibility of incorporating our proposed deep SRN based recognition framework in a humanoid robotic platform called NAO.

  10. Towards deep learning with segregated dendrites.

    Science.gov (United States)

    Guerguiev, Jordan; Lillicrap, Timothy P; Richards, Blake A

    2017-12-05

    Deep learning has led to significant advances in artificial intelligence, in part, by adopting strategies motivated by neurophysiology. However, it is unclear whether deep learning could occur in the real brain. Here, we show that a deep learning algorithm that utilizes multi-compartment neurons might help us to understand how the neocortex optimizes cost functions. Like neocortical pyramidal neurons, neurons in our model receive sensory information and higher-order feedback in electrotonically segregated compartments. Thanks to this segregation, neurons in different layers of the network can coordinate synaptic weight updates. As a result, the network learns to categorize images better than a single layer network. Furthermore, we show that our algorithm takes advantage of multilayer architectures to identify useful higher-order representations-the hallmark of deep learning. This work demonstrates that deep learning can be achieved using segregated dendritic compartments, which may help to explain the morphology of neocortical pyramidal neurons.

  11. The deep ocean under climate change

    Science.gov (United States)

    Levin, Lisa A.; Le Bris, Nadine

    2015-11-01

    The deep ocean absorbs vast amounts of heat and carbon dioxide, providing a critical buffer to climate change but exposing vulnerable ecosystems to combined stresses of warming, ocean acidification, deoxygenation, and altered food inputs. Resulting changes may threaten biodiversity and compromise key ocean services that maintain a healthy planet and human livelihoods. There exist large gaps in understanding of the physical and ecological feedbacks that will occur. Explicit recognition of deep-ocean climate mitigation and inclusion in adaptation planning by the United Nations Framework Convention on Climate Change (UNFCCC) could help to expand deep-ocean research and observation and to protect the integrity and functions of deep-ocean ecosystems.

  12. Fire protection countermeasures for containment ventilation systems

    International Nuclear Information System (INIS)

    Alvares, N.J.; Beason, D.G.; Bergman, W.; Ford, H.W.; Lipska, A.E.

    1980-01-01

    The goal of this project is to find countermeasures to protect HEPA filters in exit ventilation ducts from the heat and smoke generated by fire. Several methods for partially mitigating the smoke exposure to the HEPA filters were identified through testing and analysis. These independently involve controlling the fuel, controlling the fire, and intercepting the smoke aerosol prior to its sorption on the HEPA filter. Exit duct treatment of aerosols is not unusual in industrial applications and involves the use of scrubbers, prefilters, and inertial impaction, depending on the size, distribution, and concentration of the subject aerosol. However, when these unmodified techniques were applied to smoke aerosols from fires on materials, common to experimental laboratories of LLNL, it was found they offered minimal protection to the HEPA filters. Ultimately, a continuous, movable, high-efficiency prefilter using modified commercial equipment was designed. This technique is capable of protecting HEPA filters over the total duration of the test fires. The reason for success involved the modificaton of the prefiltration media. Commercially available filter media has a particle sorption efficiency that is inversely proportional to media strength. To achieve properties of both efficiency and strength, we laminated rolling filter media with the desired properties. It is not true that the use of rolling prefilters solely to protect HEPA filters from fire-generated smoke aerosols is cost effective in every type of containment system, especially if standard fire-protection systems are available in the space. But in areas of high fire risk, where the potential fuel load is large and ignition sources are plentiful, the complication of a rolling prefilter in exit ventilation ducts to protect HEPA filters from smoke aerosols is definitely justified

  13. The impact of onsite workplace health-enhancing physical activity interventions on worker productivity: a systematic review.

    Science.gov (United States)

    Pereira, Michelle Jessica; Coombes, Brooke Kaye; Comans, Tracy Anne; Johnston, Venerina

    2015-06-01

    The aim of this study is to investigate the effects of onsite workplace health-enhancing physical activity (HEPA) programmes on worker productivity. The PROSPERO registration number is CRD42014008750. A search for controlled trials or randomised controlled trials (RCTs) that investigated the effects of onsite workplace HEPA programmes on productivity levels of working adults was performed. Risk of bias of included studies was assessed, and the inter-rater reliability of the quality assessment was analysed. Qualitative synthesis of available evidence is presented. Eight studies were included in the review. There is consistent evidence that onsite workplace HEPA programmes do not reduce levels of sick leave. There appears to be inconsistent evidence of the impact of onsite workplace HEPA programmes on worker productivity. A high-quality study of an onsite combination (aerobic, strengthening and flexibility) HEPA regime and a moderate-quality study of a Tai Chi programme improved worker productivity measured with questionnaires in female laundry workers and older female nurses, respectively. Two high-quality studies and four moderate-quality studies did not show benefit. Studies that showed benefit were mainly those designed with productivity measures as primary outcomes, delivered to occupations involved with higher physical loads, and had higher compliance and programme intensity. The small number of studies and the lack of consistency among studies limited further analyses. There is inconsistent evidence that onsite workplace HEPA programmes improve self-reported worker productivity. Future high-quality RCTs of onsite workplace HEPA programmes should be designed around productivity outcomes, target at-risk groups and investigate interventions of sufficient intensity. High attendance with improved recording is needed to achieve significant results in augmenting worker productivity. Published by the BMJ Publishing Group Limited. For permission to use (where not

  14. Combined use of an electrostatic precipitator and a high-efficiency particulate air filter in building ventilation systems: Effects on cardiorespiratory health indicators in healthy adults.

    Science.gov (United States)

    Day, D B; Xiang, J; Mo, J; Clyde, M A; Weschler, C J; Li, F; Gong, J; Chung, M; Zhang, Y; Zhang, J

    2018-05-01

    High-efficiency particulate air (HEPA) filtration in combination with an electrostatic precipitator (ESP) can be a cost-effective approach to reducing indoor particulate exposure, but ESPs produce ozone. The health effect of combined ESP-HEPA filtration has not been examined. We conducted an intervention study in 89 volunteers. At baseline, the air-handling units of offices and residences for all subjects were comprised of coarse, ESP, and HEPA filtration. During the 5-week long intervention, the subjects were split into 2 groups, 1 with just the ESP removed and the other with both the ESP and HEPA removed. Each subject was measured for cardiopulmonary risk indicators once at baseline, twice during the intervention, and once 2 weeks after baseline conditions were restored. Measured indoor and outdoor PM 2.5 and ozone concentrations, coupled with time-activity data, were used to calculate exposures. Removal of HEPA filters increased 24-hour mean PM 2.5 exposure by 38 (95% CI: 31, 45) μg/m 3 . Removal of ESPs decreased 24-hour mean ozone exposure by 2.2 (2.0, 2.5) ppb. No biomarkers were significantly associated with HEPA filter removal. In contrast, ESP removal was associated with a -16.1% (-21.5%, -10.4%) change in plasma-soluble P-selectin and a -3.0% (-5.1%, -0.8%) change in systolic blood pressure, suggesting reduced cardiovascular risks. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Genetic Nrf2 Overactivation Inhibits the Deleterious Effects Induced by Hepatocyte-Specific c-met Deletion during the Progression of NASH

    Directory of Open Access Journals (Sweden)

    Pierluigi Ramadori

    2017-01-01

    Full Text Available We have recently shown that hepatocyte-specific c-met deficiency accelerates the progression of nonalcoholic steatohepatitis in experimental murine models resulting in augmented production of reactive oxygen species and accelerated development of fibrosis. The aim of this study focuses on the elucidation of the underlying cellular mechanisms driven by Nrf2 overactivation in hepatocytes lacking c-met receptor characterized by a severe unbalance between pro-oxidant and antioxidant functions. Control mice (c-metfx/fx, single c-met knockouts (c-metΔhepa, and double c-met/Keap1 knockouts (met/Keap1Δhepa were then fed a chow or a methionine-choline-deficient (MCD diet, respectively, for 4 weeks to reproduce the features of nonalcoholic steatohepatitis. Upon MCD feeding, met/Keap1Δhepa mice displayed increased liver mass albeit decreased triglyceride accumulation. The marked increase of oxidative stress observed in c-metΔhepa was restored in the double mutants as assessed by 4-HNE immunostaining and by the expression of genes responsible for the generation of free radicals. Moreover, double knockout mice presented a reduced amount of liver-infiltrating cells and the exacerbation of fibrosis progression observed in c-metΔhepa livers was significantly inhibited in met/Keap1Δhepa. Therefore, genetic activation of the antioxidant transcription factor Nrf2 improves liver damage and repair in hepatocyte-specific c-met-deficient mice mainly through restoring a balance in the cellular redox homeostasis.

  16. NATURAL GAS RESOURCES IN DEEP SEDIMENTARY BASINS

    Energy Technology Data Exchange (ETDEWEB)

    Thaddeus S. Dyman; Troy Cook; Robert A. Crovelli; Allison A. Henry; Timothy C. Hester; Ronald C. Johnson; Michael D. Lewan; Vito F. Nuccio; James W. Schmoker; Dennis B. Riggin; Christopher J. Schenk

    2002-02-05

    From a geological perspective, deep natural gas resources are generally defined as resources occurring in reservoirs at or below 15,000 feet, whereas ultra-deep gas occurs below 25,000 feet. From an operational point of view, ''deep'' is often thought of in a relative sense based on the geologic and engineering knowledge of gas (and oil) resources in a particular area. Deep gas can be found in either conventionally-trapped or unconventional basin-center accumulations that are essentially large single fields having spatial dimensions often exceeding those of conventional fields. Exploration for deep conventional and unconventional basin-center natural gas resources deserves special attention because these resources are widespread and occur in diverse geologic environments. In 1995, the U.S. Geological Survey estimated that 939 TCF of technically recoverable natural gas remained to be discovered or was part of reserve appreciation from known fields in the onshore areas and State waters of the United. Of this USGS resource, nearly 114 trillion cubic feet (Tcf) of technically-recoverable gas remains to be discovered from deep sedimentary basins. Worldwide estimates of deep gas are also high. The U.S. Geological Survey World Petroleum Assessment 2000 Project recently estimated a world mean undiscovered conventional gas resource outside the U.S. of 844 Tcf below 4.5 km (about 15,000 feet). Less is known about the origins of deep gas than about the origins of gas at shallower depths because fewer wells have been drilled into the deeper portions of many basins. Some of the many factors contributing to the origin of deep gas include the thermal stability of methane, the role of water and non-hydrocarbon gases in natural gas generation, porosity loss with increasing thermal maturity, the kinetics of deep gas generation, thermal cracking of oil to gas, and source rock potential based on thermal maturity and kerogen type. Recent experimental simulations

  17. Deep smarts.

    Science.gov (United States)

    Leonard, Dorothy; Swap, Walter

    2004-09-01

    When a person sizes up a complex situation and rapidly comes to a decision that proves to be not just good but brilliant, you think, "That was smart." After you watch him do this a few times, you realize you're in the presence of something special. It's not raw brainpower, though that helps. It's not emotional intelligence, either, though that, too, is often involved. It's deep smarts. Deep smarts are not philosophical--they're not"wisdom" in that sense, but they're as close to wisdom as business gets. You see them in the manager who understands when and how to move into a new international market, in the executive who knows just what kind of talk to give when her organization is in crisis, in the technician who can track a product failure back to an interaction between independently produced elements. These are people whose knowledge would be hard to purchase on the open market. Their insight is based on know-how more than on know-what; it comprises a system view as well as expertise in individual areas. Because deep smarts are experienced based and often context specific, they can't be produced overnight or readily imported into an organization. It takes years for an individual to develop them--and no time at all for an organization to lose them when a valued veteran walks out the door. They can be taught, however, with the right techniques. Drawing on their forthcoming book Deep Smarts, Dorothy Leonard and Walter Swap say the best way to transfer such expertise to novices--and, on a larger scale, to make individual knowledge institutional--isn't through PowerPoint slides, a Web site of best practices, online training, project reports, or lectures. Rather, the sage needs to teach the neophyte individually how to draw wisdom from experience. Companies have to be willing to dedicate time and effort to such extensive training, but the investment more than pays for itself.

  18. Deep Learning and Developmental Learning: Emergence of Fine-to-Coarse Conceptual Categories at Layers of Deep Belief Network.

    Science.gov (United States)

    Sadeghi, Zahra

    2016-09-01

    In this paper, I investigate conceptual categories derived from developmental processing in a deep neural network. The similarity matrices of deep representation at each layer of neural network are computed and compared with their raw representation. While the clusters generated by raw representation stand at the basic level of abstraction, conceptual categories obtained from deep representation shows a bottom-up transition procedure. Results demonstrate a developmental course of learning from specific to general level of abstraction through learned layers of representations in a deep belief network. © The Author(s) 2016.

  19. Climate, carbon cycling, and deep-ocean ecosystems.

    Science.gov (United States)

    Smith, K L; Ruhl, H A; Bett, B J; Billett, D S M; Lampitt, R S; Kaufmann, R S

    2009-11-17

    Climate variation affects surface ocean processes and the production of organic carbon, which ultimately comprises the primary food supply to the deep-sea ecosystems that occupy approximately 60% of the Earth's surface. Warming trends in atmospheric and upper ocean temperatures, attributed to anthropogenic influence, have occurred over the past four decades. Changes in upper ocean temperature influence stratification and can affect the availability of nutrients for phytoplankton production. Global warming has been predicted to intensify stratification and reduce vertical mixing. Research also suggests that such reduced mixing will enhance variability in primary production and carbon export flux to the deep sea. The dependence of deep-sea communities on surface water production has raised important questions about how climate change will affect carbon cycling and deep-ocean ecosystem function. Recently, unprecedented time-series studies conducted over the past two decades in the North Pacific and the North Atlantic at >4,000-m depth have revealed unexpectedly large changes in deep-ocean ecosystems significantly correlated to climate-driven changes in the surface ocean that can impact the global carbon cycle. Climate-driven variation affects oceanic communities from surface waters to the much-overlooked deep sea and will have impacts on the global carbon cycle. Data from these two widely separated areas of the deep ocean provide compelling evidence that changes in climate can readily influence deep-sea processes. However, the limited geographic coverage of these existing time-series studies stresses the importance of developing a more global effort to monitor deep-sea ecosystems under modern conditions of rapidly changing climate.

  20. The deep ocean under climate change.

    Science.gov (United States)

    Levin, Lisa A; Le Bris, Nadine

    2015-11-13

    The deep ocean absorbs vast amounts of heat and carbon dioxide, providing a critical buffer to climate change but exposing vulnerable ecosystems to combined stresses of warming, ocean acidification, deoxygenation, and altered food inputs. Resulting changes may threaten biodiversity and compromise key ocean services that maintain a healthy planet and human livelihoods. There exist large gaps in understanding of the physical and ecological feedbacks that will occur. Explicit recognition of deep-ocean climate mitigation and inclusion in adaptation planning by the United Nations Framework Convention on Climate Change (UNFCCC) could help to expand deep-ocean research and observation and to protect the integrity and functions of deep-ocean ecosystems. Copyright © 2015, American Association for the Advancement of Science.

  1. SEDS: THE SPITZER EXTENDED DEEP SURVEY. SURVEY DESIGN, PHOTOMETRY, AND DEEP IRAC SOURCE COUNTS

    International Nuclear Information System (INIS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Hernquist, L.; Hora, J. L.; Arendt, R.; Barmby, P.; Barro, G.; Faber, S.; Guhathakurta, P.; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Davé, R.; Dunlop, J. S.; Egami, E.; Finlator, K.; Grogin, N. A.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg 2 to a depth of 26 AB mag (3σ) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 μm. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 ± 1.0 and 4.4 ± 0.8 nW m –2 sr –1 at 3.6 and 4.5 μm to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  2. The deep lymphatic anatomy of the hand.

    Science.gov (United States)

    Ma, Chuan-Xiang; Pan, Wei-Ren; Liu, Zhi-An; Zeng, Fan-Qiang; Qiu, Zhi-Qiang

    2018-04-03

    The deep lymphatic anatomy of the hand still remains the least described in medical literature. Eight hands were harvested from four nonembalmed human cadavers amputated above the wrist. A small amount of 6% hydrogen peroxide was employed to detect the lymphatic vessels around the superficial and deep palmar vascular arches, in webs from the index to little fingers, the thenar and hypothenar areas. A 30-gauge needle was inserted into the vessels and injected with a barium sulphate compound. Each specimen was dissected, photographed and radiographed to demonstrate deep lymphatic distribution of the hand. Five groups of deep collecting lymph vessels were found in the hand: superficial palmar arch lymph vessel (SPALV); deep palmar arch lymph vessel (DPALV); thenar lymph vessel (TLV); hypothenar lymph vessel (HTLV); deep finger web lymph vessel (DFWLV). Each group of vessels drained in different directions first, then all turned and ran towards the wrist in different layers. The deep lymphatic drainage of the hand has been presented. The results will provide an anatomical basis for clinical management, educational reference and scientific research. Copyright © 2018 Elsevier GmbH. All rights reserved.

  3. Deep ECGNet: An Optimal Deep Learning Framework for Monitoring Mental Stress Using Ultra Short-Term ECG Signals.

    Science.gov (United States)

    Hwang, Bosun; You, Jiwoo; Vaessen, Thomas; Myin-Germeys, Inez; Park, Cheolsoo; Zhang, Byoung-Tak

    2018-02-08

    Stress recognition using electrocardiogram (ECG) signals requires the intractable long-term heart rate variability (HRV) parameter extraction process. This study proposes a novel deep learning framework to recognize the stressful states, the Deep ECGNet, using ultra short-term raw ECG signals without any feature engineering methods. The Deep ECGNet was developed through various experiments and analysis of ECG waveforms. We proposed the optimal recurrent and convolutional neural networks architecture, and also the optimal convolution filter length (related to the P, Q, R, S, and T wave durations of ECG) and pooling length (related to the heart beat period) based on the optimization experiments and analysis on the waveform characteristics of ECG signals. The experiments were also conducted with conventional methods using HRV parameters and frequency features as a benchmark test. The data used in this study were obtained from Kwangwoon University in Korea (13 subjects, Case 1) and KU Leuven University in Belgium (9 subjects, Case 2). Experiments were designed according to various experimental protocols to elicit stressful conditions. The proposed framework to recognize stress conditions, the Deep ECGNet, outperformed the conventional approaches with the highest accuracy of 87.39% for Case 1 and 73.96% for Case 2, respectively, that is, 16.22% and 10.98% improvements compared with those of the conventional HRV method. We proposed an optimal deep learning architecture and its parameters for stress recognition, and the theoretical consideration on how to design the deep learning structure based on the periodic patterns of the raw ECG data. Experimental results in this study have proved that the proposed deep learning model, the Deep ECGNet, is an optimal structure to recognize the stress conditions using ultra short-term ECG data.

  4. Deep inelastic electron and muon scattering

    International Nuclear Information System (INIS)

    Taylor, R.E.

    1975-07-01

    From the review of deep inelastic electron and muon scattering it is concluded that the puzzle of deep inelastic scattering versus annihilation was replaced with the challenge of the new particles, that the evidence for the simplest quark-algebra models of deep inelastic processes is weaker than a year ago. Definite evidence of scale breaking was found but the specific form of that scale breaking is difficult to extract from the data. 59 references

  5. Fast, Distributed Algorithms in Deep Networks

    Science.gov (United States)

    2016-05-11

    shallow networks, additional work will need to be done in order to allow for the application of ADMM to deep nets. The ADMM method allows for quick...Quock V Le, et al. Large scale distributed deep networks. In Advances in Neural Information Processing Systems, pages 1223–1231, 2012. [11] Ken-Ichi...A TRIDENT SCHOLAR PROJECT REPORT NO. 446 Fast, Distributed Algorithms in Deep Networks by Midshipman 1/C Ryan J. Burmeister, USN

  6. Deep Learning from Crowds

    DEFF Research Database (Denmark)

    Rodrigues, Filipe; Pereira, Francisco Camara

    Over the last few years, deep learning has revolutionized the field of machine learning by dramatically improving the stateof-the-art in various domains. However, as the size of supervised artificial neural networks grows, typically so does the need for larger labeled datasets. Recently...... networks from crowds. We begin by describing an EM algorithm for jointly learning the parameters of the network and the reliabilities of the annotators. Then, a novel general-purpose crowd layer is proposed, which allows us to train deep neural networks end-to-end, directly from the noisy labels......, crowdsourcing has established itself as an efficient and cost-effective solution for labeling large sets of data in a scalable manner, but it often requires aggregating labels from multiple noisy contributors with different levels of expertise. In this paper, we address the problem of learning deep neural...

  7. Deep learning methods for protein torsion angle prediction.

    Science.gov (United States)

    Li, Haiou; Hou, Jie; Adhikari, Badri; Lyu, Qiang; Cheng, Jianlin

    2017-09-18

    Deep learning is one of the most powerful machine learning methods that has achieved the state-of-the-art performance in many domains. Since deep learning was introduced to the field of bioinformatics in 2012, it has achieved success in a number of areas such as protein residue-residue contact prediction, secondary structure prediction, and fold recognition. In this work, we developed deep learning methods to improve the prediction of torsion (dihedral) angles of proteins. We design four different deep learning architectures to predict protein torsion angles. The architectures including deep neural network (DNN) and deep restricted Boltzmann machine (DRBN), deep recurrent neural network (DRNN) and deep recurrent restricted Boltzmann machine (DReRBM) since the protein torsion angle prediction is a sequence related problem. In addition to existing protein features, two new features (predicted residue contact number and the error distribution of torsion angles extracted from sequence fragments) are used as input to each of the four deep learning architectures to predict phi and psi angles of protein backbone. The mean absolute error (MAE) of phi and psi angles predicted by DRNN, DReRBM, DRBM and DNN is about 20-21° and 29-30° on an independent dataset. The MAE of phi angle is comparable to the existing methods, but the MAE of psi angle is 29°, 2° lower than the existing methods. On the latest CASP12 targets, our methods also achieved the performance better than or comparable to a state-of-the art method. Our experiment demonstrates that deep learning is a valuable method for predicting protein torsion angles. The deep recurrent network architecture performs slightly better than deep feed-forward architecture, and the predicted residue contact number and the error distribution of torsion angles extracted from sequence fragments are useful features for improving prediction accuracy.

  8. Deep Learning in Gastrointestinal Endoscopy.

    Science.gov (United States)

    Patel, Vivek; Armstrong, David; Ganguli, Malika; Roopra, Sandeep; Kantipudi, Neha; Albashir, Siwar; Kamath, Markad V

    2016-01-01

    Gastrointestinal (GI) endoscopy is used to inspect the lumen or interior of the GI tract for several purposes, including, (1) making a clinical diagnosis, in real time, based on the visual appearances; (2) taking targeted tissue samples for subsequent histopathological examination; and (3) in some cases, performing therapeutic interventions targeted at specific lesions. GI endoscopy is therefore predicated on the assumption that the operator-the endoscopist-is able to identify and characterize abnormalities or lesions accurately and reproducibly. However, as in other areas of clinical medicine, such as histopathology and radiology, many studies have documented marked interobserver and intraobserver variability in lesion recognition. Thus, there is a clear need and opportunity for techniques or methodologies that will enhance the quality of lesion recognition and diagnosis and improve the outcomes of GI endoscopy. Deep learning models provide a basis to make better clinical decisions in medical image analysis. Biomedical image segmentation, classification, and registration can be improved with deep learning. Recent evidence suggests that the application of deep learning methods to medical image analysis can contribute significantly to computer-aided diagnosis. Deep learning models are usually considered to be more flexible and provide reliable solutions for image analysis problems compared to conventional computer vision models. The use of fast computers offers the possibility of real-time support that is important for endoscopic diagnosis, which has to be made in real time. Advanced graphics processing units and cloud computing have also favored the use of machine learning, and more particularly, deep learning for patient care. This paper reviews the rapidly evolving literature on the feasibility of applying deep learning algorithms to endoscopic imaging.

  9. Neuromorphic Deep Learning Machines

    OpenAIRE

    Neftci, E; Augustine, C; Paul, S; Detorakis, G

    2017-01-01

    An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. One increasingly popular and successful approach is to take inspiration from inference and learning algorithms used in deep neural networks. However, the workhorse of deep learning, the gradient descent Back Propagation (BP) rule, often relies on the immediate availability of network-wide...

  10. Deep Restricted Kernel Machines Using Conjugate Feature Duality.

    Science.gov (United States)

    Suykens, Johan A K

    2017-08-01

    The aim of this letter is to propose a theory of deep restricted kernel machines offering new foundations for deep learning with kernel machines. From the viewpoint of deep learning, it is partially related to restricted Boltzmann machines, which are characterized by visible and hidden units in a bipartite graph without hidden-to-hidden connections and deep learning extensions as deep belief networks and deep Boltzmann machines. From the viewpoint of kernel machines, it includes least squares support vector machines for classification and regression, kernel principal component analysis (PCA), matrix singular value decomposition, and Parzen-type models. A key element is to first characterize these kernel machines in terms of so-called conjugate feature duality, yielding a representation with visible and hidden units. It is shown how this is related to the energy form in restricted Boltzmann machines, with continuous variables in a nonprobabilistic setting. In this new framework of so-called restricted kernel machine (RKM) representations, the dual variables correspond to hidden features. Deep RKM are obtained by coupling the RKMs. The method is illustrated for deep RKM, consisting of three levels with a least squares support vector machine regression level and two kernel PCA levels. In its primal form also deep feedforward neural networks can be trained within this framework.

  11. Preliminary analyses of the deep geoenvironmental characteristics for the deep borehole disposal of high-level radioactive waste in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Youl; Lee, Min Soo; Choi, Heui Joo; Kim, Geon Young; Kim, Kyung Su [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-06-15

    Spent fuels from nuclear power plants, as well as high-level radioactive waste from the recycling of spent fuels, should be safely isolated from human environment for an extremely long time. Recently, meaningful studies on the development of deep borehole radioactive waste disposal system in 3-5 km depth have been carried out in USA and some countries in Europe, due to great advance in deep borehole drilling technology. In this paper, domestic deep geoenvironmental characteristics are preliminarily investigated to analyze the applicability of deep borehole disposal technology in Korea. To do this, state-of-the art technologies in USA and some countries in Europe are reviewed, and geological and geothermal data from the deep boreholes for geothermal usage are analyzed. Based on the results on the crystalline rock depth, the geothermal gradient and the spent fuel types generated in Korea, a preliminary deep borehole concept including disposal canister and sealing system, is suggested.

  12. Preliminary analyses of the deep geoenvironmental characteristics for the deep borehole disposal of high-level radioactive waste in Korea

    International Nuclear Information System (INIS)

    Lee, Jong Youl; Lee, Min Soo; Choi, Heui Joo; Kim, Geon Young; Kim, Kyung Su

    2016-01-01

    Spent fuels from nuclear power plants, as well as high-level radioactive waste from the recycling of spent fuels, should be safely isolated from human environment for an extremely long time. Recently, meaningful studies on the development of deep borehole radioactive waste disposal system in 3-5 km depth have been carried out in USA and some countries in Europe, due to great advance in deep borehole drilling technology. In this paper, domestic deep geoenvironmental characteristics are preliminarily investigated to analyze the applicability of deep borehole disposal technology in Korea. To do this, state-of-the art technologies in USA and some countries in Europe are reviewed, and geological and geothermal data from the deep boreholes for geothermal usage are analyzed. Based on the results on the crystalline rock depth, the geothermal gradient and the spent fuel types generated in Korea, a preliminary deep borehole concept including disposal canister and sealing system, is suggested

  13. Toolkits and Libraries for Deep Learning.

    Science.gov (United States)

    Erickson, Bradley J; Korfiatis, Panagiotis; Akkus, Zeynettin; Kline, Timothy; Philbrick, Kenneth

    2017-08-01

    Deep learning is an important new area of machine learning which encompasses a wide range of neural network architectures designed to complete various tasks. In the medical imaging domain, example tasks include organ segmentation, lesion detection, and tumor classification. The most popular network architecture for deep learning for images is the convolutional neural network (CNN). Whereas traditional machine learning requires determination and calculation of features from which the algorithm learns, deep learning approaches learn the important features as well as the proper weighting of those features to make predictions for new data. In this paper, we will describe some of the libraries and tools that are available to aid in the construction and efficient execution of deep learning as applied to medical images.

  14. Comparative feeding on chlorophyll - rich versus remaining organic matter in bivalve shellfish

    NARCIS (Netherlands)

    Hawkins, A.; Pascoe, P.L.; Parry, H.; Brinsley, M.; Cacciatore, F.; Black, K.; Fang, J.; Smaal, A.C.

    2013-01-01

    Filter feeding was compared in the blue mussel Mytilus edulis, Mediterranean mussel Mytilus galloprovincialis, Pacific oyster Crassostrea gigas, Chinese pleated oyster Crassostrea plicatula, Chinese scallop Chlamys farreri,Manila clam Tapes phillipinarum, razor clam Sinonvacula constricta, and blood

  15. Deep-sea coral research and technology program: Alaska deep-sea coral and sponge initiative final report

    Science.gov (United States)

    Rooper, Chris; Stone, Robert P.; Etnoyer, Peter; Conrath, Christina; Reynolds, Jennifer; Greene, H. Gary; Williams, Branwen; Salgado, Enrique; Morrison, Cheryl L.; Waller, Rhian G.; Demopoulos, Amanda W.J.

    2017-01-01

    Deep-sea coral and sponge ecosystems are widespread throughout most of Alaska’s marine waters. In some places, such as the central and western Aleutian Islands, deep-sea coral and sponge resources can be extremely diverse and may rank among the most abundant deep-sea coral and sponge communities in the world. Many different species of fishes and invertebrates are associated with deep-sea coral and sponge communities in Alaska. Because of their biology, these benthic invertebrates are potentially impacted by climate change and ocean acidification. Deepsea coral and sponge ecosystems are also vulnerable to the effects of commercial fishing activities. Because of the size and scope of Alaska’s continental shelf and slope, the vast majority of the area has not been visually surveyed for deep-sea corals and sponges. NOAA’s Deep Sea Coral Research and Technology Program (DSCRTP) sponsored a field research program in the Alaska region between 2012–2015, referred to hereafter as the Alaska Initiative. The priorities for Alaska were derived from ongoing data needs and objectives identified by the DSCRTP, the North Pacific Fishery Management Council (NPFMC), and Essential Fish Habitat-Environmental Impact Statement (EFH-EIS) process.This report presents the results of 15 projects conducted using DSCRTP funds from 2012-2015. Three of the projects conducted as part of the Alaska deep-sea coral and sponge initiative included dedicated at-sea cruises and fieldwork spread across multiple years. These projects were the eastern Gulf of Alaska Primnoa pacifica study, the Aleutian Islands mapping study, and the Gulf of Alaska fish productivity study. In all, there were nine separate research cruises carried out with a total of 109 at-sea days conducting research. The remaining projects either used data and samples collected by the three major fieldwork projects or were piggy-backed onto existing research programs at the Alaska Fisheries Science Center (AFSC).

  16. Image Captioning with Deep Bidirectional LSTMs

    OpenAIRE

    Wang, Cheng; Yang, Haojin; Bartz, Christian; Meinel, Christoph

    2016-01-01

    This work presents an end-to-end trainable deep bidirectional LSTM (Long-Short Term Memory) model for image captioning. Our model builds on a deep convolutional neural network (CNN) and two separate LSTM networks. It is capable of learning long term visual-language interactions by making use of history and future context information at high level semantic space. Two novel deep bidirectional variant models, in which we increase the depth of nonlinearity transition in different way, are propose...

  17. An overview of latest deep water technologies

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    The 8th Deep Offshore Technology Conference (DOT VIII, Rio de Janeiro, October 30 - November 3, 1995) has brought together renowned specialists in deep water development projects, as well as managers from oil companies and engineering/service companies to discuss state-of-the-art technologies and ongoing projects in the deep offshore. This paper is a compilation of the session summaries about sub sea technologies, mooring and dynamic positioning, floaters (Tension Leg Platforms (TLP) and Floating Production Storage and Off loading (FPSO)), pipelines and risers, exploration and drilling, and other deep water techniques. (J.S.)

  18. Deep learning in neural networks: an overview.

    Science.gov (United States)

    Schmidhuber, Jürgen

    2015-01-01

    In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarizes relevant work, much of it from the previous millennium. Shallow and Deep Learners are distinguished by the depth of their credit assignment paths, which are chains of possibly learnable, causal links between actions and effects. I review deep supervised learning (also recapitulating the history of backpropagation), unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.

  19. Combining shallow and deep processing for a robust, fast, deep-linguistic dependency parser

    OpenAIRE

    Schneider, G

    2004-01-01

    This paper describes Pro3Gres, a fast, robust, broad-coverage parser that delivers deep-linguistic grammatical relation structures as output, which are closer to predicate-argument structures and more informative than pure constituency structures. The parser stays as shallow as is possible for each task, combining shallow and deep-linguistic methods by integrating chunking and by expressing the majority of long-distance dependencies in a context-free way. It combines statistical and rule-base...

  20. Fire protection countermeasures for containment ventilation systems

    International Nuclear Information System (INIS)

    Alvares, N.; Beason, D.; Bergman, V.; Creighton, J.; Ford, H.; Lipska, A.

    1980-01-01

    The goal of this project is to find countermeasures to protect High Efficiency Particulate Air (HEPA) filters, in exit ventilation ducts, from the heat and smoke generated by fire. Initially, methods were developed to cool fire-heated air by fine water spray upstream of the filters. It was recognized that smoke aerosol exposure to HEPA filters could also cause disruption of the containment system. Through testing and analysis, several methods to partially mitigate the smoke exposure to the HEPA filters were identified. A continuous, movable, high-efficiency prefilter using modified commercial equipment was designed. The technique is capable of protecting HEPA filters over the total time duration of the test fires. The reason for success involved the modification of the prefiltration media. Commercially available filter media has particle sorption efficiency that is inversely proportional to media strength. To achieve properties of both efficiency and strength, rolling filter media were laminated with the desired properties. The approach was Edisonian, but truncation in short order to a combination of prefilters was effective. The application of this technique was qualified, since it is of use only to protect HEPA filters from fire-generated smoke aerosols. It is not believed that this technique is cost effective in the total spectrum of containment systems, especially if standard fire protection systems are available in the space. But in areas of high-fire risk, where the potential fuel load is large and ignition sources are plentiful, the complication of a rolling prefilter in exit ventilation ducts to protect HEPA filters from smoke aerosols is definitely justified

  1. DeepVel: Deep learning for the estimation of horizontal velocities at the solar surface

    Science.gov (United States)

    Asensio Ramos, A.; Requerey, I. S.; Vitas, N.

    2017-07-01

    Many phenomena taking place in the solar photosphere are controlled by plasma motions. Although the line-of-sight component of the velocity can be estimated using the Doppler effect, we do not have direct spectroscopic access to the components that are perpendicular to the line of sight. These components are typically estimated using methods based on local correlation tracking. We have designed DeepVel, an end-to-end deep neural network that produces an estimation of the velocity at every single pixel, every time step, and at three different heights in the atmosphere from just two consecutive continuum images. We confront DeepVel with local correlation tracking, pointing out that they give very similar results in the time and spatially averaged cases. We use the network to study the evolution in height of the horizontal velocity field in fragmenting granules, supporting the buoyancy-braking mechanism for the formation of integranular lanes in these granules. We also show that DeepVel can capture very small vortices, so that we can potentially expand the scaling cascade of vortices to very small sizes and durations. The movie attached to Fig. 3 is available at http://www.aanda.org

  2. Deep Learning in Drug Discovery.

    Science.gov (United States)

    Gawehn, Erik; Hiss, Jan A; Schneider, Gisbert

    2016-01-01

    Artificial neural networks had their first heyday in molecular informatics and drug discovery approximately two decades ago. Currently, we are witnessing renewed interest in adapting advanced neural network architectures for pharmaceutical research by borrowing from the field of "deep learning". Compared with some of the other life sciences, their application in drug discovery is still limited. Here, we provide an overview of this emerging field of molecular informatics, present the basic concepts of prominent deep learning methods and offer motivation to explore these techniques for their usefulness in computer-assisted drug discovery and design. We specifically emphasize deep neural networks, restricted Boltzmann machine networks and convolutional networks. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. The Start2Bike program is effective in increasing health-enhancing physical activity : A controlled study

    NARCIS (Netherlands)

    Ooms, Linda; Veenhof, Cindy; De Bakker, Dinny H.

    2017-01-01

    Background: The sports club is seen as a new relevant setting to promote health-enhancing physical activity (HEPA) among inactive population groups. Little is known about the effectiveness of strategies and activities implemented in the sports club setting on increasing HEPA levels. This study

  4. 76 FR 35744 - Amendments to National Emission Standards for Hazardous Air Pollutants for Area Sources: Plating...

    Science.gov (United States)

    2011-06-20

    ... Metal Valve and Pipe Fitting Manufacturing; 332999, All Other Miscellaneous Fabricated Metal Product... spraying operation and exhaust them to a water curtain, or a cartridge, fabric, or HEPA filter... from the thermal spraying operation and exhaust them to a cartridge, fabric, or HEPA filter...

  5. Iris Transponder-Communications and Navigation for Deep Space

    Science.gov (United States)

    Duncan, Courtney B.; Smith, Amy E.; Aguirre, Fernando H.

    2014-01-01

    The Jet Propulsion Laboratory has developed the Iris CubeSat compatible deep space transponder for INSPIRE, the first CubeSat to deep space. Iris is 0.4 U, 0.4 kg, consumes 12.8 W, and interoperates with NASA's Deep Space Network (DSN) on X-Band frequencies (7.2 GHz uplink, 8.4 GHz downlink) for command, telemetry, and navigation. This talk discusses the Iris for INSPIRE, it's features and requirements; future developments and improvements underway; deep space and proximity operations applications for Iris; high rate earth orbit variants; and ground requirements, such as are implemented in the DSN, for deep space operations.

  6. A Comparative Morphometrical Study of the Pecten Oculi in Different Avian Species

    Directory of Open Access Journals (Sweden)

    Mustafa Orhun Dayan

    2013-01-01

    Full Text Available In this study was investigated the structure of pecten oculi in the ostrich, duck, pigeon, turkey, and starling. The pecten oculi of the ostrich was vaned type and made up primary, secondary, and few tertiary lamellae. However, duck, pigeon, turkey and starling had a pleated-type pecten oculi which displayed folded structure. The numbers of pleats of the pectens were 12, 13-14, 21-22, and 17 in duck, pigeon, turkey, and starling, respectively. Light microscopic investigation demonstrated that pecten oculi is basically composed of numerous capillaries, large blood vessels, and pigment cells in all investigating avian species. Capillaries were 20.23, 14.34, 11.78, 12.58, and 12.78 μm in diameter in ostrich, duck, pigeon, turkey, and starling, respectively. The capillaries are surrounded by thick basal membrane, and pigmented cells were observed around the capillaries.

  7. Rapid fabricating technique for multi-layered human hepatic cell sheets by forceful contraction of the fibroblast monolayer.

    Directory of Open Access Journals (Sweden)

    Yusuke Sakai

    Full Text Available Cell sheet engineering is attracting attention from investigators in various fields, from basic research scientists to clinicians focused on regenerative medicine. However, hepatocytes have a limited proliferation potential in vitro, and it generally takes a several days to form a sheet morphology and multi-layered sheets. We herein report our rapid and efficient technique for generating multi-layered human hepatic cell (HepaRG® cell sheets using pre-cultured fibroblast monolayers derived from human skin (TIG-118 cells as a feeder layer on a temperature-responsive culture dish. Multi-layered TIG-118/HepaRG cell sheets with a thick morphology were harvested on day 4 of culturing HepaRG cells by forceful contraction of the TIG-118 cells, and the resulting sheet could be easily handled. In addition, the human albumin and alpha 1-antitrypsin synthesis activities of TIG-118/HepaRG cells were approximately 1.2 and 1.3 times higher than those of HepaRG cells, respectively. Therefore, this technique is considered to be a promising modality for rapidly fabricating multi-layered human hepatocyte sheets from cells with limited proliferation potential, and the engineered cell sheet could be used for cell transplantation with highly specific functions.

  8. Strategies to Increase After-School Program Staff Skills to Promote Healthy Eating and Physical Activity.

    Science.gov (United States)

    Weaver, R Glenn; Beets, Michael W; Beighle, Aaron; Webster, Collin; Huberty, Jennifer; Moore, Justin B

    2016-01-01

    Standards targeting children's healthy eating and physical activity (HEPA) in after-school programs call for staff to display or refrain from HEPA-promoting or -discouraging behaviors that are linked to children's HEPA. This study evaluated strategies to align staff behaviors with HEPA Standards. Staff at four after-school programs serving approximately 500 children participated in professional development training from January 2012 to May 2013. Site leaders also attended workshops and received technical support during the same time frame. Changes in staff behaviors were evaluated using the System for Observing Staff Promotion of Activity and Nutrition in a pre- (fall 2011) multiple-post (spring 2012, fall 2012, and spring 2013), no-control group study design. A total of 8,949 scans were completed across the four measurement periods. Of the 19 behaviors measured, 14 changed in the appropriate direction. For example, staff engaging in physical activity with children increased from 27% to 40% of scans and staff eating unhealthy foods decreased from 56% to 14% of days. Ongoing training and technical assistance can have a measureable impact on staff behaviors linked to child-level HEPA outcomes. Future research should explore the feasibility of disseminating ongoing trainings to after-school program staff on a large scale. © 2015 Society for Public Health Education.

  9. Context and Deep Learning Design

    Science.gov (United States)

    Boyle, Tom; Ravenscroft, Andrew

    2012-01-01

    Conceptual clarification is essential if we are to establish a stable and deep discipline of technology enhanced learning. The technology is alluring; this can distract from deep design in a surface rush to exploit the affordances of the new technology. We need a basis for design, and a conceptual unit of organization, that are applicable across…

  10. Deep Generative Models for Molecular Science

    DEFF Research Database (Denmark)

    Jørgensen, Peter Bjørn; Schmidt, Mikkel Nørgaard; Winther, Ole

    2018-01-01

    Generative deep machine learning models now rival traditional quantum-mechanical computations in predicting properties of new structures, and they come with a significantly lower computational cost, opening new avenues in computational molecular science. In the last few years, a variety of deep...... generative models have been proposed for modeling molecules, which differ in both their model structure and choice of input features. We review these recent advances within deep generative models for predicting molecular properties, with particular focus on models based on the probabilistic autoencoder (or...

  11. Too Deep or Not Too Deep?: A Propensity-Matched Comparison of the Analgesic Effects of a Superficial Versus Deep Serratus Fascial Plane Block for Ambulatory Breast Cancer Surgery.

    Science.gov (United States)

    Abdallah, Faraj W; Cil, Tulin; MacLean, David; Madjdpour, Caveh; Escallon, Jaime; Semple, John; Brull, Richard

    2018-07-01

    Serratus fascial plane block can reduce pain following breast surgery, but the question of whether to inject the local anesthetic superficial or deep to the serratus muscle has not been answered. This cohort study compares the analgesic benefits of superficial versus deep serratus plane blocks in ambulatory breast cancer surgery patients at Women's College Hospital between February 2014 and December 2016. We tested the joint hypothesis that deep serratus block is noninferior to superficial serratus block for postoperative in-hospital (pre-discharge) opioid consumption and pain severity. One hundred sixty-six patients were propensity matched among 2 groups (83/group): superficial and deep serratus blocks. The cohort was used to evaluate the effect of blocks on postoperative oral morphine equivalent consumption and area under the curve for rest pain scores. We considered deep serratus block to be noninferior to superficial serratus block if it were noninferior for both outcomes, within 15 mg morphine and 4 cm·h units margins. Other outcomes included intraoperative fentanyl requirements, time to first analgesic request, recovery room stay, and incidence of postoperative nausea and vomiting. Deep serratus block was associated with postoperative morphine consumption and pain scores area under the curve that were noninferior to those of the superficial serratus block. Intraoperative fentanyl requirements, time to first analgesic request, recovery room stay, and postoperative nausea and vomiting were not different between blocks. The postoperative in-hospital analgesia associated with deep serratus block is as effective (within an acceptable margin) as superficial serratus block following ambulatory breast cancer surgery. These new findings are important to inform both current clinical practices and future prospective studies.

  12. Deep learning architecture for iris recognition based on optimal Gabor filters and deep belief network

    Science.gov (United States)

    He, Fei; Han, Ye; Wang, Han; Ji, Jinchao; Liu, Yuanning; Ma, Zhiqiang

    2017-03-01

    Gabor filters are widely utilized to detect iris texture information in several state-of-the-art iris recognition systems. However, the proper Gabor kernels and the generative pattern of iris Gabor features need to be predetermined in application. The traditional empirical Gabor filters and shallow iris encoding ways are incapable of dealing with such complex variations in iris imaging including illumination, aging, deformation, and device variations. Thereby, an adaptive Gabor filter selection strategy and deep learning architecture are presented. We first employ particle swarm optimization approach and its binary version to define a set of data-driven Gabor kernels for fitting the most informative filtering bands, and then capture complex pattern from the optimal Gabor filtered coefficients by a trained deep belief network. A succession of comparative experiments validate that our optimal Gabor filters may produce more distinctive Gabor coefficients and our iris deep representations be more robust and stable than traditional iris Gabor codes. Furthermore, the depth and scales of the deep learning architecture are also discussed.

  13. Molecular analysis of deep subsurface bacteria

    International Nuclear Information System (INIS)

    Jimenez Baez, L.E.

    1989-09-01

    Deep sediments samples from site C10a, in Appleton, and sites, P24, P28, and P29, at the Savannah River Site (SRS), near Aiken, South Carolina were studied to determine their microbial community composition, DNA homology and mol %G+C. Different geological formations with great variability in hydrogeological parameters were found across the depth profile. Phenotypic identification of deep subsurface bacteria underestimated the bacterial diversity at the three SRS sites, since bacteria with the same phenotype have different DNA composition and less than 70% DNA homology. Total DNA hybridization and mol %G+C analysis of deep sediment bacterial isolates suggested that each formation is comprised of different microbial communities. Depositional environment was more important than site and geological formation on the DNA relatedness between deep subsurface bacteria, since more 70% of bacteria with 20% or more of DNA homology came from the same depositional environments. Based on phenotypic and genotypic tests Pseudomonas spp. and Acinetobacter spp.-like bacteria were identified in 85 million years old sediments. This suggests that these microbial communities might have been adapted during a long period of time to the environmental conditions of the deep subsurface

  14. 76 FR 57913 - Amendments to National Emission Standards for Hazardous Air Pollutants for Area Sources: Plating...

    Science.gov (United States)

    2011-09-19

    ... Fixture Fitting and Trim Manufacturing; Other Metal Valve and Pipe Fitting Manufacturing; 332999, All... thermal spraying operation and exhaust them to a water curtain, or a cartridge, fabric, or HEPA filter... from the thermal spraying operation and exhaust them to a cartridge, fabric, or HEPA filter...

  15. Joint Training of Deep Boltzmann Machines

    OpenAIRE

    Goodfellow, Ian; Courville, Aaron; Bengio, Yoshua

    2012-01-01

    We introduce a new method for training deep Boltzmann machines jointly. Prior methods require an initial learning pass that trains the deep Boltzmann machine greedily, one layer at a time, or do not perform well on classifi- cation tasks.

  16. Deep boreholes; Tiefe Bohrloecher

    Energy Technology Data Exchange (ETDEWEB)

    Bracke, Guido [Gesellschaft fuer Anlagen- und Reaktorsicherheit gGmbH Koeln (Germany); Charlier, Frank [NSE international nuclear safety engineering gmbh, Aachen (Germany); Geckeis, Horst [Karlsruher Institut fuer Technologie (Germany). Inst. fuer Nukleare Entsorgung; and others

    2016-02-15

    The report on deep boreholes covers the following subject areas: methods for safe enclosure of radioactive wastes, requirements concerning the geological conditions of possible boreholes, reversibility of decisions and retrievability, status of drilling technology. The introduction covers national and international activities. Further chapters deal with the following issues: basic concept of the storage in deep bore holes, status of the drilling technology, safe enclosure, geomechanics and stability, reversibility of decisions, risk scenarios, compliancy with safe4ty requirements and site selection criteria, research and development demand.

  17. Effectiveness of interim stage filter in the exhaust system of glove boxes

    International Nuclear Information System (INIS)

    Patre, D.K.; Vangara, H.; Thanamani, S.; Gopalakrishnan, R.K.; Mhatre, Amol M.

    2018-01-01

    All operations in radiochemical laboratories are carried out in containment systems like Glove boxes and Fume hoods. For controlling air contamination two separate air cleaning systems are incorporated. Laboratory has general ventilation system and glove boxes are provided with a negative pressure system (NPS). Glove box exhaust air is passed through three stage filtration systems: in situ, interim and final before discharging to the atmosphere. In addition to the individual HEPA filters of each glove box, there is an interim HEPA filter bank introduced at the laboratory end. This was introduced to reduce a load on main exhaust filter system. Finally the exhaust air is discharged through the final stage HEPA filter located in the filter house through the Stack. The interim HEPA filter bank provides additional protection for the release of particulate activity and reduces load on the final stage filters. In the present work efforts have been put to validate the interim stage filter, which has been introduced, to limit the environmental release

  18. Loading capacity of various filters for lithium fire generated aerosols

    International Nuclear Information System (INIS)

    Jeppson, D.W.; Barreca, J.R.

    1980-01-01

    The lithium aerosol loading capacity of a prefilter, HEPA filters and a sand and gravel bed filter was determined. The test aerosol was characterized and was generated by burning lithium in an unlimited air atmosphere. Correlation to sodium aerosol loading capacities were made to relate existing data to lithium aerosol loadings under varying conditions. This work is being conducted in support of the fusion reactor safety program. The lithium aerosol was generated by burning lithium pools, up to 45 kgs, in a 340 m 3 low humidity air atmosphere to supply aerosol to recirculating filter test loops. The aerosol was sampled to determine particle size, mass concentrations and chemical species. The dew point and gas concentrations were monitored throughout the tests. Loop inlet aerosol mass concentrations ranged up to 5 gr/m 3 . Chemical compounds analyzed to be present in the aerosol include Li 2 O, LiOH, and Li 2 CO 3 . HEPA filters with and without separators and a prefilter and HEPA filter in series were loaded with 7.8 to 11.1 kg/m 2 of aerosol at a flow rate of 1.31 m/sec and 5 kPa pressure drop. The HEPA filter loading capacity was determined to be greater at a lower flow rate. The loading capacity increased from 0.4 to 2.8 kg by decreasing the flow rate from 1.31 to 0.26 m/sec for a pressure drop of 0.11 kPa due to aerosol buildup. The prefilter tested in series with a HEPA did not increase the total loading capacity significantly for the same total pressure drop. Separators in the HEPA had only minor effect on loading capacity. The sand and gravel bed filter loaded to 0.50 kg/m 2 at an aerosol flow rate of 0.069 m/sec and final pressure drop of 6.2 kPa. These loading capacities and their dependence on test variables are similar to those reported for sodium aerosols except for the lithium aerosol HEPA loading capacity dependence upon flow rate

  19. Hepatitis A and hepatitis B vaccination coverage among adults with chronic liver disease.

    Science.gov (United States)

    Yue, Xin; Black, Carla L; O'Halloran, Alissa; Lu, Peng-Jun; Williams, Walter W; Nelson, Noele P

    2018-02-21

    Infection with hepatitis A and hepatitis B virus can increase the risk of morbidity and mortality in persons with chronic liver disease (CLD). The Advisory Committee on Immunization Practices recommends hepatitis A (HepA) and hepatitis B (HepB) vaccination for persons with CLD. Data from the 2014 and 2015 National Health Interview Surveys (NHIS), nationally representative, in-person interview surveys of the non-institutionalized US civilian population, were used to assess self-reported HepA (≥1 and ≥2 doses) and HepB vaccination (≥1 and ≥3 doses) coverage among adults who reported a chronic or long-term liver condition. Multivariable logistic regression was used to identify factors independently associated with HepA and HepB vaccination among adults with CLD. Overall, 19.4% and 11.5% of adults aged ≥ 18 years with CLD reported receiving ≥1 dose and ≥2 doses of HepA vaccine, respectively, compared with 14.7% and 9.1% of adults without CLD (p CLD, ≥1dose). Age, education, geographic region, and international travel were associated with receipt of ≥2 doses HepA vaccine among adults with CLD. Overall, 35.7% and 29.1% of adults with CLD reported receiving ≥1 dose and ≥3 doses of HepB vaccine, respectively, compared with 30.2% and 24.7% of adults without CLD (p CLD, ≥1 dose). Age, education, and receipt of influenza vaccination in the past 12 months were associated with receipt of ≥3 doses HepB vaccine among adults with CLD. Among adults with CLD and ≥10 provider visits, only 13.8% and 35.3% had received ≥2 doses HepA and ≥3 doses HepB vaccine, respectively. HepA and HepB vaccination among adults with CLD is suboptimal and missed opportunities to vaccinate occurred. Providers should adhere to recommendations to vaccinate persons with CLD to increase vaccination among this population. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. DeepLoc: prediction of protein subcellular localization using deep learning

    DEFF Research Database (Denmark)

    Almagro Armenteros, Jose Juan; Sønderby, Casper Kaae; Sønderby, Søren Kaae

    2017-01-01

    The prediction of eukaryotic protein subcellular localization is a well-studied topic in bioinformatics due to its relevance in proteomics research. Many machine learning methods have been successfully applied in this task, but in most of them, predictions rely on annotation of homologues from...... knowledge databases. For novel proteins where no annotated homologues exist, and for predicting the effects of sequence variants, it is desirable to have methods for predicting protein properties from sequence information only. Here, we present a prediction algorithm using deep neural networks to predict...... current state-of-the-art algorithms, including those relying on homology information. The method is available as a web server at http://www.cbs.dtu.dk/services/DeepLoc . Example code is available at https://github.com/JJAlmagro/subcellular_localization . The dataset is available at http...

  1. Pre-cementation of deep shaft

    Science.gov (United States)

    Heinz, W. F.

    1988-12-01

    Pre-cementation or pre-grouting of deep shafts in South Africa is an established technique to improve safety and reduce water ingress during shaft sinking. The recent completion of several pre-cementation projects for shafts deeper than 1000m has once again highlighted the effectiveness of pre-grouting of shafts utilizing deep slimline boreholes and incorporating wireline technique for drilling and conventional deep borehole grouting techniques for pre-cementation. Pre-cementation of deep shaft will: (i) Increase the safety of shaft sinking operation (ii) Minimize water and gas inflow during shaft sinking (iii) Minimize the time lost due to additional grouting operations during sinking of the shaft and hence minimize costly delays and standing time of shaft sinking crews and equipment. (iv) Provide detailed information of the geology of the proposed shaft site. Informations on anomalies, dykes, faults as well as reef (gold bearing conglomerates) intersections can be obtained from the evaluation of cores of the pre-cementation boreholes. (v) Provide improved rock strength for excavations in the immediate vicinity of the shaft area. The paper describes pre-cementation techniques recently applied successfully from surface and some conclusions drawn for further considerations.

  2. Applications of Deep Learning in Biomedicine.

    Science.gov (United States)

    Mamoshina, Polina; Vieira, Armando; Putin, Evgeny; Zhavoronkov, Alex

    2016-05-02

    Increases in throughput and installed base of biomedical research equipment led to a massive accumulation of -omics data known to be highly variable, high-dimensional, and sourced from multiple often incompatible data platforms. While this data may be useful for biomarker identification and drug discovery, the bulk of it remains underutilized. Deep neural networks (DNNs) are efficient algorithms based on the use of compositional layers of neurons, with advantages well matched to the challenges -omics data presents. While achieving state-of-the-art results and even surpassing human accuracy in many challenging tasks, the adoption of deep learning in biomedicine has been comparatively slow. Here, we discuss key features of deep learning that may give this approach an edge over other machine learning methods. We then consider limitations and review a number of applications of deep learning in biomedical studies demonstrating proof of concept and practical utility.

  3. Deep Complementary Bottleneck Features for Visual Speech Recognition

    NARCIS (Netherlands)

    Petridis, Stavros; Pantic, Maja

    Deep bottleneck features (DBNFs) have been used successfully in the past for acoustic speech recognition from audio. However, research on extracting DBNFs for visual speech recognition is very limited. In this work, we present an approach to extract deep bottleneck visual features based on deep

  4. HEAPA Filter Bank In-Place Leak Test of Advanced Fuel Science Building

    Energy Technology Data Exchange (ETDEWEB)

    Ji, C. G.; Bae, S. O.; Kim, C. H

    2007-12-15

    To maintain the optimum condition of Advanced Fuel Science Building in KAERI, this report is described leak tests for HEPA Filter of HVAC in this facility. The main topics of this report are as follows for: - Procurement Specification - Visual Inspection - Airflow Capacity Test - HEPA Filter Bank In-Place Test.

  5. Producing deep-water hydrocarbons

    International Nuclear Information System (INIS)

    Pilenko, Thierry

    2011-01-01

    Several studies relate the history and progress made in offshore production from oil and gas fields in relation to reserves and the techniques for producing oil offshore. The intention herein is not to review these studies but rather to argue that the activities of prospecting and producing deep-water oil and gas call for a combination of technology and project management and, above all, of devotion and innovation. Without this sense of commitment motivating men and women in this industry, the human adventure of deep-water production would never have taken place

  6. Deep inelastic processes and the parton model

    International Nuclear Information System (INIS)

    Altarelli, G.

    The lecture was intended as an elementary introduction to the physics of deep inelastic phenomena from the point of view of theory. General formulae and facts concerning inclusive deep inelastic processes in the form: l+N→l'+hadrons (electroproduction, neutrino scattering) are first recalled. The deep inelastic annihilation e + e - →hadrons is then envisaged. The light cone approach, the parton model and their relation are mainly emphasized

  7. Life Support for Deep Space and Mars

    Science.gov (United States)

    Jones, Harry W.; Hodgson, Edward W.; Kliss, Mark H.

    2014-01-01

    How should life support for deep space be developed? The International Space Station (ISS) life support system is the operational result of many decades of research and development. Long duration deep space missions such as Mars have been expected to use matured and upgraded versions of ISS life support. Deep space life support must use the knowledge base incorporated in ISS but it must also meet much more difficult requirements. The primary new requirement is that life support in deep space must be considerably more reliable than on ISS or anywhere in the Earth-Moon system, where emergency resupply and a quick return are possible. Due to the great distance from Earth and the long duration of deep space missions, if life support systems fail, the traditional approaches for emergency supply of oxygen and water, emergency supply of parts, and crew return to Earth or escape to a safe haven are likely infeasible. The Orbital Replacement Unit (ORU) maintenance approach used by ISS is unsuitable for deep space with ORU's as large and complex as those originally provided in ISS designs because it minimizes opportunities for commonality of spares, requires replacement of many functional parts with each failure, and results in substantial launch mass and volume penalties. It has become impractical even for ISS after the shuttle era, resulting in the need for ad hoc repair activity at lower assembly levels with consequent crew time penalties and extended repair timelines. Less complex, more robust technical approaches may be needed to meet the difficult deep space requirements for reliability, maintainability, and reparability. Developing an entirely new life support system would neglect what has been achieved. The suggested approach is use the ISS life support technologies as a platform to build on and to continue to improve ISS subsystems while also developing new subsystems where needed to meet deep space requirements.

  8. Automatic Segmentation and Deep Learning of Bird Sounds

    NARCIS (Netherlands)

    Koops, Hendrik Vincent; Van Balen, J.M.H.; Wiering, F.

    2015-01-01

    We present a study on automatic birdsong recognition with deep neural networks using the BIRDCLEF2014 dataset. Through deep learning, feature hierarchies are learned that represent the data on several levels of abstraction. Deep learning has been applied with success to problems in fields such as

  9. Deep Learning: A Primer for Radiologists.

    Science.gov (United States)

    Chartrand, Gabriel; Cheng, Phillip M; Vorontsov, Eugene; Drozdzal, Michal; Turcotte, Simon; Pal, Christopher J; Kadoury, Samuel; Tang, An

    2017-01-01

    Deep learning is a class of machine learning methods that are gaining success and attracting interest in many domains, including computer vision, speech recognition, natural language processing, and playing games. Deep learning methods produce a mapping from raw inputs to desired outputs (eg, image classes). Unlike traditional machine learning methods, which require hand-engineered feature extraction from inputs, deep learning methods learn these features directly from data. With the advent of large datasets and increased computing power, these methods can produce models with exceptional performance. These models are multilayer artificial neural networks, loosely inspired by biologic neural systems. Weighted connections between nodes (neurons) in the network are iteratively adjusted based on example pairs of inputs and target outputs by back-propagating a corrective error signal through the network. For computer vision tasks, convolutional neural networks (CNNs) have proven to be effective. Recently, several clinical applications of CNNs have been proposed and studied in radiology for classification, detection, and segmentation tasks. This article reviews the key concepts of deep learning for clinical radiologists, discusses technical requirements, describes emerging applications in clinical radiology, and outlines limitations and future directions in this field. Radiologists should become familiar with the principles and potential applications of deep learning in medical imaging. © RSNA, 2017.

  10. Protein Secondary Structure Prediction Using Deep Convolutional Neural Fields.

    Science.gov (United States)

    Wang, Sheng; Peng, Jian; Ma, Jianzhu; Xu, Jinbo

    2016-01-11

    Protein secondary structure (SS) prediction is important for studying protein structure and function. When only the sequence (profile) information is used as input feature, currently the best predictors can obtain ~80% Q3 accuracy, which has not been improved in the past decade. Here we present DeepCNF (Deep Convolutional Neural Fields) for protein SS prediction. DeepCNF is a Deep Learning extension of Conditional Neural Fields (CNF), which is an integration of Conditional Random Fields (CRF) and shallow neural networks. DeepCNF can model not only complex sequence-structure relationship by a deep hierarchical architecture, but also interdependency between adjacent SS labels, so it is much more powerful than CNF. Experimental results show that DeepCNF can obtain ~84% Q3 accuracy, ~85% SOV score, and ~72% Q8 accuracy, respectively, on the CASP and CAMEO test proteins, greatly outperforming currently popular predictors. As a general framework, DeepCNF can be used to predict other protein structure properties such as contact number, disorder regions, and solvent accessibility.

  11. Stratification-Based Outlier Detection over the Deep Web.

    Science.gov (United States)

    Xian, Xuefeng; Zhao, Pengpeng; Sheng, Victor S; Fang, Ligang; Gu, Caidong; Yang, Yuanfeng; Cui, Zhiming

    2016-01-01

    For many applications, finding rare instances or outliers can be more interesting than finding common patterns. Existing work in outlier detection never considers the context of deep web. In this paper, we argue that, for many scenarios, it is more meaningful to detect outliers over deep web. In the context of deep web, users must submit queries through a query interface to retrieve corresponding data. Therefore, traditional data mining methods cannot be directly applied. The primary contribution of this paper is to develop a new data mining method for outlier detection over deep web. In our approach, the query space of a deep web data source is stratified based on a pilot sample. Neighborhood sampling and uncertainty sampling are developed in this paper with the goal of improving recall and precision based on stratification. Finally, a careful performance evaluation of our algorithm confirms that our approach can effectively detect outliers in deep web.

  12. Deep Learning and Bayesian Methods

    Directory of Open Access Journals (Sweden)

    Prosper Harrison B.

    2017-01-01

    Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  13. Eric Davidson and deep time.

    Science.gov (United States)

    Erwin, Douglas H

    2017-10-13

    Eric Davidson had a deep and abiding interest in the role developmental mechanisms played in generating evolutionary patterns documented in deep time, from the origin of the euechinoids to the processes responsible for the morphological architectures of major animal clades. Although not an evolutionary biologist, Davidson's interests long preceded the current excitement over comparative evolutionary developmental biology. Here I discuss three aspects at the intersection between his research and evolutionary patterns in deep time: First, understanding the mechanisms of body plan formation, particularly those associated with the early diversification of major metazoan clades. Second, a critique of early claims about ancestral metazoans based on the discoveries of highly conserved genes across bilaterian animals. Third, Davidson's own involvement in paleontology through a collaborative study of the fossil embryos from the Ediacaran Doushantuo Formation in south China.

  14. More Far-Side Deep Moonquake Nests Discovered

    Science.gov (United States)

    Nakamura, Y.; Jackson, John A.; Jackson, Katherine G.

    2004-01-01

    As reported last year, we started to reanalyze the seismic data acquired from 1969 to 1977 with a network of stations established on the Moon during the Apollo mission. The reason for the reanalysis was because recent advances in computer technology make it possible to employ much more sophisticated analysis techniques than was possible previously. The primary objective of the reanalysis was to search for deep moonquakes on the far side of the Moon and, if found, to use them to infer the structure of the Moon's deep interior, including a possible central core. The first step was to identify any new deep moonquakes that escaped our earlier search by applying a combination of waveform cross-correlation and single-link cluster analysis, and then to see if any of them are from previously unknown nests of deep moonquakes. We positively identified 7245 deep moonquakes, more than a five-fold increase from the previous 1360. We also found at least 88 previously unknown deep-moonquake nests. The question was whether any of these newly discovered nets were on the far side of the Moon, and we now report that our analysis of the data indicates that some of them are indeed on the far side.

  15. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field

    Directory of Open Access Journals (Sweden)

    Peter Christiansen

    2016-11-01

    Full Text Available Convolutional neural network (CNN-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (RCNN. In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45–90 m than RCNN. RCNN has a similar performance at a short range (0–30 m. However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms = a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit.

  16. Learning Transferable Features with Deep Adaptation Networks

    OpenAIRE

    Long, Mingsheng; Cao, Yue; Wang, Jianmin; Jordan, Michael I.

    2015-01-01

    Recent studies reveal that a deep neural network can learn transferable features which generalize well to novel tasks for domain adaptation. However, as deep features eventually transition from general to specific along the network, the feature transferability drops significantly in higher layers with increasing domain discrepancy. Hence, it is important to formally reduce the dataset bias and enhance the transferability in task-specific layers. In this paper, we propose a new Deep Adaptation...

  17. Theory of deep inelastic lepton-hadron scattering

    International Nuclear Information System (INIS)

    Geyer, B.; Robaschik, D.; Wieczorek, E.

    1979-01-01

    The description of deep inelastic lepton-nucleon scattering in the lowest order of the electromagnetic and weak coupling constants leads to a study of virtual Compton amplitudes and their absorptive parts. Some aspects of quantum chromodynamics are discussed. Deep inelastic scattering enables a central quantity of quantum field theory, namely the light cone behaviour of the current commutator. The moments of structure functions are used for the description of deep inelastic scattering. (author)

  18. DeepQA: Improving the estimation of single protein model quality with deep belief networks

    OpenAIRE

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-01-01

    Background Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. Results We introduce a novel single-model quality assessment method DeepQA based on deep belie...

  19. DeepDive: Declarative Knowledge Base Construction.

    Science.gov (United States)

    De Sa, Christopher; Ratner, Alex; Ré, Christopher; Shin, Jaeho; Wang, Feiran; Wu, Sen; Zhang, Ce

    2016-03-01

    The dark data extraction or knowledge base construction (KBC) problem is to populate a SQL database with information from unstructured data sources including emails, webpages, and pdf reports. KBC is a long-standing problem in industry and research that encompasses problems of data extraction, cleaning, and integration. We describe DeepDive, a system that combines database and machine learning ideas to help develop KBC systems. The key idea in DeepDive is that statistical inference and machine learning are key tools to attack classical data problems in extraction, cleaning, and integration in a unified and more effective manner. DeepDive programs are declarative in that one cannot write probabilistic inference algorithms; instead, one interacts by defining features or rules about the domain. A key reason for this design choice is to enable domain experts to build their own KBC systems. We present the applications, abstractions, and techniques of DeepDive employed to accelerate construction of KBC systems.

  20. Pathways to deep decarbonization - 2015 report

    International Nuclear Information System (INIS)

    Ribera, Teresa; Colombier, Michel; Waisman, Henri; Bataille, Chris; Pierfederici, Roberta; Sachs, Jeffrey; Schmidt-Traub, Guido; Williams, Jim; Segafredo, Laura; Hamburg Coplan, Jill; Pharabod, Ivan; Oury, Christian

    2015-12-01

    In September 2015, the Deep Decarbonization Pathways Project published the Executive Summary of the Pathways to Deep Decarbonization: 2015 Synthesis Report. The full 2015 Synthesis Report was launched in Paris on December 3, 2015, at a technical workshop with the Mitigation Action Plans and Scenarios (MAPS) program. The Deep Decarbonization Pathways Project (DDPP) is a collaborative initiative to understand and show how individual countries can transition to a low-carbon economy and how the world can meet the internationally agreed target of limiting the increase in global mean surface temperature to less than 2 degrees Celsius (deg. C). Achieving the 2 deg. C limit will require that global net emissions of greenhouse gases (GHG) approach zero by the second half of the century. In turn, this will require a profound transformation of energy systems by mid-century through steep declines in carbon intensity in all sectors of the economy, a transition we call 'deep decarbonization'

  1. Issey Miyake näitus Pariisis

    Index Scriptorium Estoniae

    1998-01-01

    Moedisaineri Issey Miyake näitus 'Making Things' 17. jaan.-ni 1999 Pariisis Fondation Cartier's. Näitus koosneb neljast osast, teine osa 'Pleats Please Guest Series' on Miyake koostöö nelja kunstikuga: Yasumasa Morimura, Nobuyoshi Araki, Tim Hawkinsoni ja Cai Guo-Qiangiga

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES SL-3 RING PANEL

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the High Efficiency Mini Pleat air filter for dust and bioaerosol filtration manufactured by Columbus Industries. The pressure drop across the filter was 142 Pa clean and 283 Pa dust load...

  3. Determination of the radionuclide release factor for an evaporator process using nondestructive assay

    International Nuclear Information System (INIS)

    Johnson, R.E.

    1998-01-01

    The 242-A Evaporator is the primary waste evaporator for the Hanford Site radioactive liquid waste stored in underground double-shell tanks. Low pressure evaporation is used to remove water from the waste, thus reducing the amount of tank space required for storage. The process produces a concentrated slurry, a process condensate, and an offgas. The offgas exhausts through two stages of high-efficiency particulate air (HEPA) filters before being discharged to the atmosphere 40 CFR 61 Subpart H requires assessment of the unfiltered exhaust to determine if continuous compliant sampling is required. Because potential (unfiltered) emissions are not measured, methods have been developed to estimate these emissions. One of the methods accepted by the Environmental Protection Agency is the measurement of the accumulation of radionuclides on the HEPA filters. Nondestructive assay (NDA) was selected for determining the accumulation on the HEPA filters. NDA was performed on the HEPA filters before and after a campaign in 1997. NDA results indicate that 2.1 E+4 becquerels of cesium-137 were accumulated on the primary HEPA 1700 filter during the campaign. The feed material processed in the campaign contained a total of 1.4 E+l6 Bq of cesium-137. The release factor for the evaporator process is 1.5 E-12. Based on this release factor, continuous compliant sampling is not required

  4. Text feature extraction based on deep learning: a review.

    Science.gov (United States)

    Liang, Hong; Sun, Xiao; Sun, Yunlei; Gao, Yuan

    2017-01-01

    Selection of text feature item is a basic and important matter for text mining and information retrieval. Traditional methods of feature extraction require handcrafted features. To hand-design, an effective feature is a lengthy process, but aiming at new applications, deep learning enables to acquire new effective feature representation from training data. As a new feature extraction method, deep learning has made achievements in text mining. The major difference between deep learning and conventional methods is that deep learning automatically learns features from big data, instead of adopting handcrafted features, which mainly depends on priori knowledge of designers and is highly impossible to take the advantage of big data. Deep learning can automatically learn feature representation from big data, including millions of parameters. This thesis outlines the common methods used in text feature extraction first, and then expands frequently used deep learning methods in text feature extraction and its applications, and forecasts the application of deep learning in feature extraction.

  5. Deep-seated sarcomas of the penis

    Directory of Open Access Journals (Sweden)

    Alberto A. Antunes

    2005-06-01

    Full Text Available Mesenchymal neoplasias represent 5% of tumors affecting the penis. Due to the rarity of such tumors, there is no agreement concerning the best method for staging and managing these patients. Sarcomas of the penis can be classified as deep-seated if they derive from the structures forming the spongy body and the cavernous bodies. Superficial lesions are usually low-grade and show a small tendency towards distant metastasis. In contrast, deep-seated lesions usually show behavior that is more aggressive and have poorer prognosis. The authors report 3 cases of deep-seated primary sarcomas of the penis and review the literature on this rare and aggressive neoplasia.

  6. In Brief: Deep-sea observatory

    Science.gov (United States)

    Showstack, Randy

    2008-11-01

    The first deep-sea ocean observatory offshore of the continental United States has begun operating in the waters off central California. The remotely operated Monterey Accelerated Research System (MARS) will allow scientists to monitor the deep sea continuously. Among the first devices to be hooked up to the observatory are instruments to monitor earthquakes, videotape deep-sea animals, and study the effects of acidification on seafloor animals. ``Some day we may look back at the first packets of data streaming in from the MARS observatory as the equivalent of those first words spoken by Alexander Graham Bell: `Watson, come here, I need you!','' commented Marcia McNutt, president and CEO of the Monterey Bay Aquarium Research Institute, which coordinated construction of the observatory. For more information, see http://www.mbari.org/news/news_releases/2008/mars-live/mars-live.html.

  7. Deep Learning for Computer Vision: A Brief Review

    Science.gov (United States)

    Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619

  8. Deep Learning for Computer Vision: A Brief Review

    Directory of Open Access Journals (Sweden)

    Athanasios Voulodimos

    2018-01-01

    Full Text Available Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.

  9. Deep Learning for Computer Vision: A Brief Review.

    Science.gov (United States)

    Voulodimos, Athanasios; Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.

  10. Is deep dreaming the new collage?

    Science.gov (United States)

    Boden, Margaret A.

    2017-10-01

    Deep dreaming (DD) can combine and transform images in surprising ways. But, being based in deep learning (DL), it is not analytically understood. Collage is an art form that is constrained along various dimensions. DD will not be able to generate collages until DL can be guided in a disciplined fashion.

  11. Density functionals from deep learning

    OpenAIRE

    McMahon, Jeffrey M.

    2016-01-01

    Density-functional theory is a formally exact description of a many-body quantum system in terms of its density; in practice, however, approximations to the universal density functional are required. In this work, a model based on deep learning is developed to approximate this functional. Deep learning allows computational models that are capable of naturally discovering intricate structure in large and/or high-dimensional data sets, with multiple levels of abstraction. As no assumptions are ...

  12. A Survey: Time Travel in Deep Learning Space: An Introduction to Deep Learning Models and How Deep Learning Models Evolved from the Initial Ideas

    OpenAIRE

    Wang, Haohan; Raj, Bhiksha

    2015-01-01

    This report will show the history of deep learning evolves. It will trace back as far as the initial belief of connectionism modelling of brain, and come back to look at its early stage realization: neural networks. With the background of neural network, we will gradually introduce how convolutional neural network, as a representative of deep discriminative models, is developed from neural networks, together with many practical techniques that can help in optimization of neural networks. On t...

  13. Assessment of deep geological environment condition

    International Nuclear Information System (INIS)

    Bae, Dae Seok; Han, Kyung Won; Joen, Kwan Sik

    2003-05-01

    The main tasks of geoscientific study in the 2nd stage was characterized focusing mainly on a near-field condition of deep geologic environment, and aimed to generate the geologic input data for a Korean reference disposal system for high level radioactive wastes and to establish site characterization methodology, including neotectonic features, fracture systems and mechanical properties of plutonic rocks, and hydrogeochemical characteristics. The preliminary assessment of neotectonics in the Korean peninsula was performed on the basis of seismicity recorded, Quarternary faults investigated, uplift characteristics studied on limited areas, distribution of the major regional faults and their characteristics. The local fracture system was studied in detail from the data obtained from deep boreholes in granitic terrain. Through this deep drilling project, the geometrical and hydraulic properties of different fracture sets are statistically analysed on a block scale. The mechanical properties of intact rocks were evaluated from the core samples by laboratory testing and the in-situ stress conditions were estimated by a hydro fracturing test in the boreholes. The hydrogeochemical conditions in the deep boreholes were characterized based on hydrochemical composition and isotopic signatures and were attempted to assess the interrelation with a major fracture system. The residence time of deep groundwater was estimated by C-14 dating. For the travel time of groundwater between the boreholes, the methodology and equipment for tracer test were established

  14. Deep Carbon Observatory investigates Carbon from Crust to Core: An Academic Record of the History of Deep Carbon Science

    Science.gov (United States)

    Mitton, S. A.

    2017-12-01

    Carbon plays an unparalleled role in our lives: as the element of life, as the basis of most of society's energy, as the backbone of most new materials, and as the central focus in efforts to understand Earth's variable and uncertain climate. Yet in spite of carbon's importance, scientists remain largely ignorant of the physical, chemical, and biological behavior of many of Earth's carbon-bearing systems. The Deep Carbon Observatory (DCO) is a global research program to transform our understanding of carbon in Earth. At its heart, DCO is a community of scientists, from biologists to physicists, geoscientists to chemists, and many others whose work crosses these disciplinary lines, forging a new, integrative field of deep carbon science. As a historian of science, I specialise in the history of planetary science and astronomy since 1900. This is directed toward understanding of the history of the steps on the road to discovering the internal dynamics of our planet. Within a framework that describes the historical background to the new field of Earth System Science, I present the first history of deep carbon science. This project will identifies the key discoveries of deep carbon science. It will assess the impact of new knowledge on geochemistry, geodynamics, and geobiology. The project will lead to publication, in book form in 2019, of an illuminating narrative that will highlight the engaging human stories of many remarkable scientists and natural philosophers from whom we have learned about the complexity of Earth's internal world. On this journey of discovery we will encounter not just the pioneering researchers of deep carbon science, but also their institutions, their instrumental inventiveness, and their passion for exploration. The book is organised thematically around the four communities of the Deep Carbon Observatory: Deep Life, Extreme Physics and Chemistry, Reservoirs and Fluxes, and Deep Energy. The presentation has a gallery and list of Deep Carbon

  15. Deep Energy Retrofit

    DEFF Research Database (Denmark)

    Zhivov, Alexander; Lohse, Rüdiger; Rose, Jørgen

    Deep Energy Retrofit – A Guide to Achieving Significant Energy User Reduction with Major Renovation Projects contains recommendations for characteristics of some of core technologies and measures that are based on studies conducted by national teams associated with the International Energy Agency...... Energy Conservation in Buildings and Communities Program (IEA-EBC) Annex 61 (Lohse et al. 2016, Case, et al. 2016, Rose et al. 2016, Yao, et al. 2016, Dake 2014, Stankevica et al. 2016, Kiatreungwattana 2014). Results of these studies provided a base for setting minimum requirements to the building...... envelope-related technologies to make Deep Energy Retrofit feasible and, in many situations, cost effective. Use of energy efficiency measures (EEMs) in addition to core technologies bundle and high-efficiency appliances will foster further energy use reduction. This Guide also provides best practice...

  16. Incore inspection device

    International Nuclear Information System (INIS)

    Ogisu, Tatsuki; Taguchi, Kosei.

    1995-01-01

    The device of the present invention can inspect surfaces of equipments in reactor water in a nuclear reactor in a state of atmospheric air. Namely, an inspection device is movable forwardly and backwardly in a water-proof vessel. An annular sucker with pleats is disposed to the outer side of a lid of the water-proof vessel. A television camera for an under water monitoring is disposed to the inner side of the lid of the water-proof vessel by way of a partitioning wall with lid. Transferring screws are disposed at the back and on the side of the water-proof vessel. In the device having such a constitution, (1) the inside of the water-proof vessel is at first made water-tight by closing the partitioning wall with lid, (2) the back and the side screws are operated by the guide of the underwater monitoring television camera, to transfer the water-proof vessel to the surface of the reactor core to be inspected, (3) the annular sucker with pleats is urged on the surface to be inspected by the back screw, to fix the water-proof vessel, (4) reactor water in a space of the annular sucker with pleats is discharged and replaced with air, and (5) the lid of the partition wall with lid is opened and the inspection device is disposed at a position of the underwater monitoring television camera, to inspect the surface to be inspected in a state of atmospheric air. (I.S.)

  17. Deep-well injection of radioactive waste in Russia

    International Nuclear Information System (INIS)

    Hoek, J.

    1998-01-01

    In the Russian federation, deep borehole injection of liquid radioactive waste has been established practice since at least 1963. The liquid is injected into sandy or other formations with high porosity, which are isolated by water-tight layers. This technique has also been used elsewhere for toxic liquid waste and residues from mining operations. Deep-well injection of radioactive waste is not currently used in any of the European Commission (EC) countries. In this paper the results of a EC-funded study were presented. The study is entitled 'Measurements, modelling of migration and possible radiological consequences at deep well injection sites for liquid radioactive waste in Russia', COSU-CT94-0099-UK. The study was carried out jointly by AEA Technology, CAG and the Research Institute for Nuclear Reactors (NIIAR) at Dimitrovgrad. Many scientists have contributed to the results reported here. The aims of the study are: Provision of extensive information on the deep-well injection repositories and their use in the former Soviet Union; Provision of a methodology to assess safety aspects of deep-well injection of liquid radioactive waste in deep geological formations; This will allow evaluation of proposals to use deep-well injection techniques in other regions; Support for Russian regulatory bodies through evaluation of the suitability of the sites, including estimates of the maximum amount of waste that can be safely stored in them; and Provision of a methodology to assess the use of deep-well injection repositories as an alternative disposal technique for EC countries. 7 refs

  18. DEEP: a general computational framework for predicting enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2014-11-05

    Transcription regulation in multicellular eukaryotes is orchestrated by a number of DNA functional elements located at gene regulatory regions. Some regulatory regions (e.g. enhancers) are located far away from the gene they affect. Identification of distal regulatory elements is a challenge for the bioinformatics research. Although existing methodologies increased the number of computationally predicted enhancers, performance inconsistency of computational models across different cell-lines, class imbalance within the learning sets and ad hoc rules for selecting enhancer candidates for supervised learning, are some key questions that require further examination. In this study we developed DEEP, a novel ensemble prediction framework. DEEP integrates three components with diverse characteristics that streamline the analysis of enhancer\\'s properties in a great variety of cellular conditions. In our method we train many individual classification models that we combine to classify DNA regions as enhancers or non-enhancers. DEEP uses features derived from histone modification marks or attributes coming from sequence characteristics. Experimental results indicate that DEEP performs better than four state-of-the-art methods on the ENCODE data. We report the first computational enhancer prediction results on FANTOM5 data where DEEP achieves 90.2% accuracy and 90% geometric mean (GM) of specificity and sensitivity across 36 different tissues. We further present results derived using in vivo-derived enhancer data from VISTA database. DEEP-VISTA, when tested on an independent test set, achieved GM of 80.1% and accuracy of 89.64%. DEEP framework is publicly available at http://cbrc.kaust.edu.sa/deep/.

  19. DEEP: a general computational framework for predicting enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Kalnis, Panos; Bajic, Vladimir B.

    2014-01-01

    Transcription regulation in multicellular eukaryotes is orchestrated by a number of DNA functional elements located at gene regulatory regions. Some regulatory regions (e.g. enhancers) are located far away from the gene they affect. Identification of distal regulatory elements is a challenge for the bioinformatics research. Although existing methodologies increased the number of computationally predicted enhancers, performance inconsistency of computational models across different cell-lines, class imbalance within the learning sets and ad hoc rules for selecting enhancer candidates for supervised learning, are some key questions that require further examination. In this study we developed DEEP, a novel ensemble prediction framework. DEEP integrates three components with diverse characteristics that streamline the analysis of enhancer's properties in a great variety of cellular conditions. In our method we train many individual classification models that we combine to classify DNA regions as enhancers or non-enhancers. DEEP uses features derived from histone modification marks or attributes coming from sequence characteristics. Experimental results indicate that DEEP performs better than four state-of-the-art methods on the ENCODE data. We report the first computational enhancer prediction results on FANTOM5 data where DEEP achieves 90.2% accuracy and 90% geometric mean (GM) of specificity and sensitivity across 36 different tissues. We further present results derived using in vivo-derived enhancer data from VISTA database. DEEP-VISTA, when tested on an independent test set, achieved GM of 80.1% and accuracy of 89.64%. DEEP framework is publicly available at http://cbrc.kaust.edu.sa/deep/.

  20. DEWS (DEep White matter hyperintensity Segmentation framework): A fully automated pipeline for detecting small deep white matter hyperintensities in migraineurs.

    Science.gov (United States)

    Park, Bo-Yong; Lee, Mi Ji; Lee, Seung-Hak; Cha, Jihoon; Chung, Chin-Sang; Kim, Sung Tae; Park, Hyunjin

    2018-01-01

    Migraineurs show an increased load of white matter hyperintensities (WMHs) and more rapid deep WMH progression. Previous methods for WMH segmentation have limited efficacy to detect small deep WMHs. We developed a new fully automated detection pipeline, DEWS (DEep White matter hyperintensity Segmentation framework), for small and superficially-located deep WMHs. A total of 148 non-elderly subjects with migraine were included in this study. The pipeline consists of three components: 1) white matter (WM) extraction, 2) WMH detection, and 3) false positive reduction. In WM extraction, we adjusted the WM mask to re-assign misclassified WMHs back to WM using many sequential low-level image processing steps. In WMH detection, the potential WMH clusters were detected using an intensity based threshold and region growing approach. For false positive reduction, the detected WMH clusters were classified into final WMHs and non-WMHs using the random forest (RF) classifier. Size, texture, and multi-scale deep features were used to train the RF classifier. DEWS successfully detected small deep WMHs with a high positive predictive value (PPV) of 0.98 and true positive rate (TPR) of 0.70 in the training and test sets. Similar performance of PPV (0.96) and TPR (0.68) was attained in the validation set. DEWS showed a superior performance in comparison with other methods. Our proposed pipeline is freely available online to help the research community in quantifying deep WMHs in non-elderly adults.

  1. Deep-Sea Corals: A New Oceanic Archive

    National Research Council Canada - National Science Library

    Adkins, Jess

    1998-01-01

    Deep-sea corals are an extraordinary new archive of deep ocean behavior. The species Desmophyllum cristagalli is a solitary coral composed of uranium rich, density banded aragonite that I have calibrated for several paleoclimate tracers...

  2. Challenging oil bioremediation at deep-sea hydrostatic pressure

    Directory of Open Access Journals (Sweden)

    Alberto Scoma

    2016-08-01

    Full Text Available The Deepwater Horizon (DWH accident has brought oil contamination of deep-sea environments to worldwide attention. The risk for new deep-sea spills is not expected to decrease in the future, as political pressure mounts to access deep-water fossil reserves, and poorly tested technologies are used to access oil. This also applies to the response to oil-contamination events, with bioremediation the only (biotechnology presently available to combat deep-sea spills. Many questions about the fate of petroleum-hydrocarbons at deep-sea remain unanswered, as much as the main constraints limiting bioremediation under increased hydrostatic pressures and low temperatures. The microbial pathways fueling oil take up are unclear, and the mild upregulation observed for beta-oxidation-related genes in both water and sediments contrasts with the high amount of alkanes present in the spilled-oil. The fate of solid alkanes (tar and that of hydrocarbons degradation rates was largely overlooked, as the reason why the most predominant hydrocarbonoclastic genera were not enriched at deep-sea, despite being present at hydrocarbon seeps at the Gulf of Mexico. This mini-review aims at highlighting the missing information in the field, proposing a holistic approach where in situ and ex situ studies are integrated to reveal the principal mechanisms accounting for deep-sea oil bioremediation.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS:AAF INTERNATIONAL, PERFECTPLEAT ULTRA, 175-102-863

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the PerfectPleat Ultra 175-102-863 air filter for dust and bioaerosol filtration manufactured by AAF International. The pressure drop across the filter was 112 Pa clean and 229 Pa dust lo...

  4. Deep Crustal Melting and the Survival of Continental Crust

    Science.gov (United States)

    Whitney, D.; Teyssier, C. P.; Rey, P. F.; Korchinski, M.

    2017-12-01

    Plate convergence involving continental lithosphere leads to crustal melting, which ultimately stabilizes the crust because it drives rapid upward flow of hot deep crust, followed by rapid cooling at shallow levels. Collision drives partial melting during crustal thickening (at 40-75 km) and/or continental subduction (at 75-100 km). These depths are not typically exceeded by crustal rocks that are exhumed in each setting because partial melting significantly decreases viscosity, facilitating upward flow of deep crust. Results from numerical models and nature indicate that deep crust moves laterally and then vertically, crystallizing at depths as shallow as 2 km. Deep crust flows en masse, without significant segregation of melt into magmatic bodies, over 10s of kms of vertical transport. This is a major mechanism by which deep crust is exhumed and is therefore a significant process of heat and mass transfer in continental evolution. The result of vertical flow of deep, partially molten crust is a migmatite dome. When lithosphere is under extension or transtension, the deep crust is solicited by faulting of the brittle upper crust, and the flow of deep crust in migmatite domes traverses nearly the entire thickness of orogenic crust in Recognition of the importance of migmatite (gneiss) domes as archives of orogenic deep crust is applicable to determining the chemical and physical properties of continental crust, as well as mechanisms and timescales of crustal differentiation.

  5. Deep Learning and Bayesian Methods

    OpenAIRE

    Prosper Harrison B.

    2017-01-01

    A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such meth...

  6. The dynamics of biogeographic ranges in the deep sea.

    Science.gov (United States)

    McClain, Craig R; Hardy, Sarah Mincks

    2010-12-07

    Anthropogenic disturbances such as fishing, mining, oil drilling, bioprospecting, warming, and acidification in the deep sea are increasing, yet generalities about deep-sea biogeography remain elusive. Owing to the lack of perceived environmental variability and geographical barriers, ranges of deep-sea species were traditionally assumed to be exceedingly large. In contrast, seamount and chemosynthetic habitats with reported high endemicity challenge the broad applicability of a single biogeographic paradigm for the deep sea. New research benefiting from higher resolution sampling, molecular methods and public databases can now more rigorously examine dispersal distances and species ranges on the vast ocean floor. Here, we explore the major outstanding questions in deep-sea biogeography. Based on current evidence, many taxa appear broadly distributed across the deep sea, a pattern replicated in both the abyssal plains and specialized environments such as hydrothermal vents. Cold waters may slow larval metabolism and development augmenting the great intrinsic ability for dispersal among many deep-sea species. Currents, environmental shifts, and topography can prove to be dispersal barriers but are often semipermeable. Evidence of historical events such as points of faunal origin and climatic fluctuations are also evident in contemporary biogeographic ranges. Continued synthetic analysis, database construction, theoretical advancement and field sampling will be required to further refine hypotheses regarding deep-sea biogeography.

  7. Survey on deep learning for radiotherapy.

    Science.gov (United States)

    Meyer, Philippe; Noblet, Vincent; Mazzara, Christophe; Lallement, Alex

    2018-05-17

    More than 50% of cancer patients are treated with radiotherapy, either exclusively or in combination with other methods. The planning and delivery of radiotherapy treatment is a complex process, but can now be greatly facilitated by artificial intelligence technology. Deep learning is the fastest-growing field in artificial intelligence and has been successfully used in recent years in many domains, including medicine. In this article, we first explain the concept of deep learning, addressing it in the broader context of machine learning. The most common network architectures are presented, with a more specific focus on convolutional neural networks. We then present a review of the published works on deep learning methods that can be applied to radiotherapy, which are classified into seven categories related to the patient workflow, and can provide some insights of potential future applications. We have attempted to make this paper accessible to both radiotherapy and deep learning communities, and hope that it will inspire new collaborations between these two communities to develop dedicated radiotherapy applications. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Unravelling networks in local public health policymaking in three European countries - a systems analysis.

    Science.gov (United States)

    Spitters, Hilde P E M; Lau, Cathrine J; Sandu, Petru; Quanjel, Marcel; Dulf, Diana; Glümer, Charlotte; van Oers, Hans A M; van de Goor, Ien A M

    2017-02-03

    Facilitating and enhancing interaction between stakeholders involved in the policymaking process to stimulate collaboration and use of evidence, is important to foster the development of effective Health Enhancing Physical Activity (HEPA) policies. Performing an analysis of real-world policymaking processes will help reveal the complexity of a network of stakeholders. Therefore, the main objectives were to unravel the stakeholder network in the policy process by conducting three systems analyses, and to increase insight into the similarities and differences in the policy processes of these European country cases. A systems analysis of the local HEPA policymaking process was performed in three European countries involved in the 'REsearch into POlicy to enhance Physical Activity' (REPOPA) project, resulting in three schematic models showing the main stakeholders and their relationships. The models were used to compare the systems, focusing on implications with respect to collaboration and use of evidence in local HEPA policymaking. Policy documents and relevant webpages were examined and main stakeholders were interviewed. The systems analysis in each country identified the main stakeholders involved and their position and relations in the policymaking process. The Netherlands and Denmark were the most similar and both differed most from Romania, especially at the level of accountability of the local public authorities for local HEPA policymaking. The categories of driving forces underlying the relations between stakeholders were formal relations, informal interaction and knowledge exchange. A systems analysis providing detailed descriptions of positions and relations in the stakeholder network in local level HEPA policymaking is rather unique in this area. The analyses are useful when a need arises for increased interaction, collaboration and use of knowledge between stakeholders in the local HEPA network, as they provide an overview of the stakeholders involved and

  9. Turning the tide: national policy approaches to increasing physical activity in seven European countries.

    Science.gov (United States)

    Bull, Fiona; Milton, Karen; Kahlmeier, Sonja; Arlotti, Alberto; Juričan, Andrea Backović; Belander, Olov; Martin, Brian; Martin-Diener, Eva; Marques, Ana; Mota, Jorge; Vasankari, Tommi; Vlasveld, Anita

    2015-06-01

    Physical inactivity is one of the four leading behavioural risk factors for non-communicable disease (NCD). Like tobacco control, increasing levels of health-enhancing physical activity (HEPA) will require a national policy framework providing direction and a clear set of actions. Despite frequent calls, there has been insufficient progress on policy development in the majority of countries around the world. This study sought and summarised national HEPA policy in seven European countries (Finland, Italy, the Netherlands, Norway, Portugal, Slovenia and Switzerland). Data collection used a policy audit tool (PAT), a 27-item instrument structured into four sections. All countries reported some legislation or policy across the sectors of education, sport and health. Only some countries reported supportive policy in the transport and environment sectors. Five countries reported a stand-alone HEPA policy and six countries reported national recommendations. HEPA prevalence targets varied in magnitude and specificity and the presence of other relevant goals from different sectors highlighted the opportunity for joint action. Evaluation and the use of scientific evidence were endorsed but described as weak in practice. Only two countries reported a national multisector coordinating committee and most countries reported challenges with partnerships on different levels of policy implementation. Bringing together the key components for success within a national HEPA policy framework is not simple. This in-depth policy audit and country comparison highlighted similarities and differences and revealed new opportunities for consideration by other countries. These examples can inform countries within and beyond Europe and guide the development of national HEPA policy within the NCD prevention agenda. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  10. HBV Bypasses the Innate Immune Response and Does Not Protect HCV From Antiviral Activity of Interferon.

    Science.gov (United States)

    Mutz, Pascal; Metz, Philippe; Lempp, Florian A; Bender, Silke; Qu, Bingqian; Schöneweis, Katrin; Seitz, Stefan; Tu, Thomas; Restuccia, Agnese; Frankish, Jamie; Dächert, Christopher; Schusser, Benjamin; Koschny, Ronald; Polychronidis, Georgios; Schemmer, Peter; Hoffmann, Katrin; Baumert, Thomas F; Binder, Marco; Urban, Stephan; Bartenschlager, Ralf

    2018-05-01

    Hepatitis C virus (HCV) infection is sensitive to interferon (IFN)-based therapy, whereas hepatitis B virus (HBV) infection is not. It is unclear whether HBV escapes detection by the IFN-mediated immune response or actively suppresses it. Moreover, little is known on how HBV and HCV influence each other in coinfected cells. We investigated interactions between HBV and the IFN-mediated immune response using HepaRG cells and primary human hepatocytes (PHHs). We analyzed the effects of HBV on HCV replication, and vice versa, at the single-cell level. PHHs were isolated from liver resection tissues from HBV-, HCV-, and human immunodeficiency virus-negative patients. Differentiated HepaRG cells overexpressing the HBV receptor sodium taurocholate cotransporting polypeptide (dHepaRGNTCP) and PHHs were infected with HBV. Huh7.5 cells were transfected with circular HBV DNA genomes resembling viral covalently closed circular DNA (cccDNA), and subsequently infected with HCV; this served as a model of HBV and HCV coinfection. Cells were incubated with IFN inducers, or IFNs, and antiviral response and viral replication were analyzed by immune fluorescence, reverse-transcription quantitative polymerase chain reaction, enzyme-linked immunosorbent assays, and flow cytometry. HBV infection of dHepaRGNTCP cells and PHHs neither activated nor inhibited signaling via pattern recognition receptors. Incubation of dHepaRGNTCP cells and PHHs with IFN had little effect on HBV replication or levels of cccDNA. HBV infection of these cells did not inhibit JAK-STAT signaling or up-regulation of IFN-stimulated genes. In coinfected cells, HBV did not prevent IFN-induced suppression of HCV replication. In dHepaRGNTCP cells and PHHs, HBV evades the induction of IFN and IFN-induced antiviral effects. HBV infection does not rescue HCV from the IFN-mediated response. Copyright © 2018 AGA Institute. Published by Elsevier Inc. All rights reserved.

  11. Hello World Deep Learning in Medical Imaging.

    Science.gov (United States)

    Lakhani, Paras; Gray, Daniel L; Pett, Carl R; Nagy, Paul; Shih, George

    2018-05-03

    There is recent popularity in applying machine learning to medical imaging, notably deep learning, which has achieved state-of-the-art performance in image analysis and processing. The rapid adoption of deep learning may be attributed to the availability of machine learning frameworks and libraries to simplify their use. In this tutorial, we provide a high-level overview of how to build a deep neural network for medical image classification, and provide code that can help those new to the field begin their informatics projects.

  12. Deep Neuromuscular Blockade Improves Laparoscopic Surgical Conditions

    DEFF Research Database (Denmark)

    Rosenberg, Jacob; Herring, W Joseph; Blobner, Manfred

    2017-01-01

    INTRODUCTION: Sustained deep neuromuscular blockade (NMB) during laparoscopic surgery may facilitate optimal surgical conditions. This exploratory study assessed whether deep NMB improves surgical conditions and, in doing so, allows use of lower insufflation pressures during laparoscopic cholecys...

  13. Operating manual for the electrostatic glove-box prefilter installed inside the filter glove box No. 046 at Rocky Flats, Building 776

    International Nuclear Information System (INIS)

    Bergman, W.; Kaifer, R.C.; Hebard, H.D.; Taylor, R.D.; Lum, B.Y.; Boling, R.M.; Buttedahl, O.I.; Woodard, R.W.; Terada, K.

    1979-01-01

    Objective of the evaluation is to evaluate the effectiveness of the electrostatic prefilter in prolonging the life of HEPA (high-efficiency particulate-air) filters. The theory of the electrostatic filter is reviewed, and Glove Box Number 046 is described in detail, followed by a description of the electrostatic prefilter used in the present application. Engineering drawings of the electrostatic prefilter are included. The procedure for evaluating the electrostatic prefilter includes the steps for conducting five different tests: evaluating (1) the HEPA filter alone, (2 and 3) the HEPA filter with a standard prefilter treated both as disposable and reusable, and (4 and 5) the HEPA filter with the electrostatic prefilter, again treated as disposable and reusable. Procedures for flowmeter calibrations and measurements of particle-size distributions are also included. Long-term maintenence of the system during the evaluation program is outlined, and estimates of component durability are given. An electrical engineering safety note describes the high-voltage operational hazard of the electrostatic prefilter and the testing of safety devices

  14. Development of Hydro-Mechanical Deep Drawing

    DEFF Research Database (Denmark)

    Zhang, Shi-Hong; Danckert, Joachim

    1998-01-01

    The hydro-mechanical deep-drawing process is reviewed in this article. The process principles and features are introduced and the developments of the hydro-mechanical deep-drawing process in process performances, in theory and in numerical simulation are described. The applications are summarized....... Some other related hydraulic forming processes are also dealt with as a comparison....

  15. 76 FR 66078 - Notice of Industry Workshop on Technical and Regulatory Challenges in Deep and Ultra-Deep Outer...

    Science.gov (United States)

    2011-10-25

    ...-0087] Notice of Industry Workshop on Technical and Regulatory Challenges in Deep and Ultra-Deep Outer... discussions expected to help identify Outer Continental Shelf (OCS) challenges and technologies associated... structured venue for consultation among offshore deepwater oil and gas industry and regulatory experts in...

  16. Deep Corals, Deep Learning: Moving the Deep Net Towards Real-Time Image Annotation

    OpenAIRE

    Lea-Anne Henry; Sankha S. Mukherjee; Neil M. Roberston; Laurence De Clippele; J. Murray Roberts

    2016-01-01

    The mismatch between human capacity and the acquisition of Big Data such as Earth imagery undermines commitments to Convention on Biological Diversity (CBD) and Aichi targets. Artificial intelligence (AI) solutions to Big Data issues are urgently needed as these could prove to be faster, more accurate, and cheaper. Reducing costs of managing protected areas in remote deep waters and in the High Seas is of great importance, and this is a realm where autonomous technology will be transformative.

  17. Evolutionary process of deep-sea bathymodiolus mussels.

    Science.gov (United States)

    Miyazaki, Jun-Ichi; de Oliveira Martins, Leonardo; Fujita, Yuko; Matsumoto, Hiroto; Fujiwara, Yoshihiro

    2010-04-27

    Since the discovery of deep-sea chemosynthesis-based communities, much work has been done to clarify their organismal and environmental aspects. However, major topics remain to be resolved, including when and how organisms invade and adapt to deep-sea environments; whether strategies for invasion and adaptation are shared by different taxa or unique to each taxon; how organisms extend their distribution and diversity; and how they become isolated to speciate in continuous waters. Deep-sea mussels are one of the dominant organisms in chemosynthesis-based communities, thus investigations of their origin and evolution contribute to resolving questions about life in those communities. We investigated worldwide phylogenetic relationships of deep-sea Bathymodiolus mussels and their mytilid relatives by analyzing nucleotide sequences of the mitochondrial cytochrome c oxidase subunit I (COI) and NADH dehydrogenase subunit 4 (ND4) genes. Phylogenetic analysis of the concatenated sequence data showed that mussels of the subfamily Bathymodiolinae from vents and seeps were divided into four groups, and that mussels of the subfamily Modiolinae from sunken wood and whale carcasses assumed the outgroup position and shallow-water modioline mussels were positioned more distantly to the bathymodioline mussels. We provisionally hypothesized the evolutionary history of Bathymodilolus mussels by estimating evolutionary time under a relaxed molecular clock model. Diversification of bathymodioline mussels was initiated in the early Miocene, and subsequently diversification of the groups occurred in the early to middle Miocene. The phylogenetic relationships support the "Evolutionary stepping stone hypothesis," in which mytilid ancestors exploited sunken wood and whale carcasses in their progressive adaptation to deep-sea environments. This hypothesis is also supported by the evolutionary transition of symbiosis in that nutritional adaptation to the deep sea proceeded from extracellular

  18. Ultra Deep Wave Equation Imaging and Illumination

    Energy Technology Data Exchange (ETDEWEB)

    Alexander M. Popovici; Sergey Fomel; Paul Sava; Sean Crawley; Yining Li; Cristian Lupascu

    2006-09-30

    In this project we developed and tested a novel technology, designed to enhance seismic resolution and imaging of ultra-deep complex geologic structures by using state-of-the-art wave-equation depth migration and wave-equation velocity model building technology for deeper data penetration and recovery, steeper dip and ultra-deep structure imaging, accurate velocity estimation for imaging and pore pressure prediction and accurate illumination and amplitude processing for extending the AVO prediction window. Ultra-deep wave-equation imaging provides greater resolution and accuracy under complex geologic structures where energy multipathing occurs, than what can be accomplished today with standard imaging technology. The objective of the research effort was to examine the feasibility of imaging ultra-deep structures onshore and offshore, by using (1) wave-equation migration, (2) angle-gathers velocity model building, and (3) wave-equation illumination and amplitude compensation. The effort consisted of answering critical technical questions that determine the feasibility of the proposed methodology, testing the theory on synthetic data, and finally applying the technology for imaging ultra-deep real data. Some of the questions answered by this research addressed: (1) the handling of true amplitudes in the downward continuation and imaging algorithm and the preservation of the amplitude with offset or amplitude with angle information required for AVO studies, (2) the effect of several imaging conditions on amplitudes, (3) non-elastic attenuation and approaches for recovering the amplitude and frequency, (4) the effect of aperture and illumination on imaging steep dips and on discriminating the velocities in the ultra-deep structures. All these effects were incorporated in the final imaging step of a real data set acquired specifically to address ultra-deep imaging issues, with large offsets (12,500 m) and long recording time (20 s).

  19. U.V. repair in deep-sea bacteria

    International Nuclear Information System (INIS)

    Lutz, L.; Yayanos, A.A.

    1986-01-01

    Exposure of cells to light of less than 320 nanometers wavelengths may lead to lethal lesions and perhaps carcinogenesis. Many organisms have evolved mechanisms to repair U.V. light-induced damage. Organisms such as deep-sea bacteria are presumably never exposed to U.V. light and perhaps occasionally to visible from bioluminescence. Thus, the repair of U.V. damage in deep-sea bacterial DNA might be inefficient and repair by photoreactivation unlikely. The bacteria utilized in this investigation are temperature sensitive and barophilic. Four deep-sea isolates were chosen for this study: PE-36 from 3584 m, CNPT-3 from 5782 m, HS-34 from 5682 m, and MT-41 from 10,476 m, all are from the North Pacific ocean. The deep-sea extends from 1100 m to depths greater than 7000 m. It is a region of relatively uniform conditions. The temperature ranges from 5 to -1 0 C. There is no solar light in the deep-sea. Deep-sea bacteria are sensitive to U.V. light; in fact more sensitive than a variety of terrestrial and sea-surface bacteria. All four isolates demonstrate thymine dimer repair. Photoreactivation was observed in only MT-41. The other strains from shallower depths displayed no photoreactivation. The presence of DNA sequences homologous to the rec A, uvr A, B, and C and phr genes of E. coli have been examined by Southern hybridization techniques

  20. Diabetic retinopathy screening using deep neural network.

    Science.gov (United States)

    Ramachandran, Nishanthan; Hong, Sheng Chiong; Sime, Mary J; Wilson, Graham A

    2017-09-07

    There is a burgeoning interest in the use of deep neural network in diabetic retinal screening. To determine whether a deep neural network could satisfactorily detect diabetic retinopathy that requires referral to an ophthalmologist from a local diabetic retinal screening programme and an international database. Retrospective audit. Diabetic retinal photos from Otago database photographed during October 2016 (485 photos), and 1200 photos from Messidor international database. Receiver operating characteristic curve to illustrate the ability of a deep neural network to identify referable diabetic retinopathy (moderate or worse diabetic retinopathy or exudates within one disc diameter of the fovea). Area under the receiver operating characteristic curve, sensitivity and specificity. For detecting referable diabetic retinopathy, the deep neural network had an area under receiver operating characteristic curve of 0.901 (95% confidence interval 0.807-0.995), with 84.6% sensitivity and 79.7% specificity for Otago and 0.980 (95% confidence interval 0.973-0.986), with 96.0% sensitivity and 90.0% specificity for Messidor. This study has shown that a deep neural network can detect referable diabetic retinopathy with sensitivities and specificities close to or better than 80% from both an international and a domestic (New Zealand) database. We believe that deep neural networks can be integrated into community screening once they can successfully detect both diabetic retinopathy and diabetic macular oedema. © 2017 Royal Australian and New Zealand College of Ophthalmologists.

  1. Extracting Databases from Dark Data with DeepDive.

    Science.gov (United States)

    Zhang, Ce; Shin, Jaeho; Ré, Christopher; Cafarella, Michael; Niu, Feng

    2016-01-01

    DeepDive is a system for extracting relational databases from dark data : the mass of text, tables, and images that are widely collected and stored but which cannot be exploited by standard relational tools. If the information in dark data - scientific papers, Web classified ads, customer service notes, and so on - were instead in a relational database, it would give analysts a massive and valuable new set of "big data." DeepDive is distinctive when compared to previous information extraction systems in its ability to obtain very high precision and recall at reasonable engineering cost; in a number of applications, we have used DeepDive to create databases with accuracy that meets that of human annotators. To date we have successfully deployed DeepDive to create data-centric applications for insurance, materials science, genomics, paleontologists, law enforcement, and others. The data unlocked by DeepDive represents a massive opportunity for industry, government, and scientific researchers. DeepDive is enabled by an unusual design that combines large-scale probabilistic inference with a novel developer interaction cycle. This design is enabled by several core innovations around probabilistic training and inference.

  2. Deep groundwater chemistry

    International Nuclear Information System (INIS)

    Wikberg, P.; Axelsen, K.; Fredlund, F.

    1987-06-01

    Starting in 1977 and up till now a number of places in Sweden have been investigated in order to collect the necessary geological, hydrogeological and chemical data needed for safety analyses of repositories in deep bedrock systems. Only crystalline rock is considered and in many cases this has been gneisses of sedimentary origin but granites and gabbros are also represented. Core drilled holes have been made at nine sites. Up to 15 holes may be core drilled at one site, the deepest down to 1000 m. In addition to this a number of boreholes are percussion drilled at each site to depths of about 100 m. When possible drilling water is taken from percussion drilled holes. The first objective is to survey the hydraulic conditions. Core drilled boreholes and sections selected for sampling of deep groundwater are summarized. (orig./HP)

  3. Preface: Deep Slab and Mantle Dynamics

    Science.gov (United States)

    Suetsugu, Daisuke; Bina, Craig R.; Inoue, Toru; Wiens, Douglas A.

    2010-11-01

    We are pleased to publish this special issue of the journal Physics of the Earth and Planetary Interiors entitled "Deep Slab and Mantle Dynamics". This issue is an outgrowth of the international symposium "Deep Slab and Mantle Dynamics", which was held on February 25-27, 2009, in Kyoto, Japan. This symposium was organized by the "Stagnant Slab Project" (SSP) research group to present the results of the 5-year project and to facilitate intensive discussion with well-known international researchers in related fields. The SSP and the symposium were supported by a Grant-in-Aid for Scientific Research (16075101) from the Ministry of Education, Culture, Sports, Science and Technology of the Japanese Government. In the symposium, key issues discussed by participants included: transportation of water into the deep mantle and its role in slab-related dynamics; observational and experimental constraints on deep slab properties and the slab environment; modeling of slab stagnation to constrain its mechanisms in comparison with observational and experimental data; observational, experimental and modeling constraints on the fate of stagnant slabs; eventual accumulation of stagnant slabs on the core-mantle boundary and its geodynamic implications. This special issue is a collection of papers presented in the symposium and other papers related to the subject of the symposium. The collected papers provide an overview of the wide range of multidisciplinary studies of mantle dynamics, particularly in the context of subduction, stagnation, and the fate of deep slabs.

  4. Harnessing the Deep Web: Present and Future

    OpenAIRE

    Madhavan, Jayant; Afanasiev, Loredana; Antova, Lyublena; Halevy, Alon

    2009-01-01

    Over the past few years, we have built a system that has exposed large volumes of Deep-Web content to Google.com users. The content that our system exposes contributes to more than 1000 search queries per-second and spans over 50 languages and hundreds of domains. The Deep Web has long been acknowledged to be a major source of structured data on the web, and hence accessing Deep-Web content has long been a problem of interest in the data management community. In this paper, we report on where...

  5. Zooplankton at deep Red Sea brine pools

    KAUST Repository

    Kaartvedt, Stein

    2016-03-02

    The deep-sea anoxic brines of the Red Sea comprise unique, complex and extreme habitats. These environments are too harsh for metazoans, while the brine–seawater interface harbors dense microbial populations. We investigated the adjacent pelagic fauna at two brine pools using net tows, video records from a remotely operated vehicle and submerged echosounders. Waters just above the brine pool of Atlantis II Deep (2000 m depth) appeared depleted of macrofauna. In contrast, the fauna appeared to be enriched at the Kebrit Deep brine–seawater interface (1466 m).

  6. How to study deep roots - and why it matters

    OpenAIRE

    Maeght, Jean-Luc; Rewald, B.; Pierret, Alain

    2013-01-01

    The drivers underlying the development of deep root systems, whether genetic or environmental, are poorly understood but evidence has accumulated that deep rooting could be a more widespread and important trait among plants than commonly anticipated from their share of root biomass. Even though a distinct classification of "deep roots" is missing to date, deep roots provide important functions for individual plants such as nutrient and water uptake but can also shape plant communities by hydr...

  7. Benchmarking State-of-the-Art Deep Learning Software Tools

    OpenAIRE

    Shi, Shaohuai; Wang, Qiang; Xu, Pengfei; Chu, Xiaowen

    2016-01-01

    Deep learning has been shown as a successful machine learning method for a variety of tasks, and its popularity results in numerous open-source deep learning software tools. Training a deep network is usually a very time-consuming process. To address the computational challenge in deep learning, many tools exploit hardware features such as multi-core CPUs and many-core GPUs to shorten the training time. However, different tools exhibit different features and running performance when training ...

  8. High-Redshift Radio Galaxies from Deep Fields

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... High-Redshift Radio Galaxies from Deep Fields ... Here we present results from the deep 150 MHz observations of LBDS-Lynx field, which has been imaged at 327, ... Articles are also visible in Web of Science immediately.

  9. Deep Learning Microscopy

    KAUST Repository

    Rivenson, Yair; Gorocs, Zoltan; Gunaydin, Harun; Zhang, Yibo; Wang, Hongda; Ozcan, Aydogan

    2017-01-01

    regular optical microscope, without any changes to its design. We blindly tested this deep learning approach using various tissue samples that are imaged with low-resolution and wide-field systems, where the network rapidly outputs an image with remarkably

  10. Serotonin: is it a marker for the diagnosis of hepatocellular ...

    African Journals Online (AJOL)

    Hoda Aly Abd El Moety

    2013-04-19

    Apr 19, 2013 ... ... which is excreted in urine.19,20 ... partial hepatectomy in rodents.24–27 One resident of the hepa- ... to modulate all these factors renders it crucial in times of hepa- tic injury ... Data were fed to the computer using the predictive Analytics ..... incidence in Egypt.5,8 In general, almost all HCC cases are pre-.

  11. Deep-sea fungi

    Digital Repository Service at National Institute of Oceanography (India)

    Raghukumar, C; Damare, S.R.

    significant in terms of carbon sequestration (5, 8). In light of this, the diversity, abundance, and role of fungi in deep-sea sediments may form an important link in the global C biogeochemistry. This review focuses on issues related to collection...

  12. Deep Trawl Dataset

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Otter trawl (36' Yankee and 4-seam net deepwater gear) catches from mid-Atlantic slope and canyons at 200 - 800 m depth. Deep-sea (200-800 m depth) flat otter trawls...

  13. Deep learning with convolutional neural network in radiology.

    Science.gov (United States)

    Yasaka, Koichiro; Akai, Hiroyuki; Kunimatsu, Akira; Kiryu, Shigeru; Abe, Osamu

    2018-04-01

    Deep learning with a convolutional neural network (CNN) is gaining attention recently for its high performance in image recognition. Images themselves can be utilized in a learning process with this technique, and feature extraction in advance of the learning process is not required. Important features can be automatically learned. Thanks to the development of hardware and software in addition to techniques regarding deep learning, application of this technique to radiological images for predicting clinically useful information, such as the detection and the evaluation of lesions, etc., are beginning to be investigated. This article illustrates basic technical knowledge regarding deep learning with CNNs along the actual course (collecting data, implementing CNNs, and training and testing phases). Pitfalls regarding this technique and how to manage them are also illustrated. We also described some advanced topics of deep learning, results of recent clinical studies, and the future directions of clinical application of deep learning techniques.

  14. Photon diffractive dissociation in deep inelastic scattering

    International Nuclear Information System (INIS)

    Ryskin, M.G.

    1990-01-01

    The new ep-collider HERA gives us the possibility to study the diffractive dissociation of virtual photon in deep inelastic ep-collision. The process of photon dissociation in deep inelastic scattering is the most direct way to measure the value of triple-pomeron vertex G 3P . It was shown that the value of the correct bare vertex G 3P may more than 4 times exceeds its effective value measuring in the triple-reggeon region and reaches the value of about 40-50% of the elastic pp-pomeron vertex. On the contrary in deep inelastic processes the perpendicular momenta q t of the secondary particles are large enough. Thus in deep inelastic reactions one can measure the absolute value of G 3P vertex in the most direct way and compare its value and q t dependence with the leading log QCD predictions

  15. Ubiquitous healthy diatoms in the deep sea confirm deep carbon injection by the biological pump

    KAUST Repository

    Agusti, Susana

    2015-07-09

    The role of the ocean as a sink for CO2 is partially dependent on the downward transport of phytoplankton cells packaged within fast-sinking particles. However, whether such fast-sinking mechanisms deliver fresh organic carbon down to the deep bathypelagic sea and whether this mechanism is prevalent across the ocean requires confirmation. Here we report the ubiquitous presence of healthy photosynthetic cells, dominated by diatoms, down to 4,000 m in the deep dark ocean. Decay experiments with surface phytoplankton suggested that the large proportion (18%) of healthy photosynthetic cells observed, on average, in the dark ocean, requires transport times from a few days to a few weeks, corresponding to sinking rates (124–732 m d−1) comparable to those of fast-sinking aggregates and faecal pellets. These results confirm the expectation that fast-sinking mechanisms inject fresh organic carbon into the deep sea and that this is a prevalent process operating across the global oligotrophic ocean.

  16. Ubiquitous healthy diatoms in the deep sea confirm deep carbon injection by the biological pump

    KAUST Repository

    Agusti, Susana; Gonzá lez-Gordillo, J. I.; Vaqué , D.; Estrada, M.; Cerezo, M. I.; Salazar, G.; Gasol, J. M.; Duarte, Carlos M.

    2015-01-01

    The role of the ocean as a sink for CO2 is partially dependent on the downward transport of phytoplankton cells packaged within fast-sinking particles. However, whether such fast-sinking mechanisms deliver fresh organic carbon down to the deep bathypelagic sea and whether this mechanism is prevalent across the ocean requires confirmation. Here we report the ubiquitous presence of healthy photosynthetic cells, dominated by diatoms, down to 4,000 m in the deep dark ocean. Decay experiments with surface phytoplankton suggested that the large proportion (18%) of healthy photosynthetic cells observed, on average, in the dark ocean, requires transport times from a few days to a few weeks, corresponding to sinking rates (124–732 m d−1) comparable to those of fast-sinking aggregates and faecal pellets. These results confirm the expectation that fast-sinking mechanisms inject fresh organic carbon into the deep sea and that this is a prevalent process operating across the global oligotrophic ocean.

  17. The deep universe

    CERN Document Server

    Sandage, AR; Longair, MS

    1995-01-01

    Discusses the concept of the deep universe from two conflicting theoretical viewpoints: firstly as a theory embracing the evolution of the universe from the Big Bang to the present; and secondly through observations gleaned over the years on stars, galaxies and clusters.

  18. Deep Vein Thrombosis

    Centers for Disease Control (CDC) Podcasts

    2012-04-05

    This podcast discusses the risk for deep vein thrombosis in long-distance travelers and ways to minimize that risk.  Created: 4/5/2012 by National Center for Emerging and Zoonotic Infectious Diseases (NCEZID).   Date Released: 4/5/2012.

  19. Deep inelastic scattering

    International Nuclear Information System (INIS)

    Aubert, J.J.

    1982-01-01

    Deep inelastic lepton-nucleon interaction experiments are renewed. Singlet and non-singlet structure functions are measured and the consistency of the different results is checked. A detailed analysis of the scaling violation is performed in terms of the quantum chromodynamics predictions [fr

  20. Outcomes of the DeepWind conceptual design

    NARCIS (Netherlands)

    Paulsen, US; Borg, M.; Madsen, HA; Pedersen, TF; Hattel, J; Ritchie, E.; Simao Ferreira, C.; Svendsen, H.; Berthelsen, P.A.; Smadja, C.

    2015-01-01

    DeepWind has been presented as a novel floating offshore wind turbine concept with cost reduction potentials. Twelve international partners developed a Darrieus type floating turbine with new materials and technologies for deep-sea offshore environment. This paper summarizes results of the 5 MW

  1. Earthquakes - a danger to deep-lying repositories?

    International Nuclear Information System (INIS)

    2012-03-01

    This booklet issued by the Swiss National Cooperative for the Disposal of Radioactive Waste NAGRA takes a look at geological factors concerning earthquakes and the safety of deep-lying repositories for nuclear waste. The geological processes involved in the occurrence of earthquakes are briefly looked at and the definitions for magnitude and intensity of earthquakes are discussed. Examples of damage caused by earthquakes are given. The earthquake situation in Switzerland is looked at and the effects of earthquakes on sub-surface structures and deep-lying repositories are discussed. Finally, the ideas proposed for deep-lying geological repositories for nuclear wastes are discussed

  2. DRREP: deep ridge regressed epitope predictor.

    Science.gov (United States)

    Sher, Gene; Zhi, Degui; Zhang, Shaojie

    2017-10-03

    The ability to predict epitopes plays an enormous role in vaccine development in terms of our ability to zero in on where to do a more thorough in-vivo analysis of the protein in question. Though for the past decade there have been numerous advancements and improvements in epitope prediction, on average the best benchmark prediction accuracies are still only around 60%. New machine learning algorithms have arisen within the domain of deep learning, text mining, and convolutional networks. This paper presents a novel analytically trained and string kernel using deep neural network, which is tailored for continuous epitope prediction, called: Deep Ridge Regressed Epitope Predictor (DRREP). DRREP was tested on long protein sequences from the following datasets: SARS, Pellequer, HIV, AntiJen, and SEQ194. DRREP was compared to numerous state of the art epitope predictors, including the most recently published predictors called LBtope and DMNLBE. Using area under ROC curve (AUC), DRREP achieved a performance improvement over the best performing predictors on SARS (13.7%), HIV (8.9%), Pellequer (1.5%), and SEQ194 (3.1%), with its performance being matched only on the AntiJen dataset, by the LBtope predictor, where both DRREP and LBtope achieved an AUC of 0.702. DRREP is an analytically trained deep neural network, thus capable of learning in a single step through regression. By combining the features of deep learning, string kernels, and convolutional networks, the system is able to perform residue-by-residue prediction of continues epitopes with higher accuracy than the current state of the art predictors.

  3. Ploughing the deep sea floor.

    Science.gov (United States)

    Puig, Pere; Canals, Miquel; Company, Joan B; Martín, Jacobo; Amblas, David; Lastras, Galderic; Palanques, Albert

    2012-09-13

    Bottom trawling is a non-selective commercial fishing technique whereby heavy nets and gear are pulled along the sea floor. The direct impact of this technique on fish populations and benthic communities has received much attention, but trawling can also modify the physical properties of seafloor sediments, water–sediment chemical exchanges and sediment fluxes. Most of the studies addressing the physical disturbances of trawl gear on the seabed have been undertaken in coastal and shelf environments, however, where the capacity of trawling to modify the seafloor morphology coexists with high-energy natural processes driving sediment erosion, transport and deposition. Here we show that on upper continental slopes, the reworking of the deep sea floor by trawling gradually modifies the shape of the submarine landscape over large spatial scales. We found that trawling-induced sediment displacement and removal from fishing grounds causes the morphology of the deep sea floor to become smoother over time, reducing its original complexity as shown by high-resolution seafloor relief maps. Our results suggest that in recent decades, following the industrialization of fishing fleets, bottom trawling has become an important driver of deep seascape evolution. Given the global dimension of this type of fishery, we anticipate that the morphology of the upper continental slope in many parts of the world’s oceans could be altered by intensive bottom trawling, producing comparable effects on the deep sea floor to those generated by agricultural ploughing on land.

  4. Parallel Distributed Processing Theory in the Age of Deep Networks.

    Science.gov (United States)

    Bowers, Jeffrey S

    2017-12-01

    Parallel distributed processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely that all knowledge is coded in a distributed format and cognition is mediated by non-symbolic computations. These claims have long been debated in cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks learn units that respond selectively to meaningful categories, and researchers are finding that deep networks need to be supplemented with symbolic systems to perform some tasks. Given the close links between PDP and deep networks, it is surprising that research with deep networks is challenging PDP theory. Copyright © 2017. Published by Elsevier Ltd.

  5. DEEP VADOSE ZONE TREATABILITY TEST PLAN

    International Nuclear Information System (INIS)

    Chronister, G.B.; Truex, M.J.

    2009-01-01

    (sm b ullet) Treatability test plan published in 2008 (sm b ullet) Outlines technology treatability activities for evaluating application of in situ technologies and surface barriers to deep vadose zone contamination (technetium and uranium) (sm b ullet) Key elements - Desiccation testing - Testing of gas-delivered reactants for in situ treatment of uranium - Evaluating surface barrier application to deep vadose zone - Evaluating in situ grouting and soil flushing

  6. Deep Learning in Visual Computing and Signal Processing

    OpenAIRE

    Xie, Danfeng; Zhang, Lei; Bai, Li

    2017-01-01

    Deep learning is a subfield of machine learning, which aims to learn a hierarchy of features from input data. Nowadays, researchers have intensively investigated deep learning algorithms for solving challenging problems in many areas such as image classification, speech recognition, signal processing, and natural language processing. In this study, we not only review typical deep learning algorithms in computer vision and signal processing but also provide detailed information on how to apply...

  7. Deep Visual Attention Prediction

    Science.gov (United States)

    Wang, Wenguan; Shen, Jianbing

    2018-05-01

    In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.

  8. Deep processes in non-relativistic confining potentials

    International Nuclear Information System (INIS)

    Fishbane, P.M.; Grisaru, M.T.

    1978-01-01

    The authors study deep inelastic and hard scattering processes for non-relativistic particles confined in deep potentials. The mechanisms by which the effects of confinement disappear and the particles scatter as if free are useful in understanding the analogous results for a relativistic field theory. (Auth.)

  9. Deep Learning in Medical Image Analysis.

    Science.gov (United States)

    Shen, Dinggang; Wu, Guorong; Suk, Heung-Il

    2017-06-21

    This review covers computer-assisted analysis of images in the field of medical imaging. Recent advances in machine learning, especially with regard to deep learning, are helping to identify, classify, and quantify patterns in medical images. At the core of these advances is the ability to exploit hierarchical feature representations learned solely from data, instead of features designed by hand according to domain-specific knowledge. Deep learning is rapidly becoming the state of the art, leading to enhanced performance in various medical applications. We introduce the fundamentals of deep learning methods and review their successes in image registration, detection of anatomical and cellular structures, tissue segmentation, computer-aided disease diagnosis and prognosis, and so on. We conclude by discussing research issues and suggesting future directions for further improvement.

  10. pathways to deep decarbonization - 2014 report

    International Nuclear Information System (INIS)

    Sachs, Jeffrey; Guerin, Emmanuel; Mas, Carl; Schmidt-Traub, Guido; Tubiana, Laurence; Waisman, Henri; Colombier, Michel; Bulger, Claire; Sulakshana, Elana; Zhang, Kathy; Barthelemy, Pierre; Spinazze, Lena; Pharabod, Ivan

    2014-09-01

    The Deep Decarbonization Pathways Project (DDPP) is a collaborative initiative to understand and show how individual countries can transition to a low-carbon economy and how the world can meet the internationally agreed target of limiting the increase in global mean surface temperature to less than 2 degrees Celsius (deg. C). Achieving the 2 deg. C limit will require that global net emissions of greenhouse gases (GHG) approach zero by the second half of the century. This will require a profound transformation of energy systems by mid-century through steep declines in carbon intensity in all sectors of the economy, a transition we call 'deep decarbonization.' Successfully transition to a low-carbon economy will require unprecedented global cooperation, including a global cooperative effort to accelerate the development and diffusion of some key low carbon technologies. As underscored throughout this report, the results of the DDPP analyses remain preliminary and incomplete. The DDPP proceeds in two phases. This 2014 report describes the DDPP's approach to deep decarbonization at the country level and presents preliminary findings on technically feasible pathways to deep decarbonization, utilizing technology assumptions and timelines provided by the DDPP Secretariat. At this stage we have not yet considered the economic and social costs and benefits of deep decarbonization, which will be the topic for the next report. The DDPP is issuing this 2014 report to the UN Secretary-General Ban Ki-moon in support of the Climate Leaders' Summit at the United Nations on September 23, 2014. This 2014 report by the Deep Decarbonization Pathway Project (DDPP) summarizes preliminary findings of the technical pathways developed by the DDPP Country Research Partners with the objective of achieving emission reductions consistent with limiting global warming to less than 2 deg. C., without, at this stage, consideration of economic and social costs and benefits. The DDPP is a knowledge

  11. Evolutionary process of deep-sea bathymodiolus mussels.

    Directory of Open Access Journals (Sweden)

    Jun-Ichi Miyazaki

    Full Text Available BACKGROUND: Since the discovery of deep-sea chemosynthesis-based communities, much work has been done to clarify their organismal and environmental aspects. However, major topics remain to be resolved, including when and how organisms invade and adapt to deep-sea environments; whether strategies for invasion and adaptation are shared by different taxa or unique to each taxon; how organisms extend their distribution and diversity; and how they become isolated to speciate in continuous waters. Deep-sea mussels are one of the dominant organisms in chemosynthesis-based communities, thus investigations of their origin and evolution contribute to resolving questions about life in those communities. METHODOLOGY/PRINCIPAL FINDING: We investigated worldwide phylogenetic relationships of deep-sea Bathymodiolus mussels and their mytilid relatives by analyzing nucleotide sequences of the mitochondrial cytochrome c oxidase subunit I (COI and NADH dehydrogenase subunit 4 (ND4 genes. Phylogenetic analysis of the concatenated sequence data showed that mussels of the subfamily Bathymodiolinae from vents and seeps were divided into four groups, and that mussels of the subfamily Modiolinae from sunken wood and whale carcasses assumed the outgroup position and shallow-water modioline mussels were positioned more distantly to the bathymodioline mussels. We provisionally hypothesized the evolutionary history of Bathymodilolus mussels by estimating evolutionary time under a relaxed molecular clock model. Diversification of bathymodioline mussels was initiated in the early Miocene, and subsequently diversification of the groups occurred in the early to middle Miocene. CONCLUSIONS/SIGNIFICANCE: The phylogenetic relationships support the "Evolutionary stepping stone hypothesis," in which mytilid ancestors exploited sunken wood and whale carcasses in their progressive adaptation to deep-sea environments. This hypothesis is also supported by the evolutionary transition of

  12. Deep water recycling through time.

    Science.gov (United States)

    Magni, Valentina; Bouilhol, Pierre; van Hunen, Jeroen

    2014-11-01

    We investigate the dehydration processes in subduction zones and their implications for the water cycle throughout Earth's history. We use a numerical tool that combines thermo-mechanical models with a thermodynamic database to examine slab dehydration for present-day and early Earth settings and its consequences for the deep water recycling. We investigate the reactions responsible for releasing water from the crust and the hydrated lithospheric mantle and how they change with subduction velocity ( v s ), slab age ( a ) and mantle temperature (T m ). Our results show that faster slabs dehydrate over a wide area: they start dehydrating shallower and they carry water deeper into the mantle. We parameterize the amount of water that can be carried deep into the mantle, W (×10 5 kg/m 2 ), as a function of v s (cm/yr), a (Myrs), and T m (°C):[Formula: see text]. We generally observe that a 1) 100°C increase in the mantle temperature, or 2) ∼15 Myr decrease of plate age, or 3) decrease in subduction velocity of ∼2 cm/yr all have the same effect on the amount of water retained in the slab at depth, corresponding to a decrease of ∼2.2×10 5 kg/m 2 of H 2 O. We estimate that for present-day conditions ∼26% of the global influx water, or 7×10 8 Tg/Myr of H 2 O, is recycled into the mantle. Using a realistic distribution of subduction parameters, we illustrate that deep water recycling might still be possible in early Earth conditions, although its efficiency would generally decrease. Indeed, 0.5-3.7 × 10 8 Tg/Myr of H 2 O could still be recycled in the mantle at 2.8 Ga. Deep water recycling might be possible even in early Earth conditions We provide a scaling law to estimate the amount of H 2 O flux deep into the mantle Subduction velocity has a a major control on the crustal dehydration pattern.

  13. Stratification-Based Outlier Detection over the Deep Web

    OpenAIRE

    Xian, Xuefeng; Zhao, Pengpeng; Sheng, Victor S.; Fang, Ligang; Gu, Caidong; Yang, Yuanfeng; Cui, Zhiming

    2016-01-01

    For many applications, finding rare instances or outliers can be more interesting than finding common patterns. Existing work in outlier detection never considers the context of deep web. In this paper, we argue that, for many scenarios, it is more meaningful to detect outliers over deep web. In the context of deep web, users must submit queries through a query interface to retrieve corresponding data. Therefore, traditional data mining methods cannot be directly applied. The primary contribu...

  14. Deep neural networks to enable real-time multimessenger astrophysics

    Science.gov (United States)

    George, Daniel; Huerta, E. A.

    2018-02-01

    Gravitational wave astronomy has set in motion a scientific revolution. To further enhance the science reach of this emergent field of research, there is a pressing need to increase the depth and speed of the algorithms used to enable these ground-breaking discoveries. We introduce Deep Filtering—a new scalable machine learning method for end-to-end time-series signal processing. Deep Filtering is based on deep learning with two deep convolutional neural networks, which are designed for classification and regression, to detect gravitational wave signals in highly noisy time-series data streams and also estimate the parameters of their sources in real time. Acknowledging that some of the most sensitive algorithms for the detection of gravitational waves are based on implementations of matched filtering, and that a matched filter is the optimal linear filter in Gaussian noise, the application of Deep Filtering using whitened signals in Gaussian noise is investigated in this foundational article. The results indicate that Deep Filtering outperforms conventional machine learning techniques, achieves similar performance compared to matched filtering, while being several orders of magnitude faster, allowing real-time signal processing with minimal resources. Furthermore, we demonstrate that Deep Filtering can detect and characterize waveform signals emitted from new classes of eccentric or spin-precessing binary black holes, even when trained with data sets of only quasicircular binary black hole waveforms. The results presented in this article, and the recent use of deep neural networks for the identification of optical transients in telescope data, suggests that deep learning can facilitate real-time searches of gravitational wave sources and their electromagnetic and astroparticle counterparts. In the subsequent article, the framework introduced herein is directly applied to identify and characterize gravitational wave events in real LIGO data.

  15. Avalanches of sediment form deep-marine depositions

    NARCIS (Netherlands)

    Pohl, Florian|info:eu-repo/dai/nl/34309424X

    2017-01-01

    The deep ocean is the largest sedimentary system basin on the planet. It serves as the primary storage point for all terrestrially weathered sediment that makes it beyond the near-shore environment. These deep-marine offshore deposits have become a focus of attention in exploration due to the

  16. Equivalent drawbead performance in deep drawing simulations

    NARCIS (Netherlands)

    Meinders, Vincent T.; Geijselaers, Hubertus J.M.; Huetink, Han

    1999-01-01

    Drawbeads are applied in the deep drawing process to improve the control of the material flow during the forming operation. In simulations of the deep drawing process these drawbeads can be replaced by an equivalent drawbead model. In this paper the usage of an equivalent drawbead model in the

  17. Deep web search: an overview and roadmap

    NARCIS (Netherlands)

    Tjin-Kam-Jet, Kien; Trieschnigg, Rudolf Berend; Hiemstra, Djoerd

    2011-01-01

    We review the state-of-the-art in deep web search and propose a novel classification scheme to better compare deep web search systems. The current binary classification (surfacing versus virtual integration) hides a number of implicit decisions that must be made by a developer. We make these

  18. Unravelling networks in local public health policymaking in three European countries

    DEFF Research Database (Denmark)

    Spitters, Hilde P.E.M.; Lau, Cathrine J; Sandu, Petru

    2017-01-01

    the main stakeholders involved and their position and relations in the policymaking process. The Netherlands and Denmark were the most similar and both differed most from Romania, especially at the level of accountability of the local public authorities for local HEPA policymaking. The categories...... of these European country cases. Methods: A systems analysis of the local HEPA policymaking process was performed in three European countries involved in the 'REsearch into POlicy to enhance Physical Activity' (REPOPA) project, resulting in three schematic models showing the main stakeholders...... of driving forces underlying the relations between stakeholders were formal relations, informal interaction and knowledge exchange. Conclusions: A systems analysis providing detailed descriptions of positions and relations in the stakeholder network in local level HEPA policymaking is rather unique...

  19. Filter testing and development for prolonged transuranic service and waste reduction

    International Nuclear Information System (INIS)

    Geer, J.A.; Buttedahl, O.I.; Skaats, C.D.; Terada, K.; Woodard, R.W.

    1977-02-01

    The life of High Efficiency Particulate Air (HEPA) filters used in transuranic service is influenced greatly by the gaseous and particulate matter to which the filters are exposed. The most severe conditions encountered at Rocky Flats are at the ventilation systems serving the plutonium recovery operations in Bldg. 771. A project of filter testing and development for prolonged transuranic service and waste reduction was formally initiated at Rocky Flats on July 1, 1975. The project is directed toward improving filtration methods which will prolong the life of HEPA filter systems without sacrificing effectiveness. Another important aspect of the project is to reduce the volume of HEPA filter waste shipped from the plant for long-term storage. Progress to September 30, 1976, is reported

  20. Age-dependent mixing of deep-sea sediments

    International Nuclear Information System (INIS)

    Smith, C.R.; Maggaard, L.; Pope, R.H.; DeMaster, D.J.

    1993-01-01

    Rates of bioturbation measured in deep-sea sediments commonly are tracer dependent; in particular, shorter lived radiotracers (such as 234 Th) often yield markedly higher diffusive mixing coefficients than their longer-lived counterparts (e.g., 210 Pb). At a single station in the 1,240-m deep Santa Catalina Basin, the authors document a strong negative correlation between bioturbation rate and tracer half-life. Sediment profiles of 234 Th (half-life = 24 days) yield an average mixing coefficient (60 cm 2 y -1 ) two orders of magnitude greater than that for 210 Pb (half-life = 22 y, mean mixing coefficient = 0.4 cm 2 y -1 ). A similar negative relationship between mixing rate and tracer time scale is observed at thirteen other deep-sea sites in which multiple radiotracers have been used to assess diffusive mixing rates. This relationship holds across a variety of radiotracer types and time scales. The authors hypothesize that this negative relationship results from age-dependent mixing, a process in which recently sedimented, food-rich particles are ingested and mixed at higher rates by deposit feeders than are older, food-poor particles. Results from an age-dependent mixing model demonstrate that this process indeed can yield the bioturbation-rate vs. tracer-time-scale correlations observed in deep-sea sediments. Field data on mixing rates of recently sedimented particles, as well as the radiotracer activity of deep-sea deposit feeders, provide strong support for the age-dependent mixing model. The presence of age-dependent mixing in deep-sea sediments may have major implications for diagenetic modeling, requiring a match between the characteristic time scales of mixing tracers and modeled reactants. 102 refs., 6 figs., 5 tabs

  1. Deep-sea Hexactinellida (Porifera) of the Weddell Sea

    Science.gov (United States)

    Janussen, Dorte; Tabachnick, Konstantin R.; Tendal, Ole S.

    2004-07-01

    New Hexactinellida from the deep Weddel Sea are described. This moderately diverse hexactinellid fauna includes 14 species belonging to 12 genera, of which five species and one subgenus are new to science: Periphragella antarctica n. sp., Holascus pseudostellatus n. sp., Caulophacus (Caulophacus) discohexactinus n. sp., C. ( Caulodiscus) brandti n. sp., C. ( Oxydiscus) weddelli n. sp., and C. ( Oxydiscus) n. subgen. So far, 20 hexactinellid species have been reported from the deep Weddell Sea, 15 are known from the northern part and 10 only from here, while 10 came from the southern area, and five of these only from there. However, this apparent high "endemism" of Antarctic hexactinellid sponges is most likely the result of severe undersampling of the deep-sea fauna. We find no reason to believe that a division between an oceanic and a more continental group of species exists. The current poor database indicates that a substantial part of the deep hexactinellid fauna of the Weddell Sea is shared with other deep-sea regions, but it does not indicate a special biogeographic relationship with any other ocean.

  2. Deepwater Program: Lophelia II, continuing ecological research on deep-sea corals and deep-reef habitats in the Gulf of Mexico

    Science.gov (United States)

    Demopoulos, Amanda W.J.; Ross, Steve W.; Kellogg, Christina A.; Morrison, Cheryl L.; Nizinski, Martha S.; Prouty, Nancy G.; Bourque, Jill R.; Galkiewicz, Julie P.; Gray, Michael A.; Springmann, Marcus J.; Coykendall, D. Katharine; Miller, Andrew; Rhode, Mike; Quattrini, Andrea; Ames, Cheryl L.; Brooke, Sandra D.; McClain Counts, Jennifer; Roark, E. Brendan; Buster, Noreen A.; Phillips, Ryan M.; Frometa, Janessy

    2017-12-11

    The deep sea is a rich environment composed of diverse habitat types. While deep-sea coral habitats have been discovered within each ocean basin, knowledge about the ecology of these habitats and associated inhabitants continues to grow. This report presents information and results from the Lophelia II project that examined deep-sea coral habitats in the Gulf of Mexico. The Lophelia II project focused on Lophelia pertusa habitats along the continental slope, at depths ranging from 300 to 1,000 meters. The chapters are authored by several scientists from the U.S. Geological Survey, National Oceanic and Atmospheric Administration, University of North Carolina Wilmington, and Florida State University who examined the community ecology (from microbes to fishes), deep-sea coral age, growth, and reproduction, and population connectivity of deep-sea corals and inhabitants. Data from these studies are presented in the chapters and appendixes of the report as well as in journal publications. This study was conducted by the Ecosystems Mission Area of the U.S. Geological Survey to meet information needs identified by the Bureau of Ocean Energy Management.

  3. Deep learning quick reference useful hacks for training and optimizing deep neural networks with TensorFlow and Keras

    CERN Document Server

    Bernico, Michael

    2018-01-01

    This book is a practical guide to applying deep neural networks including MLPs, CNNs, LSTMs, and more in Keras and TensorFlow. Packed with useful hacks to solve real-world challenges along with the supported math and theory around each topic, this book will be a quick reference for training and optimize your deep neural networks.

  4. Deep inelastic scattering

    International Nuclear Information System (INIS)

    Zakharov, V.I.

    1977-01-01

    The present status of the quark-parton-gluon picture of deep inelastic scattering is reviewed. The general framework is mostly theoretical and covers investigations since 1970. Predictions of the parton model and of the asymptotically free field theories are compared with experimental data available. The valence quark approximation is concluded to be valid in most cases, but fails to account for the data on the total momentum transfer. On the basis of gluon corrections introduced to the parton model certain predictions concerning both the deep inelastic structure functions and form factors are made. The contributions of gluon exchanges and gluon bremsstrahlung are highlighted. Asymptotic freedom is concluded to be very attractive and provide qualitative explanation to some experimental observations (scaling violations, breaking of the Drell-Yan-West type relations). Lepton-nuclear scattering is pointed out to be helpful in probing the nature of nuclear forces and studying the space-time picture of the parton model

  5. The DEEP-South: Scheduling and Data Reduction Software System

    Science.gov (United States)

    Yim, Hong-Suh; Kim, Myung-Jin; Bae, Youngho; Moon, Hong-Kyu; Choi, Young-Jun; Roh, Dong-Goo; the DEEP-South Team

    2015-08-01

    The DEep Ecliptic Patrol of the Southern sky (DEEP-South), started in October 2012, is currently in test runs with the first Korea Microlensing Telescope Network (KMTNet) 1.6 m wide-field telescope located at CTIO in Chile. While the primary objective for the DEEP-South is physical characterization of small bodies in the Solar System, it is expected to discover a large number of such bodies, many of them previously unknown.An automatic observation planning and data reduction software subsystem called "The DEEP-South Scheduling and Data reduction System" (the DEEP-South SDS) is currently being designed and implemented for observation planning, data reduction and analysis of huge amount of data with minimum human interaction. The DEEP-South SDS consists of three software subsystems: the DEEP-South Scheduling System (DSS), the Local Data Reduction System (LDR), and the Main Data Reduction System (MDR). The DSS manages observation targets, makes decision on target priority and observation methods, schedules nightly observations, and archive data using the Database Management System (DBMS). The LDR is designed to detect moving objects from CCD images, while the MDR conducts photometry and reconstructs lightcurves. Based on analysis made at the LDR and the MDR, the DSS schedules follow-up observation to be conducted at other KMTNet stations. In the end of 2015, we expect the DEEP-South SDS to achieve a stable operation. We also have a plan to improve the SDS to accomplish finely tuned observation strategy and more efficient data reduction in 2016.

  6. A Modeling Study of Deep Water Renewal in the Red Sea

    Science.gov (United States)

    Yao, F.; Hoteit, I.

    2016-02-01

    Deep water renewal processes in the Red Sea are examined in this study using a 50-year numerical simulation from 1952-2001. The deep water in the Red Sea below the thermocline ( 200 m) exhibits a near-uniform vertical structure in temperature and salinity, but geochemical tracer distributions, such as 14C and 3He, and dissolved oxygen concentrations indicate that the deep water is renewed on time scales as short as 36 years. The renewal process is accomplished through a deep overturning cell that consists of a southward bottom current and a northward returning current at depths of 400-600 m. Three sources regions are proposed for the formation of the deep water, including two deep outflows from the Gulfs of Aqaba and Suez and winter deep convections in the northern Red Sea. The MITgcm (MIT general circulation model), which has been used to simulate the shallow overturning circulations in the Red Sea, is configured in this study with increased resolutions in the deep water. During the 50 years of simulation, artificial passive tracers added in the model indicate that the deep water in the Red Sea was only episodically renewed during some anomalously cold years; two significant episodes of deep water renewal are reproduced in the winters of 1983 and 1992, in accordance with reported historical hydrographic observations. During these renewal events, deep convections reaching the bottom of the basin occurred, which further facilitated deep sinking of the outflows from the Gulfs of Aqaba and Suez. Ensuing spreading of the newly formed deep water along the bottom caused upward displacements of thermocline, which may have profound effects on the water exchanges in the Strait of Bab el Mandeb between the Red Sea and the Gulf of Aden and the functioning of the ecosystem in the Red Sea by changing the vertical distributions of nutrients.

  7. Deep Borehole Field Test Research Activities at LBNL

    Energy Technology Data Exchange (ETDEWEB)

    Dobson, Patrick [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tsang, Chin-Fu [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kneafsey, Timothy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borglin, Sharon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Piceno, Yvette [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Andersen, Gary [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nakagawa, Seiji [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nihei, Kurt [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rutqvist, Jonny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Doughty, Christine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Reagan, Matthew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-19

    The goal of the U.S. Department of Energy Used Fuel Disposition’s (UFD) Deep Borehole Field Test is to drill two 5 km large-diameter boreholes: a characterization borehole with a bottom-hole diameter of 8.5 inches and a field test borehole with a bottom-hole diameter of 17 inches. These boreholes will be used to demonstrate the ability to drill such holes in crystalline rocks, effectively characterize the bedrock repository system using geophysical, geochemical, and hydrological techniques, and emplace and retrieve test waste packages. These studies will be used to test the deep borehole disposal concept, which requires a hydrologically isolated environment characterized by low permeability, stable fluid density, reducing fluid chemistry conditions, and an effective borehole seal. During FY16, Lawrence Berkeley National Laboratory scientists conducted a number of research studies to support the UFD Deep Borehole Field Test effort. This work included providing supporting data for the Los Alamos National Laboratory geologic framework model for the proposed deep borehole site, conducting an analog study using an extensive suite of geoscience data and samples from a deep (2.5 km) research borehole in Sweden, conducting laboratory experiments and coupled process modeling related to borehole seals, and developing a suite of potential techniques that could be applied to the characterization and monitoring of the deep borehole environment. The results of these studies are presented in this report.

  8. Deep Borehole Field Test Research Activities at LBNL

    International Nuclear Information System (INIS)

    Dobson, Patrick; Tsang, Chin-Fu; Kneafsey, Timothy; Borglin, Sharon; Piceno, Yvette; Andersen, Gary; Nakagawa, Seiji; Nihei, Kurt; Rutqvist, Jonny; Doughty, Christine; Reagan, Matthew

    2016-01-01

    The goal of the U.S. Department of Energy Used Fuel Disposition's (UFD) Deep Borehole Field Test is to drill two 5 km large-diameter boreholes: a characterization borehole with a bottom-hole diameter of 8.5 inches and a field test borehole with a bottom-hole diameter of 17 inches. These boreholes will be used to demonstrate the ability to drill such holes in crystalline rocks, effectively characterize the bedrock repository system using geophysical, geochemical, and hydrological techniques, and emplace and retrieve test waste packages. These studies will be used to test the deep borehole disposal concept, which requires a hydrologically isolated environment characterized by low permeability, stable fluid density, reducing fluid chemistry conditions, and an effective borehole seal. During FY16, Lawrence Berkeley National Laboratory scientists conducted a number of research studies to support the UFD Deep Borehole Field Test effort. This work included providing supporting data for the Los Alamos National Laboratory geologic framework model for the proposed deep borehole site, conducting an analog study using an extensive suite of geoscience data and samples from a deep (2.5 km) research borehole in Sweden, conducting laboratory experiments and coupled process modeling related to borehole seals, and developing a suite of potential techniques that could be applied to the characterization and monitoring of the deep borehole environment. The results of these studies are presented in this report.

  9. The biomass of the deep-sea benthopelagic plankton

    Science.gov (United States)

    Wishner, K. F.

    1980-04-01

    Deep-sea benthopelagic plankton samples were collected with a specially designed opening-closing net system 10 to 100 m above the bottom in five different oceanic regions at depths from 1000 to 4700 m. Benthopelagic plankton biomasses decrease exponentially with depth. At 1000 m the biomass is about 1% that of the surface zooplankton, at 5000 m about 0.1%. Effects of differences in surface primary productivity on deep-sea plankton biomass are much less than the effect of depth and are detectable only in a few comparisons of extreme oceanic regions. The biomass at 10 m above the bottom is greater than that at 100 m above the bottom (in a three-sample comparison), which could be a consequence of an enriched near-bottom environment. The deep-sea plankton biomass in the Red Sea is anomalously low. This may be due to increased decomposition rates in the warm (22°C) deep Red Sea water, which prevent much detritus from reaching the deep sea. A model of organic carbon utilization in the benthic boundary layer (bottom 100 m), incorporating results from deep-sea sediment trap and respiration studies, indicates that the benthopelagic plankton use only a small amount of the organic carbon flux. A large fraction of the flux is unaccounted for by present estimates of benthic and benthopelagic respiration.

  10. Evolving Deep Networks Using HPC

    Energy Technology Data Exchange (ETDEWEB)

    Young, Steven R. [ORNL, Oak Ridge; Rose, Derek C. [ORNL, Oak Ridge; Johnston, Travis [ORNL, Oak Ridge; Heller, William T. [ORNL, Oak Ridge; Karnowski, thomas P. [ORNL, Oak Ridge; Potok, Thomas E. [ORNL, Oak Ridge; Patton, Robert M. [ORNL, Oak Ridge; Perdue, Gabriel [Fermilab; Miller, Jonathan [Santa Maria U., Valparaiso

    2017-01-01

    While a large number of deep learning networks have been studied and published that produce outstanding results on natural image datasets, these datasets only make up a fraction of those to which deep learning can be applied. These datasets include text data, audio data, and arrays of sensors that have very different characteristics than natural images. As these “best” networks for natural images have been largely discovered through experimentation and cannot be proven optimal on some theoretical basis, there is no reason to believe that they are the optimal network for these drastically different datasets. Hyperparameter search is thus often a very important process when applying deep learning to a new problem. In this work we present an evolutionary approach to searching the possible space of network hyperparameters and construction that can scale to 18, 000 nodes. This approach is applied to datasets of varying types and characteristics where we demonstrate the ability to rapidly find best hyperparameters in order to enable practitioners to quickly iterate between idea and result.

  11. Desalination Economic Evaluation Program (DEEP). User's manual

    International Nuclear Information System (INIS)

    2000-01-01

    DEEP (formerly named ''Co-generation and Desalination Economic Evaluation'' Spreadsheet, CDEE) has been developed originally by General Atomics under contract, and has been used in the IAEA's feasibility studies. For further confidence in the software, it was validated in March 1998. After that, a user friendly version has been issued under the name of DEEP at the end of 1998. DEEP output includes the levelised cost of water and power, a breakdown of cost components, energy consumption and net saleable power for each selected option. Specific power plants can be modelled by adjustment of input data including design power, power cycle parameters and costs

  12. Characterization of majority and minority carrier deep levels in p-type GaN:Mg grown by molecular beam epitaxy using deep level optical spectroscopy

    International Nuclear Information System (INIS)

    Armstrong, A.; Caudill, J.; Ringel, S. A.; Corrion, A.; Poblenz, C.; Mishra, U. K.; Speck, J. S.

    2008-01-01

    Deep level defects in p-type GaN:Mg grown by molecular beam epitaxy were characterized using steady-state photocapacitance and deep level optical spectroscopy (DLOS). Low frequency capacitance measurements were used to alleviate dispersion effects stemming from the deep Mg acceptor. Use of DLOS enabled a quantitative survey of both deep acceptor and deep donor levels, the latter being particularly important due to the limited understanding of minority carrier states for p-type GaN. Simultaneous electron and hole photoemissions resulted in a convoluted deep level spectrum that was decoupled by emphasizing either majority or minority carrier optical emission through control of the thermal filling time conditions. In this manner, DLOS was able to resolve and quantify the properties of deep levels residing near both the conduction and valence bandedges in the same sample. Bandgap states through hole photoemission were observed at E v +3.05 eV, E v +3.22 eV and E v +3.26 eV. Additionally, DLOS revealed levels at E c -3.24 eV and E c -2.97 eV through electron emission to the conduction band with the former attributed to the Mg acceptor itself. The detected deep donor concentration is less than 2% of activated [Mg] and demonstrates the excellent quality of the film

  13. DeepFlavour in CMS

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Flavour-tagging of jets is an important task in collider based high energy physics and a field where machine learning tools are applied by all major experiments. A new tagger (DeepFlavour) was developed and commissioned in CMS that is based on an advanced machine learning procedure. A deep neural network is used to do multi-classification of jets that origin from a b-quark, two b-quarks, a c-quark, two c-quarks or light colored particles (u, d, s-quark or gluon). The performance was measured in both, data and simulation. The talk will also include the measured performance of all taggers in CMS. The different taggers and results will be discussed and compared with some focus on details of the newest tagger.

  14. Deep bite malocclusion: exploration of the skeletal and dental factors

    International Nuclear Information System (INIS)

    Bhateja, N.K.; Fida, M.; Shaikh, A.

    2016-01-01

    Correction of deep bite is crucial for maintenance of dental hard and soft tissue structures and for prevention of temporomandibular joint disorders. Exploration of underlying skeletal and dental factors is essential for efficient and individualized treatment planning. To date etiological factors of dental and skeletal deep bite have not been explored in Pakistani orthodontic patients. The objectives of this study were to explore frequencies of dental and skeletal etiological factors in deep bite patients and to determine correlations amongst dental and skeletal etiological factors of deep bite. Methods: The study included a total of 113 subjects (males=35; females=78) with no craniofacial syndromes or prior orthodontic treatment. Pre-treatment orthodontic records were used to evaluate various dental and skeletal parameters. Descriptive statistics of each parameter were calculated. The various study parameters were correlated using Pearson's Correlation. Results: Deep curve of Spee was most frequently seen factor of dental deep bite (72.6%), followed by increased coronal length of upper incisors (28.3%), retroclined upper incisors (17.7%), retroclined lower incisors (8%) and increased coronal length of lower incisors (5.3%). Decreased gonial angle was most commonly found factor of skeletal deep bite (43.4%), followed by decreased mandibular plane angle (27.4%) and maxillary plane's clockwise rotation (26.5%). Frankfort mandibular plane angle and gonial angle showed a strong positive correlation (r=0.66, p=0.000). Conclusions: Reduced gonial angle is most frequently seen skeletal factor, signifying the importance of angulation and growth of ramus in development of deep bite. Deep curve of Spee is most frequently seen dental etiological component in deep bite subjects, hence signifying the importance of intruding the lower anterior teeth. (author)

  15. Preliminary discussion on deep-sourced uranium metallogenesis and deep prospecting

    International Nuclear Information System (INIS)

    Huang Shijie

    2006-01-01

    Prospecting for hydrothermal type uranium deposits should be aimed at medium-to large-sized deposits, and be guided by mantle-sourced, superimposed, deep-sourced metallogenic theory and the establishment of a multifactor, composite, deep-sourced metallogenic model. The author suggests that hydrothermal uranium deposits may be classified into three genetic types, i.e. hydrothermal circulation concentration, postmagmatic hydrothermal and mantle fluid concentration. These types of uranium deposits are characterized by their own metallogenic features and are concentrated in the same mineralization-concentrated area forming a metallogenic series. Large-sized uranium ore fields and rich-large uranium deposits are usually closely associated with mantle-sourced metallogenesis and the formation of such uranium ore fields and deposits is characterized by specific and unique regional geologic environments. Recognition criteria of mantle-sourced metallogenesis are preliminarily proposed in the paper. It is pointed out that prospecting in the future should follow the metallogenic model proper for the specific genetic type, and the establishment of operable prospecting model to realize the model-guided prospecting. (authors)

  16. Deep Ocean Contribution to Sea Level Rise

    Science.gov (United States)

    Chang, L.; Sun, W.; Tang, H.; Wang, Q.

    2017-12-01

    The ocean temperature and salinity change in the upper 2000m can be detected by Argo floats, so we can know the steric height change of the ocean. But the ocean layers above 2000m represent only 50% of the total ocean volume. Although the temperature and salinity change are small compared to the upper ocean, the deep ocean contribution to sea level might be significant because of its large volume. There has been some research on the deep ocean rely on the very sparse situ observation and are limited to decadal and longer-term rates of change. The available observational data in the deep ocean are too spares to determine the temporal variability, and the long-term changes may have a bias. We will use the Argo date and combine the situ data and topographic data to estimate the temperature and salinity of the sea water below 2000m, so we can obtain a monthly data. We will analyze the seasonal and annual change of the steric height change due to the deep ocean between 2005 and 2016. And we will evaluate the result combination the present-day satellite and in situ observing systems. The deep ocean contribution can be inferred indirectly as the difference between the altimetry minus GRACE and Argo-based steric sea level.

  17. Deep learning for SAR image formation

    Science.gov (United States)

    Mason, Eric; Yonel, Bariscan; Yazici, Birsen

    2017-04-01

    The recent success of deep learning has lead to growing interest in applying these methods to signal processing problems. This paper explores the applications of deep learning to synthetic aperture radar (SAR) image formation. We review deep learning from a perspective relevant to SAR image formation. Our objective is to address SAR image formation in the presence of uncertainties in the SAR forward model. We present a recurrent auto-encoder network architecture based on the iterative shrinkage thresholding algorithm (ISTA) that incorporates SAR modeling. We then present an off-line training method using stochastic gradient descent and discuss the challenges and key steps of learning. Lastly, we show experimentally that our method can be used to form focused images in the presence of phase uncertainties. We demonstrate that the resulting algorithm has faster convergence and decreased reconstruction error than that of ISTA.

  18. The Pleating of History: Weaving the Threads of Nationhood

    Directory of Open Access Journals (Sweden)

    Martin Ball

    2013-08-01

    Full Text Available As any etymologist knows, the word ‘text’ is derived from the past participle of the Latin verb texere, to weave. Text is therefore something that is ‘woven’. It’s a persuasive metaphor, to imagine writing in terms of the warp and weft of ideas and words, of narrative threads woven together to become a piece of fabric. The idea of history as fabric brings together a whole different set of tropes, not just of weaving, but of the very materiality of fabric. Does the fabric have a nap, or a pattern? Is it cut with the grain, or on the bias? What of its folds, its seams? All these qualities of fabric have application in the interpretation of history, and some of these images are already familiar in historical discourse.

  19. Sentimen Analisis Tweet Berbahasa Indonesia Dengan Deep Belief Network

    Directory of Open Access Journals (Sweden)

    Ira zulfa

    2017-07-01

    Full Text Available Sentiment analysis is a computational research of opinion sentiment and emotion which is expressed in textual mode. Twitter becomes the most popular communication device among internet users. Deep Learning is a new area of machine learning research. It aims to move machine learning closer to its main goal, artificial intelligence. The purpose of deep learning is to change the manual of engineering with learning. At its growth, deep learning has algorithms arrangement that focus on non-linear data representation. One of the machine learning methods is Deep Belief Network (DBN. Deep Belief Network (DBN, which is included in Deep Learning method, is a stack of several algorithms with some extraction features that optimally utilize all resources. This study has two points. First, it aims to classify positive, negative, and neutral sentiments towards the test data. Second, it determines the classification model accuracy by using Deep Belief Network method so it would be able to be applied into the tweet classification, to highlight the sentiment class of training data tweet in Bahasa Indonesia. Based on the experimental result, it can be concluded that the best method in managing tweet data is the DBN method with an accuracy of 93.31%, compared with  Naive Bayes method which has an accuracy of 79.10%, and SVM (Support Vector Machine method with an accuracy of 92.18%.

  20. Deep Recurrent Convolutional Neural Network: Improving Performance For Speech Recognition

    OpenAIRE

    Zhang, Zewang; Sun, Zheng; Liu, Jiaqi; Chen, Jingwen; Huo, Zhao; Zhang, Xiao

    2016-01-01

    A deep learning approach has been widely applied in sequence modeling problems. In terms of automatic speech recognition (ASR), its performance has significantly been improved by increasing large speech corpus and deeper neural network. Especially, recurrent neural network and deep convolutional neural network have been applied in ASR successfully. Given the arising problem of training speed, we build a novel deep recurrent convolutional network for acoustic modeling and then apply deep resid...

  1. Research Proposal for Distributed Deep Web Search

    NARCIS (Netherlands)

    Tjin-Kam-Jet, Kien

    2010-01-01

    This proposal identifies two main problems related to deep web search, and proposes a step by step solution for each of them. The first problem is about searching deep web content by means of a simple free-text interface (with just one input field, instead of a complex interface with many input

  2. Short-term Memory of Deep RNN

    OpenAIRE

    Gallicchio, Claudio

    2018-01-01

    The extension of deep learning towards temporal data processing is gaining an increasing research interest. In this paper we investigate the properties of state dynamics developed in successive levels of deep recurrent neural networks (RNNs) in terms of short-term memory abilities. Our results reveal interesting insights that shed light on the nature of layering as a factor of RNN design. Noticeably, higher layers in a hierarchically organized RNN architecture results to be inherently biased ...

  3. Biodiversity loss from deep-sea mining

    OpenAIRE

    C. L. Van Dover; J. A. Ardron; E. Escobar; M. Gianni; K. M. Gjerde; A. Jaeckel; D. O. B. Jones; L. A. Levin; H. Niner; L. Pendleton; C. R. Smith; T. Thiele; P. J. Turner; L. Watling; P. P. E. Weaver

    2017-01-01

    The emerging deep-sea mining industry is seen by some to be an engine for economic development in the maritime sector. The International Seabed Authority (ISA) – the body that regulates mining activities on the seabed beyond national jurisdiction – must also protect the marine environment from harmful effects that arise from mining. The ISA is currently drafting a regulatory framework for deep-sea mining that includes measures for environmental protection. Responsible mining increasingly stri...

  4. Deep Predictive Models in Interactive Music

    OpenAIRE

    Martin, Charles P.; Ellefsen, Kai Olav; Torresen, Jim

    2018-01-01

    Automatic music generation is a compelling task where much recent progress has been made with deep learning models. In this paper, we ask how these models can be integrated into interactive music systems; how can they encourage or enhance the music making of human users? Musical performance requires prediction to operate instruments, and perform in groups. We argue that predictive models could help interactive systems to understand their temporal context, and ensemble behaviour. Deep learning...

  5. Recent changes in the deep-water fish populations of Lake Michigan

    Science.gov (United States)

    Moffett, James W.

    1957-01-01

    The deep-water fish fauna of Lake Michigan consisted of lake trout (Salvelinus namaycush), burbot (Lota lota maculosa), seven species of chubs or deep-water ciscoes (Leucichthys spp.), and the deep-water sculpin (Myoxocephalus quadricornis). Other species occupied the deep-water zone but were not typically part of the fauna.

  6. Potential radionuclide emissions from stacks on the Hanford Site. Part 1: Dose assessment

    International Nuclear Information System (INIS)

    Davis, W.E.; Barnett, J.M.

    1994-06-01

    On February 3, 1993, the US Department of Energy, Richland Operations Office (RL) received a Compliance Order and Information Request from the Director of the Air and Toxics Division of the US Environmental Protection Agency (EPA), Region 10. The Compliance Plan specified that a dose assessment would be performed for 84 Westinghouse Hanford Company (WHC) stacks registered with the Washington State Department of Health (WAC 246-247) on the Hanford Site. Stacks that have the potential emissions to cause an effective dose equivalent (EDE) to a maximum exposed individual (MEI) greater than 0.1 mrem y -1 must be monitored continuously for radionuclide emissions. Five methods were approved by EPA, Region 10 for performing the assessments: Release Fractions from Appendix D of 40 CFR 61, Back Calculations Using A HEPA Filtration Factor, Nondestructive Assay of HEPA Filters, A Spill Release Fraction, and Upstream of HEPA Filter Air Concentrations. The first two methods were extremely conservative for estimating releases. The third method which used a state-of-the-art portable gamma spectrometer, yielded surprising results from the distribution of radionuclides on the HEPA filters. All five methods are described

  7. Deep learning beyond cats and dogs: recent advances in diagnosing breast cancer with deep neural networks.

    Science.gov (United States)

    Burt, Jeremy R; Torosdagli, Neslisah; Khosravan, Naji; RaviPrakash, Harish; Mortazi, Aliasghar; Tissavirasingham, Fiona; Hussein, Sarfaraz; Bagci, Ulas

    2018-04-10

    Deep learning has demonstrated tremendous revolutionary changes in the computing industry and its effects in radiology and imaging sciences have begun to dramatically change screening paradigms. Specifically, these advances have influenced the development of computer-aided detection and diagnosis (CAD) systems. These technologies have long been thought of as "second-opinion" tools for radiologists and clinicians. However, with significant improvements in deep neural networks, the diagnostic capabilities of learning algorithms are approaching levels of human expertise (radiologists, clinicians etc.), shifting the CAD paradigm from a "second opinion" tool to a more collaborative utility. This paper reviews recently developed CAD systems based on deep learning technologies for breast cancer diagnosis, explains their superiorities with respect to previously established systems, defines the methodologies behind the improved achievements including algorithmic developments, and describes remaining challenges in breast cancer screening and diagnosis. We also discuss possible future directions for new CAD models that continue to change as artificial intelligence algorithms evolve.

  8. Stable isotope geochemistry of deep sea cherts

    Energy Technology Data Exchange (ETDEWEB)

    Kolodny, Y; Epstein, S [California Inst. of Tech., Pasadena (USA). Div. of Geological Sciences

    1976-10-01

    Seventy four samples of DSDP (Deep Sea Drilling Project) recovered cherts of Jurassic to Miocene age from varying locations, and 27 samples of on-land exposed cherts were analyzed for the isotopic composition of their oxygen and hydrogen. These studies were accompanied by mineralogical analyses and some isotopic analyses of the coexisting carbonates. delta/sup 18/0 of chert ranges between 27 and 39 parts per thousand relative to SMOW, delta/sup 18/0 of porcellanite - between 30 and 42 parts per thousand. The consistent enrichment of opal-CT in porcellanites in /sup 18/0 with respect to coexisting microcrystalline quartz in chert is probably a reflection of a different temperature (depth) of diagenesis of the two phases. delta/sup 18/0 of deep sea cherts generally decrease with increasing age, indicating an overall cooling of the ocean bottom during the last 150 m.y. A comparison of this trend with that recorded by benthonic foraminifera (Douglas et al., Initial Reports of the Deep Sea Drilling Project; 32:509(1975)) indicates the possibility of delta/sup 18/0 in deep sea cherts not being frozen in until several tens of millions of years after deposition. Cherts of any Age show a spread of delta/sup 18/0 values, increasing diagenesis being reflected in a lowering of delta/sup 18/0. Drusy quartz has the lowest delta/sup 18/0 values. On land exposed cherts are consistently depleted in /sup 18/0 in comparison to their deep sea time equivalent cherts. Water extracted from deep sea cherts ranges between 0.5 and 1.4 wt%. deltaD of this water ranges between -78 and -95 parts per thousand and is not a function of delta/sup 18/0 of the cherts (or the temperature of their formation).

  9. Distributed deep learning networks among institutions for medical imaging.

    Science.gov (United States)

    Chang, Ken; Balachandar, Niranjan; Lam, Carson; Yi, Darvin; Brown, James; Beers, Andrew; Rosen, Bruce; Rubin, Daniel L; Kalpathy-Cramer, Jayashree

    2018-03-29

    Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study, we propose methods of distributing deep learning models as an attractive alternative to sharing patient data. We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.

  10. Deep learning for studies of galaxy morphology

    Science.gov (United States)

    Tuccillo, D.; Huertas-Company, M.; Decencière, E.; Velasco-Forero, S.

    2017-06-01

    Establishing accurate morphological measurements of galaxies in a reasonable amount of time for future big-data surveys such as EUCLID, the Large Synoptic Survey Telescope or the Wide Field Infrared Survey Telescope is a challenge. Because of its high level of abstraction with little human intervention, deep learning appears to be a promising approach. Deep learning is a rapidly growing discipline that models high-level patterns in data as complex multilayered networks. In this work we test the ability of deep convolutional networks to provide parametric properties of Hubble Space Telescope like galaxies (half-light radii, Sérsic indices, total flux etc..). We simulate a set of galaxies including point spread function and realistic noise from the CANDELS survey and try to recover the main galaxy parameters using deep-learning. We compare the results with the ones obtained with the commonly used profile fitting based software GALFIT. This way showing that with our method we obtain results at least equally good as the ones obtained with GALFIT but, once trained, with a factor 5 hundred time faster.

  11. Deep-Learning-Based Drug-Target Interaction Prediction.

    Science.gov (United States)

    Wen, Ming; Zhang, Zhimin; Niu, Shaoyu; Sha, Haozhi; Yang, Ruihan; Yun, Yonghuan; Lu, Hongmei

    2017-04-07

    Identifying interactions between known drugs and targets is a major challenge in drug repositioning. In silico prediction of drug-target interaction (DTI) can speed up the expensive and time-consuming experimental work by providing the most potent DTIs. In silico prediction of DTI can also provide insights about the potential drug-drug interaction and promote the exploration of drug side effects. Traditionally, the performance of DTI prediction depends heavily on the descriptors used to represent the drugs and the target proteins. In this paper, to accurately predict new DTIs between approved drugs and targets without separating the targets into different classes, we developed a deep-learning-based algorithmic framework named DeepDTIs. It first abstracts representations from raw input descriptors using unsupervised pretraining and then applies known label pairs of interaction to build a classification model. Compared with other methods, it is found that DeepDTIs reaches or outperforms other state-of-the-art methods. The DeepDTIs can be further used to predict whether a new drug targets to some existing targets or whether a new target interacts with some existing drugs.

  12. Deep inelastic scattering near the Coulomb barrier

    International Nuclear Information System (INIS)

    Gehring, J.; Back, B.; Chan, K.

    1995-01-01

    Deep inelastic scattering was recently observed in heavy ion reactions at incident energies near and below the Coulomb barrier. Traditional models of this process are based on frictional forces and are designed to predict the features of deep inelastic processes at energies above the barrier. They cannot be applied at energies below the barrier where the nuclear overlap is small and friction is negligible. The presence of deep inelastic scattering at these energies requires a different explanation. The first observation of deep inelastic scattering near the barrier was in the systems 124,112 Sn + 58,64 Ni by Wolfs et al. We previously extended these measurements to the system 136 Xe + 64 Ni and currently measured the system 124 Xe + 58 Ni. We obtained better statistics, better mass and energy resolution, and more complete angular coverage in the Xe + Ni measurements. The cross sections and angular distributions are similar in all of the Sn + Ni and Xe + Ni systems. The data are currently being analyzed and compared with new theoretical calculations. They will be part of the thesis of J. Gehring

  13. Deep Phenotyping: Deep Learning For Temporal Phenotype/Genotype Classification

    OpenAIRE

    Najafi, Mohammad; Namin, Sarah; Esmaeilzadeh, Mohammad; Brown, Tim; Borevitz, Justin

    2017-01-01

    High resolution and high throughput, genotype to phenotype studies in plants are underway to accelerate breeding of climate ready crops. Complex developmental phenotypes are observed by imaging a variety of accessions in different environment conditions, however extracting the genetically heritable traits is challenging. In the recent years, deep learning techniques and in particular Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs) and Long-Short Term Memories (LSTMs), h...

  14. Deep imitation learning for 3D navigation tasks.

    Science.gov (United States)

    Hussein, Ahmed; Elyan, Eyad; Gaber, Mohamed Medhat; Jayne, Chrisina

    2018-01-01

    Deep learning techniques have shown success in learning from raw high-dimensional data in various applications. While deep reinforcement learning is recently gaining popularity as a method to train intelligent agents, utilizing deep learning in imitation learning has been scarcely explored. Imitation learning can be an efficient method to teach intelligent agents by providing a set of demonstrations to learn from. However, generalizing to situations that are not represented in the demonstrations can be challenging, especially in 3D environments. In this paper, we propose a deep imitation learning method to learn navigation tasks from demonstrations in a 3D environment. The supervised policy is refined using active learning in order to generalize to unseen situations. This approach is compared to two popular deep reinforcement learning techniques: deep-Q-networks and Asynchronous actor-critic (A3C). The proposed method as well as the reinforcement learning methods employ deep convolutional neural networks and learn directly from raw visual input. Methods for combining learning from demonstrations and experience are also investigated. This combination aims to join the generalization ability of learning by experience with the efficiency of learning by imitation. The proposed methods are evaluated on 4 navigation tasks in a 3D simulated environment. Navigation tasks are a typical problem that is relevant to many real applications. They pose the challenge of requiring demonstrations of long trajectories to reach the target and only providing delayed rewards (usually terminal) to the agent. The experiments show that the proposed method can successfully learn navigation tasks from raw visual input while learning from experience methods fail to learn an effective policy. Moreover, it is shown that active learning can significantly improve the performance of the initially learned policy using a small number of active samples.

  15. Childminders, Parents and Policy: Testing the Triangle of Care

    Science.gov (United States)

    Brooker, Liz

    2016-01-01

    Childminders in England have historically been seen as marginal providers of childcare, fulfilling Bruner's description of the service as an "accordion pleat" in provision. This article outlines the history and current position of childminders in English early childhood policy, and then reports on the views on this role of childminders…

  16. Detecting atrial fibrillation by deep convolutional neural networks.

    Science.gov (United States)

    Xia, Yong; Wulan, Naren; Wang, Kuanquan; Zhang, Henggui

    2018-02-01

    Atrial fibrillation (AF) is the most common cardiac arrhythmia. The incidence of AF increases with age, causing high risks of stroke and increased morbidity and mortality. Efficient and accurate diagnosis of AF based on the ECG is valuable in clinical settings and remains challenging. In this paper, we proposed a novel method with high reliability and accuracy for AF detection via deep learning. The short-term Fourier transform (STFT) and stationary wavelet transform (SWT) were used to analyze ECG segments to obtain two-dimensional (2-D) matrix input suitable for deep convolutional neural networks. Then, two different deep convolutional neural network models corresponding to STFT output and SWT output were developed. Our new method did not require detection of P or R peaks, nor feature designs for classification, in contrast to existing algorithms. Finally, the performances of the two models were evaluated and compared with those of existing algorithms. Our proposed method demonstrated favorable performances on ECG segments as short as 5 s. The deep convolutional neural network using input generated by STFT, presented a sensitivity of 98.34%, specificity of 98.24% and accuracy of 98.29%. For the deep convolutional neural network using input generated by SWT, a sensitivity of 98.79%, specificity of 97.87% and accuracy of 98.63% was achieved. The proposed method using deep convolutional neural networks shows high sensitivity, specificity and accuracy, and, therefore, is a valuable tool for AF detection. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Deep Space Habitat Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Deep Space Habitat was closed out at the end of Fiscal Year 2013 (September 30, 2013). Results and select content have been incorporated into the new Exploration...

  18. Deep Water Survey Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The deep water biodiversity surveys explore and describe the biodiversity of the bathy- and bentho-pelagic nekton using Midwater and bottom trawls centered in the...

  19. Benchmarking Deep Learning Models on Large Healthcare Datasets.

    Science.gov (United States)

    Purushotham, Sanjay; Meng, Chuizheng; Che, Zhengping; Liu, Yan

    2018-06-04

    Deep learning models (aka Deep Neural Networks) have revolutionized many fields including computer vision, natural language processing, speech recognition, and is being increasingly used in clinical healthcare applications. However, few works exist which have benchmarked the performance of the deep learning models with respect to the state-of-the-art machine learning models and prognostic scoring systems on publicly available healthcare datasets. In this paper, we present the benchmarking results for several clinical prediction tasks such as mortality prediction, length of stay prediction, and ICD-9 code group prediction using Deep Learning models, ensemble of machine learning models (Super Learner algorithm), SAPS II and SOFA scores. We used the Medical Information Mart for Intensive Care III (MIMIC-III) (v1.4) publicly available dataset, which includes all patients admitted to an ICU at the Beth Israel Deaconess Medical Center from 2001 to 2012, for the benchmarking tasks. Our results show that deep learning models consistently outperform all the other approaches especially when the 'raw' clinical time series data is used as input features to the models. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Investigation of deep levels in GaInNAs

    International Nuclear Information System (INIS)

    Abulfotuh, F.; Balcioglu, A.; Friedman, D.; Geisz, J.; Kurtz, S.

    1999-01-01

    This paper presents and discusses the first Deep-Level transient spectroscopy (DLTS) data obtained from measurements carried out on both Schottky barriers and homojunction devices of GaInNAs. The effect of N and In doping on the electrical properties of the GaNInAs devices, which results in structural defects and interface states, has been investigated. Moreover, the location and densities of deep levels related to the presence of N, In, and N+In are identified and correlated with the device performance. The data confirmed that the presence of N alone creates a high density of shallow hole traps related to the N atom and structural defects in the device. Doping by In, if present alone, also creates low-density deep traps (related to the In atom and structural defects) and extremely deep interface states. On the other hand, the co-presence of In and N eliminates both the interface states and levels related to structural defects. However, the device still has a high density of the shallow and deep traps that are responsible for the photocurrent loss in the GaNInAs device, together with the possible short diffusion length. copyright 1999 American Institute of Physics