WorldWideScience

Sample records for source processes derived

  1. Xiphoid Process-Derived Chondrocytes: A Novel Cell Source for Elastic Cartilage Regeneration

    Science.gov (United States)

    Nam, Seungwoo; Cho, Wheemoon; Cho, Hyunji; Lee, Jungsun

    2014-01-01

    Reconstruction of elastic cartilage requires a source of chondrocytes that display a reliable differentiation tendency. Predetermined tissue progenitor cells are ideal candidates for meeting this need; however, it is difficult to obtain donor elastic cartilage tissue because most elastic cartilage serves important functions or forms external structures, making these tissues indispensable. We found vestigial cartilage tissue in xiphoid processes and characterized it as hyaline cartilage in the proximal region and elastic cartilage in the distal region. Xiphoid process-derived chondrocytes (XCs) showed superb in vitro expansion ability based on colony-forming unit fibroblast assays, cell yield, and cumulative cell growth. On induction of differentiation into mesenchymal lineages, XCs showed a strong tendency toward chondrogenic differentiation. An examination of the tissue-specific regeneration capacity of XCs in a subcutaneous-transplantation model and autologous chondrocyte implantation model confirmed reliable regeneration of elastic cartilage regardless of the implantation environment. On the basis of these observations, we conclude that xiphoid process cartilage, the only elastic cartilage tissue source that can be obtained without destroying external shape or function, is a source of elastic chondrocytes that show superb in vitro expansion and reliable differentiation capacity. These findings indicate that XCs could be a valuable cell source for reconstruction of elastic cartilage. PMID:25205841

  2. Heat source reconstruction from noisy temperature fields using an optimised derivative Gaussian filter

    Science.gov (United States)

    Delpueyo, D.; Balandraud, X.; Grédiac, M.

    2013-09-01

    The aim of this paper is to present a post-processing technique based on a derivative Gaussian filter to reconstruct heat source fields from temperature fields measured by infrared thermography. Heat sources can be deduced from temperature variations thanks to the heat diffusion equation. Filtering and differentiating are key-issues which are closely related here because the temperature fields which are processed are unavoidably noisy. We focus here only on the diffusion term because it is the most difficult term to estimate in the procedure, the reason being that it involves spatial second derivatives (a Laplacian for isotropic materials). This quantity can be reasonably estimated using a convolution of the temperature variation fields with second derivatives of a Gaussian function. The study is first based on synthetic temperature variation fields corrupted by added noise. The filter is optimised in order to reconstruct at best the heat source fields. The influence of both the dimension and the level of a localised heat source is discussed. Obtained results are also compared with another type of processing based on an averaging filter. The second part of this study presents an application to experimental temperature fields measured with an infrared camera on a thin plate in aluminium alloy. Heat sources are generated with an electric heating patch glued on the specimen surface. Heat source fields reconstructed from measured temperature fields are compared with the imposed heat sources. Obtained results illustrate the relevancy of the derivative Gaussian filter to reliably extract heat sources from noisy temperature fields for the experimental thermomechanics of materials.

  3. Repairing business process models as retrieved from source code

    NARCIS (Netherlands)

    Fernández-Ropero, M.; Reijers, H.A.; Pérez-Castillo, R.; Piattini, M.; Nurcan, S.; Proper, H.A.; Soffer, P.; Krogstie, J.; Schmidt, R.; Halpin, T.; Bider, I.

    2013-01-01

    The static analysis of source code has become a feasible solution to obtain underlying business process models from existing information systems. Due to the fact that not all information can be automatically derived from source code (e.g., consider manual activities), such business process models

  4. Source Rupture Process of the 2016 Kumamoto Prefecture, Japan, Earthquake Derived from Near-Source Strong-Motion Records

    Science.gov (United States)

    Zheng, A.; Zhang, W.

    2016-12-01

    On 15 April, 2016 the great earthquake with magnitude Mw7.1 occurred in Kumamoto prefecture, Japan. The focal mechanism solution released by F-net located the hypocenter at 130.7630°E, 32.7545°N, at a depth of 12.45 km, and the strike, dip, and the rake angle of the fault were N226°E, 84° and -142° respectively. The epicenter distribution and focal mechanisms of aftershocks implied the mechanism of the mainshock might have changed in the source rupture process, thus a single focal mechanism was not enough to explain the observed data adequately. In this study, based on the inversion result of GNSS and InSAR surface deformation with active structures for reference, we construct a finite fault model with focal mechanism changes, and derive the source rupture process by multi-time-window linear waveform inversion method using the strong-motion data (0.05 1.0Hz) obtained by K-NET and KiK-net of Japan. Our result shows that the Kumamoto earthquake is a right-lateral strike slipping rupture event along the Futagawa-Hinagu fault zone, and the seismogenic fault is divided into a northern segment and a southern one. The strike and the dip of the northern segment are N235°E, 60° respectively. And for the southern one, they are N205°E, 72° respectively. The depth range of the fault model is consistent with the depth distribution of aftershocks, and the slip on the fault plane mainly concentrate on the northern segment, in which the maximum slip is about 7.9 meter. The rupture process of the whole fault continues for approximately 18-sec, and the total seismic moment released is 5.47×1019N·m (Mw 7.1). In addition, the essential feature of the distribution of PGV and PGA synthesized by the inversion result is similar to that of observed PGA and seismic intensity.

  5. How organic carbon derived from multiple sources contributes to carbon sequestration processes in a shallow coastal system?

    Science.gov (United States)

    Watanabe, Kenta; Kuwae, Tomohiro

    2015-04-16

    Carbon captured by marine organisms helps sequester atmospheric CO 2 , especially in shallow coastal ecosystems, where rates of primary production and burial of organic carbon (OC) from multiple sources are high. However, linkages between the dynamics of OC derived from multiple sources and carbon sequestration are poorly understood. We investigated the origin (terrestrial, phytobenthos derived, and phytoplankton derived) of particulate OC (POC) and dissolved OC (DOC) in the water column and sedimentary OC using elemental, isotopic, and optical signatures in Furen Lagoon, Japan. Based on these data analysis, we explored how OC from multiple sources contributes to sequestration via storage in sediments, water column sequestration, and air-sea CO 2 exchanges, and analyzed how the contributions vary with salinity in a shallow seagrass meadow as well. The relative contribution of terrestrial POC in the water column decreased with increasing salinity, whereas autochthonous POC increased in the salinity range 10-30. Phytoplankton-derived POC dominated the water column POC (65-95%) within this salinity range; however, it was minor in the sediments (3-29%). In contrast, terrestrial and phytobenthos-derived POC were relatively minor contributors in the water column but were major contributors in the sediments (49-78% and 19-36%, respectively), indicating that terrestrial and phytobenthos-derived POC were selectively stored in the sediments. Autochthonous DOC, part of which can contribute to long-term carbon sequestration in the water column, accounted for >25% of the total water column DOC pool in the salinity range 15-30. Autochthonous OC production decreased the concentration of dissolved inorganic carbon in the water column and thereby contributed to atmospheric CO 2 uptake, except in the low-salinity zone. Our results indicate that shallow coastal ecosystems function not only as transition zones between land and ocean but also as carbon sequestration filters. They

  6. Source-based neurofeedback methods using EEG recordings: training altered brain activity in a functional brain source derived from blind source separation

    Science.gov (United States)

    White, David J.; Congedo, Marco; Ciorciari, Joseph

    2014-01-01

    A developing literature explores the use of neurofeedback in the treatment of a range of clinical conditions, particularly ADHD and epilepsy, whilst neurofeedback also provides an experimental tool for studying the functional significance of endogenous brain activity. A critical component of any neurofeedback method is the underlying physiological signal which forms the basis for the feedback. While the past decade has seen the emergence of fMRI-based protocols training spatially confined BOLD activity, traditional neurofeedback has utilized a small number of electrode sites on the scalp. As scalp EEG at a given electrode site reflects a linear mixture of activity from multiple brain sources and artifacts, efforts to successfully acquire some level of control over the signal may be confounded by these extraneous sources. Further, in the event of successful training, these traditional neurofeedback methods are likely influencing multiple brain regions and processes. The present work describes the use of source-based signal processing methods in EEG neurofeedback. The feasibility and potential utility of such methods were explored in an experiment training increased theta oscillatory activity in a source derived from Blind Source Separation (BSS) of EEG data obtained during completion of a complex cognitive task (spatial navigation). Learned increases in theta activity were observed in two of the four participants to complete 20 sessions of neurofeedback targeting this individually defined functional brain source. Source-based EEG neurofeedback methods using BSS may offer important advantages over traditional neurofeedback, by targeting the desired physiological signal in a more functionally and spatially specific manner. Having provided preliminary evidence of the feasibility of these methods, future work may study a range of clinically and experimentally relevant brain processes where individual brain sources may be targeted by source-based EEG neurofeedback. PMID

  7. REE enrichment in granite-derived regolith deposits of the southeast United States: Prospective source rocks and accumulation processes

    Science.gov (United States)

    Foley, Nora K.; Ayuso, Robert A.; Simandl, G.J.; Neetz, M.

    2015-01-01

    The Southeastern United States contains numerous anorogenic, or A-type, granites, which constitute promising source rocks for REE-enriched ion adsorption clay deposits due to their inherently high concentrations of REE. These granites have undergone a long history of chemical weathering, resulting in thick granite-derived regoliths, akin to those of South China, which supply virtually all heavy REE and Y, and a significant portion of light REE to global markets. Detailed comparisons of granite regolith profiles formed on the Stewartsville and Striped Rock plutons, and the Robertson River batholith (Virginia) indicate that REE are mobile and can attain grades comparable to those of deposits currently mined in China. A REE-enriched parent, either A-type or I-type (highly fractionated igneous type) granite, is thought to be critical for generating the high concentrations of REE in regolith profiles. One prominent feature we recognize in many granites and mineralized regoliths is the tetrad behaviour displayed in REE chondrite-normalized patterns. Tetrad patterns in granite and regolith result from processes that promote the redistribution, enrichment, and fractionation of REE, such as late- to post- magmatic alteration of granite and silicate hydrolysis in the regolith. Thus, REE patterns showing tetrad effects may be a key for discriminating highly prospective source rocks and regoliths with potential for REE ion adsorption clay deposits.

  8. Deriving profiles of incident and scattered neutrons for TOF experiments with the spallation sources

    International Nuclear Information System (INIS)

    Watanabe, Hidehiro

    1993-01-01

    A formula that closely matches the incident profile of epi-thermal and thermal neutrons for time of flight experiments carried out with a spallation neutron source and moderator scheme is derived based on the slowing-down and diffusing-out processes in a moderator. This analytical description also enables us to predict burst-function profiles; these profiles are verified by a comparison with a diffraction pattern. The limits of the analytical model are discussed through the predictable peak position shift brought about by the slowing-down process. (orig.)

  9. Controlled Carbon Source Addition to an Alternating Nitrification-Denitrification Wastewater Treatment Process Including Biological P Removal

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard; Henze, Mogens

    1995-01-01

    The paper investigates the effect of adding an external carbon source on the rate of denitrification in an alternating activated sludge process including biological P removal. Two carbon sources were examined, acetate and hydrolysate derived from biologically hydrolyzed sludge. Preliminary batch ...

  10. Heuristic derivation of the Rossi-alpha formula for a pulsed neutron source

    International Nuclear Information System (INIS)

    Baeten, P.

    2004-01-01

    Expressions for the Rossi-alpha distribution for a pulsed neutron source were derived using a heuristic derivation based on the method of joint detection probability. This heuristic technique was chosen over the more rigorous master equation method due to its simplicity and the complementary of both techniques. The derived equations also take into account the presence of delayed neutrons and intrinsic neutron sources which often cannot be neglected in source-driven subcritical cores. The obtained expressions showed that the ratio of the correlated to the uncorrelated signal in the Rossi-Alpha distribution for a Pulsed Source (RAPS) was strongly increased compared to the case for a standard Rossi-alpha distribution for a continuous source. It was also demonstrated that by using this RAPS technique four independent measurement quantities, instead of three with the standard Rossi-alpha technique, can be determined. Hence, it is no longer necessary to combine the Rossi-alpha technique with another method to measure the reactivity expressed in dollars. Both properties, the increased signal-to-noise ratio of the correlated signal and the measurement of a fourth measurement quantity, make that the RAPS technique is an excellent candidate for the measurement of kinetic parameters in source-driven subcritical assemblies

  11. The Chandra Source Catalog: Processing and Infrastructure

    Science.gov (United States)

    Evans, Janet; Evans, Ian N.; Glotfelty, Kenny J.; Hain, Roger; Hall, Diane M.; Miller, Joseph B.; Plummer, David A.; Zografou, Panagoula; Primini, Francis A.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.

    2009-09-01

    Chandra Source Catalog processing recalibrates each observation using the latest available calibration data, and employs a wavelet-based source detection algorithm to identify all the X-ray sources in the field of view. Source properties are then extracted from each detected source that is a candidate for inclusion in the catalog. Catalog processing is completed by matching sources across multiple observations, merging common detections, and applying quality assurance checks. The Chandra Source Catalog processing system shares a common processing infrastructure and utilizes much of the functionality that is built into the Standard Data Processing (SDP) pipeline system that provides calibrated Chandra data to end-users. Other key components of the catalog processing system have been assembled from the portable CIAO data analysis package. Minimal new software tool development has been required to support the science algorithms needed for catalog production. Since processing pipelines must be instantiated for each detected source, the number of pipelines that are run during catalog construction is a factor of order 100 times larger than for SDP. The increased computational load, and inherent parallel nature of the processing, is handled by distributing the workload across a multi-node Beowulf cluster. Modifications to the SDP automated processing application to support catalog processing, and extensions to Chandra Data Archive software to ingest and retrieve catalog products, complete the upgrades to the infrastructure to support catalog processing.

  12. Algorithms for the process management of sealed source brachytherapy

    International Nuclear Information System (INIS)

    Engler, M.J.; Ulin, K.; Sternick, E.S.

    1996-01-01

    Incidents and misadministrations suggest that brachytherapy may benefit form clarification of the quality management program and other mandates of the US Nuclear Regulatory Commission. To that end, flowcharts of step by step subprocesses were developed and formatted with dedicated software. The overall process was similarly organized in a complex flowchart termed a general process map. Procedural and structural indicators associated with each flowchart and map were critiqued and pre-existing documentation was revised. open-quotes Step-regulation tablesclose quotes were created to refer steps and subprocesses to Nuclear Regulatory Commission rules and recommendations in their sequences of applicability. Brachytherapy algorithms were specified as programmable, recursive processes, including therapeutic dose determination and monitoring doses to the public. These algorithms are embodied in flowcharts and step-regulation tables. A general algorithm is suggested as a template form which other facilities may derive tools to facilitate process management of sealed source brachytherapy. 11 refs., 9 figs., 2 tabs

  13. Hydrodeoxygenation processes: advances on catalytic transformations of biomass-derived platform chemicals into hydrocarbon fuels.

    Science.gov (United States)

    De, Sudipta; Saha, Basudeb; Luque, Rafael

    2015-02-01

    Lignocellulosic biomass provides an attractive source of renewable carbon that can be sustainably converted into chemicals and fuels. Hydrodeoxygenation (HDO) processes have recently received considerable attention to upgrade biomass-derived feedstocks into liquid transportation fuels. The selection and design of HDO catalysts plays an important role to determine the success of the process. This review has been aimed to emphasize recent developments on HDO catalysts in effective transformations of biomass-derived platform molecules into hydrocarbon fuels with reduced oxygen content and improved H/C ratios. Liquid hydrocarbon fuels can be obtained by combining oxygen removal processes (e.g. dehydration, hydrogenation, hydrogenolysis, decarbonylation etc.) as well as by increasing the molecular weight via C-C coupling reactions (e.g. aldol condensation, ketonization, oligomerization, hydroxyalkylation etc.). Fundamentals and mechanistic aspects of the use of HDO catalysts in deoxygenation reactions will also be discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Physical processes in EUV sources for microlithography

    International Nuclear Information System (INIS)

    Banine, V Y; Swinkels, G H P M; Koshelev, K N

    2011-01-01

    The source is an integral part of an extreme ultraviolet lithography (EUVL) tool. Such a source, as well as the EUVL tool, has to fulfil very high demands both technical and cost oriented. The EUVL tool operates at a wavelength of 13.5 nm, which requires the following new developments. - The light production mechanism changes from conventional lamps and lasers to relatively high-temperature emitting plasmas. - The light transport, mainly refractive for deep ultraviolet (DUV), should be reflective for EUV. - The source specifications as derived from the customer requirements on wafer throughput mean that the output EUV source power has to be hundreds of watts. This in its turn means that tens to hundreds of kilowatts of dissipated power has to be managed in a relatively small volume. - In order to keep lithography costs as low as possible, the lifetime of the components should be as long as possible and at least of the order of thousands of hours. This poses a challenge for the sources, namely how to design and manufacture components robust enough to withstand the intense environment of high heat dissipation, flows of several keV ions as well as the atomic and particular debris within the source vessel. - As with all lithography tools, the imaging requirements demand a narrow illumination bandwidth. Absorption of materials at EUV wavelengths is extreme with extinguishing lengths of the order of tens of nanometres, so the balance between high transmission and spectral purity requires careful engineering. All together, EUV lithography sources present technological challenges in various fields of physics such as plasma, optics and material science. These challenges are being tackled by the source manufacturers and investigated extensively in the research facilities around the world. An overview of the published results on the topic as well as the analyses of the physical processes behind the proposed solutions will be presented in this paper. (topical review)

  15. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    Science.gov (United States)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good

  16. Sources of present Chernobyl-derived caesium concentrations in surface air and deposition samples

    International Nuclear Information System (INIS)

    Hoetzl, H.; Rosner, G.; Winkler, R.; Gesellschaft fuer Strahlen- und Umweltforschung mbH Muenchen, Neuherberg

    1992-01-01

    The sources of Chernobyl-derived caesium concentrations in air and deposition samples collected from mid-1986 to end-1990 at Munich- Neuherberg, Germany, were investigated. Local resuspension has been found to be the main source. By comparison with deposition data from other locations it is estimated that within a range from 20 Bq m -2 to 60 kBq m -2 of initially deposited 137 Cs activity ∼2% is re-deposited by the process of local resuspension in Austria, Germany, Japan and United Kingdom, while significantly higher total resuspension is to be expected for Denmark and Finland. Stratospheric contribution to present concentrations is shown to be negligible. This is confirmed by cross correlation analysis between the time series of 137 Cs in air and precipitation before and after the Chernobyl accident and the respective time series of cosmogenic 7 Be, which is an indicator of stratospheric input. Seasonal variations of caesium concentrations with maxima in winter were observed. (author). 32 refs.; 5 figs.; 1 tab

  17. BRIEF COMMENTS REGARDING THE INDIRECT (OR DERIVED) SOURCES OF LABOR LAW

    OpenAIRE

    Brîndușa Vartolomei

    2015-01-01

    In the field of the law governing the legal work relations one of the features that also contributes to defining the autonomy of labor law is that of the existence of the specific sources of law consisting in regulation on the functioning of the employer, internal regulation, collective labor agreement, and instructions regarding the security and labor health. In addition, in the practical field of the labor relationssome indirect (or derived) sources of law were also pointed out ...

  18. Marine-derived fungi as a source of proteases

    Digital Repository Service at National Institute of Oceanography (India)

    Kamat, T.; Rodrigues, C.; Naik, C.G.

    , of marine-derived fungi in order to identify the potential sources. Sponge and corals were collected by SCUBA diving, from a depth of 8 to 10 m from the coastal waters of Mandapam, Tamil Nadu (9"16' N; 79"liE). The samples comprised of a soft coral Sinularia... pieces of approximately 2x2 cm were cut out aseptically. These fourteen pieces of each organism were subjected to two different treatments 23 • In the first case seven pieces were vortexed four times, for 20 seconds each, with sterile seawater while...

  19. Processing methods, characteristics and adsorption behavior of tire derived carbons: a review.

    Science.gov (United States)

    Saleh, Tawfik A; Gupta, Vinod Kumar

    2014-09-01

    The remarkable increase in the number of vehicles worldwide; and the lack of both technical and economical mechanisms of disposal make waste tires to be a serious source of pollution. One potential recycling process is pyrolysis followed by chemical activation process to produce porous activated carbons. Many researchers have recently proved the capability of such carbons as adsorbents to remove various types of pollutants including organic and inorganic species. This review attempts to compile relevant knowledge about the production methods of carbon from waste rubber tires. The effects of various process parameters including temperature and heating rate, on the pyrolysis stage; activation temperature and time, activation agent and activating gas are reviewed. This review highlights the use of waste-tires derived carbon to remove various types of pollutants like heavy metals, dye, pesticides and others from aqueous media. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. System and method for deriving a process-based specification

    Science.gov (United States)

    Hinchey, Michael Gerard (Inventor); Rash, James Larry (Inventor); Rouff, Christopher A. (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  1. Sources of present Chernobyl-derived caesium concentrations in surface air and deposition samples

    Energy Technology Data Exchange (ETDEWEB)

    Hoetzl, H.; Rosner, G.; Winkler, R. (Gesellschaft fuer Strahlen-und Umweltforschung Munich, Neuherberg (Germany). Forschungszentrum fuer Umwelt und Gesundheit Gesellschaft fuer Strahlen- und Umweltforschung mbH Muenchen, Neuherberg (Germany). Inst. fuer Strahlenschutz)

    1992-06-01

    The sources of Chernobyl-derived caesium concentrations in air and deposition samples collected from mid-1986 to end-1990 at Munich- Neuherberg, Germany, were investigated. Local resuspension has been found to be the main source. By comparison with deposition data from other locations it is estimated that within a range from 20 Bq m[sup -2] to 60 kBq m[sup -2] of initially deposited [sup 137]Cs activity [approx]2% is re-deposited by the process of local resuspension in Austria, Germany, Japan and United Kingdom, while significantly higher total resuspension is to be expected for Denmark and Finland. Stratospheric contribution to present concentrations is shown to be negligible. This is confirmed by cross correlation analysis between the time series of [sup 137]Cs in air and precipitation before and after the Chernobyl accident and the respective time series of cosmogenic [sup 7]Be, which is an indicator of stratospheric input. Seasonal variations of caesium concentrations with maxima in winter were observed. (author). 32 refs.; 5 figs.; 1 tab.

  2. Bone marrow-derived stromal cells are more beneficial cell sources for tooth regeneration compared with adipose-derived stromal cells.

    Science.gov (United States)

    Ye, Lanfeng; Chen, Lin; Feng, Fan; Cui, Junhui; Li, Kaide; Li, Zhiyong; Liu, Lei

    2015-10-01

    Tooth loss is presently a global epidemic and tooth regeneration is thought to be a feasible and ideal treatment approach. Choice of cell source is a primary concern in tooth regeneration. In this study, the odontogenic differentiation potential of two non-dental-derived stem cells, adipose-derived stromal cells (ADSCs) and bone marrow-derived stromal cells (BMSCs), were evaluated both in vitro and in vivo. ADSCs and BMSCs were induced in vitro in the presence of tooth germ cell-conditioned medium (TGC-CM) prior to implantation into the omentum majus of rats, in combination with inactivated dentin matrix (IDM). Real-time quantitative polymerase chain reaction (RT-qPCR) was used to detect the mRNA expression levels of odontogenic-related genes. Immunofluorescence and immunohistochemical assays were used to detect the protein levels of odontogenic-specific genes, such as DSP and DMP-1 both in vitro and in vivo. The results suggest that both ADSCs and BMSCs have odontogenic differentiation potential. However, the odontogenic potential of BMSCs was greater compared with ADSCs, showing that BMSCs are a more appropriate cell source for tooth regeneration. © 2015 International Federation for Cell Biology.

  3. Natural and anthropogenic sources and processes affecting water chemistry in two South Korean streams

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Woo-Jin [Division of Earth and Environmental Sciences, Korea Basic Science Institute, Cheongwon-gun, Chungbuk 363-883 (Korea, Republic of); Department of Geoscience, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Ryu, Jong-Sik [Division of Earth and Environmental Sciences, Korea Basic Science Institute, Cheongwon-gun, Chungbuk 363-883 (Korea, Republic of); Mayer, Bernhard [Department of Geoscience, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Lee, Kwang-Sik, E-mail: kslee@kbsi.re.kr [Division of Earth and Environmental Sciences, Korea Basic Science Institute, Cheongwon-gun, Chungbuk 363-883 (Korea, Republic of); Lee, Sin-Woo [Division of Earth and Environmental Sciences, Korea Basic Science Institute, Cheongwon-gun, Chungbuk 363-883 (Korea, Republic of); Department of Geology, Chungnam National University, Yuseong-gu, Daejeon 305-764 (Korea, Republic of)

    2014-07-01

    Acid mine drainage (AMD) in a watershed provides potential sources of pollutants for surface and subsurface waters that can deteriorate water quality. Between March and early August 2011, water samples were collected from two streams in South Korea, one dominantly draining a watershed with carbonate bedrock affected by coal mines and another draining a watershed with silicate bedrock and a relatively undisturbed catchment area. The objective of the study was to identify the sources and processes controlling water chemistry, which was dependent on bedrock and land use. In the Odae stream (OS), the stream in the silicate-dominated catchment, Ca, Na, and HCO{sub 3} were the dominant ions and total dissolved solids (TDS) was low (26.1–165 mg/L). In the Jijang stream (JS), in the carbonate-dominated watershed, TDS (224–434 mg/L) and ion concentrations were typically higher, and Ca and SO{sub 4} were the dominant ions due to carbonate weathering and oxidation of pyrite exposed at coal mines. Dual isotopic compositions of sulfate (δ{sup 34}S{sub SO4} and δ{sup 18}O{sub SO4}) verified that the SO{sub 4} in JS is derived mainly from sulfide mineral oxidation in coal mines. Cl in JS was highest upstream and decreased progressively downstream, which implies that pollutants from recreational facilities in the uppermost part of the catchment are the major source governing Cl concentrations within the discharge basin. Dual isotopic compositions of nitrate (δ{sup 15}N{sub NO3} and δ{sup 18}O{sub NO3}) indicated that NO{sub 3} in JS is attributable to nitrification of soil organic matter but that NO{sub 3} in OS is derived mostly from manure. Additionally, the contributions of potential anthropogenic sources to the two streams were estimated in more detail by using a plot of δ{sup 34}S{sub SO4} and δ{sup 15}N{sub NO3}. This study suggests that the dual isotope approach for sulfate and nitrate is an excellent additional tool for elucidating the sources and processes

  4. Natural and anthropogenic sources and processes affecting water chemistry in two South Korean streams

    International Nuclear Information System (INIS)

    Shin, Woo-Jin; Ryu, Jong-Sik; Mayer, Bernhard; Lee, Kwang-Sik; Lee, Sin-Woo

    2014-01-01

    Acid mine drainage (AMD) in a watershed provides potential sources of pollutants for surface and subsurface waters that can deteriorate water quality. Between March and early August 2011, water samples were collected from two streams in South Korea, one dominantly draining a watershed with carbonate bedrock affected by coal mines and another draining a watershed with silicate bedrock and a relatively undisturbed catchment area. The objective of the study was to identify the sources and processes controlling water chemistry, which was dependent on bedrock and land use. In the Odae stream (OS), the stream in the silicate-dominated catchment, Ca, Na, and HCO 3 were the dominant ions and total dissolved solids (TDS) was low (26.1–165 mg/L). In the Jijang stream (JS), in the carbonate-dominated watershed, TDS (224–434 mg/L) and ion concentrations were typically higher, and Ca and SO 4 were the dominant ions due to carbonate weathering and oxidation of pyrite exposed at coal mines. Dual isotopic compositions of sulfate (δ 34 S SO4 and δ 18 O SO4 ) verified that the SO 4 in JS is derived mainly from sulfide mineral oxidation in coal mines. Cl in JS was highest upstream and decreased progressively downstream, which implies that pollutants from recreational facilities in the uppermost part of the catchment are the major source governing Cl concentrations within the discharge basin. Dual isotopic compositions of nitrate (δ 15 N NO3 and δ 18 O NO3 ) indicated that NO 3 in JS is attributable to nitrification of soil organic matter but that NO 3 in OS is derived mostly from manure. Additionally, the contributions of potential anthropogenic sources to the two streams were estimated in more detail by using a plot of δ 34 S SO4 and δ 15 N NO3 . This study suggests that the dual isotope approach for sulfate and nitrate is an excellent additional tool for elucidating the sources and processes controlling the water chemistry of streams draining watersheds having different

  5. Innovative Process to Enrich Carbon Content of EFB-Derived Biochar as an Alternative Energy Source in Ironmaking

    Directory of Open Access Journals (Sweden)

    Hadi Purwanto

    2018-01-01

    Full Text Available This paper describes the mechanism of a developed process—an integrated pyrolysis-tar decomposition process—to produce oil palm empty fruit bunch- (EFB- derived biochar with additional solid carbon within the biochar bodies, produced by decomposition of tar vapor on its pore surface, using the chemical vapor infiltration (CVI method. The product, carbon-infiltrated biochar, was characterized to investigate the possibility to be used as partial coke breeze replacement in ironmaking. Carbon-infiltrated biochar is proposed to be utilized for a sintering process that could reduce the consumption of coke and CO2 emission in iron-steel industry.

  6. Radioactive sealed sources production process for industrial radiography

    International Nuclear Information System (INIS)

    Santos, Paulo de S.; Ngunga, Daniel M.G.; Camara, Julio R.; Vasquez, Pablo A.S.

    2017-01-01

    providing products and services to the private and governmental Brazilian users of industrial radiography and nucleonic control systems. Radioactive sealed sources are commonly used in nondestructive tests as radiography to make inspections and verify the internal structure and integrity of materials and in nucleonic gauges to control level, density, viscosity, etc. in on-line industrial processes. One of the most important activities carried out by this laboratory is related to the inspection of source projectors devices used in industrial radiography and its constituent parts as well as remote handle control assembly drive cable and guide tube systems. The laboratory also provide for the users iridium-192, cobalt-60 and selenium-75 sealed sources and performs quality control tests replacing spent or contaminated radiative sources. All discard of radioactive source is treated as radioactive waste. Additionally, administrative and commercial processes and protocols for exportation and transport of radioactive material are developed by specialized departments. In this work are presented the mean processes and procedures used by the Sealed Source Production Laboratory such as the arrival of the radioactive material to the laboratory and the source projectors, mechanical inspections, source loading, source leaking tests, etc. (author)

  7. Radioactive sealed sources production process for industrial radiography

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Paulo de S.; Ngunga, Daniel M.G.; Camara, Julio R.; Vasquez, Pablo A.S., E-mail: psantos@ipen.br, E-mail: hobeddaniel@gmail.com, E-mail: jrcamara@ipen.br, E-mail: pavsalva@ipen.br [Instituto de Pesquisas Energética s e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    providing products and services to the private and governmental Brazilian users of industrial radiography and nucleonic control systems. Radioactive sealed sources are commonly used in nondestructive tests as radiography to make inspections and verify the internal structure and integrity of materials and in nucleonic gauges to control level, density, viscosity, etc. in on-line industrial processes. One of the most important activities carried out by this laboratory is related to the inspection of source projectors devices used in industrial radiography and its constituent parts as well as remote handle control assembly drive cable and guide tube systems. The laboratory also provide for the users iridium-192, cobalt-60 and selenium-75 sealed sources and performs quality control tests replacing spent or contaminated radiative sources. All discard of radioactive source is treated as radioactive waste. Additionally, administrative and commercial processes and protocols for exportation and transport of radioactive material are developed by specialized departments. In this work are presented the mean processes and procedures used by the Sealed Source Production Laboratory such as the arrival of the radioactive material to the laboratory and the source projectors, mechanical inspections, source loading, source leaking tests, etc. (author)

  8. MATLAB-based algorithm to estimate depths of isolated thin dike-like sources using higher-order horizontal derivatives of magnetic anomalies.

    Science.gov (United States)

    Ekinci, Yunus Levent

    2016-01-01

    This paper presents an easy-to-use open source computer algorithm (code) for estimating the depths of isolated single thin dike-like source bodies by using numerical second-, third-, and fourth-order horizontal derivatives computed from observed magnetic anomalies. The approach does not require a priori information and uses some filters of successive graticule spacings. The computed higher-order horizontal derivative datasets are used to solve nonlinear equations for depth determination. The solutions are independent from the magnetization and ambient field directions. The practical usability of the developed code, designed in MATLAB R2012b (MathWorks Inc.), was successfully examined using some synthetic simulations with and without noise. The algorithm was then used to estimate the depths of some ore bodies buried in different regions (USA, Sweden, and Canada). Real data tests clearly indicated that the obtained depths are in good agreement with those of previous studies and drilling information. Additionally, a state-of-the-art inversion scheme based on particle swarm optimization produced comparable results to those of the higher-order horizontal derivative analyses in both synthetic and real anomaly cases. Accordingly, the proposed code is verified to be useful in interpreting isolated single thin dike-like magnetized bodies and may be an alternative processing technique. The open source code can be easily modified and adapted to suit the benefits of other researchers.

  9. Application for approval of derived authorized limits for the release of the 190-C trenches and 105-C process water tunnels at the Hanford Site: Volume 2 - source term development

    International Nuclear Information System (INIS)

    Denham, D.H.; Winslow, S.L.; Moeller, M.P.; Kennedy, W.E. Jr.

    1997-03-01

    As part of environmental restoration activities at the Hanford Site, Bechtel Hanford, Inc. is conducting a series of evaluations to determine appropriate release conditions for specific facilities following the completion of decontamination and decommissioning projects. The release conditions, with respect to the residual volumetric radioactive contamination, are termed authorized limits. This report presents the summary of the supporting information and the final application for approval of derived authorized limits for the release of the 190-C trenches and the 105-C process water tunnels. This document contains two volumes; this volume (Vol. 2) contains the radiological characterization data, spreadsheet analyses, and radiological source terms

  10. Semiconductor processing apparatus with compact free radical source

    NARCIS (Netherlands)

    Kovalgin, Alexeij Y.; Aarnink, Antonius A.I.

    2013-01-01

    A semiconductor processing apparatus (1), comprising: a substrate processing chamber (158), defining a substrate support location (156) at which a generally planar semiconductor substrate (300) is supportable; and at least one free radical source (200), including: a precursor gas source (250); an

  11. On data processing required to derive mobility patterns from passively-generated mobile phone data

    Science.gov (United States)

    Wang, Feilong; Chen, Cynthia

    2018-01-01

    Passively-generated mobile phone data is emerging as a potential data source for transportation research and applications. Despite the large amount of studies based on the mobile phone data, only a few have reported the properties of such data, and documented how they have processed the data. In this paper, we describe two types of common mobile phone data: Call Details Record (CDR) data and sightings data, and propose a data processing framework and the associated algorithms to address two key issues associated with the sightings data: locational uncertainty and oscillation. We show the effectiveness of our proposed methods in addressing these two issues compared to the state of art algorithms in the field. We also demonstrate that without proper processing applied to the data, the statistical regularity of human mobility patterns—a key, significant trait identified for human mobility—is over-estimated. We hope this study will stimulate more studies in examining the properties of such data and developing methods to address them. Though not as glamorous as those directly deriving insights on mobility patterns (such as statistical regularity), understanding properties of such data and developing methods to address them is a fundamental research topic on which important insights are derived on mobility patterns. PMID:29398790

  12. Yttrium recovery from primary and secondary sources: A review of main hydrometallurgical processes

    Energy Technology Data Exchange (ETDEWEB)

    Innocenzi, Valentina, E-mail: valentina.innocenzi1@univaq.it [Department of Industrial Engineering, of Information and Economy, University of L’Aquila, Via Giovanni Gronchi 18, Zona industriale di Pile, 67100 L’Aquila (Italy); De Michelis, Ida [Department of Industrial Engineering, of Information and Economy, University of L’Aquila, Via Giovanni Gronchi 18, Zona industriale di Pile, 67100 L’Aquila (Italy); Kopacek, Bernd [SAT, Austrian Society for Systems Engineering and Automation, Gurkasse 43/2, A-1140 Vienna (Austria); Vegliò, Francesco [Department of Industrial Engineering, of Information and Economy, University of L’Aquila, Via Giovanni Gronchi 18, Zona industriale di Pile, 67100 L’Aquila (Italy)

    2014-07-15

    Highlights: • Review of the main hydrometallurgical processes to recover yttrium. • Recovery of yttrium from primary sources. • Recovery of yttrium from e-waste and other types of waste. - Abstract: Yttrium is important rare earths (REs) used in numerous fields, mainly in the phosphor powders for low-energy lighting. The uses of these elements, especially for high-tech products are increased in recent years and combined with the scarcity of the resources and the environmental impact of the technologies to extract them from ores make the recycling waste, that contain Y and other RE, a priority. The present review summarized the main hydrometallurgical technologies to extract Y from ores, contaminated solutions, WEEE and generic wastes. Before to discuss the works about the treatment of wastes, the processes to retrieval Y from ores are discussed, since the processes are similar and derived from those already developed for the extraction from primary sources. Particular attention was given to the recovery of Y from WEEE because the recycle of them is important not only for economical point of view, considering its value, but also for environmental impact that this could be generated if not properly disposal.

  13. Data processing with microcode designed with source coding

    Science.gov (United States)

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  14. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Richard C [ORNL; Allgood, Glenn O [ORNL; Knox, John R [ORNL

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  15. THE USE OF MULTIPLE DATA SOURCES IN THE PROCESS OF TOPOGRAPHIC MAPS UPDATING

    Directory of Open Access Journals (Sweden)

    A. Cantemir

    2016-06-01

    Full Text Available The methods used in the process of updating maps have evolved and become more complex, especially upon the development of the digital technology. At the same time, the development of technology has led to an abundance of available data that can be used in the updating process. The data sources came in a great variety of forms and formats from different acquisition sensors. Satellite images provided by certain satellite missions are now available on space agencies portals. Images stored in archives of satellite missions such us Sentinel, Landsat and other can be downloaded free of charge.The main advantages are represented by the large coverage area and rather good spatial resolution that enables the use of these images for the map updating at an appropriate scale. In our study we focused our research of these images on 1: 50.000 scale map. DEM that are globally available could represent an appropriate input for watershed delineation and stream network generation, that can be used as support for hydrography thematic layer update. If, in addition to remote sensing aerial photogrametry and LiDAR data are ussed, the accuracy of data sources is enhanced. Ortophotoimages and Digital Terrain Models are the main products that can be used for feature extraction and update. On the other side, the use of georeferenced analogical basemaps represent a significant addition to the process. Concerning the thematic maps, the classic representation of the terrain by contour lines derived from DTM, remains the best method of surfacing the earth on a map, nevertheless the correlation with other layers such as Hidrography are mandatory. In the context of the current national coverage of the Digital Terrain Model, one of the main concerns of the National Center of Cartography, through the Cartography and Photogrammetry Department, is represented by the exploitation of the available data in order to update the layers of the Topographic Reference Map 1:5000, known as

  16. Can pancreatic duct-derived progenitors be a source of islet regeneration?

    International Nuclear Information System (INIS)

    Xia, Bing; Zhan, Xiao-Rong; Yi, Ran; Yang, Baofeng

    2009-01-01

    The regenerative process of the pancreas is of interest because the main pathogenesis of diabetes mellitus is an inadequate number of insulin-producing β-cells. The functional mass of β-cells is decreased in type 1 diabetes, so replacing missing β-cells or triggering their regeneration may allow for improved type 1 diabetes treatment. Therefore, expansion of the β-cell mass from endogenous sources, either in vivo or in vitro, represents an area of increasing interest. The mechanism of islet regeneration remains poorly understood, but the identification of islet progenitor sources is critical for understanding β-cell regeneration. One potential source is the islet proper, via the dedifferentiation, proliferation, and redifferentiation of facultative progenitors residing within the islet. Neogenesis, or that the new pancreatic islets can derive from progenitor cells present within the ducts has been reported, but the existence and identity of the progenitor cells have been debated. In this review, we focus on pancreatic ductal cells, which are islet progenitors capable of differentiating into islet β-cells. Islet neogenesis, seen as budding of hormone-positive cells from the ductal epithelium, is considered to be one mechanism for normal islet growth after birth and in regeneration, and has suggested the presence of pancreatic stem cells. Numerous results support the neogenesis hypothesis, the evidence for the hypothesis in the adult comes primarily from morphological studies that have in common the production of damage to all or part of the pancreas, with consequent inflammation and repair. Although numerous studies support a ductal origin for new islets after birth, lineage-tracing experiments are considered the 'gold standard' of proof. Lineage-tracing experiments show that pancreatic duct cells act as progenitors, giving rise to new islets after birth and after injury. The identification of differentiated pancreatic ductal cells as an in vivo progenitor for

  17. Can pancreatic duct-derived progenitors be a source of islet regeneration?

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Bing [Department of Endocrinology, First Hospital of Harbin Medical University, Harbin, Hei Long Jiang Province 150001 (China); Zhan, Xiao-Rong, E-mail: xiaorongzhan@sina.com [Department of Endocrinology, First Hospital of Harbin Medical University, Harbin, Hei Long Jiang Province 150001 (China); Yi, Ran [Department of Endocrinology, First Hospital of Harbin Medical University, Harbin, Hei Long Jiang Province 150001 (China); Yang, Baofeng [Department of Pharmacology, State Key Laboratory of Biomedicine and Pharmacology, Harbin Medical University, Harbin, Hei Long Jiang Province 150001 (China)

    2009-06-12

    The regenerative process of the pancreas is of interest because the main pathogenesis of diabetes mellitus is an inadequate number of insulin-producing {beta}-cells. The functional mass of {beta}-cells is decreased in type 1 diabetes, so replacing missing {beta}-cells or triggering their regeneration may allow for improved type 1 diabetes treatment. Therefore, expansion of the {beta}-cell mass from endogenous sources, either in vivo or in vitro, represents an area of increasing interest. The mechanism of islet regeneration remains poorly understood, but the identification of islet progenitor sources is critical for understanding {beta}-cell regeneration. One potential source is the islet proper, via the dedifferentiation, proliferation, and redifferentiation of facultative progenitors residing within the islet. Neogenesis, or that the new pancreatic islets can derive from progenitor cells present within the ducts has been reported, but the existence and identity of the progenitor cells have been debated. In this review, we focus on pancreatic ductal cells, which are islet progenitors capable of differentiating into islet {beta}-cells. Islet neogenesis, seen as budding of hormone-positive cells from the ductal epithelium, is considered to be one mechanism for normal islet growth after birth and in regeneration, and has suggested the presence of pancreatic stem cells. Numerous results support the neogenesis hypothesis, the evidence for the hypothesis in the adult comes primarily from morphological studies that have in common the production of damage to all or part of the pancreas, with consequent inflammation and repair. Although numerous studies support a ductal origin for new islets after birth, lineage-tracing experiments are considered the 'gold standard' of proof. Lineage-tracing experiments show that pancreatic duct cells act as progenitors, giving rise to new islets after birth and after injury. The identification of differentiated pancreatic ductal

  18. The algorithms for calculating synthetic seismograms from a dipole source using the derivatives of Green's function

    Science.gov (United States)

    Pavlov, V. M.

    2017-07-01

    The problem of calculating complete synthetic seismograms from a point dipole with an arbitrary seismic moment tensor in a plane parallel medium composed of homogeneous elastic isotropic layers is considered. It is established that the solutions of the system of ordinary differential equations for the motion-stress vector have a reciprocity property, which allows obtaining a compact formula for the derivative of the motion vector with respect to the source depth. The reciprocity theorem for Green's functions with respect to the interchange of the source and receiver is obtained for a medium with cylindrical boundary. The differentiation of Green's functions with respect to the coordinates of the source leads to the same calculation formulas as the algorithm developed in the previous work (Pavlov, 2013). A new algorithm appears when the derivatives with respect to the horizontal coordinates of the source is replaced by the derivatives with respect to the horizontal coordinates of the receiver (with the minus sign). This algorithm is more transparent, compact, and economic than the previous one. It requires calculating the wavenumbers associated with Bessel function's roots of order 0 and order 1, whereas the previous algorithm additionally requires the second order roots.

  19. Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.

    Science.gov (United States)

    Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian

    2015-12-16

    Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.

  20. M7 germplasm release: A tetraploid clone derived from Solanum infundibuliforme for use in expanding the germplasm base for french fry processing

    Science.gov (United States)

    A new source of russet germplasm has been identified as a parent for processing and fresh market breeding programs. It was derived via bilateral sexual polyploidization following a cross between a diploid cultivated potato and the diploid wild species Solanum infundibuliforme. This clone, designated...

  1. Ultrasound-assisted liposuction provides a source for functional adipose-derived stromal cells.

    Science.gov (United States)

    Duscher, Dominik; Maan, Zeshaan N; Luan, Anna; Aitzetmüller, Matthias M; Brett, Elizabeth A; Atashroo, David; Whittam, Alexander J; Hu, Michael S; Walmsley, Graham G; Houschyar, Khosrow S; Schilling, Arndt F; Machens, Hans-Guenther; Gurtner, Geoffrey C; Longaker, Michael T; Wan, Derrick C

    2017-12-01

    Regenerative medicine employs human mesenchymal stromal cells (MSCs) for their multi-lineage plasticity and their pro-regenerative cytokine secretome. Adipose-derived mesenchymal stromal cells (ASCs) are concentrated in fat tissue, and the ease of harvest via liposuction makes them a particularly interesting cell source. However, there are various liposuction methods, and few have been assessed regarding their impact on ASC functionality. Here we study the impact of the two most popular ultrasound-assisted liposuction (UAL) devices currently in clinical use, VASER (Solta Medical) and Lysonix 3000 (Mentor) on ASCs. After lipoaspirate harvest and processing, we sorted for ASCs using fluorescent-assisted cell sorting based on an established surface marker profile (CD34 + CD31 - CD45 - ). ASC yield, viability, osteogenic and adipogenic differentiation capacity and in vivo regenerative performance were assessed. Both UAL samples demonstrated equivalent ASC yield and viability. VASER UAL ASCs showed higher osteogenic and adipogenic marker expression, but a comparable differentiation capacity was observed. Soft tissue healing and neovascularization were significantly enhanced via both UAL-derived ASCs in vivo, and there was no significant difference between the cell therapy groups. Taken together, our data suggest that UAL allows safe and efficient harvesting of the mesenchymal stromal cellular fraction of adipose tissue and that cells harvested via this approach are suitable for cell therapy and tissue engineering applications. Copyright © 2017 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  2. Investigating source processes of isotropic events

    Science.gov (United States)

    Chiang, Andrea

    This dissertation demonstrates the utility of the complete waveform regional moment tensor inversion for nuclear event discrimination. I explore the source processes and associated uncertainties for explosions and earthquakes under the effects of limited station coverage, compound seismic sources, assumptions in velocity models and the corresponding Green's functions, and the effects of shallow source depth and free-surface conditions. The motivation to develop better techniques to obtain reliable source mechanism and assess uncertainties is not limited to nuclear monitoring, but they also provide quantitative information about the characteristics of seismic hazards, local and regional tectonics and in-situ stress fields of the region . This dissertation begins with the analysis of three sparsely recorded events: the 14 September 1988 US-Soviet Joint Verification Experiment (JVE) nuclear test at the Semipalatinsk test site in Eastern Kazakhstan, and two nuclear explosions at the Chinese Lop Nor test site. We utilize a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long period waveforms and first motion observations provides unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We examine the effects of the free surface on the moment tensor via synthetic testing, and apply the moment tensor based discrimination method to well-recorded chemical explosions. These shallow chemical explosions represent rather severe source-station geometry in terms of the vanishing traction issues. We show that the combined waveform and first motion method enables the unique discrimination of these events, even though the data include unmodeled single force components resulting from the collapse and blowout of the quarry face immediately following the initial

  3. Novel family of quasi-Z-source DC/DC converters derived from current-fed push-pull converters

    DEFF Research Database (Denmark)

    Chub, Andrii; Husev, Oleksandr; Vinnikov, Dmitri

    2014-01-01

    This paper is devoted to the step-up quasi-Z-source dc/dc push-pull converter family. The topologies in the family are derived from the isolated boost converter family by replacing input inductors with the quasi-Z-source network. Two new topologies are proposed, analyzed and compared. Theoretical...

  4. Sources of Information as Determinants of Product and Process Innovation.

    Science.gov (United States)

    Gómez, Jaime; Salazar, Idana; Vargas, Pilar

    2016-01-01

    In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent.

  5. Sources of Information as Determinants of Product and Process Innovation.

    Directory of Open Access Journals (Sweden)

    Jaime Gómez

    Full Text Available In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent.

  6. Sources of Information as Determinants of Product and Process Innovation

    Science.gov (United States)

    2016-01-01

    In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent. PMID:27035456

  7. Modeling Aerobic Carbon Source Degradation Processes using Titrimetric Data and Combined Respirometric-Titrimetric Data: Structural and Practical Identifiability

    DEFF Research Database (Denmark)

    Gernaey, Krist; Petersen, B.; Dochain, D.

    2002-01-01

    The structural and practical identifiability of a model for description of respirometric-titrimetric data derived from aerobic batch substrate degradation experiments of a CxHyOz carbon source with activated sludge was evaluated. The model processes needed to describe titrimetric data included su...... the initial substrate concentration S-S(O) is known. The values found correspond to values reported in literature, but, interestingly, also seem able to reflect the occurrence of storage processes when pulses of acetate and dextrose are added. (C) 2002 Wiley Periodicals, Inc....

  8. Levels-of-processing effect on internal source monitoring in schizophrenia.

    Science.gov (United States)

    Ragland, J Daniel; McCarthy, Erin; Bilker, Warren B; Brensinger, Colleen M; Valdez, Jeffrey; Kohler, Christian; Gur, Raquel E; Gur, Ruben C

    2006-05-01

    Recognition can be normalized in schizophrenia by providing patients with semantic organizational strategies through a levels-of-processing (LOP) framework. However, patients may rely primarily on familiarity effects, making recognition less sensitive than source monitoring to the strength of the episodic memory trace. The current study investigates whether providing semantic organizational strategies can also normalize patients' internal source-monitoring performance. Sixteen clinically stable medicated patients with schizophrenia and 15 demographically matched healthy controls were asked to identify the source of remembered words following an LOP-encoding paradigm in which they alternated between processing words on a 'shallow' perceptual versus a 'deep' semantic level. A multinomial analysis provided orthogonal measures of item recognition and source discrimination, and bootstrapping generated variance to allow for parametric analyses. LOP and group effects were tested by contrasting recognition and source-monitoring parameters for words that had been encoded during deep versus shallow processing conditions. As in a previous study there were no group differences in LOP effects on recognition performance, with patients and controls benefiting equally from deep versus shallow processing. Although there were no group differences in internal source monitoring, only controls had significantly better performance for words processed during the deep encoding condition. Patient performance did not correlate with clinical symptoms or medication dose. Providing a deep processing semantic encoding strategy significantly improved patients' recognition performance only. The lack of a significant LOP effect on internal source monitoring in patients may reflect subtle problems in the relational binding of semantic information that are independent of strategic memory processes.

  9. Rapid Automatic Lighting Control of a Mixed Light Source for Image Acquisition using Derivative Optimum Search Methods

    Directory of Open Access Journals (Sweden)

    Kim HyungTae

    2015-01-01

    Full Text Available Automatic lighting (auto-lighting is a function that maximizes the image quality of a vision inspection system by adjusting the light intensity and color.In most inspection systems, a single color light source is used, and an equal step search is employed to determine the maximum image quality. However, when a mixed light source is used, the number of iterations becomes large, and therefore, a rapid search method must be applied to reduce their number. Derivative optimum search methods follow the tangential direction of a function and are usually faster than other methods. In this study, multi-dimensional forms of derivative optimum search methods are applied to obtain the maximum image quality considering a mixed-light source. The auto-lighting algorithms were derived from the steepest descent and conjugate gradient methods, which have N-size inputs of driving voltage and one output of image quality. Experiments in which the proposed algorithm was applied to semiconductor patterns showed that a reduced number of iterations is required to determine the locally maximized image quality.

  10. Identifying the source, transport path and sinks of sewage derived organic matter

    International Nuclear Information System (INIS)

    Mudge, Stephen M.; Duce, Caroline E.

    2005-01-01

    Since sewage discharges can significantly contribute to the contaminant loadings in coastal areas, it is important to identify sources, pathways and environmental sinks. Sterol and fatty alcohol biomarkers were quantified in source materials, suspended sediments and settling matter from the Ria Formosa Lagoon. Simple ratios between key biomarkers including 5β-coprostanol, cholesterol and epi-coprostanol were able to identify the sewage sources and effected deposition sites. Multivariate methods (PCA) were used to identify co-varying sites. PLS analysis using the sewage discharge as the signature indicated ∼ 25% of the variance in the sites could be predicted by the sewage signature. A new source of sewage derived organic matter was found with a high sewage predictable signature. The suspended sediments had relatively low sewage signatures as the material was diluted with other organic matter from in situ production. From a management viewpoint, PLS provides a useful tool in identifying the pathways and accumulation sites for such contaminants. - Multivariate statistical analysis was used to identify pathways and accumulation sites for contaminants in coastal waters

  11. Catalytic Processes for Utilizing Carbohydrates Derived from Algal Biomass

    Directory of Open Access Journals (Sweden)

    Sho Yamaguchi

    2017-05-01

    Full Text Available The high productivity of oil biosynthesized by microalgae has attracted increasing attention in recent years. Due to the application of such oils in jet fuels, the algal biosynthetic pathway toward oil components has been extensively researched. However, the utilization of the residue from algal cells after oil extraction has been overlooked. This residue is mainly composed of carbohydrates (starch, and so we herein describe the novel processes available for the production of useful chemicals from algal biomass-derived sugars. In particular, this review highlights our latest research in generating lactic acid and levulinic acid derivatives from polysaccharides and monosaccharides using homogeneous catalysts. Furthermore, based on previous reports, we discuss the potential of heterogeneous catalysts for application in such processes.

  12. New efficient hydrogen process production from organosilane hydrogen carriers derivatives

    Energy Technology Data Exchange (ETDEWEB)

    Brunel, Jean Michel [Unite URMITE, UMR 6236 CNRS, Faculte de Medecine et de Pharmacie, Universite de la Mediterranee, 27 boulevard Jean Moulin, 13385 Marseille 05 (France)

    2010-04-15

    While the source of hydrogen constitutes a significant scientific challenge, addressing issues of hydrogen storage, transport, and delivery is equally important. None of the current hydrogen storage options, liquefied or high pressure H{sub 2} gas, metal hydrides, etc.. satisfy criteria of size, costs, kinetics, and safety for use in transportation. In this context, we have discovered a methodology for the production of hydrogen on demand, in high yield, under kinetic control, from organosilane hydrogen carriers derivatives and methanol as co-reagent under mild conditions catalyzed by a cheap ammonium fluoride salt. Finally, the silicon by-products can be efficiently recycle leading to an environmentally friendly source of energy. (author)

  13. 40 CFR 74.17 - Application requirements for process sources. [Reserved

    Science.gov (United States)

    2010-07-01

    ... requirements for process sources. [Reserved] ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Application requirements for process sources. [Reserved] 74.17 Section 74.17 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  14. Binaural Processing of Multiple Sound Sources

    Science.gov (United States)

    2016-08-18

    AFRL-AFOSR-VA-TR-2016-0298 Binaural Processing of Multiple Sound Sources William Yost ARIZONA STATE UNIVERSITY 660 S MILL AVE STE 312 TEMPE, AZ 85281...18-08-2016 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Jul 2012 to 14 Jul 2016 4. TITLE AND SUBTITLE Binaural Processing of...three topics cited above are entirely within the scope of the AFOSR grant. 15. SUBJECT TERMS Binaural hearing, Sound Localization, Interaural signal

  15. Computing Pathways in Bio-Models Derived from Bio-Science Text Sources

    DEFF Research Database (Denmark)

    Andreasen, Troels; Bulskov, Henrik; Nilsson, Jørgen Fischer

    2015-01-01

    This paper outlines a system, OntoScape, serving to accomplish complex inference tasks on knowledge bases and bio-models derived from life-science text corpora. The system applies so-called natural logic, a form of logic which is readable for humans. This logic affords ontological representations...... of complex terms appearing in the text sources. Along with logical propositions, the system applies a semantic graph representation facilitating calculation of bio-pathways. More generally, the system aords means of query answering appealing to general and domain specic inference rules....

  16. Recent updates on lignocellulosic biomass derived ethanol - A review

    Directory of Open Access Journals (Sweden)

    Rajeev Kumar

    2016-03-01

    Full Text Available Lignocellulosic (or cellulosic biomass derived ethanol is the most promising near/long term fuel candidate. In addition, cellulosic biomass derived ethanol may serve a precursor to other fuels and chemicals that are currently derived from unsustainable sources and/or are proposed to be derived from cellulosic biomass. However, the processing cost for second generation ethanol is still high to make the process commercially profitable and replicable. In this review, recent trends in cellulosic biomass ethanol derived via biochemical route are reviewed with main focus on current research efforts that are being undertaken to realize high product yields/titers and bring the overall cost down.

  17. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  18. Nuclear heat source design for an advanced HTGR process heat plant

    International Nuclear Information System (INIS)

    McDonald, C.F.; O'Hanlon, T.W.

    1983-01-01

    A high-temperature gas-cooled reactor (HTGR) coupled with a chemical process facility could produce synthetic fuels (i.e., oil, gasoline, aviation fuel, methanol, hydrogen, etc.) in the long term using low-grade carbon sources (e.g., coal, oil shale, etc.). The ultimate high-temperature capability of an advanced HTGR variant is being studied for nuclear process heat. This paper discusses a process heat plant with a 2240-MW(t) nuclear heat source, a reactor outlet temperature of 950 0 C, and a direct reforming process. The nuclear heat source outputs principally hydrogen-rich synthesis gas that can be used as a feedstock for synthetic fuel production. This paper emphasizes the design of the nuclear heat source and discusses the major components and a deployment strategy to realize an advanced HTGR process heat plant concept

  19. Warm Cleanup of Coal-Derived Syngas: Multicontaminant Removal Process Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Spies, Kurt A.; Rainbolt, James E.; Li, Xiaohong S.; Braunberger, Beau; Li, Liyu; King, David L.; Dagle, Robert A.

    2017-02-15

    Warm cleanup of coal- or biomass-derived syngas requires sorbent and catalytic beds to protect downstream processes and catalysts from fouling. Sulfur is particularly harmful because even parts-per-million amounts are sufficient to poison downstream synthesis catalysts. Zinc oxide (ZnO) is a conventional sorbent for sulfur removal; however, its operational performance using real gasifier-derived syngas and in an integrated warm cleanup process is not well reported. In this paper, we report the optimal temperature for bulk desulfurization to be 450oC, while removal of sulfur to parts-per-billion levels requires a lower temperature of approximately 350oC. Under these conditions, we found that sulfur in the form of both hydrogen sulfide and carbonyl sulfide could be absorbed equally well using ZnO. For long-term operation, sorbent regeneration is desirable to minimize process costs. Over the course of five sulfidation and regeneration cycles, a ZnO bed lost about a third of its initial sulfur capacity, however sorbent capacity stabilized. Here, we also demonstrate, at the bench-scale, a process and materials used for warm cleanup of coal-derived syngas using five operations: 1) Na2CO3 for HCl removal, 2) regenerable ZnO beds for bulk sulfur removal, 3) a second ZnO bed for trace sulfur removal, 4) a Ni-Cu/C sorbent for multi-contaminant inorganic removal, and 5) a Ir-Ni/MgAl2O4 catalyst employed for ammonia decomposition and tar and light hydrocarbon steam reforming. Syngas cleanup was demonstrated through successful long-term performance of a poison-sensitive, Cu-based, water-gas-shift catalyst placed downstream of the cleanup process train. The tar reformer is an important and necessary operation with this particular gasification system; its inclusion was the difference between deactivating the water-gas catalyst with carbon deposition and successful 100-hour testing using 1 LPM of coal-derived syngas.

  20. Effect of interpolation error in pre-processing codes on calculations of self-shielding factors and their temperature derivatives

    International Nuclear Information System (INIS)

    Ganesan, S.; Gopalakrishnan, V.; Ramanadhan, M.M.; Cullan, D.E.

    1986-01-01

    We investigate the effect of interpolation error in the pre-processing codes LINEAR, RECENT and SIGMA1 on calculations of self-shielding factors and their temperature derivatives. We consider the 2.0347 to 3.3546 keV energy region for 238 U capture, which is the NEACRP benchmark exercise on unresolved parameters. The calculated values of temperature derivatives of self-shielding factors are significantly affected by interpolation error. The sources of problems in both evaluated data and codes are identified and eliminated in the 1985 version of these codes. This paper helps to (1) inform code users to use only 1985 versions of LINEAR, RECENT, and SIGMA1 and (2) inform designers of other code systems where they may have problems and what to do to eliminate their problems. (author)

  1. Effect of interpolation error in pre-processing codes on calculations of self-shielding factors and their temperature derivatives

    International Nuclear Information System (INIS)

    Ganesan, S.; Gopalakrishnan, V.; Ramanadhan, M.M.; Cullen, D.E.

    1985-01-01

    The authors investigate the effect of interpolation error in the pre-processing codes LINEAR, RECENT and SIGMA1 on calculations of self-shielding factors and their temperature derivatives. They consider the 2.0347 to 3.3546 keV energy region for /sup 238/U capture, which is the NEACRP benchmark exercise on unresolved parameters. The calculated values of temperature derivatives of self-shielding factors are significantly affected by interpolation error. The sources of problems in both evaluated data and codes are identified and eliminated in the 1985 version of these codes. This paper helps to (1) inform code users to use only 1985 versions of LINEAR, RECENT, and SIGMA1 and (2) inform designers of other code systems where they may have problems and what to do to eliminate their problems

  2. Source apportionment of PM10 and PM2.5 in major urban Greek agglomerations using a hybrid source-receptor modeling process.

    Science.gov (United States)

    Argyropoulos, G; Samara, C; Diapouli, E; Eleftheriadis, K; Papaoikonomou, K; Kungolos, A

    2017-12-01

    A hybrid source-receptor modeling process was assembled, to apportion and infer source locations of PM 10 and PM 2.5 in three heavily-impacted urban areas of Greece, during the warm period of 2011, and the cold period of 2012. The assembled process involved application of an advanced computational procedure, the so-called Robotic Chemical Mass Balance (RCMB) model. Source locations were inferred using two well-established probability functions: (a) the Conditional Probability Function (CPF), to correlate the output of RCMB with local wind directional data, and (b) the Potential Source Contribution Function (PSCF), to correlate the output of RCMB with 72h air-mass back-trajectories, arriving at the receptor sites, during sampling. Regarding CPF, a higher-level conditional probability function was defined as well, from the common locus of CPF sectors derived for neighboring receptor sites. With respect to PSCF, a non-parametric bootstrapping method was applied to discriminate the statistically significant values. RCMB modeling showed that resuspended dust is actually one of the main barriers for attaining the European Union (EU) limit values in Mediterranean urban agglomerations, where the drier climate favors build-up. The shift in the energy mix of Greece (caused by the economic recession) was also evidenced, since biomass burning was found to contribute more significantly to the sampling sites belonging to the coldest climatic zone, particularly during the cold period. The CPF analysis showed that short-range transport of anthropogenic emissions from urban traffic to urban background sites was very likely to have occurred, within all the examined urban agglomerations. The PSCF analysis confirmed that long-range transport of primary and/or secondary aerosols may indeed be possible, even from distances over 1000km away from study areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Tissue Source and Cell Expansion Condition Influence Phenotypic Changes of Adipose-Derived Stem Cells

    Science.gov (United States)

    Mangum, Lauren H.; Stone, Randolph; Wrice, Nicole L.; Larson, David A.; Florell, Kyle F.; Christy, Barbara A.; Herzig, Maryanne C.; Cap, Andrew P.

    2017-01-01

    Stem cells derived from the subcutaneous adipose tissue of debrided burned skin represent an appealing source of adipose-derived stem cells (ASCs) for regenerative medicine. Traditional tissue culture uses fetal bovine serum (FBS), which complicates utilization of ASCs in human medicine. Human platelet lysate (hPL) is one potential xeno-free, alternative supplement for use in ASC culture. In this study, adipogenic and osteogenic differentiation in media supplemented with 10% FBS or 10% hPL was compared in human ASCs derived from abdominoplasty (HAP) or from adipose associated with debrided burned skin (BH). Most (95–99%) cells cultured in FBS were stained positive for CD73, CD90, CD105, and CD142. FBS supplementation was associated with increased triglyceride content and expression of adipogenic genes. Culture in hPL significantly decreased surface staining of CD105 by 31% and 48% and CD142 by 27% and 35% in HAP and BH, respectively (p < 0.05). Culture of BH-ASCs in hPL also increased expression of markers of osteogenesis and increased ALP activity. These data indicate that application of ASCs for wound healing may be influenced by ASC source as well as culture conditions used to expand them. As such, these factors must be taken into consideration before ASCs are used for regenerative purposes. PMID:29138638

  4. Tissue Source and Cell Expansion Condition Influence Phenotypic Changes of Adipose-Derived Stem Cells

    Directory of Open Access Journals (Sweden)

    Lauren H. Mangum

    2017-01-01

    Full Text Available Stem cells derived from the subcutaneous adipose tissue of debrided burned skin represent an appealing source of adipose-derived stem cells (ASCs for regenerative medicine. Traditional tissue culture uses fetal bovine serum (FBS, which complicates utilization of ASCs in human medicine. Human platelet lysate (hPL is one potential xeno-free, alternative supplement for use in ASC culture. In this study, adipogenic and osteogenic differentiation in media supplemented with 10% FBS or 10% hPL was compared in human ASCs derived from abdominoplasty (HAP or from adipose associated with debrided burned skin (BH. Most (95–99% cells cultured in FBS were stained positive for CD73, CD90, CD105, and CD142. FBS supplementation was associated with increased triglyceride content and expression of adipogenic genes. Culture in hPL significantly decreased surface staining of CD105 by 31% and 48% and CD142 by 27% and 35% in HAP and BH, respectively (p<0.05. Culture of BH-ASCs in hPL also increased expression of markers of osteogenesis and increased ALP activity. These data indicate that application of ASCs for wound healing may be influenced by ASC source as well as culture conditions used to expand them. As such, these factors must be taken into consideration before ASCs are used for regenerative purposes.

  5. Induced pluripotent stem cells (iPSCs) derived from different cell sources and their potential for regenerative and personalized medicine.

    Science.gov (United States)

    Shtrichman, R; Germanguz, I; Itskovitz-Eldor, J

    2013-06-01

    Human induced pluripotent stem cells (hiPSCs) have great potential as a robust source of progenitors for regenerative medicine. The novel technology also enables the derivation of patient-specific cells for applications to personalized medicine, such as for personal drug screening and toxicology. However, the biological characteristics of iPSCs are not yet fully understood and their similarity to human embryonic stem cells (hESCs) is still unresolved. Variations among iPSCs, resulting from their original tissue or cell source, and from the experimental protocols used for their derivation, significantly affect epigenetic properties and differentiation potential. Here we review the potential of iPSCs for regenerative and personalized medicine, and assess their expression pattern, epigenetic memory and differentiation capabilities in relation to their parental tissue source. We also summarize the patient-specific iPSCs that have been derived for applications in biological research and drug discovery; and review risks that must be overcome in order to use iPSC technology for clinical applications.

  6. Crystallographic data processing for free-electron laser sources

    International Nuclear Information System (INIS)

    White, Thomas A.; Barty, Anton; Stellato, Francesco; Holton, James M.; Kirian, Richard A.; Zatsepin, Nadia A.; Chapman, Henry N.

    2013-01-01

    A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A detailed analysis of the nature and impact of indexing ambiguities is presented. Simulations of the Monte Carlo integration scheme, which accounts for the partially recorded nature of the diffraction intensities, are presented and show that the integration of partial reflections could be made to converge more quickly if the bandwidth of the X-rays were to be increased by a small amount or if a slight convergence angle were introduced into the incident beam

  7. Crystallographic data processing for free-electron laser sources

    Energy Technology Data Exchange (ETDEWEB)

    White, Thomas A., E-mail: taw@physics.org; Barty, Anton; Stellato, Francesco [DESY, Notkestrasse 85, 22607 Hamburg (Germany); Holton, James M. [University of California, San Francisco, CA 94158 (United States); Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Kirian, Richard A. [DESY, Notkestrasse 85, 22607 Hamburg (Germany); Arizona State University, Tempe, AZ 85287 (United States); Zatsepin, Nadia A. [Arizona State University, Tempe, AZ 85287 (United States); Chapman, Henry N. [DESY, Notkestrasse 85, 22607 Hamburg (Germany); University of Hamburg, Luruper Chaussee 149, 22761 Hamburg (Germany)

    2013-07-01

    A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A detailed analysis of the nature and impact of indexing ambiguities is presented. Simulations of the Monte Carlo integration scheme, which accounts for the partially recorded nature of the diffraction intensities, are presented and show that the integration of partial reflections could be made to converge more quickly if the bandwidth of the X-rays were to be increased by a small amount or if a slight convergence angle were introduced into the incident beam.

  8. Bone marrow-derived versus parenchymal sources of inducible nitric oxide synthase in experimental autoimmune encephalomyelitis

    DEFF Research Database (Denmark)

    Zehntner, Simone P; Bourbonniere, Lyne; Hassan-Zahraee, Mina

    2004-01-01

    . These discrepancies may reflect balance between immunoregulatory and neurocytopathologic roles for NO. We investigated selective effects of bone marrow-derived versus CNS parenchymal sources of iNOS in EAE in chimeric mice. Chimeras that selectively expressed or ablated iNOS in leukocytes both showed significant...

  9. Forms, Sources and Processes of Trust

    NARCIS (Netherlands)

    Nooteboom, B.

    2006-01-01

    This chapter reviews some key points in the analysis of trust, based on Nooteboom (2002)i.The following questions are addressed.What can we have trust in?What is the relation between trust and control?What are the sources of trust? And what are its limits?By what process is trust built up and broken

  10. 26 CFR 1.863-8 - Source of income derived from space and ocean activity under section 863(d).

    Science.gov (United States)

    2010-04-01

    ..., DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Regulations Applicable to Taxable... from sources without the United States to the extent the income, based on all the facts and... income derived by a CFC is income from sources without the United States to the extent the income, based...

  11. Study of the Release Process of Open Source Software: Case Study

    OpenAIRE

    Eide, Tor Erik

    2007-01-01

    This report presents the results of a case study focusing on the release process of open source projects initiated with commercial motives. The purpose of the study is to gain an increased understanding of the release process, how a community can be attracted to the project, and how the interaction with the community evolves in commercial open source initiatives. Data has been gathered from four distinct sources to form the basis of this thesis. A thorough review of the open source literatu...

  12. Processing and characterization of diatom nanoparticles and microparticles as potential source of silicon for bone tissue engineering

    Energy Technology Data Exchange (ETDEWEB)

    Le, Thi Duy Hanh [Department of Industrial Engineering, University of Trento, Trento (Italy); BIOtech Research Center and European Institute of Excellence on Tissue Engineering and Regenerative Medicine, Trento (Italy); Bonani, Walter [Department of Industrial Engineering, University of Trento, Trento (Italy); BIOtech Research Center and European Institute of Excellence on Tissue Engineering and Regenerative Medicine, Trento (Italy); Interuniversity Consortium for Science and Technology of Materials, Trento Research Unit, Trento (Italy); Speranza, Giorgio [Center for Materials and Microsystems, PAM-SE, Fondazione Bruno Kessler, Trento (Italy); Sglavo, Vincenzo; Ceccato, Riccardo [Department of Industrial Engineering, University of Trento, Trento (Italy); Maniglio, Devid; Motta, Antonella [Department of Industrial Engineering, University of Trento, Trento (Italy); BIOtech Research Center and European Institute of Excellence on Tissue Engineering and Regenerative Medicine, Trento (Italy); Interuniversity Consortium for Science and Technology of Materials, Trento Research Unit, Trento (Italy); Migliaresi, Claudio, E-mail: claudio.migliaresi@unitn.it [Department of Industrial Engineering, University of Trento, Trento (Italy); BIOtech Research Center and European Institute of Excellence on Tissue Engineering and Regenerative Medicine, Trento (Italy); Interuniversity Consortium for Science and Technology of Materials, Trento Research Unit, Trento (Italy)

    2016-02-01

    Silicon plays an important role in bone formation and maintenance, improving osteoblast cell function and inducing mineralization. Often, bone deformation and long bone abnormalities have been associated with silica/silicon deficiency. Diatomite, a natural deposit of diatom skeleton, is a cheap and abundant source of biogenic silica. The aim of the present study is to validate the potential of diatom particles derived from diatom skeletons as silicon-donor materials for bone tissue engineering applications. Raw diatomite (RD) and calcined diatomite (CD) powders were purified by acid treatments, and diatom microparticles (MPs) and nanoparticles (NPs) were produced by fragmentation of purified diatoms under alkaline conditions. The influence of processing on the surface chemical composition of purified diatomites was evaluated by X-ray photoelectron spectroscopy (XPS). Diatoms NPs were also characterized in terms of morphology and size distribution by transmission electron microscopy (TEM) and Dynamic light scattering (DLS), while diatom MPs morphology was analyzed by scanning electron microscopy (SEM). Surface area and microporosity of the diatom particles were evaluated by nitrogen physisorption methods. Release of silicon ions from diatom-derived particles was demonstrated using inductively coupled plasma optical emission spectrometry (ICP/OES); furthermore, silicon release kinetic was found to be influenced by diatomite purification method and particle size. Diatom-derived microparticles (MPs) and nanoparticles (NPs) showed limited or no cytotoxic effect in vitro depending on the administration conditions. - Highlights: • Diatomite is a natural source of silica and has a potential as silicon-donor for bone regenerative applications. • Diatom particles derived from purified diatom skeletons were prepared by fragmentation under extreme alkaline condition. • Dissolution of diatom particles derived from diatom skeletons in DI water depend on purification method

  13. Language identification using excitation source features

    CERN Document Server

    Rao, K Sreenivasa

    2015-01-01

    This book discusses the contribution of excitation source information in discriminating language. The authors focus on the excitation source component of speech for enhancement of language identification (LID) performance. Language specific features are extracted using two different modes: (i) Implicit processing of linear prediction (LP) residual and (ii) Explicit parameterization of linear prediction residual. The book discusses how in implicit processing approach, excitation source features are derived from LP residual, Hilbert envelope (magnitude) of LP residual and Phase of LP residual; and in explicit parameterization approach, LP residual signal is processed in spectral domain to extract the relevant language specific features. The authors further extract source features from these modes, which are combined for enhancing the performance of LID systems. The proposed excitation source features are also investigated for LID in background noisy environments. Each chapter of this book provides the motivatio...

  14. Infrapatellar Fat Pad: An Alternative Source of Adipose-Derived Mesenchymal Stem Cells

    Directory of Open Access Journals (Sweden)

    P. Tangchitphisut

    2016-01-01

    Full Text Available Introduction. The Infrapatellar fat pad (IPFP represents an emerging alternative source of adipose-derived mesenchymal stem cells (ASCs. We compared the characteristics and differentiation capacity of ASCs isolated from IPFP and SC. Materials and Methods. ASCs were harvested from either IPFP or SC. IPFPs were collected from patients undergoing total knee arthroplasty (TKA, whereas subcutaneous tissues were collected from patients undergoing lipoaspiration. Immunophenotypes of surface antigens were evaluated. Their ability to form colony-forming units (CFUs and their differentiation potential were determined. The ASCs karyotype was evaluated. Results. There was no difference in the number of CFUs and size of CFUs between IPFP and SC sources. ASCs isolated from both sources had a normal karyotype. The mesenchymal stem cells (MSCs markers on flow cytometry was equivalent. IPFP-ASCs demonstrated significantly higher expression of SOX-9 and RUNX-2 over ASCs isolated from SC (6.19 ± 5.56-, 0.47 ± 0.62-fold; p value = 0.047, and 17.33 ± 10.80-, 1.56 ± 1.31-fold; p value = 0.030, resp.. Discussion and Conclusion. CFU assay of IPFP-ASCs and SC-ASCs harvested by lipoaspiration technique was equivalent. The expression of key chondrogenic and osteogenic genes was increased in cells isolated from IPFP. IPFP should be considered a high quality alternative source of ASCs.

  15. A stochastic post-processing method for solar irradiance forecasts derived from NWPs models

    Science.gov (United States)

    Lara-Fanego, V.; Pozo-Vazquez, D.; Ruiz-Arias, J. A.; Santos-Alamillos, F. J.; Tovar-Pescador, J.

    2010-09-01

    Solar irradiance forecast is an important area of research for the future of the solar-based renewable energy systems. Numerical Weather Prediction models (NWPs) have proved to be a valuable tool for solar irradiance forecasting with lead time up to a few days. Nevertheless, these models show low skill in forecasting the solar irradiance under cloudy conditions. Additionally, climatic (averaged over seasons) aerosol loading are usually considered in these models, leading to considerable errors for the Direct Normal Irradiance (DNI) forecasts during high aerosols load conditions. In this work we propose a post-processing method for the Global Irradiance (GHI) and DNI forecasts derived from NWPs. Particularly, the methods is based on the use of Autoregressive Moving Average with External Explanatory Variables (ARMAX) stochastic models. These models are applied to the residuals of the NWPs forecasts and uses as external variables the measured cloud fraction and aerosol loading of the day previous to the forecast. The method is evaluated for a set one-moth length three-days-ahead forecast of the GHI and DNI, obtained based on the WRF mesoscale atmospheric model, for several locations in Andalusia (Southern Spain). The Cloud fraction is derived from MSG satellite estimates and the aerosol loading from the MODIS platform estimates. Both sources of information are readily available at the time of the forecast. Results showed a considerable improvement of the forecasting skill of the WRF model using the proposed post-processing method. Particularly, relative improvement (in terms of the RMSE) for the DNI during summer is about 20%. A similar value is obtained for the GHI during the winter.

  16. Hydrocarbons from algal bodies and vegetal sources - a prognosticated assessment

    International Nuclear Information System (INIS)

    Sen, Subhasis; Sen, Meera; Sen, Nandita.

    1992-01-01

    Hydrogen-rich vegetal matter and other similar plant derived sources are highlighted as a potential renewable source for hydrocarbon following a different route, i.e. low temperature carbonization of the processed material followed by hydrogenation of tar and subsequent processing and also fractionation of the products are discussed. (P.R.K.)

  17. Radiation processing of chitosan derivative and its characteristics

    International Nuclear Information System (INIS)

    Kamarudin Bahari; Kamarolzaman Hussein; Kamaruddin Hashim; Khairul Zaman Mohd Dahlan

    2002-01-01

    Chitosan is natural polymer derived from chitin, a polysaccharide found in the exoskeleton of shrimps, crabs, fungi and others. Chitosan is a naturally occurring substance that is chemically similar to cellulose. Chitosan possesses a positive ionic charge give ability to chemically bond with negatively charged fats. Chitosan is soluble in organic acid but insoluble in water. Carboxymethyl-chitosan (cm-chitosan) is a derivative of chitosan which is water-soluble was then prepared by a carboxymethylation process of chitosan produced from local shrimp shell. A simple method for synthesis of cm-chitosan has been developed at 55 degree C in aqueous sodium hydroxide / propanol with chloroacetic acid (CAA) or sodium chloroacetate salt (SCA). The modification of chitosan to water-soluble chitosan can be used in hydrogel as anti-bacterial agent and it overcome the problem of bad smell using acetic acid. (Author)

  18. Early and long-term mantle processing rates derived from xenon isotopes

    Science.gov (United States)

    Mukhopadhyay, S.; Parai, R.; Tucker, J.; Middleton, J. L.; Langmuir, C. H.

    2015-12-01

    Noble gases, particularly xenon (Xe), in mantle-derived basalts provide a rich portrait of mantle degassing and surface-interior volatile exchange. The combination of extinct and extant radioactive species in the I-Pu-U-Xe systems shed light on the degassing history of the early Earth throughout accretion, as well as the long-term degassing of the Earth's interior in association with plate tectonics. The ubiquitous presence of shallow-level air contamination, however, frequently obscures the mantle Xe signal. In a majority of the samples, shallow air contamination dominates the Xe budget. For example, in the gas-rich popping rock 2ΠD43, 129Xe/130Xe ratios reach 7.7±0.23 in individual step-crushes, but the bulk composition of the sample is close to air (129Xe/130Xe of 6.7). Thus, the extent of variability in mantle source Xe composition is not well-constrained. Here, we present new MORB Xe data and explore constraints placed on mantle processing rates by the Xe data. Ten step-crushes were obtained on a depleted popping glass that was sealed in ultrapure N2 after dredge retrieval from between the Kane-Atlantis Fracture Zone of the Mid Atlantic Ridge in May 2012. 9 steps yielded 129Xe/130Xe of 7.50-7.67 and one yielded 7.3. The bulk 129Xe/130Xe of the sample is 7.6, nearly identical to the estimated mantle source value of 7.7 for the sample. Hence, the sample is virtually free of shallow-level air contamination. Because sealing the sample in N2upon dredge retrieval largely eliminated air contamination, for many samples, contamination must be added after sample retrieval from the ocean bottom. Our new high-precision Xe isotopic measurements in upper mantle-derived samples provide improved constraints on the Xe isotopic composition of the mantle source. We developed a forward model of mantle volatile evolution to identify solutions that satisfy our Xe isotopic data. We find that accretion timescales of ~10±5 Myr are consistent with I-Pu-Xe constraints, and the last

  19. Dispersal from deep ocean sources: physical and related scientific processes

    International Nuclear Information System (INIS)

    Robinson, A.R.; Kupferman, S.L.

    1985-02-01

    This report presents the results of the workshop ''Dispersal from Deep Ocean Sources: Physical and Related Scientific Processes,'' together with subsequent developments and syntheses of the material discussed there. The project was undertaken to develop usable predictive descriptions of dispersal from deep oceanic sources. Relatively simple theoretical models embodying modern ocean physics were applied, and observational and experimental data bases were exploited. All known physical processes relevant to the dispersal of passive, conservative tracers were discussed, and contact points for inclusion of nonconservative processes (biological and chemical) were identified. Numerical estimates of the amplitude, space, and time scales of dispersion were made for various mechanisms that control the evolution of the dispersal as the material spreads from a bottom point source to small-, meso-, and world-ocean scales. Recommendations for additional work are given. The volume is presented as a handbook of dispersion processes. It is intended to be updated as new results become available

  20. Coda-derived source spectra, moment magnitudes and energy-moment scaling in the western Alps

    Science.gov (United States)

    Morasca, P.; Mayeda, K.; Malagnini, L.; Walter, William R.

    2005-01-01

    A stable estimate of the earthquake source spectra in the western Alps is obtained using an empirical method based on coda envelope amplitude measurements described by Mayeda et al. for events ranging between MW~ 1.0 and ~5.0. Path corrections for consecutive narrow frequency bands ranging between 0.3 and 25.0 Hz were included using a simple 1-D model for five three-component stations of the Regional Seismic network of Northwestern Italy (RSNI). The 1-D assumption performs well, even though the region is characterized by a complex structural setting involving strong lateral variations in the Moho depth. For frequencies less than 1.0 Hz, we tied our dimensionless, distance-corrected coda amplitudes to an absolute scale in units of dyne cm by using independent moment magnitudes from long-period waveform modelling for three moderate magnitude events in the region. For the higher frequencies, we used small events as empirical Green's functions, with corner frequencies above 25.0 Hz. For each station, the procedure yields frequency-dependent corrections that account for site effects, including those related to fmax, as well as to S-to-coda transfer function effects. After the calibration was completed, the corrections were applied to the entire data set composed of 957 events. Our findings using the coda-derived source spectra are summarized as follows: (i) we derived stable estimates of seismic moment, M0, (and hence MW) as well as radiated S-wave energy, (ES), from waveforms recorded by as few as one station, for events that were too small to be waveform modelled (i.e. events less than MW~ 3.5); (ii) the source spectra were used to derive an equivalent local magnitude, ML(coda), that is in excellent agreement with the network averaged values using direct S waves; (iii) scaled energy, , where ER, the radiated seismic energy, is comparable to results from other tectonically active regions (e.g. western USA, Japan) and supports the idea that there is a fundamental

  1. Processes for converting biomass-derived feedstocks to chemicals and liquid fuels

    Science.gov (United States)

    Held, Andrew; Woods, Elizabeth; Cortright, Randy; Gray, Matthew

    2018-04-17

    The present invention provides processes, methods, and systems for converting biomass-derived feedstocks to liquid fuels and chemicals. The method generally includes the reaction of a hydrolysate from a biomass deconstruction process with hydrogen and a catalyst to produce a reaction product comprising one of more oxygenated compounds. The process also includes reacting the reaction product with a condensation catalyst to produce C.sub.4+ compounds useful as fuels and chemicals.

  2. Processes for converting biomass-derived feedstocks to chemicals and liquid fuels

    Science.gov (United States)

    Held, Andrew; Woods, Elizabeth; Cortright, Randy; Gray, Matthew

    2017-05-23

    The present invention provides processes, methods, and systems for converting biomass-derived feedstocks to liquid fuels and chemicals. The method generally includes the reaction of a hydrolysate from a biomass deconstruction process with hydrogen and a catalyst to produce a reaction product comprising one of more oxygenated compounds. The process also includes reacting the reaction product with a condensation catalyst to produce C.sub.4+ compounds useful as fuels and chemicals.

  3. Application of large radiation sources in chemical processing industry

    International Nuclear Information System (INIS)

    Krishnamurthy, K.

    1977-01-01

    Large radiation sources and their application in chemical processing industry are described. A reference has also been made to the present developments in this field in India. Radioactive sources, notably 60 Co, are employed in production of wood-plastic and concrete-polymer composites, vulcanised rubbers, polymers, sulfochlorinated paraffin hydrocarbons and in a number of other applications which require deep penetration and high reliability of source. Machine sources of electrons are used in production of heat shrinkable plastics, insulation materials for cables, curing of paints etc. Radiation sources have also been used for sewage hygienisation. As for the scene in India, 60 Co sources, gamma chambers and batch irradiators are manufactured. A list of the on-going R and D projects and organisations engaged in research in this field is given. (M.G.B.)

  4. An open source Bayesian Monte Carlo isotope mixing model with applications in Earth surface processes

    Science.gov (United States)

    Arendt, Carli A.; Aciego, Sarah M.; Hetland, Eric A.

    2015-05-01

    The implementation of isotopic tracers as constraints on source contributions has become increasingly relevant to understanding Earth surface processes. Interpretation of these isotopic tracers has become more accessible with the development of Bayesian Monte Carlo (BMC) mixing models, which allow uncertainty in mixing end-members and provide methodology for systems with multicomponent mixing. This study presents an open source multiple isotope BMC mixing model that is applicable to Earth surface environments with sources exhibiting distinct end-member isotopic signatures. Our model is first applied to new δ18O and δD measurements from the Athabasca Glacier, which showed expected seasonal melt evolution trends and vigorously assessed the statistical relevance of the resulting fraction estimations. To highlight the broad applicability of our model to a variety of Earth surface environments and relevant isotopic systems, we expand our model to two additional case studies: deriving melt sources from δ18O, δD, and 222Rn measurements of Greenland Ice Sheet bulk water samples and assessing nutrient sources from ɛNd and 87Sr/86Sr measurements of Hawaiian soil cores. The model produces results for the Greenland Ice Sheet and Hawaiian soil data sets that are consistent with the originally published fractional contribution estimates. The advantage of this method is that it quantifies the error induced by variability in the end-member compositions, unrealized by the models previously applied to the above case studies. Results from all three case studies demonstrate the broad applicability of this statistical BMC isotopic mixing model for estimating source contribution fractions in a variety of Earth surface systems.

  5. Evaluation of the influence of source and spatial resolution of DEMs on derivative products used in landslide mapping

    Directory of Open Access Journals (Sweden)

    Rubini Mahalingam

    2016-11-01

    Full Text Available Landslides are a major geohazard, which result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping can help communities plan and prepare for these damaging events. Digital elevation models (DEMs are one of the most important data-sets used in landslide hazard assessment. Despite their frequent use, limited research has been completed to date on how the DEM source and spatial resolution can influence the accuracy of the produced landslide susceptibility maps. The aim of this paper is to analyse the influence of spatial resolutions and source of DEMs on landslide susceptibility mapping. For this purpose, Advanced Spaceborne Thermal Emission and Reflection (ASTER, National Elevation Dataset (NED, and Light Detection and Ranging (LiDAR DEMs were obtained for two study sections of approximately 140 km2 in north-west Oregon. Each DEM was resampled to 10, 30, and 50 m and slope and aspect grids were derived for each resolution. A set of nine spatial databases was constructed using geoinformation science (GIS for each of the spatial resolution and source. Additional factors such as distance to river and fault maps were included. An analytical hierarchical process (AHP, fuzzy logic model, and likelihood ratio-AHP representing qualitative, quantitative, and hybrid landslide mapping techniques were used for generating landslide susceptibility maps. The results from each of the techniques were verified with the Cohen's kappa index, confusion matrix, and a validation index based on agreement with detailed landslide inventory maps. The spatial resolution of 10 m, derived from the LiDAR data-set showed higher predictive accuracy in all the three techniques used for producing landslide susceptibility maps. At a resolution of 10 m, the output maps based on NED and ASTER had higher misclassification compared to the LiDAR-based outputs. Further, the 30-m LiDAR output showed improved results over the 10-m NED and 10-m

  6. C-188 Co-60 sources installation and source rack loading optimization processes in a gamma irradiation facility

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Paulo de S.; Vasquez, Pablo A.S., E-mail: psantos@ipen.br, E-mail: pavsalva@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2017-11-01

    Since 2004, the Multipurpose Gamma Facility at the Nuclear and Energy Research Institute has been providing services on radiation processing for disinfection and sterilization of health care and disposable medical products as well to support research studies on modification of physical, chemical and biological properties of several materials. Recently, there was an increment in irradiation of the Cultural Heritages. This facility uses C-188 double-encapsulated radioactive Cobalt-60 sources known as pencils from manufactures outside of country. The activity of the cobalt sources decays into a stable nickel isotope with a half-life around 5.27 years, which means a loss of 12.3% annually. Then, additional pencils of Cobalt-60 are added periodically to the source rack to maintain the required capacity or installed activity of the facility. The manufacturer makes shipping of the radioactive sources inside a high density container type B(U) , by sea. This one involves many administrative, transport and radiation safety procedures. Once in the facility, the container is opened inside a deep pool water to remove the pencils. The required source geometry of the facility is obtained by loading these source pencils into predetermined diagram or positions in source modules and distributing these modules over the source rack of the facility. The dose variation can be reduced placing the higher activity source pencils near the periphery of the source rack. In this work are presented the procedures for perform the boiling leaching tests applied to the container, the Cobalt-60 sources installation, the loading processes and the source rack loading optimization. (author)

  7. Cadmium isotope fractionation of materials derived from various industrial processes

    Energy Technology Data Exchange (ETDEWEB)

    Martinková, Eva, E-mail: eva.cadkova@geology.cz [Czech Geological Survey, Geologická 6, 152 00 Prague 5 (Czech Republic); Chrastný, Vladislav, E-mail: chrastny@fzp.czu.cz [Faculty of Environmental Sciences, Czech University of Life Sciences Prague, Kamýcká 129, 165 21 Prague 6 (Czech Republic); Francová, Michaela, E-mail: michaela.francova@fzp.czu.cz [Faculty of Environmental Sciences, Czech University of Life Sciences Prague, Kamýcká 129, 165 21 Prague 6 (Czech Republic); Šípková, Adéla, E-mail: adela.sipkova@geology.cz [Czech Geological Survey, Geologická 6, 152 00 Prague 5 (Czech Republic); Čuřík, Jan, E-mail: jan.curik@geology.cz [Czech Geological Survey, Geologická 6, 152 00 Prague 5 (Czech Republic); Myška, Oldřich, E-mail: oldrich.myska@geology.cz [Czech Geological Survey, Geologická 6, 152 00 Prague 5 (Czech Republic); Mižič, Lukáš, E-mail: lukas.mizic@geology.cz [Czech Geological Survey, Geologická 6, 152 00 Prague 5 (Czech Republic)

    2016-01-25

    Highlights: • All studied industrial processes were accompanied by Cd isotope fractionation. • ϵ{sup 114/110} Cd values of the waste materials were discernible from primary sources. • Technology in use plays an important role in Cd isotope fractionation. - Abstract: Our study represents ϵ{sup 114/110} Cd {sub NIST3108} values of materials resulting from anthropogenic activities such as coal burning, smelting, refining, metal coating, and the glass industry. Additionally, primary sources (ore samples, pigment, coal) processed in the industrial premises were studied. Two sphalerites, galena, coal and pigment samples exhibited ϵ{sup 114/110} Cd{sub NIST3108} values of 1.0 ± 0.2, 0.2 ± 0.2, 1.3 ± 0.1, −2.3 ± 0.2 and −0.1 ± 0.3, respectively. In general, all studied industrial processes were accompanied by Cd isotope fractionation. Most of the industrial materials studied were clearly distinguishable from the samples used as a primary source based on ϵ{sup 114/110} Cd {sub NIST3108} values. The heaviest ϵ{sup 114/110} Cd{sub NIST3108} value of 58.6 ± 0.9 was found for slag resulting from coal combustion, and the lightest ϵ{sup 114/110} Cd{sub NIST3108} value of −23 ± 2.5 was observed for waste material after Pb refinement. It is evident that ϵ{sup 114/110} Cd {sub NIST3108} values depend on technological processes, and in case of incomplete Cd transfer from source to final waste material, every industrial activity creates differences in Cd isotope composition. Our results show that Cd isotope analysis is a promising tool to track the origins of industrial waste products.

  8. Derivation of the source term, dose results and associated radiological consequences for the Greek Research Reactor – 1

    Energy Technology Data Exchange (ETDEWEB)

    Pappas, Charalampos, E-mail: chpappas@ipta.demokritos.gr; Ikonomopoulos, Andreas; Sfetsos, Athanasios; Andronopoulos, Spyros; Varvayanni, Melpomeni; Catsaros, Nicolas

    2014-07-01

    Highlights: • Source term derivation of postulated accident sequences in a research reactor. • Various containment ventilation scenarios considered for source term calculations. • Source term parametric analysis performed in case of lack of ventilation. • JRODOS employed for dose calculations under eighteen modeled scenarios. • Estimation of radiological consequences during typical and adverse weather scenarios. - Abstract: The estimated source term, dose results and radiological consequences of selected accident sequences in the Greek Research Reactor – 1 are presented and discussed. A systematic approach has been adopted to perform the necessary calculations in accordance with the latest computational developments and IAEA recommendations. Loss-of-coolant, reactivity insertion and fuel channel blockage accident sequences have been selected to derive the associated source terms under three distinct containment ventilation scenarios. Core damage has been conservatively assessed for each accident sequence while the ventilation has been assumed to function within the efficiency limits defined at the Safety Analysis Report. In case of lack of ventilation a parametric analysis is also performed to examine the dependency of the source term on the containment leakage rate. A typical as well as an adverse meteorological scenario have been defined in the JRODOS computational platform in order to predict the effective, lung and thyroid doses within a region defined by a 15 km radius downwind from the reactor building. The radiological consequences of the eighteen scenarios associated with the accident sequences are presented and discussed.

  9. Photoperiod Regulates vgf-Derived Peptide Processing in Siberian Hamsters.

    Directory of Open Access Journals (Sweden)

    Barbara Noli

    Full Text Available VGF mRNA is induced in specific hypothalamic areas of the Siberian hamster upon exposure to short photoperiods, which is associated with a seasonal decrease in appetite and weight loss. Processing of VGF generates multiple bioactive peptides, so the objective of this study was to determine the profile of the VGF-derived peptides in the brain, pituitary and plasma from Siberian hamsters, and to establish whether differential processing might occur in the short day lean state versus long day fat. Antisera against short sequences at the C- or N- termini of proVGF, as well as against NERP-1, TPGH and TLQP peptides, were used for analyses of tissues, and both immunohistochemistry and enzyme linked immunosorbent assay (ELISA coupled with high-performance liquid (HPLC or gel chromatography were carried out. VGF peptide immunoreactivity was found within cortex cholinergic perikarya, in multiple hypothalamic nuclei, including those containing vasopressin, and in pituitary gonadotrophs. ELISA revealed that exposure to short day photoperiod led to a down-regulation of VGF immunoreactivity in the cortex, and a less pronounced decrease in the hypothalamus and pituitary, while the plasma VGF levels were not affected by the photoperiod. HPLC and gel chromatography both confirmed the presence of multiple VGF-derived peptides in these tissues, while gel chromatography showed the presence of the VGF precursor in all tissues tested except for the cortex. These observations are consistent with the view that VGF-derived peptides have pleiotropic actions related to changing photoperiod, possibly by regulating cholinergic systems in the cortex, vasopressin hypothalamic pathways, and the reproductive axis.

  10. A bio-based ‘green’ process for catalytic adipic acid production from lignocellulosic biomass using cellulose and hemicellulose derived γ-valerolactone

    International Nuclear Information System (INIS)

    Han, Jeehoon

    2016-01-01

    Highlights: • A bio-based ‘green’ process for catalytic conversion of corn stover to adipic acid (ADA) is studied. • New separations for effective recovery of biomass derivatives are developed. • Separations are integrated with cellulose/hemicellulose-to-ADA conversions. • Proposed process can compete economically with the current petro-based process. - Abstract: A bio-based ‘green’ process is presented for the catalytic conversion of corn stover to adipic acid (ADA) based on experimental studies. ADA is used for biobased nylon 6.6 manufacturing from lignocellulosics as carbon and energy source. In this process, the cellulose and hemicellulose fractions are catalytically converted to γ-valerolactone (GVL), using cellulose and hemicellulose-derived GVL as a solvent, and subsequently upgrading to ADA. Experimental studies showed maximal carbon yields (biomass-to-GVL: 41% and GVL-to-ADA: 46%) at low concentrations (below 16 wt% solids) using large volumes of GVL solvents while requiring efficient interstage separations and product recovery. This work presents an integrated process, including catalytic conversion and separation subsystems for GVL and ADA production and recovery, and designs a heat exchanger network to satisfy the total energy requirements of the integrated process via combustion of biomass residues (lignin and humins). Finally, an economic analysis shows that 2000 metric tonnes (Mt) per day of corn stover feedstock processing results in a minimum selling price of $633 per Mt if using the best possible parameters.

  11. Computer program for source distribution process in radiation facility

    International Nuclear Information System (INIS)

    Al-Kassiri, H.; Abdul Ghani, B.

    2007-08-01

    Computer simulation for dose distribution using Visual Basic has been done according to the arrangement and activities of Co-60 sources. This program provides dose distribution in treated products depending on the product density and desired dose. The program is useful for optimization of sources distribution during loading process. there is good agreement between calculated data for the program and experimental data.(Author)

  12. Modelling of H.264 MPEG2 TS Traffic Source

    Directory of Open Access Journals (Sweden)

    Stanislav Klucik

    2013-01-01

    Full Text Available This paper deals with IPTV traffic source modelling. Traffic sources are used for simulation, emulation and real network testing. This model is made as a derivation of known recorded traffic sources that are analysed and statistically processed. As the results show the proposed model causes in comparison to the known traffic source very similar network traffic parameters when used in a simulated network.

  13. Broad beam ion sources and some surface processes

    International Nuclear Information System (INIS)

    Neumann, H.; Scholze, F.; Tarz, M.; Schindler, A.; Wiese, R.; Nestler, M.; Blum, T.

    2005-01-01

    Modern broad-beam multi-aperture ion sources are widely used in material and surface technology applications. Customizing the generated ion beam properties (i. e. the ion current density profile) for specific demands of the application is a main challenge in the improvement of the ion beam technologies. First we introduce ion sources based on different plasma excitation principles shortly. An overview of source plasma and ion beam measurement methods deliver input data for modelling methods. This beam profile modelling using numerical trajectory codes and the validation of the results by Faraday cup measurements as a basis for ion beam profile design are described. Furthermore possibilities for ex situ and in situ beam profile control are demonstrated, like a special method for in situ control of a linear ion source beam profile, a grid modification for circular beam profile design and a cluster principle for broad beam sources. By means of these methods, the beam shape may be adapted to specific technological demands. Examples of broad beam source application in ion beam figuring of optical surfaces, modification of stainless steel, photo voltaic processes and deposition of EUVL-multilayer stacks are finally presented. (Author)

  14. Optimization of industrial processes using radiation sources

    International Nuclear Information System (INIS)

    Salles, Claudio G.; Silva Filho, Edmundo D. da; Toribio, Norberto M.; Gandara, Leonardo A.

    1996-01-01

    Aiming the enhancement of the staff protection against radiation in operational areas, the SAMARCO Mineracao S.A. proceeded a reevaluation and analysis of the real necessity of the densimeters/radioactive sources in the operational area, and also the development of an alternative control process for measurement the ore pulp, and introduced of the advanced equipment for sample chemical analysis

  15. Tritium labelled steroids, preparation process and application to synthesis of tritium labelled estrane derivatives

    International Nuclear Information System (INIS)

    1978-01-01

    Process for preparing new steroids labelled with tritium in 6.7 and comprising in 3 a blocked ketonic group as ketal, thioketal or derivatives. Application of these products to the synthesis of tritium labelled estrane derivatives [fr

  16. Bioaccumulation of hydrocarbons derived from terrestrial and anthropogenic sources in the Asian clam, Potamocorbula amurensis, in San Francisco Bay estuary

    Science.gov (United States)

    Pereira, Wilfred E.; Hostettler, Frances D.; Rapp, John B.

    1992-01-01

    An assessment was made in Suisun Bay, California, of the distributions of hydrocarbons in estuarine bed and suspended sediments and in the recently introduced asian clam, Potamocorbula amurensis. Sediments and clams were contaminated with hydrocarbons derived from petrogenic and pyrogenic sources. Distributions of alkanes and of hopane and sterane biomarkers in sediments and clams were similar, indicating that petroleum hydrocarbons associated with sediments are bioavailable to Potamocorbula amurensis. Polycyclic aromatic hydrocarbons in the sediments and clams were derived mainly from combustion sources. Potamocorbula amurensis is therefore a useful bioindicator of hydrocarbon contamination, and may be used as a biomonitor of hydrocarbon pollution in San Francisco Bay.

  17. Evaluating sources and processing of nonpoint source nitrate in a small suburban watershed in China

    Science.gov (United States)

    Han, Li; Huang, Minsheng; Ma, Minghai; Wei, Jinbao; Hu, Wei; Chouhan, Seema

    2018-04-01

    Identifying nonpoint sources of nitrate has been a long-term challenge in mixed land-use watershed. In the present study, we combine dual nitrate isotope, runoff and stream water monitoring to elucidate the nonpoint nitrate sources across land use, and determine the relative importance of biogeochemical processes for nitrate export in a small suburban watershed, Longhongjian watershed, China. Our study suggested that NH4+ fertilizer, soil NH4+, litter fall and groundwater were the main nitrate sources in Longhongjian Stream. There were large changes in nitrate sources in response to season and land use. Runoff analysis illustrated that the tea plantation and forest areas contributed to a dominated proportion of the TN export. Spatial analysis illustrated that NO3- concentration was high in the tea plantation and forest areas, and δ15N-NO3 and δ18O-NO3 were enriched in the step ponds. Temporal analysis showed high NO3- level in spring, and nitrate isotopes were enriched in summer. Study as well showed that the step ponds played an important role in mitigating nitrate pollution. Nitrification and plant uptake were the significant biogeochemical processes contributing to the nitrogen transformation, and denitrification hardly occurred in the stream.

  18. Sources of Information as Determinants of Product and Process Innovation

    OpenAIRE

    G?mez, Jaime; Salazar, Idana; Vargas, Pilar

    2016-01-01

    In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are ...

  19. Pure sources and efficient detectors for optical quantum information processing

    Science.gov (United States)

    Zielnicki, Kevin

    Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on

  20. Radiation sources and process

    International Nuclear Information System (INIS)

    Honious, H.B.; Janzow, E.F.; Malson, H.A.; Moyer, S.E.

    1980-01-01

    The invention relates to radiation sources comprising a substrate having an electrically-conductive non-radioactive metal surface, a layer of a metal radioactive isotope of the scandium group, which in addition to scandium, yttrium, lanthanum and actinium, includes all the lanthanide and actinide series of elements, with the actinide series usually being preferred because of the nature of the radioactive isotopes therein, particularly americium-241, curium-244, plutonium-238, californium-252 and promethium-147, and a non-radioactive bonding metal codeposited on the surface by electroplating the isotope and bonding metal from an electrolytic solution, the isotope being present in the layer in minor amount as compared to the bonding metal, and with or without a non-radioactive protective metal coating covering the isotoype and bonding metal on the surface, the coating being sufficiently thin to permit radiation to pass through the coating. The invention also relates to a process for providing radiation sources comprising codepositing a layer of the metal radioactive isotope with a non-radioactive bonding metal from an electrolytic solution in which the isotope is present in minor molar amount as compared to the bonding metal such that the codeposited layer contains a minor molar amount of the isotope compared to the bonding metal by electroplating on an electrically-conductive non-radioactive metal surface of a cathode substrate, and with or without depositing a nonradioactive protective metal coating over the isotope and bonding metal on the surface, the coating being sufficiently thin to permit radiation to pass through the coating

  1. Source Water Protection Contaminant Sources

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Simplified aggregation of potential contaminant sources used for Source Water Assessment and Protection. The data is derived from IDNR, IDALS, and US EPA program...

  2. Binary stars as sources of iron and of s-process isotopes

    International Nuclear Information System (INIS)

    Iben, Icko Jr.; Bologna Univ.; Sussex Univ., Brighton

    1986-01-01

    Sources of elements and isotopes in stars, during the development of stars, is examined. The paper was presented to the conference on ''The early universe and its evolution'', Erice, Italy, 1986. Intermediate mass stars in their asymptotic giant branch phase of evolution as sources of carbon, merging white dwarfs as sources of iron, and helium star cataclysmics as sources of s-process elements, are all discussed. (U.K.)

  3. The Chandra Source Catalog: Algorithms

    Science.gov (United States)

    McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.

  4. Categorisation of Practices and Sources- A Key Issue in Licensing Process

    International Nuclear Information System (INIS)

    Janzekovic, H.; Krizman, M.; Vokal, B.; Petrovic, Z.

    2004-01-01

    The analysis of a radioactive sources inventory in countries with a nuclear programme usually comprises nearly all possible man-made sources available today, from sources related to nuclear power plants to calibration sources used for educational purposes. The risk based licensing process of radiation sources and exposures is a demanding task which could be internationally harmonised by introducing sources and practice related categorisation. The detailed categorisation of radioisotopes, replacing [1], was recently published [2]. The activity ratio (A/D ratio) is used as a basic parameter which is proportional to a risk involved in a use of a radioisotope. Radioisotopes as well as related practices are categorised. No categorisation of ionising sources related to electrical apparatus producing ionising radiation without radioisotopes has been given in literature. In addition, licensees usually perform many different activities with a specific source, so the categorisation of practice should be done, based on a risk involved with a specific practice. The risk is related to the probability of a specific event as well as to the consequences of that event. It is strongly related to the categorisation of source. The main issues related to a licensing process of sources and practices are presented. The review of possible categorisation of radioisotopes and related practices is given and a proposal of a combined harmonised approach of categorisation of sources and practices, based on risk, is given. (Author) 19 refs

  5. Deriving consumer-facing disease concepts for family health histories using multi-source sampling.

    Science.gov (United States)

    Hulse, Nathan C; Wood, Grant M; Haug, Peter J; Williams, Marc S

    2010-10-01

    The family health history has long been recognized as an effective way of understanding individuals' susceptibility to familial disease; yet electronic tools to support the capture and use of these data have been characterized as inadequate. As part of an ongoing effort to build patient-facing tools for entering detailed family health histories, we have compiled a set of concepts specific to familial disease using multi-source sampling. These concepts were abstracted by analyzing family health history data patterns in our enterprise data warehouse, collection patterns of consumer personal health records, analyses from the local state health department, a healthcare data dictionary, and concepts derived from genetic-oriented consumer education materials. Collectively, these sources yielded a set of more than 500 unique disease concepts, represented by more than 2500 synonyms for supporting patients in entering coded family health histories. We expect that these concepts will be useful in providing meaningful data and education resources for patients and providers alike.

  6. Source-Modeling Auditory Processes of EEG Data Using EEGLAB and Brainstorm

    Directory of Open Access Journals (Sweden)

    Maren Stropahl

    2018-05-01

    Full Text Available Electroencephalography (EEG source localization approaches are often used to disentangle the spatial patterns mixed up in scalp EEG recordings. However, approaches differ substantially between experiments, may be strongly parameter-dependent, and results are not necessarily meaningful. In this paper we provide a pipeline for EEG source estimation, from raw EEG data pre-processing using EEGLAB functions up to source-level analysis as implemented in Brainstorm. The pipeline is tested using a data set of 10 individuals performing an auditory attention task. The analysis approach estimates sources of 64-channel EEG data without the prerequisite of individual anatomies or individually digitized sensor positions. First, we show advanced EEG pre-processing using EEGLAB, which includes artifact attenuation using independent component analysis (ICA. ICA is a linear decomposition technique that aims to reveal the underlying statistical sources of mixed signals and is further a powerful tool to attenuate stereotypical artifacts (e.g., eye movements or heartbeat. Data submitted to ICA are pre-processed to facilitate good-quality decompositions. Aiming toward an objective approach on component identification, the semi-automatic CORRMAP algorithm is applied for the identification of components representing prominent and stereotypic artifacts. Second, we present a step-wise approach to estimate active sources of auditory cortex event-related processing, on a single subject level. The presented approach assumes that no individual anatomy is available and therefore the default anatomy ICBM152, as implemented in Brainstorm, is used for all individuals. Individual noise modeling in this dataset is based on the pre-stimulus baseline period. For EEG source modeling we use the OpenMEEG algorithm as the underlying forward model based on the symmetric Boundary Element Method (BEM. We then apply the method of dynamical statistical parametric mapping (dSPM to obtain

  7. Simulating variable source problems via post processing of individual particle tallies

    International Nuclear Information System (INIS)

    Bleuel, D.L.; Donahue, R.J.; Ludewigt, B.A.; Vujic, J.

    2000-01-01

    Monte Carlo is an extremely powerful method of simulating complex, three dimensional environments without excessive problem simplification. However, it is often time consuming to simulate models in which the source can be highly varied. Similarly difficult are optimization studies involving sources in which many input parameters are variable, such as particle energy, angle, and spatial distribution. Such studies are often approached using brute force methods or intelligent guesswork. One field in which these problems are often encountered is accelerator-driven Boron Neutron Capture Therapy (BNCT) for the treatment of cancers. Solving the reverse problem of determining the best neutron source for optimal BNCT treatment can be accomplished by separating the time-consuming particle-tracking process of a full Monte Carlo simulation from the calculation of the source weighting factors which is typically performed at the beginning of a Monte Carlo simulation. By post-processing these weighting factors on a recorded file of individual particle tally information, the effect of changing source variables can be realized in a matter of seconds, instead of requiring hours or days for additional complete simulations. By intelligent source biasing, any number of different source distributions can be calculated quickly from a single Monte Carlo simulation. The source description can be treated as variable and the effect of changing multiple interdependent source variables on the problem's solution can be determined. Though the focus of this study is on BNCT applications, this procedure may be applicable to any problem that involves a variable source

  8. Electron cyclotron resonance microwave ion sources for thin film processing

    International Nuclear Information System (INIS)

    Berry, L.A.; Gorbatkin, S.M.

    1990-01-01

    Plasmas created by microwave absorption at the electron cyclotron resonance (ECR) are increasingly used for a variety of plasma processes, including both etching and deposition. ECR sources efficiently couple energy to electrons and use magnetic confinement to maximize the probability of an electron creating an ion or free radical in pressure regimes where the mean free path for ionization is comparable to the ECR source dimensions. The general operating principles of ECR sources are discussed with special emphasis on their use for thin film etching. Data on source performance during Cl base etching of Si using an ECR system are presented. 32 refs., 5 figs

  9. ACToR Chemical Structure processing using Open Source ...

    Science.gov (United States)

    ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from over 1,950 public sources. ACToR contains chemical structure information and toxicological data for over 558,000 unique chemicals. The database primarily includes data from NCCT research programs, in vivo toxicity data from ToxRef, human exposure data from ExpoCast, high-throughput screening data from ToxCast and high quality chemical structure information from the EPA DSSTox program. The DSSTox database is a chemical structure inventory for the NCCT programs and currently has about 16,000 unique structures. Included are also data from PubChem, ChemSpider, USDA, FDA, NIH and several other public data sources. ACToR has been a resource to various international and national research groups. Most of our recent efforts on ACToR are focused on improving the structural identifiers and Physico-Chemical properties of the chemicals in the database. Organizing this huge collection of data and improving the chemical structure quality of the database has posed some major challenges. Workflows have been developed to process structures, calculate chemical properties and identify relationships between CAS numbers. The Structure processing workflow integrates web services (PubChem and NIH NCI Cactus) to d

  10. Heat source model for welding process

    International Nuclear Information System (INIS)

    Doan, D.D.

    2006-10-01

    One of the major industrial stakes of the welding simulation relates to the control of mechanical effects of the process (residual stress, distortions, fatigue strength... ). These effects are directly dependent on the temperature evolutions imposed during the welding process. To model this thermal loading, an original method is proposed instead of the usual methods like equivalent heat source approach or multi-physical approach. This method is based on the estimation of the weld pool shape together with the heat flux crossing the liquid/solid interface, from experimental data measured in the solid part. Its originality consists in solving an inverse Stefan problem specific to the welding process, and it is shown how to estimate the parameters of the weld pool shape. To solve the heat transfer problem, the interface liquid/solid is modeled by a Bezier curve ( 2-D) or a Bezier surface (3-D). This approach is well adapted to a wide diversity of weld pool shapes met for the majority of the current welding processes (TIG, MlG-MAG, Laser, FE, Hybrid). The number of parameters to be estimated is weak enough, according to the cases considered from 2 to 5 in 20 and 7 to 16 in 3D. A sensitivity study leads to specify the location of the sensors, their number and the set of measurements required to a good estimate. The application of the method on test results of welding TIG on thin stainless steel sheets in emerging and not emerging configurations, shows that only one measurement point is enough to estimate the various weld pool shapes in 20, and two points in 3D, whatever the penetration is full or not. In the last part of the work, a methodology is developed for the transient analysis. It is based on the Duvaut's transformation which overpasses the discontinuity of the liquid metal interface and therefore gives a continuous variable for the all spatial domain. Moreover, it allows to work on a fixed mesh grid and the new inverse problem is equivalent to identify a source

  11. "Remember" source memory ROCs indicate recollection is a continuous process.

    Science.gov (United States)

    Slotnick, Scott D

    2010-01-01

    The dual process model assumes memory is based on recollection (retrieval with specific detail) or familiarity (retrieval without specific detail). A current debate is whether recollection is a threshold process or, like familiarity, is a continuous process. In the present study two continuous models and two threshold models of recollection were evaluated using receiver operating characteristic (ROC) analysis. These models included the continuous signal detection unequal variance model and the threshold dual process model. In the study phase of three experiments, objects were presented to the right or left of fixation. At test, participants made either remember-know responses or item confidence responses followed by source memory (spatial location) confidence ratings. Recollection-based ROCs were generated from source memory confidence ratings associated with "remember" responses (in Experiments 1-2) or the highest item confidence responses (in Experiment 3). Neither threshold model adequately fit any of the recollection-based ROCs. By contrast, one or both of the continuous models adequately fit all of the recollection-based ROCs. The present results indicate recollection and familiarity are both continuous processes.

  12. Derivation of Markov processes that violate detailed balance

    Science.gov (United States)

    Lee, Julian

    2018-03-01

    Time-reversal symmetry of the microscopic laws dictates that the equilibrium distribution of a stochastic process must obey the condition of detailed balance. However, cyclic Markov processes that do not admit equilibrium distributions with detailed balance are often used to model systems driven out of equilibrium by external agents. I show that for a Markov model without detailed balance, an extended Markov model can be constructed, which explicitly includes the degrees of freedom for the driving agent and satisfies the detailed balance condition. The original cyclic Markov model for the driven system is then recovered as an approximation at early times by summing over the degrees of freedom for the driving agent. I also show that the widely accepted expression for the entropy production in a cyclic Markov model is actually a time derivative of an entropy component in the extended model. Further, I present an analytic expression for the entropy component that is hidden in the cyclic Markov model.

  13. A One Line Derivation of DCC: Application of a Vector Random Coefficient Moving Average Process

    NARCIS (Netherlands)

    C.M. Hafner (Christian); M.J. McAleer (Michael)

    2014-01-01

    markdownabstract__Abstract__ One of the most widely-used multivariate conditional volatility models is the dynamic conditional correlation (or DCC) specification. However, the underlying stochastic process to derive DCC has not yet been established, which has made problematic the derivation of

  14. Fusion as a source of synthetic fuels

    International Nuclear Information System (INIS)

    Powell, J.R.; Fillo, J.A.; Steinberg, M.

    1981-01-01

    In the near-term, coal derived synthetic fuels will be used; but in the long-term, resource depletion and environmental effects will mandate synthetic fuels from inexhaustible sources - fission, fusion, and solar. Of the three sources, fusion appears uniquely suited for the efficient production of hydrogen-based fuels, due to its ability to directly generate very high process temperatures (up to approx. 2000 0 C) for water splitting reactions. Fusion-based water splitting reactions include high temperature electrolysis (HTE) of steam, thermochemical cycles, hybrid electrochemical/thermochemical, and direct thermal decomposition. HTE appears to be the simplest and most efficient process with efficiencies of 50 to 70% (fusion to hydrogen chemical energy), depending on process conditions

  15. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  16. Derived heuristics-based consistent optimization of material flow in a gold processing plant

    Science.gov (United States)

    Myburgh, Christie; Deb, Kalyanmoy

    2018-01-01

    Material flow in a chemical processing plant often follows complicated control laws and involves plant capacity constraints. Importantly, the process involves discrete scenarios which when modelled in a programming format involves if-then-else statements. Therefore, a formulation of an optimization problem of such processes becomes complicated with nonlinear and non-differentiable objective and constraint functions. In handling such problems using classical point-based approaches, users often have to resort to modifications and indirect ways of representing the problem to suit the restrictions associated with classical methods. In a particular gold processing plant optimization problem, these facts are demonstrated by showing results from MATLAB®'s well-known fmincon routine. Thereafter, a customized evolutionary optimization procedure which is capable of handling all complexities offered by the problem is developed. Although the evolutionary approach produced results with comparatively less variance over multiple runs, the performance has been enhanced by introducing derived heuristics associated with the problem. In this article, the development and usage of derived heuristics in a practical problem are presented and their importance in a quick convergence of the overall algorithm is demonstrated.

  17. The Simple Concurrent Online Processing System (SCOPS) - An open-source interface for remotely sensed data processing

    Science.gov (United States)

    Warren, M. A.; Goult, S.; Clewley, D.

    2018-06-01

    Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.

  18. Nuclear heat source component design considerations for HTGR process heat reactor plant concept

    International Nuclear Information System (INIS)

    McDonald, C.F.; Kapich, D.; King, J.H.; Venkatesh, M.C.

    1982-01-01

    Using alternate energy sources abundant in the U.S.A. to help curb foreign oil imports is vitally important from both national security and economic standpoints. Perhaps the most forwardlooking opportunity to realize national energy goals involves the integrated use of two energy sources that have an established technology base in the U.S.A., namely nuclear energy and coal. The coupling of a high-temperature gas-cooled reactor (HTGR) and a chemical process facility has the potential for long-term synthetic fuel production (i.e., oil, gasoline, aviation fuel, hydrogen, etc.) using coal as the carbon source. Studies are in progress to exploit the high-temperature capability of an advanced HTGR variant for nuclear process heat. The process heat plant discussed in this paper has a 1170-MW(t) reactor as the heat source and the concept is based on indirect reforming, i.e., the high-temperature nuclear thermal energy is transported (via an intermediate heat exchanger (IHX)) to the externally located process plant by a secondary helium transport loop. Emphasis is placed on design considerations for the major nuclear heat source (NHS) components, and discussions are presented for the reactor core, prestressed concrete reactor vessel (PCRV), rotating machinery, and heat exchangers

  19. A bioactive molecule in a complex wound healing process: platelet-derived growth factor.

    Science.gov (United States)

    Kaltalioglu, Kaan; Coskun-Cevher, Sule

    2015-08-01

    Wound healing is considered to be particularly important after surgical procedures, and the most important wounds related to surgical procedures are incisional, excisional, and punch wounds. Research is ongoing to identify methods to heal non-closed wounds or to accelerate wound healing; however, wound healing is a complex process that includes many biological and physiological events, and it is affected by various local and systemic factors, including diabetes mellitus, infection, ischemia, and aging. Different cell types (such as platelets, macrophages, and neutrophils) release growth factors during the healing process, and platelet-derived growth factor is a particularly important mediator in most stages of wound healing. This review explores the relationship between platelet-derived growth factor and wound healing. © 2014 The International Society of Dermatology.

  20. Weldability of general purpose heat source new-process iridium

    International Nuclear Information System (INIS)

    Kanne, W.R.

    1987-01-01

    Weldability tests on General Purpose Heat Source (GPHS) iridium capsules showed that a new iridium fabrication process reduced susceptibility to underbead cracking. Seventeen capsules were welded (a total of 255 welds) in four categories and the number of cracks in each weld was measured

  1. Source and Processes of Dissolved Organic Matter in a Bangladesh Groundwater

    Science.gov (United States)

    McKnight, D. M.; Simone, B. E.; Mladenov, N.; Zheng, Y.; Legg, T. M.; Nemergut, D.

    2010-12-01

    Arsenic contamination of groundwater is a global health crisis, especially in Bangladesh where an estimated 40 million people are at risk. The release of geogenic arsenic bound to sediments into groundwater is thought to be influenced by dissolved organic matter (DOM) through several biogeochemical processes. Abiotically, DOM can promote the release of sediment bound As through the formation of DOM-As complexes and competitive interactions between As and DOM for sorption sites on the sediment. Additionally, the labile portion of groundwater DOM can serve as an electron donor to support microbial growth and the more recalcitrant humic DOM may serve as an electron shuttle, facilitating the eventual reduction of ferric iron present as iron oxides in sediments and consequently the mobilization of sorbed As and organic material. The goal of this study is to understand the source of DOM in representative Bangladesh groundwaters and the DOM sorption processes that occur at depth. We report chemical characteristics of representative DOM from a surface water, a shallow low-As groundwater, mid-depth high-As groundwater from the Araihazar region of Bangladesh. The humic DOM from groundwater displayed a more terrestrial chemical signature, indicative of being derived from plant and soil precursor materials, while the surface water humic DOM had a more microbial signature, suggesting an anthropogenic influence. In terms of biogeochemical processes occurring in the groundwater system, there is evidence from a diverse set of chemical characteristics, ranging from 13C-NMR spectroscopy to the analysis of lignin phenols, for preferential sorption onto iron oxides influencing the chemistry and reactivity of humic DOM in high As groundwater in Bangladesh. Taken together, these results provide chemical evidence for anthropogenic influence and the importance of sorption reactions at depth controlling the water quality of high As groundwater in Bangladesh.

  2. Source placement for equalization in small enclosures

    DEFF Research Database (Denmark)

    Stefanakis, Nick; Sarris, J.; Cambourakis, G.

    2008-01-01

    ) but not with those that will deteriorate it (the "undesired" modes). Simulation results in rectangular rooms and in a car cavity show the benefits of source placement in terms of reduced overall error and increased spatial robustness in the equalization process. Additional benefits, which can be derived by proper...

  3. Nuclear heat source component design considerations for HTGR process heat reactor plant concept

    International Nuclear Information System (INIS)

    McDonald, C.F.; Kapich, D.; King, J.H.; Venkatesh, M.C.

    1982-05-01

    The coupling of a high-temperature gas-cooled reactor (HTGR) and a chemical process facility has the potential for long-term synthetic fuel production (i.e., oil, gasoline, aviation fuel, hydrogen, etc) using coal as the carbon source. Studies are in progress to exploit the high-temperature capability of an advanced HTGR variant for nuclear process heat. The process heat plant discussed in this paper has a 1170-MW(t) reactor as the heat source and the concept is based on indirect reforming, i.e., the high-temperature nuclear thermal energy is transported [via an intermediate heat exchanger (IHX)] to the externally located process plant by a secondary helium transport loop. Emphasis is placed on design considerations for the major nuclear heat source (NHS) components, and discussions are presented for the reactor core, prestressed concrete reactor vessel (PCRV), rotating machinery, and heat exchangers

  4. Effect of calcium source on structure and properties of sol-gel derived bioactive glasses.

    Science.gov (United States)

    Yu, Bobo; Turdean-Ionescu, Claudia A; Martin, Richard A; Newport, Robert J; Hanna, John V; Smith, Mark E; Jones, Julian R

    2012-12-18

    The aim was to determine the most effective calcium precursor for synthesis of sol-gel hybrids and for improving homogeneity of sol-gel bioactive glasses. Sol-gel derived bioactive calcium silicate glasses are one of the most promising materials for bone regeneration. Inorganic/organic hybrid materials, which are synthesized by incorporating a polymer into the sol-gel process, have also recently been produced to improve toughness. Calcium nitrate is conventionally used as the calcium source, but it has several disadvantages. Calcium nitrate causes inhomogeneity by forming calcium-rich regions, and it requires high temperature treatment (>400 °C) for calcium to be incorporated into the silicate network. Nitrates are also toxic and need to be burnt off. Calcium nitrate therefore cannot be used in the synthesis of hybrids as the highest temperature used in the process is typically 40-60 °C. Therefore, a different precursor is needed that can incorporate calcium into the silica network and enhance the homogeneity of the glasses at low (room) temperature. In this work, calcium methoxyethoxide (CME) was used to synthesize sol-gel bioactive glasses with a range of final processing temperatures from 60 to 800 °C. Comparison is made between the use of CME and calcium chloride and calcium nitrate. Using advanced probe techniques, the temperature at which Ca is incorporated into the network was identified for 70S30C (70 mol % SiO(2), 30 mol % CaO) for each of the calcium precursors. When CaCl(2) was used, the Ca did not seem to enter the network at any of the temperatures used. In contrast, Ca from CME entered the silica network at room temperature, as confirmed by X-ray diffraction, (29)Si magic angle spinning nuclear magnetic resonance spectroscopy, and dissolution studies. CME should be used in preference to calcium salts for hybrid synthesis and may improve homogeneity of sol-gel glasses.

  5. Fundamental atomic collisional processes in negative ion sources for H-

    International Nuclear Information System (INIS)

    Crandall, D.H.; Barnett, C.F.

    1977-01-01

    The basic collision processes which create or destroy H - in gas-phase collisions like those which occur in ion sources are discussed. Cross sections are presented which show that, for known processes, destruction is generally more likely than production. One possible production mechanism (on which there is no data) is suggested, and isotope effects between hydrogen and deuterium are discussed

  6. Variance analysis of the Monte Carlo perturbation source method in inhomogeneous linear particle transport problems. Derivation of formulae

    International Nuclear Information System (INIS)

    Noack, K.

    1981-01-01

    The perturbation source method is used in the Monte Carlo method in calculating small effects in a particle field. It offers primising possibilities for introducing positive correlation between subtracting estimates even in the cases where other methods fail, in the case of geometrical variations of a given arrangement. The perturbation source method is formulated on the basis of integral equations for the particle fields. The formulae for the second moment of the difference of events are derived. Explicity a certain class of transport games and different procedures for generating the so-called perturbation particles are considered [ru

  7. Inferring time derivatives including cell growth rates using Gaussian processes

    Science.gov (United States)

    Swain, Peter S.; Stevenson, Keiran; Leary, Allen; Montano-Gutierrez, Luis F.; Clark, Ivan B. N.; Vogel, Jackie; Pilizota, Teuta

    2016-12-01

    Often the time derivative of a measured variable is of as much interest as the variable itself. For a growing population of biological cells, for example, the population's growth rate is typically more important than its size. Here we introduce a non-parametric method to infer first and second time derivatives as a function of time from time-series data. Our approach is based on Gaussian processes and applies to a wide range of data. In tests, the method is at least as accurate as others, but has several advantages: it estimates errors both in the inference and in any summary statistics, such as lag times, and allows interpolation with the corresponding error estimation. As illustrations, we infer growth rates of microbial cells, the rate of assembly of an amyloid fibril and both the speed and acceleration of two separating spindle pole bodies. Our algorithm should thus be broadly applicable.

  8. 26 CFR 1.863-9 - Source of income derived from communications activity under section 863(a), (d), and (e).

    Science.gov (United States)

    2010-04-01

    ... SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Regulations Applicable... business within the United States is income from sources within the United States to the extent the income... taxpayer is paid to transmit the communication. Income derived by a United States or foreign person from...

  9. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  10. Photon statistics characterization of a single-photon source

    International Nuclear Information System (INIS)

    Alleaume, R; Treussart, F; Courty, J-M; Roch, J-F

    2004-01-01

    In a recent experiment, we reported the time-domain intensity noise measurement of a single-photon source relying on single-molecule fluorescence control. In this paper, we present data processing starting from photocount timestamps. The theoretical analytical expression of the time-dependent Mandel parameter Q(T) of an intermittent single-photon source is derived from ON↔OFF dynamics. Finally, source intensity noise analysis, using the Mandel parameter, is quantitatively compared with the usual approach relying on the time autocorrelation function, both methods yielding the same molecular dynamical parameters

  11. Staphylococcus aureus utilizes host-derived lipoprotein particles as sources of exogenous fatty acids.

    Science.gov (United States)

    Delekta, Phillip C; Shook, John C; Lydic, Todd A; Mulks, Martha H; Hammer, Neal D

    2018-03-26

    Methicillin-resistant Staphylococcus aureus (MRSA) is a threat to global health. Consequently, much effort has focused on the development of new antimicrobials that target novel aspects of S. aureus physiology. Fatty acids are required to maintain cell viability, and bacteria synthesize fatty acids using the type II fatty acid synthesis pathway (FASII). FASII is significantly different from human fatty acid synthesis, underscoring the therapeutic potential of inhibiting this pathway. However, many Gram-positive pathogens incorporate exogenous fatty acids, bypassing FASII inhibition and leaving the clinical potential of FASII inhibitors uncertain. Importantly, the source(s) of fatty acids available to pathogens within the host environment remains unclear. Fatty acids are transported throughout the body by lipoprotein particles in the form of triglycerides and esterified cholesterol. Thus, lipoproteins, such as low-density lipoprotein (LDL) represent a potentially rich source of exogenous fatty acids for S. aureus during infection. We sought to test the ability of LDLs to serve as a fatty acid source for S. aureus and show that cells cultured in the presence of human LDLs demonstrate increased tolerance to the FASII inhibitor, triclosan. Using mass spectrometry, we observed that host-derived fatty acids present in the LDLs are incorporated into the staphylococcal membrane and that tolerance to triclosan is facilitated by the fatty acid kinase A, FakA, and Geh, a triacylglycerol lipase. Finally, we demonstrate that human LDLs support the growth of S. aureus fatty acid auxotrophs. Together, these results suggest that human lipoprotein particles are a viable source of exogenous fatty acids for S. aureus during infection. IMPORTANCE Inhibition of bacterial fatty acid synthesis is a promising approach to combating infections caused by S. aureus and other human pathogens. However, S. aureus incorporates exogenous fatty acids into its phospholipid bilayer. Therefore, the

  12. Source Rupture Process of the 2016 Kumamoto, Japan, Earthquake Inverted from Strong-Motion Records

    Science.gov (United States)

    Zhang, Wenbo; Zheng, Ao

    2017-04-01

    On 15 April, 2016 the great earthquake with magnitude Mw7.1 occurred in Kumamoto prefecture, Japan. The focal mechanism solution released by F-net located the hypocenter at 130.7630°E, 32.7545°N, at a depth of 12.45 km, and the strike, dip, and the rake angle of the fault were N226°E, 84˚ and -142° respectively. The epicenter distribution and focal mechanisms of aftershocks implied the mechanism of the mainshock might have changed in the source rupture process, thus a single focal mechanism was not enough to explain the observed data adequately. In this study, based on the inversion result of GNSS and InSAR surface deformation with active structures for reference, we construct a finite fault model with focal mechanism changes, and derive the source rupture process by multi-time-window linear waveform inversion method using the strong-motion data (0.05 1.0Hz) obtained by K-NET and KiK-net of Japan. Our result shows that the Kumamoto earthquake is a right-lateral strike slipping rupture event along the Futagawa-Hinagu fault zone, and the seismogenic fault is divided into a northern segment and a southern one. The strike and the dip of the northern segment are N235°E, 60˚ respectively. And for the southern one, they are N205°E, 72˚ respectively. The depth range of the fault model is consistent with the depth distribution of aftershocks, and the slip on the fault plane mainly concentrate on the northern segment, in which the maximum slip is about 7.9 meter. The rupture process of the whole fault continues for approximately 18-sec, and the total seismic moment released is 5.47×1019N·m (Mw 7.1). In addition, the essential feature of the distribution of PGV and PGA synthesized by the inversion result is similar to that of observed PGA and seismic intensity.

  13. Development and Validation of an Acid Mine Drainage Treatment Process for Source Water

    Energy Technology Data Exchange (ETDEWEB)

    Lane, Ann [Battelle Memorial Institute, Columbus, OH (United States)

    2016-03-01

    Throughout Northern Appalachia and surrounding regions, hundreds of abandoned mine sites exist which frequently are the source of Acid Mine Drainage (AMD). AMD typically contains metal ions in solution with sulfate ions which have been leached from the mine. These large volumes of water, if treated to a minimum standard, may be of use in Hydraulic Fracturing (HF) or other industrial processes. This project’s focus is to evaluate an AMD water treatment technology for the purpose of providing treated AMD as an alternative source of water for HF operations. The HydroFlex™ technology allows the conversion of a previous environmental liability into an asset while reducing stress on potable water sources. The technology achieves greater than 95% water recovery, while removing sulfate to concentrations below 100 mg/L and common metals (e.g., iron and aluminum) below 1 mg/L. The project is intended to demonstrate the capability of the process to provide AMD as alternative source water for HF operations. The second budget period of the project has been completed during which Battelle conducted two individual test campaigns in the field. The first test campaign demonstrated the ability of the HydroFlex system to remove sulfate to levels below 100 mg/L, meeting the requirements indicated by industry stakeholders for use of the treated AMD as source water. The second test campaign consisted of a series of focused confirmatory tests aimed at gathering additional data to refine the economic projections for the process. Throughout the project, regular communications were held with a group of project stakeholders to ensure alignment of the project objectives with industry requirements. Finally, the process byproduct generated by the HydroFlex process was evaluated for the treatment of produced water against commercial treatment chemicals. It was found that the process byproduct achieved similar results for produced water treatment as the chemicals currently in use. Further

  14. An industrial radiation source for food processing

    International Nuclear Information System (INIS)

    Sadat, R.

    1986-01-01

    The scientific linacs realized by CGR MeV in France have been installed in several research centers, the medical accelerators of CGR MeV have been installed in radiotherapy centers all over the world, and the industrial linacs have been used for radiography in heavy industries. Based on the experience for 30 years, CGR MeV has realized a new industrial radiation source for food processing. CARIC is going to install a new machine of CGR MeV, CASSITRON, as the demand for radiation increased. This machine has been devised specially for industrial irradiation purpose. Its main features are security, simplicity and reliability, and it is easy to incorporate it into a production line. The use of CASSITRON for food industry, the ionizing effect on mechanically separated poultry meat, the capital and processing cost and others are explained. Only 10 % of medical disposable supplies is treated by ionizing energy in France. The irradiation for food decontamination, and that for industrial treatment are demanded. Therefore, CARIC is going to increase the capacity by installing a CASSITRON for sterilization. The capital and processing cost are shown. The start of operation is expected in March, 1986. At present, a CASSITRON is being installed in the SPI food processing factory, and starts operation in a few weeks. (Kako, I.)

  15. Nitrous Oxide Production in an Eastern Corn Belt Soil: Sources and Redox Range

    Science.gov (United States)

    Nitrous oxide (N2O) derived from soils is a main contributor to the greenhouse gas effect and a precursor to ozone-depleting substrates; however, the source processes and interacting controls are not well established. This study was conducted to estimate magnitude and source (nitrification vs. denit...

  16. Multicharged heavy ion production process and ion sources in impulse regime allowing the operation of the process

    International Nuclear Information System (INIS)

    Jacquot, B.

    1985-01-01

    The present invention is concerned with a production process of multicharged ions of elements choosen in the following group carbon, nitrogen, oxygen, neon and argon in a ion source in impulse regime; the process is characterized in that the gas introduced in the ion souce enclosure is a gas mixture in a non-critical proportion (about 50% in partial pressure) of a first gas choosen among helium, nitrogen and oxygen and a second gas choosen in the group comprising carbon, nitrogen, oxygen, neon and argon. This process allows to grow current intensity of heavy ions more than 10 times. The invention is also concerned with a ion source in impulse regime; it is characterized in that it comprises an enclosure related to two gas entrances, provided with a valve controlled by pressure measurement in the enclosure [fr

  17. Fire Danger of Interaction Processes of Local Sources with a Limited Energy Capacity and Condensed Substances

    OpenAIRE

    Glushkov, Dmitry Olegovich; Strizhak, Pavel Alexandrovich; Vershinina, Kseniya Yurievna

    2015-01-01

    Numerical investigation of flammable interaction processes of local energy sources with liquid condensed substances has been carried out. Basic integrated characteristic values of process have been defined – ignition delay time at different energy sources parameters. Recommendations have been formulated to ensure fire safety of technological processes, characterized by possible local heat sources formation (cutting, welding, friction, metal grinding etc.) in the vicinity of storage areas, tra...

  18. Hydroponic potato production on nutrients derived from anaerobically-processed potato plant residues

    Science.gov (United States)

    Mackowiak, C. L.; Stutte, G. W.; Garland, J. L.; Finger, B. W.; Ruffe, L. M.

    1997-01-01

    Bioregenerative methods are being developed for recycling plant minerals from harvested inedible biomass as part of NASA's Advanced Life Support (ALS) research. Anaerobic processing produces secondary metabolites, a food source for yeast production, while providing a source of water soluble nutrients for plant growth. Since NH_4-N is the nitrogen product, processing the effluent through a nitrification reactor was used to convert this to NO_3-N, a more acceptable form for plants. Potato (Solanum tuberosum L.) cv. Norland plants were used to test the effects of anaerobically-produced effluent after processing through a yeast reactor or nitrification reactor. These treatments were compared to a mixed-N treatment (75:25, NO_3:NH_4) or a NO_3-N control, both containing only reagent-grade salts. Plant growth and tuber yields were greatest in the NO_3-N control and yeast reactor effluent treatments, which is noteworthy, considering the yeast reactor treatment had high organic loading in the nutrient solution and concomitant microbial activity.

  19. Experimentally testing the dependence of momentum transport on second derivatives using Gaussian process regression

    Science.gov (United States)

    Chilenski, M. A.; Greenwald, M. J.; Hubbard, A. E.; Hughes, J. W.; Lee, J. P.; Marzouk, Y. M.; Rice, J. E.; White, A. E.

    2017-12-01

    It remains an open question to explain the dramatic change in intrinsic rotation induced by slight changes in electron density (White et al 2013 Phys. Plasmas 20 056106). One proposed explanation is that momentum transport is sensitive to the second derivatives of the temperature and density profiles (Lee et al 2015 Plasma Phys. Control. Fusion 57 125006), but it is widely considered to be impossible to measure these higher derivatives. In this paper, we show that it is possible to estimate second derivatives of electron density and temperature using a nonparametric regression technique known as Gaussian process regression. This technique avoids over-constraining the fit by not assuming an explicit functional form for the fitted curve. The uncertainties, obtained rigorously using Markov chain Monte Carlo sampling, are small enough that it is reasonable to explore hypotheses which depend on second derivatives. It is found that the differences in the second derivatives of n{e} and T{e} between the peaked and hollow rotation cases are rather small, suggesting that changes in the second derivatives are not likely to explain the experimental results.

  20. Fire Danger of Interaction Processes of Local Sources with a Limited Energy Capacity and Condensed Substances

    Directory of Open Access Journals (Sweden)

    Glushkov Dmitrii O.

    2015-01-01

    Full Text Available Numerical investigation of flammable interaction processes of local energy sources with liquid condensed substances has been carried out. Basic integrated characteristic values of process have been defined – ignition delay time at different energy sources parameters. Recommendations have been formulated to ensure fire safety of technological processes, characterized by possible local heat sources formation (cutting, welding, friction, metal grinding etc. in the vicinity of storage areas, transportation, transfer and processing of flammable liquids (gasoline, kerosene, diesel fuel.

  1. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  2. Cell-derived microparticles in haemostasis and vascular medicine.

    Science.gov (United States)

    Burnier, Laurent; Fontana, Pierre; Kwak, Brenda R; Angelillo-Scherrer, Anne

    2009-03-01

    Considerable interest for cell-derived microparticles has emerged, pointing out their essential role in haemostatic response and their potential as disease markers, but also their implication in a wide range of physiological and pathological processes. They derive from different cell types including platelets - the main source of microparticles - but also from red blood cells, leukocytes and endothelial cells, and they circulate in blood. Despite difficulties encountered in analyzing them and disparities of results obtained with a wide range of methods, microparticle generation processes are now better understood. However, a generally admitted definition of microparticles is currently lacking. For all these reasons we decided to review the literature regarding microparticles in their widest definition, including ectosomes and exosomes, and to focus mainly on their role in haemostasis and vascular medicine.

  3. Crystallization processes derived from the interaction of urine and dolostone

    Science.gov (United States)

    Cámara, Beatriz; Alvarez de Buergo, Monica; Fort, Rafael

    2015-04-01

    The increase in the number of pets (mostly dogs), homeless people and the more recent open-air drinking sessions organized by young people in historical centers of European cities, derive on the augmentation of urinations on stone façades of the built cultural heritage. Up to now this process has been considered only under an undesirable aesthetical point of view and the insalubrious conditions it creates, together with the cleaning costs that the local governments have to assume. This study aims to confirm urine as a real source of soluble salts that can trigger the decay of building materials, especially of those of built cultural heritage of the historical centers of the cities, which are suffering the new social scenario described above. For this purpose, an experimental setup was designed and performed in the laboratory to simulate this process. 5 cm side cubic specimens of dolostone were subjected to 100 testing cycles of urine absorption by capillarity. The necessary amount of urine was collected by donors and stored following clinical protocol conditions. Each cycle consisted of imbibitions of the specimens in 3 mm high urine sheet for 3 hours, drying at 40°C in an oven for 20 hours and 1 hour cooling in a dessicator. At the end of the 100 cycles, small pieces of the specimens were cut, observed and analyzed with the aid of an environmental scanning electron microscope, which presents the advantage of no sample preparation. The sampled pieces were selected considering there were different sections in height in the specimens: a) a bottom section that corresponds to the section that has been immersed in the urine solution (3 mm); b) an interface section, immediately above the immersed area, which is the area most affected by the urine capillarity process, characterized by a strong yellowish color; c) the section that we have named as section of influence, which is subjected to the capillary absorption, although not so strongly than the interface section

  4. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  5. Microwave Plasma Sources for Gas Processing

    International Nuclear Information System (INIS)

    Mizeraczyk, J.; Jasinski, M.; Dors, M.; Zakrzewski, Z.

    2008-01-01

    In this paper atmospheric pressure microwave discharge methods and devices used for producing the non-thermal plasmas for processing of gases are presented. The main part of the paper concerns the microwave plasma sources (MPSs) for environmental protection applications. A few types of the MPSs, i.e. waveguide-based surface wave sustained MPS, coaxial-line-based and waveguide-based nozzle-type MPSs, waveguide-based nozzleless cylinder-type MPS and MPS for microdischarges are presented. Also, results of the laboratory experiments on the plasma processing of several highly-concentrated (up to several tens percent) volatile organic compounds (VOCs), including Freon-type refrigerants, in the moderate (200-400 W) waveguide-based nozzle-type MPS (2.45 GHz) are presented. The results showed that the microwave discharge plasma fully decomposed the VOCs at relatively low energy cost. The energy efficiency of VOCs decomposition reached 1000 g/kWh. This suggests that the microwave discharge plasma can be a useful tool for environmental protection applications. In this paper also results of the use of the waveguide-based nozzleless cylinder-type MPS to methane reforming into hydrogen are presented

  6. Competing sound sources reveal spatial effects in cortical processing.

    Directory of Open Access Journals (Sweden)

    Ross K Maddox

    Full Text Available Why is spatial tuning in auditory cortex weak, even though location is important to object recognition in natural settings? This question continues to vex neuroscientists focused on linking physiological results to auditory perception. Here we show that the spatial locations of simultaneous, competing sound sources dramatically influence how well neural spike trains recorded from the zebra finch field L (an analog of mammalian primary auditory cortex encode source identity. We find that the location of a birdsong played in quiet has little effect on the fidelity of the neural encoding of the song. However, when the song is presented along with a masker, spatial effects are pronounced. For each spatial configuration, a subset of neurons encodes song identity more robustly than others. As a result, competing sources from different locations dominate responses of different neural subpopulations, helping to separate neural responses into independent representations. These results help elucidate how cortical processing exploits spatial information to provide a substrate for selective spatial auditory attention.

  7. Precision Orbit Derived Atmospheric Density: Development and Performance

    Science.gov (United States)

    McLaughlin, C.; Hiatt, A.; Lechtenberg, T.; Fattig, E.; Mehta, P.

    2012-09-01

    derived density estimates. However, major variations in density are observed in the POE derived densities. These POE derived densities in combination with other data sources can be assimilated into physics based general circulation models of the thermosphere and ionosphere with the possibility of providing improved density forecasts for satellite drag analysis. POE derived density estimates were initially developed using CHAMP and GRACE data so comparisons could be made with accelerometer derived density estimates. This paper presents the results of the most extensive calibration of POE derived densities compared to accelerometer derived densities and provides the reasoning for selecting certain parameters in the estimation process. The factors taken into account for these selections are the cross correlation and RMS performance compared to the accelerometer derived densities and the output of the ballistic coefficient estimation that occurs simultaneously with the density estimation. This paper also presents the complete data set of CHAMP and GRACE results and shows that the POE derived densities match the accelerometer densities better than empirical models or DCA. This paves the way to expand the POE derived densities to include other satellites with quality GPS and/or satellite laser ranging observations.

  8. A triphenylamine substituted quinacridone derivative for solution processed organic light emitting diodes

    NARCIS (Netherlands)

    Pilz da Cunha, M.; Do, T.T.; Yambem, S.D.; Pham, H.D.; Chang, S.; Manzhos, S.; Katoh, R.; Sonar, P.

    2018-01-01

    We report on a novel quinacridone derivative design, namely, 2,9-bis(4-(bis(4-methoxyphenyl)amino)phenyl)-5,12-bis(2-ethylhexyl)-5,12-dihydroquinolino[2,3-b]acridine-7,14-dione (TPA-QA-TPA) for possible use as a solution processable emissive layer in organic light emitting diodes (OLEDs). TPA-QA-TPA

  9. CONSIDERATIONS FOR THE MASS AND ENERGY INTEGRATION IN THE SUGAR PROCESS PRODUCTION AND DERIVATIVE PROCESS

    Directory of Open Access Journals (Sweden)

    Dennis Abel Clavelo Sierra

    2015-04-01

    Full Text Available The current society needs now more than ever of industries that create new forms and methods where the saving of energy and materials is a fundamental aspect. For this reason, in the present investigation we present an outline with the considerations for the integration of the processes of sugar and other derived products, in an outline of bio refinery with the objective of achieving efficient processes with an appropriate use of the material resources and an efficient use of the energy, with minimum operation costs and investment. In the outline we take as base for the study, it is considered that the integrated complex has as basic input the sugarcane; it is also considered the variation of the prices of the products in the market. In the article we make an outline with the precise steps for the development of a methodology that allows analyzing the processes involved in the biorefinery outline and in this way to identify the common material and energy resources that the processes exchange. A heuristic diagram is presented that guides the strategy to continue for it.

  10. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  11. Diagnostics of microdischarge-integrated plasma sources for display and materials processing

    International Nuclear Information System (INIS)

    Tachibana, K; Kishimoto, Y; Kawai, S; Sakaguchi, T; Sakai, O

    2005-01-01

    Two different types of microdischarge-integrated plasma sources have been operated at around the atmospheric pressure range. The discharge characteristics were diagnosed by optical emission spectroscopy (OES), laser absorption spectroscopy (LAS) and microwave transmission (MT) techniques. The dynamic spatiotemporal behaviour of excited atoms was analysed using OES and LAS and the temporal behaviour of the electron density was estimated using the MT method. In Ar and Xe/Ne gases, waveforms of the MT signal followed the current waveform in the rise period and lasted longer according to the recombination losses. However, in He the waveform followed the density of metastable atoms, reflecting the production of a large amount of electrons by the Penning ionization process with impurities. The estimated peak electron density in those plasma sources is of the order of 10 12 cm -3 , and the metastable atom density can reach 10 13 cm -3 . Thus, it is suggested that these sources can be potentially applied to convenient material processing tools of large area operated stably at atmospheric pressure

  12. By-products of fruits processing as a source of phytochemicals

    Directory of Open Access Journals (Sweden)

    Sonja Djilas

    Full Text Available The processing of fruits results in high amounts of waste materials such as peels, seeds, stones, and oilseed meals. A disposal of these materials usually represents a problem that is further aggravated by legal restrictions. Thus new aspects concerning the use of these wastes as by-products for further exploitation on the production of food additives or supplements with high nutritional value have gained increasing interest because these are high-value products and their recovery may be economically attractive. It is well known that by-products represent an important source of sugars, minerals, organic acid, dietary fibre and phenolics which have a wide range of action which includes antitumoral, antiviral, antibacterial, cardioprotective and antimutagenic activities. This review discusses the potential of the most important by-products of apple, grape and citrus fruits processing as a source of valuable compounds. The relevance of this topic is illustrated by a number of references.

  13. Analysis of the emission characteristics of ion sources for high-value optical counting processes

    International Nuclear Information System (INIS)

    Beermann, Nils

    2009-01-01

    The production of complex high-quality thin film systems requires a detailed understanding of all partial processes. One of the most relevant partial processes is the condensation of the coating material on the substrate surface. The optical and mechanical material properties can be adjusted by the well-defined impingement of energetic ions during deposition. Thus, in the past, a variety of different ion sources were developed. With respect to the present and future challenges in the production of precisely fabricated high performance optical coatings, the ion emission of the sources has commonly not been characterized sufficiently so far. This question is addressed in the frame of this work which itself is thematically integrated in the field of process-development and -control of ion assisted deposition processes. In a first step, a Faraday cup measurement system was developed which allows the spatially resolved determination of the ion energy distribution as well as the ion current distribution. Subsequently, the ion emission profiles of six ion sources were determined depending on the relevant operating parameters. Consequently, a data pool for process planning and supplementary process analysis is made available. On the basis of the acquired results, the basic correlations between the operating parameters and the ion emission are demonstrated. The specific properties of the individual sources as well as the respective control strategies are pointed out with regard to the thin film properties and production yield. Finally, a synthesis of the results and perspectives for future activities are given. (orig.)

  14. Electricity derivative markets: Investment valuation, production planning and hedging

    International Nuclear Information System (INIS)

    Naesaekkaelae, E.

    2005-01-01

    This thesis studies electricity derivative markets from a view point of an electricity producer. The traditionally used asset pricing methods, based on the no arbitrage principle, are extended to take into account electricity specific features: the non storability of electricity and the variability in the load process. The sources of uncertainty include electricity forward curve, prices of resources used to generate electricity, and the size of the future production. Also the effects of competitors' actions are considered. The thesis illustrates how the information in the derivative prices can be used in investment and production planning. In addition, the use of derivatives as a tool to stabilize electricity dependent cash flows is considered. The results indicate that the information about future electricity prices and their uncertainty, obtained from derivative markets, is important in investment analysis and production planning. (orig.)

  15. Electricity derivative markets: Investment valuation, production planning and hedging

    Energy Technology Data Exchange (ETDEWEB)

    Naesaekkaelae, E.

    2005-07-01

    This thesis studies electricity derivative markets from a view point of an electricity producer. The traditionally used asset pricing methods, based on the no arbitrage principle, are extended to take into account electricity specific features: the non storability of electricity and the variability in the load process. The sources of uncertainty include electricity forward curve, prices of resources used to generate electricity, and the size of the future production. Also the effects of competitors' actions are considered. The thesis illustrates how the information in the derivative prices can be used in investment and production planning. In addition, the use of derivatives as a tool to stabilize electricity dependent cash flows is considered. The results indicate that the information about future electricity prices and their uncertainty, obtained from derivative markets, is important in investment analysis and production planning. (orig.)

  16. Pseudo-dynamic source modelling with 1-point and 2-point statistics of earthquake source parameters

    KAUST Repository

    Song, S. G.

    2013-12-24

    Ground motion prediction is an essential element in seismic hazard and risk analysis. Empirical ground motion prediction approaches have been widely used in the community, but efficient simulation-based ground motion prediction methods are needed to complement empirical approaches, especially in the regions with limited data constraints. Recently, dynamic rupture modelling has been successfully adopted in physics-based source and ground motion modelling, but it is still computationally demanding and many input parameters are not well constrained by observational data. Pseudo-dynamic source modelling keeps the form of kinematic modelling with its computational efficiency, but also tries to emulate the physics of source process. In this paper, we develop a statistical framework that governs the finite-fault rupture process with 1-point and 2-point statistics of source parameters in order to quantify the variability of finite source models for future scenario events. We test this method by extracting 1-point and 2-point statistics from dynamically derived source models and simulating a number of rupture scenarios, given target 1-point and 2-point statistics. We propose a new rupture model generator for stochastic source modelling with the covariance matrix constructed from target 2-point statistics, that is, auto- and cross-correlations. Our sensitivity analysis of near-source ground motions to 1-point and 2-point statistics of source parameters provides insights into relations between statistical rupture properties and ground motions. We observe that larger standard deviation and stronger correlation produce stronger peak ground motions in general. The proposed new source modelling approach will contribute to understanding the effect of earthquake source on near-source ground motion characteristics in a more quantitative and systematic way.

  17. 40 CFR 74.48 - Transfer of allowances from the replacement of thermal energy-process sources. [Reserved

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Transfer of allowances from the replacement of thermal energy-process sources. [Reserved] 74.48 Section 74.48 Protection of Environment... energy—process sources. [Reserved] ...

  18. A unified approach to the design of advanced proportional-integral-derivative controllers for time-delay processes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Moonyong [Yeungnam University, Gyeongsan (Korea, Republic of); Vu, Truong Nguyen Luan [University of Technical Education of Ho Chi Minh City, Ho Chi Minh (China)

    2013-03-15

    A unified approach for the design of proportional-integral-derivative (PID) controllers cascaded with first-order lead-lag filters is proposed for various time-delay processes. The proposed controller’s tuning rules are directly derived using the Padé approximation on the basis of internal model control (IMC) for enhanced stability against disturbances. A two-degrees-of-freedom (2DOF) control scheme is employed to cope with both regulatory and servo problems. Simulation is conducted for a broad range of stable, integrating, and unstable processes with time delays. Each simulated controller is tuned to have the same degree of robustness in terms of maximum sensitivity (Ms). The results demonstrate that the proposed controller provides superior disturbance rejection and set-point tracking when compared with recently published PID-type controllers. Controllers’ robustness is investigated through the simultaneous introduction of perturbation uncertainties to all process parameters to obtain worst-case process-model mismatch. The process-model mismatch simulation results demonstrate that the proposed method consistently affords superior robustness.

  19. Analysis of social relations among organizational units derived from process models and redesign of organization structure

    NARCIS (Netherlands)

    Choi, I.; Song, M.S.; Kim, K.M.; Lee, Y-H.

    2007-01-01

    Despite surging interests in analyzing business processes, there are few scientific approaches to analysis and redesign of organizational structures which can greatly affect the performance of business processes. This paper presents a method for deriving and analyzing organizational relations from

  20. The Role of External Knowledge Sources and Organizational Design in the Process of Opportunity Exploitation

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Lyngsie, Jacob; A. Zahra, Shaker

    involving 536 Danish firms shows that the use of external knowledge sources is positively associated with opportunity exploitation, but the strength of this association is significantly influenced by organizational designs that enable the firm to access external knowledge during the process of exploiting......Research highlights the role of external knowledge sources in the recognition of strategic opportunities, but is less forthcoming with respect to the role of such sources during the process of exploiting or realizing opportunities. We build on the knowledge-based view to propose that realizing...... opportunities often involves significant interactions with external knowledge sources. Organizational design can facilitate a firm’s interactions with these sources, while achieving coordination among organizational members engaged in opportunity exploitation. Our analysis of a double-respondent survey...

  1. The Role of External Knowledge Sources and Organizational Design in the Process of Opportunity Exploitation

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Lyngsie, Jacob; Zahra, Shaker A.

    2013-01-01

    involving 536 Danish firms shows that the use of external knowledge sources is positively associated with opportunity exploitation, but the strength of this association is significantly influenced by organizational designs that enable the firm to access external knowledge during the process of exploiting......Research highlights the role of external knowledge sources in the recognition of strategic opportunities but is less forthcoming with respect to the role of such sources during the process of exploiting or realizing opportunities. We build on the knowledge-based view to propose that realizing...... opportunities often involves significant interactions with external knowledge sources. Organizational design can facilitate a firm's interactions with these sources, while achieving coordination among organizational members engaged in opportunity exploitation. Our analysis of a double-respondent survey...

  2. Electronic processes in TTF-derived complexes studied by IR spectroscopy

    Science.gov (United States)

    Graja, Andrzej

    2001-09-01

    We focus our attention on the plasma-edge-like dispersion of the reflectance spectra of the selected bis(ethylenodithio)tetrathiafulvalene (BEDT-TTF)-derived organic conductors. The standard procedure to determine the electron transport parameters in low-dimensional organic conductors consists of fitting the appropriate theoretical models with the experimental reflectance data. This procedure provides us with basic information like plasma frequency, the optical effective mass of charge carriers, their number, mean free path and damping constant. Therefore, it is concluded that the spectroscopy is a powerful tool to study the electronic processes in conducting organic solids.

  3. The processing and characterization of animal-derived bone to yield materials with biomedical applications. Part II: milled bone powders, reprecipitated hydroxyapatite and the potential uses of these materials.

    Science.gov (United States)

    Johnson, G S; Mucalo, M R; Lorier, M A; Gieland, U; Mucha, H

    2000-11-01

    Further studies on the processing and use of animal-bone-derived calcium phosphate materials in biomedical applications are presented. Bone powders sourced either from the direct crushing and milling of bovine, ovine and cervine bone or after being subjected to defatting and acid digestion/NaOH reprecipitation and sodium hypochlorite hydrogen peroxide treatment of animal bones were characterized using Fourier transform infra-red (FTIR) spectroscopy, 13C solid state magic angle spinning (MAS) nuclear magnetic resonance (NMR) spectroscopy, atomic absorption (AA) and inductively coupled plasma (ICP) spectrometric techniques. Bone powders were trialled for their potential use as a substrate for phosphine coupling and enzyme immobilization as well as a feedstock powder for plasma spraying on titanium metal substrates. Results indicated that enzyme immobilization by phosphine coupling could be successfully achieved on milled cervine bone with the immobilized enzyme retaining some activity. It was found that the presence of impurities normally carried down with the processing of the bone materials (viz., fat and collagen) played an important role in influencing the adsorbency and reactivity of the powders. Plasma spraying studies using reprecipitated bovine-derived powders produced highly adherent coatings on titanium metal, the composition of which was mostly hydroxyapatite (Ca10(PO4)6(OH)2) with low levels of alpha-tricalcium phosphate (alpha-Ca3(PO4)2) and tetracalcium phosphate (Ca4P2O9) also detected. In general, animal derived calcium phosphate materials constitute a potentially cheaper source of calcium phosphate materials for biomedical applications and make use of a largely under-utilized resource from abattoir wastes. Copyright 2000 Kluwer Academic Publishers

  4. Sources of salmonellae in an uninfected commercially-processed broiler flock.

    Science.gov (United States)

    Rigby, C E; Pettit, J R; Baker, M F; Bentley, A H; Salomons, M O; Lior, H

    1980-07-01

    Cultural monitoring was used to study the incidence and sources of salmonellae in a 4160 bird broiler flock during the growing period, transport and processing in a commercial plant. No salmonellae were isolated from any of 132 litter samples of 189 chickens cultured during the seven-week growing period, even though nest litter samples from four of the eight parent flocks yielded salmonellae and Salmonella worthington was isolated from the meat meal component of the grower ration. On arrival at the plant, 2/23 birds sampled carried S. infantis on their feathers, although intestinal cultures failed to yield salmonellae. Three of 18 processed carcasses samples yielded salmonellae (S. infantis, S. heidelberg, S. typhimurium var copenhagen). The most likely source of these salmonellae was the plastic transport crates, since 15/107 sampled before the birds were loaded yielded salmonellae (S. infantis, S. typhimurium). The crate washer at the plant did not reduce the incidence of Salmonella-contaminated crates, since 16/116 sampled after washing yielded salmonellae (S. infantis, S. typhimurium, S. heidelberg, S. schwarzengrund, S. albany).

  5. PhysioSpace: relating gene expression experiments from heterogeneous sources using shared physiological processes.

    Directory of Open Access Journals (Sweden)

    Michael Lenz

    Full Text Available Relating expression signatures from different sources such as cell lines, in vitro cultures from primary cells and biopsy material is an important task in drug development and translational medicine as well as for tracking of cell fate and disease progression. Especially the comparison of large scale gene expression changes to tissue or cell type specific signatures is of high interest for the tracking of cell fate in (trans- differentiation experiments and for cancer research, which increasingly focuses on shared processes and the involvement of the microenvironment. These signature relation approaches require robust statistical methods to account for the high biological heterogeneity in clinical data and must cope with small sample sizes in lab experiments and common patterns of co-expression in ubiquitous cellular processes. We describe a novel method, called PhysioSpace, to position dynamics of time series data derived from cellular differentiation and disease progression in a genome-wide expression space. The PhysioSpace is defined by a compendium of publicly available gene expression signatures representing a large set of biological phenotypes. The mapping of gene expression changes onto the PhysioSpace leads to a robust ranking of physiologically relevant signatures, as rigorously evaluated via sample-label permutations. A spherical transformation of the data improves the performance, leading to stable results even in case of small sample sizes. Using PhysioSpace with clinical cancer datasets reveals that such data exhibits large heterogeneity in the number of significant signature associations. This behavior was closely associated with the classification endpoint and cancer type under consideration, indicating shared biological functionalities in disease associated processes. Even though the time series data of cell line differentiation exhibited responses in larger clusters covering several biologically related patterns, top scoring

  6. RADIATION CHEMICAL CONVERSION OF OIL DERIVED FROM OIL-BITUMEN ROCK

    Directory of Open Access Journals (Sweden)

    Lala Jabbarova

    2014-06-01

    Full Text Available The results of research in the radiation processing of synthetic oil derived from oil–bitumen rock of the Balakhany deposit in Azerbaijan are presented. The study has been conducted on a 60Co gamma-source at a dose rate of P = 0.5 Gy/s and various absorbed doses of D = 43–216 kGy. Samples of synthetic oil from natural bitumen rocks have been analyzed by chromatography, gas chromatography–mass spectrometry, and IR-spectroscopy, and their radiation resistance has been evaluated. The results of the study allow for both assessment of the feasibility of manufacturing petrochemicals for various applications by radiation processing and use of these materials for isolating radioactive sources to preclude their impact on the environment.

  7. Preparation of Biocolorant and Eco-Dyeing Derived from Polyphenols Based on Laccase-Catalyzed Oxidative Polymerization

    Directory of Open Access Journals (Sweden)

    Fubang Wang

    2018-02-01

    Full Text Available Natural products have been believed to be a promising source to obtain ecological dyes and pigments. Plant polyphenol is a kind of significant natural compound, and tea provides a rich source of polyphenols. In this study, biocolorant derived from phenolic compounds was generated based on laccase-catalyzed oxidative polymerization, and eco-dyeing of silk and wool fabrics with pigments derived from tea was investigated under the influence of pH variation. This work demonstrated that the dyeing property was better under acidic conditions compared to alkalinity, and fixation rate was the best when pH value was 3. Furthermore, breaking strength of dyed fabrics sharply reduced under the condition of pH 11. Eventually, the dyeing method was an eco-friendly process, which was based on bioconversion, and no mordant was added during the process of dyeing.

  8. MzJava: An open source library for mass spectrometry data processing.

    Science.gov (United States)

    Horlacher, Oliver; Nikitin, Frederic; Alocci, Davide; Mariethoz, Julien; Müller, Markus; Lisacek, Frederique

    2015-11-03

    Mass spectrometry (MS) is a widely used and evolving technique for the high-throughput identification of molecules in biological samples. The need for sharing and reuse of code among bioinformaticians working with MS data prompted the design and implementation of MzJava, an open-source Java Application Programming Interface (API) for MS related data processing. MzJava provides data structures and algorithms for representing and processing mass spectra and their associated biological molecules, such as metabolites, glycans and peptides. MzJava includes functionality to perform mass calculation, peak processing (e.g. centroiding, filtering, transforming), spectrum alignment and clustering, protein digestion, fragmentation of peptides and glycans as well as scoring functions for spectrum-spectrum and peptide/glycan-spectrum matches. For data import and export MzJava implements readers and writers for commonly used data formats. For many classes support for the Hadoop MapReduce (hadoop.apache.org) and Apache Spark (spark.apache.org) frameworks for cluster computing was implemented. The library has been developed applying best practices of software engineering. To ensure that MzJava contains code that is correct and easy to use the library's API was carefully designed and thoroughly tested. MzJava is an open-source project distributed under the AGPL v3.0 licence. MzJava requires Java 1.7 or higher. Binaries, source code and documentation can be downloaded from http://mzjava.expasy.org and https://bitbucket.org/sib-pig/mzjava. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Modeling the Influence of Process Parameters and Additional Heat Sources on Residual Stresses in Laser Cladding

    Science.gov (United States)

    Brückner, F.; Lepski, D.; Beyer, E.

    2007-09-01

    In laser cladding thermal contraction of the initially liquid coating during cooling causes residual stresses and possibly cracks. Preweld or postweld heating using inductors can reduce the thermal strain difference between coating and substrate and thus reduce the resulting stress. The aim of this work is to better understand the influence of various thermometallurgical and mechanical phenomena on stress evolution and to optimize the induction-assisted laser cladding process to get crack-free coatings of hard materials at high feed rates. First, an analytical one-dimensional model is used to visualize the most important features of stress evolution for a Stellite coating on a steel substrate. For more accurate studies, laser cladding is simulated including the powder-beam interaction, the powder catchment by the melt pool, and the self-consistent calculation of temperature field and bead shape. A three-dimensional finite element model and the required equivalent heat sources are derived from the results and used for the transient thermomechanical analysis, taking into account phase transformations and the elastic-plastic material behavior with strain hardening. Results are presented for the influence of process parameters such as feed rate, heat input, and inductor size on the residual stresses at a single bead of Stellite coatings on steel.

  10. Inverse Source Data-Processing Strategies for Radio-Frequency Localization in Indoor Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Gennarelli

    2017-10-01

    Full Text Available Indoor positioning of mobile devices plays a key role in many aspects of our daily life. These include real-time people tracking and monitoring, activity recognition, emergency detection, navigation, and numerous location based services. Despite many wireless technologies and data-processing algorithms have been developed in recent years, indoor positioning is still a problem subject of intensive research. This paper deals with the active radio-frequency (RF source localization in indoor scenarios. The localization task is carried out at the physical layer thanks to receiving sensor arrays which are deployed on the border of the surveillance region to record the signal emitted by the source. The localization problem is formulated as an imaging one by taking advantage of the inverse source approach. Different measurement configurations and data-processing/fusion strategies are examined to investigate their effectiveness in terms of localization accuracy under both line-of-sight (LOS and non-line of sight (NLOS conditions. Numerical results based on full-wave synthetic data are reported to support the analysis.

  11. Derivative interactions and perturbative UV contributions in N Higgs doublet models

    Energy Technology Data Exchange (ETDEWEB)

    Kikuta, Yohei [KEK Theory Center, KEK, Tsukuba (Japan); The Graduate University for Advanced Studies, Department of Particle and Nuclear Physics, Tsukuba (Japan); Yamamoto, Yasuhiro [Universidad de Granada, Deportamento de Fisica Teorica y del Cosmos, Facultad de Ciencias and CAFPE, Granada (Spain)

    2016-05-15

    We study the Higgs derivative interactions on models including arbitrary number of the Higgs doublets. These interactions are generated by two ways. One is higher order corrections of composite Higgs models, and the other is integration of heavy scalars and vectors. In the latter case, three point couplings between the Higgs doublets and these heavy states are the sources of the derivative interactions. Their representations are constrained to couple with the doublets. We explicitly calculate all derivative interactions generated by integrating out. Their degrees of freedom and conditions to impose the custodial symmetry are discussed. We also study the vector boson scattering processes with a couple of two Higgs doublet models to see experimental signals of the derivative interactions. They are differently affected by each heavy field. (orig.)

  12. Acoustic sources of opportunity in the marine environment - Applied to source localization and ocean sensing

    Science.gov (United States)

    Verlinden, Christopher M.

    Controlled acoustic sources have typically been used for imaging the ocean. These sources can either be used to locate objects or characterize the ocean environment. The processing involves signal extraction in the presence of ambient noise, with shipping being a major component of the latter. With the advent of the Automatic Identification System (AIS) which provides accurate locations of all large commercial vessels, these major noise sources can be converted from nuisance to beacons or sources of opportunity for the purpose of studying the ocean. The source localization method presented here is similar to traditional matched field processing, but differs in that libraries of data-derived measured replicas are used in place of modeled replicas. In order to account for differing source spectra between library and target vessels, cross-correlation functions are compared instead of comparing acoustic signals directly. The library of measured cross-correlation function replicas is extrapolated using waveguide invariant theory to fill gaps between ship tracks, fully populating the search grid with estimated replicas allowing for continuous tracking. In addition to source localization, two ocean sensing techniques are discussed in this dissertation. The feasibility of estimating ocean sound speed and temperature structure, using ship noise across a drifting volumetric array of hydrophones suspended beneath buoys, in a shallow water marine environment is investigated. Using the attenuation of acoustic energy along eigenray paths to invert for ocean properties such as temperature, salinity, and pH is also explored. In each of these cases, the theory is developed, tested using numerical simulations, and validated with data from acoustic field experiments.

  13. Calculating the Price for Derivative Financial Assets of Bessel Processes Using the Sturm-Liouville Theory

    Directory of Open Access Journals (Sweden)

    Burtnyak Ivan V.

    2017-06-01

    Full Text Available In the paper we apply the spectral theory to find the price for derivatives of financial assets assuming that the processes described are Markov processes and such that can be considered in the Hilbert space L^2 using the Sturm-Liouville theory. Bessel diffusion processes are used in studying Asian options. We consider the financial flows generated by the Bessel diffusions by expressing them in terms of the system of Bessel functions of the first kind, provided that they take into account the linear combination of the flow and its spatial derivative. Such expression enables calculating the size of the market portfolio and provides a measure of the amount of internal volatility in the market at any given moment, allows investigating the dynamics of the equity market. The expansion of the Green function in terms of the system of Bessel functions is expressed by an analytic formula that is convenient in calculating the volume of financial flows. All assumptions are natural, result in analytic formulas that are consistent with the empirical data and, when applied in practice, adequately reflect the processes in equity markets.

  14. Variability in physical contamination assessment of source segregated biodegradable municipal waste derived composts.

    Science.gov (United States)

    Echavarri-Bravo, Virginia; Thygesen, Helene H; Aspray, Thomas J

    2017-01-01

    Physical contaminants (glass, metal, plastic and 'other') and stones were isolated and categorised from three finished commercial composts derived from source segregated biodegradable municipal waste (BMW). A subset of the identified physical contaminant fragments were subsequently reintroduced into the cleaned compost samples and sent to three commercial laboratories for testing in an inter-laboratory trial using the current PAS100:2011 method (AfOR MT PC&S). The trial showed that the 'other' category caused difficulty for all three laboratories with under reporting, particularly of the most common 'other' contaminants (paper and cardboard) and, over-reporting of non-man-made fragments. One laboratory underreported metal contaminant fragments (spiked as silver foil) in three samples. Glass, plastic and stones were variably underreported due to miss-classification or over reported due to contamination with compost (organic) fragments. The results are discussed in the context of global physical contaminant test methods and compost quality assurance schemes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Yeast derived from lignocellulosic biomass as a sustainable feed resource for use in aquaculture.

    Science.gov (United States)

    Øverland, Margareth; Skrede, Anders

    2017-02-01

    The global expansion in aquaculture production implies an emerging need of suitable and sustainable protein sources. Currently, the fish feed industry is dependent on high-quality protein sources of marine and plant origin. Yeast derived from processing of low-value and non-food lignocellulosic biomass is a potential sustainable source of protein in fish diets. Following enzymatic hydrolysis, the hexose and pentose sugars of lignocellulosic substrates and supplementary nutrients can be converted into protein-rich yeast biomass by fermentation. Studies have shown that yeasts such as Saccharomyces cerevisiae, Candida utilis and Kluyveromyces marxianus have favourable amino acid composition and excellent properties as protein sources in diets for fish, including carnivorous species such as Atlantic salmon and rainbow trout. Suitable downstream processing of the biomass to disrupt cell walls is required to secure high nutrient digestibility. A number of studies have shown various immunological and health benefits from feeding fish low levels of yeast and yeast-derived cell wall fractions. This review summarises current literature on the potential of yeast from lignocellulosic biomass as an alternative protein source for the aquaculture industry. It is concluded that further research and development within yeast production can be important to secure the future sustainability and economic viability of intensive aquaculture. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  16. Post-processing of Monte Carlo simulations for rapid BNCT source optimization studies

    International Nuclear Information System (INIS)

    Bleuel, D.L.; Chu, W.T.; Donahue, R.J.; Ludewigt, B.A.; Vujic, J.

    2000-01-01

    A great advantage of some neutron sources, such as accelerator-produced sources, is that they can be tuned to produce different spectra. Unfortunately, optimization studies are often time-consuming and difficult, as they require a lengthy Monte Carlo simulation for each source. When multiple characteristics, such as energy, angle, and spatial distribution of a neutron beam are allowed to vary, an overwhelming number of simulations may be required. Many optimization studies, therefore, suffer from a small number of datapoints, restrictive treatment conditions, or poor statistics. By scoring pertinent information from every particle tally in a Monte Carlo simulation, then applying appropriate source variable weight factors in a post-processing algorithm, a single simulation can be used to model any number of multiple sources. Through this method, the response to a new source can be modeled in minutes or seconds, rather than hours or days, allowing for the analysis of truly variable source conditions of much greater resolution than is normally possible when a new simulation must be run for each datapoint in a study. This method has been benchmarked and used to recreate optimization studies in a small fraction of the time spent in the original studies

  17. Deriving Process Congruences from Reaction Rules

    DEFF Research Database (Denmark)

    Sobocinski, Pawel

    This thesis is concerned with the development of a theory which, given a formalism with a reduction semantics, allows the derivation of a canonical labelled transition system on which bisimilarity as well as other other equiv� alences are congruences; provided that the contexts of the formalism f...

  18. Natural Radionuclides and Isotopic Signatures for Determining Carbonaceous Aerosol Sources, Aerosol Lifetimes, and Washout Processes

    International Nuclear Information System (INIS)

    Gaffney, Jeffrey

    2012-01-01

    This is the final technical report. The project description is as follows: to determine the role of aerosol radiative forcing on climate, the processes that control their atmospheric concentrations must be understood, and aerosol sources need to be determined for mitigation. Measurements of naturally occurring radionuclides and stable isotopic signatures allow the sources, removal and transport processes, as well as atmospheric lifetimes of fine carbonaceous aerosols, to be evaluated.

  19. Natural Radionuclides and Isotopic Signatures for Determining Carbonaceous Aerosol Sources, Aerosol Lifetimes, and Washout Processes

    Energy Technology Data Exchange (ETDEWEB)

    Gaffney, Jeffrey [Univ. of Arkansas, Little Rock, AR (United States)

    2012-12-12

    This is the final technical report. The project description is as follows: to determine the role of aerosol radiative forcing on climate, the processes that control their atmospheric concentrations must be understood, and aerosol sources need to be determined for mitigation. Measurements of naturally occurring radionuclides and stable isotopic signatures allow the sources, removal and transport processes, as well as atmospheric lifetimes of fine carbonaceous aerosols, to be evaluated.

  20. Plasma Processing of Metallic and Semiconductor Thin Films in the Fisk Plasma Source

    Science.gov (United States)

    Lampkin, Gregory; Thomas, Edward, Jr.; Watson, Michael; Wallace, Kent; Chen, Henry; Burger, Arnold

    1998-01-01

    The use of plasmas to process materials has become widespread throughout the semiconductor industry. Plasmas are used to modify the morphology and chemistry of surfaces. We report on initial plasma processing experiments using the Fisk Plasma Source. Metallic and semiconductor thin films deposited on a silicon substrate have been exposed to argon plasmas. Results of microscopy and chemical analyses of processed materials are presented.

  1. BioSig: the free and open source software library for biomedical signal processing.

    Science.gov (United States)

    Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.

  2. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho; Pyeon, Cheol Ho

    2015-01-01

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r g , E g , t g ) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the neutron sources

  3. Preliminary investigation of processes that affect source term identification

    International Nuclear Information System (INIS)

    Wickliff, D.S.; Solomon, D.K.; Farrow, N.D.

    1991-09-01

    Solid Waste Storage Area (SWSA) 5 is known to be a significant source of contaminants, especially tritium ( 3 H), to the White Oak Creek (WOC) watershed. For example, Solomon et al. (1991) estimated the total 3 H discharge in Melton Branch (most of which originates in SWSA 5) for the 1988 water year to be 1210 Ci. A critical issue for making decisions concerning remedial actions at SWSA 5 is knowing whether the annual contaminant discharge is increasing or decreasing. Because (1) the magnitude of the annual contaminant discharge is highly correlated to the amount of annual precipitation (Solomon et al., 1991) and (2) a significant lag may exist between the time of peak contaminant release from primary sources (i.e., waste trenches) and the time of peak discharge into streams, short-term stream monitoring by itself is not sufficient for predicting future contaminant discharges. In this study we use 3 H to examine the link between contaminant release from primary waste sources and contaminant discharge into streams. By understanding and quantifying subsurface transport processes, realistic predictions of future contaminant discharge, along with an evaluation of the effectiveness of remedial action alternatives, will be possible. The objectives of this study are (1) to characterize the subsurface movement of contaminants (primarily 3 H) with an emphasis on the effects of matrix diffusion; (2) to determine the relative strength of primary vs secondary sources; and (3) to establish a methodology capable of determining whether the 3 H discharge from SWSA 5 to streams is increasing or decreasing

  4. Diagnosing Soil Moisture Anomalies and Neglected Soil Moisture Source/Sink Processes via a Thermal Infrared-based Two-Source Energy Balance Model

    Science.gov (United States)

    Hain, C.; Crow, W. T.; Anderson, M. C.; Yilmaz, M. T.

    2014-12-01

    Atmospheric processes, especially those that occur in the surface and boundary layer, are significantly impacted by soil moisture (SM). Due to the observational gaps in the ground-based monitoring of SM, methodologies have been developed to monitor SM from satellite platforms. While many have focused on microwave methods, observations of thermal infrared land surface temperature (LST) also provides a means of providing SM information. One particular TIR SM method exploits surface flux predictions retrieved from the Atmosphere Land Exchange Inverse (ALEXI) model. ALEXI uses a time-differential measurement of morning LST rise to diagnose the partitioning of net radiation into surface energy fluxes. Here an analysis will be presented to study relationships between three SM products during a multi-year period (2000-2013) from an active/passive microwave dataset (ECV), a TIR-based model (ALEXI), and a land surface model (Noah) over the CONUS. Additionally, all three will be compared against in-situ SM observations from the North American Soil Moisture Database. The second analysis will focus on the use of ALEXI towards diagnosing SM source/sink processes. Traditional soil water balance modeling is based on one-dimensional (vertical-only) water flow, free drainage at the bottom of the soil column, and neglecting ancillary inputs due to processes such as irrigation. However, recent work has highlighted the importance of secondary water source (e.g., irrigation, groundwater extraction, inland wetlands, lateral flows) and sink (e.g., tile drainage in agricultural areas) processes on the partitioning of evaporative and sensible heat fluxes. ALEXI offers a top-down approach for mapping areas where SM source/sink processes have a significant impact on the surface energy balance. Here we present an index, ASSET, that is based on comparisons between ALEXI latent heat flux (LE) estimates and LE predicted by a free-drainage prognostic LSM lacking irrigation, groundwater and tile

  5. Post-processing of Monte Carlo simulations for rapid BNCT source optimization studies

    International Nuclear Information System (INIS)

    Bleuel, D.L.; Chu, W.T.; Donahue, R.J.; Ludewigt, B.A.; Vujic, J.

    2000-01-01

    A great advantage of some neutron sources, such as accelerator-produced sources, is that they can be tuned to produce different spectra. Unfortunately, optimization studies are often time-consuming and difficult, as they require a lengthy Monte Carlo simulation for each source. When multiple characteristics, such as energy, angle, and spatial distribution of a neutron beam are allowed to vary, an overwhelming number of simulations may be required. Many optimization studies, therefore, suffer from a small number of data points, restrictive treatment conditions, or poor statistics. By scoring pertinent information from every particle tally in a Monte Carlo simulation, then applying appropriate source variable weight factors in a post-processing algorithm; a single simulation can be used to model any number of multiple sources. Through this method, the response to a new source can be modeled in minutes or seconds, rather than hours or days, allowing for the analysis of truly variable source conditions of much greater resolution than is normally possible when a new simulation must be run for each data point in a study. This method has been benchmarked and used to recreate optimization studies in a small fraction of the time spent in the original studies. (author)

  6. Continuous-Flow Processes in Heterogeneously Catalyzed Transformations of Biomass Derivatives into Fuels and Chemicals

    Directory of Open Access Journals (Sweden)

    Antonio A. Romero

    2012-07-01

    Full Text Available Continuous flow chemical processes offer several advantages as compared to batch chemistries. These are particularly relevant in the case of heterogeneously catalyzed transformations of biomass-derived platform molecules into valuable chemicals and fuels. This work is aimed to provide an overview of key continuous flow processes developed to date dealing with a series of transformations of platform chemicals including alcohols, furanics, organic acids and polyols using a wide range of heterogeneous catalysts based on supported metals, solid acids and bifunctional (metal + acidic materials.

  7. Strong-motion characteristics and source process during the Suruga Bay earthquake in 2009 through observed records on rock sites

    International Nuclear Information System (INIS)

    Shiba, Yoshiaki; Sato, Hiroaki; Kuriyama, Masayuki

    2010-01-01

    On 11 August 2009, a moderate earthquake of M 6.5 occurred in the Suruga Bay region, south of Shizuoka prefecture. During this event, JMA Seismic Intensity reached 6 lower in several cities around the hypocenter, and at Hamaoka nuclear power plant of Chubu Electric Power reactors were automatically shutdown due to large ground motions. Though the epicenter is located at the eastern edge of source area for the assumed great Tokai earthquake of M 8, this event is classified into the intra-plate (intra-slab) earthquake, due to its focal depth lower than that of the plate boundary and fault geometry supposed from the moment tensor solution. Dense strong-motion observation network has been deployed mainly on the rock outcrops by our institute around the source area, and the waveform data of the main shock and several aftershocks were obtained at 13 stations within 100 km from the hypocenter. The observed peak ground motions and velocity response spectral amplitudes are both obviously larger than the empirical attenuation relations derived from the inland and plate-boundary earthquake data, which displays the characteristics of the intra-slab earthquake faulting. Estimated acceleration source spectra of the main shock also exhibit the short period level about 1.7 times larger than the average of those for past events, and it corresponds with the additional term in the attenuation curve of the peak ground acceleration for the intra-plate earthquake. Detailed source process of the main shock is inferred using the inversion technique. The initial source model is assumed to be composed of two distinct fault planes according to the minute aftershock distribution. Estimated source model shows that large slip occurred near the hypocenter and at the boundary region between two fault planes where the rupture transfers from primary to secondary fault. Furthermore the broadband source inversion using velocity motions in the frequency up to 5 Hz demonstrates the high effective

  8. Thermoelastic stress due to an instantaneous finite line heat source in an infinite medium

    International Nuclear Information System (INIS)

    Claesson, J.; Hellstroem, G.

    1995-09-01

    The problem originates from studies of nuclear waste repositories in rock. The problem is by superposition reduced to the case of a single, infinite, antisymmetric, instantaneous line heat source. The dimensionless problem turns out to depend on the dimensionless radial and axial coordinates only, although the original time-dependent problem contains several parameters. An exact analytical solution is derived. The solution is surprisingly handy, considering the complexity of the original problem. The stress and strain field are readily obtained from derivatives of the displacement components. These fields are studied and presented in detail. Asymptotic behaviour, field of principal stresses, regions of compression and tension, and largest values of compression and tension of the components are given as exact formulas. The solution may be used to test numerical models for coupled thermoelastic processes. It may also be used in more detailed numerical simulations of the process near the heat sources as boundary conditions to account for the three-dimensional global process. 7 refs

  9. Sources, Ages, and Alteration of Organic Matter in Estuaries.

    Science.gov (United States)

    Canuel, Elizabeth A; Hardison, Amber K

    2016-01-01

    Understanding the processes influencing the sources and fate of organic matter (OM) in estuaries is important for quantifying the contributions of carbon from land and rivers to the global carbon budget of the coastal ocean. Estuaries are sites of high OM production and processing, and understanding biogeochemical processes within these regions is key to quantifying organic carbon (Corg) budgets at the land-ocean margin. These regions provide vital ecological services, including nutrient filtration and protection from floods and storm surge, and provide habitat and nursery areas for numerous commercially important species. Human activities have modified estuarine systems over time, resulting in changes in the production, respiration, burial, and export of Corg. Corg in estuaries is derived from aquatic, terrigenous, and anthropogenic sources, with each source exhibiting a spectrum of ages and lability. The complex source and age characteristics of Corg in estuaries complicate our ability to trace OM along the river-estuary-coastal ocean continuum. This review focuses on the application of organic biomarkers and compound-specific isotope analyses to estuarine environments and on how these tools have enhanced our ability to discern natural sources of OM, trace their incorporation into food webs, and enhance understanding of the fate of Corg within estuaries and their adjacent waters.

  10. Sources of CO2 efflux from soil and review of partitioning methods

    International Nuclear Information System (INIS)

    Kuzyakov, Y.

    2006-01-01

    Five main biogenic sources of CO 2 efflux from soils have been distinguished and described according to their turnover rates and the mean residence time of carbon. They are root respiration, rhizomicrobial respiration, decomposition of plant residues, the priming effect induced by root exudation or by addition of plant residues, and basal respiration by microbial decomposition of soil organic matter (SOM). These sources can be grouped in several combinations to summarize CO 2 efflux from the soil including: root-derived CO 2 , plant-derived CO 2 , SOM-derived CO 2 , rhizosphere respiration, heterotrophic microbial respiration (respiration by heterotrophs), and respiration by autotrophs. These distinctions are important because without separation of SOM-derived CO 2 from plant-derived CO 2 , measurements of total soil respiration have very limited value for evaluation of the soil as a source or sink of atmospheric CO 2 and for interpreting the sources of CO 2 and the fate of carbon within soils and ecosystems. Additionally, the processes linked to the five sources of CO 2 efflux from soil have various responses to environmental variables and consequently to global warming. This review describes the basic principles and assumptions of the following methods which allow SOM-derived and root-derived CO 2 efflux to be separated under laboratory and field conditions: root exclusion techniques, shading and clipping, tree girdling, regression, component integration, excised roots and in situ root respiration; continuous and pulse labeling, 13 C natural abundance and FACE, and radiocarbon dating and bomb- 14 C. A short sections cover the separation of the respiration of autotrophs and that of heterotrophs, i.e. the separation of actual root respiration from microbial respiration, as well as methods allowing the amount of CO 2 evolved by decomposition of plant residues and by priming effects to be estimated. All these methods have been evaluated according to their inherent

  11. Approaches of multilayer overlay process control for 28nm FD-SOI derivative applications

    Science.gov (United States)

    Duclaux, Benjamin; De Caunes, Jean; Perrier, Robin; Gatefait, Maxime; Le Gratiet, Bertrand; Chapon, Jean-Damien; Monget, Cédric

    2018-03-01

    Derivative technology like embedded Non-Volatile Memories (eNVM) is raising new types of challenges on the "more than Moore" path. By its construction: overlay is critical across multiple layers, by its running mode: usage of high voltage are stressing leakages and breakdown, and finally with its targeted market: Automotive, Industry automation, secure transactions… which are all requesting high device reliability (typically below 1ppm level). As a consequence, overlay specifications are tights, not only between one layer and its reference, but also among the critical layers sharing the same reference. This work describes a broad picture of the key points for multilayer overlay process control in the case of a 28nm FD-SOI technology and its derivative flows. First, the alignment trees of the different flow options have been optimized using a realistic process assumptions calculation for indirect overlay. Then, in the case of a complex alignment tree involving heterogeneous scanner toolset, criticality of tool matching between reference layer and critical layers of the flow has been highlighted. Improving the APC control loops of these multilayer dependencies has been studied with simulations of feed-forward as well as implementing new rework algorithm based on multi-measures. Finally, the management of these measurement steps raises some issues for inline support and using calculations or "virtual overlay" could help to gain some tool capability. A first step towards multilayer overlay process control has been taken.

  12. Neural correlates of encoding processes predicting subsequent cued recall and source memory.

    Science.gov (United States)

    Angel, Lucie; Isingrini, Michel; Bouazzaoui, Badiâa; Fay, Séverine

    2013-03-06

    In this experiment, event-related potentials were used to examine whether the neural correlates of encoding processes predicting subsequent successful recall differed from those predicting successful source memory retrieval. During encoding, participants studied lists of words and were instructed to memorize each word and the list in which it occurred. At test, they had to complete stems (the first four letters) with a studied word and then make a judgment of the initial temporal context (i.e. list). Event-related potentials recorded during encoding were segregated according to subsequent memory performance to examine subsequent memory effects (SMEs) reflecting successful cued recall (cued recall SME) and successful source retrieval (source memory SME). Data showed a cued recall SME on parietal electrode sites from 400 to 1200 ms and a late inversed cued recall SME on frontal sites in the 1200-1400 ms period. Moreover, a source memory SME was reported from 400 to 1400 ms on frontal areas. These findings indicate that patterns of encoding-related activity predicting successful recall and source memory are clearly dissociated.

  13. An Open Source-Based Real-Time Data Processing Architecture Framework for Manufacturing Sustainability

    Directory of Open Access Journals (Sweden)

    Muhammad Syafrudin

    2017-11-01

    Full Text Available Currently, the manufacturing industry is experiencing a data-driven revolution. There are multiple processes in the manufacturing industry and will eventually generate a large amount of data. Collecting, analyzing and storing a large amount of data are one of key elements of the smart manufacturing industry. To ensure that all processes within the manufacturing industry are functioning smoothly, the big data processing is needed. Thus, in this study an open source-based real-time data processing (OSRDP architecture framework was proposed. OSRDP architecture framework consists of several open sources technologies, including Apache Kafka, Apache Storm and NoSQL MongoDB that are effective and cost efficient for real-time data processing. Several experiments and impact analysis for manufacturing sustainability are provided. The results showed that the proposed system is capable of processing a massive sensor data efficiently when the number of sensors data and devices increases. In addition, the data mining based on Random Forest is presented to predict the quality of products given the sensor data as the input. The Random Forest successfully classifies the defect and non-defect products, and generates high accuracy compared to other data mining algorithms. This study is expected to support the management in their decision-making for product quality inspection and support manufacturing sustainability.

  14. Derivative Process Model of Development Power in Industry: Empirical Research and Forecast for Chinese Software Industry and US Economy

    OpenAIRE

    Feng Dai; Bao- hua Sun; Jie Sun

    2004-01-01

    Based on concept and theory of Development Power [1], this paper analyzes the transferability and the diffusibility of industrial development power, points out that the chaos is the extreme of DP releasing and order is the highest degree of DP accumulating, puts forward A-C strength, the index of adjusting and controlling strength, and sets up the derivative process model for industrial development power on the Partial Distribution [2]-[4]. By the derivative process model, a kind of time seri...

  15. Identification of naturally processed hepatitis C virus-derived major histocompatibility complex class I ligands.

    Directory of Open Access Journals (Sweden)

    Benno Wölk

    Full Text Available Fine mapping of human cytotoxic T lymphocyte (CTL responses against hepatitis C virus (HCV is based on external loading of target cells with synthetic peptides which are either derived from prediction algorithms or from overlapping peptide libraries. These strategies do not address putative host and viral mechanisms which may alter processing as well as presentation of CTL epitopes. Therefore, the aim of this proof-of-concept study was to identify naturally processed HCV-derived major histocompatibility complex (MHC class I ligands. To this end, continuous human cell lines were engineered to inducibly express HCV proteins and to constitutively express high levels of functional HLA-A2. These cell lines were recognized in an HLA-A2-restricted manner by HCV-specific CTLs. Ligands eluted from HLA-A2 molecules isolated from large-scale cultures of these cell lines were separated by high performance liquid chromatography and further analyzed by electrospray ionization quadrupole time of flight mass spectrometry (MS/tandem MS. These analyses allowed the identification of two HLA-A2-restricted epitopes derived from HCV nonstructural proteins (NS 3 and 5B (NS3₁₄₀₆₋₁₄₁₅ and NS5B₂₅₉₄₋₂₆₀₂. In conclusion, we describe a general strategy that may be useful to investigate HCV pathogenesis and may contribute to the development of preventive and therapeutic vaccines in the future.

  16. A global catalogue of large SO2 sources and emissions derived from the Ozone Monitoring Instrument

    Directory of Open Access Journals (Sweden)

    V. E. Fioletov

    2016-09-01

    Full Text Available Sulfur dioxide (SO2 measurements from the Ozone Monitoring Instrument (OMI satellite sensor processed with the new principal component analysis (PCA algorithm were used to detect large point emission sources or clusters of sources. The total of 491 continuously emitting point sources releasing from about 30 kt yr−1 to more than 4000 kt yr−1 of SO2 per year have been identified and grouped by country and by primary source origin: volcanoes (76 sources; power plants (297; smelters (53; and sources related to the oil and gas industry (65. The sources were identified using different methods, including through OMI measurements themselves applied to a new emission detection algorithm, and their evolution during the 2005–2014 period was traced by estimating annual emissions from each source. For volcanic sources, the study focused on continuous degassing, and emissions from explosive eruptions were excluded. Emissions from degassing volcanic sources were measured, many for the first time, and collectively they account for about 30 % of total SO2 emissions estimated from OMI measurements, but that fraction has increased in recent years given that cumulative global emissions from power plants and smelters are declining while emissions from oil and gas industry remained nearly constant. Anthropogenic emissions from the USA declined by 80 % over the 2005–2014 period as did emissions from western and central Europe, whereas emissions from India nearly doubled, and emissions from other large SO2-emitting regions (South Africa, Russia, Mexico, and the Middle East remained fairly constant. In total, OMI-based estimates account for about a half of total reported anthropogenic SO2 emissions; the remaining half is likely related to sources emitting less than 30 kt yr−1 and not detected by OMI.

  17. A Global Catalogue of Large SO2 Sources and Emissions Derived from the Ozone Monitoring Instrument

    Science.gov (United States)

    Fioletov, Vitali E.; McLinden, Chris A.; Krotkov, Nickolay; Li, Can; Joiner, Joanna; Theys, Nicolas; Carn, Simon; Moran, Mike D.

    2016-01-01

    Sulfur dioxide (SO2) measurements from the Ozone Monitoring Instrument (OMI) satellite sensor processed with the new principal component analysis (PCA) algorithm were used to detect large point emission sources or clusters of sources. The total of 491 continuously emitting point sources releasing from about 30 kt yr(exp -1) to more than 4000 kt yr(exp -1) of SO2 per year have been identified and grouped by country and by primary source origin: volcanoes (76 sources); power plants (297); smelters (53); and sources related to the oil and gas industry (65). The sources were identified using different methods, including through OMI measurements themselves applied to a new emission detection algorithm, and their evolution during the 2005- 2014 period was traced by estimating annual emissions from each source. For volcanic sources, the study focused on continuous degassing, and emissions from explosive eruptions were excluded. Emissions from degassing volcanic sources were measured, many for the first time, and collectively they account for about 30% of total SO2 emissions estimated from OMI measurements, but that fraction has increased in recent years given that cumulative global emissions from power plants and smelters are declining while emissions from oil and gas industry remained nearly constant. Anthropogenic emissions from the USA declined by 80% over the 2005-2014 period as did emissions from western and central Europe, whereas emissions from India nearly doubled, and emissions from other large SO2-emitting regions (South Africa, Russia, Mexico, and the Middle East) remained fairly constant. In total, OMI-based estimates account for about a half of total reported anthropogenic SO2 emissions; the remaining half is likely related to sources emitting less than 30 kt yr(exp -1) and not detected by OMI.

  18. Gaussian process based independent analysis for temporal source separation in fMRI

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff; Henao, Ricardo; Winther, Ole

    2017-01-01

    Functional Magnetic Resonance Imaging (fMRI) gives us a unique insight into the processes of the brain, and opens up for analyzing the functional activation patterns of the underlying sources. Task-inferred supervised learning with restrictive assumptions in the regression set-up, restricts...... the exploratory nature of the analysis. Fully unsupervised independent component analysis (ICA) algorithms, on the other hand, can struggle to detect clear classifiable components on single-subject data. We attribute this shortcoming to inadequate modeling of the fMRI source signals by failing to incorporate its...

  19. Direct quantitation of the preservatives benzoic and sorbic acid in processed foods using derivative spectrophotometry combined with micro dialysis.

    Science.gov (United States)

    Fujiyoshi, Tomoharu; Ikami, Takahito; Kikukawa, Koji; Kobayashi, Masato; Takai, Rina; Kozaki, Daisuke; Yamamoto, Atsushi

    2018-02-01

    The preservatives benzoic acid and sorbic acid are generally quantified with separation techniques, such as HPLC or GC. Here we describe a new method for determining these compounds in processed food samples based on a narrowness of the UV-visible spectral band width with derivative processing. It permits more selective identification and determination of target analytes in matrices. After a sample is purified by micro dialysis, UV spectra of sample solutions were measured and fourth order derivatives of the spectrum were calculated. The amplitude between the maximum and minimum values in a high-order derivative spectrum was used for the determination of benzoic acid and sorbic acid. Benzoic acid and sorbic acid levels in several commercially available processed foods were measured by HPLC and the proposed spectrometry method. The levels obtained by the two methods were highly correlated (r 2 >0.97) for both preservatives. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Derivational morphology and Serbian EFL learners: Three perspectives on the acquisition process

    Directory of Open Access Journals (Sweden)

    Dimitrijević-Savić Jovana V.

    2011-01-01

    Full Text Available Although it has long been an under-researched topic in the field of applied linguistics, morphological knowledge is nowadays regarded as a key component of vocabulary acquisition. The past two decades have witnessed a proliferation of studies of both L1 and L2 learning contexts which shed light on various issues, ranging from morphological processing to receptive/productive knowledge of derivational and inflectional morphology. However, investigations into the acquisition of English morphology by Serbian EFL learners have, to our knowledge, been scarce. The purpose of this paper is, therefore, to explore the productive derivational knowledge of upper-intermediate Serbian EFL learners by means of three different instruments: a test focusing on the knowledge of the four main word family members (nouns, verbs, adjectives, adverbs, a test of cognate and non-cognate derivatives employing six cognate English-Serbian suffixes (-ous/-oz(an, -ize/-izovati, -ation/-acija -ism/-iz(am, -ist/-ist(a, -ity/-itet and a contextualized word-formation skill test. A combination of a qualitative and quantitative approach to data analysis has revealed the difficulties Serbian EFL learners have been experiencing in their morphology/vocabulary classes and it has enabled us to identify common mistakes and weak spots. Our results have pedagogical implications and could be put to use in curriculum design and methodology.

  1. Photobleaching Response of Different Sources of Chromophoric Dissolved Organic Matter Exposed to Natural Solar Radiation Using Absorption and Excitation?Emission Matrix Spectra

    OpenAIRE

    Zhang, Yunlin; Liu, Xiaohan; Osburn, Christopher L.; Wang, Mingzhu; Qin, Boqiang; Zhou, Yongqiang

    2013-01-01

    CDOM biogeochemical cycle is driven by several physical and biological processes such as river input, biogeneration and photobleaching that act as primary sinks and sources of CDOM. Watershed-derived allochthonous (WDA) and phytoplankton-derived autochthonous (PDA) CDOM were exposed to 9 days of natural solar radiation to assess the photobleaching response of different CDOM sources, using absorption and fluorescence (excitation-emission matrix) spectroscopy. Our results showed a marked decrea...

  2. Laser scanner data processing and 3D modeling using a free and open source software

    International Nuclear Information System (INIS)

    Gabriele, Fatuzzo; Michele, Mangiameli; Giuseppe, Mussumeci; Salvatore, Zito

    2015-01-01

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is a set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue

  3. Laser scanner data processing and 3D modeling using a free and open source software

    Energy Technology Data Exchange (ETDEWEB)

    Gabriele, Fatuzzo [Dept. of Industrial and Mechanical Engineering, University of Catania (Italy); Michele, Mangiameli, E-mail: amichele.mangiameli@dica.unict.it; Giuseppe, Mussumeci; Salvatore, Zito [Dept. of Civil Engineering and Architecture, University of Catania (Italy)

    2015-03-10

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is a set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue.

  4. Effects of grain source, grain processing, and protein degradability on rumen kinetics and microbial protein synthesis in Boer kids.

    Science.gov (United States)

    Brassard, M-E; Chouinard, P Y; Berthiaume, R; Tremblay, G F; Gervais, R; Martineau, R; Cinq-Mars, D

    2015-11-01

    Microbial protein synthesis in the rumen would be optimized when dietary carbohydrates and proteins have synchronized rates and extent of degradation. The aim of this study was to evaluate the effect of varying ruminal degradation rate of energy and nitrogen sources on intake, nitrogen balance, microbial protein yield, and kinetics of nutrients in the rumen of growing kids. Eight Boer goats (38.2 ± 3.0 kg) were used. The treatments were arranged in a split-plot Latin square design with grain sources (barley or corn) forming the main plots (squares). Grain processing methods and levels of protein degradability formed the subplots in a 2 × 2 factorial arrangement for a total of 8 dietary treatments. The grain processing method was rolling for barley and cracking for corn. Levels of protein degradability were obtained by feeding untreated soybean meal (SBM) or heat-treated soybean meal (HSBM). Each experimental period lasted 21 d, consisting of a 10-d adaptation period, a 7-d digestibility determination period, and a 4-d rumen evacuation and sampling period. Kids fed with corn had higher purine derivatives (PD) excretion when coupled with SBM compared with HSBM and the opposite occurred with barley-fed kids ( ≤ 0.01). Unprocessed grain offered with SBM led to higher PD excretion than with HSBM whereas protein degradability had no effect when processed grain was fed ( ≤ 0.03). Results of the current experiment with high-concentrate diets showed that microbial N synthesis could be maximized in goat kids by combining slowly fermented grains (corn or unprocessed grains) with a highly degradable protein supplement (SBM). With barley, a more rapidly fermented grain, a greater microbial N synthesis was observed when supplementing a low-degradable protein (HSBM).

  5. Sources of Law: Approach in the Light of Disciplinary Process Right

    Directory of Open Access Journals (Sweden)

    Alexandre dos Santos Lopes

    2016-10-01

    Full Text Available This article aims to analyze the sources of law that has an correlation with the disciplinary procedural law, especially when you realize the reverberation of principles inflows and axiological values arising from the constitution that procedural species. Calls that outline the sources  of  law  that  are  related  to  this  kind  of  administrative  process,  translates  into significant challenge, insofar as its structure, especially in the new constitutional order (post- positivist allows, starting from the look and constitutional filter, define more precisely the height, feature and densification in the context of the Brazilian legal system, enabling better framing of disciplinary procedural legal relationship.

  6. microRNA expression profile in human coronary smooth muscle cell-derived microparticles is a source of biomarkers.

    Science.gov (United States)

    de Gonzalo-Calvo, David; Cenarro, Ana; Civeira, Fernando; Llorente-Cortes, Vicenta

    2016-01-01

    microRNA (miRNA) expression profile of extracellular vesicles is a potential tool for clinical practice. Despite the key role of vascular smooth muscle cells (VSMC) in cardiovascular pathology, there is limited information about the presence of miRNAs in microparticles secreted by this cell type, including human coronary artery smooth muscle cells (HCASMC). Here, we tested whether HCASMC-derived microparticles contain miRNAs and the value of these miRNAs as biomarkers. HCASMC and explants from atherosclerotic or non-atherosclerotic areas were obtained from coronary arteries of patients undergoing heart transplant. Plasma samples were collected from: normocholesterolemic controls (N=12) and familial hypercholesterolemia (FH) patients (N=12). Both groups were strictly matched for age, sex and cardiovascular risk factors. Microparticle (0.1-1μm) isolation and characterization was performed using standard techniques. VSMC-enriched miRNAs expression (miR-21-5p, -143-3p, -145-5p, -221-3p and -222-3p) was analyzed using RT-qPCR. Total RNA isolated from HCASMC-derived microparticles contained small RNAs, including VSMC-enriched miRNAs. Exposition of HCASMC to pathophysiological conditions, such as hypercholesterolemia, induced a decrease in the expression level of miR-143-3p and miR-222-3p in microparticles, not in cells. Expression levels of miR-222-3p were lower in circulating microparticles from FH patients compared to normocholesterolemic controls. Microparticles derived from atherosclerotic plaque areas showed a decreased level of miR-143-3p and miR-222-3p compared to non-atherosclerotic areas. We demonstrated for the first time that microparticles secreted by HCASMC contain microRNAs. Hypercholesterolemia alters the microRNA profile of HCASMC-derived microparticles. The miRNA signature of HCASMC-derived microparticles is a source of cardiovascular biomarkers. Copyright © 2016 Sociedad Española de Arteriosclerosis. Publicado por Elsevier España, S.L.U. All rights

  7. Estimating Source Duration for Moderate and Large Earthquakes in Taiwan

    Science.gov (United States)

    Chang, Wen-Yen; Hwang, Ruey-Der; Ho, Chien-Yin; Lin, Tzu-Wei

    2017-04-01

    Estimating Source Duration for Moderate and Large Earthquakes in Taiwan Wen-Yen Chang1, Ruey-Der Hwang2, Chien-Yin Ho3 and Tzu-Wei Lin4 1 Department of Natural Resources and Environmental Studies, National Dong Hwa University, Hualien, Taiwan, ROC 2Department of Geology, Chinese Culture University, Taipei, Taiwan, ROC 3Department of Earth Sciences, National Cheng Kung University, Tainan, Taiwan, ROC 4Seismology Center, Central Weather Bureau, Taipei, Taiwan, ROC ABSTRACT To construct a relationship between seismic moment (M0) and source duration (t) was important for seismic hazard in Taiwan, where earthquakes were quite active. In this study, we used a proposed inversion process using teleseismic P-waves to derive the M0-t relationship in the Taiwan region for the first time. Fifteen earthquakes with MW 5.5-7.1 and focal depths of less than 40 km were adopted. The inversion process could simultaneously determine source duration, focal depth, and pseudo radiation patterns of direct P-wave and two depth phases, by which M0 and fault plane solutions were estimated. Results showed that the estimated t ranging from 2.7 to 24.9 sec varied with one-third power of M0. That is, M0 is proportional to t**3, and then the relationship between both of them was M0=0.76*10**23(t)**3 , where M0 in dyne-cm and t in second. The M0-t relationship derived from this study was very close to those determined from global moderate to large earthquakes. For further understanding the validity in the derived relationship, through the constructed relationship of M0-, we inferred the source duration of the 1999 Chi-Chi (Taiwan) earthquake with M0=2-5*10**27 dyne-cm (corresponding to Mw = 7.5-7.7) to be approximately 29-40 sec, in agreement with many previous studies for source duration (28-42 sec).

  8. Process performance and modelling of anaerobic digestion using source-sorted organic household waste

    DEFF Research Database (Denmark)

    Khoshnevisan, Benyamin; Tsapekos, Panagiotis; Alvarado-Morales, Merlin

    2018-01-01

    Three distinctive start-up strategies of biogas reactors fed with source-sorted organic fraction of municipal solid waste were investigated to reveal the most reliable procedure for rapid process stabilization. Moreover, the experimental results were compared with mathematical modeling outputs....... The combination of both experimental and modelling/simulation succeeded in optimizing the start-up process for anaerobic digestion of biopulp under mesophilic conditions....

  9. Concepts and Criteria for Blind Quantum Source Separation and Blind Quantum Process Tomography

    Directory of Open Access Journals (Sweden)

    Alain Deville

    2017-07-01

    Full Text Available Blind Source Separation (BSS is an active domain of Classical Information Processing, with well-identified methods and applications. The development of Quantum Information Processing has made possible the appearance of Blind Quantum Source Separation (BQSS, with a recent extension towards Blind Quantum Process Tomography (BQPT. This article investigates the use of several fundamental quantum concepts in the BQSS context and establishes properties already used without justification in that context. It mainly considers a pair of electron spins initially separately prepared in a pure state and then submitted to an undesired exchange coupling between these spins. Some consequences of the existence of the entanglement phenomenon, and of the probabilistic aspect of quantum measurements, upon BQSS solutions, are discussed. An unentanglement criterion is established for the state of an arbitrary qubit pair, expressed first with probability amplitudes and secondly with probabilities. The interest of using the concept of a random quantum state in the BQSS context is presented. It is stressed that the concept of statistical independence of the sources, widely used in classical BSS, should be used with care in BQSS, and possibly replaced by some disentanglement principle. It is shown that the coefficients of the development of any qubit pair pure state over the states of an orthonormal basis can be expressed with the probabilities of results in the measurements of well-chosen spin components.

  10. Ion source techniques for high-speed processing of material surface by ion beams

    International Nuclear Information System (INIS)

    Ishikawa, Junzo

    1990-01-01

    The present paper discusses some key or candidate techniques for future ion source development and such ion sources developed by the author. Several types of microwave ion sources for producing low charge state ions have been developed in Japan. When a microwave plasma cathode developed by the author is adapted to a Kaufman type ion source, the electron emission currents are found to be 2.5 A for argon gas and 0.5-0.9 A for oxygen gas. An alternative ionization method for metal atoms is strongly required for high-speed processing of material surface by metal-ion beams. Detailed discussion is made of collisional ionization of vaporized atoms, and negative-ion production (secondary negative-ion emission by sputtering). An impregnated electrode type liquid-metal ion source developed by the author, which has a porous tip structure, is described. The negative-ion production efficiency is quite high. The report also presents a neutral and ionized alkaline-metal bombardment type heavy negative-ion source, which consists of a cesium plasma ion source, suppressor, target electrode, negative-ion extraction electrode, and einzel lens. (N.K.)

  11. Path spectra derived from inversion of source and site spectra for earthquakes in Southern California

    Science.gov (United States)

    Klimasewski, A.; Sahakian, V. J.; Baltay, A.; Boatwright, J.; Fletcher, J. B.; Baker, L. M.

    2017-12-01

    A large source of epistemic uncertainty in Ground Motion Prediction Equations (GMPEs) is derived from the path term, currently represented as a simple geometric spreading and intrinsic attenuation term. Including additional physical relationships between the path properties and predicted ground motions would produce more accurate and precise, region-specific GMPEs by reclassifying some of the random, aleatory uncertainty as epistemic. This study focuses on regions of Southern California, using data from the Anza network and Southern California Seismic network to create a catalog of events magnitude 2.5 and larger from 1998 to 2016. The catalog encompasses regions of varying geology and therefore varying path and site attenuation. Within this catalog of events, we investigate several collections of event region-to-station pairs, each of which share similar origin locations and stations so that all events have similar paths. Compared with a simple regional GMPE, these paths consistently have high or low residuals. By working with events that have the same path, we can isolate source and site effects, and focus on the remaining residual as path effects. We decompose the recordings into source and site spectra for each unique event and site in our greater Southern California regional database using the inversion method of Andrews (1986). This model represents each natural log record spectra as the sum of its natural log event and site spectra, while constraining each record to a reference site or Brune source spectrum. We estimate a regional, path-specific anelastic attenuation (Q) and site attenuation (t*) from the inversion site spectra and corner frequency from the inversion event spectra. We then compute the residuals between the observed record data, and the inversion model prediction (event*site spectra). This residual is representative of path effects, likely anelastic attenuation along the path that varies from the regional median attenuation. We examine the

  12. Groundwater sources and geochemical processes in a crystalline fault aquifer

    Science.gov (United States)

    Roques, Clément; Aquilina, Luc; Bour, Olivier; Maréchal, Jean-Christophe; Dewandel, Benoît; Pauwels, Hélène; Labasque, Thierry; Vergnaud-Ayraud, Virginie; Hochreutener, Rebecca

    2014-11-01

    The origin of water flowing in faults and fractures at great depth is poorly known in crystalline media. This paper describes a field study designed to characterize the geochemical compartmentalization of a deep aquifer system constituted by a graben structure where a permeable fault zone was identified. Analyses of the major chemical elements, trace elements, dissolved gases and stable water isotopes reveal the origin of dissolved components for each permeable domain and provide information on various water sources involved during different seasonal regimes. The geochemical response induced by performing a pumping test in the fault-zone is examined, in order to quantify mixing processes and contribution of different permeable domains to the flow. Reactive processes enhanced by the pumped fluxes are also identified and discussed. The fault zone presents different geochemical responses related to changes in hydraulic regime. They are interpreted as different water sources related to various permeable structures within the aquifer. During the low water regime, results suggest mixing of recent water with a clear contribution of older water of inter-glacial origin (recharge temperature around 7 °C), suggesting the involvement of water trapped in a local low-permeability matrix domain or the contribution of large scale circulation loops. During the high water level period, due to inversion of the hydraulic gradient between the major permeable fault zone and its surrounding domains, modern water predominantly flows down to the deep bedrock and ensures recharge at a local scale within the graben. Pumping in a permeable fault zone induces hydraulic connections with storage-reservoirs. The overlaid regolith domain ensures part of the flow rate for long term pumping (around 20% in the present case). During late-time pumping, orthogonal fluxes coming from the fractured domains surrounding the major fault zone are dominant. Storage in the connected fracture network within the

  13. The Chandra Source Catalog 2.0: Combining Data for Processing (or How I learned 17 different words for "group")

    Science.gov (United States)

    Hain, Roger; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula

    2018-01-01

    The Second Chandra Source Catalog (CSC2.0) combines data at multiple stages to improve detection efficiency, enhance source region identification, and match observations of the same celestial source taken with significantly different point spread functions on Chandra's detectors. The need to group data for different reasons at different times in processing results in a hierarchy of groups to which individual sources belong. Source data are initially identified as belonging to each Chandra observation ID and number (an "obsid"). Data from each obsid whose pointings are within sixty arcseconds of each other are reprojected to the same aspect reference coordinates and grouped into stacks. Detection is performed on all data in the same stack, and individual sources are identified. Finer source position and region data are determined by further processing sources whose photons may be commingled together, grouping such sources into bundles. Individual stacks which overlap to any extent are grouped into ensembles, and all stacks in the same ensemble are later processed together to identify master sources and determine their properties.We discuss the basis for the various methods of combining data for processing and precisely define how the groups are determined. We also investigate some of the issues related to grouping data and discuss what options exist and how groups have evolved from prior releases.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.

  14. Processing of pro-opiomelanocortin-derived amidated joining peptide and glycine-extended precursor in monkey pituitary

    DEFF Research Database (Denmark)

    Fenger, M

    1991-01-01

    The molecular forms of proopiomelanocortin (POMC) derived amidated and C-terminal glycine-extended joining peptide from monkey (Macaca mulatta) pituitary were determined. The predominant forms of joining peptide found were the low molecular peptides POMC(76-105) and POMC(76-106), respectively...... sequence of monkey and human POMC extremely conserved, but also the processing patterns are similar. The monkey therefore serves as a suitable model for studying regulation of the processing of POMC and the hypothalamus-pituitary-adrenal axis in man....

  15. How Reservoirs Alter DOM Amount and Composition: Sources, Sinks, and Transformations

    Science.gov (United States)

    Kraus, T. E.; Bergamaschi, B. A.; Hernes, P. J.; Doctor, D. H.; Kendall, C.; Losee, R. F.; Downing, B. D.

    2011-12-01

    Reservoirs are critical components of many water supply systems as they allow the storage of water when supply exceeds demand. However, during water storage biogeochemical processes can alter both the amount and composition of dissolved organic matter (DOM), which can in turn affect water quality. While the balance between production and loss determines whether a reservoir is a net sink or source of DOM, changes in chemical composition are also relevant as they affect DOM reactivity (e.g. persistence in the environment, removability during coagulation treatment, and potential to form toxic compounds during drinking water treatment). The composition of the DOM pool also provides information about the DOM sources and processing, which can inform reservoir management. We examined the concentration and composition of DOM in San Luis Reservoir (SLR), a large off-stream impoundment of the California State Water Project. We used an array of DOM chemical tracers including dissolved organic carbon (DOC) concentration, optical properties, isotopic composition, lignin phenol content, and structural groupings determined by 13C NMR. There were periods when the reservoir was i) a net source of DOM due to the predominance of algal production (summer), ii) a net sink due to the predominance of degradation (fall/winter), and iii) balanced between production and consumption (spring). Despite only moderate variation in bulk DOC concentration (3.0-3.6 mg C/L), substantial changes in DOM composition indicated that terrestrial-derived material entering the reservoir was being degraded and replaced by aquatic-derived DOM produced within the reservoir. Results suggest reservoirs have the potential to reduce DOM amount and reactivity via degradative processes, however, these benefits can be decreased or even negated by the production of algal-derived DOM.

  16. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Pyeon, Cheol Ho [Kyoto University, Osaka (Japan)

    2015-10-15

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r{sub g}, E{sub g}, t{sub g}) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the

  17. Technological yields of sources for radiation processing

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1990-01-01

    The present report is prepared for planners of radiation processing of any material. Sources with cobalt-60 are treated marginally, because most probably, there will be no installation of technically meaningful activity in Poland before the year 2000. Calculations are focused on accelerators of electrons, divided into two groups: versatile linacs of energy up to 13 MeV and accelerators of lower energy, below 2 MeV, of better energetical yield but of limited applications. The calculations are connected with the confrontation of the author's technological expectations during the preparation of the linac project in the late '60s, with the results of twenty years of exploitation of the machine. One has to realize that from the 150 kV input power from the mains, only 5 kV of bent and scanned beam is recovered on the conveyor. That power is only partially used for radiation induced phenomena, sometimes only a few percent, because of the demanded homogeneity of the dose, of the mode of packing of the object and its shape, of losses at the edges of the scanned area and in the spaces between boxes, and of losses during the dead time due to the tuning of the machine and dosimetric operations. The use of lower energy accelerators may be more economical in case of objects of optimum type. At the first stage, that is of the conversion of electrical power into that of the low energy electron beam, the yield is 2-3 times better than in the case of linacs. Attention has been paid to the technological aspects of electron beam conversion into the more penetrating bremsstrahlung similar to gamma radiation. The advantages of these technologies, which make it possible to control the shape of the processed object are stressed. Ten parameters necessary for a proper calculation of technological yields of radiation processing are listed. Additional conditions which must be taken into account in the comparison of the cost of radiation processing with the cost of other technologies are also

  18. Partial podocyte replenishment in experimental FSGS derives from nonpodocyte sources.

    Science.gov (United States)

    Kaverina, Natalya V; Eng, Diana G; Schneider, Remington R S; Pippin, Jeffrey W; Shankland, Stuart J

    2016-06-01

    The current studies used genetic fate mapping to prove that adult podocytes can be partially replenished following depletion. Inducible NPHS2-rtTA/tetO-Cre/RS-ZsGreen-R reporter mice were generated to permanently label podocytes with the ZsGreen reporter. Experimental focal segmental glomerulosclerosis (FSGS) was induced with a cytotoxic podocyte antibody. On FSGS day 7, immunostaining for the podocyte markers p57, synaptopodin, and podocin were markedly decreased by 44%, and this was accompanied by a decrease in ZsGreen fluorescence. The nuclear stain DAPI was absent in segments of reduced ZsGreen and podocyte marker staining, which is consistent with podocyte depletion. Staining for p57, synaptopodin, podocin, and DAPI increased at FSGS day 28 and was augmented by the ACE inhibitor enalapril, which is consistent with a partial replenishment of podocytes. In contrast, ZsGreen fluorescence did not return and remained significantly low at day 28, indicating replenishment was from a nonpodocyte origin. Despite administration of bromodeoxyuridine (BrdU) thrice weekly throughout the course of disease, BrdU staining was not detected in podocytes, which is consistent with an absence of proliferation. Although ZsGreen reporting was reduced in the tuft at FSGS day 28, labeled podocytes were detected along the Bowman's capsule in a subset of glomeruli, which is consistent with migration from the tuft. Moreover, more than half of the migrated podocytes coexpressed the parietal epithelial cell (PEC) proteins claudin-1, SSeCKS, and PAX8. These results show that although podocytes can be partially replenished following abrupt depletion, a process augmented by ACE inhibition, the source or sources are nonpodocyte in origin and are independent of proliferation. Furthermore, a subset of podocytes migrate to the Bowman's capsule and begin to coexpress PEC markers. Copyright © 2016 the American Physiological Society.

  19. Use of the discriminant Fourier-derived cepstrum with feature-level post-processing for surface electromyographic signal classification

    International Nuclear Information System (INIS)

    Chen, Xinpu; Zhu, Xiangyang; Zhang, Dingguo

    2009-01-01

    Myoelectrical pattern classification is a crucial part in multi-functional prosthesis control. This paper investigates a discriminant Fourier-derived cepstrum (DFC) and feature-level post-processing (FLPP) to discriminate hand and wrist motions using the surface electromyographic signal. The Fourier-derived cepstrum takes advantage of the Fourier magnitude or sub-band power energy of signals directly and provides flexible use of spectral information changing with different motions. Appropriate cepstral coefficients are selected by a proposed separability criterion to construct DFC features. For the post-processing, FLPP which combines features from several analysis windows is used to improve the feature performance further. In this work, two classifiers (a linear discriminant classifier and quadratic discriminant classifier) without hyper-parameter optimization are employed to simplify the training procedure and avoid the possible bias of feature evaluation. Experimental results of the 11-motion problem show that the proposed DFC feature outperforms traditional features such as time-domain statistics and autoregressive-derived cepstrum in terms of the classification accuracy, and it is a promising method for the multi-functionality and high-accuracy control of myoelectric prostheses

  20. TOXICOLOGICAL EVALUATION OF REALISTIC EMISSIONS OF SOURCE AEROSOLS (TERESA): APPLICATION TO POWER PLANT-DERIVED PM2.5

    Energy Technology Data Exchange (ETDEWEB)

    Annette Rohr

    2006-03-01

    TERESA (Toxicological Evaluation of Realistic Emissions of Source Aerosols) involves exposing laboratory rats to realistic coal-fired power plant and mobile source emissions to help determine the relative toxicity of these PM sources. There are three coal-fired power plants in the TERESA program; this report describes the results of fieldwork conducted at the first plant, located in the Upper Midwest. The project was technically challenging by virtue of its novel design and requirement for the development of new techniques. By examining aged, atmospherically transformed aerosol derived from power plant stack emissions, we were able to evaluate the toxicity of PM derived from coal combustion in a manner that more accurately reflects the exposure of concern than existing methodologies. TERESA also involves assessment of actual plant emissions in a field setting--an important strength since it reduces the question of representativeness of emissions. A sampling system was developed and assembled to draw emissions from the stack; stack sampling conducted according to standard EPA protocol suggested that the sampled emissions are representative of those exiting the stack into the atmosphere. Two mobile laboratories were then outfitted for the study: (1) a chemical laboratory in which the atmospheric aging was conducted and which housed the bulk of the analytical equipment; and (2) a toxicological laboratory, which contained animal caging and the exposure apparatus. Animal exposures were carried out from May-November 2004 to a number of simulated atmospheric scenarios. Toxicological endpoints included (1) pulmonary function and breathing pattern; (2) bronchoalveolar lavage fluid cytological and biochemical analyses; (3) blood cytological analyses; (4) in vivo oxidative stress in heart and lung tissue; and (5) heart and lung histopathology. Results indicated no differences between exposed and control animals in any of the endpoints examined. Exposure concentrations for the

  1. Source terms derived from analyses of hypothetical accidents, 1950-1986

    International Nuclear Information System (INIS)

    Stratton, W.R.

    1987-01-01

    This paper reviews the history of reactor accident source term assumptions. After the Three Mile Island accident, a number of theoretical and experimental studies re-examined possible accident sequences and source terms. Some of these results are summarized in this paper

  2. Disambiguating past events: Accurate source memory for time and context depends on different retrieval processes.

    Science.gov (United States)

    Persson, Bjorn M; Ainge, James A; O'Connor, Akira R

    2016-07-01

    Current animal models of episodic memory are usually based on demonstrating integrated memory for what happened, where it happened, and when an event took place. These models aim to capture the testable features of the definition of human episodic memory which stresses the temporal component of the memory as a unique piece of source information that allows us to disambiguate one memory from another. Recently though, it has been suggested that a more accurate model of human episodic memory would include contextual rather than temporal source information, as humans' memory for time is relatively poor. Here, two experiments were carried out investigating human memory for temporal and contextual source information, along with the underlying dual process retrieval processes, using an immersive virtual environment paired with a 'Remember-Know' memory task. Experiment 1 (n=28) showed that contextual information could only be retrieved accurately using recollection, while temporal information could be retrieved using either recollection or familiarity. Experiment 2 (n=24), which used a more difficult task, resulting in reduced item recognition rates and therefore less potential for contamination by ceiling effects, replicated the pattern of results from Experiment 1. Dual process theory predicts that it should only be possible to retrieve source context from an event using recollection, and our results are consistent with this prediction. That temporal information can be retrieved using familiarity alone suggests that it may be incorrect to view temporal context as analogous to other typically used source contexts. This latter finding supports the alternative proposal that time since presentation may simply be reflected in the strength of memory trace at retrieval - a measure ideally suited to trace strength interrogation using familiarity, as is typically conceptualised within the dual process framework. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Separation and purification of hemicellulose-derived saccharides from wood hydrolysate by combined process.

    Science.gov (United States)

    Wang, Xiaojun; Zhuang, Jingshun; Jiang, Jungang; Fu, Yingjuan; Qin, Menghua; Wang, Zhaojiang

    2015-11-01

    Prehydrolysis of wood biomass prior to kraft cooking provides a stream containing hemicellulose-derived saccharides (HDSs) but also undesired non-saccharide compounds (NSCs) that were resulted from lignin depolymerization and carbohydrate degradation. In this study, a combined process consisting of lime treatment, resin adsorption, and gel filtration was developed to separate HDSs from NSCs. The macro-lignin impurities that accounted for 32.2% of NSCs were removed by lime treatment at 1.2% dosage with negligible HDSs loss. The majority of NSCs, lignin-derived phenolics, were eliminated by mixed bed ion exchange resin, elevating NSCs removal to 94.0%. The remaining NSCs, furfural and hydroxymethylfurfural, were excluded from HDSs by gel filtration. Chemical composition analysis showed that xylooligosaccharides (XOS) with the degree of depolymerization from 2 to 6 accounted for 28% of the total purified HDSs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Foetal stem cell derivation & characterization for osteogenic lineage

    Directory of Open Access Journals (Sweden)

    A Mangala Gowri

    2013-01-01

    Full Text Available Background & objectives: Mesencymal stem cells (MSCs derived from foetal tissues present a multipotent progenitor cell source for application in tissue engineering and regenerative medicine. The present study was carried out to derive foetal mesenchymal stem cells from ovine source and analyze their differentiation to osteogenic linage to serve as an animal model to predict human applications. Methods: Isolation and culture of sheep foetal bone marrow cells were done and uniform clonally derived MSC population was collected. The cells were characterized using cytochemical, immunophenotyping, biochemical and molecular analyses. The cells with defined characteristics were differentiated into osteogenic lineages and analysis for differentiated cell types was done. The cells were analyzed for cell surface marker expression and the gene expression in undifferentiated and differentiated osteoblast was checked by reverse transcriptase PCR (RT PCR analysis and confirmed by sequencing using genetic analyzer. Results: Ovine foetal samples were processed to obtain mononuclear (MNC cells which on culture showed spindle morphology, a characteristic oval body with the flattened ends. MSC population CD45 - /CD14 - was cultured by limiting dilution to arrive at uniform spindle morphology cells and colony forming units. The cells were shown to be positive for surface markers such as CD44, CD54, integrinβ1, and intracellular collagen type I/III and fibronectin. The osteogenically induced MSCs were analyzed for alkaline phosphatase (ALP activity and mineral deposition. The undifferentiated MSCs expressed RAB3B, candidate marker for stemness in MSCs. The osteogenically induced and uninduced MSCs expressed collagen type I and MMP13 gene in osteogenic induced cells. Interpretation & conclusions: The protocol for isolation of ovine foetal bone marrow derived MSCs was simple to perform, and the cultural method of obtaining pure spindle morphology cells was established

  5. Potential applications of plant based derivatives as fat replacers, antioxidants and antimicrobials in fresh and processed meat products.

    Science.gov (United States)

    Hygreeva, Desugari; Pandey, M C; Radhakrishna, K

    2014-09-01

    Growing concern about diet and health has led to development of healthier food products. In general consumer perception towards the intake of meat and meat products is unhealthy because it may increase the risk of diseases like cardiovascular diseases, obesity and cancer, because of its high fat content (especially saturated fat) and added synthetic antioxidants and antimicrobials. Addition of plant derivatives having antioxidant components including vitamins A, C and E, minerals, polyphenols, flavanoids and terpenoids in meat products may decrease the risk of several degenerative diseases. To change consumer attitudes towards meat consumption, the meat industry is undergoing major transformations by addition of nonmeat ingredients as animal fat replacers, natural antioxidants and antimicrobials, preferably derived from plant sources. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Microseismic imaging using a source-independent full-waveform inversion method

    KAUST Repository

    Wang, Hanchen

    2016-09-06

    Using full waveform inversion (FWI) to locate microseismic and image microseismic events allows for an automatic process (free of picking) that utilizes the full wavefield. However, waveform inversion of microseismic events faces incredible nonlinearity due to the unknown source location (space) and function (time). We develop a source independent FWI of microseismic events to invert for the source image, source function and the velocity model. It is based on convolving reference traces with the observed and modeled data to mitigate the effect of an unknown source ignition time. The adjoint-state method is used to derive the gradient for the source image, source function and velocity updates. The extended image for source wavelet in z axis is extracted to check the accuracy of the inverted source image and velocity model. Also the angle gather is calculated to see if the velocity model is correct. By inverting for all the source image, source wavelet and the velocity model, the proposed method produces good estimates of the source location, ignition time and the background velocity for part of the SEG overthrust model.

  7. Microseismic imaging using a source-independent full-waveform inversion method

    KAUST Repository

    Wang, Hanchen

    2016-01-01

    Using full waveform inversion (FWI) to locate microseismic and image microseismic events allows for an automatic process (free of picking) that utilizes the full wavefield. However, waveform inversion of microseismic events faces incredible nonlinearity due to the unknown source location (space) and function (time). We develop a source independent FWI of microseismic events to invert for the source image, source function and the velocity model. It is based on convolving reference traces with the observed and modeled data to mitigate the effect of an unknown source ignition time. The adjoint-state method is used to derive the gradient for the source image, source function and velocity updates. The extended image for source wavelet in z axis is extracted to check the accuracy of the inverted source image and velocity model. Also the angle gather is calculated to see if the velocity model is correct. By inverting for all the source image, source wavelet and the velocity model, the proposed method produces good estimates of the source location, ignition time and the background velocity for part of the SEG overthrust model.

  8. The Chandra Source Catalog : Automated Source Correlation

    Science.gov (United States)

    Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).

  9. On the possibility of the multiple inductively coupled plasma and helicon plasma sources for large-area processes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jin-Won; Lee, Yun-Seong, E-mail: leeeeys@kaist.ac.kr; Chang, Hong-Young [Low-temperature Plasma Laboratory, Department of Physics, Korea Advanced Institute of Science and Technology, Daejeon 305-701 (Korea, Republic of); An, Sang-Hyuk [Agency of Defense Development, Yuseong-gu, Daejeon 305-151 (Korea, Republic of)

    2014-08-15

    In this study, we attempted to determine the possibility of multiple inductively coupled plasma (ICP) and helicon plasma sources for large-area processes. Experiments were performed with the one and two coils to measure plasma and electrical parameters, and a circuit simulation was performed to measure the current at each coil in the 2-coil experiment. Based on the result, we could determine the possibility of multiple ICP sources due to a direct change of impedance due to current and saturation of impedance due to the skin-depth effect. However, a helicon plasma source is difficult to adapt to the multiple sources due to the consistent change of real impedance due to mode transition and the low uniformity of the B-field confinement. As a result, it is expected that ICP can be adapted to multiple sources for large-area processes.

  10. A systematic quantification of the sources of variation of process analytical measurements in the steel industry

    NARCIS (Netherlands)

    Jellema, R.H.; Louwerse, D.J.; Smilde, A.K.; Gerritsen, M.J.P.; Guldemond, D.; Voet, van der H.; Vereijken, P.F.G.

    2003-01-01

    A strategy is proposed for the Identification and quantification of sources of variation in a manufacturing process. The strategy involves six steps: identification and selection of factors, model selection, design of the experiments, performing the experiments, estimation of sources of variation,

  11. Forecasting and evaluations of crude oil processing and oil derivatives consumption in Republic of Macedonia up to 2000 year

    International Nuclear Information System (INIS)

    Janevski, Risto

    1998-01-01

    Elaboration of various analysis in an energetic field is a quite usual, but inevitable action, procedure and investigation. Also, in a field of crude oil processing and oil derivatives consumption these analyses are a base for making a various range of forecasting and evaluations. How many of these forecasting and evaluations will be credible it depends mostly of diligent, precise and accurate data and experiences in the previous years. This part refers to forecasting and evaluations of crude oil processing and oil derivatives consumption in a short period up to 2000 year in Republic of Macedonia. (Author)

  12. Real-time speckle variance swept-source optical coherence tomography using a graphics processing unit.

    Science.gov (United States)

    Lee, Kenneth K C; Mariampillai, Adrian; Yu, Joe X Z; Cadotte, David W; Wilson, Brian C; Standish, Beau A; Yang, Victor X D

    2012-07-01

    Advances in swept source laser technology continues to increase the imaging speed of swept-source optical coherence tomography (SS-OCT) systems. These fast imaging speeds are ideal for microvascular detection schemes, such as speckle variance (SV), where interframe motion can cause severe imaging artifacts and loss of vascular contrast. However, full utilization of the laser scan speed has been hindered by the computationally intensive signal processing required by SS-OCT and SV calculations. Using a commercial graphics processing unit that has been optimized for parallel data processing, we report a complete high-speed SS-OCT platform capable of real-time data acquisition, processing, display, and saving at 108,000 lines per second. Subpixel image registration of structural images was performed in real-time prior to SV calculations in order to reduce decorrelation from stationary structures induced by the bulk tissue motion. The viability of the system was successfully demonstrated in a high bulk tissue motion scenario of human fingernail root imaging where SV images (512 × 512 pixels, n = 4) were displayed at 54 frames per second.

  13. Energy and greenhouse gas profiles of polyhydroxybutyrates derived from corn grain: a life cycle perspective.

    Science.gov (United States)

    Kim, Seungdo; Dale, Bruce E

    2008-10-15

    Polyhydroxybutyrates (PHB) are well-known biopolymers derived from sugars orvegetable oils. Cradle-to-gate environmental performance of PHB derived from corn grain is evaluated through life cycle assessment (LCA), particularly nonrenewable energy consumption and greenhouse gas emissions. Site-specific process information on the corn wet milling and PHB fermentation and recovery processes was obtained from Telles. Most of energy used in the corn wet milling and PHB fermentation and recovery processes is generated in a cogeneration power plant in which corn stover, assumed to be representative of a variety of biomass sources that could be used, is burned to generate electricity and steam. County level agricultural information is used in estimating the environmental burdens associated with both corn grain and corn stover production. Results show that PHB derived from corn grain offers environmental advantages over petroleum-derived polymers in terms of nonrenewable energy consumption and greenhouse gas emissions. Furthermore, PHB provides greenhouse gas credits, and thus PHB use reduces greenhouse gas emissions compared to petroleum-derived polymers. Corn cultivation is one of the environmentally sensitive areas in the PHB production system. More sustainable practices in corn cultivation (e.g., using no-tillage and winter cover crops) could reduce the environmental impacts of PHB by up to 72%.

  14. Effect of Source, Surfactant, and Deposition Process on Electronic Properties of Nanotube Arrays

    Directory of Open Access Journals (Sweden)

    Dheeraj Jain

    2011-01-01

    Full Text Available The electronic properties of arrays of carbon nanotubes from several different sources differing in the manufacturing process used with a variety of average properties such as length, diameter, and chirality are studied. We used several common surfactants to disperse each of these nanotubes and then deposited them on Si wafers from their aqueous solutions using dielectrophoresis. Transport measurements were performed to compare and determine the effect of different surfactants, deposition processes, and synthesis processes on nanotubes synthesized using CVD, CoMoCAT, laser ablation, and HiPCO.

  15. DEVELOPMENT OF CONTINUOUS SOLVENT EXTRACTION PROCESSES FOR COAL DERIVED CARBON PRODUCTS

    Energy Technology Data Exchange (ETDEWEB)

    Elliot B. Kennel; Philip L. Biedler; Chong Chen; Dady Dadyburjor; Liviu Magean; Peter G. Stansberry; Alfred H. Stiller; John W. Zondlo

    2005-04-13

    The purpose of this DOE-funded effort is to develop continuous processes for solvent extraction of coal for the production of carbon products. These carbon products include materials used in metals smelting, especially in the aluminum and steel industries, as well as porous carbon structural material referred to as ''carbon foam'' and carbon fibers. A process has been developed which results in high quality binder pitch suitable for use in graphite electrodes or carbon anodes. A detailed description of the protocol is given by Clendenin. Briefly, aromatic heavy oils are hydro-treated under mild conditions in order to increase their ability to dissolve coal. An example of an aromatic heavy oil is Koppers Carbon Black Base (CBB) oil. CBB oil has been found to be an effective solvent and acceptably low cost (i.e., significantly below the market price for binder pitch, or about $280 per ton at the time of this writing). It is also possible to use solvents derived from hydrotreated coal and avoid reliance on coke oven recovery products completely if so desired.

  16. A multilevel reuse system with source separation process for printing and dyeing wastewater treatment: A case study.

    Science.gov (United States)

    Wang, Rui; Jin, Xin; Wang, Ziyuan; Gu, Wantao; Wei, Zhechao; Huang, Yuanjie; Qiu, Zhuang; Jin, Pengkang

    2018-01-01

    This paper proposes a new system of multilevel reuse with source separation in printing and dyeing wastewater (PDWW) treatment in order to dramatically improve the water reuse rate to 35%. By analysing the characteristics of the sources and concentrations of pollutants produced in different printing and dyeing processes, special, highly, and less contaminated wastewaters (SCW, HCW, and LCW, respectively) were collected and treated separately. Specially, a large quantity of LCW was sequentially reused at multiple levels to meet the water quality requirements for different production processes. Based on this concept, a multilevel reuse system with a source separation process was established in a typical printing and dyeing enterprise. The water reuse rate increased dramatically to 62%, and the reclaimed water was reused in different printing and dyeing processes based on the water quality. This study provides promising leads in water management for wastewater reclamation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Python Open source Waveform ExtractoR (POWER): an open source, Python package to monitor and post-process numerical relativity simulations

    Science.gov (United States)

    Johnson, Daniel; Huerta, E. A.; Haas, Roland

    2018-01-01

    Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.

  18. Meat and fermented meat products as a source of bioactive peptides.

    Science.gov (United States)

    Stadnik, Joanna; Kęska, Paulina

    2015-01-01

    Bioactive peptides are short amino acid sequences, that upon release from the parent protein may play different physiological roles, including antioxidant, antihypertensive, antimicrobial, and other bioactivities. They have been identified from a range of foods, including those of animal origin, e.g., milk and muscle sources (with pork, beef, or chicken and various species of fish and marine organism). Bioactive peptides are encrypted within the sequence of the parent protein molecule and latent until released and activated by enzymatic proteolysis, e.g. during gastrointestinal digestion or food processing. Bioactive peptides derived from food sources have the potential for incorporation into functional foods and nutraceuticals. The aim of this paper is to present an overview of the muscle-derived bioactive peptides, especially those of fermented meats and the potential benefits of these bioactive compounds to human health.

  19. Sources and diagenet..

    African Journals Online (AJOL)

    The inter-site differences in 13C and 151V could be related to the sources of OM. In the Msasani Bay the material is derived from seagrasses while in the Dar es salaam harbour and Msimbazi micro-bay, large proportion is derived from the continent. Other parameters ofOC, nitrogen and C/N ratios for these three sites show ...

  20. Study of a new source for positive and negative ions. Final report

    International Nuclear Information System (INIS)

    Freedman, A.; Davidovits, P.

    1985-05-01

    This study has focused on the feasibility of a novel ion source based on the technique of photodissociation, which could provide both positive and negative ions at considerably higher intensities (potentially 10 15 cm -3 ) than are currently available. Ions are produced by irradiating a sample of a gaseous thallium halide salt with an argon fluoride excimer laser operating at 193 nm. At this wavelength, both thallium bromide and iodide will produce atomic ion pairs in a single photon process and molecular positive ions and an electron in a two-photon induced process. The potential traits of such an excimer-laser pumped thallium salt ion source include the following: high intensity and pulse rate, good spatial and temporal resolution, low temperature, good focusing properties, and production of heavy ions. This report describes a Phase I effort investigating the efficacy of this approach. A review of the relevant photophysics pertaining to laser excitation of thallium halide salts is presented, followed by a description of both experimental and theoretical efforts involving thallium bromide in particular. The last section will summarize the basic conclusions derived from these studies, as well as discuss potential advantages of an ion source derived from photolyzing thallium halide salts

  1. The Chandra Source Catalog: Spectral Properties

    Science.gov (United States)

    Doe, Stephen; Siemiginowska, Aneta L.; Refsdal, Brian L.; Evans, Ian N.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Glotfelty, Kenny J.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Primini, Francis A.; Rots, Arnold H.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2009-09-01

    The first release of the Chandra Source Catalog (CSC) contains all sources identified from eight years' worth of publicly accessible observations. The vast majority of these sources have been observed with the ACIS detector and have spectral information in 0.5-7 keV energy range. Here we describe the methods used to automatically derive spectral properties for each source detected by the standard processing pipeline and included in the final CSC. Hardness ratios were calculated for each source between pairs of energy bands (soft, medium and hard) using the Bayesian algorithm (BEHR, Park et al. 2006). The sources with high signal to noise ratio (exceeding 150 net counts) were fit in Sherpa (the modeling and fitting application from the Chandra Interactive Analysis of Observations package, developed by the Chandra X-ray Center; see Freeman et al. 2001). Two models were fit to each source: an absorbed power law and a blackbody emission. The fitted parameter values for the power-law and blackbody models were included in the catalog with the calculated flux for each model. The CSC also provides the source energy flux computed from the normalizations of predefined power-law and black-body models needed to match the observed net X-ray counts. In addition, we provide access to data products for each source: a file with source spectrum, the background spectrum, and the spectral response of the detector. This work is supported by NASA contract NAS8-03060 (CXC).

  2. Cathode R&D for Future Light Sources

    Energy Technology Data Exchange (ETDEWEB)

    Dowell, D.H.; /SLAC; Bazarov, I.; Dunham, B.; /Cornell U., CLASSE; Harkay, K.; /Argonne; Hernandez-Garcia; /Jefferson Lab; Legg, R.; /Wisconsin U., SRC; Padmore, H.; /LBL, Berkeley; Rao, T.; Smedley, J.; /Brookhaven; Wan, W.; /LBL, Berkeley

    2010-05-26

    This paper reviews the requirements and current status of cathodes for accelerator applications, and proposes a research and development plan for advancing cathode technology. Accelerator cathodes need to have long operational lifetimes and produce electron beams with a very low emittance. The two principal emission processes to be considered are thermionic and photoemission with the photocathodes being further subdivided into metal and semi-conductors. Field emission cathodes are not included in this analysis. The thermal emittance is derived and the formulas used to compare the various cathode materials. To date, there is no cathode which provides all the requirements needed for the proposed future light sources. Therefore a three part research plan is described to develop cathodes for these future light source applications.

  3. EEG source reconstruction reveals frontal-parietal dynamics of spatial conflict processing

    OpenAIRE

    Cohen, M.X.; Ridderinkhof, K.R.

    2013-01-01

    Cognitive control requires the suppression of distracting information in order to focus on task-relevant information. We applied EEG source reconstruction via time-frequency linear constrained minimum variance beamforming to help elucidate the neural mechanisms involved in spatial conflict processing. Human subjects performed a Simon task, in which conflict was induced by incongruence between spatial location and response hand. We found an early (?200 ms post-stimulus) conflict modulation in ...

  4. Alpha-root Processes for Derivatives pricing

    OpenAIRE

    Balakrishna, BS

    2010-01-01

    A class of mean reverting positive stochastic processes driven by alpha-stable distributions, referred to here as alpha-root processes in analogy to the square root process (Cox-Ingersoll-Ross process), is a subclass of affine processes, in particular continuous state branching processes with immigration (CBI processes). Being affine, they provide semi-analytical results for the implied term structures as well as for the characteristic exponents for their associated distributions. Their use h...

  5. Quantification of source-term profiles from near-field geochemical models

    International Nuclear Information System (INIS)

    McKinley, I.G.

    1985-01-01

    A geochemical model of the near-field is described which quantitatively treats the processes of engineered barrier degradation, buffering of aqueous chemistry by solid phases, nuclide solubilization and transport through the near-field and release to the far-field. The radionuclide source-terms derived from this model are compared with those from a simpler model used for repository safety analysis. 10 refs., 2 figs., 2 tabs

  6. A Gaussian process and derivative spectral-based algorithm for red blood cell segmentation

    Science.gov (United States)

    Xue, Yingying; Wang, Jianbiao; Zhou, Mei; Hou, Xiyue; Li, Qingli; Liu, Hongying; Wang, Yiting

    2017-07-01

    As an imaging technology used in remote sensing, hyperspectral imaging can provide more information than traditional optical imaging of blood cells. In this paper, an AOTF based microscopic hyperspectral imaging system is used to capture hyperspectral images of blood cells. In order to achieve the segmentation of red blood cells, Gaussian process using squared exponential kernel function is applied first after the data preprocessing to make the preliminary segmentation. The derivative spectrum with spectral angle mapping algorithm is then applied to the original image to segment the boundary of cells, and using the boundary to cut out cells obtained from the Gaussian process to separated adjacent cells. Then the morphological processing method including closing, erosion and dilation is applied so as to keep adjacent cells apart, and by applying median filtering to remove noise points and filling holes inside the cell, the final segmentation result can be obtained. The experimental results show that this method appears better segmentation effect on human red blood cells.

  7. Rf power sources

    International Nuclear Information System (INIS)

    Allen, M.A.

    1988-05-01

    This paper covers RF power sources for accelerator applications. The approach has been with particular customers in mind. These customers are high energy physicists who use accelerators as experimental tools in the study of the nucleus of the atom, and synchrotron light sources derived from electron or positron storage rings. This paper is confined to electron-positron linear accelerators since the RF sources have always defined what is possible to achieve with these accelerators. 11 refs., 13 figs

  8. Radiation processing of polysaccharide derivatives

    International Nuclear Information System (INIS)

    Yoshii, Fumio

    2004-01-01

    Carboxymethylcellulose (CMC), carboxymethylstarch (CMS), carboxymethylchitin (CM-chitin) and carboxymethylchitosan (CM-chitosan) form gels when irradiated at paste-like condition. Bedsore prevention mat filled up CMC hydrogel crosslinked by irradiation at paste-like condition was practical applied as a health care products. It was found that CM-chitosan hydrogels have anti-microbial activity and effective as absorbents to remove metal ions. When crosslinked gel sheets of CM-chitin and CM-chitosan were immersed in copper (II) aqueous solution, absorption of Cu (II) were 161 mg/g and 172 mg/g, respectively. Radiation crosslinking of cellulose derivative such as hydroxypropyl methylcellulose phthalate, (HPMCP) kneaded with aqueous alkali solution and methanol was achieved with EB-irradiation at paste-like condition. The HPMCP gel absorbed organic solvents such as chloroform and pyridine. (author)

  9. Nitrogen deposition in precipitation to a monsoon-affected eutrophic embayment: Fluxes, sources, and processes

    Science.gov (United States)

    Wu, Yunchao; Zhang, Jingping; Liu, Songlin; Jiang, Zhijian; Arbi, Iman; Huang, Xiaoping; Macreadie, Peter Ian

    2018-06-01

    Daya Bay in the South China Sea (SCS) has experienced rapid nitrogen pollution and intensified eutrophication in the past decade due to economic development. Here, we estimated the deposition fluxes of nitrogenous species, clarified the contribution of nitrogen from precipitation and measured ions and isotopic composition (δ15N and δ18O) of nitrate in precipitation in one year period to trace its sources and formation processes among different seasons. We found that the deposition fluxes of total dissolved nitrogen (TDN), NO3-, NH4+, NO2-, and dissolved organic nitrogen (DON) to Daya Bay were 132.5, 64.4 17.5, 1.0, 49.6 mmol m-2•yr-1, respectively. DON was a significant contributor to nitrogen deposition (37% of TDN), and NO3- accounted for 78% of the DIN in precipitation. The nitrogen deposition fluxes were higher in spring and summer, and lower in winter. Nitrogen from precipitation contributed nearly 38% of the total input of nitrogen (point sources input and dry and wet deposition) in Daya Bay. The δ15N-NO3- abundance, ion compositions, and air mass backward trajectories implicated that coal combustion, vehicle exhausts, and dust from mainland China delivered by northeast monsoon were the main sources in winter, while fossil fuel combustion (coal combustion and vehicle exhausts) and dust from PRD and southeast Asia transported by southwest monsoon were the main sources in spring; marine sources, vehicle exhausts and lightning could be the potential sources in summer. δ18O results showed that OH pathway was dominant in the chemical formation process of nitrate in summer, while N2O5+ DMS/HC pathways in winter and spring.

  10. Characterization of emission factors related to source activity for trichloroethylene degreasing and chrome plating processes.

    Science.gov (United States)

    Wadden, R A; Hawkins, J L; Scheff, P A; Franke, J E

    1991-09-01

    A study at an automotive parts fabrication plant evaluated four metal surface treatment processes during production conditions. The evaluation provides examples of how to estimate process emission factors from activity and air concentration data. The processes were open tank and enclosed tank degreasing with trichloroethylene (TCE), chromium conversion coating, and chromium electroplating. Area concentrations of TCE and chromium (Cr) were monitored for 1-hr periods at three distances from each process. Source activities at each process were recorded during each sampling interval. Emission rates were determined by applying appropriate mass balance models to the concentration patterns around each source. The emission factors obtained from regression analysis of the emission rate and activity data were 16.9 g TCE/basket of parts for the open-top degreaser; 1.0 g TCE/1000 parts for the enclosed degreaser; 1.48-1.64 mg Cr/1000 parts processed in the hot CrO3/HNO3 tank for the chrome conversion coating; and 5.35-9.17 mg Cr/rack of parts for chrome electroplating. The factors were also used to determine the efficiency of collection for the local exhaust systems serving each process. Although the number of observations were limited, these factors may be useful for providing initial estimates of emissions from similar processes in other settings.

  11. The Effects of Different External Carbon Sources on Nitrous Oxide Emissions during Denitrification in Biological Nutrient Removal Processes

    Science.gov (United States)

    Hu, Xiang; Zhang, Jing; Hou, Hongxun

    2018-01-01

    The aim of this study was to investigate the effects of two different external carbon sources (acetate and ethanol) on the nitrous oxide (N2O) emissions during denitrification in biological nutrient removal processes. Results showed that external carbon source significantly influenced N2O emissions during the denitrification process. When acetate served as the external carbon source, 0.49 mg N/L and 0.85 mg N/L of N2O was produced during the denitrificaiton processes in anoxic and anaerobic/anoxic experiments, giving a ratio of N2O-N production to TN removal of 2.37% and 4.96%, respectively. Compared with acetate, the amount of N2O production is negligible when ethanol used as external carbon addition. This suggested that ethanol is a potential alternative external carbon source for acetate from the point of view of N2O emissions.

  12. External Carbon Source Addition as a Means to Control an Activated Sludge Nutrient Removal Process

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard; Henze, Mogens; Søeberg, Henrik

    1994-01-01

    In alternating type activated sludge nutrient removal processes, the denitrification rate can be limited by the availability of readily-degradable carbon substrate. A control strategy is proposed by which an easily metabolizable COD source is added directly to that point in the process at which d...

  13. How reservoirs alter drinking water quality: Organic matter sources, sinks, and transformations

    Science.gov (United States)

    Kraus, Tamara E.C.; Bergamaschi, Brian A.; Hernes, Peter J.; Doctor, Daniel H.; Kendall, Carol; Downing, Bryan D.; Losee, Richard F.

    2011-01-01

    Within reservoirs, production, transformation, and loss of dissolved organic matter (DOM) occur simultaneously. While the balance between production and loss determines whether a reservoir is a net sink or source of DOM, changes in chemical composition are also important because they affect DOM reactivity with respect to disinfection by-product (DBP) formation. The composition of the DOM pool also provides insight into DOM sources and processing, which can inform reservoir management. We examined the concentration and composition of DOM in San Luis Reservoir, a large off-stream impoundment of the California State Water Project. We used a wide array of DOM chemical tracers including dissolved organic carbon (DOC) concentration, trihalomethane and haloacetic acid formation potentials (THMFP and HAAFP, respectively), absorbance properties, isotopic composition, lignin phenol content, and structural groupings determined by 13C nuclear magnetic resonance (NMR). There were periods when the reservoir was a net source of DOC due to the predominance of algal production (summer), a net sink due to the predominance of degradation (fall–winter), and balanced between production and consumption (spring). Despite only moderate variation in bulk DOC concentration (3.0–3.6 mg C/L), changes in DOM composition indicated that terrestrial-derived material entering the reservoir was being degraded and replaced by aquatic-derived DOM produced within the reservoir. Substantial changes in the propensity of the DOM pool to form THMs and HAAs illustrate that the DBP precursor pool was not directly coupled to bulk DOC concentration and indicate that algal production is an important source of DBP precursors. Results suggest reservoirs have the potential to attenuate DOM amount and reactivity with respect to DBP precursors via degradative processes; however, these benefits can be decreased or even negated by the production of algal-derived DOM.

  14. Parkin Mutations Reduce the Complexity of Neuronal Processes in iPSC-derived Human Neurons

    Science.gov (United States)

    Ren, Yong; Jiang, Houbo; Hu, Zhixing; Fan, Kevin; Wang, Jun; Janoschka, Stephen; Wang, Xiaomin; Ge, Shaoyu; Feng, Jian

    2015-01-01

    Parkinson’s disease (PD) is characterized by the degeneration of nigral dopaminergic (DA) neurons and non-DA neurons in many parts of the brain. Mutations of parkin, an E3 ubiquitin ligase that strongly binds to microtubules, are the most frequent cause of recessively inherited Parkinson’s disease. The lack of robust PD phenotype in parkin knockout mice suggests a unique vulnerability of human neurons to parkin mutations. Here, we show that the complexity of neuronal processes as measured by total neurite length, number of terminals, number of branch points and Sholl analysis, was greatly reduced in induced pluripotent stem cell (iPSC)-derived TH+ or TH− neurons from PD patients with parkin mutations. Consistent with these, microtubule stability was significantly decreased by parkin mutations in iPSC-derived neurons. Overexpression of parkin, but not its PD-linked mutant nor GFP, restored the complexity of neuronal processes and the stability of microtubules. Consistent with these, the microtubule-depolymerizing agent colchicine mimicked the effect of parkin mutations by decreasing neurite length and complexity in control neurons while the microtubule-stabilizing drug taxol mimicked the effect of parkin overexpression by enhancing the morphology of parkin-deficient neurons. The results suggest that parkin maintains the morphological complexity of human neurons by stabilizing microtubules. PMID:25332110

  15. Economic analysis of novel synergistic biofuel (H2Bioil) processes

    International Nuclear Information System (INIS)

    Singh, Navneet R.; Mallapragada, Dharik S.; Agrawal, Rakesh; Tyner, Wallace E.

    2012-01-01

    Fast-pyrolysis based processes can be built on small-scale and have higher process carbon and energy efficiency as compared to other options. H 2 Bioil is a novel process based on biomass fast-hydropyrolysis and subsequent hydrodeoxygenation (HDO) and can potentially provide high yields of high energy density liquid fuel at relatively low hydrogen consumption. This paper contains a comprehensive financial analysis of the H 2 Bioil process with hydrogen derived from different sources. Three different carbon tax scenarios are analyzed: no carbon tax, $55/metric ton carbon tax and $110/metric ton carbon tax. The break-even crude oil price for a delivered biomass cost of $94/metric ton when hydrogen is derived from coal, natural gas or nuclear energy ranges from $103 to $116/bbl for no carbon tax and even lower ($99-$111/bbl) for the carbon tax scenarios. This break-even crude oil price compares favorably with the literature estimated prices of fuels from alternate biochemical and thermochemical routes. The impact of the chosen carbon tax is found to be limited relative to the impact of the H 2 source on the H 2 Bioil break-even price. The economic robustness of the processes for hydrogen derived from coal, natural gas, or nuclear energy is seen by an estimated break-even crude oil price of $114-$126/bbl when biomass cost is increased to $121/metric ton. (orig.)

  16. Merged/integrated Bathymetric Data Derived from Multibeam Sonar, LiDAR, and Satellite-derived Bathymetry

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Gridded multibeam bathymetry is integrated with derived bathymetry from alternate sources to provide a GIS layer with expanded spatial coverage. Integrated products...

  17. Adopted levels and derived limits for Ra-226 and the decision making processes concerning TENORM releases

    International Nuclear Information System (INIS)

    Paschoa, A.S.

    2002-01-01

    A fraction of a primary dose limit can be, in general, agreed upon as a dose related level to be adopted in decision-making processes. In the case of TENORM releases, fractions of primary dose levels for 226 Ra, 228 Ra, and 210 Po may be of particular importance to establish adopted levels and derived limits to guide decision making processes. Thus, for example, a registration level for 226 Ra could be adopted at the highest portion of the natural background variation. Above such level, intervention and remedial action levels could also be adopted. All those levels would be fractions of the primary level, but translated in terms of derived limits expressed in practical units. Derived limits would then be calculated by using environmental models. In such approach 'critical groups' would have to be carefully defined and identified. In addition, the size of a critical group would be chosen to be used in environmental modeling. Site specific environmental models and parameters are desirable, though unavailable, or very difficult to obtain, in most cases. Thus, mathematical models and parameters of more generic nature are often used. A sensitive parametric analysis can make a ranking of the parameters used in a model, allowing one to choose how important each parameter will be for the model output. The paper will point out that when using the adopted levels and derived limits, as suggested above, the uncertainties and importance of the parameters entering an environmental model can make the difference for decision makers to take the right or wrong decision, as far as radiological protection is concerned. (author)

  18. Using ISO/IEC 12207 to analyze open source software development processes: an E-learning case study

    OpenAIRE

    Krishnamurthy, Aarthy; O'Connor, Rory

    2013-01-01

    peer-reviewed To date, there is no comprehensive study of open source software development process (OSSDP) carried out for open source (OS) e-learning systems. This paper presents the work which objectively analyzes the open source software development (OSSD) practices carried out by e-learning systems development communities and their results are represented using DEMO models. These results are compared using ISO/IEC 12207:2008. The comparison of DEMO models with ISO/IEC...

  19. Sources of CO{sub 2} efflux from soil and review of partitioning methods

    Energy Technology Data Exchange (ETDEWEB)

    Kuzyakov, Y. [University of Hohenheim, Stuttgart (Germany). Institute of Soil Science and Land Evaluation

    2006-03-15

    Five main biogenic sources of CO{sub 2} efflux from soils have been distinguished and described according to their turnover rates and the mean residence time of carbon. They are root respiration, rhizomicrobial respiration, decomposition of plant residues, the priming effect induced by root exudation or by addition of plant residues, and basal respiration by microbial decomposition of soil organic matter (SOM). These sources can be grouped in several combinations to summarize CO{sub 2} efflux from the soil including: root-derived CO{sub 2}, plant-derived CO{sub 2}, SOM-derived CO{sub 2}, rhizosphere respiration, heterotrophic microbial respiration (respiration by heterotrophs), and respiration by autotrophs. These distinctions are important because without separation of SOM-derived CO{sub 2} from plant-derived CO{sub 2}, measurements of total soil respiration have very limited value for evaluation of the soil as a source or sink of atmospheric CO{sub 2} and for interpreting the sources of CO{sub 2} and the fate of carbon within soils and ecosystems. Additionally, the processes linked to the five sources of CO{sub 2} efflux from soil have various responses to environmental variables and consequently to global warming. This review describes the basic principles and assumptions of the following methods which allow SOM-derived and root-derived CO{sub 2} efflux to be separated under laboratory and field conditions: root exclusion techniques, shading and clipping, tree girdling, regression, component integration, excised roots and in situ root respiration; continuous and pulse labeling, {sup 13}C natural abundance and FACE, and radiocarbon dating and bomb-{sup 14}C. A short sections cover the separation of the respiration of autotrophs and that of heterotrophs, i.e. the separation of actual root respiration from microbial respiration, as well as methods allowing the amount of CO{sub 2} evolved by decomposition of plant residues and by priming effects to be estimated. All

  20. Transformational and derivational strategies in analogical problem solving.

    Science.gov (United States)

    Schelhorn, Sven-Eric; Griego, Jacqueline; Schmid, Ute

    2007-03-01

    Analogical problem solving is mostly described as transfer of a source solution to a target problem based on the structural correspondences (mapping) between source and target. Derivational analogy (Carbonell, Machine learning: an artificial intelligence approach Los Altos. Morgan Kaufmann, 1986) proposes an alternative view: a target problem is solved by replaying a remembered problem-solving episode. Thus, the experience with the source problem is used to guide the search for the target solution by applying the same solution technique rather than by transferring the complete solution. We report an empirical study using the path finding problems presented in Novick and Hmelo (J Exp Psychol Learn Mem Cogn 20:1296-1321, 1994) as material. We show that both transformational and derivational analogy are problem-solving strategies realized by human problem solvers. Which strategy is evoked in a given problem-solving context depends on the constraints guiding object-to-object mapping between source and target problem. Specifically, if constraints facilitating mapping are available, subjects are more likely to employ a transformational strategy, otherwise they are more likely to use a derivational strategy.

  1. Cavities as the sources of acid mine process in the Niwka-Modrzejow Coal Mine (poland)

    International Nuclear Information System (INIS)

    Pluta, I.; Mazurkiewicz, M.

    2005-01-01

    Acid mine process is one of the most significant sources the pollution of surface water. The intensive process was discovered in the Niwka-Modrzejow Coal Mine at the level 100-130 m. In this paper the method of prevention by the filling cavities of wastes from energy plants was proposed. (authors)

  2. The source term and waste optimization of molten salt reactors with processing

    International Nuclear Information System (INIS)

    Gat, U.; Dodds, H.L.

    1993-01-01

    The source term of a molten salt reactor (MSR) with fuel processing is reduced by the ratio of processing time to refueling time as compared to solid fuel reactors. The reduction, which can be one to two orders of magnitude, is due to removal of the long-lived fission products. The waste from MSRs can be optimized with respect to its chemical composition, concentration, mixture, shape, and size. The actinides and long-lived isotopes can be separated out and returned to the reactor for transmutation. These features make MSRs more acceptable and simpler in operation and handling

  3. Classical-processing and quantum-processing signal separation methods for qubit uncoupling

    Science.gov (United States)

    Deville, Yannick; Deville, Alain

    2012-12-01

    The Blind Source Separation problem consists in estimating a set of unknown source signals from their measured combinations. It was only investigated in a non-quantum framework up to now. We propose its first quantum extensions. We thus introduce the Quantum Source Separation field, investigating both its blind and non-blind configurations. More precisely, we show how to retrieve individual quantum bits (qubits) only from the global state resulting from their undesired coupling. We consider cylindrical-symmetry Heisenberg coupling, which e.g. occurs when two electron spins interact through exchange. We first propose several qubit uncoupling methods which typically measure repeatedly the coupled quantum states resulting from individual qubits preparations, and which then statistically process the classical data provided by these measurements. Numerical tests prove the effectiveness of these methods. We then derive a combination of quantum gates for performing qubit uncoupling, thus avoiding repeated qubit preparations and irreversible measurements.

  4. High-etch-rate bottom-antireflective coating and gap-fill materials using dextrin derivatives in via first dual-Damascene lithography process

    Science.gov (United States)

    Takei, Satoshi; Sakaida, Yasushi; Shinjo, Tetsuya; Hashimoto, Keisuke; Nakajima, Yasuyuki

    2008-03-01

    The present paper describes a novel class of bottom antireflective coating (BARC) and gap fill materials using dextrin derivatives. The general trend of interconnect fabrication for such a high performance LSI is to apply cupper (Cu)/ low-dielectric-constant (low-k) interconnect to reduce RC delay. A via-first dual damascene process is one of the most promising processes to fabricate Cu/ low-k interconnect due to its wide miss-alignment margin. The sacrificial materials containing dextrin derivatives under resist for lithography were developed in via-first dual damascene process. The dextrin derivatives in this study was obtained by the esterification of the hydroxyl groups of dextrin resulting in improved solubility in the resist solvents such as propylene glycol monomethylether, propylene glycol monomethylether acetate, and ethyl lactate due to avoid the issue of defects that were caused by incompatability. The etch rate of our developed BARC and gap fill materials using dextrin derivatives was more than two times faster than one of the ArF resists evaluated in a CF4 gas condition using reactive ion etching. The improved etch performance was also verified by comparison with poly(hydroxystyrene), acrylate-type materials and latest low-k materials as a reference. In addition to superior etch performance, these materials showed good resist profiles and via filling performance without voids in via holes.

  5. TOXICOLOGICAL EVALUATION OF REALISTIC EMISSIONS OF SOURCE AEROSOLS (TERESA): APPLICATION TO POWER PLANT-DERIVED PM2.5

    Energy Technology Data Exchange (ETDEWEB)

    Annette C. Rohr; Petros Koutrakis; John Godleski

    2011-03-31

    Determining the health impacts of different sources and components of fine particulate matter (PM2.5) is an important scientific goal, because PM is a complex mixture of both inorganic and organic constituents that likely differ in their potential to cause adverse health outcomes. The TERESA (Toxicological Evaluation of Realistic Emissions of Source Aerosols) study focused on two PM sources - coal-fired power plants and mobile sources - and sought to investigate the toxicological effects of exposure to realistic emissions from these sources. The DOE-EPRI Cooperative Agreement covered the performance and analysis of field experiments at three power plants. The mobile source component consisted of experiments conducted at a traffic tunnel in Boston; these activities were funded through the Harvard-EPA Particulate Matter Research Center and will be reported separately in the peer-reviewed literature. TERESA attempted to delineate health effects of primary particles, secondary (aged) particles, and mixtures of these with common atmospheric constituents. The study involved withdrawal of emissions directly from power plant stacks, followed by aging and atmospheric transformation of emissions in a mobile laboratory in a manner that simulated downwind power plant plume processing. Secondary organic aerosol (SOA) derived from the biogenic volatile organic compound {alpha}-pinene was added in some experiments, and in others ammonia was added to neutralize strong acidity. Specifically, four scenarios were studied at each plant: primary particles (P); secondary (oxidized) particles (PO); oxidized particles + secondary organic aerosol (SOA) (POS); and oxidized and neutralized particles + SOA (PONS). Extensive exposure characterization was carried out, including gas-phase and particulate species. Male Sprague Dawley rats were exposed for 6 hours to filtered air or different atmospheric mixtures. Toxicological endpoints included (1) breathing pattern; (2) bronchoalveolar lavage

  6. Source Energy Spectrum of the 17 May 2012 GLE

    Science.gov (United States)

    Pérez-Peraza, Jorge; Márquez-Adame, Juan C.; Miroshnichenko, Leonty; Velasco-Herrera, Victor

    2018-05-01

    Among the several GLEs (ground level enhancements) that have presumptuously occurred in the period 2012-2015, the 17 May 2012 is that which is more widely accepted to be a GLE, in view of the high number of high-latitude neutron monitor stations that have registered it. In spite of the small amplitude, it was more prominent of the predicted GLE's of the present decade (Pérez-Peraza & Juárez-Zuñiga, 2015, https://doi.org/10.1088/0004-637X/803/1/27). However, the lack of latitude effect makes it difficult to study the characteristics of this event in the high-energy extreme of the spectrum. Nevertheless, several outstanding works have been able to derive observational spectra at the top of the Earth atmosphere for this peculiar GLE. Some of these works find that the flow of protons is characterized by two components. Quite a great number of works have been published in relation with observational features obtained with different instrumentation, but the source phenomena, regarding the generation processes and source physical parameters, have not been scrutinized. The main goal of this work is to look at such aspects by means of the confrontation of the different approaches of the observational spectra with our analytical theoretical spectra based on stochastic acceleration and electric field acceleration from reconnection processes. In this way, we derive a set of parameters which characterize the sources of these two GLE components, leading us to propose possible scenarios for the generation of particles in this particular GLE event.

  7. Distinct transmissibility features of TSE sources derived from ruminant prion diseases by the oral route in a transgenic mouse model (TgOvPrP4 overexpressing the ovine prion protein.

    Directory of Open Access Journals (Sweden)

    Jean-Noël Arsac

    Full Text Available Transmissible spongiform encephalopathies (TSEs are a group of fatal neurodegenerative diseases associated with a misfolded form of host-encoded prion protein (PrP. Some of them, such as classical bovine spongiform encephalopathy in cattle (BSE, transmissible mink encephalopathy (TME, kuru and variant Creutzfeldt-Jakob disease in humans, are acquired by the oral route exposure to infected tissues. We investigated the possible transmission by the oral route of a panel of strains derived from ruminant prion diseases in a transgenic mouse model (TgOvPrP4 overexpressing the ovine prion protein (A136R154Q171 under the control of the neuron-specific enolase promoter. Sources derived from Nor98, CH1641 or 87V scrapie sources, as well as sources derived from L-type BSE or cattle-passaged TME, failed to transmit by the oral route, whereas those derived from classical BSE and classical scrapie were successfully transmitted. Apart from a possible effect of passage history of the TSE agent in the inocula, this implied the occurrence of subtle molecular changes in the protease-resistant prion protein (PrPres following oral transmission that can raises concerns about our ability to correctly identify sheep that might be orally infected by the BSE agent in the field. Our results provide proof of principle that transgenic mouse models can be used to examine the transmissibility of TSE agents by the oral route, providing novel insights regarding the pathogenesis of prion diseases.

  8. Relaxation dynamics in the presence of pulse multiplicative noise sources with different correlation properties

    Science.gov (United States)

    Kargovsky, A. V.; Chichigina, O. A.; Anashkina, E. I.; Valenti, D.; Spagnolo, B.

    2015-10-01

    The relaxation dynamics of a system described by a Langevin equation with pulse multiplicative noise sources with different correlation properties is considered. The solution of the corresponding Fokker-Planck equation is derived for Gaussian white noise. Moreover, two pulse processes with regulated periodicity are considered as a noise source: the dead-time-distorted Poisson process and the process with fixed time intervals, which is characterized by an infinite correlation time. We find that the steady state of the system is dependent on the correlation properties of the pulse noise. An increase of the noise correlation causes the decrease of the mean value of the solution at the steady state. The analytical results are in good agreement with the numerical ones.

  9. Quasi-homogeneous partial coherent source modeling of multimode optical fiber output using the elementary source method

    Science.gov (United States)

    Fathy, Alaa; Sabry, Yasser M.; Khalil, Diaa A.

    2017-10-01

    Multimode fibers (MMF) have many applications in illumination, spectroscopy, sensing and even in optical communication systems. In this work, we present a model for the MMF output field assuming the fiber end as a quasi-homogenous source. The fiber end is modeled by a group of partially coherent elementary sources, spatially shifted and uncorrelated with each other. The elementary source distribution is derived from the far field intensity measurement, while the weighting function of the sources is derived from the fiber end intensity measurement. The model is compared with practical measurements for fibers with different core/cladding diameters at different propagation distances and for different input excitations: laser, white light and LED. The obtained results show normalized root mean square error less than 8% in the intensity profile in most cases, even when the fiber end surface is not perfectly cleaved. Also, the comparison with the Gaussian-Schell model results shows a better agreement with the measurement. In addition, the complex degree of coherence, derived from the model results, is compared with the theoretical predictions of the modified Van Zernike equation showing very good agreement, which strongly supports the assumption that the large core MMF could be considered as a quasi-homogenous source.

  10. Significance of Joint Features Derived from the Modified Group Delay Function in Speech Processing

    Directory of Open Access Journals (Sweden)

    Murthy Hema A

    2007-01-01

    Full Text Available This paper investigates the significance of combining cepstral features derived from the modified group delay function and from the short-time spectral magnitude like the MFCC. The conventional group delay function fails to capture the resonant structure and the dynamic range of the speech spectrum primarily due to pitch periodicity effects. The group delay function is modified to suppress these spikes and to restore the dynamic range of the speech spectrum. Cepstral features are derived from the modified group delay function, which are called the modified group delay feature (MODGDF. The complementarity and robustness of the MODGDF when compared to the MFCC are also analyzed using spectral reconstruction techniques. Combination of several spectral magnitude-based features and the MODGDF using feature fusion and likelihood combination is described. These features are then used for three speech processing tasks, namely, syllable, speaker, and language recognition. Results indicate that combining MODGDF with MFCC at the feature level gives significant improvements for speech recognition tasks in noise. Combining the MODGDF and the spectral magnitude-based features gives a significant increase in recognition performance of 11% at best, while combining any two features derived from the spectral magnitude does not give any significant improvement.

  11. Source estimation for propagation processes on complex networks with an application to delays in public transportation systems

    NARCIS (Netherlands)

    Manitz, J. (Juliane); Harbering, J. (Jonas); M.E. Schmidt (Marie); T. Kneib (Thomas); A. Schöbel (Anita)

    2017-01-01

    textabstractThe correct identification of the source of a propagation process is crucial in many research fields. As a specific application, we consider source estimation of delays in public transportation networks. We propose two approaches: an effective distance median and a backtracking method.

  12. Multiple criteria decision-making process to derive consensus desired genetic gains for a dairy cattle breeding objective for diverse production systems.

    Science.gov (United States)

    Kariuki, C M; van Arendonk, J A M; Kahi, A K; Komen, H

    2017-06-01

    Dairy cattle industries contribute to food and nutrition security and are a source of income for numerous households in many developing countries. Selective breeding can enhance efficiency in these industries. Developing dairy industries are characterized by diverse production and marketing systems. In this paper, we use weighted goal aggregating procedure to derive consensus trait preferences for different producer categories and processors. We based the study on the dairy industry in Kenya. The analytic hierarchy process was used to derive individual preferences for milk yield (MY), calving interval (CIN), production lifetime (PLT), mature body weight (MBW), and fat yield (FY). Results show that classical classification of production systems into large-scale and smallholder systems does not capture all differences in trait preferences. These differences became apparent when classification was based on productivity at the individual animal level, with high and low intensity producers and processors as the most important groups. High intensity producers had highest preferences for PLT and MY, whereas low intensity producers had highest preference for CIN and PLT; processors preferred MY and FY the most. The highest disagreements between the groups were observed for FY, PLT, and MY. Individual and group preferences were aggregated into consensus preferences using weighted goal programming. Desired gains were obtained as a product of consensus preferences and percentage genetic gains (G%). These were 2.42, 0.22, 2.51, 0.15, and 0.87 for MY, CIN, PLT, MBW, and FY, respectively. Consensus preferences can be used to derive a single compromise breeding objective for situations where the same genetic resources are used in diverse production and marketing circumstances. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under the CC BY-NC-ND license

  13. Effects of irradiation source and dose level on quality characteristics of processed meat products

    Science.gov (United States)

    Ham, Youn-Kyung; Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Choi, Yun-Sang; Song, Beom-Seok; Park, Jong-Heum; Kim, Cheon-Jei

    2017-01-01

    The effect of irradiation source (gamma-ray, electron-beam, and X-ray) and dose levels on the physicochemical, organoleptic and microbial properties of cooked beef patties and pork sausages was studied, during 10 days of storage at 30±1 °C. The processed meat products were irradiated at 0, 2.5, 5, 7.5, and 10 kGy by three different irradiation sources. The pH of cooked beef patties and pork sausages was unaffected by irradiation sources or their doses. The redness of beef patties linearly decreased with increasing dose level (Pchanges in overall acceptability were observed for pork sausages regardless of irradiation source (P>0.05), while gamma-ray irradiated beef patties showed significantly decreased overall acceptability in a dose-dependent manner (Poxidation of samples was accelerated by irradiation depending on irradiation sources and dose levels during storage at 30 °C. E-beam reduced total aerobic bacteria of beef patties more effectively, while gamma-ray considerably decreased microbes in pork sausages as irradiation dose increased. The results of this study indicate that quality attributes of meat products, in particular color, lipid oxidation, and microbial properties are significantly influenced by the irradiation sources.

  14. The development of control processes supporting source memory discrimination as revealed by event-related potentials.

    Science.gov (United States)

    de Chastelaine, Marianne; Friedman, David; Cycowicz, Yael M

    2007-08-01

    Improvement in source memory performance throughout childhood is thought to be mediated by the development of executive control. As postretrieval control processes may be better time-locked to the recognition response rather than the retrieval cue, the development of processes underlying source memory was investigated with both stimulus- and response-locked event-related potentials (ERPs). These were recorded in children, adolescents, and adults during a recognition memory exclusion task. Green- and red-outlined pictures were studied, but were tested in black outline. The test requirement was to endorse old items shown in one study color ("targets") and to reject new items along with old items shown in the alternative study color ("nontargets"). Source memory improved with age. All age groups retrieved target and nontarget memories as reflected by reliable parietal episodic memory (EM) effects, a stimulus-locked ERP correlate of recollection. Response-locked ERPs to targets and nontargets diverged in all groups prior to the response, although this occurred at an increasingly earlier time point with age. We suggest these findings reflect the implementation of attentional control mechanisms to enhance target memories and facilitate response selection with the greatest and least success, respectively, in adults and children. In adults only, response-locked ERPs revealed an early-onsetting parietal negativity for nontargets, but not for targets. This was suggested to reflect adults' ability to consistently inhibit prepotent target responses for nontargets. The findings support the notion that the development of source memory relies on the maturation of control processes that serve to enhance accurate selection of task-relevant memories.

  15. Application of random-point processes to the detection of radiation sources

    International Nuclear Information System (INIS)

    Woods, J.W.

    1978-01-01

    In this report the mathematical theory of random-point processes is reviewed and it is shown how use of the theory can obtain optimal solutions to the problem of detecting radiation sources. As noted, the theory also applies to image processing in low-light-level or low-count-rate situations. Paralleling Snyder's work, the theory is extended to the multichannel case of a continuous, two-dimensional (2-D), energy-time space. This extension essentially involves showing that the data are doubly stochastic Poisson (DSP) point processes in energy as well as time. Further, a new 2-D recursive formulation is presented for the radiation-detection problem with large computational savings over nonrecursive techniques when the number of channels is large (greater than or equal to 30). Finally, some adaptive strategies for on-line ''learning'' of unknown, time-varying signal and background-intensity parameters and statistics are present and discussed. These adaptive procedures apply when a complete statistical description is not available a priori

  16. Economic analysis of novel synergistic biofuel (H{sub 2}Bioil) processes

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Navneet R.; Mallapragada, Dharik S.; Agrawal, Rakesh [Purdue University, School of Chemical Engineering, West Lafayette, IN (United States); Tyner, Wallace E. [Purdue University, Department of Agricultural Economics, West Lafayette, IN (United States)

    2012-06-15

    Fast-pyrolysis based processes can be built on small-scale and have higher process carbon and energy efficiency as compared to other options. H{sub 2}Bioil is a novel process based on biomass fast-hydropyrolysis and subsequent hydrodeoxygenation (HDO) and can potentially provide high yields of high energy density liquid fuel at relatively low hydrogen consumption. This paper contains a comprehensive financial analysis of the H{sub 2}Bioil process with hydrogen derived from different sources. Three different carbon tax scenarios are analyzed: no carbon tax, $55/metric ton carbon tax and $110/metric ton carbon tax. The break-even crude oil price for a delivered biomass cost of $94/metric ton when hydrogen is derived from coal, natural gas or nuclear energy ranges from $103 to $116/bbl for no carbon tax and even lower ($99-$111/bbl) for the carbon tax scenarios. This break-even crude oil price compares favorably with the literature estimated prices of fuels from alternate biochemical and thermochemical routes. The impact of the chosen carbon tax is found to be limited relative to the impact of the H{sub 2} source on the H{sub 2}Bioil break-even price. The economic robustness of the processes for hydrogen derived from coal, natural gas, or nuclear energy is seen by an estimated break-even crude oil price of $114-$126/bbl when biomass cost is increased to $121/metric ton. (orig.)

  17. A Thermodynamic Library for Simulation and Optimization of Dynamic Processes

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Gaspar, Jozsef; Jørgensen, John Bagterp

    2017-01-01

    Process system tools, such as simulation and optimization of dynamic systems, are widely used in the process industries for development of operational strategies and control for process systems. These tools rely on thermodynamic models and many thermodynamic models have been developed for different...... compounds and mixtures. However, rigorous thermodynamic models are generally computationally intensive and not available as open-source libraries for process simulation and optimization. In this paper, we describe the application of a novel open-source rigorous thermodynamic library, ThermoLib, which...... is designed for dynamic simulation and optimization of vapor-liquid processes. ThermoLib is implemented in Matlab and C and uses cubic equations of state to compute vapor and liquid phase thermodynamic properties. The novelty of ThermoLib is that it provides analytical first and second order derivatives...

  18. Assessment of source material (U and Th) in exploration, mining, processing of zircon sand in Central Kalimantan

    International Nuclear Information System (INIS)

    Dedi Hermawan; Pandu Dewanto; Sudarto

    2011-01-01

    From 2004 to 2008, according to data released by the Commerce Department, the volume of zircon sand and concentrates exports has increased highly. One of many locations in Indonesia that widely available zircon sand is Kalimantan island. For example, Central Kalimantan Province in 2007 to 2008 exports about 51,000 tones up to 79,000 tones of zircon sand annually. The concentration of source material in the zircon sand is important to be known because the presence of natural radioactive U and Th in zircon sand has the potential radiation hazard. Therefore it is necessary to conduct an assessment the potential reserves related to the source material contained in the zircon sand and radiation safety that are applied in the process of mining or processing of zircon sand. In this paper the location of mining and processing of zircon sand is restricted to the province of Central Kalimantan. From the assessment obtained that source material which is carried by zircon sand export form the province of Central Kalimantan have the potential to exceed the limits set by the BAPETEN Chairman Decree No.9 of 2006 About the Implementation of the Additional Protocol to the Accountability System and Control of Nuclear Materials. In terms of compliance with radiation safety, required increased surveillance of K3 during mining, process / processing by the worker, supervisor / supervisors and regional management company to be able the achievement of compliance with the provisions of the management of materials and the safety of radiation sources based on national regulations (BAPETEN) and international. (author)

  19. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  20. Estrogen-related receptor gamma disruption of source water and drinking water treatment processes extracts.

    Science.gov (United States)

    Li, Na; Jiang, Weiwei; Rao, Kaifeng; Ma, Mei; Wang, Zijian; Kumaran, Satyanarayanan Senthik

    2011-01-01

    Environmental chemicals in drinking water can impact human health through nuclear receptors. Additionally, estrogen-related receptors (ERRs) are vulnerable to endocrine-disrupting effects. To date, however, ERR disruption of drinking water potency has not been reported. We used ERRgamma two-hybrid yeast assay to screen ERRgamma disrupting activities in a drinking water treatment plant (DWTP) located in north China and in source water from a reservoir, focusing on agonistic, antagonistic, and inverse agonistic activity to 4-hydroxytamoxifen (4-OHT). Water treatment processes in the DWTP consisted of pre-chlorination, coagulation, coal and sand filtration, activated carbon filtration, and secondary chlorination processes. Samples were extracted by solid phase extraction. Results showed that ERRgamma antagonistic activities were found in all sample extracts, but agonistic and inverse agonistic activity to 4-OHT was not found. When calibrated with the toxic equivalent of 4-OHT, antagonistic effluent effects ranged from 3.4 to 33.1 microg/L. In the treatment processes, secondary chlorination was effective in removing ERRgamma antagonists, but the coagulation process led to significantly increased ERRgamma antagonistic activity. The drinking water treatment processes removed 73.5% of ERRgamma antagonists. To our knowledge, the occurrence of ERRgamma disruption activities on source and drinking water in vitro had not been reported previously. It is vital, therefore, to increase our understanding of ERRy disrupting activities in drinking water.

  1. Integrated report on the toxicological mitigation of coal liquids by hydrotreatment and other processes. [Petroleum and coal-derived products

    Energy Technology Data Exchange (ETDEWEB)

    Guerin, M.R.; Griest, W.H.; Ho, C.H.; Smith, L.H.; Witschi, H.P.

    1986-06-01

    Research here on the toxicological properties of coal-derived liquids focuses on characterizing the refining process and refined products. Principle attention is given to the potential tumorigenicity of coal-derived fuels and to the identification of means to further reduce tumorigenicity should this be found necessary. Hydrotreatment is studied most extensively because it will be almost certainly required to produce commercial products and because it is likely to also greatly reduce tumorigenic activity relative to that of crude coal-liquid feedstocks. This report presents the results of a lifetime C3H mouse skin tumorigenicity assay of an H-Coal series of oils and considers the relationships between tumorigenicity, chemistry, and processing. Lifetime assay results are reported for an H-Coal syncrude mode light oil/heavy oil blend, a low severity hydrotreatment product, a high severity hydrotreatment product, a naphtha reformate, a heating oil, a petroleum-derived reformate, and a petroleum derived heating oil. Data are compared with those for an earlier study of an SRC-II blend and products of its hydrotreatment. Adequate data are presented to allow an independent qualitative assessment of the conclusions while statistical evaluation of the data is being completed. The report also documents the physical and chemical properties of the oils tested. 33 refs., 14 figs., 53 tabs.

  2. Cathode R and D for future light sources

    Energy Technology Data Exchange (ETDEWEB)

    Dowell, D.H., E-mail: dowell@slac.stanford.ed [SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Bazarov, I.; Dunham, B. [Cornell University, Cornell Laboratory for Accelerator-Based Sciences and Education (CLASSE) Wilson Laboratory, Cornell University, Ithaca, NY 14853 (United States); Harkay, K. [Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, Il 60439 (United States); Hernandez-Garcia, C. [Thomas Jefferson Laboratory, 12000 Jefferson Ave, Free Electron Laser Suite 19 Newport News, VA 23606 (United States); Legg, R. [University of Wisconsin, SRC, 3731 Schneider Dr., Stoughton, WI 53589 (United States); Padmore, H. [Lawrence Berkeley National Laboratory, 1 Cyclotron Rd, Berkeley, CA 94720 (United States); Rao, T.; Smedley, J. [Brookhaven National Laboratory, 20 Technology Street, Bldg. 535B, Brookhaven National Laboratory Upton, NY 11973 (United States); Wan, W. [Lawrence Berkeley National Laboratory, 1 Cyclotron Rd, Berkeley, CA 94720 (United States)

    2010-10-21

    This paper reviews the requirements and current status of cathodes for accelerator applications, and proposes a research and development plan for advancing cathode technology. Accelerator cathodes need to have long operational lifetimes and produce electron beams with a very low emittance. The two principal emission processes to be considered are thermionic and photoemission with the photocathodes being further subdivided into metal and semi-conductors. Field emission cathodes are not included in this analysis. The thermal emittance is derived and the formulas used to compare the various cathode materials. To date, there is no cathode which provides all the requirements needed for the proposed future light sources. Therefore a three part research plan is described to develop cathodes for these future light source applications.

  3. Study of surface ionization and LASER ionization processes using the SOMEIL ion source: application to the Spiral 2 laser ion source development

    Energy Technology Data Exchange (ETDEWEB)

    Bajeat, O., E-mail: bajeat@ganil.fr; Lecesne, N.; Leroy, R.; Maunoury, L.; Osmond, B.; Sjodin, M. [GANIL (France); Maitre, A.; Pradeilles, N. [Laboratoire Science des Procedes Ceramiques et de Traitements de Surface (SPCTS) 12 (France)

    2013-04-15

    SPIRAL2 is the new project under construction at GANIL to provide radioactive ion beams to the Nuclear Physics Community and in particular neutron rich ion beams. For the production of condensable radioactive elements, a resonant ionization laser ion source is under development at GANIL. In order to generate the ions of interest with a good selectivity and purity, our group is studying the way to minimize surface ionization process by using refractory materials with low work function as ionizer tube. To do those investigations a dedicated ion source, called SOMEIL (Source Optimisee pour les Mesures d'Efficacite d'Ionisation Laser) is used. Numerous types of ionizer tubes made in various materials and geometry are tested. Surface ionization and laser ionization efficiencies can be measured for each of them.

  4. Numerical modeling of optical coherent transient processes with complex configurations-III: Noisy laser source

    International Nuclear Information System (INIS)

    Chang Tiejun; Tian Mingzhen

    2007-01-01

    A previously developed numerical model based on Maxwell-Bloch equations was modified to simulate optical coherent transient and spectral hole burning processes with noisy laser sources. Random walk phase noise was simulated using laser-phase sequences generated numerically according to the normal distribution of the phase shift. The noise model was tested by comparing the simulated spectral hole burning effect with the analytical solution. The noise effects on a few typical optical coherence transient processes were investigated using this numerical tool. Flicker and random walk frequency noises were considered in accumulation process

  5. Exploring information-seeking processes by business: analyzing source and channel choices in business-to-government service interactions

    NARCIS (Netherlands)

    van den Boer, Yvon; Pieterson, Willem Jan; van Dijk, Johannes A.G.M.; Arendsen, R.

    2016-01-01

    With the rise of electronic channels it has become easier for businesses to consult various types of information sources in information-seeking processes. Governments are urged to rethink their role as reliable information source and the roles of their (electronic) service channels to provide

  6. Kinetic parameters for source driven systems

    International Nuclear Information System (INIS)

    Dulla, S.; Ravetto, P.; Carta, M.; D'Angelo, A.

    2006-01-01

    The definition of the characteristic kinetic parameters of a subcritical source-driven system constitutes an interesting problem in reactor physics with important consequences for practical applications. Consistent and physically meaningful values of the parameters allow to obtain accurate results from kinetic simulation tools and to correctly interpret kinetic experiments. For subcritical systems a preliminary problem arises for the adoption of a suitable weighting function to be used in the projection procedure to derive a point model. The present work illustrates a consistent factorization-projection procedure which leads to the definition of the kinetic parameters in a straightforward manner. The reactivity term is introduced coherently with the generalized perturbation theory applied to the source multiplication factor ks, which is thus given a physical role in the kinetic model. The effective prompt lifetime is introduced on the assumption that a neutron generation can be initiated by both the fission process and the source emission. Results are presented for simplified configurations to fully comprehend the physical features and for a more complicated highly decoupled system treated in transport theory. (authors)

  7. Dementia and Depression: A Process Model for Differential Diagnosis.

    Science.gov (United States)

    Hill, Carrie L.; Spengler, Paul M.

    1997-01-01

    Delineates a process model for mental-health counselors to follow in formulating a differential diagnosis of dementia and depression in adults 65 years and older. The model is derived from empirical, theoretical, and clinical sources of evidence. Explores components of the clinical interview, of hypothesis formation, and of hypothesis testing.…

  8. EVALUATING SOIL EROSION PARAMETER ESTIMATES FROM DIFFERENT DATA SOURCES

    Science.gov (United States)

    Topographic factors and soil loss estimates that were derived from thee data sources (STATSGO, 30-m DEM, and 3-arc second DEM) were compared. Slope magnitudes derived from the three data sources were consistently different. Slopes from the DEMs tended to provide a flattened sur...

  9. Role of environmental chemicals, processed food derivatives, and nutrients in the induction of carcinogenesis.

    Science.gov (United States)

    Persano, Luca; Zagoura, Dimitra; Louisse, Jochem; Pistollato, Francesca

    2015-10-15

    In recent years it has been hypothesized that cancer stem cells (CSCs) are the actual driving force of tumor formation, highlighting the need to specifically target CSCs to successfully eradicate cancer growth and recurrence. Particularly, the deregulation of physiological signaling pathways controlling stem cell proliferation, self-renewal, differentiation, and metabolism is currently considered as one of the leading determinants of cancer formation. Given their peculiar, slow-dividing phenotype and their ability to respond to multiple microenvironmental stimuli, stem cells appear to be more susceptible to genetic and epigenetic carcinogens, possibly undergoing mutations resulting in tumor formation. In particular, some animal-derived bioactive nutrients and metabolites known to affect the hormonal milieu, and also chemicals derived from food processing and cooking, have been described as possible carcinogenic factors. Here, we review most recent literature in this field, highlighting how some environmental toxicants, some specific nutrients and their secondary products can induce carcinogenesis, possibly impacting stem cells and their niches, thus causing tumor growth.

  10. Diversity of natural self-derived ligands presented by different HLA class I molecules in transporter antigen processing-deficient cells.

    Directory of Open Access Journals (Sweden)

    Elena Lorente

    Full Text Available The transporter associated with antigen processing (TAP translocates the cytosol-derived proteolytic peptides to the endoplasmic reticulum lumen where they complex with nascent human leukocyte antigen (HLA class I molecules. Non-functional TAP complexes and viral or tumoral blocking of these transporters leads to reduced HLA class I surface expression and a drastic change in the available peptide repertoire. Using mass spectrometry to analyze complex human leukocyte antigen HLA-bound peptide pools isolated from large numbers of TAP-deficient cells, we identified 334 TAP-independent ligands naturally presented by four different HLA-A, -B, and -C class I molecules with very different TAP dependency from the same cell line. The repertoire of TAP-independent peptides examined favored increased peptide lengths and a lack of strict binding motifs for all four HLA class I molecules studied. The TAP-independent peptidome arose from 182 parental proteins, the majority of which yielded one HLA ligand. In contrast, TAP-independent antigen processing of very few cellular proteins generated multiple HLA ligands. Comparison between TAP-independent peptidome and proteome of several subcellular locations suggests that the secretory vesicle-like organelles could be a relevant source of parental proteins for TAP-independent HLA ligands. Finally, a predominant endoproteolytic peptidase specificity for Arg/Lys or Leu/Phe residues in the P(1 position of the scissile bond was found for the TAP-independent ligands. These data draw a new and intricate picture of TAP-independent pathways.

  11. Source rupture process of the 2016 Kaikoura, New Zealand earthquake estimated from the kinematic waveform inversion of strong-motion data

    Science.gov (United States)

    Zheng, Ao; Wang, Mingfeng; Yu, Xiangwei; Zhang, Wenbo

    2018-03-01

    On 2016 November 13, an Mw 7.8 earthquake occurred in the northeast of the South Island of New Zealand near Kaikoura. The earthquake caused severe damages and great impacts on local nature and society. Referring to the tectonic environment and defined active faults, the field investigation and geodetic evidence reveal that at least 12 fault sections ruptured in the earthquake, and the focal mechanism is one of the most complicated in historical earthquakes. On account of the complexity of the source rupture, we propose a multisegment fault model based on the distribution of surface ruptures and active tectonics. We derive the source rupture process of the earthquake using the kinematic waveform inversion method with the multisegment fault model from strong-motion data of 21 stations (0.05-0.35 Hz). The inversion result suggests the rupture initiates in the epicentral area near the Humps fault, and then propagates northeastward along several faults, until the offshore Needles fault. The Mw 7.8 event is a mixture of right-lateral strike and reverse slip, and the maximum slip is approximately 19 m. The synthetic waveforms reproduce the characteristics of the observed ones well. In addition, we synthesize the coseismic offsets distribution of the ruptured region from the slips of upper subfaults in the fault model, which is roughly consistent with the surface breaks observed in the field survey.

  12. The Impact of the Brain-Derived Neurotrophic Factor Gene on Trauma and Spatial Processing

    Directory of Open Access Journals (Sweden)

    Jessica K. Miller

    2017-11-01

    Full Text Available The influence of genes and the environment on the development of Post-Traumatic Stress Disorder (PTSD continues to motivate neuropsychological research, with one consistent focus being the Brain-Derived Neurotrophic Factor (BDNF gene, given its impact on the integrity of the hippocampal memory system. Research into human navigation also considers the BDNF gene in relation to hippocampal dependent spatial processing. This speculative paper brings together trauma and spatial processing for the first time and presents exploratory research into their interactions with BDNF. We propose that quantifying the impact of BDNF on trauma and spatial processing is critical and may well explain individual differences in clinical trauma treatment outcomes and in navigation performance. Research has already shown that the BDNF gene influences PTSD severity and prevalence as well as navigation behaviour. However, more data are required to demonstrate the precise hippocampal dependent processing mechanisms behind these influences in different populations and environmental conditions. This paper provides insight from recent studies and calls for further research into the relationship between allocentric processing, trauma processing and BDNF. We argue that research into these neural mechanisms could transform PTSD clinical practice and professional support for individuals in trauma-exposing occupations such as emergency response, law enforcement and the military.

  13. The Impact of the Brain-Derived Neurotrophic Factor Gene on Trauma and Spatial Processing.

    Science.gov (United States)

    Miller, Jessica K; McDougall, Siné; Thomas, Sarah; Wiener, Jan

    2017-11-27

    The influence of genes and the environment on the development of Post-Traumatic Stress Disorder (PTSD) continues to motivate neuropsychological research, with one consistent focus being the Brain-Derived Neurotrophic Factor (BDNF) gene, given its impact on the integrity of the hippocampal memory system. Research into human navigation also considers the BDNF gene in relation to hippocampal dependent spatial processing. This speculative paper brings together trauma and spatial processing for the first time and presents exploratory research into their interactions with BDNF. We propose that quantifying the impact of BDNF on trauma and spatial processing is critical and may well explain individual differences in clinical trauma treatment outcomes and in navigation performance. Research has already shown that the BDNF gene influences PTSD severity and prevalence as well as navigation behaviour. However, more data are required to demonstrate the precise hippocampal dependent processing mechanisms behind these influences in different populations and environmental conditions. This paper provides insight from recent studies and calls for further research into the relationship between allocentric processing, trauma processing and BDNF. We argue that research into these neural mechanisms could transform PTSD clinical practice and professional support for individuals in trauma-exposing occupations such as emergency response, law enforcement and the military.

  14. Comparative Evaluation of Pulsewidth Modulation Strategies for Z-Source Neutral-Point-Clamped Inverter

    DEFF Research Database (Denmark)

    Loh, P.C.; Blaabjerg, Frede; Wong, C.P.

    2007-01-01

    modulation (PWM) strategies for controlling the Z-source NPC inverter. While developing the PWM techniques, attention has been devoted to carefully derive them from a common generic basis for improved portability, easier implementation, and most importantly, assisting readers in understanding all concepts......Z-source neutral-point-clamped (NPC) inverter has recently been proposed as an alternative three-level buck-boost power conversion solution with an improved output waveform quality. In principle, the designed Z-source inverter functions by selectively "shooting through" its power sources, coupled...... to the inverter using two unique Z-source impedance networks, to boost the inverter three-level output waveform. Proper modulation of the new inverter would therefore require careful integration of the selective shoot-through process to the basic switching concepts to achieve maximal voltage-boost, minimal...

  15. Pathway computation in models derived from bio-science text sources

    DEFF Research Database (Denmark)

    Andreasen, Troels; Bulskov, Henrik; Jensen, Per Anker

    2017-01-01

    This paper outlines a system, OntoScape, serving to accomplish complex inference tasks on knowledge bases and bio-models derived from life-science text corpora. The system applies so-called natural logic, a form of logic which is readable for humans. This logic affords ontological representations...

  16. Minimally processed beetroot waste as an alternative source to obtain functional ingredients.

    Science.gov (United States)

    Costa, Anne Porto Dalla; Hermes, Vanessa Stahl; Rios, Alessandro de Oliveira; Flôres, Simone Hickmann

    2017-06-01

    Large amounts of waste are generated by the minimally processed vegetables industry, such as those from beetroot processing. The aim of this study was to determine the best method to obtain flour from minimally processed beetroot waste dried at different temperatures, besides producing a colorant from such waste and assessing its stability along 45 days. Beetroot waste dried at 70 °C originates flour with significant antioxidant activity and higher betalain content than flour produced from waste dried at 60 and 80 °C, while chlorination had no impact on the process since microbiological results were consistent for its application. The colorant obtained from beetroot waste showed color stability for 20 days and potential antioxidant activity over the analysis period, thus it can be used as a functional additive to improve nutritional characteristics and appearance of food products. These results are promising since minimally processed beetroot waste can be used as an alternative source of natural and functional ingredients with high antioxidant activity and betalain content.

  17. Preparation of functional composite materials based on chemically derived graphene using solution process

    International Nuclear Information System (INIS)

    Kim, M; Hyun, W J; Mun, S C; Park, O O

    2015-01-01

    Chemically derived graphenes were assembled into functional composite materials using solution process from stable solvent dispersion. We have developed foldable electronic circuits on paper substrates using vacuum filtration of graphene nanoplates dispersion and a selective transfer process without need for special equipment. The electronic circuits on paper substrates revealed only a small change in conductance under various folding angles and maintained an electronic path after repetitive folding and unfolding. We also prepared flexible. binder-free graphene paper-like materials by addition of graphene oxide as a film stabilizer. This graphene papers showed outstanding electrical conductivity up to 26,000 S/m and high charge capacity as an anode in lithium-ion battery without any post-treatments. For last case, multi-functional thin film structures of graphene nanoplates were fabricated by using layer-by-layer assembly technique, showing optical transparency, electrical conductivity and enhanced gas barrier property. (paper)

  18. Calculating depths to shallow magnetic sources using aeromagnetic data from the Tucson Basin

    Science.gov (United States)

    Casto, Daniel W.

    2001-01-01

    Using gridded high-resolution aeromagnetic data, the performance of several automated 3-D depth-to-source methods was evaluated over shallow control sources based on how close their depth estimates came to the actual depths to the tops of the sources. For all three control sources, only the simple analytic signal method, the local wavenumber method applied to the vertical integral of the magnetic field, and the horizontal gradient method applied to the pseudo-gravity field provided median depth estimates that were close (-11% to +14% error) to the actual depths. Careful attention to data processing was required in order to calculate a sufficient number of depth estimates and to reduce the occurrence of false depth estimates. For example, to eliminate sampling bias, high-frequency noise and interference from deeper sources, it was necessary to filter the data before calculating derivative grids and subsequent depth estimates. To obtain smooth spatial derivative grids using finite differences, the data had to be gridded at intervals less than one percent of the anomaly wavelength. Before finding peak values in the derived signal grids, it was necessary to remove calculation noise by applying a low-pass filter in the grid-line directions and to re-grid at an interval that enabled the search window to encompass only the peaks of interest. Using the methods that worked best over the control sources, depth estimates over geologic sites of interest suggested the possible occurrence of volcanics nearly 170 meters beneath a city landfill. Also, a throw of around 2 kilometers was determined for a detachment fault that has a displacement of roughly 6 kilometers.

  19. Green Processing of Lignocellulosic Biomass and Its Derivatives in Deep Eutectic Solvents.

    Science.gov (United States)

    Tang, Xing; Zuo, Miao; Li, Zheng; Liu, Huai; Xiong, Caixia; Zeng, Xianhai; Sun, Yong; Hu, Lei; Liu, Shijie; Lei, Tingzhou; Lin, Lu

    2017-07-10

    The scientific community has been seeking cost-competitive and green solvents with good dissolving capacity for the valorization of lignocellulosic biomass. At this point, deep eutectic solvents (DESs) are currently emerging as a new class of promising solvents that are generally liquid eutectic mixtures formed by self-association (or hydrogen-bonding interaction) of two or three components. DESs are attractive solvents for the fractionation (or pretreatment) of lignocellulose and the valorization of lignin, owing to the high solubility of lignin in DESs. DESs are also employed as effective media for the modification of cellulose to afford functionalized cellulosic materials, such as cellulose nanocrystals. More interestingly, biomassderived carbohydrates, such as fructose, can be used as one of the constituents of DESs and then dehydrated to 5-hydroxymethylfurfural in high yield. In this review, a comprehensive summary of recent contribution of DESs to the processing of lignocellulosic biomass and its derivatives is provided. Moreover, further discussion about the challenges of the application of DESs in biomass processing is presented. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Bioactive Carbohydrates and Peptides in Foods: An Overview of Sources, Downstream Processing Steps and Associated Bioactivities.

    Science.gov (United States)

    Hayes, Maria; Tiwari, Brijesh K

    2015-09-17

    Bioactive peptides and carbohydrates are sourced from a myriad of plant, animal and insects and have huge potential for use as food ingredients and pharmaceuticals. However, downstream processing bottlenecks hinder the potential use of these natural bioactive compounds and add cost to production processes. This review discusses the health benefits and bioactivities associated with peptides and carbohydrates of natural origin and downstream processing methodologies and novel processes which may be used to overcome these.

  1. Compact laser-produced plasma EUV sources for processing polymers and nanoimaging

    International Nuclear Information System (INIS)

    Fiedorowicz, H.; Bartnik, A.; Jarocki, R.; Kostecki, J.; Szczurek, M.; Wachulak, P.

    2010-01-01

    Complete text of publication follows. Extreme ultraviolet (EUV) can be produced form a high-temperature plasma generated by interaction of high power laser pulses with matter. Laser plasma EUV sources are considered to be used in various applications in physics, material science, biomedicine, and technology. In the paper new compact laser plasma EUV sources developed for processing polymers and imaging are presented. The sources are based on a gas puff target formed by pulsed injection of a small amount of gas under high-pressure into a laser focus region. The use of the gas puff target instead of a solid target allows for efficient generation of EUV radiation without debris production. The compact laser plasma EUV source based on a gas puff target was developed for metrology applications. The EUV source developed for processing polymers is equipped with a grazing incidence axisymmetrical ellipsoidal mirror to focus EUV radiation in the relatively broad spectral range with the strong maximum near 10 nm. The size of the focal spot is about 1.3 mm in diameter with the maximum fluence up to 70 mJ/cm 2 . EUV radiation in the wavelength range of about 5 to 50 nm is produced by irradiation of xenon or krypton gas puff target with a Nd:YAG laser operating at 10 Hz and delivering 4 ns pulses of energy up to 0.8 J per pulse. The experiments on EUV irradiation of various polymers have been performed. Modification of polymer surfaces was achieved, primarily due to direct photo-etching with EUV photons and formation of micro- and nanostructures onto the surface. The mechanism of the interaction is similar to the UV laser ablation where energetic photons cause chemical bonds of the polymer chain to be broken. However, because of very low penetration depth of EUV radiation, the interaction region is limited to a very thin surface layer (<100 nm). This makes it possible to avoid degradation of bulk material caused by deeply penetrating UV radiation. The results of the studies

  2. Process for carbonizing, distilling, and vaporizing of coal from any source

    Energy Technology Data Exchange (ETDEWEB)

    Limberg, T

    1916-10-15

    A process is described for carbonizing, distilling, and vaporizing coal from any source, especially of humid and bituminous coals as well as bituminous shale and peat for recovering an especially light tar with a large aliphatic hydrocarbon content that is characterized in that it is exposed to internal heating under vacuum at a temperature below dull-red heat. The distillation products of the material are washed away by the heating gases for the whole length of the furnace and are removed immediately and carried into separate condensers.

  3. Topography, power, and current source density of θ oscillations during reward processing as markers for alcohol dependence.

    Science.gov (United States)

    Kamarajan, Chella; Rangaswamy, Madhavi; Manz, Niklas; Chorlian, David B; Pandey, Ashwini K; Roopesh, Bangalore N; Porjesz, Bernice

    2012-05-01

    Recent studies have linked alcoholism with a dysfunctional neural reward system. Although several electrophysiological studies have explored reward processing in healthy individuals, such studies in alcohol-dependent individuals are quite rare. The present study examines theta oscillations during reward processing in abstinent alcoholics. The electroencephalogram (EEG) was recorded in 38 abstinent alcoholics and 38 healthy controls as they performed a single outcome gambling task, which involved outcomes of either loss or gain of an amount (10 or 50¢) that was bet. Event-related theta band (3.0-7.0 Hz) power following each outcome stimulus was computed using the S-transform method. Theta power at the time window of the outcome-related negativity (ORN) and positivity (ORP) (200-500 ms) was compared across groups and outcome conditions. Additionally, behavioral data of impulsivity and task performance were analyzed. The alcoholic group showed significantly decreased theta power during reward processing compared to controls. Current source density (CSD) maps of alcoholics revealed weaker and diffuse source activity for all conditions and weaker bilateral prefrontal sources during the Loss 50 condition when compared with controls who manifested stronger and focused midline sources. Furthermore, alcoholics exhibited increased impulsivity and risk-taking on the behavioral measures. A strong association between reduced anterior theta power and impulsive task-performance was observed. It is suggested that decreased power and weaker and diffuse CSD in alcoholics may be due to dysfunctional neural reward circuitry. The relationship among alcoholism, theta oscillations, reward processing, and impulsivity could offer clues to understand brain circuitries that mediate reward processing and inhibitory control. Copyright © 2011 Wiley-Liss, Inc.

  4. Atmospheric processing of combustion aerosols as a source of soluble iron to the open ocean

    OpenAIRE

    伊藤, 彰記; ITO, Akinori

    2015-01-01

    The majority of bioavailable iron (Fe) from the atmosphere is delivered from arid and semiarid regions to the oceans because the global deposition of iron from combustion sources is small compared with that from mineral dust. Atmospheric processing of mineral aerosols by inorganic and organic acids from anthropogenic and natural sources has been shown to increase the iron solubility of soils (initially < 0.5%) up to about 10%. On the other hand, atmospheric observations have shown that iron i...

  5. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  6. Semiclassical derivation of a local optical potential for heavy-ion elastic scattering. [Coupling to other processes

    Energy Technology Data Exchange (ETDEWEB)

    Donangelo, R; Canto, L F [Rio de Janeiro Univ. (Brazil). Inst. de Fisica; Hussein, M S [Sao Paulo Univ. (Brazil). Inst. de Fisica

    1979-05-21

    A semiclassical method to determine the contribution to the optical potential in the elastic channel due to the coupling to other processes taking place in heavy-ion collisions is developed. An application is made to the case of Coulomb excitation. The lowest-order term of the potential used is shown to be identical to the potential derived by Baltz et al.

  7. Dispersion Measure Variation of Repeating Fast Radio Burst Sources

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yuan-Pei; Zhang, Bing, E-mail: yypspore@gmail.com, E-mail: zhang@physics.unlv.edu [Kavli Institute for Astronomy and Astrophysics, Peking University, Beijing 100871 (China)

    2017-09-20

    The repeating fast radio burst (FRB) 121102 was recently localized in a dwarf galaxy at a cosmological distance. The dispersion measure (DM) derived for each burst from FRB 121102 so far has not shown significant evolution, even though an apparent increase was recently seen with newly detected VLA bursts. It is expected that more repeating FRB sources may be detected in the future. In this work, we investigate a list of possible astrophysical processes that might cause DM variation of a particular FRB source. The processes include (1) cosmological scale effects such as Hubble expansion and large-scale structure fluctuations; (2) FRB local effects such as gas density fluctuation, expansion of a supernova remnant (SNR), a pulsar wind nebula, and an H ii region; and (3) the propagation effect due to plasma lensing. We find that the DM variations contributed by the large-scale structure are extremely small, and any observable DM variation is likely caused by the plasma local to the FRB source. In addition to mechanisms that decrease DM over time, we suggest that an FRB source in an expanding SNR around a nearly neutral ambient medium during the deceleration (Sedov–Taylor and snowplow) phases or in a growing H ii region can increase DM. Some effects (e.g., an FRB source moving in an H ii region or plasma lensing) can produce either positive or negative DM variations. Future observations of DM variations of FRB 121102 and other repeating FRB sources can provide important clues regarding the physical origin of these sources.

  8. Dispersion Measure Variation of Repeating Fast Radio Burst Sources

    International Nuclear Information System (INIS)

    Yang, Yuan-Pei; Zhang, Bing

    2017-01-01

    The repeating fast radio burst (FRB) 121102 was recently localized in a dwarf galaxy at a cosmological distance. The dispersion measure (DM) derived for each burst from FRB 121102 so far has not shown significant evolution, even though an apparent increase was recently seen with newly detected VLA bursts. It is expected that more repeating FRB sources may be detected in the future. In this work, we investigate a list of possible astrophysical processes that might cause DM variation of a particular FRB source. The processes include (1) cosmological scale effects such as Hubble expansion and large-scale structure fluctuations; (2) FRB local effects such as gas density fluctuation, expansion of a supernova remnant (SNR), a pulsar wind nebula, and an H ii region; and (3) the propagation effect due to plasma lensing. We find that the DM variations contributed by the large-scale structure are extremely small, and any observable DM variation is likely caused by the plasma local to the FRB source. In addition to mechanisms that decrease DM over time, we suggest that an FRB source in an expanding SNR around a nearly neutral ambient medium during the deceleration (Sedov–Taylor and snowplow) phases or in a growing H ii region can increase DM. Some effects (e.g., an FRB source moving in an H ii region or plasma lensing) can produce either positive or negative DM variations. Future observations of DM variations of FRB 121102 and other repeating FRB sources can provide important clues regarding the physical origin of these sources.

  9. Interspecific transfer of pyrrolizidine alkaloids: An unconsidered source of contaminations of phytopharmaceuticals and plant derived commodities.

    Science.gov (United States)

    Nowak, Melanie; Wittke, Carina; Lederer, Ines; Klier, Bernhard; Kleinwächter, Maik; Selmar, Dirk

    2016-12-15

    Many plant derived commodities contain traces of toxic pyrrolizidine alkaloids (PAs). The main source of these contaminations seems to be the accidental co-harvest of PA-containing weeds. Yet, based on the insights of the newly described phenomenon of the horizontal transfer of natural products, it is very likely that the PA-contaminations may also be due to an uptake of the alkaloids from the soil, previously being leached out from rotting PA-plants. The transfer of PAs was investigated using various herbs, which had been mulched with dried plant material from Senecio jacobaea. All of the acceptor plants exhibited marked concentrations of PAs. The extent and the composition of the imported PAs was dependent on the acceptor plant species. These results demonstrate that PAs indeed are leached out from dried Senecio material into the soil and confirm their uptake by the roots of the acceptor plants and the translocation into the leaves. Copyright © 2016. Published by Elsevier Ltd.

  10. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Science.gov (United States)

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  11. Stellar neutron sources and s-process in massive stars

    Science.gov (United States)

    Talwar, Rashi

    The s-process or the slow neutron capture process is a nucleosynthesis process taking place at relatively low neutron densities in stars. It runs along the valley of beta stability since the neutron capture rate is much slower compared to the beta decay rate. The s-process occurs mainly during core helium burning and shell carbon burning phase in massive stars and during thermally pulsing helium burning phase in asymptotic giant-branch stars. The potential stellar neutron source for the s-process is associated with alpha-capture reactions on light nuclei. The capture-reaction rates provide the reaction flow for the build-up of22Ne neutron source during the heliumburning phase in these stars. The low energy 26Mg resonances at stellar energies below 800 keV are predicted to have a critical influence on the alpha-capture rates on 22Ne. Some of these resonances may also correspond to pronounced alpha cluster structure near the alpha-threshold. However, these resonances have remained elusive during direct alpha capture measurements owing to the high Coulomb barrier and background from cosmic rays and beam induced reactions. Hence, in the present work, alpha-inelastic scattering and alpha- transfer measurements have been performed to probe the level structure of 26Mg nucleus in order to determine the 22Ne+alpha-capture rates. Both experiments have been performed using the high-resolution Grand Raiden Spectrometer at the Research Center for Nuclear Physics (RCNP), Osaka, Japan. For the alpha-inelastic scattering measurement, a self-supporting solid 26Mg target was used and for the alpha-transfer study via the (6Li,d) reaction, 22Ne gas enclosed in a gas cell with Aramid windows was used. The reaction products were momentum analysed by the spectrometer and detected at the focal plane equipped with two multi-wire drift chambers and two plastic-scintillation detectors. The focal plane detection system provided information on the position, the angle, the time of flight and

  12. Derivation and characterization of human fetal MSCs: an alternative cell source for large-scale production of cardioprotective microparticles.

    Science.gov (United States)

    Lai, Ruenn Chai; Arslan, Fatih; Tan, Soon Sim; Tan, Betty; Choo, Andre; Lee, May May; Chen, Tian Sheng; Teh, Bao Ju; Eng, John Kun Long; Sidik, Harwin; Tanavde, Vivek; Hwang, Wei Sek; Lee, Chuen Neng; El Oakley, Reida Menshawe; Pasterkamp, Gerard; de Kleijn, Dominique P V; Tan, Kok Hian; Lim, Sai Kiang

    2010-06-01

    The therapeutic effects of mesenchymal stem cells (MSCs) transplantation are increasingly thought to be mediated by MSC secretion. We have previously demonstrated that human ESC-derived MSCs (hESC-MSCs) produce cardioprotective microparticles in pig model of myocardial ischemia/reperfusion (MI/R) injury. As the safety and availability of clinical grade human ESCs remain a concern, MSCs from fetal tissue sources were evaluated as alternatives. Here we derived five MSC cultures from limb, kidney and liver tissues of three first trimester aborted fetuses and like our previously described hESC-derived MSCs; they were highly expandable and had similar telomerase activities. Each line has the potential to generate at least 10(16-19) cells or 10(7-10) doses of cardioprotective secretion for a pig model of MI/R injury. Unlike previously described fetal MSCs, they did not express pluripotency-associated markers such as Oct4, Nanog or Tra1-60. They displayed a typical MSC surface antigen profile and differentiated into adipocytes, osteocytes and chondrocytes in vitro. Global gene expression analysis by microarray and qRT-PCR revealed a typical MSC gene expression profile that was highly correlated among the five fetal MSC cultures and with that of hESC-MSCs (r(2)>0.90). Like hESC-MSCs, they produced secretion that was cardioprotective in a mouse model of MI/R injury. HPLC analysis of the secretion revealed the presence of a population of microparticles with a hydrodynamic radius of 50-65 nm. This purified population of microparticles was cardioprotective at approximately 1/10 dosage of the crude secretion. (c) 2009 Elsevier Ltd. All rights reserved.

  13. Processing watershed-derived nitrogen in a well-flushed New England estuary

    Science.gov (United States)

    Tobias, C.R.; Cieri, M.; Peterson, B.J.; Deegan, Linda A.; Vallino, J.; Hughes, J.

    2003-01-01

    Isotopically labeled nitrate (15NO3-) was added continuously to the Rowley estuary, Massachusetts, for 22 d to assess the transport, uptake, and cycling of terrestrially derived nitrogen during a period of high river discharge and low phytoplankton activity. Isotopic enrichment of the 3.5-km tidal prism (150,000 m3) was achieved for the 3 weeks and allowed us to construct a nitrogen mass balance model for the upper estuary. Mean ??15NO3- in the estuary ranged from 300??? to 600???, and approximately 75%-80% of the 15N was exported conservatively as 15NO 3- to the coastal ocean. Essentially all of the 20%-25% of the 15N processed in the estuary occurred in the benthos and was evenly split between direct denitrification and autotrophic assimilation. The lack of water-column 15N uptake was attributed to low phytoplankton stocks and short water residence times (1.2-1.4 d). Uptake of water-column NO3- by benthic autotrophs (enriched in excess of 100???) was a function of NO3- concentration and satisfied up to 15% and 25% of the total nitrogen demand for benthic microalgae and macroalgae, respectively. Approximately 10% of tracer assimilated by benthic autotrophs was mineralized and released back to the water column as 15NH4+. By the end of the study, 15N storage in sediments and marsh macrophytes accounted for 50%-70% of the 15N assimilated in the estuary. These compartments may sequester watershed-derived nitrogen in the estuary for time scales of months to years.

  14. Use of Context in Video Processing

    Science.gov (United States)

    Wu, Chen; Aghajan, Hamid

    Interpreting an event or a scene based on visual data often requires additional contextual information. Contextual information may be obtained from different sources. In this chapter, we discuss two broad categories of contextual sources: environmental context and user-centric context. Environmental context refers to information derived from domain knowledge or from concurrently sensed effects in the area of operation. User-centric context refers to information obtained and accumulated from the user. Both types of context can include static or dynamic contextual elements. Examples from a smart home environment are presented to illustrate how different types of contextual data can be applied to aid the decision-making process.

  15. Fissile material detection and control facility with pulsed neutron sources and digital data processing

    International Nuclear Information System (INIS)

    Romodanov, V.L.; Chernikova, D.N.; Afanasiev, V.V.

    2010-01-01

    Full text: In connection with possible nuclear terrorism, there is long-felt need of devices for effective control of radioactive and fissile materials in the key points of crossing the state borders (airports, seaports, etc.), as well as various customs check-points. In International Science and Technology Center Projects No. 596 and No. 2978, a new physical method and digital technology have been developed for the detection of fissile and radioactive materials in models of customs facilities with a graphite moderator, pulsed neutron source and digital processing of responses from scintillation PSD detectors. Detectability of fissile materials, even those shielded with various radiation-absorbing screens, has been shown. The use of digital processing of scintillation signals in this facility is a necessary element, as neutrons and photons are discriminated in the time dependence of fissile materials responses at such loads on the electronic channels that standard types of spectrometers are inapplicable. Digital processing of neutron and photon responses practically resolves the problem of dead time and allows implementing devices, in which various energy groups of neutrons exist for some time after a pulse of source neutrons. Thus, it is possible to detect fissile materials deliberately concealed with shields having a large cross-section of absorption of photons and thermal neutrons. Two models of detection and the control of fissile materials were advanced: 1. the model based on graphite neutrons moderator and PSD scintillators with digital technology of neutrons and photons responses separation; 2. the model based on plastic scintillators and detecting of time coincidences of fission particles by digital technology. Facilities that count time coincidences of neutrons and photons occurring in the fission of fissile materials can use an Am Li source of neutrons, e.g. that is the case with the AWCC system. The disadvantages of the facility are related to the issues

  16. Cesium glass irradiation sources

    International Nuclear Information System (INIS)

    Plodinec, M.J.

    1982-01-01

    The precipitation process for the decontamination of soluble SRP wastes produces a material whose radioactivity is dominated by 137 Cs. Potentially, this material could be vitrified to produce irradiation sources similar to the Hanford CsCl sources. In this report, process steps necessary for the production of cesium glass irradiation sources (CGS), and the nature of the sources produced, are examined. Three options are considered in detail: direct vitrification of precipitation process waste; direct vitrification of this waste after organic destruction; and vitrification of cesium separated from the precipitation process waste. Direct vitrification is compatible with DWPF equipment, but process rates may be limited by high levels of combustible materials in the off-gas. Organic destruction would allow more rapid processing. In both cases, the source produced has a dose rate of 2 x 10 4 rads/hr at the surface. Cesium separation produces a source with a dose rate of 4 x 10 5 at the surface, which is nearer that of the Hanford sources (2 x 10 6 rads/hr). Additional processing steps would be required, as well as R and D to demonstrate that DWPF equipment is compatible with this intensely radioactive material

  17. Optimization of industrial processes using radiation sources; Otimizacao dos trabalhos envolvendo radiacao industrial

    Energy Technology Data Exchange (ETDEWEB)

    Salles, Claudio G.; Silva Filho, Edmundo D. da; Toribio, Norberto M.; Gandara, Leonardo A. [SAMARCO Mineracao S.A., Mariana, MG (Brazil). Mina de Germano

    1996-12-31

    Aiming the enhancement of the staff protection against radiation in operational areas, the SAMARCO Mineracao S.A. proceeded a reevaluation and analysis of the real necessity of the densimeters/radioactive sources in the operational area, and also the development of an alternative control process for measurement the ore pulp, and introduced of the advanced equipment for sample chemical analysis 8 figs., 1 tab.

  18. Novel sources of Flavor Changed Neutral Currents in the 331RHN model

    International Nuclear Information System (INIS)

    Cogollo, D.; Vital de Andrade, A.; Queiroz, F.S.; Teles, P.R.

    2012-01-01

    Sources of Flavor Changed Neutral Currents (FCNC) emerge naturally from a well motivated framework called 3-3-1 with right-handed neutrinos model, 331 RHN for short, mediated by an extra neutral gauge boson Z '. Following previous work we calculate these sources and in addition we derive new ones coming from CP-even and -odd neutral scalars which appear due to their non-diagonal interactions with the physical standard quarks. Furthermore, by using 4 texture zeros for the quark mass matrices, we derive the mass difference terms for the neutral mesons systems K 0 - anti K 0 , D 0 - anti D 0 and B 0 - anti B 0 and show that, though one can discern that the Z' contribution is the most relevant one for mesons oscillations purposes, scalar contributions play a role also in this processes and hence it is worthwhile to investigate them and derive new bounds on space of parameters. In particular, studying the B 0 - anti B 0 system we set the bounds M Z' >or similar 4.2 TeV and M S 2 ,M I 3 >or similar 7.5 TeV in order to be consistent with the current measurements. (orig.)

  19. Contribution of Satellite Gravimetry to Understanding Seismic Source Processes of the 2011 Tohoku-Oki Earthquake

    Science.gov (United States)

    Han, Shin-Chan; Sauber, Jeanne; Riva, Riccardo

    2011-01-01

    The 2011 great Tohoku-Oki earthquake, apart from shaking the ground, perturbed the motions of satellites orbiting some hundreds km away above the ground, such as GRACE, due to coseismic change in the gravity field. Significant changes in inter-satellite distance were observed after the earthquake. These unconventional satellite measurements were inverted to examine the earthquake source processes from a radically different perspective that complements the analyses of seismic and geodetic ground recordings. We found the average slip located up-dip of the hypocenter but within the lower crust, as characterized by a limited range of bulk and shear moduli. The GRACE data constrained a group of earthquake source parameters that yield increasing dip (7-16 degrees plus or minus 2 degrees) and, simultaneously, decreasing moment magnitude (9.17-9.02 plus or minus 0.04) with increasing source depth (15-24 kilometers). The GRACE solution includes the cumulative moment released over a month and demonstrates a unique view of the long-wavelength gravimetric response to all mass redistribution processes associated with the dynamic rupture and short-term postseismic mechanisms to improve our understanding of the physics of megathrusts.

  20. Coupling individual kernel-filling processes with source-sink interactions into GREENLAB-Maize.

    Science.gov (United States)

    Ma, Yuntao; Chen, Youjia; Zhu, Jinyu; Meng, Lei; Guo, Yan; Li, Baoguo; Hoogenboom, Gerrit

    2018-02-13

    Failure to account for the variation of kernel growth in a cereal crop simulation model may cause serious deviations in the estimates of crop yield. The goal of this research was to revise the GREENLAB-Maize model to incorporate source- and sink-limited allocation approaches to simulate the dry matter accumulation of individual kernels of an ear (GREENLAB-Maize-Kernel). The model used potential individual kernel growth rates to characterize the individual potential sink demand. The remobilization of non-structural carbohydrates from reserve organs to kernels was also incorporated. Two years of field experiments were conducted to determine the model parameter values and to evaluate the model using two maize hybrids with different plant densities and pollination treatments. Detailed observations were made on the dimensions and dry weights of individual kernels and other above-ground plant organs throughout the seasons. Three basic traits characterizing an individual kernel were compared on simulated and measured individual kernels: (1) final kernel size; (2) kernel growth rate; and (3) duration of kernel filling. Simulations of individual kernel growth closely corresponded to experimental data. The model was able to reproduce the observed dry weight of plant organs well. Then, the source-sink dynamics and the remobilization of carbohydrates for kernel growth were quantified to show that remobilization processes accompanied source-sink dynamics during the kernel-filling process. We conclude that the model may be used to explore options for optimizing plant kernel yield by matching maize management to the environment, taking into account responses at the level of individual kernels. © The Author(s) 2018. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Environmental assessment for radioisotope heat source fuel processing and fabrication

    International Nuclear Information System (INIS)

    1991-07-01

    DOE has prepared an Environmental Assessment (EA) for radioisotope heat source fuel processing and fabrication involving existing facilities at the Savannah River Site (SRS) near Aiken, South Carolina and the Los Alamos National Laboratory (LANL) near Los Alamos, New Mexico. The proposed action is needed to provide Radioisotope Thermoelectric Generators (RTG) to support the National Aeronautics and Space Administration's (NASA) CRAF and Cassini Missions. Based on the analysis in the EA, DOE has determined that the proposed action does not constitute a major Federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act (NEPA) of 1969. Therefore, an Environmental Impact Statement is not required. 30 refs., 5 figs

  2. Lignocellulosic Biomass: A Sustainable Bioenergy Source for the Future.

    Science.gov (United States)

    Fatma, Shabih; Hameed, Amir; Noman, Muhammad; Ahmed, Temoor; Shahid, Muhammad; Tariq, Mohsin; Sohail, Imran; Tabassum, Romana

    2018-01-01

    Increasing population and industrialization are continuously oppressing the existing energy resources and depleting the global fuel reservoirs. The elevated pollutions from the continuous consumption of non-renewable fossil fuels also seriously contaminating the surrounding environment. The use of alternate energy sources can be an environment-friendly solution to cope these challenges. Among the renewable energy sources biofuels (biomass-derived fuels) can serve as a better alternative to reduce the reliance on non-renewable fossil fuels. Bioethanol is one of the most widely consumed biofuels of today's world. The main objective of this review is to highlight the significance of lignocellulosic biomass as a potential source for the production of biofuels like bioethanol, biodiesel or biogas. We discuss the application of various methods for the bioconversion of lignocellulosic biomass to end products i.e. biofuels. The lignocellulosic biomass must be pretreated to disintegrate lignocellulosic complexes and to expose its chemical components for downstream processes. After pretreatment, the lignocellulosic biomass is then subjected to saccharification either via acidic or enzymatic hydrolysis. Thereafter, the monomeric sugars resulted from hydrolysis step are further processed into biofuel i.e. bioethanol, biodiesel or butanol etc. through the fermentation process. The fermented impure product is then purified through the distillation process to obtain pure biofuel. Renewable energy sources represent the potential fuel alternatives to overcome the global energy crises in a sustainable and eco-friendly manner. In future, biofuels may replenish the conventional non-renewable energy resources due to their renewability and several other advantages. Lignocellulosic biomass offers the most economical biomass to generate biofuels. However, extensive research is required for the commercial production of an efficient integrated biotransformation process for the production of

  3. A new method to estimate heat source parameters in gas metal arc welding simulation process

    International Nuclear Information System (INIS)

    Jia, Xiaolei; Xu, Jie; Liu, Zhaoheng; Huang, Shaojie; Fan, Yu; Sun, Zhi

    2014-01-01

    Highlights: •A new method for accurate simulation of heat source parameters was presented. •The partial least-squares regression analysis was recommended in the method. •The welding experiment results verified accuracy of the proposed method. -- Abstract: Heat source parameters were usually recommended by experience in welding simulation process, which induced error in simulation results (e.g. temperature distribution and residual stress). In this paper, a new method was developed to accurately estimate heat source parameters in welding simulation. In order to reduce the simulation complexity, a sensitivity analysis of heat source parameters was carried out. The relationships between heat source parameters and welding pool characteristics (fusion width (W), penetration depth (D) and peak temperature (T p )) were obtained with both the multiple regression analysis (MRA) and the partial least-squares regression analysis (PLSRA). Different regression models were employed in each regression method. Comparisons of both methods were performed. A welding experiment was carried out to verify the method. The results showed that both the MRA and the PLSRA were feasible and accurate for prediction of heat source parameters in welding simulation. However, the PLSRA was recommended for its advantages of requiring less simulation data

  4. Large Scale Production of Stem Cells and Their Derivatives

    Science.gov (United States)

    Zweigerdt, Robert

    Stem cells have been envisioned to become an unlimited cell source for regenerative medicine. Notably, the interest in stem cells lies beyond direct therapeutic applications. They might also provide a previously unavailable source of valuable human cell types for screening platforms, which might facilitate the development of more efficient and safer drugs. The heterogeneity of stem cell types as well as the numerous areas of application suggests that differential processes are mandatory for their in vitro culture. Many of the envisioned applications would require the production of a high number of stem cells and their derivatives in scalable, well-defined and potentially clinical compliant manner under current good manufacturing practice (cGMP). In this review we provide an overview on recent strategies to develop bioprocesses for the expansion, differentiation and enrichment of stem cells and their progenies, presenting examples for adult and embryonic stem cells alike.

  5. Documenting open source migration processes for re-use

    CSIR Research Space (South Africa)

    Gerber, A

    2010-10-01

    Full Text Available There are several sources that indicate a remarkable increase in the adoption of open source software (OSS) into the technology infrastructure of organizations. In fact, the number of medium to large organizations without some OSS installations...

  6. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Directory of Open Access Journals (Sweden)

    Konrad J Karczewski

    Full Text Available The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping, a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  7. Tumor-Derived Microvesicles Modulate Antigen Cross-Processing via Reactive Oxygen Species-Mediated Alkalinization of Phagosomal Compartment in Dendritic Cells

    Directory of Open Access Journals (Sweden)

    Federico Battisti

    2017-09-01

    Full Text Available Dendritic cells (DCs are the only antigen-presenting cells able to prime naïve T cells and cross-prime antigen-specific CD8+ T cells. Their functionality is a requirement for the induction and maintenance of long-lasting cancer immunity. Albeit intensively investigated, the in vivo mechanisms underlying efficient antigen cross-processing and presentation are not fully understood. Several pieces of evidence indicate that antigen transfer to DCs mediated by microvesicles (MVs enhances antigen immunogenicity. This mechanism is also relevant for cross-presentation of those tumor-associated glycoproteins such as MUC1 that are blocked in HLA class II compartment when internalized by DCs as soluble molecules. Here, we present pieces of evidence that the internalization of tumor-derived MVs modulates antigen-processing machinery of DCs. Employing MVs derived from ovarian cancer ascites fluid and established tumor cell lines, we show that MV uptake modifies DC phagosomal microenvironment, triggering reactive oxygen species (ROS accumulation and early alkalinization. Indeed, tumor MVs carry radical species and the MV uptake by DCs counteracts the chemically mediated acidification of the phagosomal compartment. Further pieces of evidence suggest that efficacious antigen cross-priming of the MUC1 antigen carried by the tumor MVs results from the early signaling induced by MV internalization and the function of the antigen-processing machinery of DCs. These results strongly support the hypothesis that tumor-derived MVs impact antigen immunogenicity by tuning the antigen-processing machinery of DCs, besides being carrier of tumor antigens. Furthermore, these findings have important implications for the exploitation of MVs as antigenic cell-free immunogen for DC-based therapeutic strategies.

  8. Source terms: an investigation of uncertainties, magnitudes, and recommendations for research. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Levine, S.; Kaiser, G. D.; Arcieri, W. C.; Firstenberg, H.; Fulford, P. J.; Lam, P. S.; Ritzman, R. L.; Schmidt, E. R.

    1982-03-01

    The purpose of this document is to assess the state of knowledge and expert opinions that exist about fission product source terms from potential nuclear power plant accidents. This is so that recommendations can be made for research and analyses which have the potential to reduce the uncertainties in these estimated source terms and to derive improved methods for predicting their magnitudes. The main reasons for writing this report are to indicate the major uncertainties involved in defining realistic source terms that could arise from severe reactor accidents, to determine which factors would have the most significant impact on public risks and emergency planning, and to suggest research and analyses that could result in the reduction of these uncertainties. Source terms used in the conventional consequence calculations in the licensing process are not explicitly addressed.

  9. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  10. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    Science.gov (United States)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  11. Prebiotic Synthesis of Autocatalytic Products From Formaldehyde-Derived Sugars as the Carbon and Energy Source

    Science.gov (United States)

    Weber, Arthur L.

    2003-01-01

    Our research objective is to understand and model the chemical processes on the primitive Earth that generated the first autocatalytic molecules and microstructures involved in the origin of life. Our approach involves: (a) investigation of a model origin-of-life process named the Sugar Model that is based on the reaction of formaldehyde- derived sugars (trioses and tetroses) with ammonia, and (b) elucidation of the constraints imposed on the chemistry of the origin of life by the fixed energies and rates of C,H,O-organic reactions under mild aqueous conditions. Recently, we demonstrated that under mild aqueous conditions the Sugar Model process yields autocatalytic products, and generates organic micropherules (2-20 micron dia.) that exhibit budding, size uniformity, and chain formation. We also discovered that the sugar substrates of the Sugar Model are capable of reducing nitrite to ammonia under mild aqueous conditions. In addition studies done in collaboration with Sandra Pizzarrello (Arizona State University) revealed that chiral amino acids (including meteoritic isovaline) catalyze both the synthesis and specific handedness of chiral sugars. Our systematic survey of the energies and rates of reactions of C,H,O-organic substrates under mild aqueous conditions revealed several general principles (rules) that govern the direction and rate of organic reactions. These reactivity principles constrain the structure of chemical pathways used in the origin of life, and in modern and primitive metabolism.

  12. Process strategies for ultra-deep x-ray lithography at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Mancini, D.C.; Moldovan, N.; Divan, R.; De Carlo, F.; Yaeger, J.

    2001-01-01

    For the past five years, we have been investigating and advancing processing capabilities for deep x-ray lithography (DXRL) using synchrotron radiation from a bending magnet at the Advanced Photon Source (APS), with an emphasis on ultra-deep structures (1mm to 1cm thick). The use of higher-energy x-rays has presented many challenges in developing optimal lithographic techniques for high-aspect ratio structures: mask requirements, resist preparation, exposure, development, and post-processing. Many problems are more severe for high-energy exposure of thicker films than for sub-millimeter structures and affect resolution, processing time, adhesion, damage, and residue. A number of strategies have been created to overcome the challenges and limitations of ultra-deep x-ray lithography (UDXRL), that have resulted in the current choices for mask, substrate, and process flow at the APS. We describe our current process strategies for UDXRL, how they address the challenges presented, and their current limitations. We note especially the importance of the process parameters for use of the positive tone resist PMMA for UDXRL, and compare to the use of negative tone resists such as SU-8 regarding throughput, resolution, adhesion, damage, and post-processing.

  13. Automatic Service Derivation from Business Process Model Repositories via Semantic Technology

    NARCIS (Netherlands)

    Leopold, H.; Pittke, F.; Mendling, J.

    2015-01-01

    Although several approaches for service identification have been defined in research and practice, there is a notable lack of fully automated techniques. In this paper, we address the problem of manual work in the context of service derivation and present an approach for automatically deriving

  14. DEVELOPMENT OF CONTINUOUS SOLVENT EXTRACTION PROCESSES FOR COAL DERIVED CARBON PRODUCTS

    Energy Technology Data Exchange (ETDEWEB)

    Elliot B. Kennel; R. Michael Bergen; Stephen P. Carpenter; Dady Dadyburjor; Manoj Katakdaunde; Liviu Magean; Alfred H. Stiller; W. Morgan Summers; John W. Zondlo

    2006-05-12

    The purpose of this DOE-funded effort is to develop continuous processes for solvent extraction of coal for the production of carbon products. The largest applications are those which support metals smelting, such as anodes for aluminum smelting and electrodes for arc furnaces. Other carbon products include materials used in creating fuels for the Direct Carbon Fuel Cell, metals smelting, especially in the aluminum and steel industries, as well as porous carbon structural material referred to as ''carbon foam'' and carbon fibers. During this reporting period, coking and composite fabrication continued using coal-derived samples. These samples were tested in direct carbon fuel cells. Methodology was refined for determining the aromatic character of hydro treated liquid, based on Nuclear Magnetic Resonance (NMR) and Fourier Transform Infrared (FTIR). Tests at GrafTech International showed that binder pitches produced using the WVU solvent extraction protocol can result in acceptable graphite electrodes for use in arc furnaces. These tests were made at the pilot scale.

  15. Anti-aging effects of vitamin C on human pluripotent stem cell-derived cardiomyocytes.

    Science.gov (United States)

    Kim, Yoon Young; Ku, Seung-Yup; Huh, Yul; Liu, Hung-Ching; Kim, Seok Hyun; Choi, Young Min; Moon, Shin Yong

    2013-10-01

    Human pluripotent stem cells (hPSCs) have arisen as a source of cells for biomedical research due to their developmental potential. Stem cells possess the promise of providing clinicians with novel treatments for disease as well as allowing researchers to generate human-specific cellular metabolism models. Aging is a natural process of living organisms, yet aging in human heart cells is difficult to study due to the ethical considerations regarding human experimentation as well as a current lack of alternative experimental models. hPSC-derived cardiomyocytes (CMs) bear a resemblance to human cardiac cells and thus hPSC-derived CMs are considered to be a viable alternative model to study human heart cell aging. In this study, we used hPSC-derived CMs as an in vitro aging model. We generated cardiomyocytes from hPSCs and demonstrated the process of aging in both human embryonic stem cell (hESC)- and induced pluripotent stem cell (hiPSC)-derived CMs. Aging in hESC-derived CMs correlated with reduced membrane potential in mitochondria, the accumulation of lipofuscin, a slower beating pattern, and the downregulation of human telomerase RNA (hTR) and cell cycle regulating genes. Interestingly, the expression of hTR in hiPSC-derived CMs was not significantly downregulated, unlike in hESC-derived CMs. In order to delay aging, vitamin C was added to the cultured CMs. When cells were treated with 100 μM of vitamin C for 48 h, anti-aging effects, specifically on the expression of telomere-related genes and their functionality in aging cells, were observed. Taken together, these results suggest that hPSC-derived CMs can be used as a unique human cardiomyocyte aging model in vitro and that vitamin C shows anti-aging effects in this model.

  16. Topological Derivatives in Shape Optimization

    CERN Document Server

    Novotny, Antonio André

    2013-01-01

    The topological derivative is defined as the first term (correction) of the asymptotic expansion of a given shape functional with respect to a small parameter that measures the size of singular domain perturbations, such as holes, inclusions, defects, source-terms and cracks. Over the last decade, topological asymptotic analysis has become a broad, rich and fascinating research area from both theoretical and numerical standpoints. It has applications in many different fields such as shape and topology optimization, inverse problems, imaging processing and mechanical modeling including synthesis and/or optimal design of microstructures, sensitivity analysis in fracture mechanics and damage evolution modeling. Since there is no monograph on the subject at present, the authors provide here the first account of the theory which combines classical sensitivity analysis in shape optimization with asymptotic analysis by means of compound asymptotic expansions for elliptic boundary value problems. This book is intende...

  17. LCCT-derived three-level three-phase inverters

    DEFF Research Database (Denmark)

    Shults, Tatiana; Husev, Oleksandr; Blaabjerg, Frede

    2017-01-01

    Solutions for a family of the novel three-level neutral-point-clamped (NPC) inductor-capacitor-capacitor-transformer (LCCT)-derived three-phase inverters are described and compared. Component design guidelines and steady state analysis, current and voltage waveforms are given. The authors......' simulation results confirm the theoretical predictions. It was found that an asymmetrical three-level NPC LCCT-derived inverter with a single diode in the impedance source network is the most promising solution. Experimental results for an asymmetrical three-level NPC LCCT-derived inverter with a single...

  18. Algae Derived Biofuel

    Energy Technology Data Exchange (ETDEWEB)

    Jahan, Kauser [Rowan Univ., Glassboro, NJ (United States)

    2015-03-31

    One of the most promising fuel alternatives is algae biodiesel. Algae reproduce quickly, produce oils more efficiently than crop plants, and require relatively few nutrients for growth. These nutrients can potentially be derived from inexpensive waste sources such as flue gas and wastewater, providing a mutual benefit of helping to mitigate carbon dioxide waste. Algae can also be grown on land unsuitable for agricultural purposes, eliminating competition with food sources. This project focused on cultivating select algae species under various environmental conditions to optimize oil yield. Membrane studies were also conducted to transfer carbon di-oxide more efficiently. An LCA study was also conducted to investigate the energy intensive steps in algae cultivation.

  19. EEG source reconstruction reveals frontal-parietal dynamics of spatial conflict processing.

    Science.gov (United States)

    Cohen, Michael X; Ridderinkhof, K Richard

    2013-01-01

    Cognitive control requires the suppression of distracting information in order to focus on task-relevant information. We applied EEG source reconstruction via time-frequency linear constrained minimum variance beamforming to help elucidate the neural mechanisms involved in spatial conflict processing. Human subjects performed a Simon task, in which conflict was induced by incongruence between spatial location and response hand. We found an early (∼200 ms post-stimulus) conflict modulation in stimulus-contralateral parietal gamma (30-50 Hz), followed by a later alpha-band (8-12 Hz) conflict modulation, suggesting an early detection of spatial conflict and inhibition of spatial location processing. Inter-regional connectivity analyses assessed via cross-frequency coupling of theta (4-8 Hz), alpha, and gamma power revealed conflict-induced shifts in cortical network interactions: Congruent trials (relative to incongruent trials) had stronger coupling between frontal theta and stimulus-contrahemifield parietal alpha/gamma power, whereas incongruent trials had increased theta coupling between medial frontal and lateral frontal regions. These findings shed new light into the large-scale network dynamics of spatial conflict processing, and how those networks are shaped by oscillatory interactions.

  20. EEG source reconstruction reveals frontal-parietal dynamics of spatial conflict processing.

    Directory of Open Access Journals (Sweden)

    Michael X Cohen

    Full Text Available Cognitive control requires the suppression of distracting information in order to focus on task-relevant information. We applied EEG source reconstruction via time-frequency linear constrained minimum variance beamforming to help elucidate the neural mechanisms involved in spatial conflict processing. Human subjects performed a Simon task, in which conflict was induced by incongruence between spatial location and response hand. We found an early (∼200 ms post-stimulus conflict modulation in stimulus-contralateral parietal gamma (30-50 Hz, followed by a later alpha-band (8-12 Hz conflict modulation, suggesting an early detection of spatial conflict and inhibition of spatial location processing. Inter-regional connectivity analyses assessed via cross-frequency coupling of theta (4-8 Hz, alpha, and gamma power revealed conflict-induced shifts in cortical network interactions: Congruent trials (relative to incongruent trials had stronger coupling between frontal theta and stimulus-contrahemifield parietal alpha/gamma power, whereas incongruent trials had increased theta coupling between medial frontal and lateral frontal regions. These findings shed new light into the large-scale network dynamics of spatial conflict processing, and how those networks are shaped by oscillatory interactions.

  1. EEG Source Reconstruction Reveals Frontal-Parietal Dynamics of Spatial Conflict Processing

    Science.gov (United States)

    Cohen, Michael X; Ridderinkhof, K. Richard

    2013-01-01

    Cognitive control requires the suppression of distracting information in order to focus on task-relevant information. We applied EEG source reconstruction via time-frequency linear constrained minimum variance beamforming to help elucidate the neural mechanisms involved in spatial conflict processing. Human subjects performed a Simon task, in which conflict was induced by incongruence between spatial location and response hand. We found an early (∼200 ms post-stimulus) conflict modulation in stimulus-contralateral parietal gamma (30–50 Hz), followed by a later alpha-band (8–12 Hz) conflict modulation, suggesting an early detection of spatial conflict and inhibition of spatial location processing. Inter-regional connectivity analyses assessed via cross-frequency coupling of theta (4–8 Hz), alpha, and gamma power revealed conflict-induced shifts in cortical network interactions: Congruent trials (relative to incongruent trials) had stronger coupling between frontal theta and stimulus-contrahemifield parietal alpha/gamma power, whereas incongruent trials had increased theta coupling between medial frontal and lateral frontal regions. These findings shed new light into the large-scale network dynamics of spatial conflict processing, and how those networks are shaped by oscillatory interactions. PMID:23451201

  2. Source processes of strong earthquakes in the North Tien-Shan region

    Science.gov (United States)

    Kulikova, G.; Krueger, F.

    2013-12-01

    Tien-Shan region attracts attention of scientists worldwide due to its complexity and tectonic uniqueness. A series of very strong destructive earthquakes occurred in Tien-Shan at the turn of XIX and XX centuries. Such large intraplate earthquakes are rare in seismology, which increases the interest in the Tien-Shan region. The presented study focuses on the source processes of large earthquakes in Tien-Shan. The amount of seismic data is limited for those early times. In 1889, when a major earthquake has occurred in Tien-Shan, seismic instruments were installed in very few locations in the world and these analog records did not survive till nowadays. Although around a hundred seismic stations were operating at the beginning of XIX century worldwide, it is not always possible to get high quality analog seismograms. Digitizing seismograms is a very important step in the work with analog seismic records. While working with historical seismic records one has to take into account all the aspects and uncertainties of manual digitizing and the lack of accurate timing and instrument characteristics. In this study, we develop an easy-to-handle and fast digitization program on the basis of already existing software which allows to speed up digitizing process and to account for all the recoding system uncertainties. Owing to the lack of absolute timing for the historical earthquakes (due to the absence of a universal clock at that time), we used time differences between P and S phases to relocate the earthquakes in North Tien-Shan and the body-wave amplitudes to estimate their magnitudes. Combining our results with geological data, five earthquakes in North Tien-Shan were precisely relocated. The digitizing of records can introduce steps into the seismograms which makes restitution (removal of instrument response) undesirable. To avoid the restitution, we simulated historic seismograph recordings with given values for damping and free period of the respective instrument and

  3. Implementing Pollution Source Control—Learning from the Innovation Process in English and Welsh Water Companies

    NARCIS (Netherlands)

    Spiller, M.; McIntosh, B.S.; Seaton, R.A.F.; Jeffrey, P.

    2013-01-01

    Improving the stimulation and management of innovation by water utilities is a key mechanism through which the challenges of securing sustainable water and wastewater services will be achieved. This paper describes the process of adopting source control interventions (SCIs) by water and sewerage

  4. Performance analyses of Z-source and quasi Z-source inverter for photovoltaic applications

    Science.gov (United States)

    Himabind, S.; Priya, T. Hari; Manjeera, Ch.

    2018-04-01

    This paper presents the comparative analysis of Z-source and Quasi Z-source converter for renewable energy applications. Due to the dependency of renewable energy sources on external weather conditions the output voltage, current changes accordingly which effects the performance of traditional voltage source and current source inverters connected across it. To overcome the drawbacks of VSI and CSI, Z-source and Quasi Z-source inverter (QZSI) are used, which can perform multiple tasks like ac-to-dc, dc-to-ac, ac-to-ac, dc-to-dc conversion. They can be used for both buck and boost operations, by utilizing the shoot-through zero state. The QZSI is derived from the ZSI topology, with a slight change in the impedance network and it overcomes the drawbacks of ZSI. The QZSI draws a constant current from the source when compared to ZSI. A comparative analysis is performed between Z-source and Quasi Z-source inverter, simulation is performed in MATLAB/Simulink environment.

  5. Primary and secondary aerosols in Beijing in winter: sources, variations and processes

    Science.gov (United States)

    Sun, Yele; Du, Wei; Fu, Pingqing; Wang, Qingqing; Li, Jie; Ge, Xinlei; Zhang, Qi; Zhu, Chunmao; Ren, Lujie; Xu, Weiqi; Zhao, Jian; Han, Tingting; Worsnop, Douglas R.; Wang, Zifa

    2016-07-01

    Winter has the worst air pollution of the year in the megacity of Beijing. Despite extensive winter studies in recent years, our knowledge of the sources, formation mechanisms and evolution of aerosol particles is not complete. Here we have a comprehensive characterization of the sources, variations and processes of submicron aerosols that were measured by an Aerodyne high-resolution aerosol mass spectrometer from 17 December 2013 to 17 January 2014 along with offline filter analysis by gas chromatography/mass spectrometry. Our results suggest that submicron aerosols composition was generally similar across the winter of different years and was mainly composed of organics (60 %), sulfate (15 %) and nitrate (11 %). Positive matrix factorization of high- and unit-mass resolution spectra identified four primary organic aerosol (POA) factors from traffic, cooking, biomass burning (BBOA) and coal combustion (CCOA) emissions as well as two secondary OA (SOA) factors. POA dominated OA, on average accounting for 56 %, with CCOA being the largest contributor (20 %). Both CCOA and BBOA showed distinct polycyclic aromatic hydrocarbons (PAHs) spectral signatures, indicating that PAHs in winter were mainly from coal combustion (66 %) and biomass burning emissions (18 %). BBOA was highly correlated with levoglucosan, a tracer compound for biomass burning (r2 = 0.93), and made a considerable contribution to OA in winter (9 %). An aqueous-phase-processed SOA (aq-OOA) that was strongly correlated with particle liquid water content, sulfate and S-containing ions (e.g. CH2SO2+) was identified. On average aq-OOA contributed 12 % to the total OA and played a dominant role in increasing oxidation degrees of OA at high RH levels (> 50 %). Our results illustrate that aqueous-phase processing can enhance SOA production and oxidation states of OA as well in winter. Further episode analyses highlighted the significant impacts of meteorological parameters on aerosol composition, size

  6. Primary and secondary aerosols in Beijing in winter: sources, variations and processes

    Directory of Open Access Journals (Sweden)

    Y. Sun

    2016-07-01

    Full Text Available Winter has the worst air pollution of the year in the megacity of Beijing. Despite extensive winter studies in recent years, our knowledge of the sources, formation mechanisms and evolution of aerosol particles is not complete. Here we have a comprehensive characterization of the sources, variations and processes of submicron aerosols that were measured by an Aerodyne high-resolution aerosol mass spectrometer from 17 December 2013 to 17 January 2014 along with offline filter analysis by gas chromatography/mass spectrometry. Our results suggest that submicron aerosols composition was generally similar across the winter of different years and was mainly composed of organics (60 %, sulfate (15 % and nitrate (11 %. Positive matrix factorization of high- and unit-mass resolution spectra identified four primary organic aerosol (POA factors from traffic, cooking, biomass burning (BBOA and coal combustion (CCOA emissions as well as two secondary OA (SOA factors. POA dominated OA, on average accounting for 56 %, with CCOA being the largest contributor (20 %. Both CCOA and BBOA showed distinct polycyclic aromatic hydrocarbons (PAHs spectral signatures, indicating that PAHs in winter were mainly from coal combustion (66 % and biomass burning emissions (18 %. BBOA was highly correlated with levoglucosan, a tracer compound for biomass burning (r2 = 0.93, and made a considerable contribution to OA in winter (9 %. An aqueous-phase-processed SOA (aq-OOA that was strongly correlated with particle liquid water content, sulfate and S-containing ions (e.g. CH2SO2+ was identified. On average aq-OOA contributed 12 % to the total OA and played a dominant role in increasing oxidation degrees of OA at high RH levels (> 50 %. Our results illustrate that aqueous-phase processing can enhance SOA production and oxidation states of OA as well in winter. Further episode analyses highlighted the significant impacts of meteorological parameters on

  7. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  8. Comparative economic factors on the use of radionuclide or electrical sources for food processing with ionizing radiation

    International Nuclear Information System (INIS)

    Lagunas-Solar, M.C.

    1985-01-01

    Food irradiation is a promising addition to conventional food processing techniques. However, as is the case with most new technologies, its economic suitability will be determined by comparison to current methods. Assuming that current food processing facilities are adaptable to the incorporation of a food irradiation capability, an analysis of cost for several different optional systems able to process up to 100 Mrad ton/day (1 MGy ton/day; or 1,000 ton/day at 100 krad) will be made. Both radionuclide and electrical accelerators will be compared as sources of ionizing radiation. The cost of irradiation will be shown to be competitive with most other treatments including fumigation, low-temperature storage, and controlled atmosphere. A proper figure-of-merit for comparing the different sources will be defined and used as a basis for an economic evaluation of food irradiation. (author)

  9. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    Science.gov (United States)

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  10. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  11. Effects of the addition of different nitrogen sources in the tequila fermentation process at high sugar concentration.

    Science.gov (United States)

    Arrizon, J; Gschaedler, A

    2007-04-01

    To study the effect of the addition of different nitrogen sources at high sugar concentration in the tequila fermentation process. Fermentations were performed at high sugar concentration (170 g l(-1)) using Agave tequilana Weber blue variety with and without added nitrogen from different sources (ammonium sulfate; glutamic acid; a mixture of ammonium sulfate and amino acids) during the exponential phase of growth. All the additions increased the fermentation rate and alcohol efficiency. The level of synthesis of volatile compounds depended on the source added. The concentration of amyl alcohols and isobutanol were decreased while propanol and acetaldehyde concentration increased. The most efficient nitrogen sources for fermentation rate were ammonium sulfate and the mixture of ammonium sulfate and amino acids. The level of volatile compounds produced depended upon types of nitrogen. The synthesis of some volatile compounds increased while others decreased with nitrogen addition. The addition of nitrogen could be a strategy for improving the fermentation rate and efficiency in the tequila fermentation process at high sugar Agave tequilana concentration. Furthermore, the sensory quality of the final product may change because the synthesis of the volatile compounds is modified.

  12. Carotenoids Functionality, Sources, and Processing by Supercritical Technology: A Review

    Directory of Open Access Journals (Sweden)

    Natália Mezzomo

    2016-01-01

    Full Text Available Carotenoid is a group of pigments naturally present in vegetal raw materials that have biological properties. These pigments have been used mainly in food, pharmaceutical, and cosmetic industries. Currently, the industrial production is executed through chemical synthesis, but natural alternatives of carotenoid production/attainment are in development. The carotenoid extraction occurs generally with vegetal oil and organic solvents, but supercritical technology is an alternative technique to the recovery of these compounds, presenting many advantages when compared to conventional process. Brazil has an ample diversity of vegetal sources inadequately investigated and, then, a major development of optimization and validation of carotenoid production/attainment methods is necessary, so that the benefits of these pigments can be delivered to the consumer.

  13. A nutribusiness strategy for processing and marketing animal-source foods for children.

    Science.gov (United States)

    Mills, Edward W; Seetharaman, Koushik; Maretzki, Audrey N

    2007-04-01

    Nutritional benefits of animal source foods in the diets of children in developing countries indicate a need to increase the availability of such foods to young children. A nutribusiness strategy based on a dried meat and starch product could be used to increase children's access to such foods. The "Chiparoo" was developed at The Pennsylvania State University with this objective in mind. Plant-based and meat ingredients of the Chiparoo are chosen based on regional availability and cultural acceptability. Chiparoo processing procedures, including solar drying, are designed to ensure product safety and to provide product properties that allow them to be eaten as a snack or crumbled into a weaning porridge. Continued work is needed to develop formulation and processing variations that accommodate the needs of cultures around the world.

  14. Constraints on equivalent elastic source models from near-source data

    International Nuclear Information System (INIS)

    Stump, B.

    1993-01-01

    A phenomenological based seismic source model is important in quantifying the important physical processes that affect the observed seismic radiation in the linear-elastic regime. Representations such as these were used to assess yield effects on seismic waves under a Threshold Test Ban Treaty and to help transport seismic coupling experience at one test site to another. These same characterizations in a non-proliferation environment find applications in understanding the generation of the different types of body and surface waves from nuclear explosions, single chemical explosions, arrays of chemical explosions used in mining, rock bursts and earthquakes. Seismologists typically begin with an equivalent elastic representation of the source which when convolved with the propagation path effects produces a seismogram. The Representation Theorem replaces the true source with an equivalent set of body forces, boundary conditions or initial conditions. An extension of this representation shows the equivalence of the body forces, boundary conditions and initial conditions and replaces the source with a set of force moments, the first degree moment tensor for a point source representation. The difficulty with this formulation, which can completely describe the observed waveforms when the propagation path effects are known, is in the physical interpretation of the actual physical processes acting in the source volume. Observational data from within the source region, where processes are often nonlinear, linked to numerical models of the important physical processes in this region are critical to a unique physical understanding of the equivalent elastic source function

  15. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Persistence of attitude change and attitude-behavior correspondence based on extensive processing of source information

    NARCIS (Netherlands)

    Pierro, Antonio; Mannetti, Lucia; Kruglanski, Arie W.; Klein, Kristen; Orehek, Edward

    A three-phase longitudinal study (spread over a month's time) was carried out to investigate attitude's persistence and linkage to behavior as it may be affected by the processing of information about the communication source. The following three independent variables were manipulated: (i) contents

  17. Wavefield dependency on virtual shifts in the source location

    KAUST Repository

    Alkhalifah, Tariq

    2011-02-14

    The wavefield dependence on a virtual shift in the source location can provide information helpful in velocity estimation and interpolation. However, the second-order partial differential equation (PDE) that relates changes in the wavefield form (or shape) to lateral perturbations in the source location depends explicitly on lateral derivatives of the velocity field. For velocity models that include lateral velocity discontinuities this is problematic as such derivatives in their classical definition do not exist. As a result, I derive perturbation partial differential wave equations that are independent of direct velocity derivatives and thus, provide possibilities for wavefield shape extrapolation in complex media. These PDEs have the same structure as the wave equation with a source function that depends on the background (original source) wavefield. The solutions of the perturbation equations provide the coefficients of a Taylor\\'s series type expansion for the wavefield. The new formulas introduce changes to the background wavefield only in the presence of lateral velocity variation or in general terms velocity variations in the perturbation direction. The accuracy of the representation, as demonstrated on the Marmousi model, is generally good. © 2011 European Association of Geoscientists & Engineers.

  18. Aqueous organic chemistry in the atmosphere: sources and chemical processing of organic aerosols.

    Science.gov (United States)

    McNeill, V Faye

    2015-02-03

    Over the past decade, it has become clear that aqueous chemical processes occurring in cloud droplets and wet atmospheric particles are an important source of organic atmospheric particulate matter. Reactions of water-soluble volatile (or semivolatile) organic gases (VOCs or SVOCs) in these aqueous media lead to the formation of highly oxidized organic particulate matter (secondary organic aerosol; SOA) and key tracer species, such as organosulfates. These processes are often driven by a combination of anthropogenic and biogenic emissions, and therefore their accurate representation in models is important for effective air quality management. Despite considerable progress, mechanistic understanding of some key aqueous processes is still lacking, and these pathways are incompletely represented in 3D atmospheric chemistry and air quality models. In this article, the concepts, historical context, and current state of the science of aqueous pathways of SOA formation are discussed.

  19. Reaction probability derived from an interpolation formula for diffusion processes with an absorptive boundary condition

    International Nuclear Information System (INIS)

    Misawa, T.; Itakura, H.

    1995-01-01

    The present article focuses on a dynamical simulation of molecular motion in liquids. In the simulation involving diffusion-controlled reaction with discrete time steps, lack of information regarding the trajectory within the time step may result in a failure to count the number of reactions of the particles within the step. In order to rectify this, an interpolated diffusion process is used. The process is derived from a stochastic interpolation formula recently developed by the first author [J. Math. Phys. 34, 775 (1993)]. In this method, the probability that reaction has occurred during the time step given the initial and final positions of the particles is calculated. Some numerical examples confirm that the theoretical result corresponds to an improvement over the Clifford-Green work [Mol. Phys. 57, 123 (1986)] on the same matter

  20. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    Science.gov (United States)

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  1. Shoot-derived abscisic acid promotes root growth.

    Science.gov (United States)

    McAdam, Scott A M; Brodribb, Timothy J; Ross, John J

    2016-03-01

    The phytohormone abscisic acid (ABA) plays a major role in regulating root growth. Most work to date has investigated the influence of root-sourced ABA on root growth during water stress. Here, we tested whether foliage-derived ABA could be transported to the roots, and whether this foliage-derived ABA had an influence on root growth under well-watered conditions. Using both application studies of deuterium-labelled ABA and reciprocal grafting between wild-type and ABA-biosynthetic mutant plants, we show that both ABA levels in the roots and root growth in representative angiosperms are controlled by ABA synthesized in the leaves rather than sourced from the roots. Foliage-derived ABA was found to promote root growth relative to shoot growth but to inhibit the development of lateral roots. Increased root auxin (IAA) levels in plants with ABA-deficient scions suggest that foliage-derived ABA inhibits root growth through the root growth-inhibitor IAA. These results highlight the physiological and morphological importance, beyond the control of stomata, of foliage-derived ABA. The use of foliar ABA as a signal for root growth has important implications for regulating root to shoot growth under normal conditions and suggests that leaf rather than root hydration is the main signal for regulating plant responses to moisture. © 2015 John Wiley & Sons Ltd.

  2. Placenta-an alternative source of stem cells

    International Nuclear Information System (INIS)

    Matikainen, Tiina; Laine, Jarmo

    2005-01-01

    The two most promising practical applications of human stem cells are cellular replacement therapies in human disease and toxicological screening of candidate drug molecules. Both require a source of human stem cells that can be isolated, purified, expanded in number and differentiated into the cell type of choice in a controlled manner. Currently, uses of both embryonic and adult stem cells are investigated. While embryonic stem cells are pluripotent and can differentiate into any specialised cell type, their use requires establishment of embryonic stem cell lines using the inner cell mass of an early pre-implantation embryo. As the blastocyst is destroyed during the process, ethical issues need to be carefully considered. The use of embryonic stem cells is also limited by the difficulties in growing large numbers of the cells without inducing spontaneous differentiation, and the problems in controlling directed differentiation of the cells. The use of adult stem cells, typically derived from bone marrow, but also from other tissues, is ethically non-controversial but their differentiation potential is more limited than that of the embryonic stem cells. Since human cord blood, umbilical cord, placenta and amnion are normally discarded at birth, they provide an easily accessible alternative source of stem cells. We review the potential and current status of the use of adult stem cells derived from the placenta or umbilical cord in therapeutic and toxicological applications

  3. Chemical etching of GaAs with a novel low energy ion beam source: a low damage process for device fabrication

    International Nuclear Information System (INIS)

    Beckerman, J.; Jackman, R.B.

    1993-01-01

    If the advantages of physics (anisotropy) can be combined with the advantages of chemistry (damage-free perturbation of the lattice) then an excellent, near damage-free, etching reaction can result. In this context, the promise for ultra-low energy ( -1 . The source does, however, give rise to a coating, derived from the source liner, which must be washed from all etched samples. The presence of such a coating is likely to be the origin of the slow etch rate achieved. After removal of the coating, smooth, mirror-like etched surfaces are apparent. These surfaces perform very well when Schottky diodes are constructed from them showing no deviation from the behaviour of control samples. (author)

  4. Evaluation of Novel Polyunsaturated Fatty Acid Derived Lipid Mediators of Inflammation to Ameliorate the Deleterious Effects of Blast Overpressure on Eye and Brain Visual Processing Centers in Rats

    Science.gov (United States)

    2014-10-01

    acid ( DHA ; 22:6ω-3) Eicosapentaenoic acid (EPA; 20:5ω-3) Lipoxin A4 Resolvin E1 Protectin DX Resolvin D1 LOX LOX LOX Structures and Endogenous Source...1 AD_________________ Award Number: W81XWH-12-2-0082 TITLE: Evaluation of Novel Polyunsaturated Fatty Acid Derived Lipid...Evaluation of Novel Polyunsaturated Fatty Acid Derived Lipid Mediators 5a. CONTRACT NUMBER of Inflammation to Ameliorate the Deleterious Effects of

  5. ACCELERATING THE ADOPTION PROCESS OF RENEWABLE ENERGY SOURCES AMONG SMES

    Directory of Open Access Journals (Sweden)

    Mirjam Leloux

    2015-07-01

    Full Text Available By 2020, intermittent renewable small scale energy sources (e.g. wind and solar energy are expected to represent about 17% of the EU’s total electricity consumption. All national overriding energy policy objectives are to ensure competitive, secure and sustainable energy for the economy and for society. Renewable energy, allied with energy efficiency, is often found crucial to meet these goals of secure sustainable and competitive energy supplies reducing dependency on expensive fossil imports and underpinning the move towards a low carbon economy while delivering green jobs to the economy. This all contributes to national competitiveness and the jobs and economic growth agenda. However, a straight forward implementation of renewable energy options is not easy, due to various barriers and obstacles. For most SMEs, the concept of generating their own renewable energy is still more of academic than genuine interest. In general, several barriers are experienced, such as high capital investments, slow return on investment, and the lack of knowledge of the benefits. There is a need for education on the benefits and drawbacks of sustainable energy, as well as a greater contribution to costs for this to work. In this paper we describe the intermediate outcomes of a European Partnership under the name of GREAT (Growing Renewable Energy Applications and Technologies, funded under the INTERREG IVB NWE Programme. GREAT aims to encourage communities and small to medium size enterprises (SMEs in Ireland, the United Kingdon, Belgium and The Netherlands to develop technological solutions for Smart Grid, Renewable Energy and Distributive Generation; research and develop policy issues for regulatory authorities and provide structured co-operation opportunities between SMEs and research institutes / technology developers. We developed GREAT spreadsheets to facilitate SMEs in each country to calculate the return-on-investment of renewable energy sources, such as

  6. The Chandra Source Catalog 2.0: Estimating Source Fluxes

    Science.gov (United States)

    Primini, Francis Anthony; Allen, Christopher E.; Miller, Joseph; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Plummer, David A.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula

    2018-01-01

    The Second Chandra Source Catalog (CSC2.0) will provide information on approximately 316,000 point or compact extended x-ray sources, derived from over 10,000 ACIS and HRC-I imaging observations available in the public archive at the end of 2014. As in the previous catalog release (CSC1.1), fluxes for these sources will be determined separately from source detection, using a Bayesian formalism that accounts for background, spatial resolution effects, and contamination from nearby sources. However, the CSC2.0 procedure differs from that used in CSC1.1 in three important aspects. First, for sources in crowded regions in which photometric apertures overlap, fluxes are determined jointly, using an extension of the CSC1.1 algorithm, as discussed in Primini & Kashyap (2014ApJ...796…24P). Second, an MCMC procedure is used to estimate marginalized posterior probability distributions for source fluxes. Finally, for sources observed in multiple observations, a Bayesian Blocks algorithm (Scargle, et al. 2013ApJ...764..167S) is used to group observations into blocks of constant source flux.In this poster we present details of the CSC2.0 photometry algorithms and illustrate their performance in actual CSC2.0 datasets.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.

  7. Modeling the influence of coupled mass transfer processes on mass flux downgradient of heterogeneous DNAPL source zones.

    Science.gov (United States)

    Yang, Lurong; Wang, Xinyu; Mendoza-Sanchez, Itza; Abriola, Linda M

    2018-04-01

    Sequestered mass in low permeability zones has been increasingly recognized as an important source of organic chemical contamination that acts to sustain downgradient plume concentrations above regulated levels. However, few modeling studies have investigated the influence of this sequestered mass and associated (coupled) mass transfer processes on plume persistence in complex dense nonaqueous phase liquid (DNAPL) source zones. This paper employs a multiphase flow and transport simulator (a modified version of the modular transport simulator MT3DMS) to explore the two- and three-dimensional evolution of source zone mass distribution and near-source plume persistence for two ensembles of highly heterogeneous DNAPL source zone realizations. Simulations reveal the strong influence of subsurface heterogeneity on the complexity of DNAPL and sequestered (immobile/sorbed) mass distribution. Small zones of entrapped DNAPL are shown to serve as a persistent source of low concentration plumes, difficult to distinguish from other (sorbed and immobile dissolved) sequestered mass sources. Results suggest that the presence of DNAPL tends to control plume longevity in the near-source area; for the examined scenarios, a substantial fraction (43.3-99.2%) of plume life was sustained by DNAPL dissolution processes. The presence of sorptive media and the extent of sorption non-ideality are shown to greatly affect predictions of near-source plume persistence following DNAPL depletion, with plume persistence varying one to two orders of magnitude with the selected sorption model. Results demonstrate the importance of sorption-controlled back diffusion from low permeability zones and reveal the importance of selecting the appropriate sorption model for accurate prediction of plume longevity. Large discrepancies for both DNAPL depletion time and plume longevity were observed between 2-D and 3-D model simulations. Differences between 2- and 3-D predictions increased in the presence of

  8. Modeling the influence of coupled mass transfer processes on mass flux downgradient of heterogeneous DNAPL source zones

    Science.gov (United States)

    Yang, Lurong; Wang, Xinyu; Mendoza-Sanchez, Itza; Abriola, Linda M.

    2018-04-01

    Sequestered mass in low permeability zones has been increasingly recognized as an important source of organic chemical contamination that acts to sustain downgradient plume concentrations above regulated levels. However, few modeling studies have investigated the influence of this sequestered mass and associated (coupled) mass transfer processes on plume persistence in complex dense nonaqueous phase liquid (DNAPL) source zones. This paper employs a multiphase flow and transport simulator (a modified version of the modular transport simulator MT3DMS) to explore the two- and three-dimensional evolution of source zone mass distribution and near-source plume persistence for two ensembles of highly heterogeneous DNAPL source zone realizations. Simulations reveal the strong influence of subsurface heterogeneity on the complexity of DNAPL and sequestered (immobile/sorbed) mass distribution. Small zones of entrapped DNAPL are shown to serve as a persistent source of low concentration plumes, difficult to distinguish from other (sorbed and immobile dissolved) sequestered mass sources. Results suggest that the presence of DNAPL tends to control plume longevity in the near-source area; for the examined scenarios, a substantial fraction (43.3-99.2%) of plume life was sustained by DNAPL dissolution processes. The presence of sorptive media and the extent of sorption non-ideality are shown to greatly affect predictions of near-source plume persistence following DNAPL depletion, with plume persistence varying one to two orders of magnitude with the selected sorption model. Results demonstrate the importance of sorption-controlled back diffusion from low permeability zones and reveal the importance of selecting the appropriate sorption model for accurate prediction of plume longevity. Large discrepancies for both DNAPL depletion time and plume longevity were observed between 2-D and 3-D model simulations. Differences between 2- and 3-D predictions increased in the presence of

  9. Dealing with Feeling: A Meta-Analysis of the Effectiveness of Strategies Derived from the Process Model of Emotion Regulation

    Science.gov (United States)

    Webb, Thomas L.; Miles, Eleanor; Sheeran, Paschal

    2012-01-01

    The present meta-analysis investigated the effectiveness of strategies derived from the process model of emotion regulation in modifying emotional outcomes as indexed by experiential, behavioral, and physiological measures. A systematic search of the literature identified 306 experimental comparisons of different emotion regulation (ER)…

  10. Upper Bounds for the Rate Distortion Function of Finite-Length Data Blocks of Gaussian WSS Sources

    Directory of Open Access Journals (Sweden)

    Jesús Gutiérrez-Gutiérrez

    2017-10-01

    Full Text Available In this paper, we present upper bounds for the rate distortion function (RDF of finite-length data blocks of Gaussian wide sense stationary (WSS sources and we propose coding strategies to achieve such bounds. In order to obtain those bounds, we previously derive new results on the discrete Fourier transform (DFT of WSS processes.

  11. Derivation and characterisation of hESC lines from supernumerary embryos, experience from Odense, Denmark

    DEFF Research Database (Denmark)

    Harkness, Linda; Rasmussen, Iben Anne; Erb, Karin

    2010-01-01

    The derivation and characterisation of human embryonic stem cells provides a source of pluripotent stem cells with potential for clinical applications. Utilising locally sourced embryos from two IVF clinics, we derived and characterised five new cell lines for use in a non-clinical setting. Analy...

  12. On the derivation of approximations to cellular automata models and the assumption of independence.

    Science.gov (United States)

    Davies, K J; Green, J E F; Bean, N G; Binder, B J; Ross, J V

    2014-07-01

    Cellular automata are discrete agent-based models, generally used in cell-based applications. There is much interest in obtaining continuum models that describe the mean behaviour of the agents in these models. Previously, continuum models have been derived for agents undergoing motility and proliferation processes, however, these models only hold under restricted conditions. In order to narrow down the reason for these restrictions, we explore three possible sources of error in deriving the model. These sources are the choice of limiting arguments, the use of a discrete-time model as opposed to a continuous-time model and the assumption of independence between the state of sites. We present a rigorous analysis in order to gain a greater understanding of the significance of these three issues. By finding a limiting regime that accurately approximates the conservation equation for the cellular automata, we are able to conclude that the inaccuracy between our approximation and the cellular automata is completely based on the assumption of independence. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Drinking Water Sources with Surface Intakes from LDHH source data, Geographic NAD83, LOSCO (1999) [drinking_water_surface_intakes_LDHH_1999

    Data.gov (United States)

    Louisiana Geographic Information Center — This is a point dataset for 87 public drinking water sources with surface intakes. It was derived from a larger statewide general drinking water source dataset...

  14. Vitamin B12-Containing Plant Food Sources for Vegetarians

    Science.gov (United States)

    Watanabe, Fumio; Yabuta, Yukinori; Bito, Tomohiro; Teng, Fei

    2014-01-01

    The usual dietary sources of Vitamin B12 are animal-derived foods, although a few plant-based foods contain substantial amounts of Vitamin B12. To prevent Vitamin B12 deficiency in high-risk populations such as vegetarians, it is necessary to identify plant-derived foods that contain high levels of Vitamin B12. A survey of naturally occurring plant-derived food sources with high Vitamin B12 contents suggested that dried purple laver (nori) is the most suitable Vitamin B12 source presently available for vegetarians. Furthermore, dried purple laver also contains high levels of other nutrients that are lacking in vegetarian diets, such as iron and n-3 polyunsaturated fatty acids. Dried purple laver is a natural plant product and it is suitable for most people in various vegetarian groups. PMID:24803097

  15. Vitamin B12-Containing Plant Food Sources for Vegetarians

    Directory of Open Access Journals (Sweden)

    Fumio Watanabe

    2014-05-01

    Full Text Available The usual dietary sources of Vitamin B12 are animal-derived foods, although a few plant-based foods contain substantial amounts of Vitamin B12. To prevent Vitamin B12 deficiency in high-risk populations such as vegetarians, it is necessary to identify plant-derived foods that contain high levels of Vitamin B12. A survey of naturally occurring plant-derived food sources with high Vitamin B12 contents suggested that dried purple laver (nori is the most suitable Vitamin B12 source presently available for vegetarians. Furthermore, dried purple laver also contains high levels of other nutrients that are lacking in vegetarian diets, such as iron and n-3 polyunsaturated fatty acids. Dried purple laver is a natural plant product and it is suitable for most people in various vegetarian groups.

  16. The source regime for irradiation plant operated with fuel elements

    International Nuclear Information System (INIS)

    Suckow, W.

    1976-11-01

    The rapid and irregular decay of the gamma radiation from reactor fuel elements requires the establishment of an optimal source regime in order to utilise reactor fuel elements as radiation sources on a technological basis. Critical values have been derived which enable the determination of optimal conditions. In this context all technologically interesting types of source regimes have been examined. Methods to achieve a high gamma yield and a satisfactory dose consistency with time have been developed and important values for these two aspects have been derived. The conditions for optimal radiation source regimes are described in the final conclusions. (author)

  17. SIproc: an open-source biomedical data processing platform for large hyperspectral images.

    Science.gov (United States)

    Berisha, Sebastian; Chang, Shengyuan; Saki, Sam; Daeinejad, Davar; He, Ziqi; Mankar, Rupali; Mayerich, David

    2017-04-10

    There has recently been significant interest within the vibrational spectroscopy community to apply quantitative spectroscopic imaging techniques to histology and clinical diagnosis. However, many of the proposed methods require collecting spectroscopic images that have a similar region size and resolution to the corresponding histological images. Since spectroscopic images contain significantly more spectral samples than traditional histology, the resulting data sets can approach hundreds of gigabytes to terabytes in size. This makes them difficult to store and process, and the tools available to researchers for handling large spectroscopic data sets are limited. Fundamental mathematical tools, such as MATLAB, Octave, and SciPy, are extremely powerful but require that the data be stored in fast memory. This memory limitation becomes impractical for even modestly sized histological images, which can be hundreds of gigabytes in size. In this paper, we propose an open-source toolkit designed to perform out-of-core processing of hyperspectral images. By taking advantage of graphical processing unit (GPU) computing combined with adaptive data streaming, our software alleviates common workstation memory limitations while achieving better performance than existing applications.

  18. MANGROVE-DERIVED NUTRIENTS AND CORAL REEFS

    Science.gov (United States)

    Understanding the consequences of the declining global cover of mangroves due to anthropogenic disturbance necessitates consideration of how mangrove-derived nutrients contribute to threatened coral reef systems. We sampled potential sources of organic matter and a suite of sessi...

  19. Advanced monitoring with complex stream processing

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  20. Modular High Temperature Gas-Cooled Reactor heat source for coal conversion

    International Nuclear Information System (INIS)

    Schleicher, R.W. Jr.; Lewis, A.C.

    1992-09-01

    In the industrial nations, transportable fuels in the form of natural gas and petroleum derivatives constitute a primary energy source nearly equivalent to that consumed for generating electric power. Nations with large coal deposits have the option of coal conversion to meet their transportable fuel demands. But these processes themselves consume huge amounts of energy and produce undesirable combustion by-products. Therefore, this represents a major opportunity to apply nuclear energy for both the environmental and energy conservation reasons. Because the most desirable coal conversion processes take place at 800 degree C or higher, only the High Temperature Gas-Cooled Reactors (HTGRs) have the potential to be adapted to coal conversion processes. This report provides a discussion of this utilization of HTGR reactors

  1. The function of advanced treatment process in a drinking water treatment plant with organic matter-polluted source water.

    Science.gov (United States)

    Lin, Huirong; Zhang, Shuting; Zhang, Shenghua; Lin, Wenfang; Yu, Xin

    2017-04-01

    To understand the relationship between chemical and microbial treatment at each treatment step, as well as the relationship between microbial community structure in biofilms in biofilters and their ecological functions, a drinking water plant with severe organic matter-polluted source water was investigated. The bacterial community dynamics of two drinking water supply systems (traditional and advanced treatment processes) in this plant were studied from the source to the product water. Analysis by 454 pyrosequencing was conducted to characterize the bacterial diversity in each step of the treatment processes. The bacterial communities in these two treatment processes were highly diverse. Proteobacteria, which mainly consisted of beta-proteobacteria, was the dominant phylum. The two treatment processes used in the plant could effectively remove organic pollutants and microbial polution, especially the advanced treatment process. Significant differences in the detection of the major groups were observed in the product water samples in the treatment processes. The treatment processes, particularly the biological pretreatment and O 3 -biological activated carbon in the advanced treatment process, highly influenced the microbial community composition and the water quality. Some opportunistic pathogens were found in the water. Nitrogen-relative microorganisms found in the biofilm of filters may perform an important function on the microbial community composition and water quality improvement.

  2. Palynofacies characterization for hydrocarbon source rock ...

    Indian Academy of Sciences (India)

    source rock potential of the Subathu Formation in the area. Petroleum geologists are well aware of the fact that the dispersed organic matter derived either from marine or non-marine sediments on reach- ing its maturation level over extended period of time contributes as source material for the produc- tion of hydrocarbons.

  3. Preliminary study of reasonableness of important parameters used in deriving OILs for PWR accidents

    International Nuclear Information System (INIS)

    Yongsheng, L.; Shongqi, S.

    2004-01-01

    Institute of nuclear energy technology, Tsinghua university, Beijing , China ,100084 Body of Abstract: This paper introduced the definition of operational intervention level (OIL) and the derived process of default OILs recommended by IAEA firstly. Then the paper focused on the reasonableness of two parameters, R1 and R2, which is assumed in derived process of default OIL1 and OIL2 in a reactor accident. The values of R1 and R2 were calculated by the calculating program of InterRas. The source item for computing includes the accidents PWR described in Wash-1400 and France severe accident source items, and furthermore the meteorological conditions for computing are classified to three classes, which are D stability class, A stability class, and F stability class with the mixing heights of 400 meters and 4 hour exposure to the plume. The wind speed is 3m/s, 2m/s and 1m/s correspond to the stability classes. The results show that the average values of R1 and R2 in the same accident series and different meteorological conditions derived by the calculating program of InterRas are close to the presumptive values. The results also indicated the rationalization of the default OIL1 and OIL2. On the other hand, the calculating results of different accidents have considerable disparity with the presumptive values in different distances and meteorological conditions, but the mutative trends are very well-regulated on distance and meteorological conditions. So the OILs recommended by IAEA are applicable to some specified conditions. At last the paper introduced the method of revising the default OILs in terms of measurement results. (Author)

  4. One-Step Synthesis of Microporous Carbon Monoliths Derived from Biomass with High Nitrogen Doping Content for Highly Selective CO2 Capture

    OpenAIRE

    Geng, Zhen; Xiao, Qiangfeng; Lv, Hong; Li, Bing; Wu, Haobin; Lu, Yunfeng; Zhang, Cunman

    2016-01-01

    The one-step synthesis method of nitrogen doped microporous carbon monoliths derived from biomass with high-efficiency is developed using a novel ammonia (NH3)-assisted activation process, where NH3 serves as both activating agent and nitrogen source. Both pore forming and nitrogen doping simultaneously proceed during the process, obviously superior to conventional chemical activation. The as-prepared nitrogen-doped active carbons exhibit rich micropores with high surface area and high nitrog...

  5. Silica-enriched mantle sources of subalkaline picrite-boninite-andesite island arc magmas

    Science.gov (United States)

    Bénard, A.; Arculus, R. J.; Nebel, O.; Ionov, D. A.; McAlpine, S. R. B.

    2017-02-01

    Primary arc melts may form through fluxed or adiabatic decompression melting in the mantle wedge, or via a combination of both processes. Major limitations to our understanding of the formation of primary arc melts stem from the fact that most arc lavas are aggregated blends of individual magma batches, further modified by differentiation processes in the sub-arc mantle lithosphere and overlying crust. Primary melt generation is thus masked by these types of second-stage processes. Magma-hosted peridotites sampled as xenoliths in subduction zone magmas are possible remnants of sub-arc mantle and magma generation processes, but are rarely sampled in active arcs. Published studies have emphasised the predominantly harzburgitic lithologies with particularly high modal orthopyroxene in these xenoliths; the former characteristic reflects the refractory nature of these materials consequent to extensive melt depletion of a lherzolitic protolith whereas the latter feature requires additional explanation. Here we present major and minor element data for pristine, mantle-derived, lava-hosted spinel-bearing harzburgite and dunite xenoliths and associated primitive melts from the active Kamchatka and Bismarck arcs. We show that these peridotite suites, and other mantle xenoliths sampled in circum-Pacific arcs, are a distinctive peridotite type not found in other tectonic settings, and are melting residues from hydrous melting of silica-enriched mantle sources. We explore the ability of experimental studies allied with mantle melting parameterisations (pMELTS, Petrolog3) to reproduce the compositions of these arc peridotites, and present a protolith ('hybrid mantle wedge') composition that satisfies the available constraints. The composition of peridotite xenoliths recovered from erupted arc magmas plausibly requires their formation initially via interaction of slab-derived components with refractory mantle prior to or during the formation of primary arc melts. The liquid

  6. BioPepDB: an integrated data platform for food-derived bioactive peptides.

    Science.gov (United States)

    Li, Qilin; Zhang, Chao; Chen, Hongjun; Xue, Jitong; Guo, Xiaolei; Liang, Ming; Chen, Ming

    2018-03-12

    Food-derived bioactive peptides play critical roles in regulating most biological processes and have considerable biological, medical and industrial importance. However, a large number of active peptides data, including sequence, function, source, commercial product information, references and other information are poorly integrated. BioPepDB is a searchable database of food-derived bioactive peptides and their related articles, including more than four thousand bioactive peptide entries. Moreover, BioPepDB provides modules of prediction and hydrolysis-simulation for discovering novel peptides. It can serve as a reference database to investigate the function of different bioactive peptides. BioPepDB is available at http://bis.zju.edu.cn/biopepdbr/ . The web page utilises Apache, PHP5 and MySQL to provide the user interface for accessing the database and predict novel peptides. The database itself is operated on a specialised server.

  7. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook

  8. Impedance Source Power Electronic Converters

    DEFF Research Database (Denmark)

    Liu, Yushan; Abu-Rub, Haitham; Ge, Baoming

    Impedance Source Power Electronic Converters brings together state of the art knowledge and cutting edge techniques in various stages of research related to the ever more popular impedance source converters/inverters. Significant research efforts are underway to develop commercially viable...... and technically feasible, efficient and reliable power converters for renewable energy, electric transportation and for various industrial applications. This book provides a detailed understanding of the concepts, designs, controls, and application demonstrations of the impedance source converters/inverters. Key...... features: Comprehensive analysis of the impedance source converter/inverter topologies, including typical topologies and derived topologies. Fully explains the design and control techniques of impedance source converters/inverters, including hardware design and control parameter design for corresponding...

  9. High speed imaging of dynamic processes with a switched source x-ray CT system

    International Nuclear Information System (INIS)

    Thompson, William M; Lionheart, William R B; Morton, Edward J; Cunningham, Mike; Luggar, Russell D

    2015-01-01

    Conventional x-ray computed tomography (CT) scanners are limited in their scanning speed by the mechanical constraints of their rotating gantries and as such do not provide the necessary temporal resolution for imaging of fast-moving dynamic processes, such as moving fluid flows. The Real Time Tomography (RTT) system is a family of fast cone beam CT scanners which instead use multiple fixed discrete sources and complete rings of detectors in an offset geometry. We demonstrate the potential of this system for use in the imaging of such high speed dynamic processes and give results using simulated and real experimental data. The unusual scanning geometry results in some challenges in image reconstruction, which are overcome using algebraic iterative reconstruction techniques and explicit regularisation. Through the use of a simple temporal regularisation term and by optimising the source firing pattern, we show that temporal resolution of the system may be increased at the expense of spatial resolution, which may be advantageous in some situations. Results are given showing temporal resolution of approximately 500 µs with simulated data and 3 ms with real experimental data. (paper)

  10. Derivative pricing based on local utility maximization

    OpenAIRE

    Jan Kallsen

    2002-01-01

    This paper discusses a new approach to contingent claim valuation in general incomplete market models. We determine the neutral derivative price which occurs if investors maximize their local utility and if derivative demand and supply are balanced. We also introduce the sensitivity process of a contingent claim. This process quantifies the reliability of the neutral derivative price and it can be used to construct price bounds. Moreover, it allows to calibrate market models in order to be co...

  11. High-Energy Compton Scattering Light Sources

    CERN Document Server

    Hartemann, Fred V; Barty, C; Crane, John; Gibson, David J; Hartouni, E P; Tremaine, Aaron M

    2005-01-01

    No monochromatic, high-brightness, tunable light sources currently exist above 100 keV. Important applications that would benefit from such new hard x-ray sources include: nuclear resonance fluorescence spectroscopy, time-resolved positron annihilation spectroscopy, and MeV flash radiography. The peak brightness of Compton scattering light sources is derived for head-on collisions and found to scale with the electron beam brightness and the drive laser pulse energy. This gamma 2

  12. Instantaneous and Frequency-Warped Signal Processing Techniques for Auditory Source Separation.

    Science.gov (United States)

    Wang, Avery Li-Chun

    This thesis summarizes several contributions to the areas of signal processing and auditory source separation. The philosophy of Frequency-Warped Signal Processing is introduced as a means for separating the AM and FM contributions to the bandwidth of a complex-valued, frequency-varying sinusoid p (n), transforming it into a signal with slowly-varying parameters. This transformation facilitates the removal of p (n) from an additive mixture while minimizing the amount of damage done to other signal components. The average winding rate of a complex-valued phasor is explored as an estimate of the instantaneous frequency. Theorems are provided showing the robustness of this measure. To implement frequency tracking, a Frequency-Locked Loop algorithm is introduced which uses the complex winding error to update its frequency estimate. The input signal is dynamically demodulated and filtered to extract the envelope. This envelope may then be remodulated to reconstruct the target partial, which may be subtracted from the original signal mixture to yield a new, quickly-adapting form of notch filtering. Enhancements to the basic tracker are made which, under certain conditions, attain the Cramer -Rao bound for the instantaneous frequency estimate. To improve tracking, the novel idea of Harmonic -Locked Loop tracking, using N harmonically constrained trackers, is introduced for tracking signals, such as voices and certain musical instruments. The estimated fundamental frequency is computed from a maximum-likelihood weighting of the N tracking estimates, making it highly robust. The result is that harmonic signals, such as voices, can be isolated from complex mixtures in the presence of other spectrally overlapping signals. Additionally, since phase information is preserved, the resynthesized harmonic signals may be removed from the original mixtures with relatively little damage to the residual signal. Finally, a new methodology is given for designing linear-phase FIR filters

  13. Systems biology derived source-sink mechanism of BMP gradient formation.

    Science.gov (United States)

    Zinski, Joseph; Bu, Ye; Wang, Xu; Dou, Wei; Umulis, David; Mullins, Mary C

    2017-08-09

    A morphogen gradient of Bone Morphogenetic Protein (BMP) signaling patterns the dorsoventral embryonic axis of vertebrates and invertebrates. The prevailing view in vertebrates for BMP gradient formation is through a counter-gradient of BMP antagonists, often along with ligand shuttling to generate peak signaling levels. To delineate the mechanism in zebrafish, we precisely quantified the BMP activity gradient in wild-type and mutant embryos and combined these data with a mathematical model-based computational screen to test hypotheses for gradient formation. Our analysis ruled out a BMP shuttling mechanism and a bmp transcriptionally-informed gradient mechanism. Surprisingly, rather than supporting a counter-gradient mechanism, our analyses support a fourth model, a source-sink mechanism, which relies on a restricted BMP antagonist distribution acting as a sink that drives BMP flux dorsally and gradient formation. We measured Bmp2 diffusion and found that it supports the source-sink model, suggesting a new mechanism to shape BMP gradients during development.

  14. Study of temperature distribution of pipes heated by moving rectangular gauss distribution heat source. Development of pipe outer surface irradiated laser stress improvement process (L-SIP)

    International Nuclear Information System (INIS)

    Ohta, Takahiro; Kamo, Kazuhiko; Asada, Seiji; Terasaki, Toshio

    2009-01-01

    The new process called L-SIP (outer surface irradiated Laser Stress Improvement Process) is developed to improve the tensile residual stress of the inner surface near the butt welded joints of pipes in the compression stress. The temperature gradient occurs in the thickness of pipes in heating the outer surface rapidly by laser beam. By the thermal expansion difference between the inner surface and the outer surface, the compression stress occurs near the inner surface of pipes. In this paper, the theoretical equation for the temperature distributions of pipes heated by moving rectangular Gauss distribution heat source on the outer surface is derived. The temperature histories of pipes calculated by theoretical equation agree well with FEM analysis results. According to the theoretical equation, the controlling parameters of temperature distributions and histories are q/2a y , vh, a x /h and a y /h, where q is total heat input, a y is heat source length in the axial direction, a x is Gaussian radius of heat source in the hoop direction, ν is moving velocity, and h is thickness of the pipe. The essential variables for L-SIP, which are defined on the basis of the measured temperature histories on the outer surface of the pipe, are Tmax, F 0 =kτ 0 /h 2 , vh, W Q and L Q , where Tmax is maximum temperature on the monitor point of the outer surface, k is thermal diffusivity coefficient, τ 0 is the temperature rise time from 100degC to maximum temperature on the monitor point of the outer surface, W Q is τ 0 x ν, and L Q is the uniform temperature length in the axial direction. It is verified that the essential variables for L-SIP match the controlling parameters by the theoretical equation. (author)

  15. Downstream mixing of sediment and tracers in agricultural catchments: Evidence of changing sediment sources and fluvial processes?

    Science.gov (United States)

    Ralph, Timothy; Wethered, Adam; Smith, Hugh; Heijnis, Henk

    2014-05-01

    Land clearance, soil tillage and grazing in agricultural catchments have liberated sediment and altered hydrological connectivity between hillslopes and channels, leading to increased sediment availability, mobilisation and delivery to rivers. The type and amount of sediment supplied to rivers is critical for fluvial geomorphology and aquatic ecosystem health. Contemporary sediment dynamics are routinely investigated using environmental radionuclides such as caesium-137 (Cs-137) and excess lead-210 (Pb-210ex), which can provide information regarding sediment source types and fluvial processes if sediment sources can be distinguished from one another and mixing models applied to representative samples. However, downstream transport, mixing and dilution of radionuclide-labelled sediment (especially from sources with low initial concentrations) can obliterate the tracer signal; sometimes before anything of geomorphological importance happens in the catchment. Can these findings be used as evidence of sediment source variations and fluvial processes when the limits of detection (of Cs-137 in particular) are being exceeded so rapidly downstream? Sediment sources and downstream sediment dynamics were investigated in Coolbaggie Creek, a major supplier of sediment to the Macquarie River in an agricultural catchment with temperate to semi-arid climate in Australia. Radionuclides were used to discriminate between the banks and gullies (Cs-137 1.45 +/- 0.47 Bq/kg; Pb-210ex 4.67 +/- 1.93 Bq/kg). Within the trunk stream, suspended sediment, organic matter and Cs-137 and Pb-210ex concentrations declined downstream. Results from a mixing model suggest that agricultural topsoils account for 95% of fine sediment entering the channel in the upper reach (200 m2) downstream, with channel expansion and gullies contributing fine sediment to the system. A lack of topsoil being supplied to the channel suggests minimal lateral connectivity between the catchment and the trunk stream in all

  16. Power source roadmaps using bibliometrics and database tomography

    International Nuclear Information System (INIS)

    Kostoff, R.N.; Tshiteya, R.; Pfeil, K.M.; Humenik, J.A.; Karypis, G.

    2005-01-01

    Database Tomography (DT) is a textual database analysis system consisting of two major components: (1) algorithms for extracting multi-word phrase frequencies and phrase proximities (physical closeness of the multi-word technical phrases) from any type of large textual database, to augment (2) interpretative capabilities of the expert human analyst. DT was used to derive technical intelligence from a Power Sources database derived from the Science Citation Index. Phrase frequency analysis by the technical domain experts provided the pervasive technical themes of the Power Sources database, and the phrase proximity analysis provided the relationships among the pervasive technical themes. Bibliometric analysis of the Power Sources literature supplemented the DT results with author/journal/institution/country publication and citation data

  17. Understanding Hydrological Processes in Variable Source Areas in the Glaciated Northeastern US Watersheds under Variable Climate Conditions

    Science.gov (United States)

    Steenhuis, T. S.; Azzaino, Z.; Hoang, L.; Pacenka, S.; Worqlul, A. W.; Mukundan, R.; Stoof, C.; Owens, E. M.; Richards, B. K.

    2017-12-01

    The New York City source watersheds in the Catskill Mountains' humid, temperate climate has long-term hydrological and water quality monitoring data It is one of the few catchments where implementation of source and landscape management practices has led to decreased phosphorus concentration in the receiving surface waters. One of the reasons is that landscape measures correctly targeted the saturated variable source runoff areas (VSA) in the valley bottoms as the location where most of the runoff and other nonpoint pollutants originated. Measures targeting these areas were instrumental in lowering phosphorus concentration. Further improvements in water quality can be made based on a better understanding of the flow processes and water table fluctuations in the VSA. For that reason, we instrumented a self-contained upland variable source watershed with a landscape characteristic of a soil underlain by glacial till at shallow depth similar to the Catskill watersheds. In this presentation, we will discuss our experimental findings and present a mathematical model. Variable source areas have a small slope making gravity the driving force for the flow, greatly simplifying the simulation of the flow processes. The experimental data and the model simulations agreed for both outflow and water table fluctuations. We found that while the flows to the outlet were similar throughout the year, the discharge of the VSA varies greatly. This was due to transpiration by the plants which became active when soil temperatures were above 10oC. We found that shortly after the temperature increased above 10oC the baseflow stopped and only surface runoff occurred when rainstorms exceeded the storage capacity of the soil in at least a portion of the variable source area. Since plant growth in the variable source area was a major variable determining the base flow behavior, changes in temperature in the future - affecting the duration of the growing season - will affect baseflow and

  18. The Impact of Processing Instruction on the Recognition and Production of English Derivational Affixes Among EFL Learners

    Directory of Open Access Journals (Sweden)

    Sasan Baleghizadeh

    2014-10-01

    Full Text Available In this study, we investigated the effectiveness of processing instruction (PI as opposed to traditional deductive exercise-based intervention (TI in teaching English derivational affixes. There was also a comparison non-intervention (NI group, and the groups were posttested. To teach the target affixes via PI, new structured input tasks were developed. In all, 101 adult male and female lower-intermediate participants initially took part in the study, but this was later reduced to 71 as a result of the pretest, and so on. The results were analyzed through MANOVA and paired t test. In recognition, PI and TI outperformed non-intervention, while PI and TI did not outperform one another. In production, TI outperformed the other groups, while the other groups did not outperform one another. More studies must be carried out before drawing any conclusions about the transferability of PI to output activities for teaching derivational affixes. The students were also interviewed to survey their attitudes. Affectively, PI created self-confidence and an enjoyable atmosphere among the learners. Also cognitively, PI was the only group that satisfied the participants in their ability to recognize and produce derivational affixes. We found PI a highly effective and positive approach for teaching recognizing derivational affixes. We also believe it to possess a high potential for teaching their production, as it gave the participants a good sense of self-confidence for the production of the affixes.

  19. Leukocyte- and endothelial-derived microparticles: a circulating source for fibrinolysis

    Science.gov (United States)

    Lacroix, Romaric; Plawinski, Laurent; Robert, Stéphane; Doeuvre, Loïc; Sabatier, Florence; Martinez de Lizarrondo, Sara; Mezzapesa, Anna; Anfosso, Francine; Leroyer, Aurelie S.; Poullin, Pascale; Jourde, Noémie; Njock, Makon-Sébastien; Boulanger, Chantal M.; Anglés-Cano, Eduardo; Dignat-George, Françoise

    2012-01-01

    Background We recently assigned a new fibrinolytic function to cell-derived microparticles in vitro. In this study we explored the relevance of this novel property of microparticles to the in vivo situation. Design and Methods Circulating microparticles were isolated from the plasma of patients with thrombotic thrombocytopenic purpura or cardiovascular disease and from healthy subjects. Microparticles were also obtained from purified human blood cell subpopulations. The plasminogen activators on microparticles were identified by flow cytometry and enzyme-linked immunosorbent assays; their capacity to generate plasmin was quantified with a chromogenic assay and their fibrinolytic activity was determined by zymography. Results Circulating microparticles isolated from patients generate a range of plasmin activity at their surface. This property was related to a variable content of urokinase-type plasminogen activator and/or tissue plasminogen activator. Using distinct microparticle subpopulations, we demonstrated that plasmin is generated on endothelial and leukocyte microparticles, but not on microparticles of platelet or erythrocyte origin. Leukocyte-derived microparticles bear urokinase-type plasminogen activator and its receptor whereas endothelial microparticles carry tissue plasminogen activator and tissue plasminogen activator/inhibitor complexes. Conclusions Endothelial and leukocyte microparticles, bearing respectively tissue plasminogen activator or urokinase-type plasminogen activator, support a part of the fibrinolytic activity in the circulation which is modulated in pathological settings. Awareness of this blood-borne fibrinolytic activity conveyed by microparticles provides a more comprehensive view of the role of microparticles in the hemostatic equilibrium. PMID:22733025

  20. Application of hierarchical Bayesian unmixing models in river sediment source apportionment

    Science.gov (United States)

    Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice

    2016-04-01

    Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling

  1. Observational evidence of competing source, loss, and transport processes for relativistic electrons in Earth's outer radiation belt

    Science.gov (United States)

    Turner, Drew; Mann, Ian; Usanova, Maria; Rodriguez, Juan; Henderson, Mike; Angelopoulos, Vassilis; Morley, Steven; Claudepierre, Seth; Li, Wen; Kellerman, Adam; Boyd, Alexander; Kim, Kyung-Chan

    Earth’s outer electron radiation belt is a region of extreme variability, with relativistic electron intensities changing by orders of magnitude over time scales ranging from minutes to years. Extreme variations of outer belt electrons ultimately result from the relative impacts of various competing source (and acceleration), loss, and transport processes. Most of these processes involve wave-particle interactions between outer belt electrons and different types of plasma waves in the inner magnetosphere, and in turn, the activity of these waves depends on different solar wind and magnetospheric driving conditions and thus can vary drastically from event to event. Using multipoint analysis with data from NASA’s Van Allen Probes, THEMIS, and SAMPEX missions, NOAA’s GOES and POES constellations, and ground-based observatories, we present results from case studies revealing how different source/acceleration and loss mechanisms compete during active periods to result in drastically different distributions of outer belt electrons. By using a combination of low-Earth orbiting and high-altitude-equatorial orbiting satellites, we briefly review how it is possible to get a much more complete picture of certain wave activity and electron losses over the full range of MLTs and L-shells throughout the radiation belt. We then show example cases highlighting the importance of particular mechanisms, including: substorm injections and whistler-mode chorus waves for the source and acceleration of relativistic electrons; magnetopause shadowing and wave-particle interactions with EMIC waves for sudden losses; and ULF wave activity for driving radial transport, a process which is important for redistributing relativistic electrons, contributing both to acceleration and loss processes. We show how relativistic electron enhancement events involve local acceleration that is consistent with wave-particle interactions between a seed population of 10s to 100s of keV electrons, with a

  2. Spatially Resolved Isotopic Source Signatures of Wetland Methane Emissions

    Science.gov (United States)

    Ganesan, A. L.; Stell, A. C.; Gedney, N.; Comyn-Platt, E.; Hayman, G.; Rigby, M.; Poulter, B.; Hornibrook, E. R. C.

    2018-04-01

    We present the first spatially resolved wetland δ13C(CH4) source signature map based on data characterizing wetland ecosystems and demonstrate good agreement with wetland signatures derived from atmospheric observations. The source signature map resolves a latitudinal difference of 10‰ between northern high-latitude (mean -67.8‰) and tropical (mean -56.7‰) wetlands and shows significant regional variations on top of the latitudinal gradient. We assess the errors in inverse modeling studies aiming to separate CH4 sources and sinks by comparing atmospheric δ13C(CH4) derived using our spatially resolved map against the common assumption of globally uniform wetland δ13C(CH4) signature. We find a larger interhemispheric gradient, a larger high-latitude seasonal cycle, and smaller trend over the period 2000-2012. The implication is that erroneous CH4 fluxes would be derived to compensate for the biases imposed by not utilizing spatially resolved signatures for the largest source of CH4 emissions. These biases are significant when compared to the size of observed signals.

  3. Processing summary report: Fabrication of cesium and strontium heat and radiation sources

    International Nuclear Information System (INIS)

    Holton, L.K. Jr.; Surma, J.E.; Allen, R.P.

    1989-02-01

    The Pacific Northwest Laboratory (PNL), has produced 30 isotopic heat sources (canisters) for the Federal Republic of Germany (FRG) to be used as part of a repository testing program in the Asse Salt Mine. PNL program work involved the filling, closure, and decontamination of the 30 canisters. The canisters were fabricated (filled) in three separate processing campaigns using the radioactive liquid-fed ceramic melter to produce a borosilicate glass. Within the borosilicate glass matrix radiochemical constituents ( 137 Cs and 90 Sr) were immobilized to yield a product with a predetermined decay heat and surface radiation exposure rate

  4. Wavefield dependency on virtual shifts in the source location

    KAUST Repository

    Alkhalifah, Tariq

    2011-01-01

    shape) to lateral perturbations in the source location depends explicitly on lateral derivatives of the velocity field. For velocity models that include lateral velocity discontinuities this is problematic as such derivatives in their classical

  5. Good Manufacturing Practices and Microbial Contamination Sources in Orange Fleshed Sweet Potato Puree Processing Plant in Kenya.

    Science.gov (United States)

    Malavi, Derick Nyabera; Muzhingi, Tawanda; Abong', George Ooko

    2018-01-01

    Limited information exists on the status of hygiene and probable sources of microbial contamination in Orange Fleshed Sweet Potato (OFSP) puree processing. The current study is aimed at determining the level of compliance to Good Manufacturing Practices (GMPs), hygiene, and microbial quality in OFSP puree processing plant in Kenya. Intensive observation and interviews using a structured GMPs checklist, environmental sampling, and microbial analysis by standard microbiological methods were used in data collection. The results indicated low level of compliance to GMPs with an overall compliance score of 58%. Microbial counts on food equipment surfaces, installations, and personnel hands and in packaged OFSP puree were above the recommended microbial safety and quality legal limits. Steaming significantly ( P contamination. Total counts, yeasts and molds, Enterobacteriaceae, total coliforms, and E. coli and S. aureus counts in OFSP puree were 8.0, 4.0, 6.6, 5.8, 4.8, and 5.9 log 10 cfu/g, respectively. In conclusion, equipment surfaces, personnel hands, and processing water were major sources of contamination in OFSP puree processing and handling. Plant hygiene inspection, environmental monitoring, and food safety trainings are recommended to improve hygiene, microbial quality, and safety of OFSP puree.

  6. Good Manufacturing Practices and Microbial Contamination Sources in Orange Fleshed Sweet Potato Puree Processing Plant in Kenya

    Science.gov (United States)

    Abong', George Ooko

    2018-01-01

    Limited information exists on the status of hygiene and probable sources of microbial contamination in Orange Fleshed Sweet Potato (OFSP) puree processing. The current study is aimed at determining the level of compliance to Good Manufacturing Practices (GMPs), hygiene, and microbial quality in OFSP puree processing plant in Kenya. Intensive observation and interviews using a structured GMPs checklist, environmental sampling, and microbial analysis by standard microbiological methods were used in data collection. The results indicated low level of compliance to GMPs with an overall compliance score of 58%. Microbial counts on food equipment surfaces, installations, and personnel hands and in packaged OFSP puree were above the recommended microbial safety and quality legal limits. Steaming significantly (P contamination. Total counts, yeasts and molds, Enterobacteriaceae, total coliforms, and E. coli and S. aureus counts in OFSP puree were 8.0, 4.0, 6.6, 5.8, 4.8, and 5.9 log10 cfu/g, respectively. In conclusion, equipment surfaces, personnel hands, and processing water were major sources of contamination in OFSP puree processing and handling. Plant hygiene inspection, environmental monitoring, and food safety trainings are recommended to improve hygiene, microbial quality, and safety of OFSP puree. PMID:29808161

  7. Dye Sensitized Solar Cell with Conventionally Annealed and Post-Hydrothermally Treated Nanocrystalline Semiconductor Oxide TiO2 Derived from Sol-Gel Process

    Directory of Open Access Journals (Sweden)

    Akhmad Yuwono

    2011-05-01

    Full Text Available Dye-sensitized solar cell (DSSC is one of the very promising alternative renewable energy sources to anticipate the declination in the fossil fuel reserves in the next few decades and to make use of the abundance of intensive sunlight energy in tropical countries like Indonesia. In the present study, TiO2 nanoparticles of different nanocrystallinity was synthesized via sol−gel process with various water to inorganic precursor ratio (Rw of 0.85, 2.00 and 3.50 upon sol preparation, followed with subsequent drying, conventional annealing and post-hydrothermal treatments. The resulting nanoparticles were integrated into the DSSC prototype and sensitized with an organic dye made of the extract of red onion. The basic performance of the fabricated DSSC has been examined and correlated to the crystallite size and band gap energy of TiO2 nanoparticles. It was found that post-hydrothermally treated TiO2 nanoparticles derived from sol of 2.00 Rw, with the most enhanced nanocrystalline size of 12.46 nm and the lowest band gap energy of 3.48 eV, showed the highest open circuit voltage (Voc of 69.33 mV.

  8. Assessing Pyrite-Derived Sulfate in the Mississippi River with Four Years of Sulfur and Triple-Oxygen Isotope Data.

    Science.gov (United States)

    Killingsworth, Bryan A; Bao, Huiming; Kohl, Issaku E

    2018-05-17

    Riverine dissolved sulfate (SO 4 2- ) sulfur and oxygen isotope variations reflect their controls such as SO 4 2- reduction and reoxidation, and source mixing. However, unconstrained temporal variability of riverine SO 4 2- isotope compositions due to short sampling durations may lead to mischaracterization of SO 4 2- sources, particularly for the pyrite-derived sulfate load. We measured the sulfur and triple-oxygen isotopes (δ 34 S, δ 18 O, and Δ' 17 O) of Mississippi River SO 4 2- with biweekly sampling between 2009 and 2013 to test isotopic variability and constrain sources. Sulfate δ 34 S and δ 18 O ranged from -6.3‰ to -0.2‰ and -3.6‰ to +8.8‰, respectively. Our sampling period captured the most severe flooding and drought in the Mississippi River basin since 1927 and 1956, respectively, and a first year of sampling that was unrepresentative of long-term average SO 4 2- . The δ 34 S SO4 data indicate pyrite-derived SO 4 2- sources are 74 ± 10% of the Mississippi River sulfate budget. Furthermore, pyrite oxidation is implicated as the dominant process supplying SO 4 2- to the Mississippi River, whereas the Δ' 17 O SO4 data shows 18 ± 9% of oxygen in this sulfate is sourced from air O 2 .

  9. Henry's law and accumulation of weak source for crust-derived helium: A case study of Weihe Basin, China

    Directory of Open Access Journals (Sweden)

    Yuhong Li

    2017-12-01

    Full Text Available Crust-derived helium is generated from the radioactive decay of uranium, thorium and other radioactive elements in geological bodies. Compared with conventional natural gas, helium is a typical weak source gas as a result of extremely slow generation rate and absence of helium-generating peak. It is associated with methane or carbon dioxide reservoirs frequently and related to groundwater closely. Helium can meet the industry standard with 0.1% in volume fraction. In order to study the accumulation mechanism of helium, the previous research on Henry's coefficient and solubility of helium, nitrogen and methane are summarized and the key roles of Henry's Law in the helium migration, accumulation and preservation are discussed by simulating calculation taking Weihe Basin as an example. According to the Law, the gas solubility in dilute solution is controlled by the gas partial pressure and the Henry's coefficient. Compared with the carrier gases, the Henry's constant of helium is high, with striking difference at low and high temperature. In addition, the helium partial pressure is greatly different in helium source rocks and gas reservoirs, resulting in the great differences of helium solubility in the two places. The accumulation progresses are as follows. Firstly, helium can dissolve into water and migrate out of helium source rocks due to the high helium solubility, which is caused by high helium partial pressure and high temperature in source rock. Secondly, when dissolved helium is transported to the shallow gas reservoir, it is prone to be out of solution and into reservoir due to the extremely low partial pressure and low temperature. Meanwhile part of carrier gases dissolves into water, as if helium is “replaced” out. Furthermore, the low concentration funnel of dissolved helium is formed near the gas reservoir, then other dissolved helium continues to migrate towards the gas reservoir, which greatly improves the helium accumulation

  10. I12: the Joint Engineering, Environment and Processing (JEEP) beamline at Diamond Light Source.

    Science.gov (United States)

    Drakopoulos, Michael; Connolley, Thomas; Reinhard, Christina; Atwood, Robert; Magdysyuk, Oxana; Vo, Nghia; Hart, Michael; Connor, Leigh; Humphreys, Bob; Howell, George; Davies, Steve; Hill, Tim; Wilkin, Guy; Pedersen, Ulrik; Foster, Andrew; De Maio, Nicoletta; Basham, Mark; Yuan, Fajin; Wanelik, Kaz

    2015-05-01

    I12 is the Joint Engineering, Environmental and Processing (JEEP) beamline, constructed during Phase II of the Diamond Light Source. I12 is located on a short (5 m) straight section of the Diamond storage ring and uses a 4.2 T superconducting wiggler to provide polychromatic and monochromatic X-rays in the energy range 50-150 keV. The beam energy enables good penetration through large or dense samples, combined with a large beam size (1 mrad horizontally × 0.3 mrad vertically). The beam characteristics permit the study of materials and processes inside environmental chambers without unacceptable attenuation of the beam and without the need to use sample sizes which are atypically small for the process under study. X-ray techniques available to users are radiography, tomography, energy-dispersive diffraction, monochromatic and white-beam two-dimensional diffraction/scattering and small-angle X-ray scattering. Since commencing operations in November 2009, I12 has established a broad user community in materials science and processing, chemical processing, biomedical engineering, civil engineering, environmental science, palaeontology and physics.

  11. Designing neural networks that process mean values of random variables

    International Nuclear Information System (INIS)

    Barber, Michael J.; Clark, John W.

    2014-01-01

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence

  12. Designing neural networks that process mean values of random variables

    Energy Technology Data Exchange (ETDEWEB)

    Barber, Michael J. [AIT Austrian Institute of Technology, Innovation Systems Department, 1220 Vienna (Austria); Clark, John W. [Department of Physics and McDonnell Center for the Space Sciences, Washington University, St. Louis, MO 63130 (United States); Centro de Ciências Matemáticas, Universidade de Madeira, 9000-390 Funchal (Portugal)

    2014-06-13

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence.

  13. The microbial fermentation characteristics depend on both carbohydrate source and heat processing: a model experiment with ileo-cannulated pigs

    DEFF Research Database (Denmark)

    Nielsen, Tina Skau; Jørgensen, Henry Johs. Høgh; Knudsen, Knud Erik Bach

    2017-01-01

    The effects of carbohydrate (CHO) source and processing (extrusion cooking) on large intestinal fermentation products were studied in ileo-cannulated pigs as a model for humans. Pigs were fed diets containing barley, pea or a mixture of potato starch:wheat bran (PSWB) either raw or extrusion cooked....... Extrusion cooking reduced the amount of starch fermented in the large intestine by 52–96% depending on the CHO source and the total pool of butyrate in the distal small intestine + large intestine by on average 60% across diets. Overall, extrusion cooking caused a shift in the composition of short......-chain fatty acids (SCFA) produced towards more acetate and less propionate and butyrate. The CHO source and processing highly affected the fermentation characteristics and extrusion cooking generally reduced large intestinal fermentation and resulted in a less desirable composition of the fermentation...

  14. Devices, materials, and processes for nano-electronics: characterization with advanced X-ray techniques using lab-based and synchrotron radiation sources

    International Nuclear Information System (INIS)

    Zschech, E.; Wyon, C.; Murray, C.E.; Schneider, G.

    2011-01-01

    Future nano-electronics manufacturing at extraordinary length scales, new device structures, and advanced materials will provide challenges to process development and engineering but also to process control and physical failure analysis. Advanced X-ray techniques, using lab systems and synchrotron radiation sources, will play a key role for the characterization of thin films, nano-structures, surfaces, and interfaces. The development of advanced X-ray techniques and tools will reduce risk and time for the introduction of new technologies. Eventually, time-to-market for new products will be reduced by the timely implementation of the best techniques for process development and process control. The development and use of advanced methods at synchrotron radiation sources will be increasingly important, particularly for research and development in the field of advanced processes and new materials but also for the development of new X-ray components and procedures. The application of advanced X-ray techniques, in-line, in out-of-fab analytical labs and at synchrotron radiation sources, for research, development, and manufacturing in the nano-electronics industry is reviewed. The focus of this paper is on the study of nano-scale device and on-chip interconnect materials, and materials for 3D IC integration as well. (authors)

  15. Analysis of potential combustion source impacts on acid deposition using an independently derived inventory. Volume II, appendices

    Energy Technology Data Exchange (ETDEWEB)

    1983-12-01

    This document contains 2 appendices. The first documents the methodologies used to calculate production, unit energy consumption, fuel type and emission estimates for 16 industries and 35 types of facilities utilizing direct-fired industrial combustion processes, located in 26 states (and the District of Columbia) east of the Mississippi River. As discussed in the text of this report, a U.S. total of 16 industries and 45 types of facilities utilizing direct-fired combustion processes were identified by an elimination type method that was developed based on evaluation of fuel use in industrial SIC codes 20-39 to identify pollutant sources contributing to acid rain. The final population included only plants that have direct-fired fuel consumption greater than or equal to 100 x 10/sup 9/ Btu/yr of equivalent energy consumption. The goal for this analysis was to provide at least a 1980 base year for the data. This was achieved for all of the industries and in fact, 1981 data were used for a number of the industries evaluated. The second contains an analysis of all consumption of major fossil fuels to: (1) identify all fuel usage categories, and (2) identify the kinds of combustion equipment used within each category. This analysis provides a frame of reference for the balance of the study and permits using an energy accounting methodology to quantify the degree to which the inventoried sources in individual consuming sectors are complete and representative of the total population for the sector.

  16. Chloride/bromide ratios in leachate derived from farm-animal waste

    International Nuclear Information System (INIS)

    Hudak, P.F.

    2003-01-01

    Results have important implications for identifying animal sources of contaminated groundwater. - Ratios of conservative chemicals have been used to identify sources of groundwater contamination. While chloride/bromide ratios have been reported for several common sources of groundwater contamination, little work has been done on leachate derived from farm-animal waste. In this study, chloride/bromide ratios were measured in leachate derived from longhorn-cattle, quarter-horse, and pygme-goat waste at a farm in Abilene, Texas, USA. (Minimum, median, and maximum) chloride/bromide ratios of (66.5, 85.6, and 167), (119, 146, and 156), and (35.4, 57.8, and 165) were observed for cattle, horses, and goats, respectively. These ratios are below typical values for domestic wastewater and within the range commonly observed for oilfield brine. Results of this study have important implications for identifying sources of contaminated groundwater in settings with significant livestock and/or oil production

  17. A phosphine mediated sequential annulation process of 2-tosylaminochalcones with MBH carbonates to construct functionalized aza-benzobicyclo[4.3.0] derivatives.

    Science.gov (United States)

    Zhang, Qinglong; Zhu, Yannan; Jin, Hongxing; Huang, You

    2017-04-04

    A novel phosphine mediated sequential annulation process to construct functionalized aza-benzobicyclo[4.3.0] derivatives has been developed involving a one-pot sequential catalytic and stoichiometric process, which generates a series of benzobicyclo[4.3.0] compounds containing one quaternary center with up to 94% yield and 20 : 1 dr value. In this reaction, MBH carbonates act as 1,2,3-C 3 synthons.

  18. Design and optimization of components and processes for plasma sources in advanced material treatments

    OpenAIRE

    Rotundo, Fabio

    2012-01-01

    The research activities described in the present thesis have been oriented to the design and development of components and technological processes aimed at optimizing the performance of plasma sources in advanced in material treatments. Consumables components for high definition plasma arc cutting (PAC) torches were studied and developed. Experimental activities have in particular focussed on the modifications of the emissive insert with respect to the standard electrode configuration, whi...

  19. THE ARECIBO LEGACY FAST ALFA SURVEY: THE {alpha}.40 H I SOURCE CATALOG, ITS CHARACTERISTICS AND THEIR IMPACT ON THE DERIVATION OF THE H I MASS FUNCTION

    Energy Technology Data Exchange (ETDEWEB)

    Haynes, Martha P.; Giovanelli, Riccardo; Martin, Ann M.; Adams, Elizabeth A. K.; Hallenbeck, Gregory; Huang Shan; Papastergis, Emmanouil, E-mail: haynes@astro.cornell.edu, E-mail: riccardo@astro.cornell.edu, E-mail: amartin@astro.cornell.edu, E-mail: betsey@astro.cornell.edu, E-mail: ghallenbeck@astro.cornell.edu, E-mail: shan@astro.cornell.edu [Center for Radiophysics and Space Research, Space Sciences Building, Cornell University, Ithaca, NY 14853 (United States); and others

    2011-11-15

    We present a current catalog of 21 cm H I line sources extracted from the Arecibo Legacy Fast Arecibo L-band Feed Array (ALFALFA) survey over {approx}2800 deg{sup 2} of sky: the {alpha}.40 catalog. Covering 40% of the final survey area, the {alpha}.40 catalog contains 15,855 sources in the regions 07{sup h}30{sup m} < R.A. < 16{sup h}30{sup m}, +04 Degree-Sign < decl. <+16 Degree-Sign , and +24 Degree-Sign < decl. <+28 Degree-Sign and 22{sup h} < R.A. < 03{sup h}, +14 Degree-Sign < decl. <+16 Degree-Sign , and +24 Degree-Sign < decl. < + 32 Degree-Sign . Of those, 15,041 are certainly extragalactic, yielding a source density of 5.3 galaxies per deg{sup 2}, a factor of 29 improvement over the catalog extracted from the H I Parkes All-Sky Survey. In addition to the source centroid positions, H I line flux densities, recessional velocities, and line widths, the catalog includes the coordinates of the most probable optical counterpart of each H I line detection, and a separate compilation provides a cross-match to identifications given in the photometric and spectroscopic catalogs associated with the Sloan Digital Sky Survey Data Release 7. Fewer than 2% of the extragalactic H I line sources cannot be identified with a feasible optical counterpart; some of those may be rare OH megamasers at 0.16 < z < 0.25. A detailed analysis is presented of the completeness, width-dependent sensitivity function and bias inherent of the {alpha}.40 catalog. The impact of survey selection, distance errors, current volume coverage, and local large-scale structure on the derivation of the H I mass function is assessed. While {alpha}.40 does not yet provide a completely representative sampling of cosmological volume, derivations of the H I mass function using future data releases from ALFALFA will further improve both statistical and systematic uncertainties.

  20. Deriving site-specific soil clean-up values for metals and metalloids: rationale for including protection of soil microbial processes.

    Science.gov (United States)

    Kuperman, Roman G; Siciliano, Steven D; Römbke, Jörg; Oorts, Koen

    2014-07-01

    Although it is widely recognized that microorganisms are essential for sustaining soil fertility, structure, nutrient cycling, groundwater purification, and other soil functions, soil microbial toxicity data were excluded from the derivation of Ecological Soil Screening Levels (Eco-SSL) in the United States. Among the reasons for such exclusion were claims that microbial toxicity tests were too difficult to interpret because of the high variability of microbial responses, uncertainty regarding the relevance of the various endpoints, and functional redundancy. Since the release of the first draft of the Eco-SSL Guidance document by the US Environmental Protection Agency in 2003, soil microbial toxicity testing and its use in ecological risk assessments have substantially improved. A wide range of standardized and nonstandardized methods became available for testing chemical toxicity to microbial functions in soil. Regulatory frameworks in the European Union and Australia have successfully incorporated microbial toxicity data into the derivation of soil threshold concentrations for ecological risk assessments. This article provides the 3-part rationale for including soil microbial processes in the development of soil clean-up values (SCVs): 1) presenting a brief overview of relevant test methods for assessing microbial functions in soil, 2) examining data sets for Cu, Ni, Zn, and Mo that incorporated soil microbial toxicity data into regulatory frameworks, and 3) offering recommendations on how to integrate the best available science into the method development for deriving site-specific SCVs that account for bioavailability of metals and metalloids in soil. Although the primary focus of this article is on the development of the approach for deriving SCVs for metals and metalloids in the United States, the recommendations provided in this article may also be applicable in other jurisdictions that aim at developing ecological soil threshold values for protection of

  1. CHROMOPHORIC DISSOLVED ORGANIC MATTER (CDOM) DERIVED FROM DECOMPOSITION OF VARIOUS VASCULAR PLANT AND ALGAL SOURCES

    Science.gov (United States)

    Chromophoric dissolved organic (CDOM) in aquatic environments is derived from the microbial decomposition of terrestrial and microbial organic matter. Here we present results of studies of the spectral properties and photoreactivity of the CDOM derived from several organic matter...

  2. The effect of feature-based attention on flanker interference processing: An fMRI-constrained source analysis.

    Science.gov (United States)

    Siemann, Julia; Herrmann, Manfred; Galashan, Daniela

    2018-01-25

    The present study examined whether feature-based cueing affects early or late stages of flanker conflict processing using EEG and fMRI. Feature cues either directed participants' attention to the upcoming colour of the target or were neutral. Validity-specific modulations during interference processing were investigated using the N200 event-related potential (ERP) component and BOLD signal differences. Additionally, both data sets were integrated using an fMRI-constrained source analysis. Finally, the results were compared with a previous study in which spatial instead of feature-based cueing was applied to an otherwise identical flanker task. Feature-based and spatial attention recruited a common fronto-parietal network during conflict processing. Irrespective of attention type (feature-based; spatial), this network responded to focussed attention (valid cueing) as well as context updating (invalid cueing), hinting at domain-general mechanisms. However, spatially and non-spatially directed attention also demonstrated domain-specific activation patterns for conflict processing that were observable in distinct EEG and fMRI data patterns as well as in the respective source analyses. Conflict-specific activity in visual brain regions was comparable between both attention types. We assume that the distinction between spatially and non-spatially directed attention types primarily applies to temporal differences (domain-specific dynamics) between signals originating in the same brain regions (domain-general localization).

  3. Synthesis of hydroxy derivatives of limonene

    International Nuclear Information System (INIS)

    Ardashov, O V; Volcho, K P; Salakhutdinov, N F

    2014-01-01

    Synthetic routes to mono-, di- and trihydroxy derivatives of limonene are presented. Emphasis is given to the problems of regio- and stereoselectivity of transformations. Data on the isolation from natural sources and on the biological activities of the title compounds are given. The bibliography includes 107 references

  4. The Potential for Synovium-derived Stem Cells in Cartilage Repair

    DEFF Research Database (Denmark)

    Kubosch, Eva Johanna; Lang, Gernot Michael; Fürst, David

    2018-01-01

    for the treatment of large, isolated, full thickness cartilage defects. Several disadvantages such as the need for two surgical procedures or hypertrophic regenerative cartilage, underline the need for alternative cell sources. OBJECTIVE: Mesenchymal stem cells, particularly synovium-derived mesenchymal stem cells......, represent a promising cell source. Synovium-derived mesenchymal stem cells have attracted considerable attention since they display great chondrogenic potential and less hypertrophic differentiation than mesenchymal stem cells derived from bone marrow. The aim of this review was to summarize the current...... knowledge on the chondrogenic potential for synovial stem cells in regard to cartilage repair purposes. RESULTS: A literature search was carried out identifying 260 articles in the databases up to January 2017. Several in vitro and initial animal in vivo studies of cartilage repair using synovia stem cell...

  5. DensToolKit: A comprehensive open-source package for analyzing the electron density and its derivative scalar and vector fields

    Science.gov (United States)

    Solano-Altamirano, J. M.; Hernández-Pérez, Julio M.

    2015-11-01

    DensToolKit is a suite of cross-platform, optionally parallelized, programs for analyzing the molecular electron density (ρ) and several fields derived from it. Scalar and vector fields, such as the gradient of the electron density (∇ρ), electron localization function (ELF) and its gradient, localized orbital locator (LOL), region of slow electrons (RoSE), reduced density gradient, localized electrons detector (LED), information entropy, molecular electrostatic potential, kinetic energy densities K and G, among others, can be evaluated on zero, one, two, and three dimensional grids. The suite includes a program for searching critical points and bond paths of the electron density, under the framework of Quantum Theory of Atoms in Molecules. DensToolKit also evaluates the momentum space electron density on spatial grids, and the reduced density matrix of order one along lines joining two arbitrary atoms of a molecule. The source code is distributed under the GNU-GPLv3 license, and we release the code with the intent of establishing an open-source collaborative project. The style of DensToolKit's code follows some of the guidelines of an object-oriented program. This allows us to supply the user with a simple manner for easily implement new scalar or vector fields, provided they are derived from any of the fields already implemented in the code. In this paper, we present some of the most salient features of the programs contained in the suite, some examples of how to run them, and the mathematical definitions of the implemented fields along with hints of how we optimized their evaluation. We benchmarked our suite against both a freely-available program and a commercial package. Speed-ups of ˜2×, and up to 12× were obtained using a non-parallel compilation of DensToolKit for the evaluation of fields. DensToolKit takes similar times for finding critical points, compared to a commercial package. Finally, we present some perspectives for the future development and

  6. The Chandra Source Catalog 2.0: Spectral Properties

    Science.gov (United States)

    McCollough, Michael L.; Siemiginowska, Aneta; Burke, Douglas; Nowak, Michael A.; Primini, Francis Anthony; Laurino, Omar; Nguyen, Dan T.; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McDowell, Jonathan C.; Miller, Joseph; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Paxson, Charles; Plummer, David A.; Rots, Arnold H.; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula; Chandra Source Catalog Team

    2018-01-01

    The second release of the Chandra Source Catalog (CSC) contains all sources identified from sixteen years' worth of publicly accessible observations. The vast majority of these sources have been observed with the ACIS detector and have spectral information in 0.5-7 keV energy range. Here we describe the methods used to automatically derive spectral properties for each source detected by the standard processing pipeline and included in the final CSC. The sources with high signal to noise ratio (exceeding 150 net counts) were fit in Sherpa (the modeling and fitting application from the Chandra Interactive Analysis of Observations package) using wstat as a fit statistic and Bayesian draws method to determine errors. Three models were fit to each source: an absorbed power-law, blackbody, and Bremsstrahlung emission. The fitted parameter values for the power-law, blackbody, and Bremsstrahlung models were included in the catalog with the calculated flux for each model. The CSC also provides the source energy fluxes computed from the normalizations of predefined absorbed power-law, black-body, Bremsstrahlung, and APEC models needed to match the observed net X-ray counts. For sources that have been observed multiple times we performed a Bayesian Blocks analysis will have been performed (see the Primini et al. poster) and the most significant block will have a joint fit performed for the mentioned spectral models. In addition, we provide access to data products for each source: a file with source spectrum, the background spectrum, and the spectral response of the detector. Hardness ratios were calculated for each source between pairs of energy bands (soft, medium and hard). This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.

  7. A green approach to the production of 2-pyridone derivatives promoted by infrared irradiation

    International Nuclear Information System (INIS)

    Hernandez, F.; De la Cruz, F.; Lopez, J.; Pena, E.; Vazquez, M. A.; Delgado, F.; Alcaraz, Y.; FRobles, J.; Martinez A, M.

    2014-01-01

    An alternative is presented by promoting a reaction with infrared irradiation to obtain different 4-aryl-3-cyano-5-ethoxycarbonyl-6-methyl-2-pyridone derivatives 9 a-k. The process was carried out with a green approach from the corresponding 4 H-pyrans, using mild reaction conditions and infrared irradiation as the energy source. In the first stage, the reaction produced 1,2,3,4-tetrahydropyridine-2-one derivatives 8 a-k, followed by an oxidative step to afford the target molecules in good yields. The structure of products 9 a-k was confirmed by Ft-IR, 1 H NMR and 13 C NMR spectroscopic techniques and X-ray diffraction. It was found that the efficiency of the reaction depends on the catalyst and the solvent, as well as on the aldehyde substituents. (Author)

  8. Light Source Estimation with Analytical Path-tracing

    OpenAIRE

    Kasper, Mike; Keivan, Nima; Sibley, Gabe; Heckman, Christoffer

    2017-01-01

    We present a novel algorithm for light source estimation in scenes reconstructed with a RGB-D camera based on an analytically-derived formulation of path-tracing. Our algorithm traces the reconstructed scene with a custom path-tracer and computes the analytical derivatives of the light transport equation from principles in optics. These derivatives are then used to perform gradient descent, minimizing the photometric error between one or more captured reference images and renders of our curre...

  9. The influence of humic acids derived from earthworm-processed organic wastes on plant growth

    Energy Technology Data Exchange (ETDEWEB)

    Atiyeh, R.M.; Lee, S.; Edwards, C.A.; Arancon, N.Q.; Metzger, J.D. [Ohio State University, Columbus, OH (United States). Soil Ecology Lab.

    2002-08-01

    Some effects of humic acids, formed during the breakdown of organic wastes by earthworms (vermicomposting), on plant growth were evaluated. In the first experiment, humic acids were extracted from pig manure vermicompost using the classic alkali/acid fractionation procedure and mixed with a soilless container medium (Metro-Mix 360), to provide a range of 0, 50, 100, 150, 200, 250, 500, 1000, 2000 and 4000 mg of humate per kg of dry weight of container medium, and tomato seedlings were grown in the mixtures. In the second experiment, humates extracted from pig manure and food wastes vermicomposts were mixed with vermiculite to provide a range of 0, 50, 125, 250, 500, 1000 and 4000 mg of humate per kg of dry weight of the container medium, and cucumber seedlings were grown in the mixtures. Both tomato and cucumber seedlings were watered daily with a solution containing all nutrients required to ensure that any differences in growth responses were not nutrient-mediated. The incorporation of both types of vermicompost-derived humic acids, into either type of soilless plant growth media, increased the growth of tomato and cucumber plants significantly, in terms of plant heights, leaf areas, shoot and root dry weights. Plant growth increased with increasing concentrations of humic acids incorporated into the medium up to a certain proportion, but this differed according to the plant species, the source of the vermicompost, and the nature of the container medium. Plant growth tended to be increased by treatments of the plants with 50-500 mg/kg humic acids, but often decreased significantly when the concentrations of humic acids derived in the container medium exceeded 500-1000 mg/kg. These growth responses were most probably due to hormone-like activity of humic acids from the vermicomposts or could have been due to plant growth hormones adsorbed onto the humates. (author)

  10. Bistatic High Frequency Radar Ocean Surface Cross Section for an FMCW Source with an Antenna on a Floating Platform

    Directory of Open Access Journals (Sweden)

    Yue Ma

    2016-01-01

    Full Text Available The first- and second-order bistatic high frequency radar cross sections of the ocean surface with an antenna on a floating platform are derived for a frequency-modulated continuous wave (FMCW source. Based on previous work, the derivation begins with the general bistatic electric field in the frequency domain for the case of a floating antenna. Demodulation and range transformation are used to obtain the range information, distinguishing the process from that used for a pulsed radar. After Fourier-transforming the autocorrelation and comparing the result with the radar range equation, the radar cross sections are derived. The new first- and second-order antenna-motion-incorporated bistatic radar cross section models for an FMCW source are simulated and compared with those for a pulsed source. Results show that, for the same radar operating parameters, the first-order radar cross section for the FMCW waveform is a little lower than that for a pulsed source. The second-order radar cross section for the FMCW waveform reduces to that for the pulsed waveform when the scattering patch limit approaches infinity. The effect of platform motion on the radar cross sections for an FMCW waveform is investigated for a variety of sea states and operating frequencies and, in general, is found to be similar to that for a pulsed waveform.

  11. Plant derived antioxidants and antifibrotic drugs: past, present and future

    Directory of Open Access Journals (Sweden)

    Devaraj Ezhilarasan

    2014-09-01

    Full Text Available Hepatic fibrosis occurs as a wound-healing process after several forms of chronic hepatic injury. Activation and proliferation of hepatic stellate cells play pivotal role in the pathogenesis of hepatic fibrosis. Many researchers, from the therapeutic perspective, have focused their attention on searching for novel agents with inhibitory effects on hepatic stellate cells proliferation and activation to prevent hepatic fibrogenesis and a number of plant derived antioxidants have been tested as anti-fibrogenic agents, they generally suppress proliferation and collagen synthesis. Plants remain an imperative source of novel drugs, novel drug leads and new chemical entities. The plant based drug discovery resulted primarily in the development of antioxidant, anti-cancer and other anti-infectious agents and continues to contribute to the new leads in clinical trials. This review summarizes some of those most important plant derived anti-fibrotic drugs and their beneficial effects on experimentally induced hepatic fibrosis in vitro and in vivo. The plant derived antioxidant compounds described herein are curcumin, silymarin, silibinin, baicalein, resveratrol, salvianolic acids, tetrandine, quercetin and berberine. Studies from ours and as demonstrated by pervious workers much information has been accumulated over the past two decades through in vivo and in vitro. In light of those studies, it has been confirmed that plants derived antioxidants, particularly flavanoids, show a significant influence to block hepatic fibrosis regardless of any etiology. This review outlines recent progress in the use of plant derived drugs against experimentally induced liver fibrosis by in vitro and in vivo studies and summarizes the possible mechanisms anti-fibrotic effects of these compounds.

  12. High frequency longitudinal profiling reveals hydrologic controls on solute sourcing, transport and processing in a karst river

    Science.gov (United States)

    Hensley, R. T.; Cohen, M. J.; Spangler, M.; Gooseff, M. N.

    2017-12-01

    The lower Santa Fe River is a large, karst river of north Florida, fed by numerous artesian springs and also containing multiple sink-rise systems. We performed repeated longitudinal profiles collecting very high frequency measurements of multiple stream parameters including temperature, dissolved oxygen, carbon dioxide, pH, dissolved organic matter, nitrate, ammonium, phosphate and turbidity. This high frequency dataset provided a spatially explicit understanding of solute sources and coupled biogeochemical processing rates along the 25 km study reach. We noted marked changes in river profiles as the river transitioned from low to high flow during the onset of the wet season. The role of lateral inflow from springs as the primary solute source was greatly reduced under high flow conditions. Effects of sink-rise systems, which under low flow conditions allow the majority of flow to bypass several kilometer long sections of the main channel, virtually disappeared under high flow conditions. Impeded light transmittance at high flow reduced primary production and by extension assimilatory nutrient uptake. This study demonstrates how high frequency longitudinal profiling can be used to observe how hydrologic conditions can alter groundwater-surface water interactions and modulate the sourcing, transport and biogeochemical processing of stream solutes.

  13. A Series of Radiation Processed Nanostructural Chitosan Derivatives for Biomedicine, Agriculture, and Bioplastics

    International Nuclear Information System (INIS)

    Pasanphan, W.; Rattanawongwiboon, T.; Huajaikaew, E.; Kongkaoroptham, P.; Guven, O.; Suwanmala, P.; Hemvichian, K.

    2014-01-01

    The work includes a series of biopolymeric chitosan (CS) nanostructures prepared by irradiation techniques. The radiation processed nanostructural CS were designed, synthesized, and characterized to address a progress in radiation technology for developing value-added natural products for advanced biomedical, agricultural and bioplastic applications. The idea to create CS nanoparticles (CSNPs) using radiation was initiated from simple radiation- induced non-chemical modification to advance radiation-induced functionalization of CSNPs. The already-existing CS nanostructures are water-soluble CSNPs as a green antioxidant and reducing agent, amphiphilic core-shell CS nanocarrier as anticancer delivery system, CS nanogel for fungicide and fertilizer controlled-release, and CS nanofiller for biodegradable PLA blends. Irradiation techniques, chemical structures, nanostructural morphologies including performance of nanostructural CS derivatives in appropriate utilizations were demonstrated. The developing idea would be an alternative approach for nanoscaled-controlled synthesis of the natural polymers.

  14. Use of advanced chemical fingerprinting in PAH source identification and allocation at a coal tar processing site

    International Nuclear Information System (INIS)

    Brown, J.S.; Boehm, P.D.; Douglas, G.S.

    1995-01-01

    Advanced chemical fingerprinting analyses were used to determine source allocation at a former coal tar processing facility which had been converted to a petroleum recycling site. Soil samples from the site had high petroleum hydrocarbon concentrations and elevated levels of polynuclear aromatic hydrocarbons (PAH). Comparisons of PAH distributions were used to differentiate the coal tar hydrocarbons from the petroleum hydrocarbons in soil samples. A more specific technique was needed to accurately allocate the contribution of the two sources to the observed PAH contamination in the soil. Petroleum biomarkers (steranes and triterpanes) which are present in crude oils and many refined petroleum products but are absent in coal tar were used to quantitatively allocate the source of the PAH contamination based on the relative ratio of the PAH to the biomarkers in soil samples. Using the resulting coal tar/petroleum source ratio the contribution of petroleum to the overall PAH contamination at the site was calculated. A multivariate statistical technique (principal component analysis or PCA) was used to provide an independent validation of the source allocation. The results of the source allocation provided a foundation for the site clean-up and remediation costs

  15. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    Science.gov (United States)

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  16. Good Manufacturing Practices and Microbial Contamination Sources in Orange Fleshed Sweet Potato Puree Processing Plant in Kenya

    OpenAIRE

    Malavi, Derick Nyabera; Muzhingi, Tawanda; Abong’, George Ooko

    2018-01-01

    Limited information exists on the status of hygiene and probable sources of microbial contamination in Orange Fleshed Sweet Potato (OFSP) puree processing. The current study is aimed at determining the level of compliance to Good Manufacturing Practices (GMPs), hygiene, and microbial quality in OFSP puree processing plant in Kenya. Intensive observation and interviews using a structured GMPs checklist, environmental sampling, and microbial analysis by standard microbiological methods were use...

  17. Multiple-predators-based capture process on complex networks

    International Nuclear Information System (INIS)

    Sharafat, Rajput Ramiz; Pu Cunlai; Li Jie; Chen Rongbin; Xu Zhongqi

    2017-01-01

    The predator/prey (capture) problem is a prototype of many network-related applications. We study the capture process on complex networks by considering multiple predators from multiple sources. In our model, some lions start from multiple sources simultaneously to capture the lamb by biased random walks, which are controlled with a free parameter α . We derive the distribution of the lamb’s lifetime and the expected lifetime 〈 T 〉. Through simulation, we find that the expected lifetime drops substantially with the increasing number of lions. Moreover, we study how the underlying topological structure affects the capture process, and obtain that locating on small-degree nodes is better than on large-degree nodes to prolong the lifetime of the lamb. The dense or homogeneous network structures are against the survival of the lamb. We also discuss how to improve the capture efficiency in our model. (paper)

  18. Sources of mutagenic activity in urban fine particles

    International Nuclear Information System (INIS)

    Stevens, R.K.; Lewis, C.W.; Dzubay, T.G.; Cupitt, L.T.; Lewtas, J.

    1990-01-01

    Samples were collected during the winter of 1984-1985 in the cities of Albuquerque, NM and Raleigh NC as part of a US Environmental Protection Agency study to evaluate methods to determine the emission sources contributing to the mutagenic properties of extractable organic matter (EOM) present in fine particles. Data derived from the analysis of the composition of these fine particles served as input to a multi-linear regression (MLR) model used to calculate the relative contribution of wood burning and motor vehicle sources to mutagenic activity observed in the extractable organic matter. At both sites the mutagenic potency of EOM was found to be greater (3-5 times) for mobile sources when compared to wood smoke extractable organics. Carbon-14 measurements which give a direct determination of the amount of EOM that originated from wood burning were in close agreement with the source apportionment results derived from the MLR model

  19. Algorithms for biomagnetic source imaging with prior anatomical and physiological information

    Energy Technology Data Exchange (ETDEWEB)

    Hughett, Paul William [Univ. of California, Berkeley, CA (United States). Dept. of Electrical Engineering and Computer Sciences

    1995-12-01

    This dissertation derives a new method for estimating current source amplitudes in the brain and heart from external magnetic field measurements and prior knowledge about the probable source positions and amplitudes. The minimum mean square error estimator for the linear inverse problem with statistical prior information was derived and is called the optimal constrained linear inverse method (OCLIM). OCLIM includes as special cases the Shim-Cho weighted pseudoinverse and Wiener estimators but allows more general priors and thus reduces the reconstruction error. Efficient algorithms were developed to compute the OCLIM estimate for instantaneous or time series data. The method was tested in a simulated neuromagnetic imaging problem with five simultaneously active sources on a grid of 387 possible source locations; all five sources were resolved, even though the true sources were not exactly at the modeled source positions and the true source statistics differed from the assumed statistics.

  20. Measurement of circulating cell-derived microparticles by flow cytometry: sources of variability within the assay.

    Science.gov (United States)

    Ayers, Lisa; Kohler, Malcolm; Harrison, Paul; Sargent, Ian; Dragovic, Rebecca; Schaap, Marianne; Nieuwland, Rienk; Brooks, Susan A; Ferry, Berne

    2011-04-01

    Circulating cell-derived microparticles (MPs) have been implicated in several disease processes and elevated levels are found in many pathological conditions. The detection and accurate measurement of MPs, although attracting widespread interest, is hampered by a lack of standardisation. The aim of this study was to establish a reliable flow cytometric assay to measure distinct subtypes of MPs in disease and to identify any significant causes of variability in MP quantification. Circulating MPs within plasma were identified by their phenotype (platelet, endothelial, leukocyte and annexin-V positivity (AnnV+). The influence of key variables (i.e. time between venepuncture and centrifugation, washing steps, the number of centrifugation steps, freezing/long-term storage and temperature of thawing) on MP measurement were investigated. Increasing time between venepuncture and centrifugation leads to increased MP levels. Washing samples results in decreased AnnV+MPs (P=0.002) and platelet-derived MPs (PMPs) (P=0.002). Double centrifugation of MPs prior to freezing decreases numbers of AnnV+MPs (P=0.0004) and PMPs (P=0.0004). A single freeze thaw cycle of samples led to an increase in AnnV+MPs (P=0.0020) and PMPs (P=0.0039). Long-term storage of MP samples at -80° resulted in decreased MP levels. This study found that minor protocol changes significantly affected MP levels. This is one of the first studies attempting to standardise a method for obtaining and measuring circulating MPs. Standardisation will be essential for successful development of MP technologies, allowing direct comparison of results between studies and leading to a greater understanding of MPs in disease. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  1. Derivation of uranium residual radioactive material guidelines for the Shpack site

    International Nuclear Information System (INIS)

    Cheng, J.J.; Yu, C.; Monette, F.; Jones, L.

    1991-08-01

    Residual radioactive material guidelines for uranium were derived for the Shpack site in Norton, Massachusetts. This site has been identified for remedial action under the Formerly Utilized Sites Remedial Action Program (FUSRAP) of the US Department of Energy (DOE). The uranium guidelines were derived on the basis of the requirement that the 50-year committed effective dose equivalent to a hypothetical individual who lives or works in the immediate vicinity of the Shpack site should not exceed a dose of 100 mrem/yr following decontamination. The DOE residual radioactive material guideline computer code, RESRAD, which implements the methodology described in the DOE manual for implementing residual radioactive material guidelines, was used in this evaluation. Three potential scenarios were considered for the site; the scenarios vary with regard to time spent at the site, sources of water used, and sources of food consumed. The results of the evaluation indicate that the basic dose limit of 100 mrem/yr will not be exceeded for uranium (including uranium-234, uranium-235, and uranium-238) within 1000 years, provided that the soil concentration of combined uranium (uranium-234 and uranium-238) at the Shpack site does not exceed the following levels: 2500 pCi/g for Scenario A (recreationist: the expected scenario); 1100 pCi/g for Scenario B (industrial worker: a plausible scenario); and 53 pCi/g for Scenario C (resident farmer using a well water as the only water source: a possible but unlikely scenario). The uranium guidelines derived in this report apply to the combined activity concentration of uranium-234 and uranium-238 and were calculated on the basis of a dose of 100 mrem/yr. In setting the actual uranium guidelines for the Shpack site, DOE will apply the as low as reasonably achievable (ALARA) policy to the decision-making process, along with other factors, such as whether a particular scenario is reasonable and appropriate. 8 refs., 2 figs., 8 tabs

  2. Unexpected source of Fukushima-derived radiocesium to the coastal ocean of Japan

    Science.gov (United States)

    Sanial, Virginie; Buesseler, Ken O.; Charette, Matthew A.; Nagao, Seiya

    2017-12-01

    Synthesizing published data, we provide a quantitative summary of the global biogeochemical cycle of vanadium (V), including both human-derived and natural fluxes. Through mining of V ores (130 × 109 g V/y) and extraction and combustion of fossil fuels (600 × 109 g V/y), humans are the predominant force in the geochemical cycle of V at Earth’s surface. Human emissions of V to the atmosphere are now likely to exceed background emissions by as much as a factor of 1.7, and, presumably, we have altered the deposition of V from the atmosphere by a similar amount. Excessive V in air and water has potential, but poorly documented, consequences for human health. Much of the atmospheric flux probably derives from emissions from the combustion of fossil fuels, but the magnitude of this flux depends on the type of fuel, with relatively low emissions from coal and higher contributions from heavy crude oils, tar sands bitumen, and petroleum coke. Increasing interest in petroleum derived from unconventional deposits is likely to lead to greater emissions of V to the atmosphere in the near future. Our analysis further suggests that the flux of V in rivers has been incremented by about 15% from human activities. Overall, the budget of dissolved V in the oceans is remarkably well balanced—with about 40 × 109 g V/y to 50 × 109 g V/y inputs and outputs, and a mean residence time for dissolved V in seawater of about 130,000 y with respect to inputs from rivers.

  3. Steam gasification of tyre waste, poplar, and refuse-derived fuel: A comparative analysis

    International Nuclear Information System (INIS)

    Galvagno, S.; Casciaro, G.; Casu, S.; Martino, M.; Mingazzini, C.; Russo, A.; Portofino, S.

    2009-01-01

    In the field of waste management, thermal disposal is a treatment option able to recover resources from 'end of life' products. Pyrolysis and gasification are emerging thermal treatments that work under less drastic conditions in comparison with classic direct combustion, providing for reduced gaseous emissions of heavy metals. Moreover, they allow better recovery efficiency since the process by-products can be used as fuels (gas, oils), for both conventional (classic engines and heaters) and high efficiency apparatus (gas turbines and fuel cells), or alternatively as chemical sources or as raw materials for other processes. This paper presents a comparative study of a steam gasification process applied to three different waste types (refuse-derived fuel, poplar wood and scrap tyres), with the aim of comparing the corresponding yields and product compositions and exploring the most valuable uses of the by-products

  4. Identification of hydrogeochemical processes and pollution sources of groundwater nitrate in Leiming Basin of Hainan island, Southern China

    Science.gov (United States)

    Shaowen, Y.; Zhan, Y., , Dr; Li, Q.

    2017-12-01

    Identifying the evolution of groundwater quality is important for the control and management of groundwater resources. The main aims of the present study are to identify the major factors affecting hydrogeochemistry of groundwater resources and to evaluate the potential sources of groundwater nitrate in Leiming basin using chemical and isotopic methods. The majority of samples belong to Na-Cl water type and are followed by Ca-HCO3 and mixed Ca-Na-HCO3. The δ18O and δ2H values in groundwater indicate that the shallow fissure groundwater is mainly recharged by rainfall. The evaporated surface water is another significant origin of groundwater. The weathering and dissolution of different rocks and minerals, input of precipitation, evaporation, ion exchange and anthropogenic activities, especially agricultural activities, influence the hydrogeochemistry of the study area. NO- 3 concentration in the groundwater varies from 0.7 to 51.7 mg/L and high values are mainly occurred in the densely populated area. The combined use of isotopic values and hydrochemical data suggests that the NO- 3 load in Leiming basin is not only derived from agricultural activities but also from other sources such as waste water and atmospheric deposition. Fertilizer is considered as the major source of NO- 3 in the groundwater in Leiming basin.

  5. Source term derivation and radiological safety analysis for the TRICO II research reactor in Kinshasa

    International Nuclear Information System (INIS)

    Muswema, J.L.; Ekoko, G.B.; Lukanda, V.M.; Lobo, J.K.-K.; Darko, E.O.; Boafo, E.K.

    2015-01-01

    Highlights: • Atmospheric dispersion modeling for two credible accidents of the TRIGA Mark II research reactor in Kinshasa (TRICO II) was performed. • Radiological safety analysis after the postulated initiating events (PIE) was also carried out. • The Karlsruhe KORIGEN and the HotSpot Health Physics codes were used to achieve the objectives of this study. • All the values of effective dose obtained following the accident scenarios were below the regulatory limits for reactor staff members and the public, respectively. - Abstract: The source term from the 1 MW TRIGA Mark II research reactor core of the Democratic Republic of the Congo was derived in this study. An atmospheric dispersion modeling followed by radiation dose calculation were performed based on two possible postulated accident scenarios. This derivation was made from an inventory of peak radioisotope activities released in the core by using the Karlsruhe version of isotope generation code KORIGEN. The atmospheric dispersion modeling was performed with HotSpot code, and its application yielded to radiation dose profile around the site using meteorological parameters specific to the area under study. The two accident scenarios were picked from possible accident analyses for TRIGA and TRIGA-fueled reactors, involving the case of destruction of the fuel element with highest activity release and a plane crash on the reactor building as the worst case scenario. Deterministic effects of these scenarios are used to update the Safety Analysis Report (SAR) of the reactor, and for its current version, these scenarios are not yet incorporated. Site-specific meteorological conditions were collected from two meteorological stations: one installed within the Atomic Energy Commission and another at the National Meteorological Agency (METTELSAT), which is not far from the site. Results show that in both accident scenarios, radiation doses remain within the limits, far below the recommended maximum effective

  6. Source term derivation and radiological safety analysis for the TRICO II research reactor in Kinshasa

    Energy Technology Data Exchange (ETDEWEB)

    Muswema, J.L., E-mail: jeremie.muswem@unikin.ac.cd [Faculty of Science, University of Kinshasa, P.O. Box 190, KIN XI (Congo, The Democratic Republic of the); Ekoko, G.B. [Faculty of Science, University of Kinshasa, P.O. Box 190, KIN XI (Congo, The Democratic Republic of the); Lukanda, V.M. [Faculty of Science, University of Kinshasa, P.O. Box 190, KIN XI (Congo, The Democratic Republic of the); Democratic Republic of the Congo' s General Atomic Energy Commission, P.O. Box AE1 (Congo, The Democratic Republic of the); Lobo, J.K.-K. [Faculty of Science, University of Kinshasa, P.O. Box 190, KIN XI (Congo, The Democratic Republic of the); Darko, E.O. [Radiation Protection Institute, Ghana Atomic Energy Commission, P.O. Box LG 80, Legon, Accra (Ghana); Boafo, E.K. [University of Ontario Institute of Technology, 2000 Simcoe St. North, Oshawa, ONL1 H7K4 (Canada)

    2015-01-15

    Highlights: • Atmospheric dispersion modeling for two credible accidents of the TRIGA Mark II research reactor in Kinshasa (TRICO II) was performed. • Radiological safety analysis after the postulated initiating events (PIE) was also carried out. • The Karlsruhe KORIGEN and the HotSpot Health Physics codes were used to achieve the objectives of this study. • All the values of effective dose obtained following the accident scenarios were below the regulatory limits for reactor staff members and the public, respectively. - Abstract: The source term from the 1 MW TRIGA Mark II research reactor core of the Democratic Republic of the Congo was derived in this study. An atmospheric dispersion modeling followed by radiation dose calculation were performed based on two possible postulated accident scenarios. This derivation was made from an inventory of peak radioisotope activities released in the core by using the Karlsruhe version of isotope generation code KORIGEN. The atmospheric dispersion modeling was performed with HotSpot code, and its application yielded to radiation dose profile around the site using meteorological parameters specific to the area under study. The two accident scenarios were picked from possible accident analyses for TRIGA and TRIGA-fueled reactors, involving the case of destruction of the fuel element with highest activity release and a plane crash on the reactor building as the worst case scenario. Deterministic effects of these scenarios are used to update the Safety Analysis Report (SAR) of the reactor, and for its current version, these scenarios are not yet incorporated. Site-specific meteorological conditions were collected from two meteorological stations: one installed within the Atomic Energy Commission and another at the National Meteorological Agency (METTELSAT), which is not far from the site. Results show that in both accident scenarios, radiation doses remain within the limits, far below the recommended maximum effective

  7. Impedance source power electronic converters

    CERN Document Server

    Liu, Yushan; Ge, Baoming; Blaabjerg, Frede; Ellabban, Omar; Loh, Poh Chiang

    2016-01-01

    Impedance Source Power Electronic Converters brings together state of the art knowledge and cutting edge techniques in various stages of research related to the ever more popular impedance source converters/inverters. Significant research efforts are underway to develop commercially viable and technically feasible, efficient and reliable power converters for renewable energy, electric transportation and for various industrial applications. This book provides a detailed understanding of the concepts, designs, controls, and application demonstrations of the impedance source converters/inverters. Key features: Comprehensive analysis of the impedance source converter/inverter topologies, including typical topologies and derived topologies. Fully explains the design and control techniques of impedance source converters/inverters, including hardware design and control parameter design for corresponding control methods. Presents the latest power conversion solutions that aim to advance the role of pow...

  8. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Directory of Open Access Journals (Sweden)

    A. Christopher Oishi

    2016-01-01

    Full Text Available Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS. We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or “baseline”, which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  9. High frequency source localization in a shallow ocean sound channel using frequency difference matched field processing.

    Science.gov (United States)

    Worthmann, Brian M; Song, H C; Dowling, David R

    2015-12-01

    Matched field processing (MFP) is an established technique for source localization in known multipath acoustic environments. Unfortunately, in many situations, particularly those involving high frequency signals, imperfect knowledge of the actual propagation environment prevents accurate propagation modeling and source localization via MFP fails. For beamforming applications, this actual-to-model mismatch problem was mitigated through a frequency downshift, made possible by a nonlinear array-signal-processing technique called frequency difference beamforming [Abadi, Song, and Dowling (2012). J. Acoust. Soc. Am. 132, 3018-3029]. Here, this technique is extended to conventional (Bartlett) MFP using simulations and measurements from the 2011 Kauai Acoustic Communications MURI experiment (KAM11) to produce ambiguity surfaces at frequencies well below the signal bandwidth where the detrimental effects of mismatch are reduced. Both the simulation and experimental results suggest that frequency difference MFP can be more robust against environmental mismatch than conventional MFP. In particular, signals of frequency 11.2 kHz-32.8 kHz were broadcast 3 km through a 106-m-deep shallow ocean sound channel to a sparse 16-element vertical receiving array. Frequency difference MFP unambiguously localized the source in several experimental data sets with average peak-to-side-lobe ratio of 0.9 dB, average absolute-value range error of 170 m, and average absolute-value depth error of 10 m.

  10. Contribution of electric energy to the process of elimination of low emission sources in Cracow

    Energy Technology Data Exchange (ETDEWEB)

    Lach, J.; Mejer, T.; Wybranski, A. [Power Distribution Plant, Cracow (Poland)

    1995-12-31

    At present energy supply belongs to the most important global problems. A significant part of energy is consumed for residential heating purposes. Depending on climatic conditions, fuel distribution and the level of technological development, the contribution of these purposes ranges between ca. 50% (Poland) and ca. 12% (Spain). The power engineering structure in Poland is based almost exclusively upon solid fuels, i.e. hard and brown coal. Chemical compounds (carbon dioxide, sulfur dioxide and nitrogen oxides) produced in combustion process influence negatively the natural environment. The contribution of residential heating in this negative effect is rather significant. Because of the fact, that the resources of fossil fuels (the most important source of energy at present) are limited and their influence on natural environment is negative, efforts are made to find out more effective ways of energy consumption and to reduce the pollutant emission from heating sources. This problem is a topical issue in Cracow, especially during the heating season because the coal-fired stoves situated in the central part of the town remain the most important source of pollutant emission. These sources cause serious menace to the health of inhabitants; furthermore the pollutants destroy Cracow monuments entered in the UNESCO world list of human heritage.

  11. Source inversion in the full-wave tomography; Full wave tomography ni okeru source inversion

    Energy Technology Data Exchange (ETDEWEB)

    Tsuchiya, T [DIA Consultants Co. Ltd., Tokyo (Japan)

    1997-10-22

    In order to consider effects of characteristics of a vibration source in the full-wave tomography (FWT), a study has been performed on a method to invert vibration source parameters together with V(p)/V(s) distribution. The study has expanded an analysis method which uses as the basic the gradient method invented by Tarantola and the partial space method invented by Sambridge, and conducted numerical experiments. The experiment No. 1 has performed inversion of only the vibration source parameters, and the experiment No. 2 has executed simultaneous inversion of the V(p)/V(s) distribution and the vibration source parameters. The result of the discussions revealed that and effective analytical procedure would be as follows: in order to predict maximum stress, the average vibration source parameters and the property parameters are first inverted simultaneously; in order to estimate each vibration source parameter at a high accuracy, the property parameters are fixed, and each vibration source parameter is inverted individually; and the derived vibration source parameters are fixed, and the property parameters are again inverted from the initial values. 5 figs., 2 tabs.

  12. Probing the heat sources during thermal runaway process by thermal analysis of different battery chemistries

    Science.gov (United States)

    Zheng, Siqi; Wang, Li; Feng, Xuning; He, Xiangming

    2018-02-01

    Safety issue is very important for the lithium ion battery used in electric vehicle or other applications. This paper probes the heat sources in the thermal runaway processes of lithium ion batteries composed of different chemistries using accelerating rate calorimetry (ARC) and differential scanning calorimetry (DSC). The adiabatic thermal runaway features for the 4 types of commercial lithium ion batteries are tested using ARC, whereas the reaction characteristics of the component materials, including the cathode, the anode and the separator, inside the 4 types of batteries are measured using DSC. The peaks and valleys of the critical component reactions measured by DSC can match the fluctuations in the temperature rise rate measured by ARC, therefore the relevance between the DSC curves and the ARC curves is utilized to probe the heat source in the thermal runaway process and reveal the thermal runaway mechanisms. The results and analysis indicate that internal short circuit is not the only way to thermal runaway, but can lead to extra electrical heat, which is comparable with the heat released by chemical reactions. The analytical approach of the thermal runaway mechanisms in this paper can guide the safety design of commercial lithium ion batteries.

  13. Full–waveform inversion using the excitation representation of the source wavefield

    KAUST Repository

    Kalita, Mahesh

    2016-09-06

    Full waveform inversion (FWI) is an iterative method of data-fitting, aiming at high resolution recovery of the unknown model parameters. However, it is a cumbersome process, requiring a long computational time and large memory space/disc storage. One of the reasons for this computational limitation is the gradient calculation step. Based on the adjoint state method, it involves the temporal cross-correlation of the forward propagated source wavefield with the backward propagated residuals, in which we usually need to store the source wavefield, or include an extra extrapolation step to propagate the source wavefield from its storage at the boundary. We propose, alternatively, an amplitude excitation gradient calculation based on the excitation imaging condition concept that represents the source wavefield history by a single, specifically the most energetic arrival. An excitation based Born modeling allows us to derive the adjoint operation. In this case, the source wavelet is injected by a cross-correlation step applied to the data residual directly. Representing the source wavefield through the excitation amplitude and time, we reduce the large requirements for both storage and the computational time. We demonstrate the application of this approach on a 2-layer model with an anomaly and the Marmousi II model.

  14. Full–waveform inversion using the excitation representation of the source wavefield

    KAUST Repository

    Kalita, Mahesh; Alkhalifah, Tariq Ali

    2016-01-01

    Full waveform inversion (FWI) is an iterative method of data-fitting, aiming at high resolution recovery of the unknown model parameters. However, it is a cumbersome process, requiring a long computational time and large memory space/disc storage. One of the reasons for this computational limitation is the gradient calculation step. Based on the adjoint state method, it involves the temporal cross-correlation of the forward propagated source wavefield with the backward propagated residuals, in which we usually need to store the source wavefield, or include an extra extrapolation step to propagate the source wavefield from its storage at the boundary. We propose, alternatively, an amplitude excitation gradient calculation based on the excitation imaging condition concept that represents the source wavefield history by a single, specifically the most energetic arrival. An excitation based Born modeling allows us to derive the adjoint operation. In this case, the source wavelet is injected by a cross-correlation step applied to the data residual directly. Representing the source wavefield through the excitation amplitude and time, we reduce the large requirements for both storage and the computational time. We demonstrate the application of this approach on a 2-layer model with an anomaly and the Marmousi II model.

  15. Short-memory linear processes and econometric applications

    CERN Document Server

    Mynbaev, Kairat T

    2011-01-01

    This book serves as a comprehensive source of asymptotic results for econometric models with deterministic exogenous regressors. Such regressors include linear (more generally, piece-wise polynomial) trends, seasonally oscillating functions, and slowly varying functions including logarithmic trends, as well as some specifications of spatial matrices in the theory of spatial models. The book begins with central limit theorems (CLTs) for weighted sums of short memory linear processes. This part contains the analysis of certain operators in Lp spaces and their employment in the derivation of CLTs

  16. Interactions between barley grain processing and source of supplemental dietary fat on nitrogen metabolism and urea-nitrogen recycling in dairy cows.

    Science.gov (United States)

    Gozho, G N; Hobin, M R; Mutsvangwa, T

    2008-01-01

    The objective of this study was to determine the effects of methods of barley grain processing and source of supplemental fat on urea-N transfer to the gastrointestinal tract (GIT) and the utilization of this recycled urea-N in lactating dairy cows. Four ruminally cannulated Holstein cows (656.3 +/- 27.7 kg of BW; 79.8 +/- 12.3 d in milk) were used in a 4 x 4 Latin square design with 28-d periods and a 2 x 2 factorial arrangement of dietary treatments. Experimental diets contained dry-rolled barley or pelleted barley in combination with whole canola or whole flaxseed as supplemental fat sources. Nitrogen balance was measured from d 15 to 19, with concurrent measurements of urea-N kinetics using continuous intrajugular infusions of [15N 15N]-urea. Dry matter intake and N intake were higher in cows fed dry-rolled barley compared with those fed pelleted barley. Nitrogen retention was not affected by diet, but fecal N excretion was higher in cows fed dry-rolled barley than in those fed pelleted barley. Actual and energy-corrected milk yield were not affected by diet. Milk fat content and milk fat yield were higher in cows fed dry-rolled barley compared with those fed pelleted barley. Source of supplemental fat did not affect urea-N kinetics. Urea-N production was higher (442.2 vs. 334.3 g of N/d), and urea-N entering the GIT tended to be higher (272.9 vs. 202.0 g of N/d), in cows fed dry-rolled barley compared with those fed pelleted barley. The amount of urea-N entry into the GIT that was returned to the ornithine cycle was higher (204.1 vs. 159.5 g of N/d) in cows fed dry-rolled barley than in pelleted barley-fed cows. The amount of urea-N recycled to the GIT and used for anabolic purposes, and the amounts lost in the urine or feces were not affected by dietary treatment. Microbial nonammonia N supply, estimated using total urinary excretion of purine derivatives, was not affected by diet. These results show that even though barley grain processing altered urea

  17. Emission sources and quantities

    International Nuclear Information System (INIS)

    Heinen, B.

    1991-01-01

    The paper examines emission sources and quantities for SO 2 and NO x . Natural SO 2 is released from volcanic sources and to a much lower extent from marsh gases. In nature NO x is mainly produced in the course of the chemical and bacterial denitrification processes going on in the soil. Manmade pollutants are produced in combustion processes. The paper concentrates on manmade pollution. Aspects discussed include: mechanism of pollution development; manmade emission sources (e.g. industry, traffic, power plants and domestic sources); and emission quantities and forecasts. 11 refs., 2 figs., 5 tabs

  18. Adaptive frequency-difference matched field processing for high frequency source localization in a noisy shallow ocean.

    Science.gov (United States)

    Worthmann, Brian M; Song, H C; Dowling, David R

    2017-01-01

    Remote source localization in the shallow ocean at frequencies significantly above 1 kHz is virtually impossible for conventional array signal processing techniques due to environmental mismatch. A recently proposed technique called frequency-difference matched field processing (Δf-MFP) [Worthmann, Song, and Dowling (2015). J. Acoust. Soc. Am. 138(6), 3549-3562] overcomes imperfect environmental knowledge by shifting the signal processing to frequencies below the signal's band through the use of a quadratic product of frequency-domain signal amplitudes called the autoproduct. This paper extends these prior Δf-MFP results to various adaptive MFP processors found in the literature, with particular emphasis on minimum variance distortionless response, multiple constraint method, multiple signal classification, and matched mode processing at signal-to-noise ratios (SNRs) from -20 to +20 dB. Using measurements from the 2011 Kauai Acoustic Communications Multiple University Research Initiative experiment, the localization performance of these techniques is analyzed and compared to Bartlett Δf-MFP. The results show that a source broadcasting a frequency sweep from 11.2 to 26.2 kHz through a 106 -m-deep sound channel over a distance of 3 km and recorded on a 16 element sparse vertical array can be localized using Δf-MFP techniques within average range and depth errors of 200 and 10 m, respectively, at SNRs down to 0 dB.

  19. CHARACTERIZING THE HEAVY ELEMENTS IN GLOBULAR CLUSTER M22 AND AN EMPIRICAL s-PROCESS ABUNDANCE DISTRIBUTION DERIVED FROM THE TWO STELLAR GROUPS

    International Nuclear Information System (INIS)

    Roederer, I. U.; Marino, A. F.; Sneden, C.

    2011-01-01

    We present an empirical s-process abundance distribution derived with explicit knowledge of the r-process component in the low-metallicity globular cluster M22. We have obtained high-resolution, high signal-to-noise spectra for six red giants in M22 using the Magellan Inamori Kyocera Echelle spectrograph on the Magellan-Clay Telescope at Las Campanas Observatory. In each star we derive abundances for 44 species of 40 elements, including 24 elements heavier than zinc (Z = 30) produced by neutron-capture reactions. Previous studies determined that three of these stars (the 'r+s group') have an enhancement of s-process material relative to the other three stars (the 'r-only group'). We confirm that the r+s group is moderately enriched in Pb relative to the r-only group. Both groups of stars were born with the same amount of r-process material, but s-process material was also present in the gas from which the r+s group formed. The s-process abundances are inconsistent with predictions for asymptotic giant branch (AGB) stars with M ≤ 3 M ☉ and suggest an origin in more massive AGB stars capable of activating the 22 Ne(α,n) 25 Mg reaction. We calculate the s-process 'residual' by subtracting the r-process pattern in the r-only group from the abundances in the r+s group. In contrast to previous r- and s-process decompositions, this approach makes no assumptions about the r- and s-process distributions in the solar system and provides a unique opportunity to explore s-process yields in a metal-poor environment.

  20. Frequency-domain inversion using the amplitude of the derivative wavefield with respect to the angular frequency

    KAUST Repository

    Choi, Yun Seok

    2012-01-01

    The instantaneous traveltime based inversion was developed to solve the phase wrapping problem, thus generating long-wavelength structures even for a high single-frequency. However, it required aggressive damping to insure proper convergence. A reason for that is the potential for unstable division in the calculation of the instantaneous traveltime for low damping factors. Thus, we propose an inversion algorithm using the amplitude of the derivative wavefield to avoid the unstable division process. Since the amplitude of the derivative wavefield contains the unwrapped-phase information, its inversion has the potential to provide robust inversion results. On the other hand, the damping term rapidly diminishes the amplitude of the derivative wavefield at far source-receiver offsets. As an alternative, we suggest using the logarithmic amplitude of the derivative wavefield. The gradient of this inversion algorithm is obtained by the back-propagation approach, based on the adjoint-state technique. Numerical examples show that the logarithmic-amplitude approach yields better convergent results than the instantaneous traveltime inversion, whereas the pure-amplitude approach does not show much convergence.

  1. Effects of the curing methods on the fabrication of polycarbosilane derived SiCf/SiC composite

    International Nuclear Information System (INIS)

    Park, Ji Yeon; Kim, Weon Ju; Ryu, Woo Seog; Woo, Chang Hyun; Han, Bum Soo

    2005-01-01

    Silicon carbide has potential advantages for structural applications in the next generation energy system- VHTR, GFR and the fusion reactor due to its unique properties such as a good irradiation resistance and thermo-mechanical properties, less severe waste generation due to neutron activation and improved plant conversion efficiencies by higher operating temperatures. Among the several fabrication processes for SiC f /SiC composites, the polymer impregnation and pyrolysis (PIP) process is the only method derived from polymeric precursors. In the PIP process, the careful control of the oxygen content is important to avoid the property degradation at a high temperature because polymeric precursors are used as source materials of SiC ceramics. During the polymer precursor conversion process, unintended oxygen may be introduced for a cross-linking with producing the Si-O-Si bonds at the curing step. High oxygen content affects the degradation of the high temperature stability in SiC ceramics. Therefore, a decrease of the oxygen content is desirable to obtain SiC ceramics with the high temperature stability. One of the methods to reduce the oxygen content of polymer derived SiC ceramics is the irradiation curing process by gamma ray or electron beam. Polymer derived SiC ceramics with the low oxygen content prepared by the electron beam curing showed the improved thermal stability at a higher temperature. In this study, the electron beam (EB) and the thermal oxidation curing methods were applied to make SiC f /SiC composite using a polymer precursor, polycarbosilane (PCS) by the PIP process. And the evaluations of the curing effects, the pyrolysis behaviors and a high temperature stability were performed

  2. Trimethylsilyl derivatives of organic compounds in source samples and in atmospheric fine particulate matter.

    Science.gov (United States)

    Nolte, Christopher G; Schauer, James J; Cass, Glen R; Simoneit, Bernd R T

    2002-10-15

    Source sample extracts of vegetative detritus, motor vehicle exhaust, tire dust paved road dust, and cigarette smoke have been silylated and analyzed by GC-MS to identify polar organic compounds that may serve as tracers for those specific emission sources of atmospheric fine particulate matter. Candidate molecular tracers were also identified in atmospheric fine particle samples collected in the San Joaquin Valley of California. A series of normal primary alkanols, dominated by even carbon-numbered homologues from C26 to C32, the secondary alcohol 10-nonacosanol, and some phytosterols are prominent polar compounds in the vegetative detritus source sample. No new polar organic compounds are found in the motor vehicle exhaust samples. Several hydrogenated resin acids are present in the tire dust sample, which might serve as useful tracers for those sources in areas that are heavily impacted by motor vehicle traffic. Finally, the alcohol and sterol emission profiles developed for all the source samples examined in this project are scaled according to the ambient fine particle mass concentrations attributed to those sources by a chemical mass balance receptor model that was previously applied to the San Joaquin Valley to compute the predicted atmospheric concentrations of individual alcohols and sterols. The resulting underprediction of alkanol concentrations at the urban sites suggests that alkanols may be more sensitive tracers for natural background from vegetative emissions (i.e., waxes) than the high molecular weight alkanes, which have been the best previously available tracers for that source.

  3. Bone marrow-derived mesenchymal stem cells versus adipose-derived mesenchymal stem cells for peripheral nerve regeneration

    Directory of Open Access Journals (Sweden)

    Marcela Fernandes

    2018-01-01

    Full Text Available Studies have confirmed that bone marrow-derived mesenchymal stem cells (MSCs can be used for treatment of several nervous system diseases. However, isolation of bone marrow-derived MSCs (BMSCs is an invasive and painful process and the yield is very low. Therefore, there is a need to search for other alterative stem cell sources. Adipose-derived MSCs (ADSCs have phenotypic and gene expression profiles similar to those of BMSCs. The production of ADSCs is greater than that of BMSCs, and ADSCs proliferate faster than BMSCs. To compare the effects of venous grafts containing BMSCs or ADSCs on sciatic nerve injury, in this study, rats were randomly divided into four groups: sham (only sciatic nerve exposed, Matrigel (MG; sciatic nerve injury + intravenous transplantation of MG vehicle, ADSCs (sciatic nerve injury + intravenous MG containing ADSCs, and BMSCs (sciatic nerve injury + intravenous MG containing BMSCs groups. Sciatic functional index was calculated to evaluate the function of injured sciatic nerve. Morphologic characteristics of nerves distal to the lesion were observed by toluidine blue staining. Spinal motor neurons labeled with Fluoro-Gold were quantitatively assessed. Compared with sham-operated rats, sciatic functional index was lower, the density of small-diameter fibers was significantly increased, and the number of motor neurons significantly decreased in rats with sciatic nerve injury. Neither ADSCs nor BMSCs significantly improved the sciatic nerve function of rats with sciatic nerve injury, increased fiber density, fiber diameters, axonal diameters, myelin sheath thickness, and G ratios (axonal diameter/fiber diameter ratios in the sciatic nerve distal to the lesion site. There was no significant difference in the number of spinal motor neurons among ADSCs, BMSCs and MG groups. These results suggest that neither BMSCs nor ADSCs provide satisfactory results for peripheral nerve repair when using MG as the conductor for

  4. Derivation of uranium residual radioactive material guidelines for the Elza Gate Site

    International Nuclear Information System (INIS)

    Cheng, J.J.; Yu, C.; Devgun, J.S.

    1991-02-01

    Residual radioactive material guidelines for uranium were derived for a large, homogeneously contaminated area at the Elza Gate Site in Oak Ridge, Tennessee. The derivation of the single-nuclide and total uranium guidelines was based on the requirement that the 50-year committed effective dose equivalent to hypothetical individual who lives or works in the immediate vicinity of the Elza Gate Site should not exceed a dose of 100 mrem/yr following decontamination. The DOE residual radioactive guideline computer code RESRAD was used in this evaluation. Four potential scenarios were considered for the site; the scenarios vary with regard to time spent at the site, sources of water used, and sources of food consumed. The results of the evaluation indicate that the basic dose limit of 100 mrem/yr will not be exceeded for uranium within 1000 years, provided that the soil concentration of uranium at the Elza Gate Site does not exceed the following levels: 1800 pCi/g for Scenario A (industrial worker: the expected scenario); 4000 pCi/g for Scenario B (recreationist: a plausible scenario); 470 pCi/g for Scenario C (resident farmer using pond water as the only water source: a possible but unlikely scenario); and 120 pCi/g for Scenario D (resident farmer using well water as the only water source: a possible but unlikely scenario). The uranium guideline applies to the total activity concentration of uranium isotopes in their natural activity concentration ratio of 1:1: 0.046. These guidelines are calculated on the basis of a dose of 100 mrem/yr. In setting the actual uranium guideline for the Elza Gate Site, the DOE will apply the as low as reasonably achievable (ALARA) policy to the decision-making process, along with other factors, such as determining whether a particular scenario is reasonable and appropriate. 10 refs., 3 figs., 7 tabs

  5. Neutron fluctuations a treatise on the physics of branching processes

    CERN Document Server

    Pazsit, Imre; Pzsit, Imre

    2007-01-01

    The transport of neutrons in a multiplying system is an area of branching processes with a clear formalism. This book presents an account of the mathematical tools used in describing branching processes, which are then used to derive a large number of properties of the neutron distribution in multiplying systems with or without an external source. In the second part of the book, the theory is applied to the description of the neutron fluctuations in nuclear reactor cores as well as in small samples of fissile material. The question of how to extract information about the system under study is discussed. In particular the measurement of the reactivity of subcritical cores, driven with various Poisson and non-Poisson (pulsed) sources, and the identification of fissile material samples, is illustrated. The book gives pragmatic information for those planning and executing and evaluating experiments on such systems. - Gives a complete treatise of the mathematics of branching particle processes, and in particular n...

  6. Transuranic radionuclides in the Columbia River: sources, inventories, and geochemical behavior

    International Nuclear Information System (INIS)

    Beasley, T.M.

    1987-01-01

    The sources, inventories, and geochemical behavior of transuranic and other long-lived radionuclides in the lower Columbia River are summarized. Inventories have been estimated from the measured activities of the different radionuclides in 50 cores raised in 1977 and 1978, while annual export of transuranic radionuclides was determined from monthly water collections in the estuary. Continental shelf inventories of Pu and Am isotopes have been estimated using excess 210 Pb inventories and the mean 210 Pb//sup 239,240/Pu inventory ratio of 100 +/- 19 observed in representative cores raised from the shelf. Despite the substantial past addition of radioactivity to the river from operation of the plutonium production reactors at Hanford, the amounts of reactor-derived radionuclides in river sediments are small relative to fallout-derived nuclides. Erosional processes have mobilized both fallout-derived /sup 239,240/Pu and 137 Cs from the landscape to the river, but the quantities involved represent <1% of their fallout inventories within the river's drainage basin. 36 references, 6 figures, 2 tables

  7. Powder Metallurgy Processing of a WxTaTiVCr High-Entropy Alloy and Its Derivative Alloys for Fusion Material Applications.

    Science.gov (United States)

    Waseem, Owais Ahmed; Ryu, Ho Jin

    2017-05-16

    The W x TaTiVCr high-entropy alloy with 32at.% of tungsten (W) and its derivative alloys with 42 to 90at.% of W with in-situ TiC were prepared via the mixing of elemental W, Ta, Ti, V and Cr powders followed by spark plasma sintering for the development of reduced-activation alloys for fusion plasma-facing materials. Characterization of the sintered samples revealed a BCC lattice and a multi-phase structure. The selected-area diffraction patterns confirmed the formation of TiC in the high-entropy alloy and its derivative alloys. It revealed the development of C15 (cubic) Laves phases as well in alloys with 71 to 90at.% W. A mechanical examination of the samples revealed a more than twofold improvement in the hardness and strength due to solid-solution strengthening and dispersion strengthening. This study explored the potential of powder metallurgy processing for the fabrication of a high-entropy alloy and other derived compositions with enhanced hardness and strength.

  8. Sources for charged particles; Les sources de particules chargees

    Energy Technology Data Exchange (ETDEWEB)

    Arianer, J.

    1997-09-01

    This document is a basic course on charged particle sources for post-graduate students and thematic schools on large facilities and accelerator physics. A simple but precise description of the creation and the emission of charged particles is presented. This course relies on every year upgraded reference documents. Following relevant topics are considered: electronic emission processes, technological and practical considerations on electron guns, positron sources, production of neutral atoms, ionization, plasma and discharge, different types of positive and negative ion sources, polarized particle sources, materials for the construction of ion sources, low energy beam production and transport. (N.T.).

  9. Source-to-incident flux relation for a tokamak fusion test reactor blanket module

    International Nuclear Information System (INIS)

    Imel, G.R.

    1982-01-01

    The source-to-incident 14-MeV flux relation for a blanket module on the Tokamak Fusion Test Reactor is derived. It is shown that assumptions can be made that allow an analytical expression to be derived, using point kernel methods. In addition, the effect of a nonuniform source distribution is derived, again by relatively simple point kernel methods. It is thought that the methodology developed is valid for a variety of blanket modules on tokamak reactors

  10. A green approach to the production of 2-pyridone derivatives promoted by infrared irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, F.; De la Cruz, F.; Lopez, J.; Pena, E.; Vazquez, M. A. [Universidad de Guanajuato, Dapartamento de Quimica, Noria Alta s/n, 36050 Guanajuato, Gto. (Mexico); Delgado, F. [IPN, Escuela Nacional de Ciencias Biologicas, Departamento de Quimica Organica, Prol. Carpio y Plan de Ayala s/n, 11340 Mexico D. F. (Mexico); Alcaraz, Y.; FRobles, J.; Martinez A, M., E-mail: mvazquez@ugto.mx [Universidad de Guanajuato, Departamento de Farmacia, Noria Alta s/n, 36050 Guanajuato, Gto. (Mexico)

    2014-10-01

    An alternative is presented by promoting a reaction with infrared irradiation to obtain different 4-aryl-3-cyano-5-ethoxycarbonyl-6-methyl-2-pyridone derivatives 9 a-k. The process was carried out with a green approach from the corresponding 4 H-pyrans, using mild reaction conditions and infrared irradiation as the energy source. In the first stage, the reaction produced 1,2,3,4-tetrahydropyridine-2-one derivatives 8 a-k, followed by an oxidative step to afford the target molecules in good yields. The structure of products 9 a-k was confirmed by Ft-IR, {sup 1}H NMR and {sup 13}C NMR spectroscopic techniques and X-ray diffraction. It was found that the efficiency of the reaction depends on the catalyst and the solvent, as well as on the aldehyde substituents. (Author)

  11. Characterization of coal-derived hydrocarbons and source-rock potential of coal beds, San Juan Basin, New Mexico and Colorado, U.S.A.

    Science.gov (United States)

    Rice, D.D.; Clayton, J.L.; Pawlewicz, M.J.

    1989-01-01

    Coal beds are considered to be a major source of nonassociated gas in the Rocky Mountain basins of the United States. In the San Juan basin of northwestern New Mexico and southwestern Colorado, significant quantities of natural gas are being produced from coal beds of the Upper Cretaceous Fruitland Formation and from adjacent sandstone reservoirs. Analysis of gas samples from the various gas-producing intervals provided a means of determining their origin and of evaluating coal beds as source rocks. The rank of coal beds in the Fruitland Formation in the central part of the San Juan basin, where major gas production occurs, increases to the northeast and ranges from high-volatile B bituminous coal to medium-volatile bituminous coal (Rm values range from 0.70 to 1.45%). On the basis of chemical, isotopic and coal-rank data, the gases are interpreted to be thermogenic. Gases from the coal beds show little isotopic variation (??13C1 values range -43.6 to -40.5 ppt), are chemically dry (C1/C1-5 values are > 0.99), and contain significant amounts of CO2 (as much as 6%). These gases are interpreted to have resulted from devolatilization of the humic-type bituminous coal that is composed mainly of vitrinite. The primary products of this process are CH4, CO2 and H2O. The coal-generated, methane-rich gas is usually contained in the coal beds of the Fruitland Formation, and has not been expelled and has not migrated into the adjacent sandstone reservoirs. In addition, the coal-bed reservoirs produce a distinctive bicarbonate-type connate water and have higher reservoir pressures than adjacent sandstones. The combination of these factors indicates that coal beds are a closed reservoir system created by the gases, waters, and associated pressures in the micropore coal structure. In contrast, gases produced from overlying sandstones in the Fruitland Formation and underlying Pictured Cliffs Sandstone have a wider range of isotopic values (??13C1 values range from -43.5 to -38

  12. Automatic classification of time-variable X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M. [Sydney Institute for Astronomy, School of Physics, The University of Sydney, Sydney, NSW 2006 (Australia)

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  13. Automatic classification of time-variable X-ray sources

    International Nuclear Information System (INIS)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.

    2014-01-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  14. Bright and durable field emission source derived from refractory taylor cones

    Science.gov (United States)

    Hirsch, Gregory

    2016-12-20

    A method of producing field emitters having improved brightness and durability relying on the creation of a liquid Taylor cone from electrically conductive materials having high melting points. The method calls for melting the end of a wire substrate with a focused laser beam, while imposing a high positive potential on the material. The resulting molten Taylor cone is subsequently rapidly quenched by cessation of the laser power. Rapid quenching is facilitated in large part by radiative cooling, resulting in structures having characteristics closely matching that of the original liquid Taylor cone. Frozen Taylor cones thus obtained yield desirable tip end forms for field emission sources in electron beam applications. Regeneration of the frozen Taylor cones in-situ is readily accomplished by repeating the initial formation procedures. The high temperature liquid Taylor cones can also be employed as bright ion sources with chemical elements previously considered impractical to implement.

  15. The effect of processing history on physical behavior and cellular response for tyrosine-derived polyarylates

    International Nuclear Information System (INIS)

    Doddi, S; Patlolla, A; Shanumunsgarundum, S; Jaffe, M; Collins, G; Arinzeh, T Livingston

    2009-01-01

    Polyarylates have shown promise as fully degradable polymers for drug delivery as well as for structural implant applications due to their range of physicomechanical properties. Processing history, however, could have a significant impact on their overall performance in biologically relevant environments. More specifically, structural changes at the molecular level can occur that will affect a polymer's physical properties and subsequent, cell attachment and growth. The present study was aimed at comparing cell growth on tyrosine-derived polyarylates with that of polylactic acid (PLLA) in their original state and after processing (i.e. undrawn and drawn forms). Two polyarylates having distinct molecular structures were chosen. Strictly, amorphous poly(DTE adipate), denoted as poly(DT 2,4), and poly(DTD) dodecandioate, denoted as poly(DT 12,10), having a more complex, non-crystalline organization, were compared with semi-crystalline PLLA. The degree of shrinkage, thermal characterization, air-water contact angle and surface morphology were determined for each polymer in its undrawn and drawn states. Poly(DT 2,4) and PLLA after processing resulted in greater shrinkage and a slight decrease in hydrophilicity whereas poly(DT 12,10) had minimal shrinkage and became slightly more hydrophilic in its drawn state. Surface morphology or roughness was also altered by processing. In turn, the rate of cell growth and overall cell numbers were reduced significantly on drawn forms of poly(DT 2,4) and PLLA, whereas more favorable growth rates were supported on drawn poly(DT 12,10). These findings indicate that processing effects in amorphous as well as oriented polymeric structures can significantly alter their biological performance.

  16. Constraining Biomarkers of Dissolved Organic Matter Sourcing Using Microbial Incubations of Vascular Plant Leachates of the California landscape

    Science.gov (United States)

    Harfmann, J.; Hernes, P.; Chuang, C. Y.; Kaiser, K.; Spencer, R. G.; Guillemette, F.

    2017-12-01

    Source origin of dissolved organic matter (DOM) is crucial in determining reactivity, driving chemical and biological processing of carbon. DOM source biomarkers such as lignin (a vascular plant marker) and D-amino acids (bacterial markers) are well-established tools in tracing DOM origin and fate. The development of high-resolution mass spectrometry and optical studies has expanded our toolkit; yet despite these advances, our understanding of DOM sources and fate remains largely qualitative. Quantitative data on DOM pools and fluxes become increasingly necessary as we refine our comprehension of its composition. In this study, we aim to calibrate and quantify DOM source endmembers by performing microbial incubations of multiple vascular plant leachates, where total DOM is constrained by initial vascular plant input and microbial production. Derived endmembers may be applied to endmember mixing models to quantify DOM source contributions in aquatic systems.

  17. The rules of drug taking: wine and poppy derivatives in the ancient world. VII. A ritual use of poppy derivatives?

    Science.gov (United States)

    Nencini, P

    1997-08-01

    Besides fertility, poppies have been used to symbolize sleep, night, and death. Consistent with the agrarian origin of their ritual use, poppies also became a symbol of reincarnation. Several literary and iconographic sources, in particular of the early Roman imperial age, are here interpreted as evidence that poppy derivatives were ingested during mystery rites. The reversible narcotic effects of poppy derivatives should have allowed a "realistic" representation of death and reincarnation, as intended by the Orphic belief of the transmigration of souls.

  18. Chemical composition, sources and secondary processes of aerosols in Baoji city of northwest China

    Science.gov (United States)

    Wang, Y. C.; Huang, R.-J.; Ni, H. Y.; Chen, Y.; Wang, Q. Y.; Li, G. H.; Tie, X. X.; Shen, Z. X.; Huang, Y.; Liu, S. X.; Dong, W. M.; Xue, P.; Fröhlich, R.; Canonaco, F.; Elser, M.; Daellenbach, K. R.; Bozzetti, C.; El Haddad, I.; Prévôt, A. S. H.; Canagaratna, M. R.; Worsnop, D. R.; Cao, J. J.

    2017-06-01

    Particulate air pollution is a severe environmental problem in China, affecting visibility, air quality, climate and human health. However, previous studies focus mainly on large cities such as Beijing, Shanghai, and Guangzhou. In this study, an Aerodyne Aerosol Chemical Speciation Monitor was deployed in Baoji, a middle size inland city in northwest China from 26 February to 27 March 2014. The non-refractory submicron aerosol (NR-PM1) was dominated by organics (55%), followed by sulfate (16%), nitrate (15%), ammonium (11%) and chloride (3%). A source apportionment of the organic aerosol (OA) was performed with the Sofi (Source Finder) interface of ME-2 (Multilinear Engine), and six main sources/factors were identified and classified as hydrocarbon-like OA (HOA), cooking OA (COA), biomass burning OA (BBOA), coal combustion OA (CCOA), less oxidized oxygenated OA (LO-OOA) and more oxidized oxygenated OA (MO-OOA), which contributed 20%, 14%, 13%, 9%, 23% and 21% of total OA, respectively. The contribution of secondary components shows increasing trends from clean days to polluted days, indicating the importance of secondary aerosol formation processes in driving particulate air pollution. The formation of LO-OOA and MO-OOA is mainly driven by photochemical reactions, but significantly influenced by aqueous-phase chemistry during periods of low atmospheric oxidative capacity.

  19. Lipids Reprogram Metabolism to Become a Major Carbon Source for Histone Acetylation

    DEFF Research Database (Denmark)

    McDonnell, Eoin; Crown, Scott B; Fox, Douglas B

    2016-01-01

    Cells integrate nutrient sensing and metabolism to coordinate proper cellular responses to a particular nutrient source. For example, glucose drives a gene expression program characterized by activating genes involved in its metabolism, in part by increasing glucose-derived histone acetylation....... Here, we find that lipid-derived acetyl-CoA is a major source of carbon for histone acetylation. Using (13)C-carbon tracing combined with acetyl-proteomics, we show that up to 90% of acetylation on certain histone lysines can be derived from fatty acid carbon, even in the presence of excess glucose...

  20. Inverse modelling of fluvial sediment connectivity identifies characteristics and spatial distribution of sediment sources in a large river network.

    Science.gov (United States)

    Schmitt, R. J. P.; Bizzi, S.; Kondolf, G. M.; Rubin, Z.; Castelletti, A.

    2016-12-01

    Field and laboratory evidence indicates that the spatial distribution of transport in both alluvial and bedrock rivers is an adaptation to sediment supply. Sediment supply, in turn, depends on spatial distribution and properties (e.g., grain sizes and supply rates) of individual sediment sources. Analyzing the distribution of transport capacity in a river network could hence clarify the spatial distribution and properties of sediment sources. Yet, challenges include a) identifying magnitude and spatial distribution of transport capacity for each of multiple grain sizes being simultaneously transported, and b) estimating source grain sizes and supply rates, both at network scales. Herein, we approach the problem of identifying the spatial distribution of sediment sources and the resulting network sediment fluxes in a major, poorly monitored tributary (80,000 km2) of the Mekong. Therefore, we apply the CASCADE modeling framework (Schmitt et al. (2016)). CASCADE calculates transport capacities and sediment fluxes for multiple grainsizes on the network scale based on remotely-sensed morphology and modelled hydrology. CASCADE is run in an inverse Monte Carlo approach for 7500 random initializations of source grain sizes. In all runs, supply of each source is inferred from the minimum downstream transport capacity for the source grain size. Results for each realization are compared to sparse available sedimentary records. Only 1 % of initializations reproduced the sedimentary record. Results for these realizations revealed a spatial pattern in source supply rates, grain sizes, and network sediment fluxes that correlated well with map-derived patterns in lithology and river-morphology. Hence, we propose that observable river hydro-morphology contains information on upstream source properties that can be back-calculated using an inverse modeling approach. Such an approach could be coupled to more detailed models of hillslope processes in future to derive integrated models

  1. Good Manufacturing Practices and Microbial Contamination Sources in Orange Fleshed Sweet Potato Puree Processing Plant in Kenya

    Directory of Open Access Journals (Sweden)

    Derick Nyabera Malavi

    2018-01-01

    Full Text Available Limited information exists on the status of hygiene and probable sources of microbial contamination in Orange Fleshed Sweet Potato (OFSP puree processing. The current study is aimed at determining the level of compliance to Good Manufacturing Practices (GMPs, hygiene, and microbial quality in OFSP puree processing plant in Kenya. Intensive observation and interviews using a structured GMPs checklist, environmental sampling, and microbial analysis by standard microbiological methods were used in data collection. The results indicated low level of compliance to GMPs with an overall compliance score of 58%. Microbial counts on food equipment surfaces, installations, and personnel hands and in packaged OFSP puree were above the recommended microbial safety and quality legal limits. Steaming significantly (P<0.05 reduced microbial load in OFSP cooked roots but the counts significantly (P<0.05 increased in the puree due to postprocessing contamination. Total counts, yeasts and molds, Enterobacteriaceae, total coliforms, and E. coli and S. aureus counts in OFSP puree were 8.0, 4.0, 6.6, 5.8, 4.8, and 5.9 log10 cfu/g, respectively. In conclusion, equipment surfaces, personnel hands, and processing water were major sources of contamination in OFSP puree processing and handling. Plant hygiene inspection, environmental monitoring, and food safety trainings are recommended to improve hygiene, microbial quality, and safety of OFSP puree.

  2. Potential of chicken by-products as sources of useful biological resources

    International Nuclear Information System (INIS)

    Lasekan, Adeseye; Abu Bakar, Fatimah; Hashim, Dzulkifly

    2013-01-01

    By-products from different animal sources are currently being utilised for beneficial purposes. Chicken processing plants all over the world generate large amount of solid by-products in form of heads, legs, bones, viscera and feather. These wastes are often processed into livestock feed, fertilizers and pet foods or totally discarded. Inappropriate disposal of these wastes causes environmental pollution, diseases and loss of useful biological resources like protein, enzymes and lipids. Utilisation methods that make use of these biological components for producing value added products rather than the direct use of the actual waste material might be another viable option for dealing with these wastes. This line of thought has consequently led to researches on these wastes as sources of protein hydrolysates, enzymes and polyunsaturated fatty acids. Due to the multi-applications of protein hydrolysates in various branches of science and industry, and the large body of literature reporting the conversion of animal wastes to hydrolysates, a large section of this review was devoted to this subject. Thus, this review reports the known functional and bioactive properties of hydrolysates derived from chicken by-products as well their utilisation as source of peptone in microbiological media. Methods of producing these hydrolysates including their microbiological safety are discussed. Based on the few references available in the literature, the potential of some chicken by-product as sources of proteases and polyunsaturated fatty acids are pointed out along with some other future applications

  3. Potential of chicken by-products as sources of useful biological resources

    Energy Technology Data Exchange (ETDEWEB)

    Lasekan, Adeseye [Faculty of Food Science and Technology, Universiti Putra Malaysia, 43400 UPM Serdang, Selangor (Malaysia); Abu Bakar, Fatimah, E-mail: fatim@putra.upm.edu.my [Faculty of Food Science and Technology, Universiti Putra Malaysia, 43400 UPM Serdang, Selangor (Malaysia); Halal Products Research Institute, Universiti Putra Malaysia, 43400 UPM Serdang, Selangor (Malaysia); Hashim, Dzulkifly [Faculty of Food Science and Technology, Universiti Putra Malaysia, 43400 UPM Serdang, Selangor (Malaysia); Halal Products Research Institute, Universiti Putra Malaysia, 43400 UPM Serdang, Selangor (Malaysia)

    2013-03-15

    By-products from different animal sources are currently being utilised for beneficial purposes. Chicken processing plants all over the world generate large amount of solid by-products in form of heads, legs, bones, viscera and feather. These wastes are often processed into livestock feed, fertilizers and pet foods or totally discarded. Inappropriate disposal of these wastes causes environmental pollution, diseases and loss of useful biological resources like protein, enzymes and lipids. Utilisation methods that make use of these biological components for producing value added products rather than the direct use of the actual waste material might be another viable option for dealing with these wastes. This line of thought has consequently led to researches on these wastes as sources of protein hydrolysates, enzymes and polyunsaturated fatty acids. Due to the multi-applications of protein hydrolysates in various branches of science and industry, and the large body of literature reporting the conversion of animal wastes to hydrolysates, a large section of this review was devoted to this subject. Thus, this review reports the known functional and bioactive properties of hydrolysates derived from chicken by-products as well their utilisation as source of peptone in microbiological media. Methods of producing these hydrolysates including their microbiological safety are discussed. Based on the few references available in the literature, the potential of some chicken by-product as sources of proteases and polyunsaturated fatty acids are pointed out along with some other future applications.

  4. Characterizing the Heavy Elements in Globular Cluster M22 and an Empirical s-process Abundance Distribution Derived from the Two Stellar Groups

    Science.gov (United States)

    Roederer, I. U.; Marino, A. F.; Sneden, C.

    2011-11-01

    We present an empirical s-process abundance distribution derived with explicit knowledge of the r-process component in the low-metallicity globular cluster M22. We have obtained high-resolution, high signal-to-noise spectra for six red giants in M22 using the Magellan Inamori Kyocera Echelle spectrograph on the Magellan-Clay Telescope at Las Campanas Observatory. In each star we derive abundances for 44 species of 40 elements, including 24 elements heavier than zinc (Z = 30) produced by neutron-capture reactions. Previous studies determined that three of these stars (the "r+s group") have an enhancement of s-process material relative to the other three stars (the "r-only group"). We confirm that the r+s group is moderately enriched in Pb relative to the r-only group. Both groups of stars were born with the same amount of r-process material, but s-process material was also present in the gas from which the r+s group formed. The s-process abundances are inconsistent with predictions for asymptotic giant branch (AGB) stars with M <= 3 M ⊙ and suggest an origin in more massive AGB stars capable of activating the 22Ne(α,n)25Mg reaction. We calculate the s-process "residual" by subtracting the r-process pattern in the r-only group from the abundances in the r+s group. In contrast to previous r- and s-process decompositions, this approach makes no assumptions about the r- and s-process distributions in the solar system and provides a unique opportunity to explore s-process yields in a metal-poor environment. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile.

  5. By-product from decoction process of Hibiscus sabdariffa L. calyces as a source of polyphenols and dietary fiber.

    Science.gov (United States)

    Sáyago-Ayerdi, Sonia G; Velázquez-López, Carolina; Montalvo-González, Efigenia; Goñi, Isabel

    2014-03-30

    Dietary fiber (DF) and antioxidant compounds are widely used as functional ingredients. The market in this field is competitive and the search for new types of quality ingredients for the food industry is intensifying. The aim of this study was to evaluate the composition and antioxidant activity of by-products generated during the decoction of calyces of four Mexican Hibiscus sabdariffa L. cultivars ('Criolla', 'China', 'Rosalis' and 'Tecoanapa') in order to assess them as a source of functional ingredients. Some calyx components were partially transferred to the beverage during the decoction process, while most were retained in the decoction residues. These by-products proved to be a good source of DF (407.4-457.0 g kg⁻¹ dry matter) and natural antioxidants (50.7-121.8 µmol Trolox equivalent g⁻¹ dry matter). The decoction process extracted some soluble carbohydrates, ash and some extractable polyphenols. The DF content changed in the dried residues, which could be considered as high-DF materials with a high proportion of soluble DF (∼20% of total DF) and considerable antioxidant capacity. These by-products could be used as an antioxidant DF source. © 2013 Society of Chemical Industry.

  6. Electron beam processing technology for modification of different types of cellulose pulps for production of derivatives

    International Nuclear Information System (INIS)

    Iller, E.; Kukielka, A.; Mikolajczyk, W.; Starostka, P.; Stupinska, H.

    2002-01-01

    Institute of Nuclear Chemistry and Technology, Pulp and Paper Research Institute and Institute of Chemical Fibers carry out a joint research project in order to develop the radiation methods modification of cellulose pulps for production of cellulose derivatives such as carbamate (CC), carboxymethyl cellulose (CMC) and methylcellulose (MC). Three different types of textile pulps: Alicell (A); Borregaard (B), Ketchikan (K) and Kraft softwood (PSS) and hardwood (PSB) pulps have been irradiated with 10 MeV electron beam from LAE 13/9 linear accelerator with doses of 5, 10, 15, 20, 25 and 50 kGy. After electron beam treatment the samples of cellulose pulps have been examined by using of structural and physico-chemical methods. Electron paramagnetic resonance spectroscopy (EPR), gel permeation chromatography (GPC) and infrared spectroscopy (IRS) were applied for determination of structural changes in irradiated cellulose pulps. By means of analytical methods, such parameters as: viscosity, average degree of polymerization (DP) and α-cellulose contents were evaluated. Based on EPR and GPC investigations the relationship between concentrations of free radicals and decreasing polymerization degrees in electron beam treatment pulps has been confirmed. The carboxymethylcellulose, methylcellulose and cellulose carbamate were prepared using the raw material of radiation modified pulps. Positive results of investigations will allow for determination of optimum conditions for electron beam modification of selected cellulose paper and textile pulps. Such procedure leads to limit the amounts of chemical activators used in methods for preparation cellulose derivatives. The proposed electron beam technology is new approaches in technical solution and economic of process of cellulose derivatives preparation. (author)

  7. Adopted levels and derived limits for Ra-226 and the decision making processes concerning TENORM releases

    International Nuclear Information System (INIS)

    Paschoa, A.S.

    2002-01-01

    A fraction of a primary dose limit can be, in general, agreed upon as a dose related level to be adopted in decision-making processes. In the case of TENORM releases, fractions of primary dose levels for 226 Ra, 228 Ra, and 210 Po may be of particular importance to establish adopted levels for 226 Ra could be adopted at the highest portion of the natural background variation. Above such level, intervention and remedial action levels could also be adopted. All those levels would be fractions of the primary level, but translated in terms of derived limits expressed in practical units. Derived limits would then be calculated by using environmental models. In such approach 'critical groups' would have to be carefully defined and identified. In addition, the size of a critical group would be chosen to be used in environmental modeling. Site specific environmental models and parameters are desirable, though unavailable, or very difficult to obtain, in most cases. Thus, mathematical models and parameters of more generic nature are often used. A sensitive parametric analysis can make a ranking of the parameters used in a model, allowing one to choose how important each parameter will be for the model output. The paper will point out that when using the adopted levels and derived limits, as suggested above, the uncertainties and importance of the parameters entering an environmental model can make the difference for decision makers to take the right or wrong decision, as far as radiological protection is concerned. (author)

  8. Isotopes as tracers of the sources of the lunar material and processes of lunar origin.

    Science.gov (United States)

    Pahlevan, Kaveh

    2014-09-13

    Ever since the Apollo programme, isotopic abundances have been used as tracers to study lunar formation, in particular to study the sources of the lunar material. In the past decade, increasingly precise isotopic data have been reported that give strong indications that the Moon and the Earth's mantle have a common heritage. To reconcile these observations with the origin of the Moon via the collision of two distinct planetary bodies, it has been proposed (i) that the Earth-Moon system underwent convective mixing into a single isotopic reservoir during the approximately 10(3) year molten disc epoch after the giant impact but before lunar accretion, or (ii) that a high angular momentum impact injected a silicate disc into orbit sourced directly from the mantle of the proto-Earth and the impacting planet in the right proportions to match the isotopic observations. Recently, it has also become recognized that liquid-vapour fractionation in the energetic aftermath of the giant impact is capable of generating measurable mass-dependent isotopic offsets between the silicate Earth and Moon, rendering isotopic measurements sensitive not only to the sources of the lunar material, but also to the processes accompanying lunar origin. Here, we review the isotopic evidence that the silicate-Earth-Moon system represents a single planetary reservoir. We then discuss the development of new isotopic tracers sensitive to processes in the melt-vapour lunar disc and how theoretical calculations of their behaviour and sample observations can constrain scenarios of post-impact evolution in the earliest history of the Earth-Moon system. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  9. Identifying Cases of Type 2 Diabetes in Heterogeneous Data Sources: Strategy from the EMIF Project.

    Directory of Open Access Journals (Sweden)

    Giuseppe Roberto

    Full Text Available Due to the heterogeneity of existing European sources of observational healthcare data, data source-tailored choices are needed to execute multi-data source, multi-national epidemiological studies. This makes transparent documentation paramount. In this proof-of-concept study, a novel standard data derivation procedure was tested in a set of heterogeneous data sources. Identification of subjects with type 2 diabetes (T2DM was the test case. We included three primary care data sources (PCDs, three record linkage of administrative and/or registry data sources (RLDs, one hospital and one biobank. Overall, data from 12 million subjects from six European countries were extracted. Based on a shared event definition, sixteeen standard algorithms (components useful to identify T2DM cases were generated through a top-down/bottom-up iterative approach. Each component was based on one single data domain among diagnoses, drugs, diagnostic test utilization and laboratory results. Diagnoses-based components were subclassified considering the healthcare setting (primary, secondary, inpatient care. The Unified Medical Language System was used for semantic harmonization within data domains. Individual components were extracted and proportion of population identified was compared across data sources. Drug-based components performed similarly in RLDs and PCDs, unlike diagnoses-based components. Using components as building blocks, logical combinations with AND, OR, AND NOT were tested and local experts recommended their preferred data source-tailored combination. The population identified per data sources by resulting algorithms varied from 3.5% to 15.7%, however, age-specific results were fairly comparable. The impact of individual components was assessed: diagnoses-based components identified the majority of cases in PCDs (93-100%, while drug-based components were the main contributors in RLDs (81-100%. The proposed data derivation procedure allowed the

  10. Open-source colorimeter.

    Science.gov (United States)

    Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M

    2013-04-19

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.

  11. Open-Source Colorimeter

    Science.gov (United States)

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032

  12. Extracellular matrix-derived hydrogels for dental stem cell delivery

    OpenAIRE

    Viswanath, Aiswarya; Vanacker, Julie; Germain, Loic; Leprince, Julien G.; Diogenes, Anibal; Shakesheff, Kevin M.; White, Lisa J.; des Rieux, Anne

    2016-01-01

    Decellularised mammalian extracellular matrices (ECM) have been widely accepted as an ideal substrate for repair and remodelling of numerous tissues in clinical and pre-clinical studies. Recent studies have demonstrated the ability of ECM scaffolds derived from site-specific homologous tissues to direct cell differentiation. The present study investigated the suitability of hydrogels derived from different source tissues: bone, spinal cord and dentine, as suitable carriers to deliver human ap...

  13. Convolutive Blind Source Separation Methods

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Larsen, Jan; Kjems, Ulrik

    2008-01-01

    During the past decades, much attention has been given to the separation of mixed sources, in particular for the blind case where both the sources and the mixing process are unknown and only recordings of the mixtures are available. In several situations it is desirable to recover all sources from...... the recorded mixtures, or at least to segregate a particular source. Furthermore, it may be useful to identify the mixing process itself to reveal information about the physical mixing system. In some simple mixing models each recording consists of a sum of differently weighted source signals. However, in many...... real-world applications, such as in acoustics, the mixing process is more complex. In such systems, the mixtures are weighted and delayed, and each source contributes to the sum with multiple delays corresponding to the multiple paths by which an acoustic signal propagates to a microphone...

  14. Soybean-derived biofuels and home heating fuels.

    Science.gov (United States)

    Mushrush, George W; Wynne, James H; Willauer, Heather D; Lloyd, Christopher L

    2006-01-01

    It is environmentally enticing to consider replacing or blending petroleum derived heating fuels with biofuels for many reasons. Major considerations include the soaring worldwide price of petroleum products, especially home heating oil, the toxicity of the petroleum-derived fuels and the environmental damage that leaking petroleum tanks afford. For these reasons, it has been suggested that domestic renewable energy sources be considered as replacements, or at the least, as blending stocks for home heating fuels. If recycled soy restaurant cooking oils could be employed for this purpose, this would represent an environmental advantage. Renewable plant sources of energy tend to be less toxic than their petroleum counterparts. This is an important consideration when tank leakage occurs. Home fuel oil storage tanks practically always contain some bottom water. This water environment has a pH value that factors into heating fuel stability. Therefore, the question is: would the biofuel help or exacerbate fuel stability and furnace maintenance issues?

  15. A time-dependent neutron transport method of characteristics formulation with time derivative propagation

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Adam J., E-mail: adamhoff@umich.edu; Lee, John C., E-mail: jcl@umich.edu

    2016-02-15

    A new time-dependent Method of Characteristics (MOC) formulation for nuclear reactor kinetics was developed utilizing angular flux time-derivative propagation. This method avoids the requirement of storing the angular flux at previous points in time to represent a discretized time derivative; instead, an equation for the angular flux time derivative along 1D spatial characteristics is derived and solved concurrently with the 1D transport characteristic equation. This approach allows the angular flux time derivative to be recast principally in terms of the neutron source time derivatives, which are approximated to high-order accuracy using the backward differentiation formula (BDF). This approach, called Source Derivative Propagation (SDP), drastically reduces the memory requirements of time-dependent MOC relative to methods that require storing the angular flux. An SDP method was developed for 2D and 3D applications and implemented in the computer code DeCART in 2D. DeCART was used to model two reactor transient benchmarks: a modified TWIGL problem and a C5G7 transient. The SDP method accurately and efficiently replicated the solution of the conventional time-dependent MOC method using two orders of magnitude less memory.

  16. A time-dependent neutron transport method of characteristics formulation with time derivative propagation

    International Nuclear Information System (INIS)

    Hoffman, Adam J.; Lee, John C.

    2016-01-01

    A new time-dependent Method of Characteristics (MOC) formulation for nuclear reactor kinetics was developed utilizing angular flux time-derivative propagation. This method avoids the requirement of storing the angular flux at previous points in time to represent a discretized time derivative; instead, an equation for the angular flux time derivative along 1D spatial characteristics is derived and solved concurrently with the 1D transport characteristic equation. This approach allows the angular flux time derivative to be recast principally in terms of the neutron source time derivatives, which are approximated to high-order accuracy using the backward differentiation formula (BDF). This approach, called Source Derivative Propagation (SDP), drastically reduces the memory requirements of time-dependent MOC relative to methods that require storing the angular flux. An SDP method was developed for 2D and 3D applications and implemented in the computer code DeCART in 2D. DeCART was used to model two reactor transient benchmarks: a modified TWIGL problem and a C5G7 transient. The SDP method accurately and efficiently replicated the solution of the conventional time-dependent MOC method using two orders of magnitude less memory.

  17. CDApps: integrated software for experimental planning and data processing at beamline B23, Diamond Light Source.

    Science.gov (United States)

    Hussain, Rohanah; Benning, Kristian; Javorfi, Tamas; Longo, Edoardo; Rudd, Timothy R; Pulford, Bill; Siligardi, Giuliano

    2015-03-01

    The B23 Circular Dichroism beamline at Diamond Light Source has been operational since 2009 and has seen visits from more than 200 user groups, who have generated large amounts of data. Based on the experience of overseeing the users' progress at B23, four key areas requiring the most assistance are identified: planning of experiments and note-keeping; designing titration experiments; processing and analysis of the collected data; and production of experimental reports. To streamline these processes an integrated software package has been developed and made available for the users. The subsequent article summarizes the main features of the software.

  18. Autologous Adipose-Derived Tissue Matrix Part I: Biologic Characteristics.

    Science.gov (United States)

    Schendel, Stephen A

    2017-10-01

    Autologous collagen is an ideal soft tissue filler and may serve as a matrix for stem cell implantation and growth. Procurement of autologous collagen has been limited, though, secondary to a sufficient source. Liposuction is a widely performed and could be a source of autologous collagen. The amount of collagen and its composition in liposuctioned fat remains unknown. The purpose of this research was to characterize an adipose-derived tissue-based product created using ultrasonic cavitation and cryo-grinding. This study evaluated the cellular and protein composition of the final product. Fat was obtained from individuals undergoing routine liposuction and was processed by a 2 step process to obtain only the connective tissue. The tissue was then evaluated by scanning electronic microscope, Western blot analysis, and flow cytometry. Liposuctioned fat was obtained from 10 individuals with an average of 298 mL per subject. After processing an average of 1 mL of collagen matrix was obtained from each 100 mL of fat. Significant viable cell markers were present in descending order for adipocytes > CD90+ > CD105+ > CD45+ > CD19+ > CD144+ > CD34+. Western blot analysis showed collagen type II, III, IV, and other proteins. Scanning electronic microscope study showed a regular pattern of cross-linked, helical collagen. Additionally, vital staing demonstrated that the cells were still viable after processing. Collagen and cells can be easily obtained from liposuctioned fat by ultrasonic separation without alteration of the overall cellular composition of the tissue. Implantation results in new collagen and cellular growth. Collagen matrix with viable cells for autologous use can be obtained from liposuctioned fat and may provide long term results. 5. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com

  19. The Commercial Open Source Business Model

    Science.gov (United States)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  20. First space-based derivation of the global atmospheric methanol emission fluxes

    Directory of Open Access Journals (Sweden)

    T. Stavrakou

    2011-05-01

    is unaccounted for in the MEGANv2.1 inventory. The most significant error reductions achieved by the optimization concern the derived biogenic emissions over the Amazon and over the Former Soviet Union. The robustness of the derived fluxes to changes in convective updraft fluxes, in methanol removal processes, and in the choice of the biogenic a priori inventory is assessed through sensitivity inversions. Detailed comparisons of the model with a number of aircraft and surface observations of methanol, as well as new methanol measurements in Europe and in the Reunion Island show that the satellite-derived methanol emissions improve significantly the agreement with the independent data, giving thus credence to the IASI dataset.