WorldWideScience

Sample records for battelle coal-cleaning process

  1. Recent Advances in Precombustion Coal Cleaning Processes

    Institute of Scientific and Technical Information of China (English)

    Shiao-HungChiang; DaxinHe

    1994-01-01

    The mineral matter in coal constitutes a major impediment to the direct use of coal in power plants.A concerted effort has been mounted to reduce the ash/sulfur contents in product coal to meet the ever more stringent environmental regulations.In recent years,significant advances have taken place in fine coal cleaning technologies.A review of recent developments in aveanced physical,chemical and biological processes for deep-cleaning of fine coal is presented.

  2. Application of microorganisms in coal cleaning processes

    International Nuclear Information System (INIS)

    A secure energy supply is one of the basic pre-requisites for a sound economic system, sustained standard and quality of life and eventually for the social well-being of each individual. For a progressive country like Pakistan, it is obligatory that all energy options must be pursued vigorously including coal utilization, which given the relatively large resources available, is considered to be one of the major options for the next few hundred years. Bioprocessing of coal in an emerging technology which has started to receive considerable research attention. Recent research activities involving coal cleaning, direct coal conversion, and indirect conversion of coal-derived materials have generated a plethora of facts regarding biochemistry, chemistry, and thermodynamic behavior of coal, in that its bioprocessing is on the verge of becoming and acceptable means to great coals. In this research report, investigations pertaining to the various aspects of coal bio processing, including desulfurization and depyritization are discussed. Bituminous coals varying in total sulfur contents of 3-6% were depyritized more than 90% by mesophilic acidophiles like Thiobacillus ferroxidans and Thiobacillus thio oxidans and thermophilic Sulfolobus brierleyi. The archaebacterium, Sulfolobus brierleyi was found to desulfurize inorganic and organic sulfur components of the coal. Conditions were established under which it can remove more than 30% of the organic sulfur present in the coals. Heterotrophic microorganisms including oxenic and soil isolates were also employed for studying sulfurization. A soil isolate, Oil-2, was found to remove more than 70% dibenzothiophenic sulfur present in an oil-water emulsion (1:20 ratio). Pseudomonas putida and the bacterium oil-2 also remove 60-70% organic sulfur present in the shale-oil. Preliminary results indicate the presence of putatively known Kodama's pathway in the oil-2. The mass balance for sulfate indicated the possibility of the presence

  3. Analysis of chemical coal cleaning processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1980-06-01

    Six chemical coal cleaning processes were examined. Conceptual designs and costs were prepared for these processes and coal preparation facilities, including physical cleaning and size reduction. Transportation of fine coal in agglomerated and unagglomerated forms was also discussed. Chemical cleaning processes were: Pittsburgh Energy Technology Center, Ledgemont, Ames Laboratory, Jet Propulsion Laboratory (two versions), and Guth Process (KVB). Three of the chemical cleaning processes are similar in concept: PETC, Ledgemont, and Ames. Each of these is based on the reaction of sulfur with pressurized oxygen, with the controlling factor being the partial pressure of oxygen in the reactor. All of the processes appear technically feasible. Economic feasibility is less certain. The recovery of process chemicals is vital to the JPL and Guth processes. All of the processes consume significant amounts of energy in the form of electric power and coal. Energy recovery and increased efficiency are potential areas for study in future more detailed designs. The Guth process (formally designed KVB) appears to be the simplest of the systems evaluated. All of the processes require future engineering to better determine methods for scaling laboratory designs/results to commercial-scale operations. A major area for future engineering is to resolve problems related to handling, feeding, and flow control of the fine and often hot coal.

  4. Separation of mercury in industrial processes of Polish hard steam coals cleaning

    Directory of Open Access Journals (Sweden)

    Wierzchowski Krzysztof

    2016-01-01

    Full Text Available Coal use is regarded as one of main sources of anthropogenic propagation of mercury in the environment. The coal cleaning is listed among methods of the mercury emission reduction. The article concerns the statistical assessment of mercury separation between coal cleaning products. Two industrial processes employed in the Polish coal preparation plants are analysed: coal cleaning in heavy media vessels and coal cleaning in jigs. It was found that the arithmetic mean mercury content in coarse and medium coal size fractions for clean coal from heavy media vessels, amounts 68.9 μg/kg, and most of the results lay below the mean value, while for rejects it amounts 95.5 μg/kg. It means that it is for around 25 μg/kg greater than in the clean coal. The arithmetic mean mercury content in raw coal smalls amounts around 118 mg/kg. The cleaning of smalls in jigs results in clean coal and steam coal blends characterized by mean mercury content 96.8 μg/kg and rejects with mean mercury content 184.5 μg/kg.

  5. Bench-scale testing of a micronized magnetite, fine-coal cleaning process

    Energy Technology Data Exchange (ETDEWEB)

    Suardini, P.J. [Custom Coals, International, Pittsburgh, PA (United States)

    1995-11-01

    Custom Coals, International has installed and is presently testing a 500 lb/hr. micronized-magnetite, fine-coal cleaning circuit at PETC`s Process Research Facility (PRF). The cost-shared project was awarded as part of the Coal Preparation Program`s, High Efficiency Preparation Subprogram. The project includes design, construction, testing, and decommissioning of a fully-integrated, bench-scale circuit, complete with feed coal classification to remove the minus 30 micron slimes, dense medium cycloning of the 300 by 30 micron feed coal using a nominal minus 10 micron size magnetite medium, and medium recovery using drain and rinse screens and various stages and types of magnetic separators. This paper describes the project circuit and goals, including a description of the current project status and the sources of coal and magnetite which are being tested.

  6. Advanced physical fine coal cleaning spherical agglomeration. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    The project included process development, engineering, construction, and operation of a 1/3 tph proof-of-concept (POC) spherical agglomeration test module. The POC tests demonstrated that physical cleaning of ultrafine coal by agglomeration using heptane can achieve: (1) Pyritic sulfur reductions beyond that possible with conventional coal cleaning methods; (2) coal ash contents below those which can be obtained by conventional coal cleaning methods at comparable energy recoveries; (3) energy recoveries of 80 percent or greater measured against the raw coal energy content; (4) complete recovery of the heptane bridging liquid from the agglomerates; and (5) production of agglomerates with 3/8-inch size and less than 30 percent moisture. Test results met or exceeded all of the program objectives. Nominal 3/8-inch size agglomerates with less than 20 percent moisture were produced. The clean coal ash content varied between 1.5 to 5.5 percent by weight (dry basis) depending on feed coal type. Ash reductions of the run-of-mine (ROM) coal were 77 to 83 percent. ROM pyritic sulfur reductions varied from 86 to 90 percent for the three test coals, equating to total sulfur reductions of 47 to 72 percent.

  7. Coal cleaning: a viable strategy for reduced carbon emissions and improved environment in China?

    International Nuclear Information System (INIS)

    China is a dominant energy consumer in global context and current energy forecasts emphasise that China's future energy consumption also will rely heavily on coal. The coal use is the major source of the greenhouse gas CO2 and particles causing serious health damage. This paper looks into the question if coal washing might work as low cost strategy for both CO2 and particle emission reductions. Coal washing removes dirt and rock from raw coal, resulting in a coal product with higher thermal energy and less air pollutants. Coal cleaning capacity has so far not been developed in line with the market potential. In this paper an emerging market for cleaned coal is studied within a CGE model for China. The macro approach catches the repercussions of coal cleaning through increased energy efficiency, lower coal transportation costs and crowding out effect of investments in coal washing plants. Coal cleaning stimulates economic growth and reduces particle emissions, but total energy use, coal use and CO2 emissions increase through a rebound effect supported by the vast reserve of underemployed labourers. A carbon tax on fossil fuel combustion has a limited effect on total emissions. The reason is a coal leakage to tax exempted processing industries

  8. Aspen Process Flowsheet Simulation Model of a Battelle Biomass-Based Gasification, Fischer-Tropsch Liquefaction and Combined-Cycle Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    None

    1998-10-30

    This study was done to support the research and development program of the National Renewable Energy Laboratory (NREL) in the thermochemical conversion of biomass to liquid transportation fuels using current state-of-the-art technology. The Mitretek study investigated the use of two biomass gasifiers; the RENUGAS gasifier being developed by the Institute of Gas Technology, and the indirectly heated gasifier being developed by Battelle Columbus. The Battelle Memorial Institute of Columbus, Ohio indirectly heated biomass gasifier was selected for this model development because the syngas produced by it is better suited for Fischer-Tropsch synthesis with an iron-based catalyst for which a large amount of experimental data are available. Bechtel with Amoco as a subcontractor developed a conceptual baseline design and several alternative designs for indirect coal liquefaction facilities. In addition, ASPEN Plus process flowsheet simulation models were developed for each of designs. These models were used to perform several parametric studies to investigate various alternatives for improving the economics of indirect coal liquefaction.

  9. Engineering Development of Advanced Physical Fine Coal Cleaning for Premium Fuel Applications

    Energy Technology Data Exchange (ETDEWEB)

    Smit, Frank J; Schields, Gene L; Jha, Mehesh C; Moro, Nick

    1997-09-26

    The ash in six common bituminous coals, Taggart, Winifrede, Elkhorn No. 3, Indiana VII, Sunnyside and Hiawatha, could be liberated by fine grinding to allow preparation of clean coal meeting premium fuel specifications (< 1- 2 lb/ MBtu ash and <0.6 lb/ MBtu sulfur) by laboratory and bench- scale column flotation or selective agglomeration. Over 2,100 tons of coal were cleaned in the PDU at feed rates between 2,500 and 6,000 lb/ h by Microcel™ column flotation and by selective agglomeration using recycled heptane as the bridging liquid. Parametric testing of each process and 72- hr productions runs were completed on each of the three test coals. The following results were achieved after optimization of the operating parameters: The primary objective was to develop the design base for commercial fine coal cleaning facilities for producing ultra- clean coals which can be converted into coal-water slurry premium fuel. The coal cleaning technologies to be developed were advanced column flotation and selective agglomeration, and the goal was to produce fuel meeting the following specifications.

  10. Engineering development of advanced physical fine coal cleaning technologies: Froth flotation

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    a study conducted by Pittsburgh Energy Technology Center of sulfur emissions from about 1300 United States coal-fired utility boilers indicated that half of the emissions were the result of burning coals having greater than 1.2 pounds of SO{sub 2} per million BTU. This was mainly attributed to the high pyritic sulfur content of the boiler fuel. A significant reduction in SO{sub 2} emissions could be accomplished by removing the pyrite from the coals by advanced physical fine coal cleaning. An engineering development project was prepared to build upon the basic research effort conducted under a solicitation for research into Fine Coal Surface Control. The engineering development project is intended to use general plant design knowledge and conceptualize a plant to utilize advanced froth flotation technology to process coal and produce a product having maximum practical pyritic sulfur reduction consistent with maximum practical BTU recovery. This document is the eighth quarterly report prepared in accordance with the project reporting requirements covering the period from July 1,1990 to September 30, 1990. The overall project scope of the engineering development project is to conceptually develop a commercial flowsheet to maximize pyritic sulfur reduction at practical energy recovery values. The data from the basic research on coal surfaces, bench scale testing and proof-of-concept scale testing will be utilized to design a final conceptual flowsheet. The economics of the flowsheet will be determined to enable industry to assess the feasibility of incorporating the advanced fine coal cleaning technology into the production of clean coal for generating electricity. 22 figs., 11 tabs.

  11. Engineering development of advanced physical fine coal cleaning for premium fuel applications

    International Nuclear Information System (INIS)

    Bechtel, together with Amax Research and Development Center (Amax R ampersand D), has prepared this study which provides conceptual cost estimates for the production of premium quality coal-water slurry fuel (CWF) in a commercial plant. Two scenarios are presented, one using column flotation technology and the other the selective agglomeration to clean the coal to the required quality specifications. This study forms part of US Department of Energy program Engineering Development of Advanced Physical Fine Coal Cleaning for Premium Fuel Applications, (Contract No. DE-AC22- 92PC92208), under Task 11, Project Final Report. The primary objective of the Department of Energy program is to develop the design base for prototype commercial advanced fine coal cleaning facilities capable of producing ultra-clean coals suitable for conversion to stable and highly loaded CWF. The fuels should contain less than 2 lb ash/MBtu (860 grams ash/GJ) of HHV and preferably less than 1 lb ash/MBtu (430 grams ash/GJ). The advanced fine coal cleaning technologies to be employed are advanced column froth flotation and selective agglomeration. It is further stipulated that operating conditions during the advanced cleaning process should recover not less than 80 percent of the carbon content (heating value) in the run-of-mine source coal. These goals for ultra-clean coal quality are to be met under the constraint that annualized coal production costs does not exceed $2.5 /MBtu ($ 2.37/GJ), including the mine mouth cost of the raw coal. A further objective of the program is to determine the distribution of a selected suite of eleven toxic trace elements between product CWF and the refuse stream of the cleaning processes. Laboratory, bench-scale and Process Development Unit (PDU) tests to evaluate advanced column flotation and selective agglomeration were completed earlier under this program with selected coal samples. A PDU with a capacity of 2 st/h was designed by Bechtel and installed at

  12. POC-SCALE TESTING OF A DRY TRIBOELECTROSTATIC SEPARATOR FOR FINE COAL CLEANING

    Energy Technology Data Exchange (ETDEWEB)

    R.H. Yoon; G.H. Luttrell; E.S. Yan; A.D. Walters

    2001-04-30

    Numerous advanced coal cleaning processes have been developed in recent years that are capable of substantially reducing both ash- and sulfur-forming minerals from coal. However, most of the processes involve fine grinding and use water as the cleaning medium; therefore, the clean coal products must be dewatered before they can be transported and burned. Unfortunately, dewatering fine coal is costly, which makes it difficult to deploy advanced coal cleaning processes for commercial applications. As a means of avoiding problems associated with the fine coal dewatering, the National Energy Technology Laboratory (NETL) developed a dry coal cleaning process in which mineral matter is separated from coal without using water. In this process, pulverized coal is subjected to triboelectrification before being placed in an electric field for electrostatic separation. The triboelectrification is accomplished by passing a pulverized coal through an in-line mixer made of copper. Copper has a work function that lies between that of carbonaceous material (coal) and mineral matter. Thus, coal particles impinging on the copper wall lose electrons to the metal thereby acquiring positive charges, while mineral matter impinging on the wall gain electrons to acquire negative charges. The charged particles then pass through an electric field where they are separated according to their charges into two or more products depending on the configuration of the separator. The results obtained at NETL showed that it is capable of removing more than 90% of the pyritic sulfur and 70% of the ash-forming minerals from a number of eastern U.S. coals. However, the BTU recoveries were less than desirable. The laboratory-scale batch triboelectrostatic separator (TES) used by NETL relied on adhering charged particles on parallel electrode surfaces and scraping them off. Therefore, its throughput will be proportional to the electrode surface area. If this laboratory device is scaled-up as is, it would

  13. POC-scale testing of a dry triboelectrostatic separator for fine coal cleaning

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, R.H.; Luttrell, G.H.; Adel, G.T. [Virginia Polytechnic Institute and State Univ., Blacksburg, VA (United States)

    1995-11-01

    Numerous advanced coal cleaning processes have been developed in recent years that are capable of substantially reducing both the ash and sulfur contents of run-of-mine coals. The extent of cleaning depends on the liberation characteristics of the coal, which generally improve with reducing particle size. however, since most of the advanced technologies are wet processes, the clean coal product must be dewatered before it can be transported and burned in conventional boilers. This additional treatment step significantly increases the processing cost and makes the industrial applicability of these advanced technologies much less attractive. In order to avoid problems associated with fine coal dewatering, researchers at the Pittsburgh Energy Technology Center (PETC) developed a novel triboelectrostatic separation (TES) process that can remove mineral matter from dry coal. In this technique, finely pulverized coal is brought into contact with a material (such as copper) having a work function intermediate to that of the carbonaceous material and associated mineral matter. Carbonaceous particles having a relatively low work function become positively charged, while particles of mineral matter having significantly higher work functions become negatively charged. once the particles become selectively charged, a separation can be achieved by passing the particle stream through an electrically charged field. Details related to the triboelectrostatic charging phenomenon have been discussed elsewhere (Inculet, 1984).

  14. BOUND PERIODICAL HOLDINGS BATTELLE - NORTHWEST LIBRARY

    Energy Technology Data Exchange (ETDEWEB)

    None

    1967-05-01

    This report lists the bound periodicals in the Technical Library at the Pacific Northwest Laboratory, operated by Battelle Memorial Institute. It was prepared from a computer program and is arranged in two parts. Part one is an alphabetical list of journals by title; part two is an arrangement of the journals by subject. The list headings are self-explanatory, with the exception of the title code, which is necessary in the machine processing. The listing is complete through June, 1966 and updates an earlier publication issued in March, 1965.

  15. Engineering development of advance physical fine coal cleaning for premium fuel applications

    Energy Technology Data Exchange (ETDEWEB)

    Jha, M.C.; Smit, F.J.; Shields, G.L. [AMAX R& D Center/ENTECH Global Inc., Golden, CO (United States)

    1995-11-01

    The objective of this project is to develop the engineering design base for prototype fine coal cleaning plants based on Advanced Column Flotation and Selective Agglomeration processes for premium fuel and near-term applications. Removal of toxic trace elements is also being investigated. The scope of the project includes laboratory research and bench-scale testing of each process on six coals followed by design, construction, and operation of a 2 tons/hour process development unit (PDU). Three coals will be cleaned in tonnage quantity and provided to DOE and its contractors for combustion evaluation. Amax R&D (now a subsidiary of Cyprus Amax Mineral Company) is the prime contractor. Entech Global is managing the project and performing most of the research and development work as an on-site subcontractor. Other participants in the project are Cyprus Amax Coal Company, Arcanum, Bechtel, TIC, University of Kentucky and Virginia Tech. Drs. Keller of Syracuse and Dooher of Adelphi University are consultants.

  16. Coal surface control for advanced physical fine coal cleaning technologies. Final report, September 19, 1988--August 31, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Morsi, B.I.; Chiang, S.H.; Sharkey, A.; Blachere, J.; Klinzing, G.; Araujo, G.; Cheng, Y.S.; Gray, R.; Streeter, R.; Bi, H.; Campbell, P.; Chiarlli, P.; Ciocco, M.; Hittle, L.; Kim, S.; Kim, Y.; Perez, L.; Venkatadri, R.

    1992-12-31

    This final report presents the research work carried out on the Coal Surface Control for Advanced Physical Fine Coal Cleaning Technologies project, sponsored by the US Department of Energy, Pittsburgh Energy Technology Center (DOE/PETC). The project was to support the engineering development of the selective agglomeration technology in order to reduce the sulfur content of US coals for controlling SO{sub 2} emissions (i.e., acid rain precursors). The overall effort was a part of the DOE/PETCs Acid Rain Control Initiative (ARCI). The overall objective of the project is to develop techniques for coal surface control prior to the advanced physical fine coal cleaning process of selective agglomeration in order to achieve 85% pyrite sulfur rejection at an energy recovery greater than 85% based on run-of-mine coal. The surface control is meant to encompass surface modification during grinding and laboratory beneficiation testing. The project includes the following tasks: Project planning; methods for analysis of samples; development of standard beneficiation test; grinding studies; modification of particle surface; and exploratory R&D and support. The coal samples used in this project include three base coals, Upper Freeport - Indiana County, PA, Pittsburgh NO. 8 - Belmont County, OH, and Illinois No. 6 - Randolph County, IL, and three additional coals, Upper Freeport - Grant County- WV, Kentucky No. 9 Hopkins County, KY, and Wyodak - Campbell County, WY. A total of 149 drums of coal were received.

  17. Damage and deterioration mechanism and curing technique of concrete structure in main coal cleaning plants

    Institute of Scientific and Technical Information of China (English)

    LV Heng-lin; ZHAO Cheng-ming; SONG Lei; MA Ying; XU Chun-hua

    2009-01-01

    Concrete structures in main coal cleaning plants have been rebuilt and reinforced in the coal mines of the Shanghai Da-tun Energy Sources Co. Ltd., the first colliery of the Pingdingshan Coal Co. Ltd. and the Sanhejian mine of the Xuzhou Mining Group Co. Ltd. In these projects, the operating environment and reliability of concrete structures in the main plants of the three companies were investigated and the safety of the structures inspected. Qualitative and quantitative analyses were made on the spe-cial natural, technological and mechanical environments around the structures. On the basis of these analyses, we discuss the long-term, combined actions of the harsh natural (corrosive gases, liquids and solids) and mechanical environments on concrete structures and further investigated the damage and deteriorating mechanisms and curing techniques of concrete structures in the main coal cleaning plants. Our study can provide a theoretical basis for ensuring the reliability of concrete structures in main coal cleaning plants.

  18. Nuclear materials transportation at Battelle

    International Nuclear Information System (INIS)

    Battelle-Columbus has been a pioneer in designing and developing shipping containers for its own needs and to meet the requirements of the nuclear industry. It has participated in the design and testing of approximately 80 licensed shipping casks. Its involvement has included cask design and testing and the preparation and updating of safety analysis reports. Battelle's capabilities also include all the computer codes needed for thermal, shielding, criticality, and structural analyses as well as a drop test facility for validating codes and obtaining data to supplement structural analyses. These facilities have also been used in the design and licensing of Battelle's four shipping containers, all of which are currently in service. These casks are used principally to transport radioactive sources, surveillance capsules, and spent research reactor fuel. Battelle-Columbus designed, licensed, built, and maintains four shipping casks, primarily to support our Hot Laboratory postirradiation programs on highly irradiated structural and spent fuel materials. These casks vary in size and shipping capacities. Weights range from 1200 to 23,000 pounds. Internal cavities range from 4-1/2 in. I.D. x 5 in. deep to 15-1/2 in. I.D. x 54 in. deep. Each is licensed by the U.S. NRC for Type fissile quantities and each has an IAEA Competent Authority Permit. Although they are used primarily for own purposes, the casks are available for lease to industry and the government. Battelle-Columbus averages about 150 outgoing and incoming shipments of radioactive material a year in packages that range from 50 000 pound spent fuel casks to small 5-gallon cans. The regulatory requirements for each shipment are becoming more detailed and restrictive every day, thus each shipment can almost be considered a major project in itself. Three years ago, a truckload of radioactive waste leaving our site required the generation of only two document; now 13 internal and external documents are required. We

  19. Coal Cleaning Using Resonance Disintegration for Mercury and Sulfur Reduction Prior to Combustion

    Energy Technology Data Exchange (ETDEWEB)

    Andrew Lucero

    2005-04-01

    Coal-cleaning processes have been utilized to increase the heating value of coal by extracting ash-forming minerals in the coal. These processes involve the crushing or grinding of raw coal followed by physical separation processes, taking advantage of the density difference between carbonaceous particles and mineral particles. In addition to the desired increase in the heating value of coal, a significant reduction of the sulfur content of the coal fed to a combustion unit is effected by the removal of pyrite and other sulfides found in the mineral matter. WRI is assisting PulseWave to develop an alternate, more efficient method of liberating and separating the undesirable mineral matter from the carbonaceous matter in coal. The approach is based on PulseWave's patented resonance disintegration technology that reduces that particle size of materials by application of destructive resonance, shock waves, and vortex generating forces. Illinois No.5 coal, a Wyodak coal, and a Pittsburgh No.8 coal were processed using the resonance disintegration apparatus then subjected to conventional density separations. Initial microscopic results indicate that up to 90% of the pyrite could be liberated from the coal in the machine, but limitations in the density separations reduced overall effectiveness of contaminant removal. Approximately 30-80% of the pyritic sulfur and 30-50% of the mercury was removed from the coal. The three coals (both with and without the pyritic phase separated out) were tested in WRI's 250,000 Btu/hr Combustion Test Facility, designed to replicate a coal-fired utility boiler. The flue gases were characterized for elemental, particle bound, and total mercury in addition to sulfur. The results indicated that pre-combustion cleaning could reduce a large fraction of the mercury emissions.

  20. Engineering development of advanced physical fine coal cleaning technologies - froth flotation

    International Nuclear Information System (INIS)

    In 1988, ICF Kaiser Engineers was awarded DOE Contract No. DE-AC22-88PC88881 to research, develop, engineer and design a commercially acceptable advanced froth flotation coal cleaning technology. The DOE initiative is in support of the continued utilization of our most abundant energy resource. Besides the goal of commercialability, coal cleaning performance and product quality goals were established by the DOE for this and similar projects. primary among these were the goals of 85 percent energy recovery and 85 percent pyrite rejection. Three nationally important coal resources were used for this project: the Pittsburgh No. 8 coal, the Upper Freeport coal, and the Illinois No. 6 coal. Following is a summary of the key findings of this project

  1. Engineering development of advanced physical fine coal cleaning technologies - froth flotation

    Energy Technology Data Exchange (ETDEWEB)

    Ferris, D.D.; Bencho, J.R. [ICF Kaiser Engineers, Inc., Pittsburgh, PA (United States)

    1995-11-01

    In 1988, ICF Kaiser Engineers was awarded DOE Contract No. DE-AC22-88PC88881 to research, develop, engineer and design a commercially acceptable advanced froth flotation coal cleaning technology. The DOE initiative is in support of the continued utilization of our most abundant energy resource. Besides the goal of commercialability, coal cleaning performance and product quality goals were established by the DOE for this and similar projects. primary among these were the goals of 85 percent energy recovery and 85 percent pyrite rejection. Three nationally important coal resources were used for this project: the Pittsburgh No. 8 coal, the Upper Freeport coal, and the Illinois No. 6 coal. Following is a summary of the key findings of this project.

  2. POC-SCALE TESTING OF A DRY TRIBOELECTROSTATIC SEPARATOR FOR FINE COAL CLEANING

    International Nuclear Information System (INIS)

    It is the objective of the project to further develop the triboelectrostatic separation (TES) process developed at the Federal Energy Technology Center (FETC) and to test the process at a proof-of-concept (POC) scale. This process has a distinct advantage over other coal cleaning processes in that it does not entail costly steps of dewatering. The POC-scale unit is to be developed based on (i) the charging characteristics of coal and mineral matter that can be determined using the novel on-line tribocharge measuring device developed at Virginia Tech and (ii) the results obtained from bench-scale TES tests conducted on three different coals. During the past quarter, most of the personnel assigned to this project have been performing work elements associated with the engineering design (Task 3) of the TES process. This activity has been subdivided into three subtasks, i.e., Charger Tests (Subtask 3.1), Separator Tests (Subtask 3.2), and Final POC Design (Subtask 3.3). In Subtask 3.1, several different tribocharging devices have been constructed using materials of various work functions. They are currently being tested to establish the best materials to be used for designing and manufacturing the optimum tribochargers that can maximum charge differences between coal and mineral matter. In Subtask 3.2, bench-scale cleaning tests have been conducted to study the effects of the various operating and design parameters on the performance of the electrostatic separator. Two different TES units have been tested to date. One uses drum-type electrodes to separate charged particles, while the other uses plate-type electrodes for the separation. The test results showed that a major improvement in separation efficiency can be achieved by recycling the middlings back to the feed stream. It has also been established that the major source of inefficiency arises from the difficulty in separating ultrafine particles. Understanding the behavior of the ultrafine particles and finding

  3. Engineering development of advanced physical fine coal cleaning for premium fuel applications. Task 6 -- Selective agglomeration laboratory research and engineering development for premium fuels

    Energy Technology Data Exchange (ETDEWEB)

    Moro, N.; Jha, M.C.

    1997-06-27

    The primary goal of this project is the engineering development of two advanced physical fine coal cleaning processes, column flotation and selective agglomeration, for premium fuel applications. The project scope included laboratory research and benchscale testing on six coals to optimize these processes, followed by the design, construction, and operation of a 2 t/hr process development unit (PDU). The project began in October, 1992, and is scheduled for completion by September 1997. This report represents the findings of Subtask 6.5 Selective Agglomeration Bench-Scale Testing and Process Scale-up. During this work, six project coals, namely Winifrede, Elkhorn No. 3, Sunnyside, Taggart, Indiana VII, and Hiawatha were processed in a 25 lb/hr continuous selective agglomeration bench-scale test unit.

  4. Engineering development of advanced physical fine coal cleaning for premium fuel applications. Quarterly technical progress report 15, April--June 1996

    Energy Technology Data Exchange (ETDEWEB)

    Moro, N.; Shields, G.L.; Smit, F.J.; Jha, M.C.

    1996-07-25

    Goal is engineering development of two advanced physical fine coal cleaning processes, column flotation and selective agglomeration, for premium fuel applications. Scope includes laboratory research and bench-scale testing on 6 coals to optimize these processes, followed by design/construction/operation of a 2-t/hr PDU. During this quarter, parametric testing of the 30-in. Microcel{trademark} flotation column at the Lady Dunn plant was completed and clean coal samples submitted for briquetting. A study of a novel hydrophobic dewatering process continued at Virginia Tech. Benefits of slurry PSD (particle size distribution) modification and pH adjustment were evaluated for the Taggart and Hiawatha coals; they were found to be small. Agglomeration bench-scale test results were positive, meeting product ash specifications. PDU Flotation Module operations continued; work was performed with Taggart coal to determine scaleup similitude between the 12-in. and 6-ft Microcel{trademark} columns. Construction of the PDU selective agglomeration module continued.

  5. Development, testing, and demonstration of an optimal fine coal cleaning circuit

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, M.; Placha, M.; Bethell, P. [and others

    1995-11-01

    The overall objective of this project is to improve the efficiency of fine coal cleaning. The project will be completed in two phases: bench-scale testing and demonstration of four advanced flotation cells and; in-plant proof-of-concept (POC) pilot plant testing of two flotation cells individually and in two-stage combinations. The goal is to ascertain if a two-stage circuit can result in reduced capital and operating costs while achieving improved separation efficiency. The plant selected for this project, Cyprus Emerald Coal Preparation plant, cleans 1200 tph of raw coal. The plant produces approximately 4 million tonnes of clean coal per year at an average as received energy content of 30.2 MJ/Kg (13,000 Btu/lb).

  6. POC-scale testing of a dry triboelectrostatic separator for fine coal cleaning. Second quarterly technical progress report, January 1, 1996--March 31, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, R.-H.; Luttrell, G.H.; Adel, G.T.

    1996-08-01

    The Pittsburgh Energy Technology Center (PETC) developed a triboelectrostatic separation (TES) process which is capable of removing mineral matter from coal without using water. A distinct advantage of this dry coal cleaning process is that it does not entail costly steps of dewatering which is a common problem associated with conventional fine coal cleaning processes. It is the objective of this project to conduct a series of proof-of-concept (POC) scale tests at a throughput of 200--250 kg/hr and obtain scale- up information. Prior to the POC testing, bench-scale test work will be conducted with the objective of increasing the separation efficiency and throughput, for which changes in the basic designs for the charger and the separator may be necessary. The bench- and POC- scale test work will be carried out to evaluate various operating parameters and establish a reliable scale-up procedure. The scale-up data will be used to analyze the economic merits of the TES process. All required documents associated with project planning were completed and submitted to DOE for approval during the second quarter of this project. Approval of the project work plan is still pending at this time subject to additional review by DOE of requested modifications to the statement of work. Accomplishments during this reporting period include the set-up of an apparatus for assessing tribocharger performance, continued construction of the bench-scale (1 kg/hr) triboelectrostatic separator and initial development of a fundamental model for predicting the motion of charged particles in a non-uniform electrostatic field.

  7. Research on Clean Coal Clean Coal Technology of Computer Automatic Control%计算机自动控制洁煤净煤技术研究

    Institute of Scientific and Technical Information of China (English)

    杨荣光

    2013-01-01

      在煤利用的过程中,会产生大量有害气体、粉尘等污染物,尤其是在发展中国家,这种污染十分严重。而在当今社会,人们的环保意识逐渐增强,国际上对于煤炭利用带来的环境问题给予了越来越多的关注。广大科技工作者针对洁煤净煤,降低污染方面技术的研究愈加深入,大量新型净化方法和应用技术应运而生。利用计算机自动控制技术,发展煤化工新技术,一方面能更有效地提高经济效益,另一方面能有效地达到洁煤、净煤的效果,保护环境。%  The coal in the use process, will produce a large number of harmful gas, dust and other pollutants, especially in developing countries, this kind of pollution is very serious. In today's society, the people environmental protection consciousness strengthens gradually, the international environment problems caused by coal use to pay more and more attention. Broad Scientists and technologists for clean coal clean coal, reducing pollution technology research more deeply, a new purification method and application technology of emerge as the times require. Use of computer automatic control technology, the development of coal chemical industry new technology, one can more effectively improve the economic benefit, on the other hand, can effectively achieve the clean coal, clean coal, protect environment

  8. Engineering Development of Advanced Physical Fine Coal Cleaning for Premium Fuel Applications: Task 9 - Selective agglomeration Module Testing and Evaluation.

    Energy Technology Data Exchange (ETDEWEB)

    Moro, N.` Jha, M.C.

    1997-09-29

    The primary goal of this project was the engineering development of two advanced physical fine coal cleaning processes, column flotation and selective agglomeration, for premium fuel applications. The project scope included laboratory research and bench-scale testing of both processes on six coals to optimize the processes, followed by the design, construction, and operation of a 2 t/hr process development unit (PDU). The project began in October, 1992, and is scheduled for completion by September 1997. This report summarizes the findings of all the selective agglomeration (SA) test work performed with emphasis on the results of the PDU SA Module testing. Two light hydrocarbons, heptane and pentane, were tested as agglomerants in the laboratory research program which investigated two reactor design concepts: a conventional two-stage agglomeration circuit and a unitized reactor that combined the high- and low-shear operations in one vessel. The results were used to design and build a 25 lb/hr bench-scale unit with two-stage agglomeration. The unit also included a steam stripping and condensation circuit for recovery and recycle of heptane. It was tested on six coals to determine the optimum grind and other process conditions that resulted in the recovery of about 99% of the energy while producing low ash (1-2 lb/MBtu) products. The fineness of the grind was the most important variable with the D80 (80% passing size) varying in the 12 to 68 micron range. All the clean coals could be formulated into coal-water-slurry-fuels with acceptable properties. The bench-scale results were used for the conceptual and detailed design of the PDU SA Module which was integrated with the existing grinding and dewatering circuits. The PDU was operated for about 9 months. During the first three months, the shakedown testing was performed to fine tune the operation and control of various equipment. This was followed by parametric testing, optimization/confirmatory testing, and finally a

  9. Engineering development of advanced physical fine coal cleaning for premium fuel applications. Quarterly technical progress report 11, April--June, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Moro, N.; Shields, G.L.; Smit, F.J.; Jha, M.C.

    1995-07-31

    The primary goal of this project is the engineering development of two advanced physical fine coal cleaning processes, column flotation and selective agglomeration, for premium fuel applications. The project scope includes laboratory research and bench-scale testing on six coals to optimize these processes, followed by design, and construction of a 2-t/hr process development unit (PDU). The PDU will then be operated to generate 200 tons of each of three project coals, by each process. During Quarter 11 (April--June, 1995), work continued on the Subtask 3.2 in-plant testing of the Microcel{trademark} flotation column at the Lady Dunn Preparation Plant with the installation and calibration of a refurbished 30-inch diameter column. The evaluation of toxic trace element data for column flotation samples continued, with preliminary analysis indicating that reasonably good mass balances were achieved for most elements, and that significant reductions in the concentration of many elements were observed from raw coal, to flotation feed, to flotation product samples. Significant progress was made on Subtask 6.5 selective agglomeration bench-scale testing. Data from this work indicates that project ash specifications can be met for all coals evaluated, and that the bulk of the bridging liquid (heptane) can be removed from the product for recycle to the process. The detailed design of the 2 t/hr selective agglomeration module progressed this quarter with the completion of several revisions of both the process flow, and the process piping and instrument diagrams. Procurement of coal for PDU operation began with the purchase of 800 tons of Taggart coal. Construction of the 2 t/hr PDU continued through this reporting quarter and is currently approximately 60% complete.

  10. Decommissioning and Decontamination Program: Battelle Plutonium Facility, Environmental assessment

    International Nuclear Information System (INIS)

    This assessment describes the decontamination of Battelle-Columbus Plutonium Facility and removal from the site of all material contamination which was associated with or produced by the Plutonium Facility. Useable uncontaminated material will be disposed of by procedures normally employed in scrap declaration and transfer. Contaminated waste will be transported to approved radioactive waste storage sites. 5 refs., 1 fig

  11. COAL CLEANING VIA LIQUID-FLUIDIZED CLASSIFICAITON (LFBC) WITH SELECTIVE SOLVENT SWELLING

    Energy Technology Data Exchange (ETDEWEB)

    J. M. Calo

    2000-12-01

    The concept of coal beneficiation due to particle segregation in water-fluidized beds, and its improvement via selective solvent-swelling of organic material-rich coal particles, was investigated in this study. Particle size distributions and their behavior were determined using image analysis techniques, and beneficiation effects were explored via measurements of the ash content of segregated particle samples collected from different height locations in a 5 cm diameter liquid-fluidized bed column (LFBC). Both acetone and phenol were found to be effective swelling agents for both Kentucky No.9 and Illinois No.6 coals, considerably increasing mean particle diameters, and shifting particle size distributions to larger sizes. Acetone was a somewhat more effective swelling solvent than phenol. The use of phenol was investigated, however, to demonstrate that low cost, waste solvents can be effective as well. For unswollen coal particles, the trend of increasing particle size from top to bottom in the LFBC was observed in all cases. Since the organic matter in the coal tends to concentrate in the smaller particles, the larger particles are typically denser. Consequently, the LFBC naturally tends to separate coal particles according to mineral matter content, both due to density and size. The data for small (40-100 {micro}m), solvent-swollen particles clearly showed improved beneficiation with respect to segregation in the water-fluidized bed than was achieved with the corresponding unswollen particles. This size range is quite similar to that used in pulverized coal combustion. The original process concept was amply demonstrated in this project. Additional work remains to be done, however, in order to develop this concept into a full-scale process.

  12. Alkalis in Coal and Coal Cleaning Products / Alkalia W Węglu I Productach Jego Wzbogacania

    Science.gov (United States)

    Bytnar, Krzysztof; Burmistrz, Piotr

    2013-09-01

    In the coking process, the prevailing part of the alkalis contained in the coal charge goes to coke. The content of alkalis in coal (and also in coke) is determined mainly by the content of two elements: sodium and potasium. The presence of these elements in coal is connected with their occurrence in the mineral matter and moisture of coal. In the mineral matter and moisture of the coals used for the coke production determinable the content of sodium is 26.6 up to 62. per cent, whereas that of potassium is 37.1 up to 73.4 per cent of the total content of alkalis. Major carriers of alkalis are clay minerals. Occasionally alkalis are found in micas and feldspars. The fraction of alkalis contained in the moisture of the coal used for the production of coke in the total amount of alkalis contained there is 17.8 up to 62.0 per cent. The presence of sodium and potassium in the coal moisture is strictly connected with the presence of the chloride ions. The analysis of the water drained during process of the water-extracting from the flotoconcentrate showed that the Na to K mass ratio in the coal moisture is 20:1. Increased amount of the alkalis in the coal blends results in increased content of the alkalis in coke. This leads to the increase of the reactivity (CRI index), and to the decrease of strength (CSR index) determined with the Nippon Steel Co. method. W procesie koksowania przeważająca część zawartych we wsadzie węglowym alkaliów przechodzi do koksu. Zawartość alkaliów w węglu, a co za tym idzie i w koksie determinowana jest głównie zawartością dwóch pierwiastków: sodu i potasu. Obecność tych pierwiastków w węglu wiąże się z występowaniem ich w substancji mineralnej i wilgoci węgla. W substancji mineralnej oraz wilgoci węgli stosowanych do produkcji koksu, oznaczona zawartość sodu wynosi od 26.6 do 62.9%, a zawartość potasu od 37.1 do 73.4% alkaliów ogółem. Głównymi nośnikami alkaliów w substancji mineralnej są minera

  13. Comparison of COMPARE and BEACON subcompartment analyses of Battelle-Frankfurt containment tests

    International Nuclear Information System (INIS)

    This report presents the results of computations performed with the COMPARE/MOD1 and BEACON/MOD3 computer codes for selected Battelle-Frankfurt loss-of-coolant accident experiments. COMPARE is used widely to perform nuclear power plant containment subcompartment analyses, and BEACON is an advanced multiphase, multidimensional best-estimate code. The objective of this study was to evaluate the margins of COMPARE calculations by comparing them with BEACON calculations and test data. The calculations were performed for the Battelle-Frankfurt D3, D6, and C9 tests. Descriptions of the two codes and the Battelle-Frankfurt experiments are included. Comparisons of the codes' calculations and experimental data for absolute pressure, differential pressure, and temperature are presented for margin evaluation. Evaluations of the sensitivity of BEACON calculations to variations in model noding, form loss, and vent area modeling are prsesented. Conclusions summarizing the results of the COMPARE margin evaluation and BEACON sensitivity studies are given as well

  14. Technical and economic assessment of producing hydrogen by reforming syngas from the Battelle indirectly heated biomass gasifier

    International Nuclear Information System (INIS)

    The technical and economic feasibility of producing hydrogen from biomass by means of indirectly heated gasification and steam reforming was studied. A detailed process model was developed in ASPEN Plus trademark to perform material and energy balances. The results of this simulation were used to size and cost major pieces of equipment from which the determination of the necessary selling price of hydrogen was made. A sensitivity analysis was conducted on the process to study hydrogen price as a function of biomass feedstock cost and hydrogen production efficiency. The gasification system used for this study was the Battelle Columbus Laboratory (BCL) indirectly heated gasifier. The heat necessary for the endothermic gasification reactions is supplied by circulating sand from a char combustor to the gasification vessel. Hydrogen production was accomplished by steam reforming the product synthesis gas (syngas) in a process based on that used for natural gas reforming. Three process configurations were studied. Scheme 1 is the full reforming process, with a primary reformer similar to a process furnace, followed by a high temperature shift reactor and a low temperature shift reactor. Scheme 2 uses only the primary reformer, and Scheme 3 uses the primary reformer and the high temperature shift reactor. A pressure swing adsorption (PSA) system is used in all three schemes to produce a hydrogen product pure enough to be used in fuel cells. Steam is produced through detailed heat integration and is intended to be sold as a by-product

  15. BATTELLE ENERGY ALLIANCE, LLC (BEA) 2014 Annual Report for Idaho National Laboratory (INL)

    Energy Technology Data Exchange (ETDEWEB)

    Juan Alvarez; Todd Allen

    2014-10-01

    This Fiscal Year (FY) 2014 annual report provides the Department of Energy (DOE) with BEA’s self-assessment of performance managing and operating the INL for the period ending September 30, 2014. After considering all of the information related to INL performance during the rating period against the Goals, Objectives and Notable Outcomes in the FY 2014 Performance Evaluation and Measurement Plan (PEMP), BEA believes it earned an overall grade closest to an A. The paragraphs below highlight how INL excelled in delivering innovative and impactful research across the three mission areas; how INL has successfully positioned itself for future growth and sustainment; and how, through strong leadership, INL has set and implemented a strategic direction to ensure we meet and exceed the expectations of DOE and other customers. Attachments 1 through 5 provide additional detail on FY 2014 mission accomplishments, outline corporate contributions for success, highlight national and international awards and recognitions at the organization and individual levels, and describe the performance issues and challenges faced in FY 2014. • Attachment 1, “Self-Assessed PEMP Ratings” • Attachment 2, “INL Mission Accomplishments” • Attachment 3, “Battelle Energy Alliance, LLC Contributions to INL Success” • Attachment 4, “FY 2014 Awards, Recognition, Professional Roles and Certifications” • Attachment 5, “Performance Issues and Challenges.”

  16. Simulation of helium release in the Battelle Model Containment facility using OpenFOAM

    International Nuclear Information System (INIS)

    Highlights: • The HYJET Jx7 hydrogen release experiment at BMC facility is studied using OpenFOAM. • The SST model and 2nd order numerics for momentum and species concentration are used. • The behaviour is captured well but helium concentration is generally over-predicted. • OpenFOAM needs smaller time steps, higher resolution, more CPU time compared to CFX. • The study shows the potential of open source CFD codes in some nuclear application. - Abstract: The open source CFD code OpenFOAM has been validated against an experiment of jet release phenomena in the Battelle Model Containment facility (BMC), and benchmarked with the Ansys CFX5.7 results. In the selected test, HYJET Jx7, helium was released into the containment at a speed of 42 m/s over a time of 200 s. The SST turbulence model was applied to model helium release and dispersion with both codes. The overall behaviour is captured adequately. However, there are still some noticeable differences between the CFX and OpenFOAM solutions. The study confirms the potential of using open source codes like OpenFOAM in some nuclear applications. Nevertheless further investigations and improvements are needed

  17. Simulation of helium release in the Battelle Model Containment facility using OpenFOAM

    Energy Technology Data Exchange (ETDEWEB)

    Wilkening, Heinz; Ammirabile, Luca, E-mail: luca.ammirabile@ec.europa.eu

    2013-12-15

    Highlights: • The HYJET Jx7 hydrogen release experiment at BMC facility is studied using OpenFOAM. • The SST model and 2nd order numerics for momentum and species concentration are used. • The behaviour is captured well but helium concentration is generally over-predicted. • OpenFOAM needs smaller time steps, higher resolution, more CPU time compared to CFX. • The study shows the potential of open source CFD codes in some nuclear application. - Abstract: The open source CFD code OpenFOAM has been validated against an experiment of jet release phenomena in the Battelle Model Containment facility (BMC), and benchmarked with the Ansys CFX5.7 results. In the selected test, HYJET Jx7, helium was released into the containment at a speed of 42 m/s over a time of 200 s. The SST turbulence model was applied to model helium release and dispersion with both codes. The overall behaviour is captured adequately. However, there are still some noticeable differences between the CFX and OpenFOAM solutions. The study confirms the potential of using open source codes like OpenFOAM in some nuclear applications. Nevertheless further investigations and improvements are needed.

  18. Finding of no significant impact, decontamination and decommissioning of Battelle Columbus Laboratories in Columbus and West Jefferson, Ohio

    International Nuclear Information System (INIS)

    This Environmental Assessment has been developed by the Department of Energy in accordance with the requirements of the National Environmental Policy Act of 1969 for the proposed decommissioning of contaminated areas at the Battelle Memorial Institute, Columbus, Ohio. The discussions in Section 1.0 provide general background information on the proposed action. Section 2.0 describes the existing radiological and non-radiological condition of the Battelle Columbus Laboratories. Section 3.0 identifies the alternatives considered for the proposed action and describes in detail the proposed decommissioning project. Section 4.0 evaluates the potential risks the project poses to human health and the environment. Section 5.0 presents the Department of Energy's proposed action. As a result of nuclear research and development activities conducted over a period of approximately 43 years performed for the Department of Energy, its predecessor agencies, and under commercial contracts, the 15 buildings became contaminated with varying amounts of radioactive material. The Department of Energy no longer has a need to utilize the facilities and is contractually obligate to remove that contamination such that they can be used by their owners without radiological restrictions. This Environmental Assessment for the Battelle Columbus Laboratories Decommissioning Project is consistent with the direction from the Secretary of Energy that public awareness and participation be considered in sensitive projects and is an appropriate document to determine action necessary to satisfy the requirements of the National Environmental Policy Act. 30 refs., 6 figs., 9 tabs

  19. Non-intrusive measurement of particle charge: Electrostatic dry coal cleaning. Technical progress report No. 8, April 1, 1993--June 30, 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    As we reported in the Technical Progress Report No. 7, there are surges of electric current in the charging loop during triboelectrification of all particles. A high speed data acquisition and analysis system was developed to monitor and record the current pattern. There is no known report on such charge-discharge surges in the literature. The mechanism for it is yet to be understood. The on-line computerized electric current measurement also leads to an observation of charging effects as a function of particle feeding rate. It is shown that feed rate greatly alters particle charge. Such an effect is mostly overlooked by researchers and it could have a important role in process design where the feed rate would be maximized. The initial results for coal and mineral particles demonstrated that the average charge was lower when the feed rate was increased. Further investigation is scheduled to identify potential controlling factors, eg, the solid volume fraction and particle number density could be important process factors. The study of charging velocity and particle size was continued. It was found that particle charge was linearly dependent on the charging velocity for all samples investigated. However, the slope of this linear dependence varied for particles having different diameters. In addition, the charge-velocity relationships were dependent on feeding rates. Hence, the data discussed below include these interrelationships.

  20. The plant ecology of Amchita Island, Alaska: Report on a research contract between the Department of Botany, the University of Tennessee and Battelle Memorial Institute, Columbus Laboratories for the period 1 August 1967 through 30 June 1968

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Department of Botany of The University of Tennessee is conducting a study of the plant ecology of Amchitka Island, Alaska, as a subcontractor for Battelle...

  1. Flotation process diagnostics and modelling by coal grain analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ofori, P; O' Brien, G.; Firth, B.; Jenkins, B. [CSIRO Energy Technology, Brisbane, Qld. (Australia)

    2006-05-15

    In coal flotation, particles of different components of the coal such as maceral groups and mineral matter and their associations have different hydrophobicities and therefore different flotation responses. By using a new coal grain analysis method for characterising individual grains, more detailed flotation performance analysis and modelling approaches have been developed. The method involves the use of microscopic imaging techniques to obtain estimates of size, compositional and density information on individual grains of fine coal. The density and composition partitioning of coal processed through different flotation systems provides an avenue to pinpoint the actual cause of poor process performance so that corrective action may be initiated. The information on grain size, density and composition is being used as input data to develop more detailed flotation process models to provide better predictions of process performance for both mechanical and column flotation devices. A number of approaches may be taken to flotation modelling such as the probability approach and the kinetic model approach or a combination of the two. In the work reported here, a simple probability approach has been taken, which will be further refined in due course. The use of grain data to map the responses of different types of coal grains through various fine coal cleaning processes provided a more advanced diagnostic capability for fine coal cleaning circuits. This enabled flotation performance curves analogous to partition curves for density separators to be produced for flotation devices.

  2. Evaluation, engineering and development of advanced cyclone processes

    Energy Technology Data Exchange (ETDEWEB)

    Durney, T.E.; Cook, A. [Coal Technology Corporation, Bristol, VA (United States); Ferris, D.D. [ICF Kaiser Engineers, Inc., Pittsburgh, PA (United States)] [and others

    1995-11-01

    This research and development project is one of three seeking to develop advanced, cost-effective, coal cleaning processes to help industry comply with 1990 Clean Air Act Regulations. The specific goal for this project is to develop a cycloning technology that will beneficiate coal to a level approaching 85% pyritic sulfur rejection while retaining 85% of the parent coal`s heating value. A clean coal ash content of less than 6% and a moisture content, for both clean coal and reject, of less than 30% are targeted. The process under development is a physical, gravimetric-based cleaning system that removes ash bearing mineral matter and pyritic sulfur. Since a large portion of the Nation`s coal reserves contain significant amounts of pyrite, physical beneficiation is viewed as a potential near-term, cost effective means of producing an environmentally acceptable fuel.

  3. IWTU Process Sample Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  4. Survey and evaluation of current and potential coal beneficiation processes

    Energy Technology Data Exchange (ETDEWEB)

    Singh, S. P.N.; Peterson, G. R.

    1979-03-01

    Coal beneficiation is a generic term used for processes that prepare run-of-mine coal for specific end uses. It is also referred to as coal preparation or coal cleaning and is a means of reducing the sulfur and the ash contents of coal. Information is presented regarding current and potential coal beneficiation processes. Several of the processes reviewed, though not yet commercial, are at various stages of experimental development. Process descriptions are provided for these processes commensurate with the extent of information and time available to perform the evaluation of these processes. Conceptual process designs, preliminary cost estimates, and economic evaluations are provided for the more advanced (from a process development hierarchy viewpoint) processes based on production levels of 1500 and 15,000 tons/day (maf) of cleaned product coal. Economic evaluations of the coal preparation plants are conducted for several project financing schemes and at 12 and 15% annual after-tax rates of return on equity capital. A 9% annual interest rate is used on the debt fraction of the plant capital. Cleaned product coal prices are determined using the discounted cash flow procedure. The study is intended to provide information on publicly known coal beneficiation processes and to indicate the relative costs of various coal beneficiation processes. Because of severe timeconstraints, several potential coal beneficiation processes are not evaluated in great detail. It is recommended that an additional study be conducted to complement this study and to more fully appreciate the potentially significant role of coal beneficiation in the clean burning of coal.

  5. Coal cleaning: A viable strategy for reduced carbon emissions and improved environment in China?

    OpenAIRE

    Glomsrød, Solveig; Taoyuan, Wei

    2003-01-01

    Abstract: China is a dominant energy consumer in a global context and current energy forecasts emphasise that China’s future energy consumption also will rely heavily on coal. The coal use is the major source of the greenhouse gas CO2 and particles causing serious health damage. This paper looks into the question if coal washing might work as low cost strategy for both CO2 and particle emission reductions. Coal washing removes dirt and rock from raw coal, resulting in a coal pr...

  6. Bench-scale testing of the micronized magnetite process. Third quarterly technical progress report, January 1995--March 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-29

    The major focus of the project, which is scheduled to occur through December 1995, will be to install and test a 500{number_sign}/hr. fine-coal cleaning circuit at DOE`s Process Research Facility (PRF), located at the Pittsburgh Energy Technology Center (PETC). The circuit will utilize an extremely fine, micron-sized magnetite media and small diameter cyclones to make efficient density separations on minus-28-Mesh coal. The overall objectives of the project are to: Determine the effects of operating time on the characteristics of the recirculating medium in a continuous integrated processing circuit, and subsequently, the sensitivity of cyclone separation performance to the quality of the recirculating medium; and determine the technical and economic feasibility of various unit operations and systems in optimizing the separation and recovery of the micronized magnetite from the coal products. This report contains a short discussion of the project description, objectives, budget, schedule, and teaming arrangement. The final section contains an outline of the specific project goals for the next quarterly reporting period.

  7. Mixing of process heels, process solutions, and recycle streams: Results of the small-scale radioactive tests

    International Nuclear Information System (INIS)

    Various recycle streams will be combined with the low-activity waste (LAW) or the high-level waste (HLW) feed solutions during the processing of the Hanford tank wastes by BNFL, Inc. In addition, the LAW and HLW feed solutions will also be mixed with heels present in the processing equipment. This report describes the results of a test conducted by Battelle to assess the effects of mixing specific process streams. Observations were made regarding adverse reactions (mainly precipitation) and effects on the Tc oxidation state (as indicated by Kd measurements with SuperLigregsign 639). The work was conducted according to test plan BNFL-TP-29953-023, Rev. 0, Small Scale Mixing of Process Heels, Solutions, and Recycle Streams. The test went according to plan, with only minor deviations from the test plan. The deviations from the test plan are discussed in the experimental section

  8. Use of the GranuFlow Process in Coal Preparation Plants to Improve Energy Recovery and Reduce Coal Processing Wastes

    Energy Technology Data Exchange (ETDEWEB)

    Glenn A. Shirey; David J. Akers

    2005-12-31

    With the increasing use of screen-bowl centrifuges in today's fine coal cleaning circuits, a significant amount of low-ash, high-Btu coal can be lost during the dewatering step due to the difficulty in capturing coal of this size consist (< 100 mesh or 0.15mm). The GranuFlow{trademark} technology, developed and patented by an in-house research group at DOE-NETL, involves the addition of an emulsified mixture of high-molecular-weight hydrocarbons to a slurry of finesized coal before cleaning and/or mechanical dewatering. The binder selectively agglomerates the coal, but not the clays or other mineral matter. In practice, the binder is applied so as to contact the finest possible size fraction first (for example, froth flotation product) as agglomeration of this fraction produces the best result for a given concentration of binder. Increasing the size consist of the fine-sized coal stream reduces the loss of coal solids to the waste effluent streams from the screen bowl centrifuge circuit. In addition, the agglomerated coal dewaters better and is less dusty. The binder can also serve as a flotation conditioner and may provide freeze protection. The overall objective of the project is to generate all necessary information and data required to commercialize the GranuFlow{trademark} Technology. The technology was evaluated under full-scale operating conditions at three commercial coal preparation plants to determine operating performance and economics. The handling, storage, and combustion properties of the coal produced by this process were compared to untreated coal during a power plant combustion test.

  9. GEOTECHNICAL/GEOCHEMICAL CHARACTERIZATION OF ADVANCED COAL PROCESS WASTE STREAMS

    Energy Technology Data Exchange (ETDEWEB)

    Edwin S. Olson; Charles J. Moretti

    1999-11-01

    Thirteen solid wastes, six coals and one unreacted sorbent produced from seven advanced coal utilization processes were characterized for task three of this project. The advanced processes from which samples were obtained included a gas-reburning sorbent injection process, a pressurized fluidized-bed coal combustion process, a coal-reburning process, a SO{sub x}, NO{sub x}, RO{sub x}, BOX process, an advanced flue desulfurization process, and an advanced coal cleaning process. The waste samples ranged from coarse materials, such as bottom ashes and spent bed materials, to fine materials such as fly ashes and cyclone ashes. Based on the results of the waste characterizations, an analysis of appropriate waste management practices for the advanced process wastes was done. The analysis indicated that using conventional waste management technology should be possible for disposal of all the advanced process wastes studied for task three. However, some wastes did possess properties that could present special problems for conventional waste management systems. Several task three wastes were self-hardening materials and one was self-heating. Self-hardening is caused by cementitious and pozzolanic reactions that occur when water is added to the waste. All of the self-hardening wastes setup slowly (in a matter of hours or days rather than minutes). Thus these wastes can still be handled with conventional management systems if care is taken not to allow them to setup in storage bins or transport vehicles. Waste self-heating is caused by the exothermic hydration of lime when the waste is mixed with conditioning water. If enough lime is present, the temperature of the waste will rise until steam is produced. It is recommended that self-heating wastes be conditioned in a controlled manner so that the heat will be safely dissipated before the material is transported to an ultimate disposal site. Waste utilization is important because an advanced process waste will not require

  10. West Valley demonstration project: alternative processes for solidifying the high-level wastes

    Energy Technology Data Exchange (ETDEWEB)

    Holton, L.K.; Larson, D.E.; Partain, W.L.; Treat, R.L.

    1981-10-01

    In 1980, the US Department of Energy (DOE) established the West Valley Solidification Project as the result of legislation passed by the US Congress. The purpose of this project was to carry out a high level nuclear waste management demonstration project at the Western New York Nuclear Service Center in West Valley, New York. The DOE authorized the Pacific Northwest Laboratory (PNL), which is operated by Battelle Memorial Institute, to assess alternative processes for treatment and solidification of the WNYNSC high-level wastes. The Process Alternatives Study is the suject of this report. Two pretreatment approaches and several waste form processes were selected for evaluation in this study. The two waste treatment approaches were the salt/sludge separation process and the combined waste process. Both terminal and interim waste form processes were studied.

  11. Upgrading low-rank coals using the liquids from coal (LFC) process

    Energy Technology Data Exchange (ETDEWEB)

    Nickell, R.E.; Hoften, S.A. van

    1993-12-31

    Three unmistakable trends characterize national and international coal markets today that help to explain coal`s continuing and, in some cases, increasing share of the world`s energy mix: the downward trend in coal prices is primarily influenced by an excess of increasing supply relative to increasing demand. Associated with this trend are the availability of capital to expand coal supplies when prices become firm and the role of coal exports in international trade, especially for developing nations; the global trend toward reducing the transportation cost component relative to the market, preserves or enhances the producer`s profit margins in the face of lower prices. The strong influence of transportation costs is due to the geographic relationships between coal producers and coal users. The trend toward upgrading low grade coals, including subbituminous and lignite coals, that have favorable environmental characteristics, such as low sulfur, compensates in some measure for decreasing coal prices and helps to reduce transportation costs. The upgrading of low grade coal includes a variety of precombustion clean coal technologies, such as deep coal cleaning. Also included in this grouping are the coal drying and mild pyrolysis (or mild gasification) technologies that remove most of the moisture and a substantial portion of the volatile matter, including organic sulfur, while producing two or more saleable coproducts with considerable added value. SGI International`s Liquids From Coal (LFC) process falls into this category. In the following sections, the LFC process is described and the coproducts of the mild pyrolysis are characterized. Since the process can be applied widely to low rank coals all around the world, the characteristics of coproducts from three different regions around the Pacific Rim-the Powder River Basin of Wyoming, the Beluga Field in Alaska near the Cook Inlet, and the Bukit Asam region in south Sumatra, Indonesia - are compared.

  12. Design Fuels Corporation (DFC)-Apache, Inc. coal reclamation system for the plant of the future for processing clean coal

    International Nuclear Information System (INIS)

    The mechanical washing processing and drying portion of the DFC process offers an efficient method for cleaning of pyritic sulfur bearing compounds which represents 25% sulfur reduction from original run-of-mine coal quality. This reduction can be augmented with the use of calcium and sodium based compounds to reduce the sulfur in many coals to produce compliance quality coal. The use of mechanical/physical methods for the removal of the pyritic material found in coal is used by the DFC process as a first step to the final application of a complete coal refuse clean-up technology based on site specific conditions of the parent coal. The paper discusses the use of the DFC process to remediate slurry ponds and tailings piles and to improve coal cleaning by gravity separation methods, flotation, hydrocyclones and spiral separators, dense media separation, water only cyclones, and oil/solvent agglomeration. A typical DFC Project is the Rosa Coal Reclamation Project which involves the development of a bituminous coal waste impoundment reclamation and washery system. The plant would be located adjacent to a coal fines pond or tailings pond and refuse pile or gob pile at a former coal strip mine in Oneonta, Alabama. Design Fuels would provide a development program by which coal waste at the Rosa Mine could be reclaimed, cleaned and sold profitably. This feedstock could be furnished from recovered coal for direct use in blast furnaces, or as feedstock for coke ovens at 250,000 tons per year at an attractive price on a 10-year contract basis. The site has an old coal washing facility on the property that will be dismantled. Some equipment salvage has been considered; and removal of the existing plant would be the responsibility of Design Fuels. The paper briefly discusses the market potential of the process

  13. Clean Processing and Utilization of Coal Energy

    Institute of Scientific and Technical Information of China (English)

    陈如清; 王海峰

    2006-01-01

    The dominant status of coal on the energy production and consumption structure of China will not be changed in the middle period of this century. To realize highly efficient utilization of coal, low pollution and low cost are great and impendent tasks. These difficult problems can be almost resolved through establishing large-scale pithead power stations using two-stage highly efficient dry coal-cleaning system before coal burning, which is a highly efficient, clean and economical strategy considering the current energy and environmental status of China. All these will be discussed in detail in this paper.

  14. Evaluation of the effect of coal cleaning of fugitive elements. Part II. Analytical methods. Final report, Phase II

    Energy Technology Data Exchange (ETDEWEB)

    Bosshart, R.E.; Price, A.A.; Ford, C.T.

    1980-03-01

    This report contains the analytical and test methods which were used routinely at Bituminous Coal Research, Inc. during the project. The procedures contained herein should aid coal industry laboratories and others, including commercial laboratories, who might be required to determine trace elements in coal. Some of the procedures have been presented in previous BCR reports; however, this report includes additional procedures which are described in greater detail. Also presented are many as the more basic coal methods which have been in use at BCR for many years, or which have been adapted or refined from other standard reference sources for coal and water. The basis for choosing specific analytical procedures for trace elements in coal is somewhat complex. At BCR, atomic absorption was selected as the basic method in the development of these procedures. The choice was based on sensitivity, selectivity, accuracy, precision, practicability, and economy. Whenever possible, the methods developed had to be both adequate and amenable for use by coal industry laboratories by virtue of relative simplicity and cost. This is not to imply that the methods described are simple or inexpensive; however, atomic abosrption techniques do meet these criteria in relation to more complex and costly methods such as neutron activation, mass spectrometry, and x-ray fluorescence, some of which require highly specialized personnel as well as access to sophisticated nuclear and computational facilities. Many of the analytical procedures for trace elements in coal have been developed or adapted specifically for the BCR studies. Their presentation is the principal purpose of this report.

  15. Engineering development of advanced physical fine coal cleaning for premium fuel applications. Quarterly technical progress report No. 4

    Energy Technology Data Exchange (ETDEWEB)

    Smit, F.J.; Hogsett, R.F.; Jha, M.C.

    1993-11-04

    This project is a major step in the Department of Energy`s program to show that ultra-clean coal-water slurry fuel (CWF) can be produced from selected coals and that this premium fuel will be a cost-effective replacement for oil and natural gas now fueling some of the industrial and utility boilers in the United States. The replacement of oil and gas with CWF can only be realized if retrofit costs are kept to a minimum and retrofit boiler emissions meet national goals for clean air. These concerns establish the specifications for maximum ash and sulfur levels and combustion properties of the CWF. This cost-share contract is a 48-month program which started on September 30, 1992. This report discusses the technical progress made during the 4th quarter of the project from July 1 to September 30, 1993.

  16. Bench-Scale Testing of the Micronized Magnetite Process

    Energy Technology Data Exchange (ETDEWEB)

    Edward R. Torak; Peter J. Suardini

    1997-11-01

    A recent emphasis of the Department of Energy's (DOE's), Coal Preparation Program has been the development of high-efficiency technologies that offer near-term, low-cost improvements in the ability of coal preparation plants to address problems associated with coal fines. In 1992, three cost-shared contracts were awarded to industry, under the first High-Efficiency Preparation (HEP I) solicitation. All three projects involved bench-scale testing of various emerging technologies, at the Federal Energy Technology Center*s (FETC*s), Process Research Facility (PRF). The first HEP I project, completed in mid-1993, was conducted by Process Technology, Inc., with the objective of developing a computerized, on-line system for monitoring and controlling the operation of a column flotation circuit. The second HEP I project, completed in mid-1994, was conducted by a team led by Virginia Polytechnic Institute to test the Mozely Multi-Gravity Separator in combination with the Microcel Flotation Column, for improved removal of mineral matter and pyritic sulfur from fine coal. The last HEP I project, of which the findings are contained in this report, was conducted by Custom Coals Corporation to evaluate and advance a micronized-magnetite-based, fine-coal cycloning technology. The micronized-magnetite coal cleaning technology, also know as the Micro-Mag process, is based on widely used conventional dense-medium cyclone applications, in that it utilizes a finely ground magnetite/water suspension as a separating medium for cleaning fine coal, by density, in a cyclone. However, the micronized-magnetite cleaning technology differs from conventional systems in several ways: ! It utilizes significantly finer magnetite (about 5 to 10 micron mean particle size), as compared to normal mean particle sizes of 20 microns. ! It can effectively beneficiate coal particles down to 500M in size, as compared to the most advanced, existing conventional systems that are limited to a

  17. Battelle-Northwest monthly activities report, March 1965

    Energy Technology Data Exchange (ETDEWEB)

    1965-04-15

    This report covers progress in the following areas: production reactor support; plutonium recycle program; PRTR HPD core; corrosion and water quality; PRTR pressure tubes; reactor components development; plutonium ceramics research; ceramics (uranium) fuel research; swelling studies; irradiation damage to reactor materials; ATR gas loop studies; graphite studies; metallic fuel development; plutonium and U-233 fueling of a fast compact reactor; FFTF studies; radiation effects on metals; customer work (support of HTLTR and EBWR); physics and instruments; chemistry; biology; radiation protection; and technical and other services.

  18. 76 FR 65696 - Battelle Energy Alliance, et al.;

    Science.gov (United States)

    2011-10-24

    ...: Electron Microscope. Manufacturer: FEI Company, the Netherlands. Intended Use: See notice at 76 FR 56156.... Instrument: Electron Microscope. Manufacturer: FEI Company, Czech Republic. Intended Use: See notice at 76 FR... notice at 76 FR 56156, September 12, 2011. Comments: None received. Decision: Approved. No instrument...

  19. The FY 1998 Battelle performance evaluation and incentive fee agreement

    Energy Technology Data Exchange (ETDEWEB)

    Davis, T.L.

    1998-01-07

    Fiscal Year 1998 represents the second full year utilizing a results-oriented, performance-based contract. This document describes the critical outcomes, objectives, performance indicators, expected levels of performance, and the basis for the evaluation of the Contractors performance for the period October 1, 1997 through September 30, 1998, as required by Articles entitled Use of Objective Standards of Performance, Self Assessment and Performance Evaluation and Critical Outcomes Review of the Contract DE-AC06-76RL01830. In partnership with the Contractor and other key customers, the Department of Energy (DOE) Richland Operations Office has defined six critical outcomes that serve as the core for the Contractors performance evaluation. The Contractor also utilizes these outcomes as a basis for overall management of the Laboratory. The Critical Outcome system focuses all of the customer desires into specific objectives and performance indicators, with supporting measures to track and foster continued improvement in meeting the needs (outcomes) of the Laboratory`s customers. Section 1 provides information on how the overall performance rating for the Contractor will be determined. Section 2 provides the detailed information concerning critical outcomes, objectives, performance indicators and expectations of performance. Section 3 describes the commitments for documenting and reporting the Laboratory`s self-evaluation.

  20. Fiscal year 1998 Battelle performance evaluation agreement revision 1

    Energy Technology Data Exchange (ETDEWEB)

    DAVIS, T.L.

    1998-10-22

    Fiscal Year 1998 represents the second full year utilizing a results-oriented, performance-based contract. This document describes the critical outcomes, objectives, performance indicators, expected levels of performance, and the basis for the evaluation of the Contractors performance for the period October 1, 1997 through September 30, 1998, as required by Articles entitled Use of Objective Standards of Performance, Self Assessment and Performance Evaluation and Critical Outcomes Review of the Contract DE-AC08-76RLO1830. In partnership with the Contractor and other key customers, the Department of Energy (DOE) Richland Operations Office has defined six critical outcomes that same as the core for the Contractors performance evaluation. The Contractor also utilizes these outcomes as a basis for overall management of the Laboratory. As stated above six critical outcomes have been established for FY 1998. These outcomes are based on the following needs identified by DOE-HQ, RL and other customers of the Laboratory. Our Energy Research customer desires relevant, quality and cost effective science. Our Environmental Management customer wants technology developed, demonstrated, and deployed to solve environmental cleanup issues. To ensure the diversification and viability of the Laboratory as a National asset, RL and HQ alike want to increase the Science and Technical contributions of PNNL related to its core capabilities. RL wants improved leadership/management, cost-effective operations, and maintenance of a work environment, which fosters innovative thinking and high morale. RL and HQ alike desire compliance with environment, safety and health (ES and H) standards and disciplined conduct of operations for protection of the worker, environment, and the public, As with all of Hanford, DOE expects contribution of the Laboratory to the economic development of the Tri-Cities community, and the region, to build a new local economy that is less reliant on the Hanford mission, as well as enhancing the status of the Laboratory as a valued corporate citizen of the Northwest Region. The Critical Outcome system focuses all of these customer desires into specific objectives and performance indicators, with supporting measures to track and foster continued improvement in meeting the needs (outcomes) of the Laboratory's customers.

  1. Battelle-Northwest monthly activities report, February 1965

    Energy Technology Data Exchange (ETDEWEB)

    1964-03-15

    Activities for each of the following departments are discussed in this report: Reactor and Materials Technology Dept.; Physics and Instruments Dept.; Chemistry Dept.; Biology Dept.; Applied Mathematics Dept.; Radiation Protection Dept.; and the Test Reactor and Engineering Services Dept.. Activities are in support of Hanford reactors (production reactors, N-reactor, PRTR reactor, etc) and reprocessing and radioactive waste management efforts at Hanford.

  2. Fiscal year 1999 Battelle performance evaluation and fee agreement

    Energy Technology Data Exchange (ETDEWEB)

    DAVIS, T.L.

    1998-10-22

    Fiscal Year 1999 represents the third fill year utilizing a results-oriented, performance-based evaluation for the Contractor's operations and management of the DOE Pacific Northwest National Laboratory (here after referred to as the Laboratory). However, this is the first year that the Contractor's fee is totally performance-based utilizing the same Critical Outcomes. This document describes the critical outcomes, objectives, performance indicators, expected levels of performance, and the basis for the evaluation of the Contractor's performance for the period October 1, 1998 through September 30, 1999, as required by Clauses entitled ''Use of Objective Standards of Performance, Self Assessment and Performance Evaluation'' and ''Performance Measures Review'' of the Contract DE-ACO6-76RL01830. Furthermore, it documents the distribution of the total available performance-based fee and the methodology set for determining the amount of fee earned by the Contractor as stipulated within the causes entitled ''Estimated Cost and Annual Fee,'' ''Total Available Fee'' and ''Allowable Costs and Fee.'' In partnership with the Contractor and other key customers, the Department of Energy (DOE) Headquarters (HQ) and Richland Operations Office (RL) has defined four critical outcomes that serve as the core for the Contractor's performance-based evaluation and fee determination. The Contractor also utilizes these outcomes as a basis for overall management of the Laboratory.

  3. Catalytic Two-Stage Liquefaction (CTSL{trademark}) process bench studies and PDU scale-up with sub-bituminous coal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Comolli, A.G.; Johanson, E.S.; Karolkiewicz, W.F.; Lee, L.K.T.; Stalzer, R.H.; Smith, T.O.

    1993-03-01

    Reported are the details and results of Laboratory and Bench-Scale experiments using sub-bituminous coal conducted at Hydrocarbon Research, Inc., under DOE Contract No. DE-AC22-88PC88818 during the period October 1, 1988 to December 31, 1992. The work described is primarily concerned with testing of the baseline Catalytic Two-Stage Liquefaction (CTSL{trademark}) process with comparisons with other two stage process configurations, catalyst evaluations and unit operations such as solid separation, pretreatments, on-line hydrotreating, and an examination of new concepts. In the overall program, three coals were evaluated, bituminous Illinois No. 6, Burning Star and sub-bituminous Wyoming Black Thunder and New Mexico McKinley Mine seams. The results from a total of 16 bench-scale runs are reported and analyzed in detail. The runs (experiments) concern process variables, variable reactor volumes, catalysts (both supported, dispersed and rejuvenated), coal cleaned by agglomeration, hot slurry treatments, reactor sequence, on-line hydrotreating, dispersed catalyst with pretreatment reactors and CO{sub 2}/coal effects. The tests involving the Wyoming and New Mexico Coals are reported herein, and the tests involving the Illinois coal are described in Topical Report No. 2. On a laboratory scale, microautoclave tests evaluating coal, start-up oils, catalysts, thermal treatment, CO{sub 2} addition and sulfur compound effects were conducted and reported in Topical Report No. 3. Other microautoclave tests are described in the Bench Run sections to which they refer such as: rejuvenated catalyst, coker liquids and cleaned coals. The microautoclave tests conducted for modelling the CTSL{trademark} process are described in the CTSL{trademark} Modelling section of Topical Report No. 3 under this contract.

  4. Evaluation of gasification and novel thermal processes for the treatment of municipal solid waste

    Energy Technology Data Exchange (ETDEWEB)

    Niessen, W.R.; Marks, C.H.; Sommerlad, R.E. [Camp Dresser and McKee, Inc., Cambridge, MA (United States)

    1996-08-01

    This report identifies seven developers whose gasification technologies can be used to treat the organic constituents of municipal solid waste: Energy Products of Idaho; TPS Termiska Processor AB; Proler International Corporation; Thermoselect Inc.; Battelle; Pedco Incorporated; and ThermoChem, Incorporated. Their processes recover heat directly, produce a fuel product, or produce a feedstock for chemical processes. The technologies are on the brink of commercial availability. This report evaluates, for each technology, several kinds of issues. Technical considerations were material balance, energy balance, plant thermal efficiency, and effect of feedstock contaminants. Environmental considerations were the regulatory context, and such things as composition, mass rate, and treatability of pollutants. Business issues were related to likelihood of commercialization. Finally, cost and economic issues such as capital and operating costs, and the refuse-derived fuel preparation and energy conversion costs, were considered. The final section of the report reviews and summarizes the information gathered during the study.

  5. Coal liquefaction process streams characterization and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.A.; Linehan, J.C.; Robins, W.H. (Battelle Pacific Northwest Lab., Richland, WA (United States))

    1992-07-01

    Under contract from the DOE , and in association with CONSOL Inc., Battelle, Pacific Northwest Laboratory (PNL) evaluated four principal and several complementary techniques for the analysis of non-distillable direct coal liquefaction materials in support of process development. Field desorption mass spectrometry (FDMS) and nuclear magnetic resonance (NMR) spectroscopic methods were examined for potential usefulness as techniques to elucidate the chemical structure of residual (nondistillable) direct coal liquefaction derived materials. Supercritical fluid extraction (SFE) and supercritical fluid chromatography/mass spectrometry (SFC/MS) were evaluated for effectiveness in compound-class separation and identification of residual materials. Liquid chromatography (including microcolumn) separation techniques, gas chromatography/mass spectrometry (GC/MS), mass spectrometry/mass spectrometry (MS/MS), and GC/Fourier transform infrared (FTIR) spectroscopy methods were applied to supercritical fluid extracts. The full report authored by the PNL researchers is presented here. The following assessment briefly highlights the major findings of the project, and evaluates the potential of the methods for application to coal liquefaction materials. These results will be incorporated by CONSOL into a general overview of the application of novel analytical techniques to coal-derived materials at the conclusion of CONSOL's contract.

  6. Example process hazard analysis of a Department of Energy water chlorination process

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  7. Process and equipment development for hot isostatic pressing treatability study

    Energy Technology Data Exchange (ETDEWEB)

    Bateman, Ken; Wahlquist, Dennis; Malewitz, Tim

    2015-03-01

    Battelle Energy Alliance (BEA), LLC, has developed processes and equipment for a pilot-scale hot isostatic pressing (HIP) treatability study to stabilize and volume reduce radioactive calcine stored at Idaho National Laboratory (INL). In 2009, the U. S. Department of Energy signed a Record of Decision with the state of Idaho selecting HIP technology as the method to treat 5,800 yd^3 (4,400 m^3) of granular zirconia and alumina calcine produced between 1953 and 1992 as a waste byproduct of spent nuclear fuel reprocessing. Since the 1990s, a variety of radioactive and hazardous waste forms have been remotely treated using HIP within INL hot cells. To execute the remote process at INL, waste is loaded into a stainless-steel or aluminum can, which is evacuated, sealed, and placed into a HIP furnace. The HIP simultaneously heats and pressurizes the waste, reducing its volume and increasing its durability. Two 1 gal cans of calcine waste currently stored in a shielded cask were identified as candidate materials for a treatability study involving the HIP process. Equipment and materials for cask-handling and calcine transfer into INL hot cells, as well as remotely operated equipment for waste can opening, particle sizing, material blending, and HIP can loading have been designed and successfully tested. These results demonstrate BEA’s readiness for treatment of INL calcine.

  8. West Valley demonstration project: alternative processes for solidifying the high-level wastes

    International Nuclear Information System (INIS)

    In 1980, the US Department of Energy (DOE) established the West Valley Solidification Project as the result of legislation passed by the US Congress. The purpose of this project was to carry out a high level nuclear waste management demonstration project at the Western New York Nuclear Service Center in West Valley, New York. The DOE authorized the Pacific Northwest Laboratory (PNL), which is operated by Battelle Memorial Institute, to assess alternative processes for treatment and solidification of the WNYNSC high-level wastes. The Process Alternatives Study is the suject of this report. Two pretreatment approaches and several waste form processes were selected for evaluation in this study. The two waste treatment approaches were the salt/sludge separation process and the combined waste process. Both terminal and interim waste form processes were studied. The terminal waste form processes considered were: borosilicate glass, low-alkali glass, marbles-in-lead matrix, and crystallinolecular potential and molecular dynamics calculations of the effect are yet to be completed. Cous oxide was also investigated. The reaction is first order in nitrite ion, second order in hydrogen ion, and between zero and first order in hydroxylamine monosulfonate, depending on the concentration

  9. Evaluation of selected chemical processes for production of low-cost silicon (Phase II). Silicon Material Task, Low-Cost Silicon Solar Array Project. Fifth--sixth quarterly progress report, October 1, 1976--March 31, 1977. [Zinc reduction of silicon tetrachloride in fluidized bed

    Energy Technology Data Exchange (ETDEWEB)

    Blocher, J.M. Jr.; Browning, M.F.; Wilson, W.J.; Carmichael, D.C.

    1977-04-29

    The results of experimental work and economic analyses carried out during the first 12 months of this contract (Phase I) have led to Battelle's concentration on development of the zinc reduction of silicon tetrachloride on seed particles in a fluidized bed. A second year program (Phase II) has been initiated which consists of the design of a 25 MT/year experimental facility and supporting experiment effort. During this quarter, the effort in the plant design portion of the program has been devoted to the (1) preparation of a detailed process schematic diagram; (2) determination of material flow and energy requirements; (3) conceptual design of major equipment items, including those unique to the facility; (4) contacts with industrial companies on equipment and processes for which experience is available; and (5) initiation of contacts with Battelle pilot plant design specialists, a distillation consultant, and engineering firms. The effort in the experimental support portion of the program has included a continuation of the following studies: (1) operating parameter optimization in the miniplant, (2) reactor design, and (3) condenser system design, including supplemental condensation experiments. In addition, a new zinc feed system has been devised and evaluated, and the construction of a system sufficiently large to obtain meaningful data on the electrolytic recovery of zinc from zinc chloride has been initiated.

  10. Structurally Integrated Coatings for Wear and Corrosion (SICWC): Arc Lamp, InfraRed (IR) Thermal Processing

    Energy Technology Data Exchange (ETDEWEB)

    Mackiewicz-Ludtka, G.; Sebright, J. [Caterpillar Corp.

    2007-12-15

    The primary goal of this Cooperative Research and Development Agreement (CRADA) betwe1311 UT-Battelle (Contractor) and Caterpillar Inc. (Participant) was to develop the plasma arc lamp (PAL), infrared (IR) thermal processing technology 1.) to enhance surface coating performance by improving the interfacial bond strength between selected coatings and substrates; and 2.) to extend this technology base for transitioning of the arc lamp processing to the industrial Participant. Completion of the following three key technical tasks (described below) was necessary in order to accomplish this goal. First, thermophysical property data sets were successfully determined for composite coatings applied to 1010 steel substrates, with a more limited data set successfully measured for free-standing coatings. These data are necessary for the computer modeling simulations and parametric studies to; A.) simulate PAL IR processing, facilitating the development of the initial processing parameters; and B.) help develop a better understanding of the basic PAL IR fusing process fundamentals, including predicting the influence of melt pool stirring and heat tnmsfar characteristics introduced during plasma arc lamp infrared (IR) processing; Second, a methodology and a set of procedures were successfully developed and the plasma arc lamp (PAL) power profiles were successfully mapped as a function of PAL power level for the ORNL PAL. The latter data also are necessary input for the computer model to accurately simulate PAL processing during process modeling simulations, and to facilitate a better understand of the fusing process fundamentals. Third, several computer modeling codes have been evaluated as to their capabilities and accuracy in being able to capture and simulate convective mixing that may occur during PAL thermal processing. The results from these evaluation efforts are summarized in this report. The intention of this project was to extend the technology base and provide for

  11. Process Accounting

    OpenAIRE

    Gilbertson, Keith

    2002-01-01

    Standard utilities can help you collect and interpret your Linux system's process accounting data. Describes the uses of process accounting, standard process accounting commands, and example code that makes use of process accounting utilities.

  12. A review of state-of-the-art processing operations in coal preparation

    Institute of Scientific and Technical Information of China (English)

    Noble Aaron; Luttrell Gerald H.

    2015-01-01

    Coal preparation is an integral part of the coal commodity supply chain. This stage of post-mining, pre-utilization beneficiation uses low-cost separation technologies to remove unwanted mineral matter and moisture which hinder the value of the coal product. Coal preparation plants typically employ several parallel circuits of cleaning and dewatering operations, with each circuit designed to optimally treat a specific size range of coal. Recent innovations in coal preparation have increased the efficiency and capac-ity of individual unit operations while reinforcing the standard parallel cleaning approach. This article, which describes the historical influences and state-of-the-art design for the various coal preparation unit operations, is organized to distinguish between coarse/intermediate coal cleaning and fine/ultrafine coal cleaning. Size reduction, screening, classification, cleaning, dewatering, waste disposal unit operations are particularly highlighted, with a special focus on the U.S. design philosophy. Notable differences between the U.S. and international operations are described as appropriate.

  13. Sixth annual coal preparation, utilization, and environmental control contractors conference

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    A conference was held on coal preparation, utilization and environmental control. Topics included: combustion of fuel slurries; combustor performance; desulfurization chemically and by biodegradation; coal cleaning; pollution control of sulfur oxides and nitrogen oxides; particulate control; and flue gas desulfurization. Individual projects are processed separately for the databases. (CBS).

  14. Tenth annual coal preparation, utilization, and environmental control contractors conference: Proceedings. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    Volume I contains papers presented at the following sessions: high efficiency preparation; advanced physical coal cleaning; superclean emission systems; air toxics and mercury measurement and control workshop; and mercury measurement and control workshop. Selected papers have been processed for inclusion in the Energy Science and Technology Database.

  15. Semiconductor electrochemistry of coal pyrite. Final technical report, September 1990--September 1995

    Energy Technology Data Exchange (ETDEWEB)

    Osseo-Asare, K.; Wei, D.

    1996-01-01

    This project is concerned with the physiochemical processes occuring at the pyrite/aqueous interface, in the context of coal cleaning, desulfurization, and acid mine drainage. The use of synthetic particles of pyrite as model electrodes to investigate the semiconductor electrochemistry of pyrite is employed.

  16. Environmental control implications of generating electric power from coal. 1977 technology status report. Appendix A, Part 1. Coal preparation and cleaning assessment study

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    This report evaluates the state of the art and effectiveness of physical coal cleaning as a potential strategy for controlling SO/sub x/ emissions in coal fired power generation. Coal properties which are significantly altered by physical coal cleaning were determined. The effects of the changes in properties as they relate to pulverized coal firing, fluidized bed combustion and low Btu gasification for combined cycle powered generation were studied. Available coal washability data were integrated by computer with U.S. coal reserve data. Approximately 18% of the demonstrated coal reserve were matched with washability data. Integrated data appear in the Appendix. Current coal preparation practices were reviewed. Future trends were determined. Five process flow sheets representing increasing levels of cleaning sophistication were prepared. The clean product from each flow sheet will meet U.S. EPA New Source Performance Standards. Capital and operating costs for each case were estimated. Environmental control technology and environmental impact associated with current coal preparation and cleaning operations were assessed. Physical coal cleaning is widely practiced today. Where applicable it represents the least expensive method of coal sulfur reduction. Developmental physical and chemical coal cleaning processes were studied. The chemical methods have the advantage of being able to remove both pyritic sulfur and organic sulfur present in the coal matrix. Further R and D efforts will be required before commercialization of these processes.

  17. Pretreatment status report on the identification and evaluation of alternative processes. Milestone Report No. C064

    Energy Technology Data Exchange (ETDEWEB)

    Sutherland, D.G. [Westinghouse Hanford Co., Richland, WA (United States); Brothers, A.J. [Pacific Northwest Lab., Richland, WA (United States); Beary, M.M.; Nicholson, G.A. [Science Applications International Corp., San Diego, CA (United States)

    1993-09-01

    The purpose of this report is to support the development and demonstration of a pretreatment system that will (1) destroy organic materials and ferrocyanide in tank wastes so that the wastes can be stored safely, (2) separate the high-activity and low-activity fractions, (3) remove radionuclides and remove or destroy hazardous chemicals in LLW as necessary to meet waste form feed requirements, (4) support development and demonstration of vitrification technology by providing representative feeds to the bench-scale glass melter, (5) support full-scale HLW vitrification operations, including near-term operation, by providing feed that meets specifications, and (6) design and develop pretreatment processes that accomplish the above objectives and ensure compliance with environmental regulations. This report is a presentation of candidate technologies for pretreatment of Hanford Site tank waste. Included are descriptions of studies by the Pacific Northwest Laboratory of Battelle Memorial Institute; Science Applications International Corporation, an independent consultant; BNFL, Inc. representing British technologies; Numatec, representing French technologies; and brief accounts of other relevant activities.

  18. Pretreatment status report on the identification and evaluation of alternative processes

    International Nuclear Information System (INIS)

    The purpose of this report is to support the development and demonstration of a pretreatment system that will (1) destroy organic materials and ferrocyanide in tank wastes so that the wastes can be stored safely, (2) separate the high-activity and low-activity fractions, (3) remove radionuclides and remove or destroy hazardous chemicals in LLW as necessary to meet waste form feed requirements, (4) support development and demonstration of vitrification technology by providing representative feeds to the bench-scale glass melter, (5) support full-scale HLW vitrification operations, including near-term operation, by providing feed that meets specifications, and (6) design and develop pretreatment processes that accomplish the above objectives and ensure compliance with environmental regulations. This report is a presentation of candidate technologies for pretreatment of Hanford Site tank waste. Included are descriptions of studies by the Pacific Northwest Laboratory of Battelle Memorial Institute; Science Applications International Corporation, an independent consultant; BNFL, Inc. representing British technologies; Numatec, representing French technologies; and brief accounts of other relevant activities

  19. Primary Processing

    NARCIS (Netherlands)

    Mulder, W.J.; Harmsen, P.F.H.; Sanders, J.P.M.; Carre, P.; Kamm, B.; Schoenicke, P.

    2012-01-01

    Primary processing of oil-containing material involves pre-treatment processes, oil recovery processes and the extraction and valorisation of valuable compounds from waste streams. Pre-treatment processes, e.g. thermal, enzymatic, electrical and radio frequency, have an important effect on the oil r

  20. Qualification testing and full-scale demonstration of titanium-treated zeolite for sludge wash processing

    Energy Technology Data Exchange (ETDEWEB)

    Dalton, W.J.

    1997-06-30

    Titanium-treated zeolite is a new ion-exchange material that is a variation of UOP (formerly Union Carbide) IONSIV IE-96 zeolite (IE-96) that has been treated with an aqueous titanium solution in a proprietary process. IE-96 zeolite, without the titanium treatment, has been used since 1988 in the West Valley Demonstration Project`s (WVDP) Supernatant Treatment System (STS) ion-exchange columns to remove Cs-137 from the liquid supernatant solution. The titanium-treated zeolite (TIE-96) was developed by Battelle-Pacific Northwest Laboratory (PNL). Following successful lab-scale testing of the PNL-prepared TIE-96, UOP was selected as a commercial supplier of the TIE-96 zeolite. Extensive laboratory tests conducted by both the WVDP and PNL indicate that the TIE-96 will successfully remove comparable quantities of Cs-137 from Tank 8D-2 high-level radioactive liquid as was done previously with IE-96. In addition to removing Cs-137, TIE-96 also removes trace quantities of Pu, as well as Sr-90, from the liquid being processed over a wide range of operating conditions: temperature, pH, and dilution. The exact mechanism responsible for the Pu removal is not fully understood. However, the Pu that is removed by the TIE-96 remains on the ion-exchange column under anticipated sludge wash processing conditions. From May 1988 to November 1990, the WVDP processed 560,000 gallons of liquid high-level radioactive supernatant waste stored in Tank 8D-2. Supernatant is an aqueous salt solution comprised primarily of soluble sodium salts. The second stage of the high-level waste treatment process began November 1991 with the initiation of sludge washing. Sludge washing involves the mixing of Tank 8D-2 contents, both sludge and liquid, to dissolve the sulfate salts present in the sludge. Two sludge washes were required to remove sulfates from the sludge.

  1. Elektrokemiske Processer

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers

    1997-01-01

    Electrochemical processes in: Power sources, Electrosynthesis, Corrosion.Pourbaix-diagrams.Decontamination of industrial waste water for heavy metals.......Electrochemical processes in: Power sources, Electrosynthesis, Corrosion.Pourbaix-diagrams.Decontamination of industrial waste water for heavy metals....

  2. Anaerobic Process.

    Science.gov (United States)

    Yang, Qian; Ju, Mei-Ting; Li, Wei-Zun; Liu, Le; Wang, Yan-Nan; Chang, Chein-Chi

    2016-10-01

    A review of the literature published in 2015 on the focus of Anaerobic Process. It is divided into the following sections. Pretreatment Organic waste Multiple-stage co-digestion Process Methodology and Technology. PMID:27620085

  3. Process mining

    DEFF Research Database (Denmark)

    van der Aalst, W.M.P.; Rubin, V.; Verbeek, H.M.W.;

    2010-01-01

    Process mining includes the automated discovery of processes from event logs. Based on observed events (e.g., activities being executed or messages being exchanged) a process model is constructed. One of the essential problems in process mining is that one cannot assume to have seen all possible...... behavior. At best, one has seen a representative subset. Therefore, classical synthesis techniques are not suitable as they aim at finding a model that is able to exactly reproduce the log. Existing process mining techniques try to avoid such “overfitting” by generalizing the model to allow for more...

  4. Data processing

    CERN Document Server

    Fry, T F

    2013-01-01

    Data Processing discusses the principles, practices, and associated tools in data processing. The book is comprised of 17 chapters that are organized into three parts. The first part covers the characteristics, systems, and methods of data processing. Part 2 deals with the data processing practice; this part discusses the data input, output, and storage. The last part discusses topics related to systems and software in data processing, which include checks and controls, computer language and programs, and program elements and structures. The text will be useful to practitioners of computer-rel

  5. Comment on 'Critical materials assessment program' (S. A. Smith and R. Watts, Battelle Memorial Institute)

    Energy Technology Data Exchange (ETDEWEB)

    Zweibel, K.; Jackson, B.; Hermann, A.

    1986-01-15

    In the article ''Critical materials assessment program'' (1984) Smith and Watts examine the availability of indium for use in CuInSe/sub 2//CdS solar cells and conclude that indium's use should be ''decreased or eliminated''. Our response to this article is an update of the status of indium usage in CuInSe/sub 2/ cells. It will be shown that the use of indium is being significantly reduced and that pressures on the world indium supply can be minimal. Our conclusion is that the CdS/CuInSe/sub 2/ solar cell technology is viable, not limited by indium availability or cost.

  6. Experimental evaluation of the Battelle accelerated test design for the solar array at Mead, Nebraska

    Energy Technology Data Exchange (ETDEWEB)

    Frickland, P.O.; Repar, J.

    1982-04-06

    A previously developed test design for accelerated aging of photovoltaic modules was experimentally evaluated. The studies included a review of relevant field experience, environmental chamber cycling of full-size modules, and electrical and physical evaluation of the effects of accelerated aging during and after the tests. The test results indicated that thermally induced fatigue of the interconnects was the primary mode of module failure as measured by normalized power output. No chemical change in the silicone encapsulant was detectable after 360 test cycles.

  7. Final report on ARPA fission yield project work at Battelle-Northwest, April 1970--April 1973

    International Nuclear Information System (INIS)

    The overall objective has been to measure the independent and cumulative fission yields of selected halogen and rare gas nuclides for application to characterization of underground nuclear detonations. The studies have included fission yield measurements for thermal, fission spectrum, and 15 MeV neutron-induced fission events. Target materials included 235U, 238U and 239Pu. The research effort was divided into two basic parts. In one part, the nuclides of interest were separated radiochemically and determined by gamma-ray spectrometry. This approach provides information on the independent and cumulative yields of nuclides with half-lives of a few seconds or greater. The second part of our effort involved the use of on-line mass separation techniques. This approach yields information on independent fission yields of nuclides with half-lives ranging down to fractions of a second and provides data on all significant isotopes of a given fission product element in one set of measurements. The main effort in the radiochemistry program was centered on measurements of the cumulative fission yield of 89Kr. Cumulative fission yields of 89Kr were measured for thermal-neutron fission of 239Pu and for fission-spectrum and 15-MeV neutron fission of 235U, 238U and 239Pu. In addition, cumulative fission yields of the other rare gas radionuclides, /sup 85m/Kr, 87Kr, 88Kr, 137Xe, 138Xe, were measured for the same fission type events. Fractional independent yields of 89Rb and 138Cs were also measured for a limited number of fission systems. On-line mass spectrometer facilities were established at a Van de Graaff accelerator and at a nuclear reactor. Measurements were made of relative independent fission yields of rubidium isotopes of masses 89 through 97 and of cesium isotopes of masses 139 through 145.(U.S.)

  8. Stochastic processes

    CERN Document Server

    Parzen, Emanuel

    2015-01-01

    Well-written and accessible, this classic introduction to stochastic processes and related mathematics is appropriate for advanced undergraduate students of mathematics with a knowledge of calculus and continuous probability theory. The treatment offers examples of the wide variety of empirical phenomena for which stochastic processes provide mathematical models, and it develops the methods of probability model-building.Chapter 1 presents precise definitions of the notions of a random variable and a stochastic process and introduces the Wiener and Poisson processes. Subsequent chapters examine

  9. Data processing

    International Nuclear Information System (INIS)

    The 1988 progress report of the Data Processing laboratory (Polytechnic School, France), is presented. The laboratory research fields are: the semantics, the tests and the semantic analysis of the codes, the formal calculus, the software applications, the algorithms, the neuron networks and VLSI (Very Large Scale Integration). The investigations concerning the polynomial rings are performed by means of the standard basis approach. Among the research topics, the Pascal codes, the parallel processing, the combinatorial, statistical and asymptotic properties of the fundamental data processing tools, the signal processing and the pattern recognition. The published papers, the congress communications and the thesis are also included

  10. Validation of New Process Models for Large Injection-Molded Long-Fiber Thermoplastic Composite Structures

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Kunc, Vlastimil; Tucker III, Charles L.

    2012-02-23

    This report describes the work conducted under the CRADA Nr. PNNL/304 between Battelle PNNL and Autodesk whose objective is to validate the new process models developed under the previous CRADA for large injection-molded LFT composite structures. To this end, the ARD-RSC and fiber length attrition models implemented in the 2013 research version of Moldflow was used to simulate the injection molding of 600-mm x 600-mm x 3-mm plaques from 40% glass/polypropylene (Dow Chemical DLGF9411.00) and 40% glass/polyamide 6,6 (DuPont Zytel 75LG40HSL BK031) materials. The injection molding was performed by Injection Technologies, Inc. at Windsor, Ontario (under a subcontract by Oak Ridge National Laboratory, ORNL) using the mold offered by the Automotive Composite Consortium (ACC). Two fill speeds under the same back pressure were used to produce plaques under slow-fill and fast-fill conditions. Also, two gating options were used to achieve the following desired flow patterns: flows in edge-gated plaques and in center-gated plaques. After molding, ORNL performed measurements of fiber orientation and length distributions for process model validations. The structure of this report is as follows. After the Introduction (Section 1), Section 2 provides a summary of the ARD-RSC and fiber length attrition models. A summary of model implementations in the latest research version of Moldflow is given in Section 3. Section 4 provides the key processing conditions and parameters for molding of the ACC plaques. The validations of the ARD-RSC and fiber length attrition models are presented and discussed in Section 5. The conclusions will be drawn in Section 6.

  11. Kreative processer

    DEFF Research Database (Denmark)

    Schoch, Odilo

    2010-01-01

    Explaining the way of understanding processes and the development of visionary goals - linked to the daily business in archtiectural practise.......Explaining the way of understanding processes and the development of visionary goals - linked to the daily business in archtiectural practise....

  12. Peat Processing

    Science.gov (United States)

    1986-01-01

    Humics, Inc. already had patented their process for separating wet peat into components and processing it when they consulted NERAC regarding possible applications. The NERAC search revealed numerous uses for humic acid extracted from peat. The product improves seed germination, stimulates root development, and improves crop yields. There are also potential applications in sewage disposal and horticultural peat, etc.

  13. Sustainable processing

    DEFF Research Database (Denmark)

    Kristensen, Niels Heine

    2004-01-01

    Kristensen_NH and_Beck A: Sustainable processing. In Otto Schmid, Alexander Beck and Ursula Kretzschmar (Editors) (2004): Underlying Principles in Organic and "Low-Input Food" Processing - Literature Survey. Research Institute of Organic Agriculture FiBL, CH-5070 Frick, Switzerland. ISBN 3-906081-58-3...

  14. Design Processes

    DEFF Research Database (Denmark)

    Ovesen, Nis

    2009-01-01

    Inspiration for most research and optimisations on design processes still seem to focus within the narrow field of the traditional design practise. The focus in this study turns to associated businesses of the design professions in order to learn from their development processes. Through intervie...

  15. Process automation

    International Nuclear Information System (INIS)

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  16. Organizing Process

    DEFF Research Database (Denmark)

    Hull Kristensen, Peer; Bojesen, Anders

    This paper invites to discuss the processes of individualization and organizing being carried out under what we might see as an emerging regime of change. The underlying argumentation is that in certain processes of change, competence becomes questionable at all times. The hazy characteristics...... of this regime of change are pursued through a discussion of competencies as opposed to qualifications illustrated by distinct cases from the Danish public sector in the search for repetitive mechanisms. The cases are put into a general perspective by drawing upon experiences from similar change processes...

  17. Grants Process

    Science.gov (United States)

    The NCI Grants Process provides an overview of the end-to-end lifecycle of grant funding. Learn about the types of funding available and the basics for application, review, award, and on-going administration within the NCI.

  18. Membrane Processes.

    Science.gov (United States)

    Pellegrin, Marie-Laure; Sadler, Mary E; Greiner, Anthony D; Aguinaldo, Jorge; Min, Kyungnan; Zhang, Kai; Arabi, Sara; Burbano, Marie S; Kent, Fraser; Shoaf, Robert

    2015-10-01

    This review, for literature published in 2014, contains information related to membrane processes for municipal and industrial applications. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following topics: pretreatment, membrane bioreactor (MBR) configuration, design, nutrient removal, operation, industrial treatment, fixed film and anaerobic membrane systems, reuse, microconstituents removal, membrane technology advances, membrane fouling, and modeling. Other sub-sections of the Treatment Systems section that might relate to this literature review include: Biological Fixed-Film Systems, Activated Sludge and Other Aerobic Suspended Culture Processes, Anaerobic Processes, Water Reclamation and Reuse. The following sections might also have related information on membrane processes: Industrial Wastes, Hazardous Wastes, and Fate and Effects of Pollutants. PMID:26420079

  19. Membrane Processes.

    Science.gov (United States)

    Pellegrin, Marie-Laure; Burbano, Marie S; Sadler, Mary E; Diamond, Jason; Baker, Simon; Greiner, Anthony D; Arabi, Sara; Wong, Joseph; Doody, Alexandra; Padhye, Lokesh P; Sears, Keith; Kistenmacher, Peter; Kent, Fraser; Tootchi, Leila; Aguinaldo, Jorge; Saddredini, Sara; Schilling, Bill; Min, Kyungnan; McCandless, Robert; Danker, Bryce; Gamage, Neranga P; Wang, Sunny; Aerts, Peter

    2016-10-01

    This review, for literature published in 2015, contains information related to membrane processes for municipal and industrial applications. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following topics: pretreatment, membrane bioreactor (MBR) configuration, design, nutrient removal, operation, industrial treatment, anaerobic membrane systems, reuse, microconstituents removal, membrane technology advances, membrane fouling, and modeling. Other sub-sections of the Treatment Systems section that might relate to this literature review include: Biological Fixed-Film Systems, Activated Sludge and Other Aerobic Suspended Culture Processes, Anaerobic Processes, Water Reclamation and Reuse. The following sections might also have related information on membrane processes: Industrial Wastes, Hazardous Wastes, and Fate and Effects of Pollutants. PMID:27620084

  20. Electrochemical Processes

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers

    1997-01-01

    The notes describe in detail primary and secondary galvanic cells, fuel cells, electrochemical synthesis and electroplating processes, corrosion: measurments, inhibitors, cathodic and anodic protection, details of metal dissolution reactions, Pourbaix diagrams and purification of waste water from...

  1. Sewer Processes

    DEFF Research Database (Denmark)

    Hvitved-Jacobsen, Thorkild; Vollertsen, Jes; Nielsen, Asbjørn Haaning

    microbial and chemical processes and demonstrates how this knowledge can be applied for the design, operation, and the maintenance of wastewater collection systems. The authors add chemical and microbial dimensions to the design and management of sewer networks with an overall aim of improved sustainability......Since the first edition was published over a decade ago, advancements have been made in the design, operation, and maintenance of sewer systems, and new problems have emerged. For example, sewer processes are now integrated in computer models, and simultaneously, odor and corrosion problems caused...... by hydrogen sulfide and other volatile organic compounds, as well as other potential health issues, have caused environmental concerns to rise. Reflecting the most current developments, Sewer Processes: Microbial and Chemical Process Engineering of Sewer Networks, Second Edition, offers the reader updated...

  2. Processing Proteases

    DEFF Research Database (Denmark)

    Ødum, Anders Sebastian Rosenkrans

    Processing proteases are proteases which proteolytically activate proteins and peptides into their biologically active form. Processing proteases play an important role in biotechnology as tools in protein fusion technology. Fusion strategies where helper proteins or peptide tags are fused...... to the protein of interest are an elaborate method to optimize expression or purification systems. It is however critical that fusion proteins can be removed and processing proteases can facilitate this in a highly specific manner. The commonly used proteases all have substrate specificities to the N...... of few known proteases to have substrate specificity for the C-terminal side of the scissile bond. LysN exhibits specificity for lysine, and has primarily been used to complement trypsin in to proteomic studies. A working hypothesis during this study was the potential of LysN as a processing protease...

  3. Renewal processes

    CERN Document Server

    Mitov, Kosto V

    2014-01-01

    This monograph serves as an introductory text to classical renewal theory and some of its applications for graduate students and researchers in mathematics and probability theory. Renewal processes play an important part in modeling many phenomena in insurance, finance, queuing systems, inventory control and other areas. In this book, an overview of univariate renewal theory is given and renewal processes in the non-lattice and lattice case are discussed. A pre-requisite is a basic knowledge of probability theory.

  4. Extraction process

    International Nuclear Information System (INIS)

    A process is described for extracting at least two desired constituents from a mineral, using a liquid reagent which produces the constituents, or compounds thereof, in separable form and independently extracting those constituents, or compounds. The process is especially valuable for the extraction of phosphoric acid and metal values from acidulated phosphate rock, the slurry being contacted with selective extractants for phosphoric acid and metal (e.g. uranium) values. In an example, uranium values are oxidized to uranyl form and extracted using an ion exchange resin. (U.K.)

  5. Offshoring Process

    DEFF Research Database (Denmark)

    Slepniov, Dmitrij; Sørensen, Brian Vejrum; Katayama, Hiroshi

    2011-01-01

    of globalisation. Yet there are clear differences in how offshoring is conducted in Denmark and Japan. The main differences are outlined in a framework and explained employing cultural variables. The findings lead to a number of propositions suggesting that the process of offshoring is not simply a uniform......The purpose of this chapter is to contribute to the knowledge on how production offshoring and international operations management vary across cultural contexts. The chapter attempts to shed light on how companies approach the process of offshoring in different cultural contexts. In order...

  6. BENTONITE PROCESSING

    Directory of Open Access Journals (Sweden)

    Anamarija Kutlić

    2012-07-01

    Full Text Available Bentonite has vide variety of uses. Special use of bentonite, where its absorbing properties are employed to provide water-tight sealing is for an underground repository in granites In this paper, bentonite processing and beneficiation are described.

  7. Causticizing process

    Energy Technology Data Exchange (ETDEWEB)

    Engdal, H.

    1987-03-10

    This invention seeks to provide a method in which the soda lye obtained as a result of cellulose cooking process and unslaked lime are used for producing white liquor which can be re-used in the cooking process. In this method, the heat released by the slaking of lime with soda lye is recovered by a high pressure slaking process wherein the heat is transferred, either to the steam separating from the lye, which steam is then led to the desired application, or to some other medium to be heated. The invention is characterized in that the soda lye to be causticized is divided into two parts, one of which is used for the slaking of lime by adding to it all the unslaked lime needed for the causticizing process, and that, following slaking, the two volumes are brought together for the actual causticizing reaction involving the total amount of lye needed. This invention provides the advantage that the amount of lye needed is smaller, and so the temperature can be increased.

  8. Innovation process

    DEFF Research Database (Denmark)

    Kolodovski, A.

    2006-01-01

    Purpose of this report: This report was prepared for RISO team involved in design of the innovation system Report provides innovation methodology to establish common understanding of the process concepts and related terminology The report does not includeRISO- or Denmark-specific cultural, econom...

  9. Development and Validation of an Acid Mine Drainage Treatment Process for Source Water

    Energy Technology Data Exchange (ETDEWEB)

    Lane, Ann [Battelle Memorial Institute, Columbus, OH (United States)

    2016-03-01

    Throughout Northern Appalachia and surrounding regions, hundreds of abandoned mine sites exist which frequently are the source of Acid Mine Drainage (AMD). AMD typically contains metal ions in solution with sulfate ions which have been leached from the mine. These large volumes of water, if treated to a minimum standard, may be of use in Hydraulic Fracturing (HF) or other industrial processes. This project’s focus is to evaluate an AMD water treatment technology for the purpose of providing treated AMD as an alternative source of water for HF operations. The HydroFlex™ technology allows the conversion of a previous environmental liability into an asset while reducing stress on potable water sources. The technology achieves greater than 95% water recovery, while removing sulfate to concentrations below 100 mg/L and common metals (e.g., iron and aluminum) below 1 mg/L. The project is intended to demonstrate the capability of the process to provide AMD as alternative source water for HF operations. The second budget period of the project has been completed during which Battelle conducted two individual test campaigns in the field. The first test campaign demonstrated the ability of the HydroFlex system to remove sulfate to levels below 100 mg/L, meeting the requirements indicated by industry stakeholders for use of the treated AMD as source water. The second test campaign consisted of a series of focused confirmatory tests aimed at gathering additional data to refine the economic projections for the process. Throughout the project, regular communications were held with a group of project stakeholders to ensure alignment of the project objectives with industry requirements. Finally, the process byproduct generated by the HydroFlex process was evaluated for the treatment of produced water against commercial treatment chemicals. It was found that the process byproduct achieved similar results for produced water treatment as the chemicals currently in use. Further

  10. Boolean process

    Institute of Scientific and Technical Information of China (English)

    闵应骅; 李忠诚; 赵著行

    1997-01-01

    Boolean algebra successfully describes the logical behavior of a digital circuit, and has been widely used in electronic circuit design and test With the development of high speed VLSIs it is a drawback for Boolean algebra to be unable to describe circuit timing behavior. Therefore a Boolean process is defined as a family of Boolean van ables relevant to the time parameter t. A real-valued sample of a Boolean process is a waveform. Waveform functions can be manipulated formally by using mathematical tools. The distance, difference and limit of a waveform polynomial are defined, and a sufficient and necessary condition of the limit existence is presented. Based on this, the concept of sensitization is redefined precisely to demonstrate the potential and wide application possibility The new definition is very different from the traditional one, and has an impact on determining the sensitizable paths with maximum or minimum length, and false paths, and then designing and testing high performance circuits

  11. Restoration Process

    Science.gov (United States)

    1979-01-01

    In the accompanying photos, a laboratory technician is restoring the once-obliterated serial number of a revolver. The four-photo sequence shows the gradual progression from total invisibility to clear readability. The technician is using a new process developed in an applications engineering project conducted by NASA's Lewis Research Center in conjunction with Chicago State University. Serial numbers and other markings are frequently eliminated from metal objects to prevent tracing ownership of guns, motor vehicles, bicycles, cameras, appliances and jewelry. To restore obliterated numbers, crime laboratory investigators most often employ a chemical etching technique. It is effective, but it may cause metal corrosion and it requires extensive preparatory grinding and polishing. The NASA-Chicago State process is advantageous because it can be applied without variation to any kind of metal, it needs no preparatory work and number recovery can be accomplished without corrosive chemicals; the liquid used is water.

  12. Ceramic Processing

    Energy Technology Data Exchange (ETDEWEB)

    EWSUK,KEVIN G.

    1999-11-24

    Ceramics represent a unique class of materials that are distinguished from common metals and plastics by their: (1) high hardness, stiffness, and good wear properties (i.e., abrasion resistance); (2) ability to withstand high temperatures (i.e., refractoriness); (3) chemical durability; and (4) electrical properties that allow them to be electrical insulators, semiconductors, or ionic conductors. Ceramics can be broken down into two general categories, traditional and advanced ceramics. Traditional ceramics include common household products such as clay pots, tiles, pipe, and bricks, porcelain china, sinks, and electrical insulators, and thermally insulating refractory bricks for ovens and fireplaces. Advanced ceramics, also referred to as ''high-tech'' ceramics, include products such as spark plug bodies, piston rings, catalyst supports, and water pump seals for automobiles, thermally insulating tiles for the space shuttle, sodium vapor lamp tubes in streetlights, and the capacitors, resistors, transducers, and varistors in the solid-state electronics we use daily. The major differences between traditional and advanced ceramics are in the processing tolerances and cost. Traditional ceramics are manufactured with inexpensive raw materials, are relatively tolerant of minor process deviations, and are relatively inexpensive. Advanced ceramics are typically made with more refined raw materials and processing to optimize a given property or combination of properties (e.g., mechanical, electrical, dielectric, optical, thermal, physical, and/or magnetic) for a given application. Advanced ceramics generally have improved performance and reliability over traditional ceramics, but are typically more expensive. Additionally, advanced ceramics are typically more sensitive to the chemical and physical defects present in the starting raw materials, or those that are introduced during manufacturing.

  13. Implementation of Paste Backfill Mining Technology in Chinese Coal Mines

    OpenAIRE

    Qingliang Chang; Jianhang Chen; Huaqiang Zhou; Jianbiao Bai

    2014-01-01

    Implementation of clean mining technology at coal mines is crucial to protect the environment and maintain balance among energy resources, consumption, and ecology. After reviewing present coal clean mining technology, we introduce the technology principles and technological process of paste backfill mining in coal mines and discuss the components and features of backfill materials, the constitution of the backfill system, and the backfill process. Specific implementation of this technology a...

  14. Lithospheric processes

    Energy Technology Data Exchange (ETDEWEB)

    Baldridge, W. [and others

    2000-12-01

    The authors used geophysical, geochemical, and numerical modeling to study selected problems related to Earth's lithosphere. We interpreted seismic waves to better characterize the thickness and properties of the crust and lithosphere. In the southwestern US and Tien Shari, crust of high elevation is dynamically supported above buoyant mantle. In California, mineral fabric in the mantle correlate with regional strain history. Although plumes of buoyant mantle may explain surface deformation and magmatism, our geochemical work does not support this mechanism for Iberia. Generation and ascent of magmas remains puzzling. Our work in Hawaii constrains the residence of magma beneath Hualalai to be a few hundred to about 1000 years. In the crust, heat drives fluid and mass transport. Numerical modeling yielded robust and accurate predictions of these processes. This work is important fundamental science, and applies to mitigation of volcanic and earthquake hazards, Test Ban Treaties, nuclear waste storage, environmental remediation, and hydrothermal energy.

  15. Image Processing

    Science.gov (United States)

    1993-01-01

    Electronic Imagery, Inc.'s ImageScale Plus software, developed through a Small Business Innovation Research (SBIR) contract with Kennedy Space Flight Center for use on space shuttle Orbiter in 1991, enables astronauts to conduct image processing, prepare electronic still camera images in orbit, display them and downlink images to ground based scientists for evaluation. Electronic Imagery, Inc.'s ImageCount, a spin-off product of ImageScale Plus, is used to count trees in Florida orange groves. Other applications include x-ray and MRI imagery, textile designs and special effects for movies. As of 1/28/98, company could not be located, therefore contact/product information is no longer valid.

  16. Crystallization process

    Science.gov (United States)

    Adler, Robert J.; Brown, William R.; Auyang, Lun; Liu, Yin-Chang; Cook, W. Jeffrey

    1986-01-01

    An improved crystallization process is disclosed for separating a crystallizable material and an excluded material which is at least partially excluded from the solid phase of the crystallizable material obtained upon freezing a liquid phase of the materials. The solid phase is more dense than the liquid phase, and it is separated therefrom by relative movement with the formation of a packed bed of solid phase. The packed bed is continuously formed adjacent its lower end and passed from the liquid phase into a countercurrent flow of backwash liquid. The packed bed extends through the level of the backwash liquid to provide a drained bed of solid phase adjacent its upper end which is melted by a condensing vapor.

  17. Welding process

    International Nuclear Information System (INIS)

    This invention relates to a process for making a large number of weld beads as separate contours, spaced out from each other, by means of an automatic welding head. Under this invention, after striking the arc in the prescribed manner and positioning the torch on the first contour to be welded and having made the first weld bead, the torch current is reduced to bring about a part fade out of the arc. The torch is then moved to the starting position on a second contour to be welded where a static timed pre-fusion is effected by resumption of the welding current to carry out the second weld bead by following the second welding contour in the same manner and so forth. The invention particularly applies to the welding of tube ends to a tube plate

  18. Information Processing - Administrative Data Processing

    Science.gov (United States)

    Bubenko, Janis

    A three semester, 60-credit course package in the topic of Administrative Data Processing (ADP), offered in 1966 at Stockholm University (SU) and the Royal Institute of Technology (KTH) is described. The package had an information systems engineering orientation. The first semester focused on datalogical topics, while the second semester focused on the infological topics. The third semester aimed to deepen the students’ knowledge in different parts of ADP and at writing a bachelor thesis. The concluding section of this paper discusses various aspects of the department’s first course effort. The course package led to a concretisation of our discipline and gave our discipline an identity. Our education seemed modern, “just in time”, and well adapted to practical needs. The course package formed the first concrete activity of a group of young teachers and researchers. In a forty-year perspective, these people have further developed the department and the topic to an internationally well-reputed body of knowledge and research. The department has produced more than thirty professors and more than one hundred doctoral degrees.

  19. Coal liquefaction process streams characterization and evaluation. Characterization of coal-derived materials by field desorption mass spectrometry, two-dimensional nuclear magnetic resonance, supercritical fluid extraction, and supercritical fluid chromatography/mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.A.; Linehan, J.C.; Robins, W.H. [Battelle Pacific Northwest Lab., Richland, WA (United States)

    1992-07-01

    Under contract from the DOE , and in association with CONSOL Inc., Battelle, Pacific Northwest Laboratory (PNL) evaluated four principal and several complementary techniques for the analysis of non-distillable direct coal liquefaction materials in support of process development. Field desorption mass spectrometry (FDMS) and nuclear magnetic resonance (NMR) spectroscopic methods were examined for potential usefulness as techniques to elucidate the chemical structure of residual (nondistillable) direct coal liquefaction derived materials. Supercritical fluid extraction (SFE) and supercritical fluid chromatography/mass spectrometry (SFC/MS) were evaluated for effectiveness in compound-class separation and identification of residual materials. Liquid chromatography (including microcolumn) separation techniques, gas chromatography/mass spectrometry (GC/MS), mass spectrometry/mass spectrometry (MS/MS), and GC/Fourier transform infrared (FTIR) spectroscopy methods were applied to supercritical fluid extracts. The full report authored by the PNL researchers is presented here. The following assessment briefly highlights the major findings of the project, and evaluates the potential of the methods for application to coal liquefaction materials. These results will be incorporated by CONSOL into a general overview of the application of novel analytical techniques to coal-derived materials at the conclusion of CONSOL`s contract.

  20. AN ADVANCED OXIDATION PROCESS : FENTON PROCESS

    Directory of Open Access Journals (Sweden)

    Engin GÜRTEKİN

    2008-03-01

    Full Text Available Biological wastewater treatment is not effective treatment method if raw wastewater contains toxic and refractory organics. Advanced oxidation processes are applied before or after biological treatment for the detoxification and reclamation of this kind of wastewaters. The advanced oxidation processes are based on the formation of powerful hydroxyl radicals. Among advanced oxidation processes Fenton process is one of the most promising methods. Because application of Fenton process is simple and cost effective and also reaction occurs in a short time period. Fenton process is applied for many different proposes. In this study, Fenton process was evaluated as an advanced oxidation process in wastewater treatment.

  1. AN ADVANCED OXIDATION PROCESS : FENTON PROCESS

    OpenAIRE

    Engin GÜRTEKİN; Nusret ŞEKERDAĞ

    2008-01-01

    Biological wastewater treatment is not effective treatment method if raw wastewater contains toxic and refractory organics. Advanced oxidation processes are applied before or after biological treatment for the detoxification and reclamation of this kind of wastewaters. The advanced oxidation processes are based on the formation of powerful hydroxyl radicals. Among advanced oxidation processes Fenton process is one of the most promising methods. Because application of Fenton process is simple ...

  2. Auditory Processing Disorders

    Science.gov (United States)

    Auditory Processing Disorders Auditory processing disorders (APDs) are referred to by many names: central auditory processing disorders , auditory perceptual disorders , and central auditory disorders . APDs ...

  3. Extensible packet processing architecture

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  4. Annual progress report to Battelle Pacific Northwest National Laboratories on prediction of phase separation of simulated nuclear waste glasses

    International Nuclear Information System (INIS)

    The objective of this research is to predict the immiscibility boundaries of multi-component borosilicate glasses, on which many nuclear waste glass compositions are based. The method used is similar to the prediction method of immiscibility boundaries of multi-component silicate glass systems successfully made earlier and is based upon the superposition of immiscibility boundaries of simple systems using an appropriate parameter. This method is possible because many immiscibility boundaries have similar shapes and can be scaled by a parameter. In the alkali and alkaline earth binary silicate systems, for example, the critical temperature and compositions were scaled using the Debye-Hueckel theory. In the present study on borosilicate systems, first, immiscibility boundaries of various binary alkali and alkaline borate glass systems (e.g. BaO-B2O3) were examined and their critical temperatures were evaluated in terms of Debye-Hueckel theory. The mixing effects of two alkali and alkaline-earth borate systems on the critical temperature were also explored. Next immiscibility boundaries of ternary borosilicate glasses (e.g. Na2O-SiO2-B2O3, K2O-SiO2-B2O3, Rb2O-SiO2-B2O3, and Cs2O-SiO2-B2O3) were examined. Their mixing effects are currently under investigation

  5. Hydrogeologic Evaluation of a Ground-Source Cooling System at the BSF/CSF on the Battelle Campus: Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Freedman, Vicky L.; Mackley, Rob D.; Waichler, Scott R.; Horner, Jacob A.; Moon, Thomas W.; Newcomer, Darrell R.; DeSmet, Darrell J.; Lindsey, K. A.; Porcello, J. J.

    2010-05-12

    This report documents both the field characterization activities and the numerical modeling effort at the BSF/CSF site to determine the viability of an open-loop ground source heat pump (GSHP). The primary purpose of the integrated field and modeling study was to determine far-field impacts related to a non-consumptive use water right for the well field containing four extraction and four injection wells. In the field, boreholes were logged and used to develop the geologic conceptual model. Hydraulic testing was performed to identify hydraulic properties and determine sustainable pumping rates. Estimates of the Ringold hydraulic conductivity (60-150 m/d) at the BSF/CSF site were consistent with the local and regional hydrogeology as well as estimates previously published by other investigators. Sustainable pumping rates at the extraction wells were variable (100 – 700 gpm), and confirmed field observations of aquifer heterogeneity. Field data were used to develop a numerical model of the site. Simulations assessed the potential of the well field to impact nearby contaminant plumes, neighboring water rights, and the thermal regime of nearby surface water bodies. Using steady-state flow scenarios in conjunction with particle tracking, a radius of influence of 400–600 m was identified around the well field. This distance was considerably shorter than the distance to the closest contaminant plume (~1.2 km northwest to the DOE Horn Rapids Landfill) and the nearest water right holder (~1.2 km southeast to the City of Richland Well Field). Results demonstrated that current trajectories for nearby contaminant plumes will not be impacted by the operation of the GSHP well field. The objective of the energy transport analysis was to identify potential thermal impacts to the Columbia River under likely operational scenarios for the BSF/CSF well field. Estimated pumping rates and injection temperatures were used to simulate heat transport for a range of hydraulic conductivity estimates for the Ringold Formation. Two different operational scenarios were simulated using conservative assumptions, such as the absence of river water intrusion in the near shore groundwater. When seasonal injection of warm and cool water occurred, temperature impacts were insignificant at the Columbia River (< +0.2ºC), irrespective of the hydraulic conductivity estimate. The second operational scenario simulated continuous heat rejection, a condition anticipated once the BSF/CSF is fully loaded with laboratory and computer equipment. For the continuous heat rejection case, where hourly peak conditions were simulated as month-long peaks, the maximum change in temperature along the shoreline was ~1ºC. If this were to be interpreted as an absolute change in a static river temperature, it could be considered significant. However, the warmer-than-ambient groundwater flux that would potentially discharge to the Columbia River is very small relative to the flow in the river. For temperatures greater than 17.0ºC, the flow relative to a low-flow condition in the river is only 0.012%. Moreover, field data has shown that diurnal fluctuations in temperature are as high as 5ºC along the shoreline.

  6. BWIP [Battelle Waste Isolation Program] Repository Project: Interim fiscal profile, Benton and Franklin counties, Washington: Working draft

    International Nuclear Information System (INIS)

    This report presents a fiscal profile of Benton and Franklin counties, and of the cities of Richland, Kennewick, and Pasco. Overall, changes in operating revenues and expenditures in these jurisdictions have corresponded with changes in the local economy. The combined operating expenditures of Benton County, Franklin County, Kennewick, Pasco, and Richland, expressed in current dollars, tripled between 1975 and 1985, increasing from $18.1 million to $55.0 million, an annual average increase of 11.8 percent. During this time, the population of the Benton-Franklin MSA increased from 100,000 to 140,900 people, and the national all-items price index for urban consumers doubled, increasing from 161.2 to 322.2. Adjusted for inflation, per capita expenditures by these governments increased only slightly during this period, from $361.8 in 1975 to $390.3 in 1985. Employment in the Benton-Franklin MSA rose from 40,080 workers in 1970 to a peak of 75,900 in 1981 before declining to 61,100 in 1985, primarily due to the loss of 9,928 jobs in the Washington Public Power Supply System after 1981. The MSA's population followed a similar trend, with a slight lag. In 1970, total population in the Benton-Franklin MSA was 93,356 people. The MSA's population grew rapidly during the late 1970s, reached a peak of 147,900 persons in 1982, and then declined to 139,300 in 1986. 23 refs., 16 figs., 14 tabs

  7. Refactoring Process Models in Large Process Repositories.

    NARCIS (Netherlands)

    Weber, B.; Reichert, M.U.

    2008-01-01

    With the increasing adoption of process-aware information systems (PAIS), large process model repositories have emerged. Over time respective models have to be re-aligned to the real-world business processes through customization or adaptation. This bears the risk that model redundancies are introdu

  8. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    2010-01-01

    are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and...

  9. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    , and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can...

  10. Process Intensification: A Perspective on Process Synthesis

    DEFF Research Database (Denmark)

    Lutze, Philip; Gani, Rafiqul; Woodley, John

    2010-01-01

    process options is presented. For each step, different tools and methods will be needed. In this paper, a knowledge base tool storing and retrieving necessary information/data about intensified processes/equipments has been highlighted including metrics for performance evaluation. The application......In recent years, process intensification (PI) has attracted considerable academic interest as a potential means for process improvement, to meet the increasing demands for sustainable production. A variety of intensified operations developed in academia and industry creates a large number...... of options to potentially improve the process but to identify the set of feasible solutions for PI in which the optimal can be found takes considerable resources. Hence, a process synthesis tool to achieve PI would potentially assist in the generation and evaluation of PI options. Currently, several process...

  11. The permanental process

    DEFF Research Database (Denmark)

    McCullagh, Peter; Møller, Jesper

    2006-01-01

    We extend the boson process first to a large class of Cox processes and second to an even larger class of infinitely divisible point processes. Density and moment results are studied in detail. These results are obtained in closed form as weighted permanents, so the extension i called a permanental...... process. Temporal extensions and a particularly tractable case of the permanental process are also studied. Extensions of the fermion process along similar lines, leading to so-called determinantal processes, are discussed....

  12. Modelingof the Biotransformation Processes

    OpenAIRE

    Vrsalovic Presecki, A.; Findrik, Z.; Zelic, B.

    2006-01-01

    Modeling and simulation of biotransformation processes have a large potential in searching for optimal process conditions, development and process design, control, scale-up, identifying of the process cost structure, and comparing process alternatives. Modeling and simulation leads to better understanding and quantification of the investigated process and could lead to significant material and costs savings especially in the early phases of the process development. In this review modeling and...

  13. Food Processing and Allergenicity

    NARCIS (Netherlands)

    Verhoeckx, K.; Vissers, Y.; Baumert, J.L.; Faludi, R.; Fleys, M.; Flanagan, S.; Herouet-Guicheney, C.; Holzhauser, T.; Shimojo, R.; Bolt, van der Nieke; Wichers, H.J.; Kimber, I.

    2015-01-01

    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed.

    In this review the impact of processing (heat and non

  14. Special parallel processing workshop

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-01

    This report contains viewgraphs from the Special Parallel Processing Workshop. These viewgraphs deal with topics such as parallel processing performance, message passing, queue structure, and other basic concept detailing with parallel processing.

  15. Contracting process innovation

    OpenAIRE

    Nissen, Mark E.

    2001-01-01

    Process innovation pertains to making dramatic improvements in performance of enterprise processes. Stemming from total quality management, business process reengineering and other widely-accepted approaches to performance improvement in the business enterprise, process innovation has largely been practiced without a systematic method and required expertise only possessed by a few, highly-talented people. Now, as the result of research in this area, the process of process innovation now has a...

  16. Product Development Process Modeling

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The use of Concurrent Engineering and other modern methods of product development and maintenance require that a large number of time-overlapped "processes" be performed by many people. However, successfully describing and optimizing these processes are becoming even more difficult to achieve. The perspective of industrial process theory (the definition of process) and the perspective of process implementation (process transition, accumulation, and inter-operations between processes) are used to survey the method used to build one base model (multi-view) process model.

  17. Integrated Process Capability Analysis

    Institute of Scientific and Technical Information of China (English)

    Chen; H; T; Huang; M; L; Hung; Y; H; Chen; K; S

    2002-01-01

    Process Capability Analysis (PCA) is a powerful too l to assess the ability of a process for manufacturing product that meets specific ations. The larger process capability index implies the higher process yield, a nd the larger process capability index also indicates the lower process expected loss. Chen et al. (2001) has applied indices C pu, C pl, and C pk for evaluating the process capability for a multi-process product wi th smaller-the-better, larger-the-better, and nominal-the-best spec...

  18. From Process Understanding to Process Control

    NARCIS (Netherlands)

    Streefland, M.

    2010-01-01

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged. Recent changes in the regul

  19. Business Process Customization using Process Merging Techniques

    NARCIS (Netherlands)

    Bulanov, Pavel; Lazovik, Alexander; Aiello, Marco

    2012-01-01

    One of the important application of service composition techniques lies in the field of business process management. Essentially a business process can be considered as a composition of services, which is usually prepared by domain experts, and many tasks still have to be performed manually. These i

  20. Idaho Chemical Processing Plant Process Efficiency improvements

    Energy Technology Data Exchange (ETDEWEB)

    Griebenow, B.

    1996-03-01

    In response to decreasing funding levels available to support activities at the Idaho Chemical Processing Plant (ICPP) and a desire to be cost competitive, the Department of Energy Idaho Operations Office (DOE-ID) and Lockheed Idaho Technologies Company have increased their emphasis on cost-saving measures. The ICPP Effectiveness Improvement Initiative involves many activities to improve cost effectiveness and competitiveness. This report documents the methodology and results of one of those cost cutting measures, the Process Efficiency Improvement Activity. The Process Efficiency Improvement Activity performed a systematic review of major work processes at the ICPP to increase productivity and to identify nonvalue-added requirements. A two-phase approach was selected for the activity to allow for near-term implementation of relatively easy process modifications in the first phase while obtaining long-term continuous improvement in the second phase and beyond. Phase I of the initiative included a concentrated review of processes that had a high potential for cost savings with the intent of realizing savings in Fiscal Year 1996 (FY-96.) Phase II consists of implementing long-term strategies too complex for Phase I implementation and evaluation of processes not targeted for Phase I review. The Phase II effort is targeted for realizing cost savings in FY-97 and beyond.

  1. Idaho Chemical Processing Plant Process Efficiency improvements

    International Nuclear Information System (INIS)

    In response to decreasing funding levels available to support activities at the Idaho Chemical Processing Plant (ICPP) and a desire to be cost competitive, the Department of Energy Idaho Operations Office (DOE-ID) and Lockheed Idaho Technologies Company have increased their emphasis on cost-saving measures. The ICPP Effectiveness Improvement Initiative involves many activities to improve cost effectiveness and competitiveness. This report documents the methodology and results of one of those cost cutting measures, the Process Efficiency Improvement Activity. The Process Efficiency Improvement Activity performed a systematic review of major work processes at the ICPP to increase productivity and to identify nonvalue-added requirements. A two-phase approach was selected for the activity to allow for near-term implementation of relatively easy process modifications in the first phase while obtaining long-term continuous improvement in the second phase and beyond. Phase I of the initiative included a concentrated review of processes that had a high potential for cost savings with the intent of realizing savings in Fiscal Year 1996 (FY-96.) Phase II consists of implementing long-term strategies too complex for Phase I implementation and evaluation of processes not targeted for Phase I review. The Phase II effort is targeted for realizing cost savings in FY-97 and beyond

  2. Spatial Process Generation

    OpenAIRE

    Kroese, Dirk P.; Botev, Zdravko I.

    2013-01-01

    The generation of random spatial data on a computer is an important tool for understanding the behavior of spatial processes. In this paper we describe how to generate realizations from the main types of spatial processes, including Gaussian and Markov random fields, point processes, spatial Wiener processes, and Levy fields. Concrete MATLAB code is provided.

  3. Thin film processes

    CERN Document Server

    Vossen, John L

    1978-01-01

    Remarkable advances have been made in recent years in the science and technology of thin film processes for deposition and etching. It is the purpose of this book to bring together tutorial reviews of selected filmdeposition and etching processes from a process viewpoint. Emphasis is placed on the practical use of the processes to provide working guidelines for their implementation, a guide to the literature, and an overview of each process.

  4. Hyperspectral processing in graphical processing units

    Science.gov (United States)

    Winter, Michael E.; Winter, Edwin M.

    2011-06-01

    With the advent of the commercial 3D video card in the mid 1990s, we have seen an order of magnitude performance increase with each generation of new video cards. While these cards were designed primarily for visualization and video games, it became apparent after a short while that they could be used for scientific purposes. These Graphical Processing Units (GPUs) are rapidly being incorporated into data processing tasks usually reserved for general purpose computers. It has been found that many image processing problems scale well to modern GPU systems. We have implemented four popular hyperspectral processing algorithms (N-FINDR, linear unmixing, Principal Components, and the RX anomaly detection algorithm). These algorithms show an across the board speedup of at least a factor of 10, with some special cases showing extreme speedups of a hundred times or more.

  5. Extension of the semi-empirical correlation for the effects of pipe diameter and internal surface roughness on the decompression wave speed to include High Heating Value Processed Gas mixtures

    International Nuclear Information System (INIS)

    The decompression wave speed, which is used throughout the pipeline industry in connection with the Battelle two-curve method for the control of propagating ductile fracture, is typically calculated using GASDECOM (GASDECOMpression). GASDECOM, developed in the 1970's, idealizes the decompression process as isentropic and one-dimensional, taking no account of pipe wall frictional effects or pipe diameter. Previous shock tube tests showed that decompression wave speeds in smaller diameter and rough pipes are consistently slower than those predicted by GASDECOM for the same conditions of mixture composition and initial pressure and temperature. Previous analysis based on perturbation theory and the fundamental momentum equation revealed a correction term to be subtracted from the ‘idealized’ value of the decompression speed calculated by GASDECOM. One parameter in this correction term involves a dynamic spatial pressure gradient of the outflow at the rupture location. While this is difficult to obtain without a shock tube or actual rupture test, data from 14 shock tube tests, as well as from 14 full scale burst tests involving a variety of gas mixture compositions, were analyzed to correlate the variation of this pressure gradient with two characteristics of the gas mixture, namely; the molecular weight and the higher heating value (HHV). For lean to moderately-rich gas mixes, the developed semi-empirical correlation was found to fit very well the experimentally determined decompression wave speed curve. For extremely rich gas mixes, such as High Heating Value Processed Gas (HHVPG) mixtures of HHV up to 58 MJ/m3, it was found that it overestimates the correction term. Therefore, additional shock tube tests were conducted on (HHVPG) mixes, and the previously developed semi-empirical correlation was extended (revised) to account for such extremity in the richness of the gas mixtures. The newly developed semi-empirical correlation covers a wider range of natural gas

  6. Process innovation laboratory

    DEFF Research Database (Denmark)

    Møller, Charles

    2007-01-01

    Most organizations today are required not only to operate effective business processes but also to allow for changing business conditions at an increasing rate. Today nearly every business relies on their enterprise information systems (EIS) for process integration and future generations of EIS...... will increasingly be driven by business process models. Consequently business process modelling and improvement is becoming a serious challenge. The aim of this paper is to establish a conceptual framework for business process innovation (BPI) in the supply chain based on advanced EIS. The challenge is thus...... to create a new methodology for developing and exploring process models and applications. The paper outlines the process innovation laboratory as a new approach to BPI. The process innovation laboratory is a comprehensive framework and a collaborative workspace for experimenting with process models...

  7. Digital image processing.

    Science.gov (United States)

    Seeram, Euclid

    2004-01-01

    Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists. PMID:15352557

  8. Metallurgical process engineering

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Ruiyu [Central Iron and Steel Research Institute (CISRI), Beijing (China)

    2011-07-01

    ''Metallurgical Process Engineering'' discusses large-scale integrated theory on the level of manufacturing production processes, putting forward concepts for exploring non-equilibrium and irreversible complex system. It emphasizes the dynamic and orderly operation of the steel plant manufacturing process, the major elements of which are the flow, process network and program. The book aims at establishing a quasi-continuous and continuous process system for improving several techno-economic indices, minimizing dissipation and enhancing the market competitiveness and sustainability of steel plants. The book is intended for engineers, researchers and managers in the fields of metallurgical engineering, industrial design, and process engineering. (orig.)

  9. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    The Integrated Renovation Process (IRP) is a user customized methodology based on judiciously selected constructivist and interactive multi-criteria decision making methods (Galiotto, Heiselberg, & Knudstrup, 2014 (expected)). When applied for home renovation, the Integrated Renovation Process...

  10. Towards better process understanding

    DEFF Research Database (Denmark)

    Matero, Sanni Elina; van der Berg, Franciscus Winfried J; Poutiainen, Sami;

    2013-01-01

    The manufacturing of tablets involves many unit operations that possess multivariate and complex characteristics. The interactions between the material characteristics and process related variation are presently not comprehensively analyzed due to univariate detection methods. As a consequence......, current best practice to control a typical process is to not allow process-related factors to vary i.e. lock the production parameters. The problem related to the lack of sufficient process understanding is still there: the variation within process and material properties is an intrinsic feature...... and cannot be compensated for with constant process parameters. Instead, a more comprehensive approach based on the use of multivariate tools for investigating processes should be applied. In the pharmaceutical field these methods are referred to as Process Analytical Technology (PAT) tools that aim...

  11. Infrared processing of foods

    Science.gov (United States)

    Infrared (IR) processing of foods has been gaining popularity over conventional processing in several unit operations, including drying, peeling, baking, roasting, blanching, pasteurization, sterilization, disinfection, disinfestation, cooking, and popping . It has shown advantages over conventional...

  12. News: Process intensification

    Science.gov (United States)

    Conservation of materials and energy is a major objective to the philosophy of sustainability. Where production processes can be intensified to assist these objectives, significant advances have been developed to assist conservation as well as cost. Process intensification (PI) h...

  13. Dairy processing, Improving quality

    NARCIS (Netherlands)

    Smit, G.

    2003-01-01

    This book discusses raw milk composition, production and quality, and reviews developments in processing from hygiene and HACCP systems to automation, high-pressure processing and modified atmosphere packaging.

  14. Flavor changing lepton processes

    International Nuclear Information System (INIS)

    The flavor changing lepton processes, or in another words the lepton flavor changing processes, are described with emphasis on the updated theoretical motivations and the on-going experimental progress on a new high-intense muon source. (author)

  15. Drug Development Process

    Science.gov (United States)

    ... Device Approvals The Drug Development Process The Drug Development Process Share Tweet Linkedin Pin it More sharing ... Pin it Email Print Step 1 Discovery and Development Discovery and Development Research for a new drug ...

  16. Modeling Design Process

    OpenAIRE

    TAKEDA, Hideaki; Veerkamp, Paul; Yoshikawa, Hiroyuki

    1990-01-01

    This article discusses building a computable design process model, which is a prerequisite for realizing intelligent computer-aided design systems. First, we introduce general design theory, from which a descriptive model of design processes is derived. In this model, the concept of metamodels plays a crucial role in describing the evolutionary nature of design. Second, we show a cognitive design process model obtained by observing design processes using a protocol analysis method. We then di...

  17. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  18. Grind hardening process

    CERN Document Server

    Salonitis, Konstantinos

    2015-01-01

    This book presents the grind-hardening process and the main studies published since it was introduced in 1990s.  The modelling of the various aspects of the process, such as the process forces, temperature profile developed, hardness profiles, residual stresses etc. are described in detail. The book is of interest to the research community working with mathematical modeling and optimization of manufacturing processes.

  19. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  20. Business Process Modeling Notation

    OpenAIRE

    Robert Flowers; Charles Edeki

    2013-01-01

    Models as representations of real world entities may appear to the novice to be an unnecessary abstraction. Indeed, in small organizations where there are relatively few processes, there is little need to abstract activities. When it comes to large organizations, with hundreds or thousands of processes, thecreation of models becomes an essential activity. Even if the process itself does not change in the form of new process re-engineering efforts or new applications, there are new employees a...

  1. Dosimetry for radiation processing

    DEFF Research Database (Denmark)

    Miller, Arne

    1986-01-01

    During the past few years significant advances have taken place in the different areas of dosimetry for radiation processing, mainly stimulated by the increased interest in radiation for food preservation, plastic processing and sterilization of medical products. Reference services both...... and sterilization dosimetry, optichromic dosimeters in the shape of small tubes for food processing, and ESR spectroscopy of alanine for reference dosimetry. In this paper the special features of radiation processing dosimetry are discussed, several commonly used dosimeters are reviewed, and factors leading...

  2. Hawkes processes in finance

    OpenAIRE

    Emmanuel Bacry; Iacopo Mastromatteo; Jean-Fran\\c{c}ois Muzy

    2015-01-01

    In this paper we propose an overview of the recent academic literature devoted to the applications of Hawkes processes in finance. Hawkes processes constitute a particular class of multivariate point processes that has become very popular in empirical high frequency finance this last decade. After a reminder of the main definitions and properties that characterize Hawkes processes, we review their main empirical applications to address many different problems in high frequency finance. Becaus...

  3. Stochastic Processes in Finance

    OpenAIRE

    Madan, Dilip B.

    2010-01-01

    Stochastic processes arising in the description of the risk-neutral evolution of equity prices are reviewed. Starting with Brownian motion, I review extensions to Lévy and Sato processes. These processes have independent increments; the former are homogeneous in time, whereas the latter are inhomogeneous. One-dimensional Markov processes such as local volatility and local Lévy are discussed next. Finally, I take up two forms of stochastic volatility that are due to either space scaling or tim...

  4. Mining processes in dentistry

    OpenAIRE

    R. S. Mans; Reijers, H.A.; Genuchten, van, MJIM Michiel; Wismeijer, D.

    2012-01-01

    Business processes in dentistry are quickly evolving towards "digital dentistry". This means that many steps in the dental process will increasingly deal with computerized information or computerized half products. A complicating factor in the improvement of process performance in dentistry, however, is the large number of independent dental professionals that are involved in the entire process. In order to reap the benefits of digital dentistry, it is essential to obtain an accurate view on ...

  5. Lithography process control

    CERN Document Server

    Levinson, Harry J

    1999-01-01

    This text covers lithography process control at several levels, from fundamental through advanced topics. The book is a self-contained tutorial that works both as an introduction to the technology and as a reference for the experienced lithographer. It reviews the foundations of statistical process control as background for advanced topics such as complex processes and feedback. In addition, it presents control methodologies that may be applied to process development pilot lines.

  6. Semisolid Metal Processing Consortium

    Energy Technology Data Exchange (ETDEWEB)

    Apelian,Diran

    2002-01-10

    Mathematical modeling and simulations of semisolid filling processes remains a critical issue in understanding and optimizing the process. Semisolid slurries are non-Newtonian materials that exhibit complex rheological behavior. There the way these slurries flow in cavities is very different from the way liquid in classical casting fills cavities. Actually filling in semisolid processing is often counter intuitive

  7. Multiphoton processes: conference proceedings

    International Nuclear Information System (INIS)

    The chapters of this volume represent the invited papers delivered at the conference. They are arranged according to thermatic proximity beginning with atoms and continuing with molecules and surfaces. Section headings include multiphoton processes in atoms, field fluctuations and collisions in multiphoton process, and multiphoton processes in molecules and surfaces. Abstracts of individual items from the conference were prepared separately for the data base

  8. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook will pr...

  9. How yogurt is processed

    Science.gov (United States)

    This month’s Processing column on the theme of “How Is It Processed?” focuses on yogurt. Yogurt is known for its health-promoting properties. This column will provide a brief overview of the history of yogurt and the current market. It will also unveil both traditional and modern yogurt processing t...

  10. Silicon production process evaluations

    Science.gov (United States)

    1982-01-01

    Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.

  11. Characterization of concurrent processing

    Science.gov (United States)

    Utku, S.; Melosh, R.; Salama, M.

    1985-01-01

    Computer architectures designed for concurrent processing are characterized by the number of processing elements, ensemble speed, random access memory, input/output routes, and modes of operation. The important attributes of processing tasks are then identified, and some processing stratagems are examined. It is shown that the greater the complexity of a given task, the wider the range of possible stratagems which can accomplish the task. For relatively simple tasks, the optimum stratagem can be found by analytical reasoning. For more complex tasks, however, optimum scheduling techniques may have to be employed for the assignment of segments of the task to the available processing elements.

  12. Biomass process handbook

    Energy Technology Data Exchange (ETDEWEB)

    1983-01-01

    Descriptions are given of 42 processes which use biomass to produce chemical products. Marketing and economic background, process description, flow sheets, costs, major equipment, and availability of technology are given for each of the 42 processes. Some of the chemicals discussed are: ethanol, ethylene, acetaldehyde, butanol, butadiene, acetone, citric acid, gluconates, itaconic acid, lactic acid, xanthan gum, sorbitol, starch polymers, fatty acids, fatty alcohols, glycerol, soap, azelaic acid, perlargonic acid, nylon-11, jojoba oil, furfural, furfural alcohol, tetrahydrofuran, cellulose polymers, products from pulping wastes, and methane. Processes include acid hydrolysis, enzymatic hydrolysis, fermentation, distillation, Purox process, and anaerobic digestion.

  13. Data processing made simple

    CERN Document Server

    Wooldridge, Susan

    2013-01-01

    Data Processing: Made Simple, Second Edition presents discussions of a number of trends and developments in the world of commercial data processing. The book covers the rapid growth of micro- and mini-computers for both home and office use; word processing and the 'automated office'; the advent of distributed data processing; and the continued growth of database-oriented systems. The text also discusses modern digital computers; fundamental computer concepts; information and data processing requirements of commercial organizations; and the historical perspective of the computer industry. The

  14. Evaluation of steelmaking processes

    Energy Technology Data Exchange (ETDEWEB)

    Fruehan, R.J. [Carnegie-Mellon Univ., Pittsburgh, PA (United States)

    1994-01-01

    Objective of the AISI Direct Steelmaking Program is to develop a process for producing steel directly from ore and coal; the process should be less capital intensive, consume less energy, and have higher productivity. A task force was formed to examine available processes: trough, posthearth, IRSID, Electric Arc Furnace, energy optimizing furnace. It is concluded that there is insufficient incentive to replace a working BOF with any of these processes to refine hot metal; however, if new steelmaking capacity is required, IRSID and EOF should be considered. A fully continuous process should not be considered until direct ironmaking and continuous refining are perfected.

  15. Colloid process engineering

    CERN Document Server

    Peukert, Wolfgang; Rehage, Heinz; Schuchmann, Heike

    2015-01-01

    This book deals with colloidal systems in technical processes and the influence of colloidal systems by technical processes. It explores how new measurement capabilities can offer the potential for a dynamic development of scientific and engineering, and examines the origin of colloidal systems and its use for new products. The future challenges to colloidal process engineering are the development of appropriate equipment and processes for the production and obtainment of multi-phase structures and energetic interactions in market-relevant quantities. The book explores the relevant processes and for controlled production and how they can be used across all scales.

  16. Thin film processes II

    CERN Document Server

    Kern, Werner

    1991-01-01

    This sequel to the 1978 classic, Thin Film Processes, gives a clear, practical exposition of important thin film deposition and etching processes that have not yet been adequately reviewed. It discusses selected processes in tutorial overviews with implementation guide lines and an introduction to the literature. Though edited to stand alone, when taken together, Thin Film Processes II and its predecessor present a thorough grounding in modern thin film techniques.Key Features* Provides an all-new sequel to the 1978 classic, Thin Film Processes* Introduces new topics, and sever

  17. Theory of Markov processes

    CERN Document Server

    Dynkin, E B

    1960-01-01

    Theory of Markov Processes provides information pertinent to the logical foundations of the theory of Markov random processes. This book discusses the properties of the trajectories of Markov processes and their infinitesimal operators.Organized into six chapters, this book begins with an overview of the necessary concepts and theorems from measure theory. This text then provides a general definition of Markov process and investigates the operations that make possible an inspection of the class of Markov processes corresponding to a given transition function. Other chapters consider the more c

  18. Dynamical laser spike processing

    CERN Document Server

    Shastri, Bhavin J; Tait, Alexander N; Rodriguez, Alejandro W; Wu, Ben; Prucnal, Paul R

    2015-01-01

    Novel materials and devices in photonics have the potential to revolutionize optical information processing, beyond conventional binary-logic approaches. Laser systems offer a rich repertoire of useful dynamical behaviors, including the excitable dynamics also found in the time-resolved "spiking" of neurons. Spiking reconciles the expressiveness and efficiency of analog processing with the robustness and scalability of digital processing. We demonstrate that graphene-coupled laser systems offer a unified low-level spike optical processing paradigm that goes well beyond previously studied laser dynamics. We show that this platform can simultaneously exhibit logic-level restoration, cascadability and input-output isolation---fundamental challenges in optical information processing. We also implement low-level spike-processing tasks that are critical for higher level processing: temporal pattern detection and stable recurrent memory. We study these properties in the context of a fiber laser system, but the addit...

  19. Advanced coal-using community systems. Task 1A. Technology characteristics. Volume 1. Fuel- and energy-production systems

    Energy Technology Data Exchange (ETDEWEB)

    Tison, R.R.; Blazek, C.F.; Biederman, N.P.; Malik, N.J.; Gamze, M.G.; Wetterstrom, D.; Diskant, W.; Malfitani, L.

    1979-03-01

    This report is presented in 2 volumes. It contains descriptions of engineering characterizations and equipment used in coal processing, fuel and energy distribution, storage, and end-use utilization. Volume 1 contains 4 chapters dealing with: coal conversion processes (high- and low-Btu gas from coal and coal-to-liquid fuels); coal cleaning and direct combustion (pretreating, direct combustion, and stack gas cleaning); electricity production (compression-ignition engines, turbines, combined-cycle, fuel cells, alternative Rankine cycles, Stirling cycles, and closed Brayton cycles); and thermal generating processes (steam plants, direct-contact steam-heated hot water systems, thermal liquid plants, absorption chillers, and centrifugal chillers). (DMC)

  20. Improving the performance of conventional and column froth flotation cells

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, B.J. [CQ Inc., Homer City, PA (United States)

    1995-11-01

    Many existing mining operations hover on the brink of producing competitively priced fuel with marginally acceptable sulfur levels. To remain competitive, these operations need to improve the yield of their coal processing facilities, lower the sulfur content of their clean coal, or lower the ash content of their clean coal. Fine coal cleaning processes offer the best opportunity for coal producers to increase their yield of high quality product. Over 200 coal processing plants in the U.S. already employ some type of conventional or column flotation device to clean fines. an increase in efficiency in these existing circuits could be the margin required to make these coal producers competitive.

  1. Process monitor gratings

    Science.gov (United States)

    Brunner, T. A.; Ausschnitt, C. P.

    2007-03-01

    Despite the increasing use of advanced imaging methods to pattern chip features, process windows continue to shrink with decreasing critical dimensions. Controlling the manufacturing process within these shrinking windows requires monitor structures designed to maximize both sensitivity and robustness. In particular, monitor structures must exhibit a large, measurable response to dose and focus changes over the entire range of the critical features process window. Any process variations present fundamental challenges to the effectiveness of OPC methods, since the shape compensation assumes a repeatable process. One particular process parameter which is under increasing scrutiny is focus blur, e.g. from finite laser bandwidth, which can cause such OPC instability, and thereby damage pattern fidelity. We introduce a new type of test target called the Process Monitor Grating (PMG) which is designed for extreme sensitivity to process variation. The PMG design principle is to use assist features to zero out higher diffraction orders. We show via simulation and experiment that such structures are indeed very sensitive to process variation. In addition, PMG targets have other desirable attributes such as mask manufacturability, robustness to pattern collapse, and compatibility with standard CD metrology methods such as scatterometry. PMG targets are applicable to the accurate determination of dose and focus deviations, and in combination with an isofocal grating target, allow the accurate determination of focus blur. The methods shown in this paper are broadly applicable to the characterization of process deviations using test wafers or to the control of product using kerf structures.

  2. Future Steelmaking Processes

    Energy Technology Data Exchange (ETDEWEB)

    Prof. R. J. Fruehan

    2004-09-20

    There is an increasing demand for an ironmaking process with lower capital cost, energy consumption and emissions than a blast furnace. It is the hypothesis of the present work that an optimized combination of two reasonable proven technologies will greatly enhance the overall process. An example is a rotary hearth furnace (RHF) linked to a smelter (e.g., AISI, HIsmelt). The objective of this research is to select promising process combinations, develop energy, materials balance and productivity models for the individual processes, conduct a limited amount of basic research on the processes and evaluate the process combinations. Three process combinations were selected with input from the industrial partners. The energy-materials and productivity models for the RHF, smelter, submerged arc furnace and CIRCOFER were developed. Since utilization of volatiles in coal is critical for energy and CO{sub 2} emission reduction, basic research on this topic was also conducted. The process models developed are a major product developed in this research. These models can be used for process evaluation by the industry. The process combinations of an RHF-Smelter and a simplified CIRCOFER-Smelter appear to be promising. Energy consumption is reduced and productivity increased. Work on this project is continuing using funds from other sources.

  3. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  4. Spitzer Telemetry Processing System

    Science.gov (United States)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  5. Linearity in Process Languages

    DEFF Research Database (Denmark)

    Nygaard, Mikkel; Winskel, Glynn

    2002-01-01

    The meaning and mathematical consequences of linearity (managing without a presumed ability to copy) are studied for a path-based model of processes which is also a model of affine-linear logic. This connection yields an affine-linear language for processes, automatically respecting open-map bisi......-map bisimulation, in which a range of process operations can be expressed. An operational semantics is provided for the tensor fragment of the language. Different ways to make assemblies of processes lead to different choices of exponential, some of which respect bisimulation.......The meaning and mathematical consequences of linearity (managing without a presumed ability to copy) are studied for a path-based model of processes which is also a model of affine-linear logic. This connection yields an affine-linear language for processes, automatically respecting open...

  6. Multimodal Processes Rescheduling

    DEFF Research Database (Denmark)

    Bocewicz, Grzegorz; Banaszak, Zbigniew A.; Nielsen, Peter;

    2013-01-01

    Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe-cuted in the......Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe......-cuted in these kinds of systems can be considered using a declarative modeling framework. Proposed representation provides a unified way for performance evaluation of local cyclic as well as supported by them multimodal processes. The main question regards of reachability of a SCCP cyclic behavior. In this context......, the sufficient conditions guarantee the reachability of both local and multimodal processes cyclic steady state spaces are discussed....

  7. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  8. Basic digital signal processing

    CERN Document Server

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  9. Jointly Poisson processes

    CERN Document Server

    Johnson, D H

    2009-01-01

    What constitutes jointly Poisson processes remains an unresolved issue. This report reviews the current state of the theory and indicates how the accepted but unproven model equals that resulting from the small time-interval limit of jointly Bernoulli processes. One intriguing consequence of these models is that jointly Poisson processes can only be positively correlated as measured by the correlation coefficient defined by cumulants of the probability generating functional.

  10. Process Improvement Essentials

    CERN Document Server

    Persse, James R

    2006-01-01

    Process Improvement Essentials combines the foundation needed to understand process improvement theory with the best practices to help individuals implement process improvement initiatives in their organization. The three leading programs: ISO 9001:2000, CMMI, and Six Sigma--amidst the buzz and hype--tend to get lumped together under a common label. This book delivers a combined guide to all three programs, compares their applicability, and then sets the foundation for further exploration.

  11. Hierarchical Dirichlet Scaling Process

    OpenAIRE

    Kim, Dongwoo; Oh, Alice

    2014-01-01

    We present the \\textit{hierarchical Dirichlet scaling process} (HDSP), a Bayesian nonparametric mixed membership model. The HDSP generalizes the hierarchical Dirichlet process (HDP) to model the correlation structure between metadata in the corpus and mixture components. We construct the HDSP based on the normalized gamma representation of the Dirichlet process, and this construction allows incorporating a scaling function that controls the membership probabilities of the mixture components. ...

  12. Operating System Process Schedulers

    OpenAIRE

    ŠEKORANJA, MATEJ

    2016-01-01

    Process scheduling is one of the key tasks of every operating system. Proper implementation of a scheduler reflects itself in a system responsiveness, especially when processes require execution in real-time. Multimedia playback is one of these processes, also being one of the most common operating system tasks nowadays. In the beginning of this thesis, I present theoretical basics of scheduling: its goals, different scheduling types and basics algorithms. I cover scheduling in single-proces...

  13. Polygon mesh processing

    CERN Document Server

    Botsch, Mario; Pauly, Mark; Alliez, Pierre; Levy, Bruno

    2010-01-01

    Geometry processing, or mesh processing, is a fast-growing area of research that uses concepts from applied mathematics, computer science, and engineering to design efficient algorithms for the acquisition, reconstruction, analysis, manipulation, simulation, and transmission of complex 3D models. Applications of geometry processing algorithms already cover a wide range of areas from multimedia, entertainment, and classical computer-aided design, to biomedical computing, reverse engineering, and scientific computing. Over the last several years, triangle meshes have become increasingly popular,

  14. Business process transformation

    CERN Document Server

    Grover, Varun

    2015-01-01

    Featuring contributions from prominent thinkers and researchers, this volume in the ""Advances in Management Information Systems"" series provides a rich set of conceptual, empirical, and introspective studies that epitomize fundamental knowledge in the area of Business Process Transformation. Processes are interpreted broadly to include operational and managerial processes within and between organizations, as well as those involved in knowledge generation. Transformation includes radical and incremental change, its conduct, management, and outcome. The editors and contributing authors pay clo

  15. Multiphoton processes: conference proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Lambropoulos, P.; Smith, S.J. (eds.)

    1984-01-01

    The chapters of this volume represent the invited papers delivered at the conference. They are arranged according to thermatic proximity beginning with atoms and continuing with molecules and surfaces. Section headings include multiphoton processes in atoms, field fluctuations and collisions in multiphoton process, and multiphoton processes in molecules and surfaces. Abstracts of individual items from the conference were prepared separately for the data base. (GHT)

  16. The Recruitment Process:

    DEFF Research Database (Denmark)

    Holm, Anna

    The aim of this research was to determine whether the introduction of e-recruitment has an impact on the process and underlying tasks, subtasks and activities of recruitment. Three large organizations with well-established e-recruitment practices were included in the study. The three case studies......, which were carried out in Denmark in 2008-2009 using qualitative research methods, revealed changes in the sequence, divisibility and repetitiveness of a number of recruitment tasks and subtasks. The new recruitment process design was identified and presented in the paper. The study concluded...... that the main task of the process shifted from processing applications to communicating with candidates....

  17. The Critical Design Process

    DEFF Research Database (Denmark)

    Brunsgaard, Camilla; Knudstrup, Mary-Ann; Heiselberg, Per

    2014-01-01

    within Danish tradition of architecture and construction. The objective of the research presented in this paper, is to compare the different design processes behind the making of passive houses in a Danish context. We evaluated the process with regard to the integrated and traditional design process....... Data analysis showed that the majority of the consortiums worked in an integrated manner; though there was room for improvment. Additionally, the paper discusses the challanges of implementing the integrated design process in practice and suggests ways of overcomming some of the barriers . In doing so...

  18. Living olefin polymerization processes

    Science.gov (United States)

    Schrock, Richard R.; Baumann, Robert

    1999-01-01

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  19. Financial information processing

    Institute of Scientific and Technical Information of China (English)

    Shuo BAI; Shouyang WANG; Lean YU; Aoying ZHOU

    2009-01-01

    @@ The rapid growth in financial data volume has made financial information processing more and more difficult due to the increase in complexity, which has forced businesses and academics alike to turn to sophisticated information processing technologies for better solutions. A typical feature is that high-performance computers and advanced computational techniques play ever-increasingly important roles for business and industries to have competitive advantages. Accordingly, financial information processing has emerged as a new cross-disciplinary field integrating computer science, mathematics, financial economics, intelligent techniques, and computer simulations to make different decisions based on processed financial information.

  20. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  1. Acoustic Signal Processing

    Science.gov (United States)

    Hartmann, William M.; Candy, James V.

    Signal processing refers to the acquisition, storage, display, and generation of signals - also to the extraction of information from signals and the re-encoding of information. As such, signal processing in some form is an essential element in the practice of all aspects of acoustics. Signal processing algorithms enable acousticians to separate signals from noise, to perform automatic speech recognition, or to compress information for more efficient storage or transmission. Signal processing concepts are the building blocks used to construct models of speech and hearing. Now, in the 21st century, all signal processing is effectively digital signal processing. Widespread access to high-speed processing, massive memory, and inexpensive software make signal processing procedures of enormous sophistication and power available to anyone who wants to use them. Because advanced signal processing is now accessible to everybody, there is a need for primers that introduce basic mathematical concepts that underlie the digital algorithms. The present handbook chapter is intended to serve such a purpose.

  2. Cooperative internal conversion process

    CERN Document Server

    Kálmán, Péter

    2015-01-01

    A new phenomenon, called cooperative internal conversion process, in which the coupling of bound-free electron and neutron transitions due to the dipole term of their Coulomb interaction permits cooperation of two nuclei leading to neutron exchange if it is allowed by energy conservation, is discussed theoretically. General expression of the cross section of the process is reported in one particle nuclear and spherical shell models as well in the case of free atoms (e.g. noble gases). A half-life characteristic of the process is also determined. The case of $Ne$ is investigated numerically. The process may have significance in fields of nuclear waste disposal and nuclear energy production.

  3. Fuels Processing Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — NETL’s Fuels Processing Laboratory in Morgantown, WV, provides researchers with the equipment they need to thoroughly explore the catalytic issues associated with...

  4. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  5. EARSEC SAR processing system

    Science.gov (United States)

    Protheroe, Mark; Sloggett, David R.; Sieber, Alois J.

    1994-12-01

    Traditionally, the production of high quality Synthetic Aperture Radar imagery has been an area where a potential user would have to expend large amounts of money in either the bespoke development of a processing chain dedicated to his requirements or in the purchase of a dedicated hardware platform adapted using accelerator boards and enhanced memory management. Whichever option the user adopted there were limitations based on the desire for a realistic throughput in data load and time. The user had a choice, made early in the purchase, for either a system that adopted innovative algorithmic manipulation, to limit the processing time of the purchase of expensive hardware. The former limits the quality of the product, while the latter excludes the user from any visibility into the processing chain. Clearly there was a need for a SAR processing architecture that gave the user a choice into the methodology to be adopted for a particular processing sequence, allowing him to decide on either a quick (lower quality) product or a detailed slower (high quality) product, without having to change the algorithmic base of his processor or the hardware platform. The European Commission, through the Advanced Techniques unit of the Joint Research Centre (JRC) Institute for Remote Sensing at Ispra in Italy, realizing the limitations on current processing abilities, initiated its own program to build airborne SAR and Electro-Optical (EO) sensor systems. This program is called the European Airborne Remote Sensing Capabilities (EARSEC) program. This paper describes the processing system developed for the airborne SAR sensor system. The paper considers the requirements for the system and the design of the EARSEC Airborne SAR Processing System. It highlights the development of an open SAR processing architecture where users have full access to intermediate products that arise from each of the major processing stages. It also describes the main processing stages in the overall

  6. Facilitated exclusion process and Pfaffian Schur processes

    OpenAIRE

    Baik, Jinho; Barraquand, Guillaume; Corwin, Ivan; Suidan, Toufic

    2016-01-01

    We study the Facilitated TASEP, an interacting particle system on the one dimensional integer lattice. We prove that starting from step initial condition, the position of the rightmost particle has Tracy Widom GSE statistics on a cube root time scale, while the statistics in the bulk of the rarefaction fan are GUE. This uses a mapping with last-passage percolation in a half-quadrant, which we study using the formalism of Pfaffian Schur processes. For the model with exponential weights, we pro...

  7. Modeling soil processes

    NARCIS (Netherlands)

    Vereecken, H.; Schnepf, A.; Hopmans, J.W.; Javaux, M.; Or, D.; Roose, T.; Vanderborght, J.; Young, M.H.; Amelung, W.; Aitkenhead, M.; Allison, S.D.; Assouline, S.; Baveye, P.; Berli, M.; Brüggemann, N.; Finke, P.; Flury, M.; Gaiser, T.; Govers, G.; Ghezzehei, T.; Hallett, P.; Hendricks Franssen, H.J.; Heppell, J.; Horn, R.; Huisman, J.A.; Jacques, D.; Jonard, F.; Kollet, S.; Lafolie, F.; Lamorski, K.; Leitner, D.; Mcbratney, A.; Minasny, B.; Montzka, C.; Nowak, W.; Pachepsky, Y.; Padarian, J.; Romano, N.; Roth, K.; Rothfuss, Y.; Rowe, E.C.; Schwen, A.; Šimůnek, J.; Tiktak, A.; Dam, van Jos; Zee, van der S.E.A.T.M.; Vogel, H.J.; Vrugt, J.A.; Wöhling, T.; Wöhling, T.; Young, I.M.

    2016-01-01

    The remarkable complexity of soil and its importance to a wide range of ecosystem services presents major challenges to the modeling of soil processes. Although major progress in soil models has occurred in the last decades, models of soil processes remain disjointed between disciplines or ecosys

  8. Shell Higher Olefins Process.

    Science.gov (United States)

    Lutz, E. F.

    1986-01-01

    Shows how olefin isomerization and the exotic olefin metathesis reaction can be harnessed in industrial processes. Indicates that the Shell Higher Olefins Process makes use of organometallic catalysts to manufacture alpha-olefins and internal carbon-11 through carbon-14 alkenes in a flexible fashion that can be adjusted to market needs. (JN)

  9. Kuhlthau's Information Search Process.

    Science.gov (United States)

    Shannon, Donna

    2002-01-01

    Explains Kuhlthau's Information Search Process (ISP) model which is based on a constructivist view of learning and provides a framework for school library media specialists for the design of information services and instruction. Highlights include a shift from library skills to information skills; attitudes; process approach; and an interview with…

  10. Ultrahigh bandwidth signal processing

    DEFF Research Database (Denmark)

    Oxenløwe, Leif Katsuo

    2016-01-01

    Optical time lenses have proven to be very versatile for advanced optical signal processing. Based on a controlled interplay between dispersion and phase-modulation by e.g. four-wave mixing, the processing is phase-preserving, an hence useful for all types of data signals including coherent multi...

  11. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  12. Dosimetry in process control

    International Nuclear Information System (INIS)

    Measurement of absorbed dose and dose distribution in irradiated medical products relies on the use of quality dosimetry systems, trained personnel and a thorough understanding of the energy deposition process. The interrelationship of these factors will be discussed with emphasis on the current and future practices of process control dosimetry. (author)

  13. Contaminated nickel scrap processing

    International Nuclear Information System (INIS)

    The DOE will soon choose between treating contaminated nickel scrap as a legacy waste and developing high-volume nickel decontamination processes. In addition to reducing the volume of legacy wastes, a decontamination process could make 200,000 tons of this strategic metal available for domestic use. Contaminants in DOE nickel scrap include 234Th, 234Pa, 137Cs, 239Pu (trace), 60Co, U, 99Tc, and 237Np (trace). This report reviews several industrial-scale processes -- electrorefining, electrowinning, vapormetallurgy, and leaching -- used for the purification of nickel. Conventional nickel electrolysis processes are particularly attractive because they use side-stream purification of process solutions to improve the purity of nickel metal. Additionally, nickel purification by electrolysis is effective in a variety of electrolyte systems, including sulfate, chloride, and nitrate. Conventional electrorefining processes typically use a mixed electrolyte which includes sulfate, chloride, and borate. The use of an electrorefining or electrowinning system for scrap nickel recovery could be combined effectively with a variety of processes, including cementation, solvent extraction, ion exchange, complex-formation, and surface sorption, developed for uranium and transuranic purification. Selected processes were reviewed and evaluated for use in nickel side-stream purification. 80 refs

  14. Mineral Processing Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2000-09-01

    This document represents the roadmap for Processing Technology Research in the US Mining Industry. It was developed based on the results of a Processing Technology Roadmap Workshop sponsored by the National Mining Association in conjunction with the US Department of Energy, Office of Energy Efficiency and Renewable Energy, Office of Industrial Technologies. The Workshop was held January 24 - 25, 2000.

  15. Food processing and allergenicity.

    Science.gov (United States)

    Verhoeckx, Kitty C M; Vissers, Yvonne M; Baumert, Joseph L; Faludi, Roland; Feys, Marcel; Flanagan, Simon; Herouet-Guicheney, Corinne; Holzhauser, Thomas; Shimojo, Ryo; van der Bolt, Nieke; Wichers, Harry; Kimber, Ian

    2015-06-01

    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat treatment) on the allergenic potential of proteins, and on the antigenic (IgG-binding) and allergenic (IgE-binding) properties of proteins has been considered. A variety of allergenic foods (peanuts, tree nuts, cows' milk, hens' eggs, soy, wheat and mustard) have been reviewed. The overall conclusion drawn is that processing does not completely abolish the allergenic potential of allergens. Currently, only fermentation and hydrolysis may have potential to reduce allergenicity to such an extent that symptoms will not be elicited, while other methods might be promising but need more data. Literature on the effect of processing on allergenic potential and the ability to induce sensitisation is scarce. This is an important issue since processing may impact on the ability of proteins to cause the acquisition of allergic sensitisation, and the subject should be a focus of future research. Also, there remains a need to develop robust and integrated methods for the risk assessment of food allergenicity.

  16. Relational Processing Following Stroke

    Science.gov (United States)

    Andrews, Glenda; Halford, Graeme S.; Shum, David; Maujean, Annick; Chappell, Mark; Birney, Damian

    2013-01-01

    The research examined relational processing following stroke. Stroke patients (14 with frontal, 30 with non-frontal lesions) and 41 matched controls completed four relational processing tasks: sentence comprehension, Latin square matrix completion, modified Dimensional Change Card Sorting, and n-back. Each task included items at two or three…

  17. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  18. Hybrid quantum information processing

    Science.gov (United States)

    Furusawa, Akira

    2014-12-01

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  19. Image processing mini manual

    Science.gov (United States)

    Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill

    1992-01-01

    The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.

  20. Hyperspectral image processing methods

    Science.gov (United States)

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  1. Technologies for Optical Processing

    DEFF Research Database (Denmark)

    Stubkjær, Kristian

    2008-01-01

    The article consists of a Powerpoint presentation on technologies for optical processing. The paper concludes that the nonlinear elements based on SOA, fibers and waveguide structures have capabilities of simple processing at data rates of 100-600 Gb/s. Switching powers comparable to electronics...

  2. Uranium processing and properties

    CERN Document Server

    2013-01-01

    Covers a broad spectrum of topics and applications that deal with uranium processing and the properties of uranium Offers extensive coverage of both new and established practices for dealing with uranium supplies in nuclear engineering Promotes the documentation of the state-of-the-art processing techniques utilized for uranium and other specialty metals

  3. Microsystem process networks

    Science.gov (United States)

    Wegeng, Robert S.; TeGrotenhuis, Ward E.; Whyatt, Greg A.

    2006-10-24

    Various aspects and applications of microsystem process networks are described. The design of many types of microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having exergetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

  4. Laser processing of plastics

    Science.gov (United States)

    Atanasov, Peter A.

    1995-03-01

    CO2-laser processing of plastics has been studied experimentally and theoretically. Welding of cylindrical parts made from polycarbonate and polypropylene, cutting of polymethyl-methacrylate plates, and drilling holes in polypropylene are presented as examples. A good coincidence between theoretical and experimental results in case of laser welding has been found. Some practical aspects of laser processing of plastics has been given.

  5. Heavy oils processing materials requirements crude processing

    Energy Technology Data Exchange (ETDEWEB)

    Sloley, Andrew W. [CH2M Hill, Englewood, CO (United States)

    2012-07-01

    Over time, recommended best practices for crude unit materials selection have evolved to accommodate new operating requirements, feed qualities, and product qualities. The shift to heavier oil processing is one of the major changes in crude feed quality occurring over the last 20 years. The three major types of crude unit corrosion include sulfidation attack, naphthenic acid attack, and corrosion resulting from hydrolyzable chlorides. Heavy oils processing makes all three areas worse. Heavy oils have higher sulfur content; higher naphthenic acid content; and are more difficult to desalt, leading to higher chloride corrosion rates. Materials selection involves two major criteria, meeting required safety standards, and optimizing economics of the overall plant. Proper materials selection is only one component of a plant integrity approach. Materials selection cannot eliminate all corrosion. Proper materials selection requires appropriate support from other elements of an integrity protection program. The elements of integrity preservation include: materials selection (type and corrosion allowance); management limits on operating conditions allowed; feed quality control; chemical additives for corrosion reduction; and preventive maintenance and inspection (PMI). The following discussion must be taken in the context of the application of required supporting work in all the other areas. Within that context, specific materials recommendations are made to minimize corrosion due to the most common causes in the crude unit. (author)

  6. Posttranslational processing of progastrin

    DEFF Research Database (Denmark)

    Bundgaard, Jens René; Rehfeld, Jens F.

    2010-01-01

    Gastrin and cholecystokinin (CCK) are homologous hormones with important functions in the brain and the gut. Gastrin is the main regulator of gastric acid secretion and gastric mucosal growth, whereas cholecystokinin regulates gall bladder emptying, pancreatic enzyme secretion and besides acts...... processing progastrin is often greatly disturbed in neoplastic cells.The posttranslational phase of the biogenesis of gastrin and the various progastrin products in gastrin gene-expressing tissues is now reviewed here. In addition, the individual contributions of the processing enzymes are discussed......, as are structural features of progastrin that are involved in the precursor activation process. Thus, the review describes how the processing depends on the cell-specific expression of the processing enzymes and kinetics in the secretory pathway....

  7. Business process support

    Energy Technology Data Exchange (ETDEWEB)

    Carle, Adriana; Fiducia, Daniel [Transportadora de Gas del Sur S.A. (TGS), Buenos Aires (Argentina)

    2005-07-01

    This paper is about the own development of business support software. The developed applications are used to support two business processes: one of them is the process of gas transportation and the other is the natural gas processing. This software has interphases with the ERP SAP, software SCADA and on line gas transportation simulation software. The main functionalities of the applications are: entrance on line real time of clients transport nominations, transport programming, allocation of the clients transport nominations, transport control, measurements, balanced pipeline, allocation of gas volume to the gas processing plants, calculate of product tons processed in each plant and tons of product distributed to clients. All the developed software generates information to the internal staff, regulatory authorities and clients. (author)

  8. Beryllium chemistry and processing

    CERN Document Server

    Walsh, Kenneth A

    2009-01-01

    This book introduces beryllium; its history, its chemical, mechanical, and physical properties including nuclear properties. The 29 chapters include the mineralogy of beryllium and the preferred global sources of ore bodies. The identification and specifics of the industrial metallurgical processes used to form oxide from the ore and then metal from the oxide are thoroughly described. The special features of beryllium chemistry are introduced, including analytical chemical practices. Beryllium compounds of industrial interest are identified and discussed. Alloying, casting, powder processing, forming, metal removal, joining and other manufacturing processes are covered. The effect of composition and process on the mechanical and physical properties of beryllium alloys assists the reader in material selection. The physical metallurgy chapter brings conformity between chemical and physical metallurgical processing of beryllium, metal, alloys, and compounds. The environmental degradation of beryllium and its all...

  9. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  10. Organic food processing

    DEFF Research Database (Denmark)

    Kahl, Johannes; Alborzi, Farnaz; Beck, Alexander;

    2014-01-01

    In 2007 EU Regulation (EC) 834/2007 introduced principles and criteria for organic food processing. These regulations have been analysed and discussed in several scientific publications and research project reports. Recently, organic food quality was described by principles, aspects and criteria....... These principles from organic agriculture were verified and adapted for organic food processing. Different levels for evaluation were suggested. In another document, underlying paradigms and consumer perception of organic food were reviewed against functional food, resulting in identifying integral product...... to evaluate processing methods. Therefore the goal of this paper is to describe and discuss the topic of organic food processing to make it operational. A conceptual background for organic food processing is given by verifying the underlying paradigms and principles of organic farming and organic food as well...

  11. Food Process Engineering

    DEFF Research Database (Denmark)

    Friis, Alan; Jensen, Bo Boye Busk; Risum, Jørgen

    introduced make it possible to get an overview of processes. We have included a chapter on how to make block-dia¬grams and material balances. This should help in analysing a production and to isolate the most important processing steps. Most processes in a food production are of physical nature, thus we have...... included a chapter on relevant physical properties of foods and especially a part on rheological properties of food products. The remaining text is focused on the processes transport of fluids, transport of heat and transport of energy. The textbook ends with a chapter on heat exchangers and how...... to calculate the requirements of heat processing. Our goal is to put food engineering into a production context. Other courses teach food chemistry, food microbiology and food technology. Topics of great importance and all have to be seen in a broader context of producing good and safe food in a large scale...

  12. Nonhomogeneous fractional Poisson processes

    Energy Technology Data Exchange (ETDEWEB)

    Wang Xiaotian [School of Management, Tianjin University, Tianjin 300072 (China)]. E-mail: swa001@126.com; Zhang Shiying [School of Management, Tianjin University, Tianjin 300072 (China); Fan Shen [Computer and Information School, Zhejiang Wanli University, Ningbo 315100 (China)

    2007-01-15

    In this paper, we propose a class of non-Gaussian stationary increment processes, named nonhomogeneous fractional Poisson processes W{sub H}{sup (j)}(t), which permit the study of the effects of long-range dependance in a large number of fields including quantum physics and finance. The processes W{sub H}{sup (j)}(t) are self-similar in a wide sense, exhibit more fatter tail than Gaussian processes, and converge to the Gaussian processes in distribution in some cases. In addition, we also show that the intensity function {lambda}(t) strongly influences the existence of the highest finite moment of W{sub H}{sup (j)}(t) and the behaviour of the tail probability of W{sub H}{sup (j)}(t)

  13. New Processes for Annulation

    Institute of Scientific and Technical Information of China (English)

    Liu Hsing-Jang

    2004-01-01

    Making use of the high propensity of 2-cyano-2-cycloalkenones to undergo conjugate addition with various carbanions and the high reactivity of the ensuing α -cyano ketone system, a number of new annulation processes have been developed recently in our laboratories. As shown in Eq. 1 (n=1) with a specific example, one such process involves the addition of 3-butenylmagnesium bromide, followed by a palladium (Ⅱ) acetate mediated oxidative cyclization, to facilitate methylenecyclopentane ring formation. This annulation process could be readily extended to effect methylenecyclohexane ring formation (Eq. 1, n=2), using 4-pentenylmagnesinm bromide as the initial reagent, and to install the carbomethoxy-substituted methylenecyclopentane and methylenecyclohexane rings, using the carbanions derived from methyl 4-pentenoate and methyl 5-hexenoate, respectively (Eq. 2). In another annulation process, the addition of the enolate of methyl 5-chloropentanoate is involved initially, and the ring formation is readily effected by an intramolecular alkylation process. A specific example is given in Eq. 3.

  14. Logistics Innovation Process Revisited

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Su, Shong-Iee Ivan; Yang, Su-Lan

    2011-01-01

    that was triggered by the practical needs of new ways of handling material flows of a hospital. This approach made it possible to revisit theory on logistics innovation process. Findings – Apart from the tangible benefits reported to the case hospital, five findings can be extracted from this study: the logistics...... on internal stakeholders as on external relationships; and logistics innovation process may start out as a dialectic, conflict ridden process and end up in a well-ordered goal-oriented teleological process. Research limitations/implications – In general, the study contributes to the knowledge base...... on an existing model by methodological triangulation in order to learn more about the qualities of actual processes and their implications for theory and practice....

  15. Manufacturing processes 4 forming

    CERN Document Server

    Klocke, Fritz

    2013-01-01

    This book provides essential information on metal forming, utilizing a practical distinction between bulk and sheet metal forming. In the field of bulk forming, it examines processes of cold, warm and hot bulk forming, as well as rolling and a new addition, the process of thixoforming. As for the field of sheet metal working, on the one hand it deals with sheet metal forming processes (deep drawing, flange forming, stretch drawing, metal spinning and bending). In terms of special processes, the chapters on internal high-pressure forming and high rate forming have been revised and refined. On the other, the book elucidates and presents the state of the art in sheet metal separation processes (shearing and fineblanking). Furthermore, joining by forming has been added to the new edition as a new chapter describing mechanical methods for joining sheet metals. The new chapter “Basic Principles” addresses both sheet metal and bulk forming, in addition to metal physics, plastomechanics and computational basics; ...

  16. In Process Beam Monitoring

    Science.gov (United States)

    Steen, W. M.; Weerasinghe, V. M.

    1986-11-01

    The industrial future of lasers in material processing lies in the combination of the laser with automatic machinery. One possible form of such a combination is an intelligent workstation which monitors the process as it occurs and adjusts itself accordingly, either by self teaching or by comparison to a process data bank or algorithm. In order to achieve this attractive goal in-process signals are required. Two devices are described in this paper. One is the Laser Beam Analyser which is now maturing into a second generation with computerised output. The other is the Acoustic Mirror, a totally novel analytic technique, not yet fully understood, but which nevertheless can act as a very effective process monitor.

  17. Quantum independent increment processes

    CERN Document Server

    Franz, Uwe

    2006-01-01

    This is the second of two volumes containing the revised and completed notes of lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald in March, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present second volume contains the following lectures: "Random Walks on Finite Quantum Groups" by Uwe Franz and Rolf Gohm, "Quantum Markov Processes and Applications in Physics" by Burkhard Kümmerer, Classical and Free Infinite Divisibility and Lévy Processes" by Ole E. Barndorff-Nielsen, Steen Thorbjornsen, and "Lévy Processes on Quantum Groups and Dual Groups" by Uwe Franz.

  18. Beyond the search process

    DEFF Research Database (Denmark)

    Hyldegård, Jette

    2009-01-01

    This paper reports on the findings from a longitudinal case study exploring Kuhlthau's information search process (ISP)-model in a group based academic setting. The research focus is on group members' activities and cognitive and emotional experiences during the task process of writing an assignm......This paper reports on the findings from a longitudinal case study exploring Kuhlthau's information search process (ISP)-model in a group based academic setting. The research focus is on group members' activities and cognitive and emotional experiences during the task process of writing....... It is concluded that the ISP-model does not fully comply with group members' problem solving process and the involved information seeking behavior. Further, complex academic problem solving seems to be even more complex when it is performed in a group based setting. The study contributes with a new conceptual...

  19. Formed HIP Can Processing

    Energy Technology Data Exchange (ETDEWEB)

    Clarke, Kester Diederik [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-07-27

    The intent of this report is to document a procedure used at LANL for HIP bonding aluminum cladding to U-10Mo fuel foils using a formed HIP can for the Domestic Reactor Conversion program in the NNSA Office of Material, Management and Minimization, and provide some details that may not have been published elsewhere. The HIP process is based on the procedures that have been used to develop the formed HIP can process, including the baseline process developed at Idaho National Laboratory (INL). The HIP bonding cladding process development is summarized in the listed references. Further iterations with Babcock & Wilcox (B&W) to refine the process to meet production and facility requirements is expected.

  20. Branching processes in biology

    CERN Document Server

    Kimmel, Marek

    2015-01-01

    This book provides a theoretical background of branching processes and discusses their biological applications. Branching processes are a well-developed and powerful set of tools in the field of applied probability. The range of applications considered includes molecular biology, cellular biology, human evolution and medicine. The branching processes discussed include Galton-Watson, Markov, Bellman-Harris, Multitype, and General Processes. As an aid to understanding specific examples, two introductory chapters, and two glossaries are included that provide background material in mathematics and in biology. The book will be of interest to scientists who work in quantitative modeling of biological systems, particularly probabilists, mathematical biologists, biostatisticians, cell biologists, molecular biologists, and bioinformaticians. The authors are a mathematician and cell biologist who have collaborated for more than a decade in the field of branching processes in biology for this new edition. This second ex...

  1. Revealing the programming process

    DEFF Research Database (Denmark)

    Bennedsen, Jens; Caspersen, Michael Edelgaard

    2005-01-01

    One of the most important goals of an introductory programming course is that the students learn a systematic approach to the development of computer programs. Revealing the programming process is an important part of this; however, textbooks do not address the issue -- probably because the textb......One of the most important goals of an introductory programming course is that the students learn a systematic approach to the development of computer programs. Revealing the programming process is an important part of this; however, textbooks do not address the issue -- probably because...... the textbook medium is static and therefore ill-suited to expose the process of programming. We have found that process recordings in the form of captured narrated programming sessions are a simple, cheap, and efficient way of providing the revelation.We identify seven different elements of the programming...... process for which process recordings are a valuable communication media in order to enhance the learning process. Student feedback indicates both high learning outcome and superior learning potential compared to traditional classroom teaching....

  2. Welding processes handbook

    CERN Document Server

    Weman, Klas

    2003-01-01

    Deals with the main commercially significant and commonly used welding processes. This title takes the student or novice welder through the individual steps involved in each process in an easily understood way. It covers many of the requirements referred to in European Standards including EN719, EN 729, EN 729 and EN 287.$bWelding processes handbook is a concise, explanatory guide to the main commercially significant and commonly-used welding processes. It takes the novice welder or student through the individual steps involved in each process in a clear and easily understood way. It is intended to provide an up-to-date reference to the major applications of welding as they are used in industry. The contents have been arranged so that it can be used as a textbook for European welding courses in accordance with guidelines from the European Welding Federation. Welding processes and equipment necessary for each process are described so that they can be applied to all instruction levels required by the EWF and th...

  3. Monitoring of Microalgal Processes.

    Science.gov (United States)

    Havlik, Ivo; Scheper, Thomas; Reardon, Kenneth F

    2016-01-01

    Process monitoring, which can be defined as the measurement of process variables with the smallest possible delay, is combined with process models to form the basis for successful process control. Minimizing the measurement delay leads inevitably to employing online, in situ sensors where possible, preferably using noninvasive measurement methods with stable, low-cost sensors. Microalgal processes have similarities to traditional bioprocesses but also have unique monitoring requirements. In general, variables to be monitored in microalgal processes can be categorized as physical, chemical, and biological, and they are measured in gaseous, liquid, and solid (biological) phases. Physical and chemical process variables can be usually monitored online using standard industrial sensors. The monitoring of biological process variables, however, relies mostly on sensors developed and validated using laboratory-scale systems or uses offline methods because of difficulties in developing suitable online sensors. Here, we review current technologies for online, in situ monitoring of all types of process parameters of microalgal cultivations, with a focus on monitoring of biological parameters. We discuss newly introduced methods for measuring biological parameters that could be possibly adapted for routine online use, should be preferably noninvasive, and are based on approaches that have been proven in other bioprocesses. New sensor types for measuring physicochemical parameters using optical methods or ion-specific field effect transistor (ISFET) sensors are also discussed. Reviewed methods with online implementation or online potential include measurement of irradiance, biomass concentration by optical density and image analysis, cell count, chlorophyll fluorescence, growth rate, lipid concentration by infrared spectrophotometry, dielectric scattering, and nuclear magnetic resonance. Future perspectives are discussed, especially in the field of image analysis using in situ

  4. Getting Started with Processing

    CERN Document Server

    Reas, Casey

    2010-01-01

    Learn computer programming the easy way with Processing, a simple language that lets you use code to create drawings, animation, and interactive graphics. Programming courses usually start with theory, but this book lets you jump right into creative and fun projects. It's ideal for anyone who wants to learn basic programming, and serves as a simple introduction to graphics for people with some programming skills. Written by the founders of Processing, this book takes you through the learning process one step at a time to help you grasp core programming concepts. You'll learn how to sketch wi

  5. Transnational Learning Processes

    DEFF Research Database (Denmark)

    Nedergaard, Peter

    This paper analyses and compares the transnational learning processes in the employment field in the European Union and among the Nordic countries. Based theoretically on a social constructivist model of learning and methodologically on a questionnaire distributed to the relevant participants......, a number of hypotheses concerning transnational learning processes are tested. The paper closes with a number of suggestions regarding an optimal institutional setting for facilitating transnational learning processes.Key words: Transnational learning, Open Method of Coordination, Learning, Employment......, European Employment Strategy, European Union, Nordic countries....

  6. Lasers in chemical processing

    International Nuclear Information System (INIS)

    The high cost of laser energy is the crucial issue in any potential laser-processing application. It is expensive relative to other forms of energy and to most bulk chemicals. We show those factors that have previously frustrated attempts to find commercially viable laser-induced processes for the production of materials. Having identified the general criteria to be satisfied by an economically successful laser process and shown how these imply the laser-system requirements, we present a status report on the uranium laser isotope separation (LIS) program at the Lawrence Livermore National Laboratory

  7. Digital Differential Geometry Processing

    Institute of Scientific and Technical Information of China (English)

    Xin-Guo Liu; Hu-Jun Bao; Qun-Sheng Peng

    2006-01-01

    The theory and methods of digital geometry processing has been a hot research area in computer graphics, as geometric models serves as the core data for 3D graphics applications. The purpose of this paper is to introduce some recent advances in digital geometry processing, particularly mesh fairing, surface parameterization and mesh editing, that heavily use differential geometry quantities. Some related concepts from differential geometry, such as normal, curvature, gradient,Laplacian and their counterparts on digital geometry are also reviewed for understanding the strength and weakness of various digital geometry processing methods.

  8. Lapis SOI Pixel Process

    OpenAIRE

    Okihara, Masao; Kasai, Hiroki; Miura, Noriyuki; Kuriyama, Naoya; Nagatomo, Yoshiki

    2015-01-01

    0.2 um fully-depleted SOI technology has been developed a for X-ray pixel detectors. To improve the detector performance, some advanced process technologies are developing continuously. To utilize the high resistivity FZ-SOI, slow ramp up and ramp down recipes are applied for the thermal processes in both of SOI wafer fabrication and pixel detector process. The suitable backside treatment is also applied to prevent increase of leakage current at backside damaged layer in the case of full depl...

  9. Plasma processing for VLSI

    CERN Document Server

    Einspruch, Norman G

    1984-01-01

    VLSI Electronics: Microstructure Science, Volume 8: Plasma Processing for VLSI (Very Large Scale Integration) discusses the utilization of plasmas for general semiconductor processing. It also includes expositions on advanced deposition of materials for metallization, lithographic methods that use plasmas as exposure sources and for multiple resist patterning, and device structures made possible by anisotropic etching.This volume is divided into four sections. It begins with the history of plasma processing, a discussion of some of the early developments and trends for VLSI. The second section

  10. Quantum Information Processing

    CERN Document Server

    Leuchs, Gerd

    2005-01-01

    Quantum processing and communication is emerging as a challenging technique at the beginning of the new millennium. This is an up-to-date insight into the current research of quantum superposition, entanglement, and the quantum measurement process - the key ingredients of quantum information processing. The authors further address quantum protocols and algorithms. Complementary to similar programmes in other countries and at the European level, the German Research Foundation (DFG) started a focused research program on quantum information in 1999. The contributions - written by leading experts - bring together the latest results in quantum information as well as addressing all the relevant questions

  11. Irreversible processes kinetic theory

    CERN Document Server

    Brush, Stephen G

    2013-01-01

    Kinetic Theory, Volume 2: Irreversible Processes deals with the kinetic theory of gases and the irreversible processes they undergo. It includes the two papers by James Clerk Maxwell and Ludwig Boltzmann in which the basic equations for transport processes in gases are formulated, together with the first derivation of Boltzmann's ""H-theorem"" and a discussion of this theorem, along with the problem of irreversibility.Comprised of 10 chapters, this volume begins with an introduction to the fundamental nature of heat and of gases, along with Boltzmann's work on the kinetic theory of gases and s

  12. Radioactive waste processing method

    International Nuclear Information System (INIS)

    When granular materials comprising radioactive wastes containing phosphorus are processed at first in a fluidized bed type furnace, if the granular materials are phosphorus-containing activated carbon, granular materials comprising alkali compound such as calcium hydroxide and barium hydroxide are used as fluidizing media. Even granular materials of slow burning speed can be burnt stably in a fluidizing state by high temperature heat of the fluidizing media, thereby enabling to take a long burning processing time. Accordingly, radioactive activated carbon wastes can be processed by burning treatment. (T.M.)

  13. Process of performance assessment

    International Nuclear Information System (INIS)

    Performance assessment is the process used to evaluate the environmental consequences of disposal of radioactive waste in the biosphere. An introductory review of the subject is presented. Emphasis is placed on the process of performance assessment from the standpoint of defining the process. Performance assessment, from evolving experience at DOE sites, has short-term and long-term subprograms, the components of which are discussed. The role of mathematical modeling in performance assessment is addressed including the pros and cons of current approaches. Finally, the system/site/technology issues as the focal point of this symposium are reviewed

  14. Advanced uranium enrichment processes

    International Nuclear Information System (INIS)

    Three advanced Uranium enrichment processes are dealt with in the report: AVLIS (Atomic Vapour LASER Isotope Separation), MLIS (Molecular LASER Isotope Separation) and PSP (Plasma Separation Process). The description of the physical and technical features of the processes constitutes a major part of the report. If further presents comparisons with existing industrially used enrichment technologies, gives information on actual development programmes and budgets and ends with a chapter on perspectives and conclusions. An extensive bibliography of the relevant open literature is added to the different subjects discussed. The report was drawn up by the nuclear research Centre (CEA) Saclay on behalf of the Commission of the European Communities

  15. Study on Glulam Process

    Institute of Scientific and Technical Information of China (English)

    PENG Limin; WANG Haiqing; HE Weili

    2006-01-01

    This paper selected lumbers of Manchurian ash (Fraxinus rnandshurica), Manchurian walnut (Juglans mandshuricd) and Spruce (Picea jezoensis vai.kornamvii) for manufacturing glulam with water-borne polymeric-isocyanate adhesive to determine process variables. The process variables that include specific pressure, pressing time and adhesive application amount influencing the shear strength of the glulam, were investigated through the orthogonal test. The results indicated that optimum process variables for glulam manufacturing were as follows: Specific pressure of 1.5 MPa for Spruce and 2,0 MPa both for Manchurian ash and Manchurian walnut, pressing time of 60 min and adhesive application amount of 250 g/m2.

  16. The image processing handbook

    CERN Document Server

    Russ, John C

    2006-01-01

    Now in its fifth edition, John C. Russ's monumental image processing reference is an even more complete, modern, and hands-on tool than ever before. The Image Processing Handbook, Fifth Edition is fully updated and expanded to reflect the latest developments in the field. Written by an expert with unequalled experience and authority, it offers clear guidance on how to create, select, and use the most appropriate algorithms for a specific application. What's new in the Fifth Edition? ·       A new chapter on the human visual process that explains which visual cues elicit a response from the vie

  17. Nano integrated circuit process

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Yung Sup

    2004-02-15

    This book contains nine chapters, which are introduction of manufacture of semiconductor chip, oxidation such as Dry-oxidation, wet oxidation, oxidation model and oxide film, diffusion like diffusion process, diffusion equation, diffusion coefficient and diffusion system, ion implantation, including ion distribution, channeling, multiimplantation and masking and its system, sputtering such as CVD and PVD, lithography, wet etch and dry etch, interconnection and flattening like metal-silicon connection, silicide, multiple layer metal process and flattening, an integrated circuit process, including MOSFET and CMOS.

  18. Semi-Markov processes

    CERN Document Server

    Grabski

    2014-01-01

    Semi-Markov Processes: Applications in System Reliability and Maintenance is a modern view of discrete state space and continuous time semi-Markov processes and their applications in reliability and maintenance. The book explains how to construct semi-Markov models and discusses the different reliability parameters and characteristics that can be obtained from those models. The book is a useful resource for mathematicians, engineering practitioners, and PhD and MSc students who want to understand the basic concepts and results of semi-Markov process theory. Clearly defines the properties and

  19. CAPSULE REPORT: EVAPORATION PROCESS

    Science.gov (United States)

    Evaporation has been an established technology in the metal finishing industry for many years. In this process, wastewaters containing reusable materials, such as copper, nickel, or chromium compounds are heated, producing a water vapor that is continuously removed and condensed....

  20. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter;

    2015-01-01

    strategic preference, as part of their business model innovation activity planned. Practical implications – This paper aimed at strengthening researchers and, particularly, practitioner’s perspectives into the field of business model process configurations. By insuring an [abstracted] alignment between......Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  1. IT Project Prioritization Process

    DEFF Research Database (Denmark)

    Shollo, Arisa; Constantiou, Ioanna

    2013-01-01

    In most of the large companies IT project prioritization process is designed based on principles of evidencebased management. We investigate a case of IT project prioritization in a financial institution, and in particular, how managers practice evidence-based management during this process. We use...... a rich dataset built from a longitudinal study of the prioritization process for the IT projects. Our findings indicate that managers reach a decision not only by using evidence but from the interplay between the evidence and the judgment devices that managers employ. The interplay between evidence...... and judgment devices is manifested in three ways: supplementing, substituting, and interpreting evidence. We show that while evidence does not fully determine the decision, it plays a central role in discussions, reflections, and negotiations during the IT prioritization process....

  2. Assessing Process and Product

    DEFF Research Database (Denmark)

    Bennedsen, Jens; Caspersen, Michael E.

    2006-01-01

    The final assessment of a course must reflect its goals, and contents. An important goal of our introductory programming course is that the students learn a systematic approach for the development of computer programs. Having the programming process as learning objective naturally raises the ques......The final assessment of a course must reflect its goals, and contents. An important goal of our introductory programming course is that the students learn a systematic approach for the development of computer programs. Having the programming process as learning objective naturally raises...... the question how to include this in assessments. Traditional assessments (e.g. oral, written, or multiple choice) are unsuitable to test the programming process. We describe and evaluate a practical lab examination that assesses the students' programming process as well as the developed programs...

  3. Cooperative processing data bases

    Science.gov (United States)

    Hasta, Juzar

    1991-01-01

    Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.

  4. Ultrasonic Processing of Materials

    Science.gov (United States)

    Han, Qingyou

    2015-08-01

    Irradiation of high-energy ultrasonic vibration in metals and alloys generates oscillating strain and stress fields in solids, and introduces nonlinear effects such as cavitation, acoustic streaming, and radiation pressure in molten materials. These nonlinear effects can be utilized to assist conventional material processing processes. This article describes recent research at Oak Ridge National Labs and Purdue University on using high-intensity ultrasonic vibrations for degassing molten aluminum, processing particulate-reinforced metal matrix composites, refining metals and alloys during solidification process and welding, and producing bulk nanostructures in solid metals and alloys. Research results suggest that high-intensity ultrasonic vibration is capable of degassing and dispersing small particles in molten alloys, reducing grain size during alloy solidification, and inducing nanostructures in solid metals.

  5. Processed Products Database System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Collection of annual data on processed seafood products. The Division provides authoritative advice, coordination and guidance on matters related to the collection,...

  6. Organ Donation: The Process

    Science.gov (United States)

    ... Español Search Register with your state as an Organ Donor Home Why Donate Becoming a Donor About Donation & ... Plan Your Finances After Your Transplant Contact Your Donor Family Organ Donation: The Process Enrolling as a Donor : The ...

  7. Radiation processing in Japan

    International Nuclear Information System (INIS)

    Economic scale of radiation application in the field of industry, agriculture and medicine in Japan in 1997 was investigated to compare its economic impacts with that of nuclear energy industry. Total production value of radiation application accounted for 54% of nuclear industry including nuclear energy industry and radiation applications in three fields above. Industrial radiation applications were further divided into five groups, namely nondestructive test, RI instruments, radiation facilities, radiation processing and ion beam processing. More than 70% of the total production value was brought about by ion beam processing for use with IC and semiconductors. Future economic prospect of radiation processing of polymers, for example cross-linking, EB curing, graft polymerization and degradation, is reviewed. Particular attention was paid to radiation vulcanization of natural rubber latex and also to degradation of natural polymers. (S. Ohno)

  8. Catalyzing alignment processes

    DEFF Research Database (Denmark)

    Lauridsen, Erik Hagelskjær; Jørgensen, Ulrik

    2004-01-01

    This paper describes how environmental management systems (EMS) spur the circulation of processes that support the constitution of environmental issues as specific environ¬mental objects and objectives. EMS catalyzes alignmentprocesses that produce coherence among the different elements involved ...

  9. Processing NOAA Spectroradiometric Data

    OpenAIRE

    Broenkow, William W.; Greene, Nancy, T.; Feinholz, Michael, E.

    1993-01-01

    This report outlines the NOAA spectroradiometer data processing system implemented by the MLML_DBASE programs. This is done by presenting the algorithms and graphs showing the effects of each step in the algorithms. [PDF contains 32 pages

  10. George: Gaussian Process regression

    Science.gov (United States)

    Foreman-Mackey, Daniel

    2015-11-01

    George is a fast and flexible library, implemented in C++ with Python bindings, for Gaussian Process regression useful for accounting for correlated noise in astronomical datasets, including those for transiting exoplanet discovery and characterization and stellar population modeling.

  11. Enrichment: centrifuge process

    International Nuclear Information System (INIS)

    This short course is divided into three sections devoted respectively to the physics of the process, some practical problems raised by the design of a centrifuge and the present situation of centrifugation in the World. 31 figs., 18 refs

  12. Technology or Process First?

    DEFF Research Database (Denmark)

    Siurdyban, Artur Henryk; Svejvig, Per; Møller, Charles

    Enterprise Systems Management (ESM) and Business Pro- cess Management (BPM), although highly correlated, have evolved as alternative and mutually exclusive approaches to corporate infrastruc- ture. As a result, companies struggle to nd the right balance between technology and process factors...

  13. Processer i undervisningen

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe

    Undersøgelsen har fokus på processer i undervisningen – og derigennem på hvordan digitale læremidler kan understøtte eller integreres i typiske processer. Undersøgelsen hviler på deltagende observation på Abildgårdskolen i Odense. Gennem observationerne er der identificeret en række eksempler på ...... udfordringer for at gennemføre de undervisningsmæssige processer og givet bud på digitale læremidler der forventes at kunne understøtte processerne. Undersøgelsen viser samtidig hvordan fokus på processer kan fungere som en metode til brugerdreven innovation....

  14. Desalination processes and technologies

    International Nuclear Information System (INIS)

    Reasons of the development of desalination processes, the modern desalination technologies, such as multi-stage flash evaporation, multi-effect distillation, reverse osmosis, and the prospects of using nuclear power for desalination purposes are discussed. 9 refs

  15. Radiation processing in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Makuuchi, Keizo [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment

    2001-03-01

    Economic scale of radiation application in the field of industry, agriculture and medicine in Japan in 1997 was investigated to compare its economic impacts with that of nuclear energy industry. Total production value of radiation application accounted for 54% of nuclear industry including nuclear energy industry and radiation applications in three fields above. Industrial radiation applications were further divided into five groups, namely nondestructive test, RI instruments, radiation facilities, radiation processing and ion beam processing. More than 70% of the total production value was brought about by ion beam processing for use with IC and semiconductors. Future economic prospect of radiation processing of polymers, for example cross-linking, EB curing, graft polymerization and degradation, is reviewed. Particular attention was paid to radiation vulcanization of natural rubber latex and also to degradation of natural polymers. (S. Ohno)

  16. Reconfigurable network processing platforms

    NARCIS (Netherlands)

    Kachris, C.

    2007-01-01

    This dissertation presents our investigation on how to efficiently exploit reconfigurable hardware to design flexible, high performance, and power efficient network devices capable to adapt to varying processing requirements of network applications and traffic. The proposed reconfigurable network pr

  17. Markovian risk process

    Institute of Scientific and Technical Information of China (English)

    WANG Han-xing; YAN Yun-zhi; ZHAO Fei; FANG Da-fan

    2007-01-01

    A Markovian risk process is considered in this paper, which is the generalization of the classical risk model. It is proper that a risk process with large claims is modelled as the Markovian risk model. In such a model, the occurrence of claims is described by a point process {N(t)}t≥o with N(t) being the number of jumps during the interval (0, t] for a Markov jump process. The ruin probability Ψ(u) of a company facing such a risk model is mainly studied. An integral equation satisfied by the ruin probability function Ψ(u) is obtained and the bounds for the convergence rate of the ruin probability Ψ(u) are given by using a generalized renewal technique developed in the paper.

  18. Dissolution processes. [224 references

    Energy Technology Data Exchange (ETDEWEB)

    Silver, G.L.

    1976-10-22

    This review contains more than 100 observations and 224 references on the dissolution phenomenon. The dissolution processes are grouped into three categories: methods of aqueous attack, fusion methods, and miscellaneous observations on phenomena related to dissolution problems. (DLC)

  19. Electron-attachment processes

    International Nuclear Information System (INIS)

    Topics covered include: (1) modes of production of negative ions, (2) techniques for the study of electron attachment processes, (3) dissociative electron attachment to ground-state molecules, (4) dissociative electron attachment to hot molecules (effects of temperature on dissociative electron attachment), (5) molecular parent negative ions, and (6) negative ions formed by ion-pair processes and by collisions of molecules with ground state and Rydberg atoms

  20. Biomedical signal processing

    CERN Document Server

    Akay, Metin

    1994-01-01

    Sophisticated techniques for signal processing are now available to the biomedical specialist! Written in an easy-to-read, straightforward style, Biomedical Signal Processing presents techniques to eliminate background noise, enhance signal detection, and analyze computer data, making results easy to comprehend and apply. In addition to examining techniques for electrical signal analysis, filtering, and transforms, the author supplies an extensive appendix with several computer programs that demonstrate techniques presented in the text.

  1. Hydrogen production processes

    International Nuclear Information System (INIS)

    The goals of this first Gedepeon workshop on hydrogen production processes are: to stimulate the information exchange about research programs and research advances in the domain of hydrogen production processes, to indicate the domains of interest of these processes and the potentialities linked with the coupling of a nuclear reactor, to establish the actions of common interest for the CEA, the CNRS, and eventually EDF, that can be funded in the framework of the Gedepeon research group. This document gathers the slides of the 17 presentations given at this workshop and dealing with: the H2 question and the international research programs (Lucchese P.); the CEA's research program (Lucchese P., Anzieu P.); processes based on the iodine/sulfur cycle: efficiency of a facility - flow-sheets, efficiencies, hard points (Borgard J.M.), R and D about the I/S cycle: Bunsen reaction (Colette S.), R and D about the I/S cycle: the HI/I2/H2O system (Doizi D.), demonstration loop/chemical engineering (Duhamet J.), materials and corrosion (Terlain A.); other processes under study: the Westinghouse cycle (Eysseric C.), other processes under study at the CEA (UT3, plasma,...) (Lemort F.), database about thermochemical cycles (Abanades S.), Zn/ZnO cycle (Broust F.), H2 production by cracking, high temperature reforming with carbon trapping (Flamant G.), membrane technology (De Lamare J.); high-temperature electrolysis: SOFC used as electrolyzers (Grastien R.); generic aspects linked with hydrogen production: technical-economical evaluation of processes (Werkoff F.), thermodynamic tools (Neveu P.), the reactor-process coupling (Aujollet P.). (J.S.)

  2. Radiopharmaceutical drug review process

    International Nuclear Information System (INIS)

    To ensure proper radioactive drug use (such as quality, diagnostic improvement, and minimal radioactive exposure), the Food and Drug Administration evaluates new drugs with respect to safety, effectiveness, and accuracy and adequacy of the labeling. The IND or NDA process is used for this purpose. A brief description of the process, including the Chemical Classification System and the therapeutic potential classification, is presented as it applies to radiopharmaceuticals. Also, the status of the IND or NDA review of radiopharmaceuticals is given

  3. Reward Processing in Autism

    OpenAIRE

    Scott-Van Zeeland, Ashley A.; DAPRETTO, MIRELLA; Ghahremani, Dara G.; Poldrack, Russell A.; Bookheimer, Susan Y.

    2010-01-01

    The social motivation hypothesis of autism posits that infants with autism do not experience social stimuli as rewarding, thereby leading to a cascade of potentially negative consequences for later development. While possible downstream effects of this hypothesis such as altered face and voice processing have been examined, there has not been a direct investigation of social reward processing in autism. Here we use functional magnetic resonance imaging to examine social and monetary rewarded ...

  4. Processing chicken at slaughter

    OpenAIRE

    POŽÁRKOVÁ, Radka

    2012-01-01

    Composition of poultry flesh and its purpose on human nutrition is described in this work. The quality and factors which affects quality are described further. HACCP system takes also important role. The end of this thesis is focused on poultry meat markets. The aim of this thesis was to study and describe chicken slaughtering process and processing of chicken carcass and determine the major share of the fleshy parts of broiler chicken carcass which means shares of breast muscles and tight mu...

  5. Digital signal processing: Handbook

    Science.gov (United States)

    Goldenberg, L. M.; Matiushkin, B. D.; Poliak, M. N.

    The fundamentals of the theory and design of systems and devices for the digital processing of signals are presented. Particular attention is given to algorithmic methods of synthesis and digital processing equipment in communication systems (e.g., selective digital filtering, spectral analysis, and variation of the signal discretization frequency). Programs for the computer-aided analysis of digital filters are described. Computational examples are presented, along with tables of transfer function coefficients for recursive and nonrecursive digital filters.

  6. Poststroke neuroplasticity processes

    Directory of Open Access Journals (Sweden)

    I. V. Damulin

    2014-01-01

    Full Text Available The paper considers different aspects of neuroplasticity in patients with stroke. It underlines the dynamism of this process and the ambiguity of involvement of the structures of the contralateral cerebral hemisphere in the restorative process. It considers the periods after onset of stroke and the activation of different brain regions (of both the involved and intact hemisphere in the poststroke period. Particular emphasis is placed on the issues of neurorehabilitation in this category of patients. Delay in rehabilitation measures leads to a worse outcome, the patients must be at hospital longer. It is emphasized that the neurorehabilitaton measures should use strategies aimed at improving plasticity processes at the level of synaptic transmission and neuronal communications. At the same time, of great importance are the processes of structural and functional remodeling of neuronal communications with the involvement of surviving neurons that are located in the peri-infarct area and partially damaged during ischemia. To recover stroke-induced lost motor functions, measures are implemented to modulate the ipsilateral motor cortex, contralateral motor cortex, and sensory afferentation. Remodeling processes, one of the manifestations of neuroplasticity, vary with the size and location of an ischemic focus. The specific features of this process with subcortical and cortical foci are considered. It is stressed that there are genetically determined neurotrophic factors that may enhance remodeling processes in the peri-infarct area, as well as factors that inhibit these processes. The sensory system is noted to have a high potential of compensation, which is appreciably associated with the considerable extent of sensory fibers even at the level of the cerebral cortex.

  7. Differentially Private Gaussian Processes

    OpenAIRE

    Smith, Michael Thomas; Zwiessele, Max; Lawrence, Neil D.

    2016-01-01

    A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Differential privacy is a framework which allows algorithms to have provable privacy guarantees. Gaussian processes are a widely used approach for dealing with uncertainty in functions. This paper explores differentially private mechanisms for Gaussian processes. We compare binning and adding noise before regression with adding noise post-regression. For the former we dev...

  8. Compactifications for Dual Processes

    OpenAIRE

    Glover, Joseph

    1980-01-01

    We develop a general theory of duality for Markov processes satisfying Meyer's hypothesis (L) and possessing an excessive reference measure. We make use of a compactification introduced by Walsh which allows a right process and its moderate dual to have strong Markov versions on an enlarged state space. The representation theory for potentials of additive functionals due to Revuz and Sharpe can be extended to this setting. Using this theory, we show that the conatural additive functionals int...

  9. Cognitive Processes in Writing

    Institute of Scientific and Technical Information of China (English)

    李莹

    2009-01-01

    Writing has become one of important topic to discuss in the new age.Its theories could be generally learnt,but its nature needs to handle in specific contents.In another words,every one who can write must generate his/her thinking or cognitive processes.Because writing thinking is to do meaningful activities,how to solove writing problems could be managed through cognitive process.

  10. Bank Record Processing

    Science.gov (United States)

    1982-01-01

    Barnett Banks of Florida, Inc. operates 150 banking offices in 80 Florida cities. Banking offices have computerized systems for processing deposits or withdrawals in checking/savings accounts, and for handling commercial and installment loan transactions. In developing a network engineering design for the terminals used in record processing, an affiliate, Barnett Computing Company, used COSMIC's STATCOM program. This program provided a reliable network design tool and avoided the cost of developing new software.

  11. Language and Speech Processing

    CERN Document Server

    Mariani, Joseph

    2008-01-01

    Speech processing addresses various scientific and technological areas. It includes speech analysis and variable rate coding, in order to store or transmit speech. It also covers speech synthesis, especially from text, speech recognition, including speaker and language identification, and spoken language understanding. This book covers the following topics: how to realize speech production and perception systems, how to synthesize and understand speech using state-of-the-art methods in signal processing, pattern recognition, stochastic modelling computational linguistics and human factor studi

  12. Analyzing business process management

    OpenAIRE

    Skjæveland, Børge

    2013-01-01

    Within the Oil & Gas Industry, the market is constantly growing more competitive, forcing companies to continually adapt to changes. Companies need to cut costs and improve the business efficiency. One way of successfully managing these challenges is to implement business process management in the organization. This thesis will analyze how Oceaneering Asset Integrity AS handled the implementation of a Business Process Management System and the effects it had on the employees. The main goal...

  13. Novel food processing techniques

    OpenAIRE

    Vesna Lelas

    2006-01-01

    Recently, a lot of investigations have been focused on development of the novel mild food processing techniques with the aim to obtain the high quality food products. It is presumed also that they could substitute some of the traditional processes in the food industry. The investigations are primarily directed to usage of high hydrostatic pressure, ultrasound, tribomechanical micronization, microwaves, pulsed electrical fields. The results of the scientific researches refer to the fact that a...

  14. Image Processing Software

    Science.gov (United States)

    1992-01-01

    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  15. Catalytic Processes in Biorefinery

    OpenAIRE

    Vitiello, Rosa

    2015-01-01

    The biorefinery is a system that uses as feedstocks biomasses and recover from these energy, fuel and chemicals. There are many processes considered in the biorefinery system, but in this thesis the biorefinery that uses as feedstock oil, in particular dedicated crops and waste vegetable oils were considered. In the first part of this thesis the biodiesel production process was studied. One possible route to produce biodiesel from waste oils (carachetrized by high concentrations of Fr...

  16. Fractional Pure Birth Processes

    CERN Document Server

    Orsingher, Enzo; 10.3150/09-BEJ235

    2010-01-01

    We consider a fractional version of the classical non-linear birth process of which the Yule-Furry model is a particular case. Fractionality is obtained by replacing the first-order time derivative in the difference-differential equations which govern the probability law of the process, with the Dzherbashyan-Caputo fractional derivative. We derive the probability distribution of the number $ \\mathcal{N}_\

  17. Apple Image Processing Educator

    Science.gov (United States)

    Gunther, F. J.

    1981-01-01

    A software system design is proposed and demonstrated with pilot-project software. The system permits the Apple II microcomputer to be used for personalized computer-assisted instruction in the digital image processing of LANDSAT images. The programs provide data input, menu selection, graphic and hard-copy displays, and both general and detailed instructions. The pilot-project results are considered to be successful indicators of the capabilities and limits of microcomputers for digital image processing education.

  18. Granger Independent Martingale Processes

    OpenAIRE

    Cherubini, Umberto; Gobbi, Fabio; Mulinacci, Sabrina; Romagnoli, Silvia

    2016-01-01

    We introduce a new class of processes for the evaluation of multivariate equity derivatives. The proposed setting is well suited for the application of the standard copula function theory to processes, rather than variables, and easily enables to enforce the martingale pricing requirement. The martingale condition is imposed in a general multidimensional Markov setting to which we only add the restriction of no-Granger-causality of the increments (Granger-independent increments). We call this...

  19. Helium process cycle

    Science.gov (United States)

    Ganni, Venkatarao

    2007-10-09

    A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

  20. Customer Innovation Process Leadership

    DEFF Research Database (Denmark)

    Lindgren, Peter; Jørgensen, Jacob Høj; Goduscheit, René Chester

    2007-01-01

    Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been on diffe......Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been...... on different types of organisational setup to the product development model and process. The globalization and enhanced competitive markets are however changing the innovation game and the challenge to innovation leadership Excellent product development innovation and leadership seems not any longer to enough...... another outlook to future innovation leadership - Customer Innovation Process Leadership - CIP-leadership. CIP-leadership moves the company's innovation process closer to the customer innovation process and discusses how companies can be involved and innovate in customers' future needs and lead...

  1. Helium process cycle

    Science.gov (United States)

    Ganni, Venkatarao

    2008-08-12

    A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

  2. Radiation processing of polysaccharides

    International Nuclear Information System (INIS)

    Radiation processing is a very convenient tool for imparting desirable effects in polymeric materials and it has been an area of enormous interest in the last few decades. The success of radiation technology for processing of synthetic polymers can be attributed to two reasons namely, their ease of processing in various shapes and sizes, and secondly, most of these polymers undergo crosslinking reaction upon exposure to radiation. In recent years, natural polymers are being looked at with renewed interest because of their unique characteristics, such as inherent biocompatibility, biodegradability and easy availability. Traditionally, the commercial exploitation of natural polymers like carrageenans, alginates or starch etc. has been based, to a large extent, on empirical knowledge. But now, the applications of natural polymers are being sought in knowledge - demanding areas such as pharmacy and biotechnology, which is acting as a locomotive for further scientific research in their structure-function relationship. Selected success stories concerning radiation processed natural polymers and application of their derivatives in the health care products industries and agriculture are reported. This publication will be of interest to individuals at nuclear institutions worldwide that have programmes of R and D and applications in radiation processing technologies. New developments in radiation processing of polymers and other natural raw materials give insight into converting them into useful products for every day life, human health and environmental remediation. The book will also be of interest to other field specialists, readers including managers and decision makers in industry (health care, food and agriculture) helping them to understand the important role of radiation processing technology in polysaccharides

  3. Novel food processing techniques

    Directory of Open Access Journals (Sweden)

    Vesna Lelas

    2006-12-01

    Full Text Available Recently, a lot of investigations have been focused on development of the novel mild food processing techniques with the aim to obtain the high quality food products. It is presumed also that they could substitute some of the traditional processes in the food industry. The investigations are primarily directed to usage of high hydrostatic pressure, ultrasound, tribomechanical micronization, microwaves, pulsed electrical fields. The results of the scientific researches refer to the fact that application of some of these processes in particular food industry can result in lots of benefits. A significant energy savings, shortening of process duration, mild thermal conditions, food products with better sensory characteristics and with higher nutritional values can be achieved. As some of these techniques act also on the molecular level changing the conformation, structure and electrical potential of organic as well as inorganic materials, the improvement of some functional properties of these components may occur. Common characteristics of all of these techniques are treatment at ambient or insignificant higher temperatures and short time of processing (1 to 10 minutes. High hydrostatic pressure applied to various foodstuffs can destroy some microorganisms, successfully modify molecule conformation and consequently improve functional properties of foods. At the same time it acts positively on the food products intend for freezing. Tribomechanical treatment causes micronization of various solid materials that results in nanoparticles and changes in structure and electrical potential of molecules. Therefore, the significant improvement of some rheological and functional properties of materials occurred. Ultrasound treatment proved to be potentially very successful technique of food processing. It can be used as a pretreatment to drying (decreases drying time and improves functional properties of food, as extraction process of various components

  4. Process Correlation Analysis Model for Process Improvement Identification

    OpenAIRE

    Su-jin Choi; Dae-Kyoo Kim; Sooyong Park

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practice...

  5. Carbon dioxide reducing processes; Koldioxidreducerande processer

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Fredrik

    1999-12-01

    This thesis discusses different technologies to reduce or eliminate the carbon dioxide emissions, when a fossil fuel is used for energy production. Emission reduction can be accomplished by separating the carbon dioxide for storage or reuse. There are three different ways of doing the separation. The carbon dioxide can be separated before the combustion, the process can be designed so that the carbon dioxide can be separated without any energy consumption and costly systems or the carbon dioxide can be separated from the flue gas stream. Two different concepts of separating the carbon dioxide from a combined cycle are compared, from the performance and the economical point of view, with a standard natural gas fired combined cycle where no attempts are made to reduce the carbon dioxide emissions. One concept is to use absorption technologies to separate the carbon dioxide from the flue gas stream. The other concept is based on a semi-closed gas turbine cycle using carbon dioxide as working fluid and combustion with pure oxygen, generated in an air-separating unit. The calculations show that the efficiency (power) drop is smaller for the first concept than for the second, 8.7 % points compared to 13.7 % points, when power is produced. When both heat and power are produced, the relation concerning the efficiency (power) remains. Regarding the overall efficiency (heat and power) the opposite relation is present. A possible carbon dioxide tax must exceed 0.21 SEK/kg CO{sub 2} for it to be profitable to separate carbon dioxide with any of these technologies.

  6. VLSI signal processing technology

    CERN Document Server

    Swartzlander, Earl

    1994-01-01

    This book is the first in a set of forthcoming books focussed on state-of-the-art development in the VLSI Signal Processing area. It is a response to the tremendous research activities taking place in that field. These activities have been driven by two factors: the dramatic increase in demand for high speed signal processing, especially in consumer elec­ tronics, and the evolving microelectronic technologies. The available technology has always been one of the main factors in determining al­ gorithms, architectures, and design strategies to be followed. With every new technology, signal processing systems go through many changes in concepts, design methods, and implementation. The goal of this book is to introduce the reader to the main features of VLSI Signal Processing and the ongoing developments in this area. The focus of this book is on: • Current developments in Digital Signal Processing (DSP) pro­ cessors and architectures - several examples and case studies of existing DSP chips are discussed in...

  7. Cantilever epitaxial process

    Science.gov (United States)

    Ashby, Carol I.; Follstaedt, David M.; Mitchell, Christine C.; Han, Jung

    2003-07-29

    A process of growing a material on a substrate, particularly growing a Group II-VI or Group III-V material, by a vapor-phase growth technique where the growth process eliminates the need for utilization of a mask or removal of the substrate from the reactor at any time during the processing. A nucleation layer is first grown upon which a middle layer is grown to provide surfaces for subsequent lateral cantilever growth. The lateral growth rate is controlled by altering the reactor temperature, pressure, reactant concentrations or reactant flow rates. Semiconductor materials, such as GaN, can be produced with dislocation densities less than 10.sup.7 /cm.sup.2.

  8. Laser Processing and Chemistry

    CERN Document Server

    Bäuerle, Dieter

    2011-01-01

    This book gives an overview of the fundamentals and applications of laser-matter interactions, in particular with regard to laser material processing. Special attention is given to laser-induced physical and chemical processes at gas-solid, liquid-solid, and solid-solid interfaces. Starting with the background physics, the book proceeds to examine applications of lasers in “standard” laser machining and laser chemical processing (LCP), including the patterning, coating, and modification of material surfaces. This fourth edition has been enlarged to cover the rapid advances in the understanding of the dynamics of materials under the action of ultrashort laser pulses, and to include a number of new topics, in particular the increasing importance of lasers in various different fields of surface functionalizations and nanotechnology. In two additional chapters, recent developments in biotechnology, medicine, art conservation and restoration are summarized. Graduate students, physicists, chemists, engineers, a...

  9. Cryogenic process simulation

    International Nuclear Information System (INIS)

    Combining accurate fluid property databases with a commercial equation-solving software package running on a desktop computer allows simulation of cryogenic processes without extensive computer programming. Computer simulation can be a powerful tool for process development or optimization. Most engineering simulations to date have required extensive programming skills in languages such as Fortran, Pascal, etc. Authors of simulation code have also usually been responsible for choosing and writing the particular solution algorithm. This paper describes a method of simulating cryogenic processes with a commercial software package on a desktop personal computer that does not require these traditional programming tasks. Applications include modeling of cryogenic refrigerators, heat exchangers, vapor-cooled power leads, vapor pressure thermometers, and various other engineering problems

  10. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... of process industries. The book builds on the extensive modelling experience of the authors, who have developed models for both research and industrial purposes. It complements existing books by the authors in the modelling area. Those areas include the traditional petroleum and petrochemical industries...

  11. Processes for xanthomonas biopolymers

    Energy Technology Data Exchange (ETDEWEB)

    Engelskirchen, K.; Stein, W.; Bahn, M.; Schieferstein, L.; Schindler, J.

    1984-03-27

    A process is described for producing xanthan gum in which the use of a stable, water-in-oil emulsion in the fermentation medium markedly lowers the viscosity of the medium, resulting in lower energy requirements for the process, and also resulting in enhanced yields of the biopolymer. In such an emulsion, the aqueous fermentation phase, with its microbial growth and metabolic processes, takes place in a finely dispersed homogeneous oil phase. The viscosity increase in each droplet of the aqueous nutrient solution will not noticeably affect this mixture in the fermenter because the viscosity of the reaction mixture in the fermenter is determined primarily by the viscosity of the oil phase. 45 claims

  12. Process window metrology

    Science.gov (United States)

    Ausschnitt, Christopher P.; Chu, William; Hadel, Linda M.; Ho, Hok; Talvi, Peter

    2000-06-01

    This paper is the third of a series that defines a new approach to in-line lithography control. The first paper described the use of optically measurable line-shortening targets to enhance signal-to-noise and reduce measurement time. The second described the dual-tone optical critical dimension (OCD) measurement and analysis necessary to distinguish dose and defocus. Here we describe the marriage of dual-tone OCD to SEM-CD metrology that comprises what we call 'process window metrology' (PWM), the means to locate each measured site in dose and focus space relative to the allowed process window. PWM provides in-line process tracking and control essential to the successful implementation of low-k lithography.

  13. Business Process Requirement Engineering

    Directory of Open Access Journals (Sweden)

    Atsa Etoundi Roger,

    2010-12-01

    Full Text Available Requirement engineering is as an increasingly important discipline for supporting business process and workflow modeling, as these are designed to satisfy diverse customer needs, and increase the productivity of enterprise. Moreover, most customers hesitate to adopt a given product or service if the added value is not conformed to their desires. Dealing with customers, with a wide range of perspective, within an enterprise, is very complex. These perspectives are grounded in differences in skills, responsibility, knowledge and expertise of stakeholders. This holds more in the domain of business processes and workflows where the satisfaction of the customers is the must if these enterprises wish to deal with the pressure of the network economy. Based on the requirement engineering, we present in this paper an integration of RE approach in the modeling of business process and workflows. (Abstract

  14. Lapis SOI Pixel Process

    CERN Document Server

    Okihara, Masao; Miura, Noriyuki; Kuriyama, Naoya; Nagatomo, Yoshiki

    2015-01-01

    0.2 um fully-depleted SOI technology has been developed a for X-ray pixel detectors. To improve the detector performance, some advanced process technologies are developing continuously. To utilize the high resistivity FZ-SOI, slow ramp up and ramp down recipes are applied for the thermal processes in both of SOI wafer fabrication and pixel detector process. The suitable backside treatment is also applied to prevent increase of leakage current at backside damaged layer in the case of full depletion of substrate. Large detector chip about 66mm width and 30mm height can be obtained by stitching exposure technique for large detector chip. To improve cross-talk and radiation tolerance, the nested well structure and double- SOI wafer are now under investigation for advanced pixel structure.

  15. Foundations of signal processing

    CERN Document Server

    Vetterli, Martin; Goyal, Vivek K

    2014-01-01

    This comprehensive and engaging textbook introduces the basic principles and techniques of signal processing, from the fundamental ideas of signals and systems theory to real-world applications. Students are introduced to the powerful foundations of modern signal processing, including the basic geometry of Hilbert space, the mathematics of Fourier transforms, and essentials of sampling, interpolation, approximation and compression. The authors discuss real-world issues and hurdles to using these tools, and ways of adapting them to overcome problems of finiteness and localisation, the limitations of uncertainty and computational costs. Standard engineering notation is used throughout, making mathematical examples easy for students to follow, understand and apply. It includes over 150 homework problems and over 180 worked examples, specifically designed to test and expand students' understanding of the fundamentals of signal processing, and is accompanied by extensive online materials designed to aid learning, ...

  16. Quantum independent increment processes

    CERN Document Server

    Franz, Uwe

    2005-01-01

    This volume is the first of two volumes containing the revised and completed notes lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald during the period March 9 – 22, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present first volume contains the following lectures: "Lévy Processes in Euclidean Spaces and Groups" by David Applebaum, "Locally Compact Quantum Groups" by Johan Kustermans, "Quantum Stochastic Analysis" by J. Martin Lindsay, and "Dilations, Cocycles and Product Systems" by B.V. Rajarama Bhat.

  17. Topological signal processing

    CERN Document Server

    Robinson, Michael

    2014-01-01

    Signal processing is the discipline of extracting information from collections of measurements. To be effective, the measurements must be organized and then filtered, detected, or transformed to expose the desired information.  Distortions caused by uncertainty, noise, and clutter degrade the performance of practical signal processing systems. In aggressively uncertain situations, the full truth about an underlying signal cannot be known.  This book develops the theory and practice of signal processing systems for these situations that extract useful, qualitative information using the mathematics of topology -- the study of spaces under continuous transformations.  Since the collection of continuous transformations is large and varied, tools which are topologically-motivated are automatically insensitive to substantial distortion. The target audience comprises practitioners as well as researchers, but the book may also be beneficial for graduate students.

  18. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  19. Modern coating processes

    International Nuclear Information System (INIS)

    Articles collected in this volume explain both the present state of technique and current developments and problems in the environment of the following coating processes: - Hardfacing welding and soldering; - Thermal spraying; - Thin film technique (CVD, PVD); - Galvanising. Apart from basic representation of the conventional use of the different processes, both the new technological and material developments are to the fore. In this context, the purposeful post-treatment of coatings and the combination of different processes to achieve special coating properties should be mentioned. Examples of this show the hot isostatic pressing or laser melting of sprayed coatings, the simultaneous spraying and shot-blasting and the combination of galvanic and thin film techniques for the manufacture of hybrid systems. A further important group of subjects concerns the testing of various coatings. (orig.)

  20. AERONET Version 3 processing

    Science.gov (United States)

    Holben, B. N.; Slutsker, I.; Giles, D. M.; Eck, T. F.; Smirnov, A.; Sinyuk, A.; Schafer, J.; Rodriguez, J.

    2014-12-01

    The Aerosol Robotic Network (AERONET) database has evolved in measurement accuracy, data quality products, availability to the scientific community over the course of 21 years with the support of NASA, PHOTONS and all federated partners. This evolution is periodically manifested as a new data version release by carefully reprocessing the entire database with the most current algorithms that fundamentally change the database and ultimately the data products used by the community. The newest processing, Version 3, will be released in 2015 after the entire database is reprocessed and real-time data processing becomes operational. All V 3 algorithms have been developed, individually vetted and represent four main categories: aerosol optical depth (AOD) processing, inversion processing, database management and new products. The primary trigger for release of V 3 lies with cloud screening of the direct sun observations and computation of AOD that will fundamentally change all data available for analysis and all subsequent retrieval products. This presentation will illustrate the innovative approach used for cloud screening and assesses the elements of V3 AOD relative to the current version. We will also present the advances in the inversion product processing with emphasis on the random and systematic uncertainty estimates. This processing will be applied to the new hybrid measurement scenario intended to provide inversion retrievals for all solar zenith angles. We will introduce automatic quality assurance criteria that will allow near real time quality assured aerosol products necessary for real time satellite and model validation and assimilation. Last we will introduce the new management structure that will improve access to the data database. The current version 2 will be supported for at least two years after the initial release of V3 to maintain continuity for on going investigations.

  1. NTP comparison process

    Science.gov (United States)

    Corban, Robert

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  2. Orchestrator Telemetry Processing Pipeline

    Science.gov (United States)

    Powell, Mark; Mittman, David; Joswig, Joseph; Crockett, Thomas; Norris, Jeffrey

    2008-01-01

    Orchestrator is a software application infrastructure for telemetry monitoring, logging, processing, and distribution. The architecture has been applied to support operations of a variety of planetary rovers. Built in Java with the Eclipse Rich Client Platform, Orchestrator can run on most commonly used operating systems. The pipeline supports configurable parallel processing that can significantly reduce the time needed to process a large volume of data products. Processors in the pipeline implement a simple Java interface and declare their required input from upstream processors. Orchestrator is programmatically constructed by specifying a list of Java processor classes that are initiated at runtime to form the pipeline. Input dependencies are checked at runtime. Fault tolerance can be configured to attempt continuation of processing in the event of an error or failed input dependency if possible, or to abort further processing when an error is detected. This innovation also provides support for Java Message Service broadcasts of telemetry objects to clients and provides a file system and relational database logging of telemetry. Orchestrator supports remote monitoring and control of the pipeline using browser-based JMX controls and provides several integration paths for pre-compiled legacy data processors. At the time of this reporting, the Orchestrator architecture has been used by four NASA customers to build telemetry pipelines to support field operations. Example applications include high-volume stereo image capture and processing, simultaneous data monitoring and logging from multiple vehicles. Example telemetry processors used in field test operations support include vehicle position, attitude, articulation, GPS location, power, and stereo images.

  3. Biomedical Image Processing

    CERN Document Server

    Deserno, Thomas Martin

    2011-01-01

    In modern medicine, imaging is the most effective tool for diagnostics, treatment planning and therapy. Almost all modalities have went to directly digital acquisition techniques and processing of this image data have become an important option for health care in future. This book is written by a team of internationally recognized experts from all over the world. It provides a brief but complete overview on medical image processing and analysis highlighting recent advances that have been made in academics. Color figures are used extensively to illustrate the methods and help the reader to understand the complex topics.

  4. Radiation processed polysaccharide products

    International Nuclear Information System (INIS)

    Radiation crosslinking, degradation and grafting techniques for modification of polymeric materials including natural polysaccharides have been providing many unique products. In this communication, typical products from radiation processed polysaccharides particularly plant growth promoter from alginate, plant protector and elicitor from chitosan, super water absorbent containing starch, hydrogel sheet containing carrageenan/CM-chitosan as burn wound dressing, metal ion adsorbent from partially deacetylated chitin were described. The procedures for producing those above products were also outlined. Future development works on radiation processing of polysaccharides were briefly presented. (author)

  5. Image Processing System

    Science.gov (United States)

    1986-01-01

    Mallinckrodt Institute of Radiology (MIR) is using a digital image processing system which employs NASA-developed technology. MIR's computer system is the largest radiology system in the world. It is used in diagnostic imaging. Blood vessels are injected with x-ray dye, and the images which are produced indicate whether arteries are hardened or blocked. A computer program developed by Jet Propulsion Laboratory known as Mini-VICAR/IBIS was supplied to MIR by COSMIC. The program provides the basis for developing the computer imaging routines for data processing, contrast enhancement and picture display.

  6. Semantic and Process Interoperability

    Directory of Open Access Journals (Sweden)

    Félix Oscar Fernández Peña

    2010-05-01

    Full Text Available Knowledge management systems support education at different levels of the education. This is very important for the process in which the higher education of Cuba is involved. Structural transformations of teaching are focused on supporting the foundation of the information society in the country. This paper describes technical aspects of the designing of a model for the integration of multiple knowledgemanagement tools supporting teaching. The proposal is based on the definition of an ontology for the explicit formal description of the semantic of motivations of students and teachers in the learning process. Its target is to facilitate knowledge spreading.

  7. Genomic signal processing

    CERN Document Server

    Shmulevich, Ilya

    2007-01-01

    Genomic signal processing (GSP) can be defined as the analysis, processing, and use of genomic signals to gain biological knowledge, and the translation of that knowledge into systems-based applications that can be used to diagnose and treat genetic diseases. Situated at the crossroads of engineering, biology, mathematics, statistics, and computer science, GSP requires the development of both nonlinear dynamical models that adequately represent genomic regulation, and diagnostic and therapeutic tools based on these models. This book facilitates these developments by providing rigorous mathema

  8. An Integrated Desgin Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...

  9. An Integrated Design Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...

  10. Research Planning Process

    Science.gov (United States)

    Lofton, Rodney

    2010-01-01

    This presentation describes the process used to collect, review, integrate, and assess research requirements desired to be a part of research and payload activities conducted on the ISS. The presentation provides a description of: where the requirements originate, to whom they are submitted, how they are integrated into a requirements plan, and how that integrated plan is formulated and approved. It is hoped that from completing the review of this presentation, one will get an understanding of the planning process that formulates payload requirements into an integrated plan used for specifying research activities to take place on the ISS.

  11. Mutually Testing Processes

    OpenAIRE

    Bernardi, Giovanni; Hennessy, Matthew

    2015-01-01

    In the standard testing theory of DeNicola-Hennessy one process is considered to be a refinement of another if every test guaranteed by the former is also guaranteed by the latter. In the domain of web services this has been recast, with processes viewed as servers and tests as clients. In this way the standard refinement preorder between servers is determined by their ability to satisfy clients. But in this setting there is also a natural refinement preorder between clients, determined by th...

  12. Food irradiation processing

    International Nuclear Information System (INIS)

    An international symposium on food irradiation processing dealing with issues which affect the commercial introduction of the food irradiation process was held in Vienna in 1985. The symposium, which attracted close to 300 participants, was planned to interest not only scientists and food technologists, but also representatives of government agencies, the food industry, trade associations and consumer organizations. The symposium included a discussion of the technological and economic feasibility of applying ionizing energy for the preservation of food, and focused on the specific needs of developing countries. Separate abstracts were prepared for the various presentations at this meeting

  13. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    investigated individually, with the purpose of obtaining a better understanding before the final integrated model is setup. In the sub-process investigations focus is addressed especially at aeration by bottom mounted diffusers and mechanical mixing of the activated sludge suspension via slowly rotating...... zones in mind, its special construction poses strict demands to the hydrodynamic model. In case study three the model is extended to a three-phase model where also the injection of air bubbles during the aeration process is modeled. The aeration of sludge is controlled through a simple expression...

  14. Reversible brazing process

    Science.gov (United States)

    Pierce, Jim D.; Stephens, John J.; Walker, Charles A.

    1999-01-01

    A method of reversibly brazing surfaces together. An interface is affixed to each surface. The interfaces can be affixed by processes such as mechanical joining, welding, or brazing. The two interfaces are then brazed together using a brazing process that does not defeat the surface to interface joint. Interfaces of materials such as Ni-200 can be affixed to metallic surfaces by welding or by brazing with a first braze alloy. The Ni-200 interfaces can then be brazed together using a second braze alloy. The second braze alloy can be chosen so that it minimally alters the properties of the interfaces to allow multiple braze, heat and disassemble, rebraze cycles.

  15. Advanced Polymer Processing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Muenchausen, Ross E. [Los Alamos National Laboratory

    2012-07-25

    Some conclusions of this presentation are: (1) Radiation-assisted nanotechnology applications will continue to grow; (2) The APPF will provide a unique focus for radiolytic processing of nanomaterials in support of DOE-DP, other DOE and advanced manufacturing initiatives; (3) {gamma}, X-ray, e-beam and ion beam processing will increasingly be applied for 'green' manufacturing of nanomaterials and nanocomposites; and (4) Biomedical science and engineering may ultimately be the biggest application area for radiation-assisted nanotechnology development.

  16. Solar industrial process heat

    Energy Technology Data Exchange (ETDEWEB)

    Lumsdaine, E.

    1981-04-01

    The aim of the assessment reported is to candidly examine the contribution that solar industrial process heat (SIPH) is realistically able to make in the near and long-term energy futures of the United States. The performance history of government and privately funded SIPH demonstration programs, 15 of which are briefly summarized, and the present status of SIPH technology are discussed. The technical and performance characteristics of solar industrial process heat plants and equipment are reviewed, as well as evaluating how the operating experience of over a dozen SIPH demonstration projects is influencing institutional acceptance and economoc projections. Implications for domestic energy policy and international implications are briefly discussed. (LEW)

  17. Thermal stir welding process

    Science.gov (United States)

    Ding, R. Jeffrey (Inventor)

    2012-01-01

    A welding method is provided for forming a weld joint between first and second elements of a workpiece. The method includes heating the first and second elements to form an interface of material in a plasticized or melted state interface between the elements. The interface material is then allowed to cool to a plasticized state if previously in a melted state. The interface material, while in the plasticized state, is then mixed, for example, using a grinding/extruding process, to remove any dendritic-type weld microstructures introduced into the interface material during the heating process.

  18. NITRIC ACID PICKLING PROCESS

    Science.gov (United States)

    Boller, E.R.; Eubank, L.D.

    1958-08-19

    An improved process is described for the treatment of metallic uranium surfaces preparatory to being given hot dip coatings. The process consists in first pickling the uraniunn surInce with aqueous 50% to 70% nitric acid, at 60 to 70 deg C, for about 5 minutes, rinsing the acid solution from the uranium article, promptly drying and then passing it through a molten alkali-metal halide flux consisting of 42% LiCl, 53% KCla and 5% NaCl into a molten metal bath consisting of 85 parts by weight of zinc and 15 parts by weight of aluminum

  19. Hard exclusive QCD processes

    Energy Technology Data Exchange (ETDEWEB)

    Kugler, W.

    2007-01-15

    Hard exclusive processes in high energy electron proton scattering offer the opportunity to get access to a new generation of parton distributions, the so-called generalized parton distributions (GPDs). This functions provide more detailed informations about the structure of the nucleon than the usual PDFs obtained from DIS. In this work we present a detailed analysis of exclusive processes, especially of hard exclusive meson production. We investigated the influence of exclusive produced mesons on the semi-inclusive production of mesons at fixed target experiments like HERMES. Further we give a detailed analysis of higher order corrections (NLO) for the exclusive production of mesons in a very broad range of kinematics. (orig.)

  20. The process of entrepreneurship:

    DEFF Research Database (Denmark)

    Neergaard, Helle

    2003-01-01

    between organisational growth and managerial role transformation in technology-based new ventures. The chapter begins by reviewing existing literature on organisational growth patterns and establishing a link to managerial roles in order to elucidate the basic premises of the study. The chapter...... for understanding the link between organisational growth and managerial role transformation.......Growing a technology-based new venture is a complex process because these ventures are embedded in turbulent environments that require fast organisational and managerial transformation. This chapter addresses the evolutionary process of such ventures. It seeks to provide insight into the link...

  1. Statecharts Via Process Algebra

    Science.gov (United States)

    Luttgen, Gerald; vonderBeeck, Michael; Cleaveland, Rance

    1999-01-01

    Statecharts is a visual language for specifying the behavior of reactive systems. The Language extends finite-state machines with concepts of hierarchy, concurrency, and priority. Despite its popularity as a design notation for embedded system, precisely defining its semantics has proved extremely challenging. In this paper, a simple process algebra, called Statecharts Process Language (SPL), is presented, which is expressive enough for encoding Statecharts in a structure-preserving and semantic preserving manner. It is establish that the behavioral relation bisimulation, when applied to SPL, preserves Statecharts semantics

  2. Power plant process computer

    International Nuclear Information System (INIS)

    The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)

  3. Exoplanet atmospheres physical processes

    CERN Document Server

    Seager, Sara

    2010-01-01

    Over the past twenty years, astronomers have identified hundreds of extrasolar planets--planets orbiting stars other than the sun. Recent research in this burgeoning field has made it possible to observe and measure the atmospheres of these exoplanets. This is the first textbook to describe the basic physical processes--including radiative transfer, molecular absorption, and chemical processes--common to all planetary atmospheres, as well as the transit, eclipse, and thermal phase variation observations that are unique to exoplanets. In each chapter, Sara Seager offers a conceptual introdu

  4. STUDY ON THE SEPARATION AND UTILIZATION TECHNOLOGY OF MAGNETIC BEAD IN FLY ASH

    Institute of Scientific and Technical Information of China (English)

    边炳鑫; 李哲; 吕一波; 石宪奎; 韦鲁滨

    2000-01-01

    On the basis of study on physical and chemical properties of magnetic bead (MB) in fly ash (FA), the paper gives out the separation methods of MB and results of three separating process. The result of comparative test in size, density, stability, magnetic material content, specific magnetic susceptibility (SMS), medium recovery oxidation resistance and wear resistance between MB and magnetic fines currently used in dense medium separation leads to that using MB recovered from fly ash is used as medium solids in coal cleaning in stead of magnetic fines not only have no influence upon taryests of separation, but can bring good economic and social benefits.

  5. Total Process Surveillance (TOPS)

    International Nuclear Information System (INIS)

    In order to operate a plant safely and economically, an operator requires a complete knowledge of the plant's operating state. Only a limited amount of information is generally available regarding the plant's current state, this being determined by a finite number of transducers measuring key process parameters. It is the responsibility of the operator to assimilate and interpret this, possibly conflicting, raw measurement data. An operator accomplishes this data interpretation task by utilising his knowledge of the operation of the process and his experience of its behaviour under certain well defined conditions. Under normal operating conditions the operator may use only a fairly basic mental model of the process to facilitate his understanding. However under off-normal plant conditions and especially those associated with a severe accident, this mental model may not be sufficient to understand the behaviour of the process. This is highly likely to be the case where the process system has undergone a structural change as a result of a severe accident. As with any management task, the prime components of dealing with a severe accident condition are monitoring and control. This implies the need to obtain information on the process state, assimilate this information, interpret and understand what it means in the context of the process, leading to a control decision and thus a control action. A key task here is the gathering and assimilation of all the available plant data. The transducers installed on a plant represent a diverse range of information sources. Traditionally these are considered individually or in functional groups, such as fuel channel outlet temperatures. However each transducer measurement is almost invariably linked to other different transducer measurements via the physics of the process. This leads to the concept of analytical redundancy among a given set of diverse transducers. In an off-normal plant condition, the ability to exploit this

  6. Conceptualizing operations strategy processes

    DEFF Research Database (Denmark)

    Rytter, Niels Gorm; Boer, Harry; Koch, Christian

    2007-01-01

    Purpose - The purpose of this paper is to present insights into operations strategy (OS) in practice. It outlines a conceptualization and model of OS processes and, based on findings from an in-depth and longitudinal case study, contributes to further development of extant OS models and methods...

  7. Information Processing of Trauma.

    Science.gov (United States)

    Hartman, Carol R.; Burgess, Ann W.

    1993-01-01

    This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)

  8. Advanced Biosignal Processing

    CERN Document Server

    Nait-Ali, Amine

    2009-01-01

    Presents the principle of many advanced biosignal processing techniques. This title introduces the main biosignal properties and the acquisition techniques. It concerns one of the most intensively used biosignals in the clinical routine, namely the Electrocardiogram, the Elektroenzephalogram, the Electromyogram and the Evoked Potential

  9. Uranium recovery process

    International Nuclear Information System (INIS)

    A process of recovering uranium from an aqueous medium containing both it and sulfuric acid which comprises contacting the medium with an anion exchange resin having tertiary amine groups, said resin being the product of (a) the reaction of polyethyleneimine and a dihaloalkane and (b) the subsequent reductive alkylation of the product of (a)

  10. Pattern evaporation process

    Directory of Open Access Journals (Sweden)

    Z. Żółkiewicz

    2007-04-01

    Full Text Available The paper discusses the process of thermal evaporation of a foundry pattern. At several research-development centres, studies have been carried out to examine the physico-chemical phenomena that take place in foundry mould filled with polystyrene pattern when it is poured with molten metal. In the technique of evaporative patterns, the process of mould filling with molten metal (the said mould holding inside a polystyrene pattern is interrelated with the process of thermal decomposition of this pattern. The transformation of an evaporative pattern (e.g. made from foamed polystyrene from the solid into liquid and then gaseous state occurs as a result of the thermal effect that the liquid metal exerts onto this pattern. Consequently, at the liquid metal-pattern-mould phase boundary some physico-chemical phenomena take place, which until now have not been fully explained. When the pattern is evaporating, some solid and gaseous products are evolved, e.g. CO, CO2, H2, N2, and hydrocarbons, e.g. styrene, toluene, ethane, methane, benzene [16, 23]. The process of polystyrene pattern evaporation in foundry mould under the effect of molten metal is of a very complex nature and depends on many different factors, still not fully investigated. The kinetics of pattern evaporation is also affected by the technological properties of foundry mould, e.g. permeability, thermophysical properties, parameters of the gating system, temperature of pouring, properties of pattern material, and the size of pattern-liquid metal contact surface.

  11. Instruction sequence processing operators

    NARCIS (Netherlands)

    J.A. Bergstra; C.A. Middelburg

    2009-01-01

    This paper concerns instruction sequences whose execution involves the processing of instructions by an execution environment that offers a family of services and may yield a Boolean value at termination. We introduce a composition operator for families of services and three operators that have a di

  12. Anaerobic Digestion: Process

    DEFF Research Database (Denmark)

    Angelidaki, Irini; Batstone, Damien J.

    2011-01-01

    with very little dry matter may also be called a digest. The digest should not be termed compost unless it specifically has been composted in an aerated step. This chapter describes the basic processes of anaerobic digestion. Chapter 9.5 describes the anaerobic treatment technologies, and Chapter 9...

  13. The Serendipitous Research Process

    Science.gov (United States)

    Nutefall, Jennifer E.; Ryder, Phyllis Mentzell

    2010-01-01

    This article presents the results of an exploratory study asking faculty in the first-year writing program and instruction librarians about their research process focusing on results specifically related to serendipity. Steps to prepare for serendipity are highlighted as well as a model for incorporating serendipity into a first-year writing…

  14. Automated process planning system

    Science.gov (United States)

    Mann, W.

    1978-01-01

    Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.

  15. Agriculture and food processing

    International Nuclear Information System (INIS)

    This chapter discuss the application of nuclear technology in agriculture sector. Nuclear Technology has help agriculture and food processing to develop tremendously. Two techniques widely use in both clusters are ionization radiation and radioisotopes. Among techniques for ionizing radiation are plant mutation breeding, SIT and food preservation. Meanwhile radioisotopes use as a tracer for animal research, plant soil relations water sedimentology

  16. Ultrahigh bandwidth signal processing

    Science.gov (United States)

    Oxenløwe, Leif Katsuo

    2016-04-01

    Optical time lenses have proven to be very versatile for advanced optical signal processing. Based on a controlled interplay between dispersion and phase-modulation by e.g. four-wave mixing, the processing is phase-preserving, and hence useful for all types of data signals including coherent multi-level modulation formats. This has enabled processing of phase-modulated spectrally efficient data signals, such as orthogonal frequency division multiplexed (OFDM) signals. In that case, a spectral telescope system was used, using two time lenses with different focal lengths (chirp rates), yielding a spectral magnification of the OFDM signal. Utilising such telescopic arrangements, it has become possible to perform a number of interesting functionalities, which will be described in the presentation. This includes conversion from OFDM to Nyquist WDM, compression of WDM channels to a single Nyquist channel and WDM regeneration. These operations require a broad bandwidth nonlinear platform, and novel photonic integrated nonlinear platforms like aluminum gallium arsenide nano-waveguides used for 1.28 Tbaud optical signal processing will be described.

  17. Diasporic Relationships and Processes

    DEFF Research Database (Denmark)

    Singla, Rashmi

    2010-01-01

    How does moving across the geographical borders affect the relationships of diaspora members both here – in the country of residence and there- in the country of origin? The article delineates some of the processes through gendered experiences of the young adults perceived as active actors based...

  18. Stochastic conditional intensity processes

    DEFF Research Database (Denmark)

    Bauwens, Luc; Hautsch, Nikolaus

    2006-01-01

    model allows for a wide range of (cross-)autocorrelation structures in multivariate point processes. The model is estimated by simulated maximum likelihood (SML) using the efficient importance sampling (EIS) technique. By modeling price intensities based on NYSE trading, we provide significant evidence...

  19. Authenticizing the Research Process

    Directory of Open Access Journals (Sweden)

    Nora Elizondo-Schmelkes, MA, Ph.D. Candidate

    2011-06-01

    Full Text Available This study reflects the main concern of students (national and international who are trying to get a postgraduate degree in a third world (or “in means of development” country. The emergent problem found is that students have to finish their thesis or dissertation but they do not really know how to accomplish this goal. They resolve this problem by authenticizing the process as their own. The theory of authenticizing involves compassing their way to solve the problem of advancing in the research process. Compassing allows the student to authenticize his/her research process, making it a personal and „owned. process. The main categories of compassing are the intellectual, physical and emotional dimension patterns that the student has, learns and follows in order to finish the project and get a degree. Authenticizing implies to author with authenticity their thesis or dissertation. Compassing allows them to do this in their own way, at their own pace or time and with their own internal resources, strengths and weaknesses.

  20. Self-normalized processes

    CERN Document Server

    Heyde, C C

    2008-01-01

    Self-normalized processes are of common occurrence in probabilistic and statistical studies. This volume covers developments in the area, including self-normalized large and moderate deviations, and laws of the iterated logarithms for self-normalized martingales. It treats the theory and applications of self-normalization.

  1. Cascaded Poisson processes

    Science.gov (United States)

    Matsuo, Kuniaki; Saleh, Bahaa E. A.; Teich, Malvin Carl

    1982-12-01

    We investigate the counting statistics for stationary and nonstationary cascaded Poisson processes. A simple equation is obtained for the variance-to-mean ratio in the limit of long counting times. Explicit expressions for the forward-recurrence and inter-event-time probability density functions are also obtained. The results are expected to be of use in a number of areas of physics.

  2. Sustainability of abrasive processes

    DEFF Research Database (Denmark)

    Aurich, J.C.; Linke, B.; Hauschild, Michael Zwicky;

    2013-01-01

    This paper presents an overview of research on sustainability of abrasive processes. It incorporates results from a round robin study on ‘‘energy-efficiency of abrasive processes’’ which has been carried out within the scientific technical committee ‘‘abrasive processes’’ (STC G) of CIRP...

  3. Perspective of radiation processing

    International Nuclear Information System (INIS)

    The area of the applications of radiation techniques is very wide. This paper only relates to the applications of radiation techniques in industries including radiation chemical industry, radiation processing of foods and environmental protection by radiation, but the nuclear instruments and the instrumentations of radiation are out-side of our study. (author)

  4. Electrochemical Discharge Machining Process

    Directory of Open Access Journals (Sweden)

    Anjali V. Kulkarni

    2007-09-01

    Full Text Available Electrochemical discharge machining process is evolving as a promising micromachiningprocess. The experimental investigations in the present work substantiate this trend. In the presentwork, in situ, synchronised, transient temperature and current measurements have been carriedout. The need for the transient measurements arose due to the time-varying nature of the dischargeformation and time varying circuit current. Synchronised and transient measurements revealedthe discrete nature of the process. It also helped in formulating the basic mechanism for thedischarge formation and the material removal in the process. Temperature profile on workpieceand in electrochemical discharge machining cell is experimentally measured using pyrometer,and two varieties of K-type thermocouples. Surface topography of the discharge-affected zoneson the workpiece has been carried out using scanning electron microscope. Measurements andsurface topographical studies reveal the potential use of this process for machining in micronregime. With careful experimental set-up design, suitable supply voltage and its polarity, theprocess can be applied for both micromachining and micro-deposition. It can be extended formachining and or deposition of wide range of materials.

  5. Industrial Information Processing

    DEFF Research Database (Denmark)

    Svensson, Carsten

    2002-01-01

    This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build-to-order...

  6. BUSINESS PROCESS REENGINEERING

    Directory of Open Access Journals (Sweden)

    Magdalena LUCA (DEDIU

    2014-06-01

    Full Text Available Business process reengineering determines the change of organizational functions from an orientation focused on operations through a multidimensional approach. Former employees who were mere executors are now determined to take their own decisions and as a result the functional departments lose their reason to exist. Managers do not act anymore as supervisors, but mainly as mentors, while the employees focus more attention on customer needs and less than the head’s. Under these conditions, new organizational paradigms are required, the most important being that of learning organizations. In order to implement a reengineering of the economic processes and promoting a new organizational paradigm the information technology plays a decisive role. The article presents some results obtained in a research theme ANSTI funded by contract no. 501/2000. Economic and financial analysis is performed in order to know the current situation to achieve better results in the future. One of its objectives is the production analyzed as a labour process and the interaction elements of this process. The indicators investigated in the analysis of financial and economic activity of production reflect the development directions, the means and resources to accomplish predetermined objectives and express the results and effectiveness of what is expected.

  7. Food processing in action

    Science.gov (United States)

    Radio frequency (RF) heating is a commonly used food processing technology that has been applied for drying and baking as well as thawing of frozen foods. Its use in pasteurization, as well as for sterilization and disinfection of foods, is more limited. This column will review various RF heating ap...

  8. Revitalizing stagnated policy processes

    NARCIS (Netherlands)

    Termeer, C.J.A.M.; Kessener, B.

    2007-01-01

    Many complex policy processes face stagnations. People involved sense that continuing along existing paths will not produce those outcomes that are desired and deemed necessary. But changing and developing new action strategies is difficult. This article presents the results of an action research th

  9. Udfordringer for transkulturelle processer

    DEFF Research Database (Denmark)

    Petersen, Karen Bjerg

    2013-01-01

    at indskrænke mulighedsrummet for transkulturelle processer og for det at lære fra en terra nullius position. Der er fokus på empiriske undersøgelser af kultursyn i lovgivning om opholdstilladelse fra 2010, lovgivning om statsborgerskab fra 2006 samt kultursyn i den i 2003 indførte obligatoriske...

  10. Antecedents of institutional process

    Directory of Open Access Journals (Sweden)

    Emilio Díez de Castro

    2015-03-01

    Full Text Available This research is an attempt to advance the understanding of why organizations are responsive to the institutionalization process. To this end, we describe key elements that help explain the origin of this process. Furthermore, this research has followed a qualitative research methodology, using the ‘concept mapping’ technique, and grouping the different constructs items that act as motivating factors in the transformation of organizations to institutions. Methodologically we have tried to overlook the differentiation between old and new institutionalism, this approach follows the ideas of those researchers that question the suitability of drawing a line between the “old” and the “new” theory. We consider that the role of the CEO is essential in driving the institutionalization process, though usually their decisions are supported or have passed through the filter of the organization governance staff or the board of directors. Any progress that the organization makes depends fundamentally on the capabilities, perceptions, training and mindset of the CEO. The research results reinforce several key topics suggested in the literature on institutional theory. Particularly, we have proposed a classification with the motives that give rise to institutional initiatives: institutional authority; advantage in management; and, social involvement. This classification is consistent, to a larger extend, to the pillars of institutionalization that have been defined in the institutional theory literature, helping to understand, in more detail, the origin of the business processes and the background or motivations that generates and guide them.

  11. Thermal radiation processes

    NARCIS (Netherlands)

    Kaastra, J.S.; Paerels, F.; Durret, F.; Schindler, S.; Richter, P.

    2008-01-01

    We discuss the different physical processes that are important to understand the thermal X-ray emission and absorption spectra of the diffuse gas in clusters of galaxies and the warm-hot intergalactic medium. The ionisation balance, line and continuum emission and absorption properties are reviewed

  12. Supplier Evaluation Processes

    DEFF Research Database (Denmark)

    Hald, Kim Sundtoft; Ellegaard, Chris

    2011-01-01

    Purpose – The purpose of this paper is to illuminate how supplier evaluation practices are linked to supplier performance improvements. Specifically, the paper investigates how performance information travelling between the evaluating buyer and the evaluated suppliers is shaped and reshaped in th...... supplier evaluation process as its unit of analysis....

  13. Process in Humanistic Education.

    Science.gov (United States)

    Underhill, Adrian

    1989-01-01

    Outlines the themes and purposes of humanistic education in the instruction of English-as-a-Second-Language, from the perspective of teacher, trainer, student, colleague, parent, and observer, focusing on the processes, values, and attitudes that underpin humanistic education and that are drawn from humanistic psychology. (Author/CB)

  14. Governing Knowledge Processes

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Husted, Kenneth; Michailova, Snejina;

    2003-01-01

    An under-researched issue in work within the `knowledge movement' is therelation between organizational issues and knowledge processes (i.e., sharingand creating knowledge). We argue that managers can shape formalorganization structure and organization forms and can influence the moreinformal...

  15. Pragmatics and Information Processing.

    Science.gov (United States)

    Snyder, Lynn Sebestyen; Downey, Doris C.

    1983-01-01

    Findings from studies of attention, semantic memory, and the pragmatics of language are reviewed and implications for intervention with children whose language is disordered are discussed. Selectivity and resource allocation are the attention topics considered while schemata, frames, inferences, and narrative discourse processing are addressed…

  16. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process...

  17. Image-Processing Program

    Science.gov (United States)

    Roth, D. J.; Hull, D. R.

    1994-01-01

    IMAGEP manipulates digital image data to effect various processing, analysis, and enhancement functions. It is keyboard-driven program organized into nine subroutines. Within subroutines are sub-subroutines also selected via keyboard. Algorithm has possible scientific, industrial, and biomedical applications in study of flows in materials, analysis of steels and ores, and pathology, respectively.

  18. Image Processing for Teaching.

    Science.gov (United States)

    Greenberg, R.; And Others

    1993-01-01

    The Image Processing for Teaching project provides a powerful medium to excite students about science and mathematics, especially children from minority groups and others whose needs have not been met by traditional teaching. Using professional-quality software on microcomputers, students explore a variety of scientific data sets, including…

  19. Matchmaking for business processes

    NARCIS (Netherlands)

    Wombacher, Andreas; Fankhauser, Peter; Mahleko, Bendick; Neuhold, Erich

    2003-01-01

    Web services have a potential to enhance B2B ecommerce over the Internet by allowing companies and organizations to publish their business processes on service directories where potential trading partners can find them. This can give rise to new business paradigms based on ad-hoc trading relations a

  20. Asymmetric inclusion process

    Science.gov (United States)

    Reuveni, Shlomi; Eliazar, Iddo; Yechiali, Uri

    2011-10-01

    We introduce and explore the asymmetric inclusion process (ASIP), an exactly solvable bosonic counterpart of the fermionic asymmetric exclusion process (ASEP). In both processes, random events cause particles to propagate unidirectionally along a one-dimensional lattice of n sites. In the ASEP, particles are subject to exclusion interactions, whereas in the ASIP, particles are subject to inclusion interactions that coalesce them into inseparable clusters. We study the dynamics of the ASIP, derive evolution equations for the mean and probability generating function (PGF) of the sites’ occupancy vector, obtain explicit results for the above mean at steady state, and describe an iterative scheme for the computation of the PGF at steady state. We further obtain explicit results for the load distribution in steady state, with the load being the total number of particles present in all lattice sites. Finally, we address the problem of load optimization, and solve it under various criteria. The ASIP model establishes bridges between statistical physics and queueing theory as it represents a tandem array of queueing systems with (unlimited) batch service, and a tandem array of growth-collapse processes.

  1. Rethinking lessons learned processes

    NARCIS (Netherlands)

    Buttler, T.; Lukosch, S.G.; Kolfschoten, G.L.; Verbraeck, A.

    2012-01-01

    Lessons learned are one way to retain experience and knowledge in project-based organizations, helping them to prevent reinventin,g the wheel or to repeat past mistakes. However, there are several challenges that make these lessonts learned processes a challenging endeavor. These include capturing k

  2. Behavioural hybrid process calculus

    NARCIS (Netherlands)

    Brinksma, H.; Krilavicius, T.

    2005-01-01

    Process algebra is a theoretical framework for the modelling and analysis of the behaviour of concurrent discrete event systems that has been developed within computer science in past quarter century. It has generated a deeper nderstanding of the nature of concepts such as observable behaviour in th

  3. Gaia Data Processing Architecture

    CERN Document Server

    O'Mullane, W; Bailer-Jones, C; Bastian, U; Brown, A; Drimmel, R; Eyer, L; Huc, C; Jansen, F; Katz, D; Lindegren, L; Pourbaix, D; Luri, X; Mignard, F; Torra, J; van Leeuwen, F

    2006-01-01

    Gaia is ESA's ambitious space astrometry mission the main objective of which is to astrometrically and spectro-photometrically map 1000 Million celestial objects (mostly in our galaxy) with unprecedented accuracy. The announcement of opportunity for the data processing will be issued by ESA late in 2006. The Gaia Data Processing and Analysis Consortium (DPAC) has been formed recently and is preparing an answer. The satellite will downlink close to 100 TB of raw telemetry data over 5 years. To achieve its required accuracy of a few 10s of Microarcsecond astrometry, a highly involved processing of this data is required. In addition to the main astrometric instrument Gaia will host a Radial Velocity instrument, two low-resolution dispersers for multi-color photometry and two Star Mappers. Gaia is a flying Giga Pixel camera. The various instruments each require relatively complex processing while at the same time being interdependent. We describe the overall composition of the DPAC and the envisaged overall archi...

  4. A visual analysis of the process of process modeling

    OpenAIRE

    Claes, J Jan; Vanderfeesten, ITP Irene; Pinggera, J.; Reijers, HA Hajo; Weber, B.; Poels, G

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process modeling. This paper contributes to this research by presenting a way of visualizing the different steps a modeler undertakes to construct a process model, in a so-called process of process modeling C...

  5. Visualizing the Process of Process Modeling with PPMCharts

    OpenAIRE

    Claes, Jan; Vanderfeesten, Irene; Pinggera, Jakob; Reijers, Hajo A.; Weber, Barbara; Poels, Geert

    2015-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such a way that relevant characteristics of the modeling process can be observed graphically. By recording each modeling operation in a modeling process, one can build an event log that can be used as i...

  6. Managing Process Variants in the Process Life Cycle

    OpenAIRE

    Hallerbach, A.; Bauer, Th.; Reichert, M.U.

    2007-01-01

    When designing process-aware information systems, often variants of the same process have to be specified. Each variant then constitutes an adjustment of a particular process to specific requirements building the process context. Current Business Process Management (BPM) tools do not adequately support the management of process variants. Usually, the variants have to be kept in separate process models. This leads to huge modeling and maintenance efforts. In particular, more fundamental proces...

  7. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  8. Cassini science planning process

    Science.gov (United States)

    Paczkowski, Brian G.; Ray, Trina L.

    2004-01-01

    The mission design for Cassini-Huygens calls for a four-year orbital survey of the Saturnian system and the descent into the Titan atmosphere and eventual soft-landing of the Huygens probe. The Cassini orbiter tour consists of 76 orbits around Saturn with 44 close Titan flybys and 8 targeted icy satellite flybys. The Cassini orbiter spacecraft carries twelve scientific instruments that will perform a wide range of observations on a multitude of designated targets. The science opportunities, frequency of encounters, the length of the Tour, and the use of distributed operations pose significant challenges for developing the science plan for the orbiter mission. The Cassini Science Planning Process is the process used to develop and integrate the science and engineering plan that incorporates an acceptable level of science required to meet the primary mission objectives far the orbiter. The bulk of the integrated science and engineering plan will be developed prior to Saturn Orbit Insertion (Sol). The Science Planning Process consists of three elements: 1) the creation of the Tour Atlas, which identifies the science opportunities in the tour, 2) the development of the Science Operations Plan (SOP), which is the conflict-free timeline of all science observations and engineering activities, a constraint-checked spacecraft pointing profile, and data volume allocations to the science instruments, and 3) an Aftermarket and SOP Update process, which is used to update the SOP while in tour with the latest information on spacecraft performance, science opportunities, and ephemerides. This paper will discuss the various elements of the Science Planning Process used on the Cassini Mission to integrate, implement, and adapt the science and engineering activity plans for Tour.

  9. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  10. HAPs-Rx: Precombustion Removal of Hazardous Air Pollutant Precursors

    Energy Technology Data Exchange (ETDEWEB)

    David J. Akers; Clifford E. Raleigh

    1998-03-16

    CQ Inc. and its project team members--Howard University, PrepTech Inc., Fossil Fuel Sciences, the United States Geological Survey (USGS), and industry advisors--are applying mature coal cleaning and scientific principles to the new purpose of removing potentially hazardous air pollutants from coal. The team uniquely combines mineral processing, chemical engineering, and geochemical expertise. This project meets more than 11 goals of the U.S. Department of Energy (DOE), the National Energy Strategy, and the 1993 Climate Change Action Plan. During this project: (1) Equations were developed to predict the concentration of trace elements in as-mined and cleaned coals. These equations, which address both conventional and advanced cleaning processes, can be used to increase the removal of hazardous air pollutant precursors (HAPs) by existing cleaning plants and to improve the design of new cleaning plants. (2) A promising chemical method of removing mercury and other HAPs was developed. At bench-scale, mercury reductions of over 50 percent were achieved on coal that had already been cleaned by froth flotation. The processing cost of this technology is projected to be less than $3.00 per ton ($3.30 per tonne). (3) Projections were made of the average trace element concentration in cleaning plant solid waste streams from individual states. Average concentrations were found to be highly variable. (4) A significantly improved understanding of how trace elements occur in coal was gained, primarily through work at the USGS during the first systematic development of semiquantitative data for mode of occurrence. In addition, significant improvement was made in the laboratory protocol for mode of occurrence determination. (5) Team members developed a high-quality trace element washability database. For example, the poorest mass balance closure for the uncrushed size and washability data for mercury on all four coals is 8.44 percent and the best is 0.46 percent. This indicates an

  11. 5 CFR 1653.13 - Processing legal processes.

    Science.gov (United States)

    2010-01-01

    ... TSP is notified in writing that the legal process has been appealed, and that the effect of the filing... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Processing legal processes. 1653.13... PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's...

  12. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip;

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  13. Process and Post-Process: A Discursive History.

    Science.gov (United States)

    Matsuda, Paul Kei

    2003-01-01

    Examines the history of process and post-process in composition studies, focusing on ways in which terms, such as "current-traditional rhetoric,""process," and "post-process" have contributed to the discursive construction of reality. Argues that use of the term post-process in the context of second language writing needs to be guided by a…

  14. Managing Process Variants in the Process Life Cycle

    NARCIS (Netherlands)

    Hallerbach, A.; Bauer, Th.; Reichert, M.U.

    2007-01-01

    When designing process-aware information systems, often variants of the same process have to be specified. Each variant then constitutes an adjustment of a particular process to specific requirements building the process context. Current Business Process Management (BPM) tools do not adequately supp

  15. Meadow enriched ACP process algebras

    OpenAIRE

    J.A. Bergstra; Middelburg, C.A.

    2009-01-01

    We introduce the notion of an ACP process algebra. The models of the axiom system ACP are the origin of this notion. ACP process algebras have to do with processes in which no data are involved. We also introduce the notion of a meadow enriched ACP process algebra, which is a simple generalization of the notion of an ACP process algebra to processes in which data are involved. In meadow enriched ACP process algebras, the mathematical structure for data is a meadow.

  16. Clinical Process Intelligence

    DEFF Research Database (Denmark)

    Vilstrup Pedersen, Klaus

    2006-01-01

    knowledge to the point of care in a proactive manner. Attention has been on the translation of guidelines and the integration with the elec-tronic medical record (EMR). Less attention has been paid to the evaluation of clinical practice in relation to clinical guide-lines. I claim that single patient cases......, generalized processes de-ducted from many patient cases and clinical guidelines, are all guidelines that may be used in clinical reasoning. The differ-ence is the validity of the suggested actions and decisions. In the case of generalized processes deducted from EMRs, we have "what we usually do" practice i......Ideally, clinical guidelines are created from evidence based medicine. Translating the narrative semi-structured article for-mat of the clinical guidelines into a computable language makes it possible to utilize this information in IT-supported clinical reasoning, and thereby bring the relevant...

  17. Youpi: YOUr processing PIpeline

    Science.gov (United States)

    Monnerville, Mathias; Sémah, Gregory

    2012-03-01

    Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.

  18. Integral Politics as Process

    Directory of Open Access Journals (Sweden)

    Tom Atlee

    2010-03-01

    Full Text Available Using the definition proposed here, integral politics can be a process of integrating diverse perspectives into wholesome guidance for a community or society. Characteristics that follow from this definition have ramifications for understanding what such political processes involve. Politics becomes integral as it transcends partisan battle and nurtures generative conversation toward the common good. Problems, conflicts and crises become opportunities for new (or renewed social coherence. Conversational methodologies abound that can help citizen awareness temporarily expand during policy-making, thus helping raise society’s manifested developmental stage. Convening archetypal stakeholders or randomly selected citizens in conversations designed to engage the broader public enhances democratic legitimacy. With minimal issue- and candidate-advocacy, integral political leaders would develop society’s capacity to use integral conversational tools to improve its health, resilience, and collective intelligence. This both furthers and manifests evolution becoming conscious of itself.

  19. Posttranslational processing of progastrin

    DEFF Research Database (Denmark)

    Bundgaard, Jens René; Rehfeld, Jens F.

    2010-01-01

    Gastrin and cholecystokinin (CCK) are homologous hormones with important functions in the brain and the gut. Gastrin is the main regulator of gastric acid secretion and gastric mucosal growth, whereas cholecystokinin regulates gall bladder emptying, pancreatic enzyme secretion and besides acts...... as a major neurotransmitter in the central and peripheral nervous systems. The tissue-specific expression of the hormones is regulated at the transcriptional level, but the posttranslational phase is also decisive and is highly complex in order to ensure accurate maturation of the prohormones in a cell...... processing progastrin is often greatly disturbed in neoplastic cells.The posttranslational phase of the biogenesis of gastrin and the various progastrin products in gastrin gene-expressing tissues is now reviewed here. In addition, the individual contributions of the processing enzymes are discussed...

  20. Speech processing standards

    Science.gov (United States)

    Ince, A. Nejat

    1990-05-01

    Speech processing standards are given for 64, 32, 16 kb/s and lower rate speech and more generally, speech-band signals which are or will be promulgated by CCITT and NATO. The International Telegraph and Telephone Consultative Committee (CCITT) of the International body which deals, among other things, with speech processing within the context of ISDN. Within NATO there are also bodies promulgating standards which make interoperability, possible without complex and expensive interfaces. Some of the applications for low-bit rate voice and the related work undertaken by CCITT Study Groups which are responsible for developing standards in terms of encoding algorithms, codec design objectives as well as standards on the assessment of speech quality, are highlighted.

  1. The aluminum smelting process.

    Science.gov (United States)

    Kvande, Halvor

    2014-05-01

    This introduction to the industrial primary aluminum production process presents a short description of the electrolytic reduction technology, the history of aluminum, and the importance of this metal and its production process to modern society. Aluminum's special qualities have enabled advances in technologies coupled with energy and cost savings. Aircraft capabilities have been greatly enhanced, and increases in size and capacity are made possible by advances in aluminum technology. The metal's flexibility for shaping and extruding has led to architectural advances in energy-saving building construction. The high strength-to-weight ratio has meant a substantial reduction in energy consumption for trucks and other vehicles. The aluminum industry is therefore a pivotal one for ecological sustainability and strategic for technological development.

  2. PREFACE: Quantum information processing

    Science.gov (United States)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  3. Nested Hierarchical Dirichlet Processes.

    Science.gov (United States)

    Paisley, John; Wang, Chong; Blei, David M; Jordan, Michael I

    2015-02-01

    We develop a nested hierarchical Dirichlet process (nHDP) for hierarchical topic modeling. The nHDP generalizes the nested Chinese restaurant process (nCRP) to allow each word to follow its own path to a topic node according to a per-document distribution over the paths on a shared tree. This alleviates the rigid, single-path formulation assumed by the nCRP, allowing documents to easily express complex thematic borrowings. We derive a stochastic variational inference algorithm for the model, which enables efficient inference for massive collections of text documents. We demonstrate our algorithm on 1.8 million documents from The New York Times and 2.7 million documents from Wikipedia. PMID:26353240

  4. The Player Engagement Process

    DEFF Research Database (Denmark)

    Schoenau-Fog, Henrik

    2011-01-01

    Engagement is an essential element of the player experience, and the concept is described in various ways in the literature. To gain a more detailed comprehension of this multifaceted concept, and in order to better understand what aspects can be used to evaluate engaging game play and to design...... engaging user experiences, this study investigates one dimension of player engagement by empirically identifying the components associated with the desire to continue playing. Based on a description of the characteristics of player engagement, a series of surveys were developed to discover the components......, categories and triggers involved in this process. By applying grounded theory to the analysis of the responses, a process-oriented player engagement framework was developed and four main components consisting of objectives, activities, accomplishments and affects as well as the corresponding categories...

  5. Processing of Video Records

    OpenAIRE

    Čerešňák, Michal

    2013-01-01

    V této práci je prezentován systém pro zpracování videa se zaměřením na detekci a rozeznávání obličejů. Systém zpracovává video stream z kamery v reálném čase. K detekci obličejů využívá ViolaJones detektor. Pro rozeznávání obličejů je použita metoda SURF. Systém je implementován v jazyce C# a využívá knihovnu OpenCV a Emgu CV wrapper. This thesis presents a system for video processing focused on face detection and recognition. This system processes a video stream from camer...

  6. Process Improvement: Customer Service.

    Science.gov (United States)

    Cull, Donald

    2015-01-01

    Utilizing the comment section of patient satisfaction surveys, Clark Memorial Hospital in Jeffersonville, IN went through a thoughtful process to arrive at an experience that patients said they wanted. Two Lean Six Sigma tools were used--the Voice of the Customer (VoC) and the Affinity Diagram. Even when using these tools, a facility will not be able to accomplish everything the patient may want. Guidelines were set and rules were established for the Process Improvement Team in order to lessen frustration, increase focus, and ultimately be successful. The project's success is driven by the team members carrying its message back to their areas. It's about ensuring that everyone is striving to improve the patients' experience by listening to what they say is being done right and what they say can be done better. And then acting on it.

  7. Yeast nuclear RNA processing

    Institute of Scientific and Technical Information of China (English)

    Jade; Bernstein; Eric; A; Toth

    2012-01-01

    Nuclear RNA processing requires dynamic and intricately regulated machinery composed of multiple enzymes and their cofactors.In this review,we summarize recent experiments using Saccharomyces cerevisiae as a model system that have yielded important insights regarding the conversion of pre-RNAs to functional RNAs,and the elimination of aberrant RNAs and unneeded intermediates from the nuclear RNA pool.Much progress has been made recently in describing the 3D structure of many elements of the nuclear degradation machinery and its cofactors.Similarly,the regulatory mechanisms that govern RNA processing are gradually coming into focus.Such advances invariably generate many new questions,which we highlight in this review.

  8. Instabilities in sensory processes

    Science.gov (United States)

    Balakrishnan, J.

    2014-07-01

    In any organism there are different kinds of sensory receptors for detecting the various, distinct stimuli through which its external environment may impinge upon it. These receptors convey these stimuli in different ways to an organism's information processing region enabling it to distinctly perceive the varied sensations and to respond to them. The behavior of cells and their response to stimuli may be captured through simple mathematical models employing regulatory feedback mechanisms. We argue that the sensory processes such as olfaction function optimally by operating in the close proximity of dynamical instabilities. In the case of coupled neurons, we point out that random disturbances and fluctuations can move their operating point close to certain dynamical instabilities triggering synchronous activity.

  9. Thin film interconnect processes

    Science.gov (United States)

    Malik, Farid

    Interconnects and associated photolithography and etching processes play a dominant role in the feature shrinkage of electronic devices. Most interconnects are fabricated by use of thin film processing techniques. Planarization of dielectrics and novel metal deposition methods are the focus of current investigations. Spin-on glass, polyimides, etch-back, bias-sputtered quartz, and plasma-enhanced conformal films are being used to obtain planarized dielectrics over which metal films can be reliably deposited. Recent trends have been towards chemical vapor depositions of metals and refractory metal silicides. Interconnects of the future will be used in conjunction with planarized dielectric layers. Reliability of devices will depend to a large extent on the quality of the interconnects.

  10. Improving staff selection processes.

    Science.gov (United States)

    Cerinus, Marie; Shannon, Marina

    2014-11-11

    This article, the second in a series of articles on Leading Better Care, describes the actions undertaken in recent years in NHS Lanarkshire to improve selection processes for nursing, midwifery and allied health professional (NMAHP) posts. This is an area of significant interest to these professions, management colleagues and patients given the pivotal importance of NMAHPs to patient care and experience. In recent times the importance of selecting staff not only with the right qualifications but also with the right attributes has been highlighted to ensure patients are well cared for in a safe, effective and compassionate manner. The article focuses on NMAHP selection processes, tracking local, collaborative development work undertaken to date. It presents an overview of some of the work being implemented, highlights a range of important factors, outlines how evaluation is progressing and concludes by recommending further empirical research.

  11. Soft Pion Processes

    Science.gov (United States)

    Nambu, Y.

    1968-01-01

    My talk is concerned with a review, not necessarily of the latest theoretical developments, but rather of an old idea which has contributed to recent theoretical activities. By soft pion processes I mean processes in which low energy pions are emitted or absorbed or scattered, just as we use the word soft photon in a similar context. Speaking more quantitatively, we may call a pion soft if its energy is small compared to a natural scale in the reaction. This scale is determined by the particular dynamics of pion interaction, and one may roughly say that a pion is soft if its energy is small compared to the energies of the other individual particles that participate in the reaction. It is important to note at this point that pion is by far the lightest member of all the hadrons, and much of the success of the soft pion formulas depends on this fact.

  12. Plutonium dissolution process

    Science.gov (United States)

    Vest, Michael A.; Fink, Samuel D.; Karraker, David G.; Moore, Edwin N.; Holcomb, H. Perry

    1996-01-01

    A two-step process for dissolving plutonium metal, which two steps can be carried out sequentially or simultaneously. Plutonium metal is exposed to a first mixture containing approximately 1.0M-1.67M sulfamic acid and 0.0025M-0.1M fluoride, the mixture having been heated to a temperature between 45.degree. C. and 70.degree. C. The mixture will dissolve a first portion of the plutonium metal but leave a portion of the plutonium in an oxide residue. Then, a mineral acid and additional fluoride are added to dissolve the residue. Alteratively, nitric acid in a concentration between approximately 0.05M and 0.067M is added to the first mixture to dissolve the residue as it is produced. Hydrogen released during the dissolution process is diluted with nitrogen.

  13. Attentional processes and meditation.

    Science.gov (United States)

    Hodgins, Holley S; Adair, Kathryn C

    2010-12-01

    Visual attentional processing was examined in adult meditators and non-meditators on behavioral measures of change blindness, concentration, perspective-shifting, selective attention, and sustained inattentional blindness. Results showed that meditators (1) noticed more changes in flickering scenes and noticed them more quickly, (2) counted more accurately in a challenging concentration task, (3) identified a greater number of alternative perspectives in multiple perspectives images, and (4) showed less interference from invalid cues in a visual selective attention task, but (5) did not differ on a measure of sustained inattentional blindness. Together, results show that regular meditation is associated with more accurate, efficient, and flexible visual attentional processing across diverse tasks that have high face validity outside of the laboratory. Furthermore, effects were assessed in a context separate from actual meditation practice, suggesting that meditators' better visual attention is not just immediate, but extends to contexts separate from meditation practice.

  14. Process Improvement: Customer Service.

    Science.gov (United States)

    Cull, Donald

    2015-01-01

    Utilizing the comment section of patient satisfaction surveys, Clark Memorial Hospital in Jeffersonville, IN went through a thoughtful process to arrive at an experience that patients said they wanted. Two Lean Six Sigma tools were used--the Voice of the Customer (VoC) and the Affinity Diagram. Even when using these tools, a facility will not be able to accomplish everything the patient may want. Guidelines were set and rules were established for the Process Improvement Team in order to lessen frustration, increase focus, and ultimately be successful. The project's success is driven by the team members carrying its message back to their areas. It's about ensuring that everyone is striving to improve the patients' experience by listening to what they say is being done right and what they say can be done better. And then acting on it. PMID:26571974

  15. A Logical Process Calculus

    Science.gov (United States)

    Cleaveland, Rance; Luettgen, Gerald; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents the Logical Process Calculus (LPC), a formalism that supports heterogeneous system specifications containing both operational and declarative subspecifications. Syntactically, LPC extends Milner's Calculus of Communicating Systems with operators from the alternation-free linear-time mu-calculus (LT(mu)). Semantically, LPC is equipped with a behavioral preorder that generalizes Hennessy's and DeNicola's must-testing preorder as well as LT(mu's) satisfaction relation, while being compositional for all LPC operators. From a technical point of view, the new calculus is distinguished by the inclusion of: (1) both minimal and maximal fixed-point operators and (2) an unimple-mentability predicate on process terms, which tags inconsistent specifications. The utility of LPC is demonstrated by means of an example highlighting the benefits of heterogeneous system specification.

  16. Process gas solidification system

    International Nuclear Information System (INIS)

    A process for withdrawing gaseous UF6 from a first system and directing same into a second system for converting the gas to liquid UF6 at an elevated temperature, additionally including the step of withdrawing the resulting liquid UF6 from the second system, subjecting it to a specified sequence of flash-evaporation, cooling and solidification operations, and storing it as a solid in a plurality of storage vessels. (author)

  17. Business processes optimization possibilities

    OpenAIRE

    Mitreva, Elizabeta; Taskov, Nako; Filiposki, Oliver; Boskov, Tatjana

    2013-01-01

    In this paper we propose a model and opportunities for better performance, greater efficiency and effectiveness of companies, through optimization of the business processes, change of corporate culture and manners of taking full advantage of the business potential. The benefits of implementation of this model not only increase business results of companies and will continue to serve as the driving force for a continuous improvement, but also increasing the commitment of the top management and...

  18. A catalytic cracking process

    Energy Technology Data Exchange (ETDEWEB)

    Degnan, T.F.; Helton, T.E.

    1995-07-20

    Heavy oils are subjected to catalytic cracking in the absence of added hydrogen using a catalyst containing a zeolite having the structure of ZSM-12 and a large-pore crystalline zeolite having a Constraint Index less than about 1. The process is able to effect a bulk conversion of the oil at the same time yielding a higher octane gasoline and increased light olefin content. (author)

  19. Disability and Identity Processes

    Directory of Open Access Journals (Sweden)

    Karin Garzón D.

    2007-09-01

    Full Text Available This article presents a reflection about the Disabilitylike a human phenomenon, which needsnot only new definitions if not a criticism pointof view as a Concept, with the end of indicaterepresentations forms not only for people withdisabilities if not for the citizens in general.Considers discourses that identify argumentsfrom social sciences, appearing the Disabilitylike a differentiate condition between humanbeings, mainly for an exclusion process that announcesocial behaviors in relations with thepersons with disabilities.

  20. Exact Numerical Processing

    OpenAIRE

    García Chamizo, Juan Manuel; Mora Pascual, Jerónimo Manuel; Mora Mora, Higinio

    2003-01-01

    A model of an exact arithmetic processing is presented. We describe a representation format that gives us a greater expressive capability and covers a wider numerical set. The rational numbers are represented by means of fractional notation and explicit codification of its periodic part. We also give a brief description of exact arithmetic operations on the proposed format. This model constitutes a good alternative for the symbolic arithmetic, in special when numerical exact values are requir...

  1. Karst processes and time

    OpenAIRE

    Bosák, Pavel

    2008-01-01

    Karst evolution depends particularly on the time available for process evolution and on the geographical and geological conditions of the exposure of the rock. The longer the time, the higher the hydraulic gradient and the larger the amount of solvent water entering the karst system, the more evolved is the karst. In general, stratigraphic discontinuities directly influence the intensity and extent of karstification. Unconformities influence the stratigraphy of the karst through the time-span...

  2. The Thermox Process

    International Nuclear Information System (INIS)

    The Thermox process is a process developed by AB Atomenergi for the decladding and dissolution of irradiated Zircaloy-2 clad uranium dioxide fuel elements and consists of the following stages: 1. Decladding by means of thermal oxidation of the Zircaloy-2 with oxygen and water vapour at 825 C using nitrogen as a catalyst. 2. Oxidation of the uranium dioxide pellets with air and oxygen to U3O8 at a temperature of 450 - 650 C. 3. Dissolving and leaching the uranium oxides with dilute nitric acid leaving the insoluble zirconium oxide as a residue. 4. Filtering the solution and washing the residues of the cladding. The work has included the following parts; The laboratory scale investigation of the conditions for the oxidation of Zircaloy-2 in various gas mixtures and of the conditions for oxidizing and dissolving sintered UO2 pellets; The development on a pilot plant scale of suitable apparatus and process techniques for the safe and reproducible treatment of half length inactive fuel elements; Studies of some special operation and handling problems, which have to be solved before the method can be applied in full scale. Five half length fuel elements have been treated, and the results have been satisfactory. The pilot plant experiments have proved that inactive fuel elements can be decanned, oxidized and dissolved by means of the Thermox process. Solutions and canning residues are easy to filter, separate, and handle and are free from corroding agents. The uranium losses can be kept very low. The zirconium dioxide is obtained in a form suitable for permanent disposal

  3. Topology and mental processes.

    Science.gov (United States)

    McLeay, H

    2000-08-01

    The study reported here considers the effect of rotation on the decision time taken to compare nonrigid objects, presented as like and unlike pairs of knots and unknots. The results for 48 subjects, 21 to 45 years old, support the notion that images which have a characteristic 'foundation part' are more easily stored and accessed in the brain. Also, there is evidence that the comparison of deformable objects is processed by mental strategies other than self-evident mental rotation.

  4. Electrochemical Discharge Machining Process

    OpenAIRE

    Anjali V. Kulkarni

    2007-01-01

    Electrochemical discharge machining process is evolving as a promising micromachiningprocess. The experimental investigations in the present work substantiate this trend. In the presentwork, in situ, synchronised, transient temperature and current measurements have been carriedout. The need for the transient measurements arose due to the time-varying nature of the dischargeformation and time varying circuit current. Synchronised and transient measurements revealedthe discrete nature of the pr...

  5. Thermal radiation processes

    OpenAIRE

    Kaastra, J. S.; Paerels, F.; Durret, F; S. Schindler; Richter, P.

    2008-01-01

    We discuss the different physical processes that are important to understand the thermal X-ray emission and absorption spectra of the diffuse gas in clusters of galaxies and the warm-hot intergalactic medium. The ionisation balance, line and continuum emission and absorption properties are reviewed and several practical examples are given that illustrate the most important diagnostic features in the X-ray spectra.

  6. Image processing occupancy sensor

    Energy Technology Data Exchange (ETDEWEB)

    Brackney, Larry J.

    2016-09-27

    A system and method of detecting occupants in a building automation system environment using image based occupancy detection and position determinations. In one example, the system includes an image processing occupancy sensor that detects the number and position of occupants within a space that has controllable building elements such as lighting and ventilation diffusers. Based on the position and location of the occupants, the system can finely control the elements to optimize conditions for the occupants, optimize energy usage, among other advantages.

  7. Field Geology/Processes

    OpenAIRE

    Allen, Carlton C.; Jakeš, Petr; Jaumann, Ralf; Moses, Stewart; Marshall, John; Ryder, Graham; Saunders, Stephen; Singer, Robert B.

    1995-01-01

    The field geology/processes group examined the basic operations of a terrestrial field geologist and the manner in which these operations could be transferred to a planetary lander. We determined four basic requirements for robotic field geology: geologic context, surface vision, mobility, and manipulation. Geologic context requires a combination of orbital and descent imaging. Surface vision requirements include range, resolution, stereo, and multispectral imaging. The minimum mobility for u...

  8. Fastdata processing with Spark

    CERN Document Server

    Karau, Holden

    2013-01-01

    This book will be a basic, step-by-step tutorial, which will help readers take advantage of all that Spark has to offer.Fastdata Processing with Spark is for software developers who want to learn how to write distributed programs with Spark. It will help developers who have had problems that were too much to be dealt with on a single computer. No previous experience with distributed programming is necessary. This book assumes knowledge of either Java, Scala, or Python.

  9. Lie detection: Cognitive Processes

    OpenAIRE

    Street, C. N. H.

    2013-01-01

    How do we make decisions when we are uncertain? In more real-world settings there is often a vast array of information available to guide the decision, from an understanding of the social situation, to prior beliefs and experience, to information available in the current environment. Yet much of the research into uncertain decision-making has typically studied the process by isolating it from this rich source of information that decision-makers usually have available to them. This thesis take...

  10. Femtosecond laser materials processing

    Energy Technology Data Exchange (ETDEWEB)

    Stuart, B. C., LLNL

    1998-06-02

    Femtosecond lasers enable materials processing of most any material with extremely high precision and negligible shock or thermal loading to the surrounding area Applications ranging from drilling teeth to cutting explosives to making high-aspect ratio cuts in metals with no heat-affected zone are made possible by this technology For material removal at reasonable rates, we developed a fully computer-controlled 15-Watt average power, 100-fs laser machining system.

  11. Femtosecond laser materials processing

    Energy Technology Data Exchange (ETDEWEB)

    Stuart, B

    1998-08-05

    Femtosecond lasers enable materials processing of most any material with extremely high precision and negligible shock or thermal loading to the surrounding area. Applications ranging from drilling teeth to cutting explosives to precision cuts in composites are possible by using this technology. For material removal at reasonable rates, we have developed a fully computer-controlled 15-Watt average power, 100-fs laser machining system.

  12. Sustainable friendly processing

    OpenAIRE

    Kristensen, Niels Heine

    2004-01-01

    A comparison of the sustainability approach and the organic agriculture movement is made. Organic agriculture and processors of organic foods had the focus in developing new (innovative) production methods – sustainability strategies were more focussing on improving/adjusting already existing production technologies. Today it is more likely to find a convergence between sustainability strategies and organic strategies in organic agriculture and food processing. But basically the strategies em...

  13. Radiation processing in Hungary

    International Nuclear Information System (INIS)

    Hungary has 10.7 million population in 100,000 km2 territory. The gross national product is about $3,000 per capita per year. Hungary is a country with highly developed agriculture and medium degree developed industries. The Hungarian economy is an open economy because more than 40% of the national income is earned by export. The research and development works on various radiation processing have been performed for 25 years. In the Central Research Institute for Physics of the Hungarian Academy of Sciences, a laboratory was organized for the basic research of radiation chemistry and the moderator materials for nuclear reactors. Also the activities in the Central Research Institute for Chemistry, the Institute of Isotopes, the Research Institute for Plastics Industry, and the Central Research Institute for Food Industry are briefly reported. The largest radiation processing unit in Hungary is the automatic sterilization plant of Medicor Works in Debrecen with 350 kCi Co-60 source. The second important field of radiation processing is the irradiation of foods and agricultural products, and the radiation unit with 150 kCi Co-60 source is in the Central Research Institute for Food Industry. Radiation cross-linked polyethylene production, the production of wood-unsaturated polyester composite and enzyme immobilization are performed. (Kako, I.)

  14. Advanced microwave processing concepts

    Energy Technology Data Exchange (ETDEWEB)

    Lauf, R.J.; McMillan, A.D.; Paulauskas, F.L. [Oak Ridge National Lab., TN (United States)

    1997-04-01

    The purpose of this work is to explore the feasibility of several advanced microwave processing concepts to develop new energy-efficient materials and processes. The project includes two tasks: (1) commercialization of the variable-frequency microwave furnace; and (2) microwave curing of polymeric materials. The variable frequency microwave furnace, whose initial conception and design was funded by the AIM Materials Program, allows the authors, for the first time, to conduct microwave processing studies over a wide frequency range. This novel design uses a high-power traveling wave tube (TWT) originally developed for electronic warfare. By using this microwave source, one can not only select individual microwave frequencies for particular experiments, but also achieve uniform power densities over a large area by the superposition of many different frequencies. Microwave curing of various thermoset resins will be studied because it holds the potential of in-situ curing of continuous-fiber composites for strong, lightweight components or in-situ curing of adhesives, including metal-to-metal. Microwave heating can shorten curing times, provided issues of scaleup, uniformity, and thermal management can be adequately addressed.

  15. Process measuring techniques; Prozessmesstechnik

    Energy Technology Data Exchange (ETDEWEB)

    Freudenberger, A.

    2000-07-01

    This introduction into measurement techniques for chemical and process-technical plant in science and industry describes in detail the methods used to measure basic quantities. Most prominent are modern measuring techniques by means of ultrasound, microwaves and the Coriolis effect. Alongside physical and measuring technique fundamentals, the practical applications of measuring devices are described. Calculation examples are given to illustrate the subject matter. The book addresses students of physical engineering, process engineering and environmental engineering at technical schools as well as engineers of other disciplines wishing to familiarize themselves with the subject of process measurement techniques. (orig.) [German] Diese Einfuehrung in die Messtechnik fuer chemische und verfahrens-technische Forschungs- und Produktionsanlagen beschreibt ausfuehrlich die Methoden zur Messung der Basisgroessen. Moderne Messverfahren mit Ultraschall, Mikrowellen und Coriolis-Effekt stehen dabei im Vordergrund. Beruecksichtigung finden sowohl die physikalischen und messtechnischen Grundlagen als auch die praktischen Anwendungen der Geraete. Berechnungsbeispiele dienen der Erlaeuterung und Vertiefung des Stoffes. Angesprochen sind Studenten der Ingenieurstufengaenge Physikalische Technik und Verfahrens- und Umwelttechnik an Fachhochschulen als auch Ingenieure anderer Fachrichtungen, die sich in das Gebiet der Prozessmesstechnik einarbeiten wollen. (orig.)

  16. Laser processing of materials

    Indian Academy of Sciences (India)

    J Dutta Majumdar; I Manna

    2003-06-01

    Light amplification by stimulated emission of radiation (laser) is a coherent and monochromatic beam of electromagnetic radiation that can propagate in a straight line with negligible divergence and occur in a wide range of wavelength, energy/power and beam-modes/configurations. As a result, lasers find wide applications in the mundane to the most sophisticated devices, in commercial to purely scientific purposes, and in life-saving as well as life-threatening causes. In the present contribution, we provide an overview of the application of lasers for material processing. The processes covered are broadly divided into four major categories; namely, laser-assisted forming, joining, machining and surface engineering. Apart from briefly introducing the fundamentals of these operations, we present an updated review of the relevant literature to highlight the recent advances and open questions. We begin our discussion with the general applications of lasers, fundamentals of laser-matter interaction and classification of laser material processing. A major part of the discussion focuses on laser surface engineering that has attracted a good deal of attention from the scientific community for its technological significance and scientific challenges. In this regard, a special mention is made about laser surface vitrification or amorphization that remains a very attractive but unaccomplished proposition.

  17. Processing of lateritic ores

    International Nuclear Information System (INIS)

    Highly weathered or lateritic ores that contain high proportions of fine clay minerals present specific problems when they are processed to extract uranium. Of perhaps the greatest significance is the potential of the fine minerals to adsorb dissolved uranium (preg-robbing) from leach liquors produced by processing laterites or blends of laterite and primary ores. These losses can amount to 25% of the readily soluble uranium. The clay components can also restrict practical slurry densities to relatively low values in order to avoid rheology problems in pumping and agitation. The fine fractions also contribute to relatively poor solid-liquid separation characteristics in settling and/or filtration. Studies at ANSTO have characterised the minerals believed to be responsible for these problems and quantified the effects of the fines in these types of ores. Processing strategies were also examined, including roasting, resin-in-leach and separate leaching of the laterite fines to overcome potential problems. The incorporation of the preferred treatment option into an existing mill circuit is discussed. (author)

  18. Advanced microwave processing concepts

    Energy Technology Data Exchange (ETDEWEB)

    Lauf, R.J.; McMillan, A.D.; Paulauskas, F.L. [Oak Ridge National Laboratory, TN (United States)

    1995-05-01

    The purpose of this work is to explore the feasibility of several advanced microwave processing concepts to develop new energy-efficient materials and processes. The project includes two tasks: (1) commercialization of the variable-frequency microwave furnace; and (2) microwave curing of polymer composites. The variable frequency microwave furnace, whose initial conception and design was funded by the AIC Materials Program, will allow us, for the first time, to conduct microwave processing studies over a wide frequency range. This novel design uses a high-power traveling wave tube (TWT) originally developed for electronic warfare. By using this microwave source, one can not only select individual microwave frequencies for particular experiments, but also achieve uniform power densities over a large area by the superposition of many different frequencies. Microwave curing of thermoset resins will be studied because it hold the potential of in-situ curing of continuous-fiber composites for strong, lightweight components. Microwave heating can shorten curing times, provided issues of scaleup, uniformity, and thermal management can be adequately addressed.

  19. Spacelab Ground Processing

    Science.gov (United States)

    Scully, Edward J.; Gaskins, Roger B.

    1982-02-01

    Spacelab (SL) ground processing is active at the Kennedy Space Center (KSC). The palletized payload for the second Shuttle launch is staged and integrated with interface verification active. The SL Engineering Model is being assembled for subsequent test and checkout activities. After delivery of SL flight elements from Europe, prelaunch operations for the first SL flight start with receipt of the flight experiment packages and staging of the SL hardware. Experiment operations consist of integrating the various experiment elements into the SL racks, floors and pallets. Rack and floor assemblies with the experiments installed, are integrated into the flight module. Aft end-cone installation, pallet connections, and SL subsystems interface verifications are accomplished, and SL-Orbiter interfaces verified. The Spacelab cargo is then transferred to the Orbiter Processing Facility (OPF) in a controlled environment using a canister/transporter. After the SL is installed into the Orbiter payload bay, physical and functional integrity of all payload-to-Orbiter interfaces are verified and final close-out operations conducted. Spacelab payload activities at the launch pad are minimal with the payload bay doors remaining closed. Limited access is available to the module through the Spacelab Transfer Tunnel. After mission completion, the SL is removed from the Orbiter in the OPF and returned to the SL processing facility for experiment equipment removal and reconfiguration for the subsequent mission.

  20. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim is to iden......As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... in conjunction with image data are plagued with various challenges beyond the usual ones encountered in current applications. In this presentation we will introduce the basic ideas of SPC and the multivariate control charts commonly used in industry. We will further discuss the challenges the practitioners...

  1. The Caroline interrogatory process

    Energy Technology Data Exchange (ETDEWEB)

    Degagne, D. [Alberta Energy and Utilities Board, Calgary, AB (Canada); Gibson, T. [Gecko Management, Calgary, AB (Canada)

    1999-11-01

    Using the specific case study of the Caroline interrogatory process, an example is given of how an effective communications and public involvement program can re-establish trust and credibility levels within an community after an incident. The public is nervous about sour gas, especially about blowouts of gas from a pipeline. The post-approval period was marked by high expectations and a community consultation program which included a community advisory board, an emergency planning committee, socio-economic factors, and environmental monitoring and studies. Information and education involves newspaper articles, newsletters, tours, public consultation meetings, and weekly e-mail. Mercury was detected as a potential hazard at the site, and company actions are illustrated. Overall lessons learned included: starting early paid off, face to face resident contacts were the most effective, the willingness to make changes was the key to success, the community helped, knowing all the answers is not essential, and there is a need for empathy. The interrogatory process includes a hybrid technique that is comprised of four stages: 1) process review and public input, 2) identification and clarification of issues, 3) responses by industry and government, and 4) a public forum and follow-up action.

  2. The Caroline interrogatory process

    Energy Technology Data Exchange (ETDEWEB)

    Degagne, D. (Alberta Energy and Utilities Board, Calgary, AB (Canada)); Gibson, T. (Gecko Management, Calgary, AB (Canada))

    1999-01-01

    Using the specific case study of the Caroline interrogatory process, an example is given of how an effective communications and public involvement program can re-establish trust and credibility levels within an community after an incident. The public is nervous about sour gas, especially about blowouts of gas from a pipeline. The post-approval period was marked by high expectations and a community consultation program which included a community advisory board, an emergency planning committee, socio-economic factors, and environmental monitoring and studies. Information and education involves newspaper articles, newsletters, tours, public consultation meetings, and weekly e-mail. Mercury was detected as a potential hazard at the site, and company actions are illustrated. Overall lessons learned included: starting early paid off, face to face resident contacts were the most effective, the willingness to make changes was the key to success, the community helped, knowing all the answers is not essential, and there is a need for empathy. The interrogatory process includes a hybrid technique that is comprised of four stages: 1) process review and public input, 2) identification and clarification of issues, 3) responses by industry and government, and 4) a public forum and follow-up action.

  3. FCC resid processing

    International Nuclear Information System (INIS)

    This paper narrows the definition of reside processor to those FCC's which run feedstocks containing 1 wt.% or more Conradson carbon residue. With this new definition, the resid survey is revisited to see if any new conclusions can be drawn from the data. The authors break down the numbers and geography of resid processors, feed types, operating variables, and yields. After examining the state of resid processing in the FCC, the paper focuses on the design of cracking catalysts for handling resid feeds. There are important considerations involved in processing resid including high levels of contaminants such as vanadium and nickel, the impact on heat balance, and diffusion effects. Catalysts can be designed to improve the profitability of a resid operation. A clear picture of the roles of zeolite and matrix is presented, along with a discussion of the different types of zeoliters which are commonly used. The paper demonstrates how zeolite and matrix are best combined to meet objectives within a given set of constraints while processing resid

  4. Basic Social Processes

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon. PhD

    2005-06-01

    Full Text Available The goal of grounded theory is to generate a theory that accounts for a pattern of behavior that is relevant and problematic for those involved. The goal is not voluminous description, nor clever verification. As with all grounded theory, the generation of a basic social process (BSP theory occurs around a core category. While a core category is always present in a grounded research study, a BSP may not be.BSPs are ideally suited to generation by grounded theory from qualitative research because qualitative research can pick up process through fieldwork that continues over a period of time. BSPs are a delight to discover and formulate since they give so much movement and scope to the analyst’s perception of the data. BSPs such as cultivating, defaulting, centering, highlighting or becoming, give the feeling of process, change and movement over time. They also have clear, amazing general implications; so much so, that it is hard to contain them within the confines of a single substantive study. The tendency is to refer to them as a formal theory without the necessary comparative development of formal theory. They are labeled by a “gerund”(“ing” which both stimulates their generation and the tendency to over-generalize them.

  5. Priority in Process Algebras

    Science.gov (United States)

    Cleaveland, Rance; Luettgen, Gerald; Natarajan, V.

    1999-01-01

    This paper surveys the semantic ramifications of extending traditional process algebras with notions of priority that allow for some transitions to be given precedence over others. These enriched formalisms allow one to model system features such as interrupts, prioritized choice, or real-time behavior. Approaches to priority in process algebras can be classified according to whether the induced notion of preemption on transitions is global or local and whether priorities are static or dynamic. Early work in the area concentrated on global pre-emption and static priorities and led to formalisms for modeling interrupts and aspects of real-time, such as maximal progress, in centralized computing environments. More recent research has investigated localized notions of pre-emption in which the distribution of systems is taken into account, as well as dynamic priority approaches, i.e., those where priority values may change as systems evolve. The latter allows one to model behavioral phenomena such as scheduling algorithms and also enables the efficient encoding of real-time semantics. Technically, this paper studies the different models of priorities by presenting extensions of Milner's Calculus of Communicating Systems (CCS) with static and dynamic priority as well as with notions of global and local pre- emption. In each case the operational semantics of CCS is modified appropriately, behavioral theories based on strong and weak bisimulation are given, and related approaches for different process-algebraic settings are discussed.

  6. Element geochemistry and cleaning potential of the No. 11 coal seam from Antaibao mining district

    Institute of Scientific and Technical Information of China (English)

    WANG; Wenfeng; QIN; Yong; SONG; Dangyu; SANG; Shuxun; JIAN

    2005-01-01

    Based on the analyses of sulfur and 41 other elements in 8 channel samples of the No. 11 coal seam from Antaibao surface mine, Shanxi, China and 4 samples from the coal preparation plant of this mine, the distribution of the elements in the seam profile, their geochemical partitioning behavior during the coal cleaning and the genetic relationships between the both are studied. The conclusions are drawn as follows. The coal-forming environment was probably invaded by sea water during the post-stage of peatification, which results in the fact that the contents of As, Fe, S, etc. associated closely with sea water tend to increase toward the top of the seam, and that the kaolinite changes into illite and montmorillonite in the coal-sublayer near the roof. These elements studied are dominantly associated with kaolinite, pyrite, illite, montmorillonite, etc., of which the As, Pb, Mn, Cs, Co, Ni, etc. are mainly associated with sulfides, the Mo, V, Nb, Hf, REEs, Ta etc. mainly with kaolintie, the Mg, Al etc. mainly with epigenetic montmorillonite, and the Rb, Cr, Ba, Cu, K, Hg, etc. mainly with epigenetic illite. The physical coal cleaning is not only effective in the removal of ash and sulfur, but also in reducing the concentration of most major and trace elements. The elements Be, U, Sb, W, Br, Se, P, etc. are largely or partly organically bound showing a relatively low removability, while the removability of the other elements studied is more than 20%, of which the Mg, Mn, Hg, Fe, As, K, Al, Cs, and Cr associated mostly with the coarser or epigenetic minerals show a higher removability than that of ash. The distribution of the elements in the seam profile controls their partitioning behavior to a great degree during the coal cleaning processes.

  7. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    Science.gov (United States)

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  8. Fundamentals of process intensification: A process systems engineering view

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Sales Cruz, Alfonso Mauricio; Gani, Rafiqul

    2016-01-01

    This chapter gives an overview of the fundamentals of process intensification from a process systems engineering point of view. The concept of process intensification, including process integration, is explained together with the drivers for applying process intensification, which can be achieved...... at different scales of size, that is, the unit operation scale, the task scale, and the phenomena scale. The roles of process intensification with respect to process improvements and the generation of more sustainable process designs are discussed and questions related to when to apply process intensification...... and how to apply process intensification are answered through illustrative examples. The main issues and needs for generation of more sustainable process alternatives through process intensification are discussed in terms of the need for a systematic computer-aided framework and the methods and tools...

  9. Innovation process management

    Directory of Open Access Journals (Sweden)

    K. Pałucha

    2012-11-01

    Full Text Available Purpose: of the article is to present selected problems of innovation management. Indication of the growing interdisciplinary and the complexity of that issue and requirement of having an extensive knowledge from the scope of the management. The other purpose is a synthetic presentation of differences in approach to managing innovations.The additional purpose is to present benefits which are the effect of efficient managing innovations. They are described based on examining processes of preparing the production. The topicality of the subject matter results from the fact that it is projecting to a considerable degree onto the competitiveness of businesses. However the large variability is forcing surroundings to seek organizational new solutions.Design/methodology/approach: For the need of achieving presented purposes, studies on literature concerning managing innovations were undertaken. Special attention was paid to development processes of new releases. At the same time in chosen manufacturing companies they conducted pilot researches (interviews, conversations with the technical divisions’ management staff. Those researches enabled to recognize approaches toward the discussed subject matter as well as to find out solutions applied in processes of preparing the production and effects obtained from this title.Findings: Studies on literature are showing that there is a number of innovation management models as well as of managing development processes of new releases. It is explicitly hard to show better solutions and to recommend them for individual enterprises. All of them have the determined merits and demerits. They are reweighing demand-induced models. A complexity of suggested solutions grows. Examined enterprises apply own models embracing outside organizations, stakeholders and consumers more and more often. At the same moment they require a modern organization.Research limitations/implications: The preliminary findings introduced in

  10. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    OpenAIRE

    O. Honcharova

    2013-01-01

    The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST), benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cy...

  11. CONVERGENCE TO PROCESS ORGANIZATION BY MODEL OF PROCESS MATURITY

    Directory of Open Access Journals (Sweden)

    Blaženka Piuković Babičković

    2015-06-01

    Full Text Available With modern business process orientation binds primarily, process of thinking and process organizational structure. Although the business processes are increasingly a matter of writing and speaking, it is a major problem among the business world, especially in countries in transition, where it has been found that there is a lack of understanding of the concept of business process management. The aim of this paper is to give a specific contribution to overcoming the identified problem, by pointing out the significance of the concept of business process management, as well as the representation of the model for review of process maturity and tools that are recommended for use in process management.

  12. Time processing in dyscalculia

    Directory of Open Access Journals (Sweden)

    marinella eCappelletti

    2011-12-01

    Full Text Available To test whether atypical number development may affect other types of quantity processing, we investigated temporal discrimination in adults with developmental dyscalculia (DD. This also allowed us to test whether (1 number and time may be sub-served by a common quantity system or decision mechanisms –in which case they may both be impaired, or (2 whether number and time are distinct –and therefore they may dissociate. Participants judged which of two successively presented horizontal lines was longer in duration, the first line being preceded by either a small or a large number prime (‘1’ or ‘9’ or by a neutral symbol (‘#’, or in third task decide which of two Arabic numbers (either ‘1’, ‘5’, ’9’ lasted longer. Results showed that (i DD’s temporal discriminability was normal as long as numbers were not part of the experimental design even as task-irrelevant stimuli; however (ii task-irrelevant numbers dramatically disrupted DD’s temporal discriminability, the more their salience increased, though the actual magnitude of the numbers had no effect; and in contrast (iii controls’ time perception was robust to the presence of numbers but modulated by numerical quantity such that small number primes or numerical stimuli made durations appear shorter than veridical and the opposite for larger numerical prime or numerical stimuli. This study is the first to investigate continuous quantity as time in a population with a congenital number impairment and to show that atypical development of numerical competence leaves continuous quantity processing spared. Our data support the idea of a partially shared quantity system across numerical and temporal dimensions, which allows dissociations and interactions among dimensions; furthermore, they suggest that impaired number in DD is unlikely to originate from systems initially dedicated to continuous quantity processing like time.

  13. Solar Flares: Magnetohydrodynamic Processes

    Directory of Open Access Journals (Sweden)

    Kazunari Shibata

    2011-12-01

    Full Text Available This paper outlines the current understanding of solar flares, mainly focused on magnetohydrodynamic (MHD processes responsible for producing a flare. Observations show that flares are one of the most explosive phenomena in the atmosphere of the Sun, releasing a huge amount of energy up to about 10^32 erg on the timescale of hours. Flares involve the heating of plasma, mass ejection, and particle acceleration that generates high-energy particles. The key physical processes for producing a flare are: the emergence of magnetic field from the solar interior to the solar atmosphere (flux emergence, local enhancement of electric current in the corona (formation of a current sheet, and rapid dissipation of electric current (magnetic reconnection that causes shock heating, mass ejection, and particle acceleration. The evolution toward the onset of a flare is rather quasi-static when free energy is accumulated in the form of coronal electric current (field-aligned current, more precisely, while the dissipation of coronal current proceeds rapidly, producing various dynamic events that affect lower atmospheres such as the chromosphere and photosphere. Flares manifest such rapid dissipation of coronal current, and their theoretical modeling has been developed in accordance with observations, in which numerical simulations proved to be a strong tool reproducing the time-dependent, nonlinear evolution of a flare. We review the models proposed to explain the physical mechanism of flares, giving an comprehensive explanation of the key processes mentioned above. We start with basic properties of flares, then go into the details of energy build-up, release and transport in flares where magnetic reconnection works as the central engine to produce a flare.

  14. Discovery as a process

    Energy Technology Data Exchange (ETDEWEB)

    Loehle, C.

    1994-05-01

    The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if something is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.

  15. Coupled Diffusion Processes

    Institute of Scientific and Technical Information of China (English)

    章复熹

    2004-01-01

    @@ Coupled diffusion processes (or CDP for short) model the systems of molecular motors,which attract much interest from physicists and biologists in recent years[1,2,9,14,4,7,21]. The protein moves along a filament called the track, and it is crucial that there are several inner states of the protein and the underlying chemical reaction causes transitions among different inner states,while chemical energy can be converted to mechanical energy by rachet effects[5,3,2,14,12].

  16. Cascade hydrodewaxing process

    Energy Technology Data Exchange (ETDEWEB)

    Yen, J.H.

    1986-07-08

    A cascade catalytic hydrodewaxing process is described comprising: (a) passing a hydrocarbon feedstock containing waxy components selected from a group of normal paraffins and slightly branched chain paraffins over a hydroisomerization catalyst comprising a crystalline silicate zeolite having the structure of ZSM-12 in admixture with a crystalline silicate zeolite having the structure of ZSM-23, the admixture having hydrogenation/dehydrogenation activity to hydroisomerize the feedstock; and (b) passing at least a majority of the normally liquid hydrocarbon recovered from step (a) over a dewaxing catalyst comprising a crystalline silicate zeolite having a structure of ZSM-5, the zeolite of step (b) having hydrogenation/-dehydrogenation activity to dewax the recovered hydrocarbon.

  17. Hyperspectral image processing

    CERN Document Server

    Wang, Liguo

    2016-01-01

    Based on the authors’ research, this book introduces the main processing techniques in hyperspectral imaging. In this context, SVM-based classification, distance comparison-based endmember extraction, SVM-based spectral unmixing, spatial attraction model-based sub-pixel mapping, and MAP/POCS-based super-resolution reconstruction are discussed in depth. Readers will gain a comprehensive understanding of these cutting-edge hyperspectral imaging techniques. Researchers and graduate students in fields such as remote sensing, surveying and mapping, geosciences and information systems will benefit from this valuable resource.

  18. Process Principle of Information

    Institute of Scientific and Technical Information of China (English)

    张高锋; 任君

    2006-01-01

    Ⅰ.IntroductionInformation structure is the organization modelof given and New information in the course ofinformation transmission.A discourse contains avariety of information and not all the informationlisted in the discourse is necessary and useful to us.When we decode a discourse,usually,we do not needto read every word in the discourse or text but skimor scan the discourse or text to search what we thinkis important or useful to us in the discourse as quicklyas possible.Ⅱ.Process Principles of Informati...

  19. Phonocardiography Signal Processing

    CERN Document Server

    Abbas, Abbas K

    2009-01-01

    The auscultation method is an important diagnostic indicator for hemodynamic anomalies. Heart sound classification and analysis play an important role in the auscultative diagnosis. The term phonocardiography refers to the tracing technique of heart sounds and the recording of cardiac acoustics vibration by means of a microphone-transducer. Therefore, understanding the nature and source of this signal is important to give us a tendency for developing a competent tool for further analysis and processing, in order to enhance and optimize cardiac clinical diagnostic approach. This book gives the

  20. Introduction to information processing

    CERN Document Server

    Dietel, Harvey M

    2014-01-01

    An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

  1. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  2. FHR Process Instruments

    Energy Technology Data Exchange (ETDEWEB)

    Holcomb, David Eugene [ORNL

    2015-01-01

    Fluoride salt-cooled High temperature Reactors (FHRs) are entering into early phase engineering development. Initial candidate technologies have been identified to measure all of the required process variables. The purpose of this paper is to describe the proposed measurement techniques in sufficient detail to enable assessment of the proposed instrumentation suite and to support development of the component technologies. This paper builds upon the instrumentation chapter of the recently published FHR technology development roadmap. Locating instruments outside of the intense core radiation and high-temperature fluoride salt environment significantly decreases their environmental tolerance requirements. Under operating conditions, FHR primary coolant salt is a transparent, low-vapor-pressure liquid. Consequently, FHRs can employ standoff optical measurements from above the salt pool to assess in-vessel conditions. For example, the core outlet temperature can be measured by observing the fuel s blackbody emission. Similarly, the intensity of the core s Cerenkov glow indicates the fission power level. Short-lived activation of the primary coolant provides another means for standoff measurements of process variables. The primary coolant flow and neutron flux can be measured using gamma spectroscopy along the primary coolant piping. FHR operation entails a number of process measurements. Reactor thermal power and core reactivity are the most significant variables for process control. Thermal power can be determined by measuring the primary coolant mass flow rate and temperature rise across the core. The leading candidate technologies for primary coolant temperature measurement are Au-Pt thermocouples and Johnson noise thermometry. Clamp-on ultrasonic flow measurement, that includes high-temperature tolerant standoffs, is a potential coolant flow measurement technique. Also, the salt redox condition will be monitored as an indicator of its corrosiveness. Both

  3. Entrepreneurship and Process Studies

    DEFF Research Database (Denmark)

    Hjorth, Daniel; Holt, Robin; Steyaert, Chris

    2015-01-01

    Process studies put movement, change and flow first; to study processually is to consider the world as restless, something underway, becoming and perishing, without end. To understand firms processually is to accept but also – and this is harder perhaps – to absorb this fluidity, to treat...... and potential of processual approaches to studying, researching and practising entrepreneurship. The articles in the issue attest to an increasing sensitivity to processual thinking. We argue that appreciating entrepreneurial phenomena processually opens up the field to an understanding of entrepreneurship...

  4. Investigating the Design Process

    DEFF Research Database (Denmark)

    Kautz, Karlheinz

    2011-01-01

    /methodology/approach – Based on an integrated framework for user participation derived from the participatory design literature the research was performed as a case study and semi-structured, open-ended interviews were conducted with about a third of the development team and with a representative sample of key players...... to understand what participatory design is and how, when and where it can be performed as an instance of a design process in agile development. As such the paper contributes to an analytical and a design theory of participatory design in agile development. Furthermore the paper explicates why participatory...

  5. Digital signal processing laboratory

    CERN Document Server

    Kumar, B Preetham

    2011-01-01

    INTRODUCTION TO DIGITAL SIGNAL PROCESSING Brief Theory of DSP ConceptsProblem SolvingComputer Laboratory: Introduction to MATLAB®/SIMULINK®Hardware Laboratory: Working with Oscilloscopes, Spectrum Analyzers, Signal SourcesDigital Signal Processors (DSPs)ReferencesDISCRETE-TIME LTI SIGNALS AND SYSTEMS Brief Theory of Discrete-Time Signals and SystemsProblem SolvingComputer Laboratory: Simulation of Continuous Time and Discrete-Time Signals and Systems ReferencesTIME AND FREQUENCY ANALYSIS OF COMMUNICATION SIGNALS Brief Theory of Discrete-Time Fourier Transform (DTFT), Discrete Fourier Transform

  6. Digital signal processing

    CERN Document Server

    O'Shea, Peter; Hussain, Zahir M

    2011-01-01

    In three parts, this book contributes to the advancement of engineering education and that serves as a general reference on digital signal processing. Part I presents the basics of analog and digital signals and systems in the time and frequency domain. It covers the core topics: convolution, transforms, filters, and random signal analysis. It also treats important applications including signal detection in noise, radar range estimation for airborne targets, binary communication systems, channel estimation, banking and financial applications, and audio effects production. Part II considers sel

  7. Process for treating biomass

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Timothy J; Teymouri, Farzaneh

    2015-11-04

    This invention is directed to a process for treating biomass. The biomass is treated with a biomass swelling agent within the vessel to swell or rupture at least a portion of the biomass. A portion of the swelling agent is removed from a first end of the vessel following the treatment. Then steam is introduced into a second end of the vessel different from the first end to further remove swelling agent from the vessel in such a manner that the swelling agent exits the vessel at a relatively low water content.

  8. Simpler radioactive wastewater processing.

    Science.gov (United States)

    Rodríguez, José Canga; Luh, Volker

    2011-11-01

    José Canga Rodríguez, key account manager, Pharmaceutical and Life Sciences, EnviroChemie, and Volker Luh, CEO of EnviroDTS, describe the development, and recent successful application, of a new technology for dealing safely and effectively with the radioactive "wastewater" generated by patients who have undergone radiotherapy in nuclear medicine facilities. The BioChroma process provides what is reportedly not only a more flexible means than traditional "delay and decay" systems of dealing with this "by-product" of medical treatment, but also one that requires less plant space, affords less risk of leakage or cross-contamination, and is easier to install. PMID:22368885

  9. Medical image processing

    CERN Document Server

    Dougherty, Geoff

    2011-01-01

    This book is designed for end users in the field of digital imaging, who wish to update their skills and understanding with the latest techniques in image analysis. This book emphasizes the conceptual framework of image analysis and the effective use of image processing tools. It uses applications in a variety of fields to demonstrate and consolidate both specific and general concepts, and to build intuition, insight and understanding. Although the chapters are essentially self-contained they reference other chapters to form an integrated whole. Each chapter employs a pedagogical approach to e

  10. Simulation of Cavitation Processes

    OpenAIRE

    Jover Herrero, Álvaro

    2010-01-01

    Cavitation is the formation of empty cavities in a liquid by high forces and the immediate implosion of them. It occurs when a liquid is subjected to rapid changes of pressure causing the formation of cavities in the lower pressure regions of the liquid. Cavitation processes can produce high temperatures (10.000 ◦C) and pressures (1.000 atm) inside the bubbles and also generate micro jets produced by the asymmetric implosions of the bubbles. This phenomenon has been investigated duri...

  11. High processivity polymerases

    Energy Technology Data Exchange (ETDEWEB)

    Shamoo, Yousif; Sun, Siyang

    2014-06-10

    Chimeric proteins comprising a sequence nonspecific single-stranded nucleic-acid-binding domain joined to a catalytic nucleic-acid-modifying domain are provided. Methods comprising contacting a nucleic acid molecule with a chimeric protein, as well as systems comprising a nucleic acid molecule, a chimeric protein, and an aqueous solution are also provided. The joining of sequence nonspecific single-stranded nucleic-acid-binding domain and a catalytic nucleic-acid-modifying domain in chimeric proteins, among other things, may prevent the separation of the two domains due to their weak association and thereby enhances processivity while maintaining fidelity.

  12. Process for treating biomass

    Science.gov (United States)

    Campbell, Timothy J.; Teymouri, Farzaneh

    2015-08-11

    This invention is directed to a process for treating biomass. The biomass is treated with a biomass swelling agent within the vessel to swell or rupture at least a portion of the biomass. A portion of the swelling agent is removed from a first end of the vessel following the treatment. Then steam is introduced into a second end of the vessel different from the first end to further remove swelling agent from the vessel in such a manner that the swelling agent exits the vessel at a relatively low water content.

  13. Process-based tolerance assessment of connecting rod machining process

    Science.gov (United States)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.

    2016-01-01

    Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.

  14. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    CERN Document Server

    Ge, Zhiqiang

    2013-01-01

      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  15. Development of innovative iron making process, Hi-QIP process

    Energy Technology Data Exchange (ETDEWEB)

    Takeda Kanji [JFE Steel Corp. (Japan)

    2005-07-01

    The direct reduction smelting process of iron is explained based on the tasks in the conventional blast furnace process. Furthermore, Hi-QIP (High Quality Iron Pebble) method that is under development as a new iron making process from powdery iron ores and pulverized coal to metallic iron directly with respect to the principle, and the development status is explained. The Midrex process, the fusion reduction process, and the rotary hearth furnace method that supplement the blast furnace process are explained. The Hi-QIP process is a coal-based bed type reduction and melting process. The whole process is conducted in a rotary hearth of a furnace continuously. Tests are conducted to confirm the feasibility of the Hi-QIP process with a bench-scale furnace. A pilot plant has been constructed to verify the feasibility of the process with a continuous type large furnace. 3 refs., 11 figs., 5 tabs.

  16. Koenigs function and branching processes

    CERN Document Server

    Chikilev, O G

    2001-01-01

    An explicit solution of time-homogeneous pure birth branching processes is described. It gives alternative extensions for the negative binomial distribution (branching processes with immigration) and for the Furry-Yule distribution (branching processes without immigration).

  17. Neuroscientific Model of Motivational Process

    OpenAIRE

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Rewa...

  18. Aquarius Digital Processing Unit

    Science.gov (United States)

    Forgione, Joshua; Winkert, George; Dobson, Norman

    2009-01-01

    Three documents provide information on a digital processing unit (DPU) for the planned Aquarius mission, in which a radiometer aboard a spacecraft orbiting Earth is to measure radiometric temperatures from which data on sea-surface salinity are to be deduced. The DPU is the interface between the radiometer and an instrument-command-and-data system aboard the spacecraft. The DPU cycles the radiometer through a programmable sequence of states, collects and processes all radiometric data, and collects all housekeeping data pertaining to operation of the radiometer. The documents summarize the DPU design, with emphasis on innovative aspects that include mainly the following: a) In the radiometer and the DPU, conversion from analog voltages to digital data is effected by means of asynchronous voltage-to-frequency converters in combination with a frequency-measurement scheme implemented in field-programmable gate arrays (FPGAs). b) A scheme to compensate for aging and changes in the temperature of the DPU in order to provide an overall temperature-measurement accuracy within 0.01 K includes a high-precision, inexpensive DC temperature measurement scheme and a drift-compensation scheme that was used on the Cassini radar system. c) An interface among multiple FPGAs in the DPU guarantees setup and hold times.

  19. Ambiguity in sentence processing.

    Science.gov (United States)

    Altmann, G T

    1998-04-01

    As listeners and readers, we rarely notice the ambiguities that pervade our everyday language. When we hear the proverb `Time flies like an arrow' we might ponder its meaning, but not the fact that there are almost 100 grammatically permissible interpretations of this short sentence. On occasion, however, we do notice sentential ambiguity: headlines, such as `Two Sisters Reunited After 18 Years in Checkout Counter', are amusing because they so consistently lead to the unintended interpretation (presumably, the sisters did not spend 18 years at the checkout). It is this consistent preference for one interpretation-and one grammatical structure-rather than another that has fuelled research into sentence processing for more than 20 years. Until relatively recently, the dominant belief had been that these preferences arise from general principles that underlie our use of grammar, with certain grammatical constructions being preferred over others. There has now accrued, however, a considerable body of evidence demonstrating that these preferences are not absolute, but can change in particular circumstances. With this evidence have come new theories of sentence processing, some of which, at first glance, radically question the standard notions of linguistic representation, grammar and understanding.

  20. Hybrid microcircuit intraconnection processes

    Energy Technology Data Exchange (ETDEWEB)

    Bonham, H.B.

    1976-08-01

    Hybrid intraconnections join thin film networks and applique components into an electrically functional hybrid microcircuit (HMC). Applique components were intraconnected with thermocompression (TC) bonds to chromium/gold metallized thin film networks. The project determined critical processes, material parameters, quality criteria, and characterization techniques. The program began on July 1, 1970, and initial efforts consisted of organizing the development program and forecasting needed equipment and manpower. The total time schedule was then PERT-charted. Gold beam-lead devices, gold wire, and chromium/gold films had already been selected for these intraconnection technologies. Thermocompression bonding was developed to provide intraconnections between the HMC gold conductor metallization and gold beam-lead active devices, gold plated external leads, and gold terminated applique devices, such as capacitors. An optimum combination of bonding parameters based on the fundamental physics of the process was developed. This philosophy required bonding equipment to be thoroughly characterized so that the value of bonding parameters (such as force and temperature) indicated on the bonder could be translated into measurable values of the force and temperature within the bond zone.

  1. The anaerobic digestion process

    Energy Technology Data Exchange (ETDEWEB)

    Rivard, C.J. [National Renewable Energy Lab., Golden, CO (United States); Boone, D.R. [Oregon Graduate Inst., Portland, OR (United States)

    1996-01-01

    The microbial process of converting organic matter into methane and carbon dioxide is so complex that anaerobic digesters have long been treated as {open_quotes}black boxes.{close_quotes} Research into this process during the past few decades has gradually unraveled this complexity, but many questions remain. The major biochemical reactions for forming methane by methanogens are largely understood, and evolutionary studies indicate that these microbes are as different from bacteria as they are from plants and animals. In anaerobic digesters, methanogens are at the terminus of a metabolic web, in which the reactions of myriads of other microbes produce a very limited range of compounds - mainly acetate, hydrogen, and formate - on which the methanogens grow and from which they form methane. {open_quotes}Interspecies hydrogen-transfer{close_quotes} and {open_quotes}interspecies formate-transfer{close_quotes} are major mechanisms by which methanogens obtain their substrates and by which volatile fatty acids are degraded. Present understanding of these reactions and other complex interactions among the bacteria involved in anaerobic digestion is only now to the point where anaerobic digesters need no longer be treated as black boxes.

  2. Grants Process Overview Infographic

    Science.gov (United States)

    This infographic shows the steps in the National Institutes of Health and National Cancer Institute Grants Process. The graphic shows which steps are done by the Principle Investigator, Grantee Institution, and by NIH. The process is represented by a circular flow of steps. Starting from the top and reading clockwise: The Principle Investigator “Initiates Research Idea and Prepares Application” The Grantee Institution “Submits Application” NIH “NIH Center For Scientific Review, Assigns To NCI And To Study Section” NIH “Scientific Review Group (NCI OR CSR) Evaluates for Scientific Merit” NIH “National Cancer Advisory Board Recommends Action” NIH “NCI Evaluates Program Relevance And Need” NIH “NCI Makes Funding Selections And Issues Grant Awards” (NIH) NIH “NCI Monitors Programmatic and Business Management Performance of the Grant” The Grantee Institution “Manages Funds” The Principle Investigator “Conducts Research” Source: www.cancer.gov Icons made by Freepik from http://www.flaticon.com is licensed by CC BY3.0”

  3. Hot rolling joining process

    International Nuclear Information System (INIS)

    In the case of incorporating nonferrous metal equipment in fuel reprocessing processes, from the viewpoint of reducing maintenance works for the piping connection to peripheral equipments, it is desirable to adopt the pipe joints of joining the materials of different kinds, which have the high reliability against leakage. In order to meet this demand, the development of the manufacturing technology of the pipe joints by hot rolling process has been carried out. As for the structure of this pipe joint, a small diameter nonferrous metal pipe and a large diameter stainless steel pipe are joined by hot rolling by using an inserted material. The materials are Ti-5% Ta, Ti and Zr for the nonferrous metals, SUS 304L for the stainless steel, and Ta foil for the inserted material. The merits and demerits of this pipe joints are shown. The control of the interface structure in the joining of different materials was carried out by using the inserted material. The method of manufacturing the pipe joints and the proper conditions of the rolling joining are explained. As for the performance of the pipe joints, the evaluations of the defects in the joining interface, the strength of the joining, the corrosion resistance and the susceptibility to stress corrosion cracking are reported. (K.I.)

  4. Multidimensional diffusion processes

    CERN Document Server

    Stroock, Daniel W

    1997-01-01

    From the reviews: "… Both the Markov-process approach and the Itô approach … have been immensely successful in diffusion theory. The Stroock-Varadhan book, developed from the historic 1969 papers by its authors, presents the martingale-problem approach as a more powerful - and, in certain regards, more intrinsic-means of studying the foundations of the subject. […] … the authors make the uncompromising decision not "to proselytise by intimidating the reader with myriad examples demonstrating the full scope of the techniques", but rather to persuade the reader "with a careful treatment of just one problem to which they apply". […] Most of the main tools of stochastic-processes theory are used, ..but it is the formidable combination of probability theory with analysis … which is the core of the work. […] I have emphasized the great importance of the Stroock-Varadhan book. It contains a lot more than I have indicated; in particular, its many exercises conain much interesting material. For immediat...

  5. Mastering the diesel process

    Energy Technology Data Exchange (ETDEWEB)

    Antila, E.; Kaario, O.; Lahtinen, T. (and others)

    2004-07-01

    This is the final report of the research project 'Mastering the Diesel Process'. The project has been a joint research effort of the Helsinki University of Technology, the Tampere University of Technology, the Technical Research Centre of Finland, and the Aabo Akademi University. Moreover, the contribution of the Michigan Technological University has been important. The project 'Mastering the Diesel Process' has been a computational research project on the physical phenomena of diesel combustion. The theoretical basis of the project lies on computational fluid dynamics. Various submodels for computational fluid dynamics have been developed or tested within engine simulation. Various model combinations in three diesel engines of different sizes have been studied. The most important submodels comprise fuel spray drop breakup, fuel evaporation, gas-fuel interaction in the spray, mixing model of combustion, heat transfer, emission mechanisms. The boundary conditions and flow field modelling have been studied, as well. The main simulation tool have been Star-CD. KIVA code have been used in the model development, as well. By the help of simulation, we are able to investigate the effect of various design parameters or operational parameters on diesel combustion and emission formation. (orig.)

  6. ARM Mentor Selection Process

    Energy Technology Data Exchange (ETDEWEB)

    Sisterson, D. L. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Program was created in 1989 with funding from the U.S. Department of Energy (DOE) to develop several highly instrumented ground stations to study cloud formation processes and their influence on radiative transfer. In 2003, the ARM Program became a national scientific user facility, known as the ARM Climate Research Facility. This scientific infrastructure provides for fixed sites, mobile facilities, an aerial facility, and a data archive available for use by scientists worldwide through the ARM Climate Research Facility—a scientific user facility. The ARM Climate Research Facility currently operates more than 300 instrument systems that provide ground-based observations of the atmospheric column. To keep ARM at the forefront of climate observations, the ARM infrastructure depends heavily on instrument scientists and engineers, also known as lead mentors. Lead mentors must have an excellent understanding of in situ and remote-sensing instrumentation theory and operation and have comprehensive knowledge of critical scale-dependent atmospheric processes. They must also possess the technical and analytical skills to develop new data retrievals that provide innovative approaches for creating research-quality data sets. The ARM Climate Research Facility is seeking the best overall qualified candidate who can fulfill lead mentor requirements in a timely manner.

  7. Mindfulness and psychological process.

    Science.gov (United States)

    Williams, J Mark G

    2010-02-01

    The author reviews the articles in the Special Section on Mindfulness, starting from the assumption that emotions evolved as signaling systems that need to be sensitive to environmental contingencies. Failure to switch off emotion is due to the activation of mental representations of present, past, and future that are created independently of external contingencies. Mindfulness training can be seen as one way to teach people to discriminate such "simulations" from objects and contingencies as they actually are. The articles in this Special Section show how even brief laboratory training can have effects on processing affective stimuli; that long-term meditation practitioners show distinct reactions to pain; that longer meditation training is associated with differences in brain structure; that 8 weeks' mindfulness practice brings about changes in the way emotion is processed showing that participants can learn to uncouple the sensory, directly experienced self from the "narrative" self; that mindfulness training can affect working memory capacity, and enhance the ability of participants to talk about past crises in a way that enables them to remain specific and yet not be overwhelmed. The implications of these findings for understanding emotion and for further research is discussed.

  8. Turbulence and Stochastic Processes

    Science.gov (United States)

    Celani, Antonio; Mazzino, Andrea; Pumir, Alain

    sec:08-1In 1931 the monograph Analytical Methods in Probability Theory appeared, in which A.N. Kolmogorov laid the foundations for the modern theory of Markov processes [1]. According to Gnedenko: "In the history of probability theory it is difficult to find other works that changed the established points of view and basic trends in research work in such a decisive way". Ten years later, his article on fully developed turbulence provided the framework within which most, if not all, of the subsequent theoretical investigations have been conducted [2] (see e.g. the review by Biferale et al. in this volume [3]. Remarkably, the greatest advances made in the last few years towards a thorough understanding of turbulence developed from the successful marriage between the theory of stochastic processes and the phenomenology of turbulent transport of scalar fields. In this article we will summarize these recent developments which expose the direct link between the intermittency of transported fields and the statistical properties of particle trajectories advected by the turbulent flow (see also [4], and, for a more thorough review, [5]. We also discuss the perspectives of the Lagrangian approach beyond passive scalars, especially for the modeling of hydrodynamic turbulence.

  9. [In Process Citation].

    Science.gov (United States)

    Böthin, Elke

    2015-01-01

    The discussion on systematization and methodology combined with the question of state interference and professional political interests accompanies the beginning of Continuing Medical Education (CME) until today. The development of CME for the period 1871 to 1945 shows a process of systematization regarding organizational, administrational and structural level with participation of government and medical professional policies. In the time of the German Empire the foundation for a structured CME was created by establishing a system of charge free and decentralized training courses. For the first time, thought was given to distinguish different types of medical education. During the period of the Weimar Republic the structure of decentralized CME was supported by developing new teaching and learning methods. The medical professional representatives refused any state participation except financial assistance. In the Nazi era CME was brought into line of Nazi ideology concerning structure, organization and administration. CME obligations were mandatory for all physicians and trained them on Nazi ideology especially. CME as a lifelong process centrally guided in combination with decentralized structures is an appropriate way for offering training to all physicians. State support is important and necessary, but shaping the contents should be a duty of the professional associations of Physicians. The quality of patient care is no longer complied as soon as the aspect of cost-efficiency is transformed into a pressure factor concerning political interests and social insurance. PMID:26790195

  10. [In Process Citation].

    Science.gov (United States)

    Böthin, Elke

    2015-01-01

    The discussion on systematization and methodology combined with the question of state interference and professional political interests accompanies the beginning of Continuing Medical Education (CME) until today. The development of CME for the period 1871 to 1945 shows a process of systematization regarding organizational, administrational and structural level with participation of government and medical professional policies. In the time of the German Empire the foundation for a structured CME was created by establishing a system of charge free and decentralized training courses. For the first time, thought was given to distinguish different types of medical education. During the period of the Weimar Republic the structure of decentralized CME was supported by developing new teaching and learning methods. The medical professional representatives refused any state participation except financial assistance. In the Nazi era CME was brought into line of Nazi ideology concerning structure, organization and administration. CME obligations were mandatory for all physicians and trained them on Nazi ideology especially. CME as a lifelong process centrally guided in combination with decentralized structures is an appropriate way for offering training to all physicians. State support is important and necessary, but shaping the contents should be a duty of the professional associations of Physicians. The quality of patient care is no longer complied as soon as the aspect of cost-efficiency is transformed into a pressure factor concerning political interests and social insurance.

  11. NANOSCALE PROCESS ENGINEERING

    Institute of Scientific and Technical Information of China (English)

    Qixiang Wang; Fei Wei

    2003-01-01

    The research of nanoscale process engineering (NPE) is based on the interdisciplinary nature of nanoscale science and technology. It mainly deals with transformation of materials and energy into nanostructured materials and nanodevices, and synergizes the multidisciplinary convergence between materials science and technology, biotechnology, and information technology. The core technologies of NPE concern all aspects of nanodevice construction and operation, such as manufacture of nanomaterials "by design", concepts and design of nanoarchitectures, and manufacture and control of customizable nanodevices. Two main targets of NPE at present are focused on nanoscale manufacture and concept design of nanodevices. The research progress of nanoscale manufacturing processes focused on creating nanostructures and assembling them into nanosystems and larger scale architectures has built the interdiscipline of NPE. The concepts and design of smart, multi-functional, environmentally compatible and customizable nanodevice prototypes built from the nanostructured systems of nanocrystalline, nanoporous and microemulsion systems are most challenging tasks of NPE. The development of NPE may also impel us to consider the curriculum and educational reform of chemical engineering in universities.

  12. Beryllium Manufacturing Processes

    Energy Technology Data Exchange (ETDEWEB)

    Goldberg, A

    2006-06-30

    This report is one of a number of reports that will be combined into a handbook on beryllium. Each report covers a specific topic. To-date, the following reports have been published: (1) Consolidation and Grades of Beryllium; (2) Mechanical Properties of Beryllium and the Factors Affecting these Properties; (3) Corrosion and Corrosion Protection of Beryllium; (4) Joining of Beryllium; (5) Atomic, Crystal, Elastic, Thermal, Nuclear, and other Properties of Beryllium; and (6) Beryllium Coating (Deposition) Processes and the Influence of Processing Parameters on Properties and Microstructure. The conventional method of using ingot-cast material is unsuitable for manufacturing a beryllium product. Beryllium is a highly reactive metal with a high melting point, making it susceptible to react with mold-wall materials forming beryllium compounds (BeO, etc.) that become entrapped in the solidified metal. In addition, the grain size is excessively large, being 50 to 100 {micro}m in diameter, while grain sizes of 15 {micro}m or less are required to meet acceptable strength and ductility requirements. Attempts at refining the as-cast-grain size have been unsuccessful. Because of the large grain size and limited slip systems, the casting will invariably crack during a hot-working step, which is an important step in the microstructural-refining process. The high reactivity of beryllium together with its high viscosity (even with substantial superheat) also makes it an unsuitable candidate for precision casting. In order to overcome these problems, alternative methods have been developed for the manufacturing of beryllium. The vast majority of these methods involve the use of beryllium powders. The powders are consolidated under pressure in vacuum at an elevated temperature to produce vacuum hot-pressed (VHP) blocks and vacuum hot-isostatic-pressed (HIP) forms and billets. The blocks (typically cylindrical), which are produced over a wide range of sizes (up to 183 cm dia. by 61

  13. EDITORIAL: Industrial Process Tomography

    Science.gov (United States)

    West, Robert M.

    2004-07-01

    Industrial process tomography remains a multidisciplinary field with considerable interest for many varied participants. Indeed this adds greatly to its appeal. It is a pleasure and a privilege to once again act as guest editor for a special feature issue of Measurement Science and Technology on industrial process tomography, the last being in December 2002. Those involved in the subject appreciate the efforts of Measurement Science and Technology in producing another issue and I thank the journal on their behalf. It can be seen that there are considerable differences in the composition of material covered in this issue compared with previous publications. The dominance of electrical impedance and electrical capacitance techniques is reduced and there is increased emphasis on general utility of tomographic methods. This is encompassed in the papers of Hoyle and Jia (visualization) and Dierick et al (Octopus). Electrical capacitance tomography has been a core modality for industrial applications. This issue includes new work in two very interesting aspects of image reconstruction: pattern matching (Takei and Saito) and simulated annealing (Ortiz-Aleman et al). It is important to take advantage of knowledge of the process such as the presence of only two components, and then to have robust reconstruction methods provided by pattern matching and by simulated annealing. Although crude reconstruction methods such as approximation by linear back projection were utilized for initial work on electrical impedance tomography, the techniques published here are much more advanced. The paper by Kim et al includes modelling of a two-component system permitting an adaption-related approach; the paper by Tossavainen et al models free surface boundaries to enable the estimation of shapes of objects within the target. There are clear improvements on the previous crude and blurred reconstructions where boundaries were merely inferred rather than estimated as in these new developments

  14. Poultry Slaughtering and Processing Facilities

    Data.gov (United States)

    Department of Homeland Security — Agriculture Production Poultry Slaughtering and Processing in the United States This dataset consists of facilities which engage in slaughtering, processing, and/or...

  15. Particle processing technology

    Science.gov (United States)

    Yoshio, Sakka

    2014-02-01

    In recent years, there has been strong demand for the development of novel devices and equipment that support advanced industries including IT/semiconductors, the environment, energy and aerospace along with the achievement of higher efficiency and reduced environmental impact. Many studies have been conducted on the fabrication of innovative inorganic materials with novel individual properties and/or multifunctional properties including electrical, dielectric, thermal, optical, chemical and mechanical properties through the development of particle processing. The fundamental technologies that are key to realizing such materials are (i) the synthesis of nanoparticles with uniform composition and controlled crystallite size, (ii) the arrangement/assembly and controlled dispersion of nanoparticles with controlled particle size, (iii) the precise structural control at all levels from micrometer to nanometer order and (iv) the nanostructural design based on theoretical/experimental studies of the correlation between the local structure and the functions of interest. In particular, it is now understood that the application of an external stimulus, such as magnetic energy, electrical energy and/or stress, to a reaction field is effective in realizing advanced particle processing [1-3]. This special issue comprises 12 papers including three review papers. Among them, seven papers are concerned with phosphor particles, such as silicon, metals, Si3N4-related nitrides, rare-earth oxides, garnet oxides, rare-earth sulfur oxides and rare-earth hydroxides. In these papers, the effects of particle size, morphology, dispersion, surface states, dopant concentration and other factors on the optical properties of phosphor particles and their applications are discussed. These nanoparticles are classified as zero-dimensional materials. Carbon nanotubes (CNT) and graphene are well-known one-dimensional (1D) and two-dimensional (2D) materials, respectively. This special issue also

  16. Radiation in industrial processes

    International Nuclear Information System (INIS)

    The uses of ionizing radiation can be divided into two broad categories. First, it can be used as a tool of investigation, measurement and testing, and secondly, it can be a direct agent in inducing chemical processes. For example, radiation can help in the detecting and locating of malignant tumours, and it can be employed also for the destruction of those tumours. Again, it can reveal intricate processes of plant growth and, at the same time, can initiate certain processes which result in the growth of new varieties of plants. Similarly in industry, radiation is both a tool of detection, testing and measurement and an active agent for the initiation of useful chemical reactions. The initiation of chemical reactions usually requires larger and more powerful sources of radiation. Such radiation can be provided by substances like cobalt 60 and caesium 137 or by machines which accelerate nuclear particles to very high energies. Of the particle-accelerating machines, the most useful in this field are those which accelerate electrons to energies considerably higher than those possessed by the electrons (beta particles) emitted by radioactive substances. These high-energy radiations produce interesting reactions both in organic life and in materials for industry. Several of the papers presented at the Warsaw conference were devoted to the application of ionizing radiation to polymerization and other useful reactions in the manufacture and treatment of plastics. The polymerization of the ethylene series of hydro-carbons was discussed from various angles and the technical characteristics and requirements were described. It was pointed out by some experts that the cross-linking effect of radiation resulted in a superior product, opening the way to new applications of polyethylene. Irradiated polyethylene film has been sold for several years, and electrical wire has been made with irradiated polyethylene as the insulating jacket. Other reactions discussed included the cross

  17. Titan's global geologic processes

    Science.gov (United States)

    Malaska, Michael; Lopes, Rosaly M. C.; Schoenfeld, Ashley; Birch, Samuel; Hayes, Alexander; Williams, David A.; Solomonidou, Anezina; Janssen, Michael A.; Le Gall, Alice; Soderblom, Jason M.; Neish, Catherine; Turtle, Elizabeth P.; Cassini RADAR Team

    2016-10-01

    We have mapped the Cassini SAR imaged areas of Saturn's moon Titan in order to determine the geological properties that modify the surface [1]. We used the SAR dataset for mapping, but incorporated data from radiometry, VIMS, ISS, and SARTopo for terrain unit determination. This work extends our analyses of the mid-latitude/equatorial Afekan Crater region [2] and in the southern and northern polar regions [3]. We placed Titan terrains into six broad terrain classes: craters, mountain/hummocky, labyrinth, plains, dunes, and lakes. We also extended the fluvial mapping done by Burr et al. [4], and defined areas as potential cryovolcanic features [5]. We found that hummocky/mountainous and labyrinth areas are the oldest units on Titan, and that lakes and dunes are among the youngest. Plains units are the largest unit in terms of surface area, followed by the dunes unit. Radiometry data suggest that most of Titan's surface is covered in high-emissivity materials, consistent with organic materials, with only minor exposures of low-emissivity materials that are consistent with water ice, primarily in the mountain and hummocky areas and crater rims and ejecta [6, 7]. From examination of terrain orientation, we find that landscape evolution in the mid-latitude and equatorial regions is driven by aeolian processes, while polar landscapes are shaped by fluvial, lacrustine, and possibly dissolution or volatilization processes involving cycling organic materials [3, 8]. Although important in deciphering Titan's terrain evolution, impact processes play a very minor role in the modification of Titan's landscape [9]. We find no evidence for large-scale aqueous cryovolcanic deposits.References: [1] Lopes, R.M.C. et al. (2010) Icarus, 205, 540–558. [2] Malaska, M.J. et al. (2016) Icarus, 270, 130–161. [3] Birch et al., in revision. [4] Burr et al. (2013) GSA Bulletin 125, 299–321. [5] Lopes et al. JGR: Planets, 118, 1–20. [6] Janssen et al., (2009) Icarus, 200, 222–239. [7

  18. Sensors in Spray Processes

    Science.gov (United States)

    Fauchais, P.; Vardelle, M.

    2010-06-01

    This paper presents what is our actual knowledge about sensors, used in the harsh environment of spray booths, to improve the reproducibility and reliability of coatings sprayed with hot or cold gases. First are described, with their limitations and precisions, the different sensors following the in-flight hot particle parameters (trajectories, temperatures, velocities, sizes, and shapes). A few comments are also made about techniques, still under developments in laboratories, to improve our understanding of coating formation such as plasma jet temperature measurements in non-symmetrical conditions, hot gases heat flux, particles flattening and splats formation, particles evaporation. Then are described the illumination techniques by laser flash of either cold particles (those injected in hot gases, or in cold spray gun) or liquid injected into hot gases (suspensions or solutions). The possibilities they open to determine the flux and velocities of cold particles or visualize liquid penetration in the core of hot gases are discussed. Afterwards are presented sensors to follow, when spraying hot particles, substrate and coating temperature evolution, and the stress development within coatings during the spray process as well as the coating thickness. The different uses of these sensors are then described with successively: (i) Measurements limited to particle trajectories, velocities, temperatures, and sizes in different spray conditions: plasma (including transient conditions due to arc root fluctuations in d.c. plasma jets), HVOF, wire arc, cold spray. Afterwards are discussed how such sensor data can be used to achieve a better understanding of the different spray processes, compare experiments to calculations and improve the reproducibility and reliability of the spray conditions. (ii) Coatings monitoring through in-flight measurements coupled with those devoted to coatings formation. This is achieved by either maintaining at their set point both in-flight and

  19. Composable Data Processing in Environmental Science - A Process View

    OpenAIRE

    Wombacher, A.

    2008-01-01

    Data processing in environmental science is essential for doing science. The heterogeneity of data sources, data processing operations and infrastructures results in a lot of manual data and process integration work done by each scientist individually. This is very inefficient and time consuming. The aim is to provide a view based approach on accessing and processing data supporting a more generic infrastructure to integrate processing steps from different organizations, systems and libraries...

  20. CONVERGENCE TO PROCESS ORGANIZATION BY MODEL OF PROCESS MATURITY

    OpenAIRE

    Blaženka Piuković Babičković; Željko Vojinović

    2015-01-01

    With modern business process orientation binds primarily, process of thinking and process organizational structure. Although the business processes are increasingly a matter of writing and speaking, it is a major problem among the business world, especially in countries in transition, where it has been found that there is a lack of understanding of the concept of business process management. The aim of this paper is to give a specific contribution to overcoming the identified problem, by poin...