WorldWideScience

Sample records for hydrogasification process analysis

  1. Fiscal 1991 survey report. Coal hydrogasification technology development; 1991 nendo sekitan suiten gaska gijutsu kaihatsu chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    Surveys and studies were conducted to establish a practical SNG (substitute natural gas) production process. Dealt with in the survey of basic studies on hydrogasification were the effect of gasification conditions, mechanism of tar decomposition, model-using estimation and assessment of reaction heat, and so forth. In an effort to develop a reactor, the current status was studied and future tasks were extracted concerning the one-through type and the internal circulation type entrained bed hydrogasification furnaces. In the study of practical application of the coal hydrogasification process, it was found that gas cooling efficiency would be increased from last fiscal year's 75.2% to approximately 78% by optimizing the process configuration. An ARCH (advanced rapid coal hydrogasification) process to have a novel reactor was proposed, and, for its commercialization, guidelines for dimensionally enlarging the process were worked out and tasks to discharge at each of the development stages were extracted. Relative to pilot tests, an efficient development program was deliberated, in particular, which comprised ARCH-1 and ARCH-2. (NEDO)

  2. Production of Fischer–Tropsch fuels and electricity from bituminous coal based on steam hydrogasification

    International Nuclear Information System (INIS)

    Lu, Xiaoming; Norbeck, Joseph M.; Park, Chan S.

    2012-01-01

    A new thermochemical process for (Fischer–Tropsch) FT fuels and electricity coproduction based on steam hydrogasification is addressed and evaluated in this study. The core parts include (Steam Hydrogasification Reactor) SHR, (Steam Methane Reformer) SMR and (Fisher–Tropsch Reactor) FTR. A key feature of SHR is the enhanced conversion of carbon into methane at high steam environment with hydrogen and no need for catalyst or the use of oxygen. Facilities utilizing bituminous coal for coproduction of FT fuels and electricity with carbon dioxide sequestration are designed in detail. Cases with design capacity of either 400 or 4000 TPD (Tonne Per Day) (dry basis) are investigated with process modeling and cost estimation. A cash flow analysis is performed to determine the fuels (Production Cost) PC. The analysis shows that the 400 TPD case due to a FT fuels PC of 5.99 $/gallon diesel equivalent results in a plant design that is totally uneconomic. The 4000 TPD plant design is expected to produce 7143 bbl/day FT liquids with PC of 2.02 $/gallon and 2.27 $/gallon diesel equivalent at overall carbon capture ratio of 65% and 90%, respectively. Prospective commercial economics benefits with increasing plant size and improvements from large-scale demonstration efforts on steam hydrogasification. -- Highlights: ► We develop a new thermochemical method for synthetic fuels production. ► Detailed plant design and process modeling for the Coal-to-Liquid facilities are performed. ► Economic analysis has been carried out in determining the fuel production cost and IRR. ► The fuels produced in this study can compete with petroleum when crude oil price is 100 $/bbl. ► Further economic benefit comes with plant scale-up and process commercial demonstration efforts.

  3. Fiscal 1993 survey report. Coal hydrogasification technology development; 1993 nendo sekitan suiten gaska gijutsu kaihatsu chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-03-01

    Surveys and studies were conducted to establish a practical SNG (substitute natural gas) production process. The hydrogasification process that Japan should develop is a flexible process that operates on the three modes of the maximum SNG yield, the maximum heat efficiency, and the maximum BTX (benzene, toluene, xylene) yield. Such being the case, an ARCH (advanced rapid coal hydrogasification) process was proposed, provided with a reactor capable of an ARCH-1 type operation for the maximum gas cooling efficiency and an ARCH-2 type operation for the maximum liquid yield. As for the details of the ARCH process development, the time and priority for development were determined for each of the items in consideration of the technical contents and the steps of development in the flow from a bench plant to a demonstration plant. The technology of char cooling and extraction was specified as the first item to be immediately dealt with. As for the development of the hydrogasification reactor, it was concluded that it was suitable to begin with the development of an injector. According to the development plan, the cost required up to a pilot plant test was estimated at 2 billion yen. (NEDO)

  4. Fiscal 1992 survey report. Coal hydrogasification technology development; 1992 nendo sekitan suiten gaska gijutsu kaihatsu chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-03-01

    Surveys and studies were conducted to establish a practical SNG (substitute natural gas) production process. In the study of coal hydrogasification, a mathematical simulation was implemented to estimate the distribution of products with the pyrolytic reaction and the hydrogenolytic reaction controlled independently in the ARCH-2 (advanced rapid coal hydrogasification-2) process, the said two reactions representing the key concepts of the ARCH-2 process. It was then disclosed that a two-stage reaction control would increase the liquid yield. Also, a tentative calculation was made of gas cooling efficiency and cost performance in a process capable of achieving the target liquid acquisition rate. It was then found that BTX (benzene, toluene, xylene) production up to approximately 15% in terms of carbon was feasible and that the SNG price would be 29.03 yen/Nm{sup 3} with benzene priced at 90 yen/kg, these promising a better result than in the ARCH-1 process. The gas cooling efficiency of the ARCH-2 process was but 72.0% or less, however, which demanded improvement. Studies were made, based on the results of studies in progress since fiscal 1990, about what the hydrogasification process for Japan to develop should be. (NEDO)

  5. Exergy analysis of a coal/biomass co-hydrogasification based chemical looping power generation system

    International Nuclear Information System (INIS)

    Yan, Linbo; Yue, Guangxi; He, Boshu

    2015-01-01

    Power generation from co-utilization of coal and biomass is very attractive since this technology can not only save the coal resource but make sufficient utilization of biomass. In addition, with this concept, net carbon discharge per unit electric power generation can also be sharply reduced. In this work, a coal/biomass co-hydrogasification based chemical looping power generation system is presented and analyzed with the assistance of Aspen Plus. The effects of different operating conditions including the biomass mass fraction, R_b, the hydrogen recycle ratio, R_h_r, the hydrogasification pressure, P_h_g, the iron to fuel mole ratio, R_i_f, the reducer temperature, T_r_e, the oxidizer temperature, T_o_x, and the fuel utilization factor, U_f of the SOFC (solid oxide fuel cell) on the system operation results including the energy efficiency, η_e, the total energy efficiency, η_t_e, the exergy efficiency, η_e_x, the total exergy efficiency, η_t_e_x and the carbon capture rate, η_c_c, are analyzed. The energy and exergy balances of the whole system are also calculated and the corresponding Sankey diagram and Grassmann diagram are drawn. Under the benchmark condition, exergy efficiencies of different units in the system are calculated. η_t_e, η_t_e_x and η_c_c of the system are also found to be 43.6%, 41.2% and 99.1%, respectively. - Highlights: • A coal/biomass co-hydrogasification based chemical looping power generation system is setup. • Sankey and Grassmann diagrams are presented based on the energy and exergy balance calculations. • Sensitivity analysis is done to understand the system operation characteristics. • Total energy and exergy efficiencies of this system can be 43.6% and 41.2%, respectively. • About 99.1% of the carbon contained in coal and biomass can be captured in this system.

  6. Pilot plant for hydrogasification of coal with nuclear heat

    International Nuclear Information System (INIS)

    Falkenhain, G.; Velling, G.

    1976-01-01

    In the framework of a research and development programme sponsored by the Ministry of Research and Technology of the Federal Republic of Germany, two process variants for hydrogasification of coal by means of nuclear heat have been developed by the Rheinische Braunkohlenwerke AG, Cologne. For testing these process variants a semi-technical pilot plant for gasification of coal under pressure in a fluidized bed was constructed. The pilot plant, in which the gasification of lignite and hard coal is planned, is designed for a throughput of 100kg carbon per hour corresponding to 400kg raw lignite per hour or 150kg hard coal per hour. The plant should provide data on the influence of the most essential process parameters (pressure, temperature, residence time of gas and coal, type and pre-treatment of feed coal) on the performance of gasification and raw gas composition. Different plant components will also be tested. Since the pilot plant will permit testing of both process variants of hydrogasification, it was designed in such a way that it is possible to vary a great number of process parameters. Thus, for instance, the pressure can be chosen in a range up to 100 bar and pure hydrogen or mixtures of hydrogen, carbon monoxide and steam can be applied as gasification agents. The gasifier is an internally insulated fluidized bed reactor with an inner diameter of 200mm and a height of about 8m, to which an internally insulated cyclone for separation of the entrained fines is attached. The raw gas is then cooled down by direct water scrubbing. (author)

  7. The behavior of catalysts in hydrogasification of sub-bituminous coal in pressured fluidized bed

    International Nuclear Information System (INIS)

    Yan, Shuai; Bi, Jicheng; Qu, Xuan

    2017-01-01

    Highlights: •CCHG in a pressured fluidized bed achieved 77.3 wt.% of CH 4 yield in 30 min. •Co-Ca and Ni-Ca triggered catalytic coal pyrolysis and char hydrogasification. •The reason for better catalytic performance of 5%Co-1%Ca was elucidated. •Sintered catalyst blocked the reactive sites and suppressed coal conversion. •Co-Ca made the catalyzed coal char rich in mesopore structures and reactive sites. -- Abstract: The catalytic hydrogasification of the sub-bituminous coal was carried out in a lab-scale pressurized fluidized bed with the Co-Ca, Ni-Ca and Fe-Ca as catalysts at 850 °C and 3 MPa. The effect of different catalysts on the characteristics of gasification products was investigated, and the behavior of the catalysts was also explored by means of the X-ray diffraction (XRD), FT-Raman, Brunauer–Emmett–Teller (BET), etc. Experiment results showed that all the catalysts promoted the carbon conversion in the coal catalytic hydrogasification (CCHG), and the catalytic activity was in the order: 5%Co-1%Ca > 5%Ni-1%Ca > 5%Fe-1%Ca. Compared with the raw coal hydrogasification, the carbon conversion increased from 43.4 wt.% to 91.3 wt.%, and the CH 4 yield increased from 23.7 wt.% to 77.3 wt.% within 30 min after adding the 5%Co-1%Ca catalyst into the coal. Co-Ca and Ni-Ca possessed catalytic effect on both processes of pyrolysis of coal and hydrogasification of coal char in CCHG, by which the graphitization of the coal was suppressed and methane formation rate was significantly accelerated. Fe/Co/Ni-Ca could penetrate into the interior of coal during CCHG, making the catalytic production of CH 4 conduct in the pore structures. The activity difference of the catalysts was owing to the different ability of rupturing the amorphous C−C bonds in coal structure. The incomplete carbon conversion of the 5%Co-1%Ca loaded coal was due to the agglomeration of the catalyst and the blockage of the reactive sites by the sintered catalyst. This work will provide

  8. Development of a Hydrogasification Process for Co-Production of Substitute Natural Gas (SNG) and Electric Power from Western Coals

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xiaolei [Arizona Public Service Company, Pheonix, AZ (United States); Rink, Nancy [Arizona Public Service Company, Pheonix, AZ (United States)

    2011-04-30

    This report presents the results of the research and development conducted on an Advanced Hydrogasification Process (AHP) conceived and developed by Arizona Public Service Company (APS) under U.S. Department of Energy (DOE) contract: DE-FC26-06NT42759 for Substitute Natural Gas (SNG) production from western coal. A double-wall (i.e., a hydrogasification contained within a pressure shell) down-flow hydrogasification reactor was designed, engineered, constructed, commissioned and operated by APS, Phoenix, AZ. The reactor is ASME-certified under Section VIII with a rating of 1150 pounds per square inch gage (psig) maximum allowable working pressure at 1950 degrees Fahrenheit (°F). The reaction zone had a 1.75 inch inner diameter and 13 feet length. The initial testing of a sub-bituminous coal demonstrated ~ 50% carbon conversion and ~10% methane yield in the product gas under 1625°F, 1000 psig pressure, with a 11 seconds (s) residence time, and 0.4 hydrogen-to-coal mass ratio. Liquid by-products mainly contained Benzene, Toluene, Xylene (BTX) and tar. Char collected from the bottom of the reactor had 9000-British thermal units per pound (Btu/lb) heating value. A three-dimensional (3D) computational fluid dynamic model simulation of the hydrodynamics around the reactor head was utilized to design the nozzles for injecting the hydrogen into the gasifier to optimize gas-solid mixing to achieve improved carbon conversion. The report also presents the evaluation of using algae for carbon dioxide (CO2) management and biofuel production. Nannochloropsis, Selenastrum and Scenedesmus were determined to be the best algae strains for the project purpose and were studied in an outdoor system which included a 6-meter (6M) radius cultivator with a total surface area of 113 square meters (m2) and a total culture volume between 10,000 to 15,000 liters (L); a CO2 on-demand feeding system; an on-line data collection system for temperature, p

  9. Achievement report for fiscal 1997 on investigative research on society compatibility of development of coal hydrogasification technology; 1997 nendo sekitan suiso tenka gas ka gijutsu kaihatsu shakai tekigosei ni kansuru chosa kenkyu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    In view of possibility of the future tightness in natural gas supply, establishment of coal gasification technology was set as the final objective, which can supply cheaply and stably the substitution natural gas of high quality by using coal existing affluently over the world as the raw material. An investigative research is carried out under a five-year plan on society compatibility required to assess the possibility of the practical application thereof. Fiscal 1997 has performed in continuation from the previous year the 'survey on process level elevation' and 'survey on the society compatibility'. This report summarizes the achievements thereon. In the investigative research on the process level elevation, the Shell's methane synthesis process based on an oxygen blown and dry feed coal gasifier was evaluated, and the calculation process was pursued on material balance in a hydrogasification reactor as having been performed in the 'survey on developing the coal hydrogasification technology' in which its reasonability was verified. In the survey on the society compatibility of the process, a survey was carried out on natural gas (including non-conventional methane hydrate and coal bed methane) and coals as raw materials for hydrogasification. (NEDO)

  10. Prototype plant for nuclear process heat (PNP) - operation of the pilot plant for hydrogasification of coal

    International Nuclear Information System (INIS)

    Bruengel, N.; Dehms, G.; Fiedler, P.; Gerigk, H.P.; Ruddeck, W.; Schrader, L.; Schumacher, H.J.

    1988-04-01

    The Rheinische Braunkohlenwerke AG developed the process of hydrogasification of coal in a fluidized bed for generation of SNG. On basis of test results obtained in a semi-technical pilot plant of a through-put of 250 kg/h dried coal a large pilot plant was erected processing 10 t/h dried brown coal. This plant was on stream for about 14700 h, of which about 7800 h were with gasifier operation; during this time about 38000 t of dried brown coal of the Rhenish district were processed containing 4 to 25% of ash. At pressures of 60 to 120 bar and temperatures of 800 to 935 0 C carbon conversion rates up to 81 percent and methane amounts of 5000 m 3 (STP)/h were reached. The decisive parameter for methane generation was the hydrogen/coal-ratio. Even at high moisture contents, usually diminishing the methane yield from the coal essentially, by high hydrogen/coal-ratios high methane yields could be obtained. The gasifier itself caused no troubles during the total time operation. Difficulties with the original design of the residual char cooler could be overcome by change-over from water injection to liquid carbon dioxide. The design of the heat recovery system proved well. Alltogether so the size increasement of the gasifier from the semi-technical to the large pilot plant as well as the harmonization of gas generation and gas refining was proved. (orig.) With 20 refs., 20 tabs., 81 figs [de

  11. Fiscal 1998 New Sunshine Plan auxiliary project. Report on results on development of coal hydrogasification technology (Support research); 1998 nendo New Sunshine keikaku hojo jigyo seika hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu shien kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    R and D was conducted for the purpose of developing the alternative natural gas manufacturing process that, using coal as the raw material, is highly efficient as well as environmentally superior, with the fiscal 1998 results reported. A hydrogasification test was conducted on Taiheiyo coal at a temperature of 1,173k and a pressure of 7 MPa, which showed that all gaseous products other than methane stopped their evolution roughly in the heat-up process, while methane continued to evolve to show the highest yield. In the reactivity comparison of various types of coal, coals with carbon content of 80% or below were high in reactivity and considered suitable for hydrogasification feedstock. It was also suggested that the hydrogasification reactivity of low rank coals including sub-bituminous coals or below might be greatly affected by the presence/absence of the catalytic effects of ion-exchanged metals. Behavior experiments of sulfur and nitrogen in coal in the hydrogenation reaction were carried out using a continuous free-fall type reactor, which elucidated the effects of hydrogen pressure and gas residence time among various operational elements. (NEDO)

  12. Appearance of rapid carbon on hydrogasification of coal; Suiten gas ka ni okeru kokassei tanso no hatsugen

    Energy Technology Data Exchange (ETDEWEB)

    Soneda, Y.; Makino, M. [National Institute for Resources and Environment, Tsukuba (Japan)9] Xu, W. [New Energy and Industrial Technology Development Organization, Tokyo, (Japan)

    1998-09-20

    The technology of hydrogasification of coal now under development under a State project aims to produce light oil as well as methane-rich high calorie gas through the direct reaction between coal and hydrogen, and is expected to deal with the difficult natural gas demand/supply relationship anticipated for the future. Although it is mandatory to fully understand the reaction between coal and hydrogen for the completion of this technology, yet there are many tasks to be fulfilled. One of the tasks is the elucidation of the behavior of what is named rapid carbon that appears upon the rapid heating of coal in a high-pressure hydrogen environment. In this paper, some interesting findings about the appearance of rapid carbon are reported. When coal is placed in such an environment, volatile components are lost first of all and then the active carbon reaction occurs. When the behavior of active carbon in the reaction is observed, it is found that active carbon is not so small in quantity, and the result of observation of its appearance and deactivation during the reaction justifies an inference that the reaction is regarded as one of the primary reactions in the process of hydrogasification. Accordingly, systematic studies of its physical and chemical features from various viewpoints are necessary. 5 refs., 3 figs., 1 tab.

  13. Commissioned operation report for fiscal 1991 on commissioning of surveying high-level development and effective utilization of natural gas, and development of coal hydrogasification technology; 1991 nendo tennen gas kodo kaihatsu yuko riyo chosa tou itaku gyomu hokokusho. Sekitan suiten gaska gijutsu kaihatsu chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    With an objective of establishing a practically usable process to manufacture substitution natural gas, discussions have been given on the technical, economical, and developmental problems therein. This paper summarizes the achievements in fiscal 1991. The summary of surveys in the current fiscal year is as follows: the coal hydrogasification process is positioned as having high necessity in the gas industry because of its high thermal efficiency and low gas cost; in order to evaluate the reaction heat in the hydrogasification reaction, a mathematical model having flexibility was structured, whereas a large number of findings has been derived, including performance of the reactor and the optimum operating conditions; in addition to having made clear the conditions for an entrained bed hydrogasification reactor, comparisons and discussions were given on the internally circulating reactor and one-through reactor; studies were performed on thermal efficiency and gas cost in the optimized process configuration o the ARCH-1 process base; and a proposal was made on the test for a new reactor having the two-step reaction zone that could be expected of increased yield in aqueous solution, and could contribute to reducing the gas cost. (NEDO)

  14. Fiscal 1996 New Sunshine Plan auxiliary project. Report on results on development of coal hydrogasification technology (Support research); 1996 nendo New Sunshine keikaku hojo jigyo seika hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu shien kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    R and D was conducted for the purpose of developing the alternative natural gas manufacturing process that, using coal as the raw material, is highly efficient as well as environmentally superior, with the fiscal 1996 results reported. A test research was carried out on the coal hydrogasification process which was selected as the optimum process based on the studies in the previous years. On the basis of the results of the studies, a free fall pyrolyzer was employed as the test equipment for the use of behavior research of hetero-elements in the test. The basic specifications were set based on a hydrogen/coal ratio of 0.2, a temperature of 950 degrees C and a pressure of 7 MPa, which are the reaction conditions of the ARCH reactor, with the basic design drafted accordingly. In a preliminary examination on the mechanisms of the coal hydrogasification reaction, analysis was made on the representative 20 different kinds of coals, on the six kinds of which studies were made on the gasification behavior using a constant heating rate pyrolyzer. As a result, from the temperature dependency of a methane forming speed, it was assumed that three components of coal had bearing upon the gasification behavior. (NEDO)

  15. Fiscal 1992 report. Overseas surveys out of surveys for coal hydrogasification technology development; 1992 nendo sekitan suiten gaska gijutsu kaihatsu chosa ni okeru kaigai chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-12-01

    As part of the coal hydrogasification technology development survey project, overseas surveys were carried out as in the preceding fiscal year. With an emphasis placed on the process materials and resources, and on product utilization technologies, surveys and studies were conducted about the trends of development of coal and natural gas resources, and information was collected on energy-related matters in Indonesia and Australia. The need of hydrogasification technology was investigated from the viewpoint of natural resources. Moreover, Japanese engineers were dispatched to APEC (Asia-Pacific Economic Cooperation Conference) New Energy Seminar, Indonesia. Visits were made for information on the natural gas resources at an LNG base in East Kalimantan, Indonesia; coal gasification, energy, and others at CSIRO (Commonwealth Scientific and Industrial Research Organization), Australia; coal bed and methane resources at Warren Center, University of Sydney, Australia; coal bed and methane resources at the Brisbane office, Mitsubishi Gas Chemical Company, Inc.; and coal resources at coal mines of Idemitsu South Queensland Coal Pty Ltd. (NEDO)

  16. Fiscal 1994 survey report. Coal hydrogasification technology development; 1994 nendo sekitan suiten gaska gijutsu kaihatsu chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    Surveys and studies were conducted to establish a practical SNG (substitute natural gas) production process. For the development of an ARCH (advanced rapid coal hydrogasification) process, a plan was prepared covering the basic concept of the process, overall development program, hydrogen/oxygen burner, and an injector. The overall development program comprises element studies (4 years) and the study of the operation of a 50 tons/day pilot plant (8 years), and deals with the development of a reactor and peripheral equipment. Next comes a total system verification effort using a 200 tons/day verification plant in combination with a hydrogen production process, and this aims to achieve commercialization at 3 million Nm{sup 3}/day. As for the hydrogen/oxygen burner, a structure was proposed after surveys of literature and patents on burner structures, ignition methods, and monitoring methods. In the development of an injector, a plan was prepared for testing, and improving, the performance in a cold/hot model of a specimen incorporating the proposed hydrogen/oxygen burner. Basic studies to be carried out include simulation-aided performance prediction. (NEDO)

  17. Achievement report for fiscal 1997 (edition B) on auxiliary New Sunshine Project. Development of coal hydrogasification technology (Research by using experimental device); 1997 nendo New Sunshine keikaku hojo jigyo seika hokokusho (B ban). Sekitan suiso tenka gaska gijutsu kaihatsu - Jikken sochi ni yoru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    With an objective of using practically the coal hydrogasification technology (the ARCH process), developmental research has been performed. This paper summarizes the achievements in fiscal 1997. In the research by using a small testing device, the Taiheiyo coal was used to have derived hydrogasification data (distribution and yield of the reaction products) in case of having changed the temperature, residential time and H{sub 2}/caoal ratio at a pressure of 7.0 MPa. In the developmental research on the injector, a test to verify mixing performance was performed by simulating the coal/hydrogen with gas/gas and coal/gas at normal temperature and pressure. Furthermore, discussions were given on the heat conduction analysis and cooling structure, whereas an injector was designed and fabricated. With respect to the hot model test to verify the performance of the injector, detailed design and partial fabrication of the test device were carried out. In addition, development was conducted on the coal/gas system mixing simulation to simulate the states of dispersion and mixing of the coal as the first phase of developing the mixing and temperature rise simulation. (NEDO)

  18. Fiscal 1994 entrusted task report. Surveys of advanced natural gas development and efficient utilization (Survey of coal hydrogasification technology development); 1994 nendo tennen gas kodo kaihatsu yuko riyo chosa tou itaku gyomu hokokusho. Sekitan suiten gaska gijutsu kaihatsu chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    For the establishment of a practical process for substitute natural gas (SNG) production, technological and economical assessments were made, and tasks to discharge for the development were discussed. In this fiscal year, the results of surveys conducted in the past five-year period were compiled, and studies were made to prepare for a smooth transition to the element research stage. Findings obtained are described below. SNG producing technologies need to be developed, with the demand for SNG increasing sharply, to further stabilize the base for SNG supply; coal which is abundantly available should be used as the material for SNG; and coal hydrogasification, among various methods for producing SNG from coal, is the most suitable in view of efficiency and cost performance. It was also found after a prolonged study for the improvement of efficiency and cost performance that probabilities were high that the yield of BTX (benzene, toluene, xylene) would increase and cost performance would improve. Besides, a basic plan and an element technology research plan were prepared for the development of the ARCH (advanced rapid coal hydrogasification) process. (NEDO)

  19. Development of a Hydrogasification Process for Co-Production of Substitute Natural Gas (SNG) and Electric Power from Western Coals-Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Raymond Hobbs

    2007-05-31

    The Advanced Hydrogasification Process (AHP)--conversion of coal to methane--is being developed through NETL with a DOE Grant and has successfully completed its first phase of development. The results so far are encouraging and have led to commitment by DOE/NETL to begin a second phase--bench scale reactor vessel testing, expanded engineering analysis and economic perspective review. During the next decade new means of generating electricity, and other forms of energy, will be introduced. The members of the AHP Team envision a need for expanded sources of natural gas or substitutes for natural gas, to fuel power generating plants. The initial work the team has completed on a process to use hydrogen to convert coal to methane (pipeline ready gas) shows promising potential. The Team has intentionally slanted its efforts toward the needs of US electric utilities, particularly on fuels that can be used near urban centers where the greatest need for new electric generation is found. The process, as it has evolved, would produce methane from coal by adding hydrogen. The process appears to be efficient using western coals for conversion to a highly sought after fuel with significantly reduced CO{sub 2} emissions. Utilities have a natural interest in the preservation of their industry, which will require a dramatic reduction in stack emissions and an increase in sustainable technologies. Utilities tend to rank long-term stable supplies of fuel higher than most industries and are willing to trade some ratio of cost for stability. The need for sustainability, stability and environmentally compatible production are key drivers in the formation and progression of the AHP development. In Phase II, the team will add a focus on water conservation to determine how the basic gasification process can be best integrated with all the plant components to minimize water consumption during SNG production. The process allows for several CO{sub 2} reduction options including consumption of

  20. Fiscal 1991 report. Overseas surveys out of surveys for coal hydrogasification technology development; 1991 nendo sekitan suiten gaska gijutsu kaihatsu chosa ni okeru kaigai chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-11-01

    For the selection and evaluation of coal gasification processes suitable for substitute natural gas (SNG) production, visits were made to business corporations, research institutes, etc., engaged in the development of coal gasification technology abroad, and surveys were conducted of the development status overseas and information was collected. Visits were made and information was collected on the Lurgi process, a commercial SNG plant, and others at Dakota Gasification Company, U.S.; U-gas process and others at Institute of Gas Technology; energy-related matters at U.S. Department of Energy; coal hydrogasification process and others at Midlands Station, British Gas plc; Shell coal gasification process and others at Amsterdam Research Institute, Royal Dutch Shell; coal gasification, high-temperature desulfurization, and others at KEMA, Holland; and IGCC (integrated gasification combined cycle) verification plant with the Shell coal gasification process incorporated thereinto, now under construction at Demkolec. (NEDO)

  1. Catalysis of metal-clay intercalation compound in the low temperature coal hydrogasification

    Energy Technology Data Exchange (ETDEWEB)

    Fuda, Kiyoshi; Kimura, Mitsuhiko; Miyamoto, Norimitsu; Matsunaga, Toshiaki

    1986-10-23

    Focusing the hydrogenating methanation by gaseous phase catalytic reactions of low temperature volatile components, the catalytic effects of Ni metal and the effects of carriers having sensitive effects on the catalytic activities of Ni metal were studied. Sample coals were prepared from Shin-Yubari coal, and Ni hydride-montmorillonite complex catalysts and the catalysts produced by carring Ni nitrate on alumina and burning in hydrogen gas flows were prepared. The hydrogasification were carried out in a reaction tube. As a result, the montmorillonite-Ni compounds catalysts had high catalitic effects and high conversion ratio of 90% or more in the low temperature coal gasification. The catalitic effects of carried Ni metal strongly depended on the carrier substances, and the rank of effects for the carriers was montmorillonite>zeorite>TiO/sub 2/>alpha-Al/sub 2/O/sub 3/>MgO>SiO/sub 2/=gamma-Al/sub 2/O/sub 3/. (3 figs, 3 tabs, 3 refs)

  2. Development of processes for the utilization of Brazilian coal using nuclear process heat and/or nuclear process steam

    International Nuclear Information System (INIS)

    Bamert, H.; Niessen, H.F.; Walbeck, M.; Wasrzik, U.; Mueller, R.; Schiffers, U.; Strauss, W.

    1980-01-01

    Status of the project: End of the project definition phase and preparation of the planned conceptual phase. Objective of the project: Development of processes for the utilization of nuclear process heat and/or nuclear process steam for the gasification of coal with high ash content, in particular coal from Brazil. Results: With the data of Brazilian coal of high ash content (mine Leao/ 43% ash in the mine-mouth quality, 20% ash after preparation) there have been worked out proposals for the mine planning and for a number of processes. On the basis of these proposals and under consideration of the main data specified by the Brazilian working group there have been choosen two processes and worked out in a conceptual design: 1) pressurized water reactor + LURGI-pressure gasifier/hydrogasification for the production of SNG and 2) high temperature reactor steam gasification for the production of town gas. The economic evaluation showed that the two processes are not substantially different in their cost efficiency and they are economical on a long-term basis. For more specific design work there has been planned the implementation of an experimental programme using the semi-technical plants 'hydrogasification' in Wesseling and 'steam gasification' in Essen as the conceptual phase. (orig.) [de

  3. FY 2000 report on the results of the project supplementary to the New Sunshine Project - Feasibility of coal hydrogasification technology in China. II - Final report. Investigational study of the social adaptability (Feasibility study of the international cooperation - Report of Beijing Research Institute of Coal Chemistry); 2000 nendo New Sunshine keikaku hojo jigyo (Bessatsu kokusai kyoryoku kanosei chosa Pekin Bai kagaku kenkyujo) hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu - Shakai tekigo sei ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    For the purpose of establishing the coal hydrogasification technology which has a possibility of producing high-quality substitute natural gas in quantity and at low cost, an investigational study of the social adaptability was made. In this fiscal year, the following were carried out: natural gas resource and the plan for the use in China, actual state of the town gas business and the future plan, etc. As a part of the study, Beijing Research Institute of Coal Chemistry, China Coal Research Institute, made a survey under the research contract. As a result of the survey, the following was found out: In Xinjiang and Urumchi, Uigur Autonomous Region, there is an abundant coal resource that is suitable for coal hydrogasification, the transportation pipeline of natural gas had been constructed, and public facilities are prepared, and therefore, both cities are suitable for the construction of coal hydrogasification plant. Datong, Shanxi Province, is a largest city of coal production, enables the long-term coal supply for coal hydrogasification, and has a plan for remodeling of old facilities and construction of new facilities for the introduction of natural gas, and therefore, the city is suitable for the construction of coal hydrogasification plant. (NEDO)

  4. Fiscal 1990 report. Overseas surveys out of surveys for coal hydrogasification technology development; 1990 nendo sekitan suiten gaska gijutsu kaihatsu chosa ni okeru kaigai chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-12-01

    For the selection and evaluation of coal gasification processes suitable for substitute natural gas (SNG) production, visits were made to overseas business corporations, research institutes, etc., engaged in the development of coal gasification technology, surveys were conducted of the status of development abroad, and information was collected. Visited were Westfield Development Center, British Gas plc; Midlands Research Station, British Gas plc; IEA Coal Research; IFP (Institut Francais de Petrole); and DMT-FP (DMT-Gesellschaft fur Forschung und Prufung mbH). The Westfield Development Center uses coal from near-by open cut mines and supplies town gas to the Scottish region. The slagging Lurgi process, etc., were investigated. At Midlands Research Station, where a coal hydrogasification process is under development, the history of development and the cold model test were summarized, a test plan using a 5 tons/day pilot plant and the modification of test facilities were explained, and the 5 tons/day pilot plant was visited for study. (NEDO)

  5. Achievement report for fiscal 1999 on auxiliary New Sunshine Project. Development of coal hydrogasification technology (Survey and research on its social acceptability); 1999 nendo New Sunshine keikaku hojo jigyo seika hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu - Shakai tekigo sei ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    With an objective to evaluate feasibility of practical use and economy of the coal hydrogasification technology (the ARCH process), survey and research have been performed. This paper summarizes the achievements in fiscal 1999. In the survey on the social acceptability, survey has been made on the future trend in the demand and supply and the price of LNG, LPG, and coal for hydrogasification. As a result, it was discovered that the price of LNG imported into Japan is determined as if linked with the crude oil price, and Saudi Arabia is the price leader of the LPG price. With respect to the survey on the possibility of international cooperation, surveys were conducted on the prospects of the long-term demand and supply in China, natural gas resources, and the demand and supply thereof. The feasibility study has estimated the product gas manufacturing cost after the process has been improved. In the trial calculation on the three-mode cost, it was discovered that, although the profit from byproducts is great, the BTX maximized mode causes the manufacturing cost to be higher by as much as 2 to 3 yen per Nm{sup 3} than that of other modes because of higher unit consumption in raw materials and higher construction cost. (NEDO)

  6. Achievement report for fiscal 1998 (edition B) on auxiliary New Sunshine Project. Development of coal hydrogasification technology (Research by using experimental device); 1998 nendo New Sunshine keikaku hojo jigyo seika hokokusho (B ban). Sekitan suiso tenka gaska gijutsu kaihatsu - Jikken sochi ni yoru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    With an objective of using practically the coal hydrogasification technology (the ARCH process), developmental research has been performed on important elementary technologies using different experimental devices. This paper summarizes the achievements in fiscal 1998. In the research by using a small testing apparatus, the Taiheiyo coal was used to have derived hydrogasification data (distribution and yield of the reaction products) in case of having changed the reaction pressure, temperature rising rate, and H{sub 2}/caoal ratio, and to verify the possibility of increasing the BTX yield by installing a temperature zone in two steps. In the developmental research on the injector, a combustion test and a coal feeding test were performed on the injector having been designed and fabricated in the previous fiscal year to verify the basic performance and evaluate the heat resistance and durability. With respect to the hot model test, a test installation was completed with the injector mounted to conduct the trial operation and test. In addition, development was conducted on the coal temperature rise simulation as the second phase of developing the simulation of mixing of coal with high-temperature hydrogen and temperature rise. (NEDO)

  7. A FEASIBILITY STUDY FOR THE COPROCESSING OF FOSSIL FUELS WITH BIOMASS BY THE HYDROCARB PROCESS

    Science.gov (United States)

    The report describes and gives results of an assessment of a new process concept for the production of carbon and methanol from fossil fuels. The Hydrocarb Process consists of the hydrogasification of carbonaceous material to produce methane, which is subsequently thermally decom...

  8. Fuel production from coal by the Mobil Oil process using nuclear high-temperature process heat

    International Nuclear Information System (INIS)

    Hoffmann, G.

    1982-01-01

    Two processes for the production of liquid hydrocarbons are presented: Direct conversion of coal into fuel (coal hydrogenation) and indirect conversion of coal into fuel (syngas production, methanol synthesis, Mobil Oil process). Both processes have several variants in which nuclear process heat may be used; in most cases, the nuclear heat is introduced in the gas production stage. The following gas production processes are compared: LURGI coal gasification process; steam reformer methanation, with and without coal hydrogasification and steam gasification of coal. (orig./EF) [de

  9. FY 1999 report on the results of the project supplementary to the New Sunshine Project - Feasibility of coal hydrogasification technology in China. Investigational study of the social adaptability (Feasibility study of the international cooperation - Report of Beijing Research Institute of Coal Chemistry); 1999 nendo New Sunshine keikaku hojo jigyo (Bessatsu, Kokusai kyoryoku kanosei chosa Pekin Bai kagaku kenkyujo) hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu - Shakai tekigo sei ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    For the purpose of establishing the coal hydrogasification technology which has a possibility of producing high-quality substitute natural gas in quantity and at low cost, an investigational study of the social adaptability was made. In this fiscal year, the following were carried out: outlook of energy supply/demand in China and the problems, natural gas resource and the plan for the use, actual state of the town gas business and the future plan, etc. As a part of the study, Beijing Research Institute of Coal Chemistry, China Coal Research Institute, made a survey under the research contract. As to the general situation of natural gas in China, report was made on the following: present situation of the development of natural gas resource in China, present situation of town gas in large cities of China, present situation and outlook of coal development and utilization in China, assessment of the coal mine area adaptable to coal hydrogasification, etc. In the survey of the area suitable for coal hydrogasification, report was made on the present situation and future of energy supply/demand in Shanghai, Shanxi, Shenfua and Xinjiang, present situation and future of town gas supply, etc. Survey/report were also made on the coal hydrogasification technology and the applicability. (NEDO)

  10. Prototype plant for nuclear process heat (PNP), reference phase

    International Nuclear Information System (INIS)

    Fladerer, R.; Schrader, L.

    1982-07-01

    The coal gasification processes using nuclear process heat being developed within the framwork of the PNP project, have the advantages of saving feed coal, improving efficiency, reducing emissions, and stabilizing energy costs. One major gasification process is the hydrogasification of coal for producing SNG or gas mixture of carbon monoxide and hydrogen; this process can also be applied in a conventional route. The first steps to develop this process were planning, construction and operation of a semi-technical pilot plant for hydrogasification of coal in a fluidized bed having an input of 100 kg C/h. Before the completion of the development phase (reference phase) describing here, several components were tested on part of which no operational experience had so far been gained; these were the newly developed devices, e.g. the inclined tube for feeding coal into the fluidized bed, and the raw gas/hydrogenation gas heat exchanger for utilizing the waste heat of the raw gas leaving the gasifier. Concept optimizing of the thoroughly tested equipment parts led to an improved operational behaviour. Between 1976 and 1980, the semi-technical pilot plant was operated for about 19,400 hours under test conditions, more than 7,400 hours of which it has worked under gasification conditions. During this time approx. 1,100 metric tons of dry brown coal and more than 13 metric tons of hard coal were gasified. The longest coherent operational phase under gasification conditions was 748 hours in which 85.4 metric tons of dry brown coal were gasified. Carbon gasification rates up to 82% and methane contents in the dry raw gas (free of N 2 ) up to 48 vol.% were obtained. A detailed evaluation of the test results provided information of the results obtained previously. For the completion of the test - primarily of long-term tests - the operation of the semi-technical pilot plant for hydrogasification of coal is to be continued up to September 1982. (orig.) [de

  11. Commissioned operation report for fiscal 1992 on commissioning of surveying high-level development and effective utilization of natural gas, and development of coal hydrogasification technology; 1992 nendo tennen gas kodo kaihatsu yuko riyo chosa tou itaku gyomu hokokusho. Sekitan suiten gaska gijutsu kaihatsu chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-03-01

    With an objective of establishing a practically usable process to manufacture substitution natural gas, discussions have been given on the technical, economical, and developmental problems therein. This paper summarizes the achievements in fiscal 1992. With respect to the possibility of applying the coal SNG to power generation fuel, a power generation system composed of coal SNG, pipeline transportation and natural gas was recognized of having the significance of technological development because of its capability of raising the power plant utilization rate and possibility of being superior in the economic aspect. In the study on enhancement of aqueous solution yield, performance of the ARCH-2 reactor was discussed by the simulation forecast using a mathematical model, whereas the benzene yield was found possible to be raised up to 15% in the carbon conversion rate. As the target of the hydrogasification process to be developed by Japan, based on the study results of the current fiscal year, three points consisted of SNG yield maximization, cooling gas efficiency maximization, and BTV yield maximization were indicated, and it was proposed that a process having flexibility in the product yield should be developed. (NEDO)

  12. Gasification of coal using nuclear process heat. Chapter D

    International Nuclear Information System (INIS)

    Schilling, H.-D.; Bonn, B.; Krauss, U.

    1979-01-01

    In the light of the high price of coal and the enormous advances made recently in nuclear engineering, the possibility of using heat from high-temperature nuclear reactors for gasification processes was discussed as early as the 1960s. The advantages of this technology are summarized. A joint programme of development work is described, in which the Nuclear Research Centre at Juelich is aiming to develop a high-temperature reactor which will supply process heat at as high a temperature as possible, while other organizations are working on the hydrogasification of lignites and hard coals, and steam gasification. Experiments are at present being carried out on a semi-technical scale, and no operational data for large-scale plants are available as yet. (author)

  13. High temperature reactor and application to nuclear process heat

    Energy Technology Data Exchange (ETDEWEB)

    Schulten, R; Kugeler, K [Kernforschungsanlage Juelich G.m.b.H. (Germany, F.R.)

    1976-01-01

    The principle of high temperature nuclear process heat is explained and the main applications (hydrogasification of coal, nuclear chemical heat pipe, direct reduction of iron ore, coal gasification by steam and water splitting) are described in more detail. The motivation for the introduction of nuclear process heat to the market, questions of cost, of raw material resources and environmental aspects are the next point of discussion. The new technological questions of the nuclear reactor and the status of development are described, especially information about the fuel elements, the hot gas ducts, the contamination and some design considerations are added. Furthermore the status of development of helium heated steam reformers, the main results of the work until now and the further activities in this field are explained.

  14. Report for fiscal 1997 by gasification technology subcommittee, Coal Gasification Committee; 1997 nendo sekitan gas ka iinkai gas ka gijutsu bukai hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The 1st meeting of the subcommittee was held on August 1, 1997, and the 2nd meeting on February 24, 1998. Research plans for developing coal hydrogasification technology were reported and the achievements were brought under deliberation. The Coal Gasification Committee met in a plenary session on March 10, 1998, and reports were delivered and deliberation was made on the progress of coal gasification technology development. Reported in relation with studies using an experimental coal hydrogasification system were findings obtained by use of a small test unit, development of an injector, hot model test, cold model test, development of a cooled char extraction technology, development of a concentrated coal transportation technology, etc. Reported in relation with studies of assistance were the basic study of coal hydrogasification reaction, structure of and materials for a hydrogasification furnace, etc. Reports were also delivered on the survey and research of friendliness toward the community of coal hydrogasification technology development and on the study of coal gasification for fuel cells. (NEDO)

  15. Hydrogen production from coal using a nuclear heat source

    International Nuclear Information System (INIS)

    Quade, R.N.

    1977-01-01

    A strong candidate for hydrogen production in the intermediate time frame of 1990 to 1995 is a coal-based process using a high-temperature gas-cooled reactor (HTGR) as a heat source. Expected process efficiencies in the range of 60 to 70% are considerably higher than all other hydrogen production processes except steam reforming of a natural gas - a feedstock which may not be available in large quantities in this time frame. The process involves the preparation of a coal liquid, hydrogasification of that liquid, and steam reforming of the resulting gaseous or light liquid product. Bench-scale experimental work on the hydrogasification of coal liquids is being carried out. A study showing process efficiency and cost of hydrogen vs nuclear reactor core outlet temperature has been completed and shows diminishing returns at process temperatures above about 1500 0 F. (author)

  16. Integrated Energy System with Beneficial Carbon Dioxide (CO2) Use - Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xiaolei; Rink, Nancy T

    2011-04-29

    This report presents an integrated energy system that combines the production of substitute natural gas through coal hydrogasification with an algae process for beneficial carbon dioxide (CO2) use and biofuel production (funded under Department of Energy (DOE) contract DE-FE0001099). The project planned to develop, test, operate and evaluate a 2 ton-per-day coal hydrogasification plant and 25-acre algae farm at the Arizona Public Service (APS) 1000 Megawatt (MW) Cholla coal-fired power plant in Joseph City, Arizona. Conceptual design of the integrated system was undertaken with APS partners Air Liquide (AL) and Parsons. The process engineering was separated into five major areas: flue gas preparation and CO2 delivery, algae farming, water management, hydrogasification, and biofuel production. The process flow diagrams, energy and material balances, and preliminary major equipment needs for each major area were prepared to reflect integrated process considerations and site infrastructure design basis. The total project also included research and development on a bench-scale hydrogasifier, one-dimensional (1-D) kinetic-model simulation, extensive algae stressing, oil extraction, lipid analysis and a half-acre algae farm demonstration at APS?s Redhawk testing facility. During the project, a two-acre algae testing facility with a half-acre algae cultivation area was built at the APS Redhawk 1000 MW natural gas combined cycle power plant located 55 miles west of Phoenix. The test site integrated flue gas delivery, CO2 capture and distribution, algae cultivation, algae nursery, algae harvesting, dewatering and onsite storage as well as water treatment. The site environmental, engineering, and biological parameters for the cultivators were monitored remotely. Direct biodiesel production from biomass through an acid-catalyzed transesterification reaction and a supercritical methanol transesterification reaction were evaluated. The highest oil-to-biodiesel conversion of 79

  17. Achievement report for fiscal 1999 (edition B) on auxiliary New Sunshine Project. Development of coal hydrogasification technology (Research by using experimental device); 1999 nendo New Sunshine keikaku hojo jigyo seika hokokusho (B ban). Sekitan suiso tenka gaska gijutsu kaihatsu - Jikken sochi ni yoru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    With an objective of using practically the coal hydrogasification technology (the ARCH process), developmental research has been performed on important elementary technologies using different experimental devices. This paper summarizes the achievements in fiscal 1999. In the research by using a small testing apparatus, the Taiheiyo coal was used to have performed demonstration operation on the replacement natural maximized case, the heat efficiency maximized case, and the BTX maximized case. As a result, the three cases were found nearly as anticipated in the simulation, whereas the replacement natural gas maximized case has achieved the targeted whole coal conversion rate of 60% or more. However, the BTX maximized case presented a value lower than the targeted BTX yield of 12%. In the developmental research on the injector, the injector having been fabricated for the hot model test was given another combustion test, where the focal temperature of 1,200 degree C or higher was derived. The hot model test has verified the non-agglomeration performance of coal by using as parameters the focal temperatures, coal cross sectional area loads, coal types, and injectors. It was verified that the Taiheiyo and Shin Mu coals do not agglomerate excessively. (NEDO)

  18. The hydrogasification of lignite and sub-bituminous coals

    Science.gov (United States)

    Bhatt, B.; Fallon, P. T.; Steinberg, M.

    1981-02-01

    A North Dakota lignite and a New Mexico sub-bituminous coal have been hydrogenated at up to 900°C and 2500 psi hydrogen pressure. Yields of gaseous hydrocarbons and aromatic liquids have been studied as a function of temperature, pressure, residence time, feed rates and H2/coal ratio. Coal feed rates in excess of 10 lb/hr have been achieved in the 1 in. I. D.×8 ft reactor and methane concentration as high as 55% have been observed. A four-step reaction model was developed for the production and decomposition of the hydrocarbon products. A single object function formulated from the weighted errors for the four dependent process, variables, CH4, C2H6, BTX, and oil yields, was minimized using a program containing three independent iterative techniques. The results of the nonlinear regression analysis for lignite show that a first-order chemical reaction model with respect to C conversion satisfactorily describes the dilute phase hydrogenation. The activation energy for the initial products formation was estimated to be 42,700 cal/gmole and the power of hydrogen partial pressure was found to be +0.14. The overall correlation coefficient was 0.83. The mechanism, the rate expressions, and the design curves developed can be used for scale-up and reactor design.

  19. Report for fiscal 1998 by gasification technology subcommittee, Coal Gasification Committee; 1998 nendo sekitan gas ka iinkai gas ka gijutsu bukai hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The gasification technology subcommittee met on August 4 and November 17, 1998, and on March 10, 1999. Reported for deliberation were the research plan for coal hydrogasification technology development, its progress, and its achievements. On the other hand, the fuel cell-oriented coal gasification subcommittee met on July 23, 1998, and February 26, 1999, when studies were reported for deliberation concerning the development of a coal gasification technology for fuel cells, research plans, and research achievements. Reported in relation to studies using experimenting units were findings acquired using a small test unit, development of an injector, tests using a hot and cold models, development of a cooled char flow extraction technology, development of a highly concentrated powder transportation technology, and conceptual designs of next-generation facilities. A report was also delivered on survey and research on the friendliness toward the community of the development of coal hydrogasification technologies. Furthermore, a plan for reinforcing the system for evaluating the development of coal hydrogasification technologies was brought under deliberation. (NEDO)

  20. Performance simulations for Co-gasification of coal and methane

    Energy Technology Data Exchange (ETDEWEB)

    Niksa, Stephen [Niksa Energy Associates LLC, Belmont, CA (United States); Lim, J.P.; Del Rio Diaz Jara, D.; Eckstrom, D.; Steele, D.; Malhotra, R.; Wilson, R.B. [SRI International, Menlo Park, CA (United States). Chemistry and Chemical Engineering Dept.

    2013-07-01

    In the process under development, coal suspended in mixtures of CH{sub 4}, H{sub 2}, and steam is rapidly heated to temperatures above 1,400 C under 5-7 MPa for at least 1 s. The coal first decomposes into volatiles and char while CH{sub 4} is converted into CO/H{sub 2} mixtures. Then the char is converted into CO/H{sub 2} mixtures via steam gasification on longer time scales, and into CH{sub 4} via hydrogasification. Throughout all stages, homogeneous chemistry reforms all intermediate fuel components into the syngas feedstock for methanol synthesis. Fully validated reaction mechanisms for each chemical process were used to quantitatively interpret a co-gasification test series in SRI's lab-scale gasification facility. Homogeneous reforming chemistry generates equilibrium gas compositions at 1,500 C in the available transit time of 1.4 s, but not at any of the lower temperatures. Methane conversion in the gas phase increases for progressively hotter temperatures, in accord with the data. But the strong predicted dependence on steam concentration was not evident in the measured CH{sub 4} conversions, even when steam concentration was the subject test variable. Char hydrogasification adds CH{sub 4} to the product gas stream, but this process probably converts no more than 15-20% of the char in the lab-scale tests and the bulk of the char is converted by steam gasification. The correlation coefficient between predicted and measured char conversions exceeded 0.8 and the std. dev. was 3.4%, which is comparable to the measurement uncertainties. The evaluation of the predicted CH{sub 4} conversions gave a std. dev. greater than 20%. Simulations of commercial conditions with realistic suspension loadings and no diluents in the feed gave slightly lower conversions of both CH{sub 4} and coal, because hydrogasification accounts for more of the char conversion, and occurs at rates slower than for steam gasification.

  1. Reports on 1977 result of Sunshine Project. Research for detailed design of coal gasification plant (studies on operating conditions 'pressurized hydro-fluidized gasification method', dissolution of research equipment); 1977 nendo sekitan gas ka plant no shosai sekkei no tame no shiken kenkyu seika hokokusho. Unten joken no kenkyu 'kaatsu suiten ryudo gas ka hoshiki' kenkyu kaitai

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1978-03-31

    The pressurized hydro-fluidized gasification method is such that coal is hydro-gasified under a pressure of 30kg/cm{sup 3} to produce a methane-rich calorie gas. This method is a combination of three independent processes, namely, (1) thermal cracking of coal, (2) hydrogasification of thermally decomposed product, and (3) water gasification for self-contained hydrogen, and is a continuous gasification method in which all processes are operated in a fluidized bed. In fiscal 1976, an operation technique was found in which conditions suitable for each reaction purpose were given to a detached thermal cracking furnace and hydrogasification furnace and in which continuous gasification was still maintained without an adverse effect between the two furnaces. This year, the continuous gasification test will be implemented successively, determining suitable operating conditions for the purpose of increasing methane concentration and improving gasification efficiency, and concurrently extracting device-related problems that impair the continuous operation. With the aim of obtaining a self-contained system for hydrogen, water gasification tests will be conducted, with the optimum conditions determined for the water gasification. An operation test will be carried out for an internal heat type control box, so that the functions under pressurization, durability and operation criteria will be determined. (NEDO)

  2. Achievement report for fiscal 1981 on research under Sunshine Program. Basic research on high-calorie gas production technology; 1981 nendo sunshinte keikaku kenkyu seika hokokusho. Kokarori gas seizo gijutsu no kiso kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1982-03-01

    Test and research are conducted for acquiring basic data regarding coal gasification. In the basic research for the development of a high-calorie gas production process, a small moving bed gas furnace is improved, and a high-calorie gas of a heating value of approximately 5,100kcal/m{sup 3} is stably produced at a rate of approximately 20m{sup 3} per diem. The carbon conversion rate turns out to be approximately 37% for the gas, approximately 53% for the residual char, and approximately 10% for a mixture of tar, naphthalene, benzene, phenol, etc. Hydrogen is produced making use of the residual char when a small amount of coal is added, and it is deemed that the first experiment for the development of a moving bed hydrogasification process has been successfully completed. In the study of the mechanism of hydrogasification reaction, the result of a preliminary experiment in a fixed bed pressurized gasification furnace is compared with the data from a continuous gasification experiment, and the relationship is determined between the coal feed rate and the reaction rate. Conducted besides are a basic study of problems relating to operation, basic research on a reaction mechanism (carbonization by rapid heating), estimation of the equilibrium composition of gas generated by coal gasification, etc. (NEDO)

  3. DOE Coal Gasification Multi-Test Facility: fossil fuel processing technical/professional services

    Energy Technology Data Exchange (ETDEWEB)

    Hefferan, J.K.; Lee, G.Y.; Boesch, L.P.; James, R.B.; Rode, R.R.; Walters, A.B.

    1979-07-13

    A conceptual design, including process descriptions, heat and material balances, process flow diagrams, utility requirements, schedule, capital and operating cost estimate, and alternative design considerations, is presented for the DOE Coal Gasification Multi-Test Facility (GMTF). The GMTF, an engineering scale facility, is to provide a complete plant into which different types of gasifiers and conversion/synthesis equipment can be readily integrated for testing in an operational environment at relatively low cost. The design allows for operation of several gasifiers simultaneously at a total coal throughput of 2500 tons/day; individual gasifiers operate at up to 1200 tons/day and 600 psig using air or oxygen. Ten different test gasifiers can be in place at the facility, but only three can be operated at one time. The GMTF can produce a spectrum of saleable products, including low Btu, synthesis and pipeline gases, hydrogen (for fuel cells or hydrogasification), methanol, gasoline, diesel and fuel oils, organic chemicals, and electrical power (potentially). In 1979 dollars, the base facility requires a $288 million capital investment for common-use units, $193 million for four gasification units and four synthesis units, and $305 million for six years of operation. Critical reviews of detailed vendor designs are appended for a methanol synthesis unit, three entrained flow gasifiers, a fluidized bed gasifier, and a hydrogasifier/slag-bath gasifier.

  4. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  5. Hynol: An economic process for methanol production from biomass and natural gas with reduced CO2 emission

    Science.gov (United States)

    Steinberg, M.; Dong, Yuanji

    1993-10-01

    The Hynol process is proposed to meet the demand for an economical process for methanol production with reduced CO2 emission. This new process consists of three reaction steps: (1) hydrogasification of biomass, (2) steam reforming of the produced gas with additional natural gas feedstock, and (3) methanol synthesis of the hydrogen and carbon monoxide produced during the previous two steps. The H2-rich gas remaining after methanol synthesis is recycled to gasify the biomass in an energy neutral reactor so that there is no need for an expensive oxygen plant as required by commercial steam gasifiers. Recycling gas allows the methanol synthesis reactor to perform at a relatively lower pressure than conventional while the plant still maintains high methanol yield. Energy recovery designed into the process minimizes heat loss and increases the process thermal efficiency. If the Hynol methanol is used as an alternative and more efficient automotive fuel, an overall 41% reduction in CO2 emission can be achieved compared to the use of conventional gasoline fuel. A preliminary economic estimate shows that the total capital investment for a Hynol plant is 40% lower than that for a conventional biomass gasification plant. The methanol production cost is $0.43/gal for a 1085 million gal/yr Hynol plant which is competitive with current U.S. methanol and equivalent gasoline prices. Process flowsheet and simulation data using biomass and natural gas as cofeedstocks are presented. The Hynol process can convert any condensed carbonaceous material, especially municipal solid waste (MSW), to produce methanol.

  6. Production of synthesis gas and methane via coal gasification utilizing nuclear heat

    International Nuclear Information System (INIS)

    van Heek, K.H.; Juentgen, H.

    1982-01-01

    The steam gasificaton of coal requires a large amount of energy for endothermic gasification, as well as for production and heating of the steam and for electricity generation. In hydrogasification processes, heat is required primarily for the production of hydrogen and for preheating the reactants. Current developments in nuclear energy enable a gas cooled high temperature nuclear reactor (HTR) to be the energy source, the heat produced being withdrawn from the system by means of a helium loop. There is a prospect of converting coal, in optimal yield, into a commercial gas by employing the process heat from a gas-cooled HTR. The advantages of this process are: (1) conservation of coal reserves via more efficient gas production; (2) because of this coal conservation, there are lower emissions, especially of CO 2 , but also of dust, SO 2 , NO/sub x/, and other harmful substances; (3) process engineering advantages, such as omission of an oxygen plant and reduction in the number of gas scrubbers; (4) lower gas manufacturing costs compared to conventional processes. The main problems involved in using nuclear energy for the industrial gasification of coal are: (1) development of HTRs with helium outlet temperatures of at least 950 0 C; (2) heat transfer from the core of the reactor to the gas generator, methane reforming oven, or heater for the hydrogenation gas; (3) development of a suitable allothermal gas generator for the steam gasification; and (4) development of a helium-heated methane reforming oven and adaption of the hydrogasification process for operation in combination with the reactor. In summary, processes for gasifying coal that employ heat from an HTR have good economic and technical prospects of being realized in the future. However, time will be required for research and development before industrial application can take place. 23 figures, 4 tables. (DP)

  7. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  8. The use of wire mesh reactors to characterise solid fuels and provide improved understanding of larger scale thermochemical processes

    Energy Technology Data Exchange (ETDEWEB)

    Lu Gao; Long Wu; Nigel Paterson; Denis Dugwell; Rafael Kandiyoti [Imperial College London, London (United Kingdom). Department of Chemical Engineering

    2008-07-01

    Most reaction products from the pyrolysis and the early stages of gasification of solid fuels are chemically reactive. Secondary reactions between primary products and with heated fuel particles tend to affect the final product distributions. The extents and pathways of these secondary reactions are determined mostly by the heating rate and the size and shape of the reaction zone and of the sample itself. The wire-mesh reactor (WMR) configuration discussed in this paper allows products to be separated from reactants and enables the rapid quenching of products, allowing suppression of secondary reactions. This paper presents an overview of the development of wire-mesh reactors, describing several diverse applications. The first of these involves an analysis of the behaviour of injectant coal particles in blast furnace tuyeres and raceways. The data has offered explanations for helping to understand why, at high coal injection rates, problems can be encountered in the operation of blast furnaces. Another project focused on determining the extents of pyrolysis and gasification reactivities of a suite of Chinese coals under intense reaction conditions. The results showed variations in coal reactivities that were related to the C content. In another project demonstrating the versatility of the WMR configuration, the high pressure version of the reactor is being used for developing the Zero Emission Coal Alliance (ZECA) concept. The work aims to examine and explain the chemical and transport mechanisms underlying the pyrolysis, hydropyrolysis and hydrogasification stages of the process. The results obtained till date have shown the effects of the operating conditions on the extent of hydropyrolysis/gasification of a bituminous coal and two lignites. The lignites were more reactive than the coal, and the data suggests that high levels of conversion will be achievable under the anticipated ZECA process conditions. 29 refs., 15 figs., 7 tabs.

  9. Nuclear process heat at high temperature: Application, realization and development programme

    International Nuclear Information System (INIS)

    Sammeck, K.H.; Fischer, R.

    1976-01-01

    Studies in the Federal Republic of Germany (FRG), the USA and the United Kingdom have shown that high-temperature helium energy from an HTR can advantageously be utilized for coal gasification and other fossil fuel conversion processes, and that a substantial demand for substitute natural gas (SNG) can be expected in the future. These results are based on plant design studies, economic assessments and basic development efforts in the field of coal gasification with nuclear heat, which in the FRG were carried out by Arbeitsgemeinschaft Nukleare Prozesswaerme (ANP)-members, HRB and KFA Juelich. Nuclear process plants are based on different gasification processes, resulting in different concepts of the nuclear heat system. In the case of hydro-gasification it is expected that steam reformers, arranged within the primary circuit of the reactor, will be heated directly by the primary helium. In the case of steam gasification, the high-temperature energy must be transferred to the gasification process via an intermediate circuit which is coupled to a gasifier outside the containment. In both cases the design of the nuclear reactor resembles an HTR for electricity generation. The main objectives of the development of nuclear process heat are to increase the helium outlet temperature of the reactor up to 950 0 C, to develop metallic alloys for high-temperature components such as heat exchangers, to design and construct a hot-gas duct, a steam reformer and a helium-helium heat exchanger and to develop the gasification processes. The nuclear safety regulations and the interface problems between the reactor, the process plant and the electricity generating plant have to be considered thoroughly. The Arbeitsgemeinschaft Nukleare Prozesswaerme and HRB started a development programme, in close collaboration with KFA Juelich, which will lead to the construction of a prototype plant for coal gasification with nuclear heat within 5 to 5 1/2 years. A survey of the main objectives

  10. Control system for technological processes in tritium processing plants with process analysis

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian

    2005-01-01

    Integration of a large variety of installations and equipment into a unitary system for controlling the technological process in tritium processing nuclear facilities appears to be a rather complex approach particularly when experimental or new technologies are developed. Ensuring a high degree of versatility allowing easy modifications in configurations and process parameters is a major requirement imposed on experimental installations. The large amount of data which must be processed, stored and easily accessed for subsequent analyses imposes development of a large information network based on a highly integrated system containing the acquisition, control and technological process analysis data as well as data base system. On such a basis integrated systems of computation and control able to conduct the technological process could be developed as well protection systems for cases of failures or break down. The integrated system responds to the control and security requirements in case of emergency and of the technological processes specific to the industry that processes radioactive or toxic substances with severe consequences in case of technological failure as in the case of tritium processing nuclear plant. In order to lower the risk technological failure of these processes an integrated software, data base and process analysis system are developed, which, based on identification algorithm of the important parameters for protection and security systems, will display the process evolution trend. The system was checked on a existing plant that includes a removal tritium unit, finally used in a nuclear power plant, by simulating the failure events as well as the process. The system will also include a complete data base monitoring all the parameters and a process analysis software for the main modules of the tritium processing plant, namely, isotope separation, catalytic purification and cryogenic distillation

  11. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  12. Formal Analysis of Design Process Dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  13. Hydrogen production from coal using a nuclear heat source

    Science.gov (United States)

    Quade, R. N.

    1976-01-01

    A strong candidate for hydrogen production in the intermediate time frame of 1985 to 1995 is a coal-based process using a high-temperature gas-cooled reactor (HTGR) as a heat source. Expected process efficiencies in the range of 60 to 70% are considerably higher than all other hydrogen production processes except steam reforming of a natural gas. The process involves the preparation of a coal liquid, hydrogasification of that liquid, and steam reforming of the resulting gaseous or light liquid product. A study showing process efficiency and cost of hydrogen vs nuclear reactor core outlet temperature has been completed, and shows diminishing returns at process temperatures above about 1500 F. A possible scenario combining the relatively abundant and low-cost Western coal deposits with the Gulf Coast hydrogen users is presented which provides high-energy density transportation utilizing coal liquids and uranium.

  14. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  15. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  16. The investigation for attaining the optimal yield of oil shale by integrating high temperature reactors

    International Nuclear Information System (INIS)

    Bhattacharyya, A.T.

    1984-03-01

    This work presents a systemanalytical investigation and shows how far a high temperature reactor can be integrated for achieving the optimal yield of kerogen from oil shale. About 1/3 of the produced components must be burnt out in order to have the required high temperature process heat. The works of IGT show that the hydrogen gasification of oil shale enables not only to reach oil shale of higher quality but also allows to achieve a higher extraction quantity. For this reason a hydro-gasification process has been calculated in this work in which not only hydrogen is used as the gasification medium but also two high temperature reactors are integrated as the source of high temperature heat. (orig.) [de

  17. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  18. Moments analysis of concurrent Poisson processes

    International Nuclear Information System (INIS)

    McBeth, G.W.; Cross, P.

    1975-01-01

    A moments analysis of concurrent Poisson processes has been carried out. Equations are given which relate combinations of distribution moments to sums of products involving the number of counts associated with the processes and the mean rate of the processes. Elimination of background is discussed and equations suitable for processing random radiation, parent-daughter pairs in the presence of background, and triple and double correlations in the presence of background are given. The theory of identification of the four principle radioactive series by moments analysis is discussed. (Auth.)

  19. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  20. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  1. Two-stage process analysis using the process-based performance measurement framework and business process simulation

    NARCIS (Netherlands)

    Han, K.H.; Kang, J.G.; Song, M.S.

    2009-01-01

    Many enterprises have recently been pursuing process innovation or improvement to attain their performance goals. To align a business process with enterprise performances, this study proposes a two-stage process analysis for process (re)design that combines the process-based performance measurement

  2. High pressure hydropyrolysis of coals by using a continuous free-fall reactor

    Energy Technology Data Exchange (ETDEWEB)

    W.-C. Xu; K. Matsuoka; H. Akiho; M. Kumagai; A. Tomita [Institute of Research and Innovation, Kashiwa (Japan)

    2003-04-01

    Rapid hydropyrolysis of coal was carried out at temperatures ranging from 923 to 1123 K and H{sub 2} pressures up to 7 MPa by using a continuous free-fall pyrolyzer. The effects of the reaction conditions on product yields were investigated. Carbon mass balance was fairly good. It was revealed that a large amount of methane was produced due to the hydrogenolysis of higher hydrocarbons and the hydrogasification of char. The influence of pyrolysis temperature was significant on both reactions while H{sub 2} pressure mainly affected the latter. A considerable amount of reactive carbon was formed during hydropyrolysis of coal. It was converted to methane at high temperatures and high H{sub 2} pressures, while the hydrogasification of reactive carbon takes place relatively slowly at low temperatures and low H{sub 2} pressures, resulting in a low overall carbon conversion. The coal conversions observed in the present study were much higher than those obtained with using reactors where the contact between coal particles and H{sub 2} is insufficient. 25 refs., 6 figs., 6 tabs.

  3. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  4. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  5. Materials, process, product analysis of coal process technology. Phase I final report

    Energy Technology Data Exchange (ETDEWEB)

    Saxton, J. C.; Roig, R. W.; Loridan, A.; Leggett, N. E.; Capell, R. G.; Humpstone, C. C.; Mudry, R. N.; Ayres, E.

    1976-02-01

    The purpose of materials-process-product analysis is a systematic evaluation of alternative manufacturing processes--in this case processes for converting coal into energy and material products that can supplement or replace petroleum-based products. The methodological steps in the analysis include: Definition of functional operations that enter into coal conversion processes, and modeling of alternative, competing methods to accomplish these functions; compilation of all feasible conversion processes that can be assembled from combinations of competing methods for the functional operations; systematic, iterative evaluation of all feasible conversion processes under a variety of economic situations, environmental constraints, and projected technological advances; and aggregative assessments (economic and environmental) of various industrial development scenarios. An integral part of the present project is additional development of the existing computer model to include: A data base for coal-related materials and coal conversion processes; and an algorithmic structure that facilitates the iterative, systematic evaluations in response to exogenously specified variables, such as tax policy, environmental limitations, and changes in process technology and costs. As an analytical tool, the analysis is intended to satisfy the needs of an analyst working at the process selection level, for example, with respect to the allocation of RDandD funds to competing technologies.

  6. Gas reactor international cooperative program interim report: German Pebble Bed Reactor design and technology review

    International Nuclear Information System (INIS)

    1978-09-01

    This report describes and evaluates several gas-cooled reactor plant concepts under development within the Federal Republic of Germany (FRG). The concepts, based upon the use of a proven Pebble Bed Reactor (PBR) fuel element design, include nuclear heat generation for chemical processes and electrical power generation. Processes under consideration for the nuclear process heat plant (PNP) include hydrogasification of coal, steam gasification of coal, combined process, and long-distance chemical heat transportation. The electric plant emphasized in the report is the steam turbine cycle (HTR-K), although the gas turbine cycle (HHT) is also discussed. The study is a detailed description and evaluation of the nuclear portion of the various plants. The general conclusions are that the PBR technology is sound and that the HTR-K and PNP plant concepts appear to be achievable through appropriate continuing development programs, most of which are either under way or planned

  7. Root cause analysis with enriched process logs

    NARCIS (Netherlands)

    Suriadi, S.; Ouyang, C.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.; La Rosa, M.; Soffer, P.

    2013-01-01

    n the field of process mining, the use of event logs for the purpose of root cause analysis is increasingly studied. In such an analysis, the availability of attributes/features that may explain the root cause of some phenomena is crucial. Currently, the process of obtaining these attributes from

  8. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  9. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  10. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  11. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  12. Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process

    Science.gov (United States)

    Chenail, Ronald J.

    2012-01-01

    In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…

  13. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  14. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  15. Computerization of the safeguards analysis decision process

    International Nuclear Information System (INIS)

    Ehinger, M.H.

    1990-01-01

    This paper reports that safeguards regulations are evolving to meet new demands for timeliness and sensitivity in detecting the loss or unauthorized use of sensitive nuclear materials. The opportunities to meet new rules, particularly in bulk processing plants, involve developing techniques which use modern, computerized process control and information systems. Using these computerized systems in the safeguards analysis involves all the challenges of the man-machine interface experienced in the typical process control application and adds new dimensions to accuracy requirements, data analysis, and alarm resolution in the regulatory environment

  16. Retro-Techno-Economic Analysis: Using (Bio)Process Systems Engineering Tools to Attain Process Target Values

    DEFF Research Database (Denmark)

    Furlan, Felipe F.; Costa, Caliane B B; Secchi, Argimiro R.

    2016-01-01

    Economic analysis, allied to process systems engineering tools, can provide useful insights about process techno-economic feasibility. More interestingly, rather than being used to evaluate specific process conditions, this techno-economic analysis can be turned upside down to achieve target valu...

  17. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  18. Processability analysis of candidate waste forms

    International Nuclear Information System (INIS)

    Gould, T.H. Jr.; Dunson, J.B. Jr.; Eisenberg, A.M.; Haight, H.G. Jr.; Mello, V.E.; Schuyler, R.L. III.

    1982-01-01

    A quantitative merit evaluation, or processability analysis, was performed to assess the relative difficulty of remote processing of Savannah River Plant high-level wastes for seven alternative waste form candidates. The reference borosilicate glass process was rated as the simplest, followed by FUETAP concrete, glass marbles in a lead matrix, high-silica glass, crystalline ceramics (SYNROC-D and tailored ceramics), and coated ceramic particles. Cost estimates for the borosilicate glass, high-silica glass, and ceramic waste form processing facilities are also reported

  19. Iterated Process Analysis over Lattice-Valued Regular Expressions

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    We present an iterated approach to statically analyze programs of two processes communicating by message passing. Our analysis operates over a domain of lattice-valued regular expressions, and computes increasingly better approximations of each process's communication behavior. Overall the work e...... extends traditional semantics-based program analysis techniques to automatically reason about message passing in a manner that can simultaneously analyze both values of variables as well as message order, message content, and their interdependencies.......We present an iterated approach to statically analyze programs of two processes communicating by message passing. Our analysis operates over a domain of lattice-valued regular expressions, and computes increasingly better approximations of each process's communication behavior. Overall the work...

  20. Example process hazard analysis of a Department of Energy water chlorination process

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  1. A core ontology for business process analysis

    NARCIS (Netherlands)

    Pedrinaci, C.; Domingue, J.; Alves De Medeiros, A.K.; Bechhofer, S.; Hauswirth, M.; Hoffmann, J.; Koubarakis, M.

    2008-01-01

    Business Process Management (BPM) aims at supporting the whole life-cycle necessary to deploy and maintain business processes in organisations. An important step of the BPM life-cycle is the analysis of the processes deployed in companies. However, the degree of automation currently achieved cannot

  2. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  3. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  4. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Science.gov (United States)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  5. Rapid, low-cost, image analysis through video processing

    International Nuclear Information System (INIS)

    Levinson, R.A.; Marrs, R.W.; Grantham, D.G.

    1976-01-01

    Remote Sensing now provides the data necessary to solve many resource problems. However, many of the complex image processing and analysis functions used in analysis of remotely-sensed data are accomplished using sophisticated image analysis equipment. High cost of this equipment places many of these techniques beyond the means of most users. A new, more economical, video system capable of performing complex image analysis has now been developed. This report describes the functions, components, and operation of that system. Processing capability of the new video image analysis system includes many of the tasks previously accomplished with optical projectors and digital computers. Video capabilities include: color separation, color addition/subtraction, contrast stretch, dark level adjustment, density analysis, edge enhancement, scale matching, image mixing (addition and subtraction), image ratioing, and construction of false-color composite images. Rapid input of non-digital image data, instantaneous processing and display, relatively low initial cost, and low operating cost gives the video system a competitive advantage over digital equipment. Complex pre-processing, pattern recognition, and statistical analyses must still be handled through digital computer systems. The video system at the University of Wyoming has undergone extensive testing, comparison to other systems, and has been used successfully in practical applications ranging from analysis of x-rays and thin sections to production of color composite ratios of multispectral imagery. Potential applications are discussed including uranium exploration, petroleum exploration, tectonic studies, geologic mapping, hydrology sedimentology and petrography, anthropology, and studies on vegetation and wildlife habitat

  6. Development of advanced spent fuel management process. System analysis of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, S.G.; Kang, D.S.; Seo, C.S.; Lee, H.H.; Shin, Y.J.; Park, S.W.

    1999-03-01

    The system analysis of an advanced spent fuel management process to establish a non-proliferation model for the long-term spent fuel management is performed by comparing the several dry processes, such as a salt transport process, a lithium process, the IFR process developed in America, and DDP developed in Russia. In our system analysis, the non-proliferation concept is focused on the separation factor between uranium and plutonium and decontamination factors of products in each process, and the non-proliferation model for the long-term spent fuel management has finally been introduced. (Author). 29 refs., 17 tabs., 12 figs

  7. Summary of process research analysis efforts

    Science.gov (United States)

    Burger, D. R.

    1985-01-01

    A summary of solar-cell process research analysis efforts was presented. Process design and cell design are interactive efforts where technology from integrated circuit processes and other processes are blended. The primary factors that control cell efficiency are: (1) the bulk parameters of the available sheet material, (2) the retention and enhancement of these bulk parameters, and (3) the cell design and the cost to produce versus the finished cells performance. The process sequences need to be tailored to be compatible with the sheet form, the cell shape form, and the processing equipment. New process options that require further evaluation and utilization are lasers, robotics, thermal pulse techniques, and new materials. There are numerous process control techniques that can be adapted and used that will improve product uniformity and reduced costs. Two factors that can lead to longer life modules are the use of solar cell diffusion barriers and improved encapsulation.

  8. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  9. AN ENHANCED HAZARD ANALYSIS PROCESS FOR THE HANFORD TANK FARMS

    International Nuclear Information System (INIS)

    SHULTZ MV

    2008-01-01

    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process

  10. Self-similar analysis of the spherical implosion process

    International Nuclear Information System (INIS)

    Ishiguro, Yukio; Katsuragi, Satoru.

    1976-07-01

    The implosion processes caused by laser-heating ablation has been studied by self-similarity analysis. Attention is paid to the possibility of existence of the self-similar solution which reproduces the implosion process of high compression. Details of the self-similar analysis are reproduced and conclusions are drawn quantitatively on the gas compression by a single shock. The compression process by a sequence of shocks is discussed in self-similarity. The gas motion followed by a homogeneous isentropic compression is represented by a self-similar motion. (auth.)

  11. Computer-Aided Sustainable Process Synthesis-Design and Analysis

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan

    -groups is that, the performance of the entire process can be evaluated from the contributions of the individual process-groups towards the selected flowsheet property (for example, energy consumed). The developed flowsheet property models include energy consumption, carbon footprint, product recovery, product......Process synthesis involves the investigation of chemical reactions needed to produce the desired product, selection of the separation techniques needed for downstream processing, as well as taking decisions on sequencing the involved separation operations. For an effective, efficient and flexible...... focuses on the development and application of a computer-aided framework for sustainable synthesis-design and analysis of process flowsheets by generating feasible alternatives covering the entire search space and includes analysis tools for sustainability, LCA and economics. The synthesis method is based...

  12. Process synthesis, design and analysis using a process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario R.; Gani, Rafiqul

    2015-01-01

    ) techniques. The fundamental pillars of this framework are the definition and use of functional process-groups (building blocks) representing a wide range of process operations, flowsheet connectivity rules to join the process-groups to generate all the feasible flowsheet alternatives and flowsheet property...... models like energy consumption, atom efficiency, environmental impact to evaluate the performance of the generated alternatives. In this way, a list of feasible flowsheets are quickly generated, screened and selected for further analysis. Since the flowsheet is synthesized and the operations......This paper describes the development and application of a process-group contribution method to model, simulate and synthesize chemical processes. Process flowsheets are generated in the same way as atoms or groups of atoms are combined to form molecules in computer aided molecular design (CAMD...

  13. Systemic analysis of the caulking assembly process

    Directory of Open Access Journals (Sweden)

    Rodean Claudiu

    2017-01-01

    Full Text Available The present paper highlights the importance of a caulking process which is nowadays less studied in comparison with the growing of its usage in the automotive industry. Due to the fact that the caulking operation is used in domains with high importance such as shock absorbers and brake systems there comes the demand of this paper to detail the parameters which characterize the process, viewed as input data and output data, and the requirements asked for the final product. The paper presents the actual measurement methods used for analysis the performance of the caulking assembly. All this parameters leads to an analysis algorithm of performance established for the caulking process which it is used later in the paper for an experimental research. The study is a basis from which it will be able to go to further researches in order to optimize the following processing.

  14. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  15. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. System and method for producing substitute natural gas from coal

    Science.gov (United States)

    Hobbs, Raymond [Avondale, AZ

    2012-08-07

    The present invention provides a system and method for producing substitute natural gas and electricity, while mitigating production of any greenhouse gasses. The system includes a hydrogasification reactor, to form a gas stream including natural gas and a char stream, and an oxygen burner to combust the char material to form carbon oxides. The system also includes an algae farm to convert the carbon oxides to hydrocarbon material and oxygen.

  17. POST-PROCESSING ANALYSIS FOR THC SEEPAGE

    International Nuclear Information System (INIS)

    SUN, Y.

    2004-01-01

    This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and management of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P and CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN: LB0302DSCPTHCS.002 [DIRS 161976]) used in the P and CE (BSC 2004 [DIRS 169860

  18. Advanced exergetic analysis of five natural gas liquefaction processes

    International Nuclear Information System (INIS)

    Vatani, Ali; Mehrpooya, Mehdi; Palizdar, Ali

    2014-01-01

    Highlights: • Advanced exergetic analysis was investigated for five LNG processes. • Avoidable/unavoidable and endogenous/exogenous irreversibilities were calculated. • Advanced exergetic analysis identifies the potentials for improving the system. - Abstract: Conventional exergy analysis cannot identify portion of inefficiencies which can be avoided. Also this analysis does not have ability to calculate a portion of exergy destruction which has been produced through performance of a component alone. In this study advanced exergetic analysis was performed for five mixed refrigerant LNG processes and four parts of irreversibility (avoidable/unavoidable) and (endogenous/exogenous) were calculated for the components with high inefficiencies. The results showed that portion of endogenous exergy destruction in the components is higher than the exogenous one. In fact interactions among the components do not affect the inefficiencies significantly. Also this analysis showed that structural optimization cannot be useful to decrease the overall process irreversibilities. In compressors high portion of the exergy destruction is related to the avoidable one, thus they have high potential to improve. But in multi stream heat exchangers and air coolers, unavoidable inefficiencies were higher than the other parts. Advanced exergetic analysis can identify the potentials and strategies to improve thermodynamic performance of energy intensive processes

  19. Analysis of briquetting process of sewage sludge with coal to combustion process

    Directory of Open Access Journals (Sweden)

    Kosturkiewicz Bogdan

    2016-01-01

    Full Text Available Energy recovery from sewage sludge can be achieved by several thermal technologies, but before those processes sewage sludge requires special pretreatment. The paper presents the investigation of the sewage sludge with coal briquettes as a fuel for combustion process. Research is conducted at Department of Manufacturing Systems and Department of Thermal Engineering and Environmental Protection, AGH University of Science and Technology to develop a technology of briquette preparation. The obtained results showed possibility of briquetting of municipal sewage sludge with coal in roll presses, equipped with asymmetric thickening gravity feed system. The following properties were determined for the obtained briquettes: density, drop strength and compressive strength. Based on physical and chemical analysis of prepared briquettes it was confirmed that briquettes have good fuel properties to combustion process. Thermal behaviour of studied sewage sludge and prepared mixture was investigated by thermogravimetric analysis (TG. For the thermo gravimetric analysis (TG the samples were heated in an alumina crucible from an ambient temperature up to 1000 °C at a constant rates: 10 °C/min, 40 °C/min and 100 °C/min in a 40 ml/min flow of air.

  20. Profitability Analysis of Soybean Oil Processes.

    Science.gov (United States)

    Cheng, Ming-Hsun; Rosentrater, Kurt A

    2017-10-07

    Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.

  1. Profitability Analysis of Soybean Oil Processes

    Directory of Open Access Journals (Sweden)

    Ming-Hsun Cheng

    2017-10-01

    Full Text Available Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV, break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP, is profitable when the capacity is larger than 17 million kg of annual oil production.

  2. Data near processing support for climate data analysis

    Science.gov (United States)

    Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils

    2016-04-01

    Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted

  3. Ergonomic analysis of radiopharmaceuticals samples preparation process

    International Nuclear Information System (INIS)

    Gomes, Luciene Betzler C.; Santos, Isaac Luquetti dos; Fonseca, Antonio Carlos C. da; Pellini, Marcos Pinto; Rebelo, Ana Maria

    2005-01-01

    The doses of radioisotopes to be administrated in patients for diagnostic effect or therapy are prepared in the radiopharmacological sector. The preparation process adopts techniques that are aimed to reduce the exposition time of the professionals and the absorption of excessive doses for patients. The ergonomic analysis of this process contributes in the prevention of occupational illnesses and to prevent risks of accidents during the routines, providing welfare and security to the involved users and conferring to the process an adequate working standard. In this context it is perceived relevance of studies that deal with the analysis of factors that point with respect to the solution of problems and for establishing proposals that minimize risks in the exercise of the activities. Through a methodology that considers the application of the concepts of Ergonomics, it is searched the improvement of the effectiveness or the quality and reduction of the difficulties lived for the workers. The work prescribed, established through norms and procedures codified will be faced with the work effectively carried through, the real work, shaped to break the correct appreciation, with focus in the activities. This work has as objective to argue an ergonomic analysis of samples preparation process of radioisotopes in the Setor de Radiofarmacia do Hospital Universitario Clementino Fraga Filho da Universidade Federal do Rio de Janeiro (UFRJ). (author)

  4. Vygotsky's Analysis of Children's Meaning Making Processes

    Science.gov (United States)

    Mahn, Holbrook

    2012-01-01

    Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…

  5. Big Data Analysis of Manufacturing Processes

    Science.gov (United States)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  6. Big Data Analysis of Manufacturing Processes

    International Nuclear Information System (INIS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-01-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results. (paper)

  7. BUSINESS PROCESS MANAGEMENT SYSTEMS TECHNOLOGY COMPONENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Andrea Giovanni Spelta

    2007-05-01

    Full Text Available The information technology that supports the implementation of the business process management appproach is called Business Process Management System (BPMS. The main components of the BPMS solution framework are process definition repository, process instances repository, transaction manager, conectors framework, process engine and middleware. In this paper we define and characterize the role and importance of the components of BPMS's framework. The research method adopted was the case study, through the analysis of the implementation of the BPMS solution in an insurance company called Chubb do Brasil. In the case study, the process "Manage Coinsured Events"" is described and characterized, as well as the components of the BPMS solution adopted and implemented by Chubb do Brasil for managing this process.

  8. Gasification with nuclear reactor heat

    International Nuclear Information System (INIS)

    Weisbrodt, I.A.

    1977-01-01

    The energy-political ultimate aims for the introduction of nuclear coal gasification and the present state of technology concerning the HTR reactor, concerning gasification and heat exchanging components are outlined. Presented on the plans a) for hydro-gasification of lignite and for steam gasification of pit coal for the production of synthetic natural gas, and b) for the introduction of a nuclear heat system. The safety and environmental problems to be expected are portrayed. The main points of development, the planned prototype plant and the schedule of the project Pototype plant Nuclear Process heat (PNP) are specified. In a market and economic viability study of nuclear coal gasification, the application potential of SNG, the possible construction programme for the FRG, as well as costs and rentability of SNG production are estimated. (GG) [de

  9. Second-order analysis of semiparametric recurrent event processes.

    Science.gov (United States)

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  10. Multiobjective Optimization of ELID Grinding Process Using Grey Relational Analysis Coupled with Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    S. Prabhu

    2014-06-01

    Full Text Available Carbon nanotube (CNT mixed grinding wheel has been used in the electrolytic in-process dressing (ELID grinding process to analyze the surface characteristics of AISI D2 Tool steel material. CNT grinding wheel is having an excellent thermal conductivity and good mechanical property which is used to improve the surface finish of the work piece. The multiobjective optimization of grey relational analysis coupled with principal component analysis has been used to optimize the process parameters of ELID grinding process. Based on the Taguchi design of experiments, an L9 orthogonal array table was chosen for the experiments. The confirmation experiment verifies the proposed that grey-based Taguchi method has the ability to find out the optimal process parameters with multiple quality characteristics of surface roughness and metal removal rate. Analysis of variance (ANOVA has been used to verify and validate the model. Empirical model for the prediction of output parameters has been developed using regression analysis and the results were compared for with and without using CNT grinding wheel in ELID grinding process.

  11. Analysis of hard inclusive processes in quantum chromodynamics

    International Nuclear Information System (INIS)

    Radyushkin, A.V.

    1983-01-01

    An approach to the investigation of hard processes in QCD based on a regular usage of α-representation analysis of Feynman diagram asymptotics is described. Analysis is examplified by two simplest inclusive processes: E + e - annihilation into hadrons and deep inelastic lepton-hadron scattering. The separation procedure of factorization of contributions stipulated by short- and long-range particle interactions is reported. The relation between expansion operators and methods based on direct analysis of diagrams as well as between theoretical field approaches and the parton model is discussed. Specific features of factorization of short- and long-range contributions in non-Abelian gauge theories are investigated

  12. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  13. Pinch analysis for bioethanol production process from lignocellulosic biomass

    International Nuclear Information System (INIS)

    Fujimoto, S.; Yanagida, T.; Nakaiwa, M.; Tatsumi, H.; Minowa, T.

    2011-01-01

    Bioethanol produced from carbon neutral and renewable biomass resources is an attractive process for the mitigation of greenhouse gases from vehicle exhaust. This study investigated energy utilization during bioethanol production from lignocellulose while avoiding competition with food production from corn and considering the potential mitigation of greenhouse gases. Process design and simulations were performed for bioethanol production using concentrated sulfuric acid. Mass and heat balances were obtained by process simulations, and the heat recovery ratio was determined by pinch analysis. An energy saving of 38% was achieved. However, energy supply and demand were not effectively utilized in the temperature range from 95 to 100 o C. Therefore, a heat pump was used to improve the temperature range of efficient energy supply and demand. Results showed that the energy required for the process could be supplied by heat released during the process. Additionally, the power required was supplied by surplus power generated during the process. Thus, pinch analysis was used to improve the energy efficiency of the process. - Highlights: → Effective energy utilization of bioethanol production was studied by using pinch analysis. → It was found that energy was not effectively utilized in the temperature range from 95 to 100 o C. → Use of a heat pump was considered to improve the ineffective utilization. → Then, remarkable energy savings could be achieved by it. → Pinch analysis effectively improved the energy efficiency of the bioethanol production.

  14. What carries a mediation process? Configural analysis of mediation.

    Science.gov (United States)

    von Eye, Alexander; Mun, Eun Young; Mair, Patrick

    2009-09-01

    Mediation is a process that links a predictor and a criterion via a mediator variable. Mediation can be full or partial. This well-established definition operates at the level of variables even if they are categorical. In this article, two new approaches to the analysis of mediation are proposed. Both of these approaches focus on the analysis of categorical variables. The first involves mediation analysis at the level of configurations instead of variables. Thus, mediation can be incorporated into the arsenal of methods of analysis for person-oriented research. Second, it is proposed that Configural Frequency Analysis (CFA) can be used for both exploration and confirmation of mediation relationships among categorical variables. The implications of using CFA are first that mediation hypotheses can be tested at the level of individual configurations instead of variables. Second, this approach leaves the door open for different types of mediation processes to exist within the same set. Using a data example, it is illustrated that aggregate-level analysis can overlook mediation processes that operate at the level of individual configurations.

  15. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  16. Improvement of product design process by knowledge value analysis

    OpenAIRE

    XU, Yang; BERNARD, Alain; PERRY, Nicolas; LAROCHE, Florent

    2013-01-01

    Nowadays, design activities remain the core issue for global product development. As knowledge is more and more integrated, effective analysis of knowledge value becomes very useful for the improvement of product design processes. This paper aims at proposing a framework of knowledge value analysis in the context of product design process. By theoretical analysis and case study, the paper illustrates how knowledge value can be calculated and how the results can help the improvement of product...

  17. Carbon dioxide capture processes: Simulation, design and sensitivity analysis

    DEFF Research Database (Denmark)

    Zaman, Muhammad; Lee, Jay Hyung; Gani, Rafiqul

    2012-01-01

    equilibrium and associated property models are used. Simulations are performed to investigate the sensitivity of the process variables to change in the design variables including process inputs and disturbances in the property model parameters. Results of the sensitivity analysis on the steady state...... performance of the process to the L/G ratio to the absorber, CO2 lean solvent loadings, and striper pressure are presented in this paper. Based on the sensitivity analysis process optimization problems have been defined and solved and, a preliminary control structure selection has been made.......Carbon dioxide is the main greenhouse gas and its major source is combustion of fossil fuels for power generation. The objective of this study is to carry out the steady-state sensitivity analysis for chemical absorption of carbon dioxide capture from flue gas using monoethanolamine solvent. First...

  18. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  19. Laplace-Laplace analysis of the fractional Poisson process

    OpenAIRE

    Gorenflo, Rudolf; Mainardi, Francesco

    2013-01-01

    We generate the fractional Poisson process by subordinating the standard Poisson process to the inverse stable subordinator. Our analysis is based on application of the Laplace transform with respect to both arguments of the evolving probability densities.

  20. Analysis of the logistics processes in the wine distribution

    OpenAIRE

    Slavkovský, Matúš

    2011-01-01

    Master's thesis is referring the importance of logistics in the retail business and the importance of reducing logistics costs. It includes so theoretical knowledge as well as the analysis of the relevant markets, which are producing and consuming wine in the largest quantities. Thesis is focused on analysis of the logistical processes and costs of an e-shop. Based on this analysis measures to improve the logistics of the process of the company are proposed. The goal of the Master's thesis is...

  1. Thermodynamic analysis applied to a food-processing plant

    Energy Technology Data Exchange (ETDEWEB)

    Ho, J C; Chandratilleke, T T

    1987-01-01

    Two production lines of a multi-product, food-processing plant are selected for energy auditing and analysis. Thermodynamic analysis showed that the first-law and second-law efficiencies are 81.5% and 26.1% for the instant-noodles line and 23.6% and 7.9% for the malt-beverage line. These efficiency values are dictated primarily by the major energy-consuming sub-processes of each production line. Improvements in both first-law and second-law efficiencies are possible for the plants if the use of steam for heating is replaced by gaseous or liquid fuels, the steam ejectors for creating vacuum are replaced by a mechanical pump, and employing the cooler surroundings to assist in the cooling process.

  2. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  3. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  4. Mathematical principles of signal processing Fourier and wavelet analysis

    CERN Document Server

    Brémaud, Pierre

    2002-01-01

    Fourier analysis is one of the most useful tools in many applied sciences. The recent developments of wavelet analysis indicates that in spite of its long history and well-established applications, the field is still one of active research. This text bridges the gap between engineering and mathematics, providing a rigorously mathematical introduction of Fourier analysis, wavelet analysis and related mathematical methods, while emphasizing their uses in signal processing and other applications in communications engineering. The interplay between Fourier series and Fourier transforms is at the heart of signal processing, which is couched most naturally in terms of the Dirac delta function and Lebesgue integrals. The exposition is organized into four parts. The first is a discussion of one-dimensional Fourier theory, including the classical results on convergence and the Poisson sum formula. The second part is devoted to the mathematical foundations of signal processing - sampling, filtering, digital signal proc...

  5. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  6. Tolerance analysis in manufacturing using process capability ratio with measurement uncertainty

    DEFF Research Database (Denmark)

    Mahshid, Rasoul; Mansourvar, Zahra; Hansen, Hans Nørgaard

    2017-01-01

    . In this paper, a new statistical analysis was applied to manufactured products to assess achieved tolerances when the process is known while using capability ratio and expanded uncertainty. The analysis has benefits for process planning, determining actual precision limits, process optimization, troubleshoot......Tolerance analysis provides valuable information regarding performance of manufacturing process. It allows determining the maximum possible variation of a quality feature in production. Previous researches have focused on application of tolerance analysis to the design of mechanical assemblies...... malfunctioning existing part. The capability measure is based on a number of measurements performed on part’s quality variable. Since the ratio relies on measurements, elimination of any possible error has notable negative impact on results. Therefore, measurement uncertainty was used in combination with process...

  7. Reachability for Finite-State Process Algebras Using Static Analysis

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming

    2011-01-01

    of the Data Flow Analysis are used in order to “cut off” some of the branches in the reachability analysis that are not important for determining, whether or not a state is reachable. In this way, it is possible for our reachability algorithm to avoid building large parts of the system altogether and still......In this work we present an algorithm for solving the reachability problem in finite systems that are modelled with process algebras. Our method uses Static Analysis, in particular, Data Flow Analysis, of the syntax of a process algebraic system with multi-way synchronisation. The results...... solve the reachability problem in a precise way....

  8. MetaboLab - advanced NMR data processing and analysis for metabolomics

    Directory of Open Access Journals (Sweden)

    Günther Ulrich L

    2011-09-01

    Full Text Available Abstract Background Despite wide-spread use of Nuclear Magnetic Resonance (NMR in metabolomics for the analysis of biological samples there is a lack of graphically driven, publicly available software to process large one and two-dimensional NMR data sets for statistical analysis. Results Here we present MetaboLab, a MATLAB based software package that facilitates NMR data processing by providing automated algorithms for processing series of spectra in a reproducible fashion. A graphical user interface provides easy access to all steps of data processing via a script builder to generate MATLAB scripts, providing an option to alter code manually. The analysis of two-dimensional spectra (1H,13C-HSQC spectra is facilitated by the use of a spectral library derived from publicly available databases which can be extended readily. The software allows to display specific metabolites in small regions of interest where signals can be picked. To facilitate the analysis of series of two-dimensional spectra, different spectra can be overlaid and assignments can be transferred between spectra. The software includes mechanisms to account for overlapping signals by highlighting neighboring and ambiguous assignments. Conclusions The MetaboLab software is an integrated software package for NMR data processing and analysis, closely linked to the previously developed NMRLab software. It includes tools for batch processing and gives access to a wealth of algorithms available in the MATLAB framework. Algorithms within MetaboLab help to optimize the flow of metabolomics data preparation for statistical analysis. The combination of an intuitive graphical user interface along with advanced data processing algorithms facilitates the use of MetaboLab in a broader metabolomics context.

  9. A cost analysis: processing maple syrup products

    Science.gov (United States)

    Neil K. Huyler; Lawrence D. Garrett

    1979-01-01

    A cost analysis of processing maple sap to syrup for three fuel types, oil-, wood-, and LP gas-fired evaporators, indicates that: (1) fuel, capital, and labor are the major cost components of processing sap to syrup; (2) wood-fired evaporators show a slight cost advantage over oil- and LP gas-fired evaporators; however, as the cost of wood approaches $50 per cord, wood...

  10. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  11. Energy analysis handbook. CAC document 214. [Combining process analysis with input-output analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bullard, C. W.; Penner, P. S.; Pilati, D. A.

    1976-10-01

    Methods are presented for calculating the energy required, directly and indirectly, to produce all types of goods and services. Procedures for combining process analysis with input-output analysis are described. This enables the analyst to focus data acquisition cost-effectively, and to achieve a specified degree of accuracy in the results. The report presents sample calculations and provides the tables and charts needed to perform most energy cost calculations, including the cost of systems for producing or conserving energy.

  12. Emotion processing in the visual brain: a MEG analysis.

    Science.gov (United States)

    Peyk, Peter; Schupp, Harald T; Elbert, Thomas; Junghöfer, Markus

    2008-06-01

    Recent functional magnetic resonance imaging (fMRI) and event-related brain potential (ERP) studies provide empirical support for the notion that emotional cues guide selective attention. Extending this line of research, whole head magneto-encephalogram (MEG) was measured while participants viewed in separate experimental blocks a continuous stream of either pleasant and neutral or unpleasant and neutral pictures, presented for 330 ms each. Event-related magnetic fields (ERF) were analyzed after intersubject sensor coregistration, complemented by minimum norm estimates (MNE) to explore neural generator sources. Both streams of analysis converge by demonstrating the selective emotion processing in an early (120-170 ms) and a late time interval (220-310 ms). ERF analysis revealed that the polarity of the emotion difference fields was reversed across early and late intervals suggesting distinct patterns of activation in the visual processing stream. Source analysis revealed the amplified processing of emotional pictures in visual processing areas with more pronounced occipito-parieto-temporal activation in the early time interval, and a stronger engagement of more anterior, temporal, regions in the later interval. Confirming previous ERP studies showing facilitated emotion processing, the present data suggest that MEG provides a complementary look at the spread of activation in the visual processing stream.

  13. 40 CFR 68.67 - Process hazard analysis.

    Science.gov (United States)

    2010-07-01

    ...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault Tree...) The hazards of the process; (2) The identification of any previous incident which had a likely...

  14. Explaining discontinuity in organizational learning : a process analysis

    NARCIS (Netherlands)

    Berends, J.J.; Lammers, I.S.

    2010-01-01

    This paper offers a process analysis of organizational learning as it unfolds in a social and temporal context. Building upon the 4I framework (Crossan et al. 1999), we examine organizational learning processes in a longitudinal case study of an implementation of knowledge management in an

  15. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  16. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  17. Parallel factor analysis PARAFAC of process affected water

    Energy Technology Data Exchange (ETDEWEB)

    Ewanchuk, A.M.; Ulrich, A.C.; Sego, D. [Alberta Univ., Edmonton, AB (Canada). Dept. of Civil and Environmental Engineering; Alostaz, M. [Thurber Engineering Ltd., Calgary, AB (Canada)

    2010-07-01

    A parallel factor analysis (PARAFAC) of oil sands process-affected water was presented. Naphthenic acids (NA) are traditionally described as monobasic carboxylic acids. Research has indicated that oil sands NA do not fit classical definitions of NA. Oil sands organic acids have toxic and corrosive properties. When analyzed by fluorescence technology, oil sands process-affected water displays a characteristic peak at 290 nm excitation and approximately 346 nm emission. In this study, a parallel factor analysis (PARAFAC) was used to decompose process-affected water multi-way data into components representing analytes, chemical compounds, and groups of compounds. Water samples from various oil sands operations were analyzed in order to obtain EEMs. The EEMs were then arranged into a large matrix in decreasing process-affected water content for PARAFAC. Data were divided into 5 components. A comparison with commercially prepared NA samples suggested that oil sands NA is fundamentally different. Further research is needed to determine what each of the 5 components represent. tabs., figs.

  18. A report on digital image processing and analysis

    International Nuclear Information System (INIS)

    Singh, B.; Alex, J.; Haridasan, G.

    1989-01-01

    This report presents developments in software, connected with digital image processing and analysis in the Centre. In image processing, one resorts to either alteration of grey level values so as to enhance features in the image or resorts to transform domain operations for restoration or filtering. Typical transform domain operations like Karhunen-Loeve transforms are statistical in nature and are used for a good registration of images or template - matching. Image analysis procedures segment grey level images into images contained within selectable windows, for the purpose of estimating geometrical features in the image, like area, perimeter, projections etc. In short, in image processing both the input and output are images, whereas in image analyses, the input is an image whereas the output is a set of numbers and graphs. (author). 19 refs

  19. Optimization of cryogenic cooled EDM process parameters using grey relational analysis

    International Nuclear Information System (INIS)

    Kumar, S Vinoth; Kumar, M Pradeep

    2014-01-01

    This paper presents an experimental investigation on cryogenic cooling of liquid nitrogen (LN 2 ) copper electrode in the electrical discharge machining (EDM) process. The optimization of the EDM process parameters, such as the electrode environment (conventional electrode and cryogenically cooled electrode in EDM), discharge current, pulse on time, gap voltage on material removal rate, electrode wear, and surface roughness on machining of AlSiCp metal matrix composite using multiple performance characteristics on grey relational analysis was investigated. The L 18 orthogonal array was utilized to examine the process parameters, and the optimal levels of the process parameters were identified through grey relational analysis. Experimental data were analyzed through analysis of variance. Scanning electron microscopy analysis was conducted to study the characteristics of the machined surface.

  20. Pedagogical issues for effective teaching of biosignal processing and analysis.

    Science.gov (United States)

    Sandham, William A; Hamilton, David J

    2010-01-01

    Biosignal processing and analysis is generally perceived by many students to be a challenging topic to understand, and to become adept with the necessary analytical skills. This is a direct consequence of the high mathematical content involved, and the many abstract features of the topic. The MATLAB and Mathcad software packages offer an excellent algorithm development environment for teaching biosignal processing and analysis modules, and can also be used effectively in many biosignal, and indeed bioengineering, research areas. In this paper, traditional introductory and advanced biosignal processing (and analysis) syllabi are reviewed, and the use of MATLAB and Mathcad for teaching and research is illustrated with a number of examples.

  1. Book: Marine Bioacoustic Signal Processing and Analysis

    Science.gov (United States)

    2011-09-30

    physicists , and mathematicians . However, more and more biologists and psychologists are starting to use advanced signal processing techniques and...Book: Marine Bioacoustic Signal Processing and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT ...chapters than it should be, since the project must be finished by Dec. 31. I have started setting aside 2 hours of uninterrupted per workday to work

  2. Software Integration of Life Cycle Assessment and Economic Analysis for Process Evaluation

    DEFF Research Database (Denmark)

    Kalakula, Sawitree; Malakula, Pomthong; Siemanonda, Kitipat

    2013-01-01

    This study is focused on the sustainable process design of bioethanol production from cassava rhizome. The study includes: process simulation, sustainability analysis, economic evaluation and life cycle assessment (LCA). A steady state process simulation if performed to generate a base case design...... of the bioethanol conversion process using cassava rhizome as a feedstock. The sustainability analysis is performed to analyze the relevant indicators in sustainability metrics, to definedesign/retrofit targets for process improvements. Economic analysis is performed to evaluate the profitability of the process........ Also, simultaneously with sustainability analysis, the life cycle impact on environment associated with bioethanol production is performed. Finally, candidate alternative designs are generated and compared with the base case design in terms of LCA, economics, waste, energy usage and enviromental impact...

  3. The Design-to-Analysis Process at Sandia National Laboratories Observations and Recommendations; TOPICAL

    International Nuclear Information System (INIS)

    BURNS, SHAWN P.; HARRISON, RANDY J.; DOBRANICH, DEAN

    2001-01-01

    The efficiency of the design-to-analysis process for translating solid-model-based design data to computational analysis model data plays a central role in the application of computational analysis to engineering design and certification. A review of the literature from within Sandia as well as from industry shows that the design-to-analysis process involves a number of complex organizational and technological issues. This study focuses on the design-to-analysis process from a business process standpoint and is intended to generate discussion regarding this important issue. Observations obtained from Sandia staff member and management interviews suggest that the current Sandia design-to-analysis process is not mature and that this cross-organizational issue requires committed high-level ownership. A key recommendation of the study is that additional resources should be provided to the computer aided design organizations to support design-to-analysis. A robust community of practice is also needed to continuously improve the design-to-analysis process and to provide a corporate perspective

  4. Frames and operator theory in analysis and signal processing

    CERN Document Server

    Larson, David R; Nashed, Zuhair; Nguyen, Minh Chuong; Papadakis, Manos

    2008-01-01

    This volume contains articles based on talks presented at the Special Session Frames and Operator Theory in Analysis and Signal Processing, held in San Antonio, Texas, in January of 2006. Recently, the field of frames has undergone tremendous advancement. Most of the work in this field is focused on the design and construction of more versatile frames and frames tailored towards specific applications, e.g., finite dimensional uniform frames for cellular communication. In addition, frames are now becoming a hot topic in mathematical research as a part of many engineering applications, e.g., matching pursuits and greedy algorithms for image and signal processing. Topics covered in this book include: Application of several branches of analysis (e.g., PDEs; Fourier, wavelet, and harmonic analysis; transform techniques; data representations) to industrial and engineering problems, specifically image and signal processing. Theoretical and applied aspects of frames and wavelets. Pure aspects of operator theory empha...

  5. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  6. A thermo-economic analysis of the separation process of an ethylene plant

    International Nuclear Information System (INIS)

    Yi, T.; Jan-Min, S.

    1989-01-01

    This study has established a model of thermo-economic balance for chemical processes. The general rules to form the exergy-price constraint equation and the equations of some major types of the units have been proposed. With this model, a thermo-economic analysis for the separation process of an Ethylene Plant has been studied. The paper has made an analysis for the effects of different boundary exergy prices on the process evaluation. The result shows that the thermo-economic analysis for a process, using the method advanced in here, merely depends on the process construction and the ratio of supplying exergy-prices. As soon as the ratio is well-matched, a similar analogic analysis may be set up for the same type of processes in different economic environments

  7. The Digital Image Processing And Quantitative Analysis In Microscopic Image Characterization

    International Nuclear Information System (INIS)

    Ardisasmita, M. Syamsa

    2000-01-01

    Many electron microscopes although have produced digital images, but not all of them are equipped with a supporting unit to process and analyse image data quantitatively. Generally the analysis of image has to be made visually and the measurement is realized manually. The development of mathematical method for geometric analysis and pattern recognition, allows automatic microscopic image analysis with computer. Image processing program can be used for image texture and structure periodic analysis by the application of Fourier transform. Because the development of composite materials. Fourier analysis in frequency domain become important for measure the crystallography orientation. The periodic structure analysis and crystal orientation are the key to understand many material properties like mechanical strength. stress, heat conductivity, resistance, capacitance and other material electric and magnetic properties. In this paper will be shown the application of digital image processing in microscopic image characterization and analysis in microscopic image

  8. Energy and exergy analysis of the silicon production process

    International Nuclear Information System (INIS)

    Takla, M.; Kamfjord, N.E.; Tveit, Halvard; Kjelstrup, S.

    2013-01-01

    We used energy and exergy analysis to evaluate two industrial and one ideal (theoretical) production process for silicon. The industrial processes were considered in the absence and presence of power production from waste heat in the off-gas. The theoretical process, with pure reactants and no side-reactions, was used to provide a more realistic upper limit of performance for the others. The energy analysis documented the large thermal energy source in the off-gas system, while the exergy analysis documented the potential for efficiency improvement. We found an exergetic efficiency equal to 0.33 ± 0.02 for the process without power production. The value increased to 0.41 ± 0.03 when waste heat was utilized. For the ideal process, we found an exergetic efficiency of 0.51. Utilization of thermal exergy in an off-gas of 800 °C increased this exergetic efficiency to 0.71. Exergy destructed due to combustion of by-product gases and exergy lost with the furnace off-gas were the largest contributors to the thermodynamic inefficiency of all processes. - Highlights: • The exergetic efficiency for an industrial silicon production process when silicon is the only product was estimated to 0.33. • With additional power production from thermal energy in the off-gas we estimated the exergetic efficiency to 0.41. • The theoretical silicon production process is established as the reference case. • Exergy lost with the off-gas and exergy destructed due to combustion account for roughly 75% of the total losses. • With utilization of the thermal exergy in the off-gas at a temperature of 800 °C the exergetic efficiency was 0.71

  9. A comprehensive sensitivity and uncertainty analysis of a milk drying process

    DEFF Research Database (Denmark)

    Ferrari, A.; Gutiérrez, S.; Sin, G.

    2015-01-01

    A simple steady state model of a milk drying process was built to help process understanding. It involves a spray chamber and also internal/external fluid beds. The model was subjected to a statistical analysis for quality assurance using sensitivity analysis (SA) of inputs/parameters, identifiab......A simple steady state model of a milk drying process was built to help process understanding. It involves a spray chamber and also internal/external fluid beds. The model was subjected to a statistical analysis for quality assurance using sensitivity analysis (SA) of inputs...... technique. SA results provide evidence towards over-parameterization in the model, and the chamber inlet dry bulb air temperature was the variable (input) with the highest sensitivity. IA results indicated that at most 4 parameters are identifiable: two from spray chamber and one from each fluid bed dryer...

  10. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  11. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  12. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  13. Preliminary Hazards Analysis Plasma Hearth Process

    International Nuclear Information System (INIS)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment

  14. Applied Fourier analysis from signal processing to medical imaging

    CERN Document Server

    Olson, Tim

    2017-01-01

    The first of its kind, this focused textbook serves as a self-contained resource for teaching from scratch the fundamental mathematics of Fourier analysis and illustrating some of its most current, interesting applications, including medical imaging and radar processing. Developed by the author from extensive classroom teaching experience, it provides a breadth of theory that allows students to appreciate the utility of the subject, but at as accessible a depth as possible. With myriad applications included, this book can be adapted to a one or two semester course in Fourier Analysis or serve as the basis for independent study. Applied Fourier Analysis assumes no prior knowledge of analysis from its readers, and begins by making the transition from linear algebra to functional analysis. It goes on to cover basic Fourier series and Fourier transforms before delving into applications in sampling and interpolation theory, digital communications, radar processing, medical i maging, and heat and wave equations. Fo...

  15. Inferring Group Processes from Computer-Mediated Affective Text Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, Jack C [ORNL; Begoli, Edmon [ORNL; Jose, Ajith [Missouri University of Science and Technology; Griffin, Christopher [Pennsylvania State University

    2011-02-01

    Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Several useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.

  16. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  17. Process analysis in a THTR trial reprocessing plant

    International Nuclear Information System (INIS)

    Brodda, B.G.; Filss, P.; Kirchner, H.; Kroth, K.; Lammertz, H.; Schaedlich, W.; Brocke, W.; Buerger, K.; Halling, H.; Watzlawik, K.H.

    1979-01-01

    The demands on an analytical control system for a THTR trial reprocessing plant are specified. In a rather detailed example, a typical sampling, sample monitoring and measuring process is described. Analytical control is partly automated. Data acquisition and evaluation by computer are described for some important, largely automated processes. Sample management and recording of in-line and off-line data are carried out by a data processing system. Some important experiments on sample taking, sample transport and on special analysis are described. (RB) [de

  18. Data Farming Process and Initial Network Analysis Capabilities

    Directory of Open Access Journals (Sweden)

    Gary Horne

    2016-01-01

    Full Text Available Data Farming, network applications and approaches to integrate network analysis and processes to the data farming paradigm are presented as approaches to address complex system questions. Data Farming is a quantified approach that examines questions in large possibility spaces using modeling and simulation. It evaluates whole landscapes of outcomes to draw insights from outcome distributions and outliers. Social network analysis and graph theory are widely used techniques for the evaluation of social systems. Incorporation of these techniques into the data farming process provides analysts examining complex systems with a powerful new suite of tools for more fully exploring and understanding the effect of interactions in complex systems. The integration of network analysis with data farming techniques provides modelers with the capability to gain insight into the effect of network attributes, whether the network is explicitly defined or emergent, on the breadth of the model outcome space and the effect of model inputs on the resultant network statistics.

  19. A meta-analysis and review of holistic face processing.

    Science.gov (United States)

    Richler, Jennifer J; Gauthier, Isabel

    2014-09-01

    The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, 2 measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the 2 designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly 3 times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the 1st sections of our review-the complete design-and outline outstanding research questions in that new context. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  20. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2008-05-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  1. Use of safety analysis results to support process operation

    International Nuclear Information System (INIS)

    Karvonen, I.; Heino, P.

    1990-01-01

    Safety and risk analysis carried out during the design phase of a process plant produces useful knowledge about the behavior and the disturbances of the system. This knowledge, however, often remains to the designer though it would be of benefit to the operators and supervisors of the process plant, too. In Technical Research Centre of Finland a project has been started to plan and construct a prototype of an information system to make use of the analysis knowledge during the operation phase. The project belongs to a Nordic KRM project (Knowledge Based Risk Management System). The information system is planned to base on safety and risk analysis carried out during the design phase and completed with operational experience. The safety analysis includes knowledge about potential disturbances, their causes and consequences in the form of Hazard and Operability Study, faut trees and/or event trees. During the operation disturbances can however, occur, which are not included in the safety analysis, or the causes or consequences of which have been incompletely identified. Thus the information system must also have an interface for the documentation of the operational knowledge missing from the analysis results. The main tasks off the system when supporting the management of a disturbance are to identify it (or the most important of the coexistent ones) from the stored knowledge and to present it in a proper form (for example as a deviation graph). The information system may also be used to transfer knowledge from one shift to another and to train process personnel

  2. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic modelbased process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty...

  3. The process system analysis for advanced spent fuel management technology (I)

    International Nuclear Information System (INIS)

    Lee, H. H.; Lee, J. R.; Kang, D. S.; Seo, C. S.; Shin, Y. J.; Park, S. W.

    1997-12-01

    Various pyrochemical processes were evaluated, and viable options were selected in consideration of the proliferation safety, technological feasibility and compatibility to the domestic nuclear power system. Detailed technical analysis were followed on the selected options such as unit process flowsheet including physico-chemical characteristics of the process systems, preliminary concept development, process design criteria and materials for equipment. Supplementary analysis were also carried out on the support technologies including sampling and transport technologies of molten salt, design criteria and equipment for glove box systems, and remote operation technologies. (author). 40 refs., 49 tabs., 37 figs

  4. Exergy analysis of the LFC process

    International Nuclear Information System (INIS)

    Li, Qingsong; Lin, Yuankui

    2016-01-01

    Highlights: • Mengdong lignite was upgraded by liquids from coal (LFC) process at a laboratory-scale. • True boiling point distillation of tar was performed. • Basing on experimental data, the LFC process was simulated in Aspen Plus. • Amounts of exergy destruction and efficiencies of blocks were calculated. • Potential measures for improving the LFC process are suggested. - Abstract: Liquid from coal (LFC) is a pyrolysis technology for upgrading lignite. LFC is close to viability as a large-scale commercial technology and is strongly promoted by the Chinese government. This paper presents an exergy analysis of the LFC process producing semicoke and tar, simulated in Aspen Plus. The simulation included the drying unit, pyrolysis unit, tar recovery unit and combustion unit. To obtain the data required for the simulation, Mengdong lignite was upgraded using a laboratory-scale experimental facility based on LFC technology. True boiling point distillation of tar was performed. Based on thermodynamic data obtained from the simulation, chemical exergy and physical exergy were determined for process streams and exergy destruction was calculated. The exergy budget of the LFC process is presented as a Grassmann flow diagram. The overall exergy efficiency was 76.81%, with the combustion unit causing the highest exergy destruction. The study found that overall exergy efficiency can be increased by reducing moisture in lignite and making full use of physical exergy of pyrolysates. A feasible method for making full use of physical exergy of semicoke was suggested.

  5. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  6. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  7. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  8. Safety analysis of tritium processing system based on PHA

    International Nuclear Information System (INIS)

    Fu Wanfa; Luo Deli; Tang Tao

    2012-01-01

    Safety analysis on primary confinement of tritium processing system for TBM was carried out with Preliminary Hazard Analysis. Firstly, the basic PHA process was given. Then the function and safe measures with multiple confinements about tritium system were described and analyzed briefly, dividing the two kinds of boundaries of tritium transferring through, that are multiple confinement systems division and fluid loops division. Analysis on tritium releasing is the key of PHA. Besides, PHA table about tritium releasing was put forward, the causes and harmful results being analyzed, and the safety measures were put forward also. On the basis of PHA, several kinds of typical accidents were supposed to be further analyzed. And 8 factors influencing the tritium safety were analyzed, laying the foundation of evaluating quantitatively the safety grade of various nuclear facilities. (authors)

  9. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2012-09-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  10. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  11. Quantitative analysis of geomorphic processes using satellite image data at different scales

    Science.gov (United States)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  12. Fit Gap Analysis – The Role of Business Process Reference Models

    Directory of Open Access Journals (Sweden)

    Dejan Pajk

    2013-12-01

    Full Text Available Enterprise resource planning (ERP systems support solutions for standard business processes such as financial, sales, procurement and warehouse. In order to improve the understandability and efficiency of their implementation, ERP vendors have introduced reference models that describe the processes and underlying structure of an ERP system. To select and successfully implement an ERP system, the capabilities of that system have to be compared with a company’s business needs. Based on a comparison, all of the fits and gaps must be identified and further analysed. This step usually forms part of ERP implementation methodologies and is called fit gap analysis. The paper theoretically overviews methods for applying reference models and describes fit gap analysis processes in detail. The paper’s first contribution is its presentation of a fit gap analysis using standard business process modelling notation. The second contribution is the demonstration of a process-based comparison approach between a supply chain process and an ERP system process reference model. In addition to its theoretical contributions, the results can also be practically applied to projects involving the selection and implementation of ERP systems.

  13. Process based analysis of manually controlled drilling processes for bone

    Science.gov (United States)

    Teicher, Uwe; Achour, Anas Ben; Nestler, Andreas; Brosius, Alexander; Lauer, Günter

    2018-05-01

    The machining operation drilling is part of the standard repertoire for medical applications. This machining cycle, which is usually a multi-stage process, generates the geometric element for the subsequent integration of implants, which are screwed into the bone in subsequent processes. In addition to the form, shape and position of the generated drill hole, it is also necessary to use a technology that ensures an operation with minimal damage. A surface damaged by excessive mechanical and thermal energy input shows a deterioration in the healing capacity of implants and represents a structure with complications for inflammatory reactions. The resulting loads are influenced by the material properties of the bone, the used technology and the tool properties. An important aspect of the process analysis is the fact that machining of bone is in most of the cases a manual process that depends mainly on the skills of the operator. This includes, among other things, the machining time for the production of a drill hole, since manual drilling is a force-controlled process. Experimental work was carried out on the bone of a porcine mandible in order to investigate the interrelation of the applied load during drilling. It can be shown that the load application can be subdivided according to the working feed direction. The entire drilling process thus consists of several time domains, which can be divided into the geometry-generating feed motion and a retraction movement of the tool. It has been shown that the removal of the tool from the drill hole has a significant influence on the mechanical load input. This fact is proven in detail by a new evaluation methodology. The causes of this characteristic can also be identified, as well as possible ways of reducing the load input.

  14. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  15. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  16. A Comparative Analysis of Extract, Transformation and Loading (ETL) Process

    Science.gov (United States)

    Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.

    2018-02-01

    The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).

  17. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    Science.gov (United States)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  18. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  19. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  20. Warpage analysis in injection moulding process

    Science.gov (United States)

    Hidayah, M. H. N.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    This study was concentrated on the effects of process parameters in plastic injection moulding process towards warpage problem by using Autodesk Moldflow Insight (AMI) software for the simulation. In this study, plastic dispenser of dental floss has been analysed with thermoplastic material of Polypropylene (PP) used as the moulded material and details properties of 80 Tonne Nessei NEX 1000 injection moulding machine also has been used in this study. The variable parameters of the process are packing pressure, packing time, melt temperature and cooling time. Minimization of warpage obtained from the optimization and analysis data from the Design Expert software. Integration of Response Surface Methodology (RSM), Center Composite Design (CCD) with polynomial models that has been obtained from Design of Experiment (DOE) is the method used in this study. The results show that packing pressure is the main factor that will contribute to the formation of warpage in x-axis and y-axis. While in z-axis, the main factor is melt temperature and packing time is the less significant among the four parameters in x, y and z-axes. From optimal processing parameter, the value of warpage in x, y and z-axis have been optimised by 21.60%, 26.45% and 24.53%, respectively.

  1. Theoretical analysis of radiographic images by nonstationary Poisson processes

    International Nuclear Information System (INIS)

    Tanaka, Kazuo; Uchida, Suguru; Yamada, Isao.

    1980-01-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process. (author)

  2. Clinical process analysis and activity-based costing at a heart center.

    Science.gov (United States)

    Ridderstolpe, Lisa; Johansson, Andreas; Skau, Tommy; Rutberg, Hans; Ahlfeldt, Hans

    2002-08-01

    Cost studies, productivity, efficiency, and quality of care measures, the links between resources and patient outcomes, are fundamental issues for hospital management today. This paper describes the implementation of a model for process analysis and activity-based costing (ABC)/management at a Heart Center in Sweden as a tool for administrative cost information, strategic decision-making, quality improvement, and cost reduction. A commercial software package (QPR) containing two interrelated parts, "ProcessGuide and CostControl," was used. All processes at the Heart Center were mapped and graphically outlined. Processes and activities such as health care procedures, research, and education were identified together with their causal relationship to costs and products/services. The construction of the ABC model in CostControl was time-consuming. However, after the ABC/management system was created, it opened the way for new possibilities including process and activity analysis, simulation, and price calculations. Cost analysis showed large variations in the cost obtained for individual patients undergoing coronary artery bypass grafting (CABG) surgery. We conclude that a process-based costing system is applicable and has the potential to be useful in hospital management.

  3. A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.

    Science.gov (United States)

    Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan

    2017-12-01

    A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  4. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  5. Computer Aided Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    A methodology for computer aided design and analysis of separation processes involving electrolyte systems is presented. The methodology consists of three main parts. The thermodynamic part "creates" the problem specific property model package, which is a collection of pure component and mixture...... property models. The design and analysis part generates process (flowsheet) alternatives, evaluates/analyses feasibility of separation and provides a visual operation path for the desired separation. The simulation part consists of a simulation/calculation engine that allows the screening and validation...... of process alternatives. For the simulation part, a general multi-purpose, multi-phase separation model has been developed and integrated to an existing computer aided system. Application of the design and analysis methodology is highlighted through two illustrative case studies....

  6. Computer Aided Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    Takano, Kiyoteru; Gani, Rafiqul; Kolar, P.

    2000-01-01

    A methodology for computer aided design and analysis of separation processes involving electrolyte systems is presented. The methodology consists of three main parts. The thermodynamic part 'creates' the problem specific property model package, which is a collection of pure component and mixture...... property models. The design and analysis part generates process (flowsheet) alternatives, evaluates/analyses feasibility of separation and provides a visual operation path for the desired separation. The simulation part consists of a simulation/calculation engine that allows the screening and validation...... of process alternatives. For the simulation part, a general multi-purpose, multi-phase separation model has been developed and integrated to an existing computer aided system. Application of the design and analysis methodology is highlighted through two illustrative case studies, (C) 2000 Elsevier Science...

  7. Thermodynamics and process analysis for future economic scenarios

    International Nuclear Information System (INIS)

    Ayres, R.U.

    1995-01-01

    Economists are increasingly interested in forecasting future costs and benefits of policies for dealing with materials/energy fluxes, polluting emissions and environmental impacts on various scales, from sectoral to global. Computable general equilibrium (CGE) models are currently popular because they project demand and industrial structure into the future, along an equilibrium path. But they are applicable only to the extent that structural changes occur in or near equilibrium, independent of radical technological (or social) change. The alternative tool for analyzing economic implications of scenario assumptions is to use Leontief-type Input-Output (I-O) models. I-O models are unable to endogenize structural shifts (changing I-O coefficients). However, this can be a virtue when considering radical rather than incremental shifts. Postulated I-O tables can be used independently to check the internal consistency of scenarios. Or I-O models can be used to generate scenarios by linking them to econometric 'macro-drivers' (which can, in principle, be CGE models). Explicit process analysis can be integrated, in principle, with I-O models. This hybrid scheme provides a natural means of satisfying physical constraints, especially the first and second laws of thermodynamics. This is important, to avoid constructing scenarios based on physically impossible processes. Process analysis is really the only available tool for constructing physically plausible alternative future I-O tables, and generating materials/energy and waste emissions coefficients. Explicit process analysis also helps avoid several problems characteristic of 'pure' CGE or I-O models, viz. (1) aggregation errors (2) inability to handle arbitrary combinations of co-product and co-input relationships and (3) inability to reflect certain non-linearities such as internal feedback loops. 4 figs., 2 tabs., 38 refs

  8. Cross-Sectional Analysis of Longitudinal Mediation Processes.

    Science.gov (United States)

    O'Laughlin, Kristine D; Martin, Monica J; Ferrer, Emilio

    2018-01-01

    Statistical mediation analysis can help to identify and explain the mechanisms behind psychological processes. Examining a set of variables for mediation effects is a ubiquitous process in the social sciences literature; however, despite evidence suggesting that cross-sectional data can misrepresent the mediation of longitudinal processes, cross-sectional analyses continue to be used in this manner. Alternative longitudinal mediation models, including those rooted in a structural equation modeling framework (cross-lagged panel, latent growth curve, and latent difference score models) are currently available and may provide a better representation of mediation processes for longitudinal data. The purpose of this paper is twofold: first, we provide a comparison of cross-sectional and longitudinal mediation models; second, we advocate using models to evaluate mediation effects that capture the temporal sequence of the process under study. Two separate empirical examples are presented to illustrate differences in the conclusions drawn from cross-sectional and longitudinal mediation analyses. Findings from these examples yielded substantial differences in interpretations between the cross-sectional and longitudinal mediation models considered here. Based on these observations, researchers should use caution when attempting to use cross-sectional data in place of longitudinal data for mediation analyses.

  9. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  10. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    Science.gov (United States)

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  11. Image processing analysis of traditional Gestalt vision experiments

    Science.gov (United States)

    McCann, John J.

    2002-06-01

    In the late 19th century, the Gestalt Psychology rebelled against the popular new science of Psychophysics. The Gestalt revolution used many fascinating visual examples to illustrate that the whole is greater than the sum of all the parts. Color constancy was an important example. The physical interpretation of sensations and their quantification by JNDs and Weber fractions were met with innumerable examples in which two 'identical' physical stimuli did not look the same. The fact that large changes in the color of the illumination failed to change color appearance in real scenes demanded something more than quantifying the psychophysical response of a single pixel. The debates continues today with proponents of both physical, pixel-based colorimetry and perceptual, image- based cognitive interpretations. Modern instrumentation has made colorimetric pixel measurement universal. As well, new examples of unconscious inference continue to be reported in the literature. Image processing provides a new way of analyzing familiar Gestalt displays. Since the pioneering experiments by Fergus Campbell and Land, we know that human vision has independent spatial channels and independent color channels. Color matching data from color constancy experiments agrees with spatial comparison analysis. In this analysis, simple spatial processes can explain the different appearances of 'identical' stimuli by analyzing the multiresolution spatial properties of their surrounds. Benary's Cross, White's Effect, the Checkerboard Illusion and the Dungeon Illusion can all be understood by the analysis of their low-spatial-frequency components. Just as with color constancy, these Gestalt images are most simply described by the analysis of spatial components. Simple spatial mechanisms account for the appearance of 'identical' stimuli in complex scenes. It does not require complex, cognitive processes to calculate appearances in familiar Gestalt experiments.

  12. Applications Associated With Morphological Analysis And Generation In Natural Language Processing

    Directory of Open Access Journals (Sweden)

    Neha Yadav

    2017-08-01

    Full Text Available Natural Language Processing is one of the most developing fields in research area. In most of the applications related to the Natural Language Processing findings of the Morphological Analysis and Morphological Generation can be considered very important. As morphological study is the technique to recognise a word and its output can be used on later on stages .Keeping in view this importance this paper describes how Morphological Analysis and Morphological Generation can be proved as an important part of various Natural Language Processing fields such as Spell checker Machine Translation etc.

  13. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  14. Engineering analysis of the two-stage trifluoride precipitation process

    International Nuclear Information System (INIS)

    Luerkens, D.w.W.

    1984-06-01

    An engineering analysis of two-stage trifluoride precipitation processes is developed. Precipitation kinetics are modeled using consecutive reactions to represent fluoride complexation. Material balances across the precipitators are used to model the time dependent concentration profiles of the main chemical species. The results of the engineering analysis are correlated with previous experimental work on plutonium trifluoride and cerium trifluoride

  15. Analysis of electrochemical disintegration process of graphite matrix

    International Nuclear Information System (INIS)

    Tian Lifang; Wen Mingfen; Chen Jing

    2010-01-01

    The electrochemical method with ammonium nitrate as electrolyte was studied to disintegrate the graphite matrix from the simulative fuel elements for high temperature gas-cooled reactor. The influences of process parameters, including salt concentration, system temperature and current density, on the disintegration rate of graphite fragments were investigated in the present work. The experimental results showed that the disintegration rate depended slightly on the temperature and salt concentration. The current density strongly affected the disintegration rate of graphite fragments. Furthermore, the content of introduced oxygen in final graphite fragments was independent of the current density and the concentration of electrolyte. Moreover, the structural evolution of graphite was analyzed based on the microstructural parameters determined by X-ray diffraction profile fitting analysis using MAUD (material analysis using diffraction) before and after the disintegration process. It may safely be concluded that the graphite disintegration can be ascribed to the influences of the intercalation of foreign molecules in between crystal planes and the partial oxidation involved. The disintegration process was described deeply composed of intercalate part and further oxidation part of carbon which effected together to lead to the collapse of graphite crystals.

  16. Sensorial analysis of peanuts processed by e-beam

    International Nuclear Information System (INIS)

    Silva, Priscila V.; Furgeri, Camilo; Salum, Debora C.; Rogovschi, Vladimir D.; Villavicencio, Anna Lucia C.H.

    2007-01-01

    The development of the sensorial analysis was influenced by frequent changes in the technology of production and distribution of foods. Currently the sensorial analysis has represented a decisive part in some sectors of the nourishing industry with the purpose to improve the quality of its products. The food irradiation has as purpose to improve the product quality, in order to eliminate the diverse microorganisms that can spoil the food. The process of irradiation in the recommended doses causes very few chemical alterations in some foods, the nutritional losses are considered insignificant and some of the alterations known found in irradiated foods is not harmful or dangerous. The present study evaluated the sensorial characteristics of peanuts processed by electron beam machine and was made a test of acceptance using a hedonic scale. Samples of peanut had been processed in the doses of 0, 5 and 7 kGy. Thirty volunteer panelists had participated of that acceptance study. The evaluating parameters were: appearance, odor and flavor. The result showed that the consumers had approved the peanut in the dose of 5 and 7 kGy, not having significant difference between the samples controlled and irradiated. (author)

  17. Sensorial analysis of peanuts processed by e-beam

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Priscila V.; Furgeri, Camilo; Salum, Debora C.; Rogovschi, Vladimir D.; Villavicencio, Anna Lucia C.H. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mail: villavic@ipen.br

    2007-07-01

    The development of the sensorial analysis was influenced by frequent changes in the technology of production and distribution of foods. Currently the sensorial analysis has represented a decisive part in some sectors of the nourishing industry with the purpose to improve the quality of its products. The food irradiation has as purpose to improve the product quality, in order to eliminate the diverse microorganisms that can spoil the food. The process of irradiation in the recommended doses causes very few chemical alterations in some foods, the nutritional losses are considered insignificant and some of the alterations known found in irradiated foods is not harmful or dangerous. The present study evaluated the sensorial characteristics of peanuts processed by electron beam machine and was made a test of acceptance using a hedonic scale. Samples of peanut had been processed in the doses of 0, 5 and 7 kGy. Thirty volunteer panelists had participated of that acceptance study. The evaluating parameters were: appearance, odor and flavor. The result showed that the consumers had approved the peanut in the dose of 5 and 7 kGy, not having significant difference between the samples controlled and irradiated. (author)

  18. Emotional Processing, Interaction Process, and Outcome in Clarification-Oriented Psychotherapy for Personality Disorders: A Process-Outcome Analysis.

    Science.gov (United States)

    Kramer, Ueli; Pascual-Leone, Antonio; Rohde, Kristina B; Sachse, Rainer

    2016-06-01

    It is important to understand the change processes involved in psychotherapies for patients with personality disorders (PDs). One patient process that promises to be useful in relation to the outcome of psychotherapy is emotional processing. In the present process-outcome analysis, we examine this question by using a sequential model of emotional processing and by additionally taking into account a therapist's appropriate responsiveness to a patient's presentation in clarification-oriented psychotherapy (COP), a humanistic-experiential form of therapy. The present study involved 39 patients with a range of PDs undergoing COP. Session 25 was assessed as part of the working phase of each therapy by external raters in terms of emotional processing using the Classification of Affective-Meaning States (CAMS) and in terms of the overall quality of therapist-patient interaction using the Process-Content-Relationship Scale (BIBS). Treatment outcome was assessed pre- and post-therapy using the Global Severity Index (GSI) of the SCL-90-R and the BDI. Results indicate that the good outcome cases showed more self-compassion, more rejecting anger, and a higher quality of therapist-patient interaction compared to poorer outcome cases. For good outcome cases, emotional processing predicted 18% of symptom change at the end of treatment, which was not found for poor outcome cases. These results are discussed within the framework of an integrative understanding of emotional processing as an underlying mechanism of change in COP, and perhaps in other effective therapy approaches for PDs.

  19. Synthesis and analysis of a closed cycle methane-fueled marine energy process

    International Nuclear Information System (INIS)

    Teich, C.I.

    1983-01-01

    A marine energy system has been synthesized from state-of-the-art technology to convert nuclear derived electricity into liquefied methane. In the first part of the process, the on-board process, liquid methane is burned in a combined gas turbine-steam turbine system to provide propulsion power and the carbon dioxide created during combustion recovered. In the second part of the process, the fuel regeneration process, the methane is regenerated in a centralized land-based facility by the reaction of the recovered carbon dioxide with hydrogen obtained from nuclear-powered electrolysis of water. The system was analyzed by combining thermodynamic available energy analysis and an approximate preliminary design. The available energy analysis of the combined system established the thermodynamic feasibility of the methane-carbon dioxide cycle and resulted in various process improvements because of the inefficiencies disclosed by the analysis. The more critical on-board process was analyzed and developed further by a capital cost optimization and ranking alternate process options by their available energy consumptions. The optimal on-board process, whose capital cost is 16% less than the preliminary design, has an effectiveness of 47% and the fuel regeneration process an effectiveness of 56%. It was also found that the process cost was proportional to the horsepower raised to the seven-tenths power

  20. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  1. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  2. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  3. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    2017-07-01

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.

  4. Multivariate statistical analysis of a multi-step industrial processes

    DEFF Research Database (Denmark)

    Reinikainen, S.P.; Høskuldsson, Agnar

    2007-01-01

    Monitoring and quality control of industrial processes often produce information on how the data have been obtained. In batch processes, for instance, the process is carried out in stages; some process or control parameters are set at each stage. However, the obtained data might not be utilized...... efficiently, even if this information may reveal significant knowledge about process dynamics or ongoing phenomena. When studying the process data, it may be important to analyse the data in the light of the physical or time-wise development of each process step. In this paper, a unified approach to analyse...... multivariate multi-step processes, where results from each step are used to evaluate future results, is presented. The methods presented are based on Priority PLS Regression. The basic idea is to compute the weights in the regression analysis for given steps, but adjust all data by the resulting score vectors...

  5. The MicroAnalysis Toolkit: X-ray Fluorescence Image Processing Software

    International Nuclear Information System (INIS)

    Webb, S. M.

    2011-01-01

    The MicroAnalysis Toolkit is an analysis suite designed for the processing of x-ray fluorescence microprobe data. The program contains a wide variety of analysis tools, including image maps, correlation plots, simple image math, image filtering, multiple energy image fitting, semi-quantitative elemental analysis, x-ray fluorescence spectrum analysis, principle component analysis, and tomographic reconstructions. To be as widely useful as possible, data formats from many synchrotron sources can be read by the program with more formats available by request. An overview of the most common features will be presented.

  6. Process Equipment Failure Mode Analysis in a Chemical Industry

    Directory of Open Access Journals (Sweden)

    J. Nasl Seraji

    2008-04-01

    Full Text Available Background and aims   Prevention of potential accidents and safety promotion in chemical processes requires systematic safety management in them. The main objective of this study was analysis of important process equipment components failure modes and effects in H2S and CO2  isolation from extracted natural gas process.   Methods   This study was done in sweetening unit of an Iranian gas refinery. Failure Mode and Effect Analysis (FMEA used for identification of process equipments failures.   Results   Totally 30 failures identified and evaluated using FMEA. P-1 blower's blade breaking and sour gas pressure control valve bearing tight moving had maximum risk Priority number (RPN, P-1 body corrosion and increasing plug lower side angle of reach DEAlevel control valve  in tower - 1 were minimum calculated RPN.   Conclusion   By providing a reliable documentation system for equipment failures and  incidents recording, maintaining of basic information for later safety assessments would be  possible. Also, the probability of failures and effects could be minimized by conducting preventive maintenance.

  7. Data processing of X-ray fluorescence analysis using an electronic computer

    International Nuclear Information System (INIS)

    Yakubovich, A.L.; Przhiyalovskij, S.M.; Tsameryan, G.N.; Golubnichij, G.V.; Nikitin, S.A.

    1979-01-01

    Considered are problems of data processing of multi-element (for 17 elements) X-ray fluorescence analysis of tungsten and molybdenum ores. The analysis was carried out using silicon-lithium spectrometer with the energy resolution of about 300 eV and a 1024-channel analyzer. A characteristic radiation of elements was excited with two 109 Cd radioisotope sources, their general activity being 10 mCi. The period of measurements was 400 s. The data obtained were processed with a computer using the ''Proba-1'' and ''Proba-2'' programs. Data processing algorithms and computer calculation results are presented

  8. Surplus analysis of Sparre Andersen insurance risk processes

    CERN Document Server

    Willmot, Gordon E

    2017-01-01

    This carefully written monograph covers the Sparre Andersen process in an actuarial context using the renewal process as the model for claim counts. A unified reference on Sparre Andersen (renewal risk) processes is included, often missing from existing literature. The authors explore recent results and analyse various risk theoretic quantities associated with the event of ruin, including the time of ruin and the deficit of ruin. Particular attention is given to the explicit identification of defective renewal equation components, which are needed to analyse various risk theoretic quantities and are also relevant in other subject areas of applied probability such as dams and storage processes, as well as queuing theory. Aimed at researchers interested in risk/ruin theory and related areas, this work will also appeal to graduate students in classical and modern risk theory and Gerber-Shiu analysis.

  9. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  10. DIII-D Thomson Scattering Diagnostic Data Acquisition, Processing and Analysis Software

    International Nuclear Information System (INIS)

    Middaugh, K.R.; Bray, B.D.; Hsieh, C.L.; McHarg, B.B.Jr.; Penaflor, B.G.

    1999-01-01

    One of the diagnostic systems critical to the success of the DIII-D tokamak experiment is the Thomson scattering diagnostic. This diagnostic is unique in that it measures local electron temperature and density: (1) at multiple locations within the tokamak plasma; and (2) at different times throughout the plasma duration. Thomson ''raw'' data are digitized signals of scattered light, measured at different times and locations, from the laser beam paths fired into the plasma. Real-time acquisition of this data is performed by specialized hardware. Once obtained, the raw data are processed into meaningful temperature and density values which can be analyzed for measurement quality. This paper will provide an overview of the entire Thomson scattering diagnostic software and will focus on the data acquisition, processing, and analysis software implementation. The software falls into three general categories: (1) Set-up and Control: Initializes and controls all Thomson hardware and software, synchronizes with other DIII-D computers, and invokes other Thomson software as appropriate. (2) Data Acquisition and Processing: Obtains raw measured data from memory and processes it into temperature and density values. (3) Analysis: Provides a graphical user interface in which to perform analysis and sophisticated plotting of analysis parameters

  11. Exergetic analysis of a biodiesel production process from Jatropha curcas

    International Nuclear Information System (INIS)

    Blanco-Marigorta, A.M.; Suárez-Medina, J.; Vera-Castellano, A.

    2013-01-01

    Highlights: ► Exergetic analysis of a biodiesel production process from Jatropha curcas. ► A 95% of the inefficiencies are located in the transesterification reactor. ► Exergetic efficiency of the steam generator amounts 37.6%. ► Chemical reactions cause most of the irreversibilities of the process. ► Exergetic efficiency of the overall process is over 63%. -- Abstract: As fossil fuels are depleting day by day, it is necessary to find an alternative fuel to fulfill the energy demand of the world. Biodiesel is considered as an environmentally friendly renewable diesel fuel alternative. The interest in using Jatropha curcas as a feedstock for the production of biodiesel is rapidly growing. On the one hand, J. curcas’ oil does not compete with the food sector due to its toxic nature and to the fact that it must be cultivated in marginal/poor soil. On the other, its price is low and stable. In the last decade, the investigation on biodiesel production was centered on the choice of the suitable raw material and on the optimization of the process operation conditions. Nowadays, research is focused on the improvement of the energetic performance and on diminishing the inefficiencies in the different process components. The method of exergy analysis is well suited for furthering this goal, for it is a powerful tool for developing, evaluating and improving an energy conversion system. In this work, we identify the location, magnitude and sources of thermodynamic inefficiencies in a biodiesel production process from J. curcas by means of an exergy analysis. The thermodynamic properties were calculated from existing databases or estimated when necessary. The higher exergy destruction takes places in the transesterification reactor due to chemical reactions. Almost 95% of the exergy of the fuel is destroyed in this reactor. The exergetic efficiency of the overall process is 63%.

  12. Global processing takes time: A meta-analysis on local-global visual processing in ASD.

    Science.gov (United States)

    Van der Hallen, Ruth; Evers, Kris; Brewaeys, Katrien; Van den Noortgate, Wim; Wagemans, Johan

    2015-05-01

    What does an individual with autism spectrum disorder (ASD) perceive first: the forest or the trees? In spite of 30 years of research and influential theories like the weak central coherence (WCC) theory and the enhanced perceptual functioning (EPF) account, the interplay of local and global visual processing in ASD remains only partly understood. Research findings vary in indicating a local processing bias or a global processing deficit, and often contradict each other. We have applied a formal meta-analytic approach and combined 56 articles that tested about 1,000 ASD participants and used a wide range of stimuli and tasks to investigate local and global visual processing in ASD. Overall, results show no enhanced local visual processing nor a deficit in global visual processing. Detailed analysis reveals a difference in the temporal pattern of the local-global balance, that is, slow global processing in individuals with ASD. Whereas task-dependent interaction effects are obtained, gender, age, and IQ of either participant groups seem to have no direct influence on performance. Based on the overview of the literature, suggestions are made for future research. (c) 2015 APA, all rights reserved).

  13. PRECLOSURE CRITICALITY ANALYSIS PROCESS REPORT

    International Nuclear Information System (INIS)

    Danise, A.E.

    2004-01-01

    This report describes a process for performing preclosure criticality analyses for a repository at Yucca Mountain, Nevada. These analyses will be performed from the time of receipt of fissile material until permanent closure of the repository (preclosure period). The process describes how criticality safety analyses will be performed for various configurations of waste in or out of waste packages that could occur during preclosure as a result of normal operations or event sequences. The criticality safety analysis considers those event sequences resulting in unanticipated moderation, loss of neutron absorber, geometric changes, or administrative errors in waste form placement (loading) of the waste package. The report proposes a criticality analyses process for preclosure to allow a consistent transition from preclosure to postclosure, thereby possibly reducing potential cost increases and delays in licensing of Yucca Mountain. The proposed approach provides the advantage of using a parallel regulatory framework for evaluation of preclosure and postclosure performance and is consistent with the U.S. Nuclear Regulatory Commission's approach of supporting risk-informed, performance-based regulation for fuel cycle facilities, ''Yucca Mountain Review Plan, Final Report'', and 10 CFR Part 63. The criticality-related criteria for ensuring subcriticality are also described as well as which guidance documents will be utilized. Preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the U.S. Nuclear Regulatory Commission; therefore, the design approach for preclosure criticality safety will be dictated by existing regulatory requirements while using a risk-informed approach with burnup credit for in-package operations

  14. Performance analysis of solar energy integrated with natural-gas-to-methanol process

    International Nuclear Information System (INIS)

    Yang, Sheng; Liu, Zhiqiang; Tang, Zhiyong; Wang, Yifan; Chen, Qianqian; Sun, Yuhan

    2017-01-01

    Highlights: • Solar energy integrated with natural-gas-to-methanol process is proposed. • The two processes are modeled and simulated. • Performance analysis of the two processes are conducted. • The proposed process can cut down the greenhouse gas emission. • The proposed process can save natural gas consumption. - Abstract: Methanol is an important platform chemical. Methanol production using natural gas as raw material has short processing route and well developed equipment and technology. However, natural gas reserves are not large in China. Solar energy power generation system integrated with natural-gas-to-methanol (NGTM) process is developed, which may provide a technical routine for methanol production in the future. The solar energy power generation produces electricity for reforming unit and system consumption in solar energy integrated natural-gas-to-methanol system (SGTM). Performance analysis of conventional natural-gas-to-methanol process and solar energy integrated with natural-gas-to-methanol process are presented based on simulation results. Performance analysis was conducted considering carbon efficiency, production cost, solar energy price, natural gas price, and carbon tax. Results indicate that solar energy integrated with natural-gas-to-methanol process is able to cut down the greenhouse gas (GHG) emission. In addition, solar energy can replace natural gas as fuel. This can reduce the consumption of natural gas, which equals to 9.2% of the total consumed natural gas. However, it is not economical considering the current technology readiness level, compared with conventional natural-gas-to-methanol process.

  15. Formal concept analysis applied to the prediction of additives for galvanizing process

    Directory of Open Access Journals (Sweden)

    J. Klimeš

    2010-04-01

    Full Text Available Formal concept analysis is a new mathematical approach to data analysis, data mining and to discavering patterns in data. The result of the application of the formal concept analysis method to the behavior of the galvanizing of rimmed steel is presented. Effects of additives in the galvanizing process have been correlated to the chemical element properties of the additives. This model may also help to design new alloys as additives in the galvanizing process.

  16. Development of system analysis code for pyrochemical process using molten salt electrorefining

    International Nuclear Information System (INIS)

    Tozawa, K.; Matsumoto, T.; Kakehi, I.

    2000-04-01

    This report describes accomplishment of development of a cathode processor calculation code to simulate the mass and heat transfer phenomena with the distillation process and development of an analytical model for cooling behavior of the pyrochemical process cell on personal computers. The pyrochemical process using molten salt electrorefining would introduce new technologies for new fuels of particle oxide, particle nitride and metallic fuels. The cathode processor calculation code with distillation process was developed. A code validation calculation has been conducted on the basic of the benchmark problem for natural convection in a square cavity. Results by using the present code agreed well for the velocity-temperature fields, the maximum velocity and its location with the benchmark solution published in a paper. The functions have been added to advance the reality in simulation and to increase the efficiency in utilization. The test run has been conducted using the code with the above modification for an axisymmetric enclosed vessel simulating a cathode processor, and the capability of the distillation process simulation with the code has been confirmed. An analytical model for cooling behavior of the pyrochemical process cell was developed. The analytical model was selected by comparing benchmark analysis with detailed analysis on engineering workstation. Flow and temperature distributions were confirmed by the result of steady state analysis. In the result of transient cooling analysis, an initial transient peak of temperature occurred at balanced heat condition in the steady-state analysis. Final gas temperature distribution was dependent on gas circulation flow in transient condition. Then there were different final gas temperature distributions on the basis of the result of steady-state analysis. This phenomenon has a potential for it's own metastable condition. Therefore it was necessary to design gas cooling flow pattern without cooling gas circulation

  17. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  18. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  19. A Qualitative Analysis of the Turkish Gendarmerie Assignment Process

    National Research Council Canada - National Science Library

    Soylemez, Kadir

    2005-01-01

    ...; this number increases to 43 million (65% of the population) in the summer months. This study is an organizational analysis of the current assignment process of the Turkish General Command of the Gendarmerie...

  20. Very-high-temperature gas reactor environmental impacts assessment

    International Nuclear Information System (INIS)

    Baumann, C.D.; Barton, C.J.; Compere, E.L.; Row, T.H.

    1977-08-01

    The operation of a Very High Temperature Reactor (VHTR), a slightly modified General Atomic type High Temperature Gas-Cooled Reactor (HTGR) with 1600 F primary coolant, as a source of process heat for the 1400 0 F steam-methanation reformer step in a hydrogen producing plant (via hydrogasification of coal liquids) was examined. It was found that: (a) from the viewpoint of product contamination by fission and activation products, an Intermediate Heat Exchanger (IHX) is probably not necessary; and (b) long term steam corrosion of the core support posts may require increasing their diameter (a relatively minor design adjustment). However, the hydrogen contaminant in the primary coolant which permeates the reformer may reduce steam corrosion but may produce other problems which have not as yet been resolved. An IHX in parallel with both the reformer and steam generator would solve these problems, but probably at greater cost than that of increasing the size of the core support posts. It is recommended that this corrosion problem be examined in more detail, especially by investigating the performance of current fossil fuel heated reformers in industry. Detailed safety analysis of the VHTR would be required to establish definitely whether the IHX can be eliminated. Water and hydrogen ingress into the reactor system are potential problems which can be alleviated by an IHX. These problems will require analysis, research and development within the program required for development of the VHTR

  1. Processing of pulse oximeter data using discrete wavelet analysis.

    Science.gov (United States)

    Lee, Seungjoon; Ibey, Bennett L; Xu, Weijian; Wilson, Mark A; Ericson, M Nance; Coté, Gerard L

    2005-07-01

    A wavelet-based signal processing technique was employed to improve an implantable blood perfusion monitoring system. Data was acquired from both in vitro and in vivo sources: a perfusion model and the proximal jejunum of an adult pig. Results showed that wavelet analysis could isolate perfusion signals from raw, periodic, in vitro data as well as fast Fourier transform (FFT) methods. However, for the quasi-periodic in vivo data segments, wavelet analysis provided more consistent results than the FFT analysis for data segments of 50, 10, and 5 s in length. Wavelet analysis has thus been shown to require less data points for quasi-periodic data than FFT analysis making it a good choice for an indwelling perfusion monitor where power consumption and reaction time are paramount.

  2. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2018-03-01

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  3. Production yield analysis in the poultry processing industry

    NARCIS (Netherlands)

    Somsen, D.J.; Capelle, A.; Tramper, J.

    2004-01-01

    The paper outlines a case study where the PYA-method (production yield analysis) was implemented at a poultry-slaughtering line, processing 9000 broiler chicks per hour. It was shown that the average live weight of a flock of broilers could be used to predict the maximum production yield of the

  4. Functional analysis, harmonic analysis, and image processing a collection of papers in honor of Bj"orn Jawerth

    CERN Document Server

    Cwikel, Michael

    2017-01-01

    This volume is dedicated to the memory of Björn Jawerth. It contains original research contributions and surveys in several of the areas of mathematics to which Björn made important contributions. Those areas include harmonic analysis, image processing, and functional analysis, which are of course interrelated in many significant and productive ways. Among the contributors are some of the world's leading experts in these areas. With its combination of research papers and surveys, this book may become an important reference and research tool. This book should be of interest to advanced graduate students and professional researchers in the areas of functional analysis, harmonic analysis, image processing, and approximation theory. It combines articles presenting new research with insightful surveys written by foremost experts.

  5. Textural Analysis of Fatique Crack Surfaces: Image Pre-processing

    Directory of Open Access Journals (Sweden)

    H. Lauschmann

    2000-01-01

    Full Text Available For the fatique crack history reconstitution, new methods of quantitative microfractography are beeing developed based on the image processing and textural analysis. SEM magnifications between micro- and macrofractography are used. Two image pre-processing operatins were suggested and proved to prepare the crack surface images for analytical treatment: 1. Normalization is used to transform the image to a stationary form. Compared to the generally used equalization, it conserves the shape of brightness distribution and saves the character of the texture. 2. Binarization is used to transform the grayscale image to a system of thick fibres. An objective criterion for the threshold brightness value was found as that resulting into the maximum number of objects. Both methods were succesfully applied together with the following textural analysis.

  6. Systems analysis as a tool for optimal process strategy

    International Nuclear Information System (INIS)

    Ditterich, K.; Schneider, J.

    1975-09-01

    For the description and the optimal treatment of complex processes, the methods of Systems Analysis are used as the most promising approach in recent times. In general every process should be optimised with respect to reliability, safety, economy and environmental pollution. In this paper the complex relations between these general optimisation postulates are established in qualitative form. These general trend relations have to be quantified for every particular system studied in practice

  7. STREAM PROCESSING ALGORITHMS FOR DYNAMIC 3D SCENE ANALYSIS

    Science.gov (United States)

    2018-02-15

    PROCESSING ALGORITHMS FOR DYNAMIC 3D SCENE ANALYSIS 5a. CONTRACT NUMBER FA8750-14-2-0072 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62788F 6...of Figures 1 The 3D processing pipeline flowchart showing key modules. . . . . . . . . . . . . . . . . 12 2 Overall view (data flow) of the proposed...pipeline flowchart showing key modules. from motion and bundle adjustment algorithm. By fusion of depth masks of the scene obtained from 3D

  8. Vulnerability analysis of process plants subject to domino effects

    International Nuclear Information System (INIS)

    Khakzad, Nima; Reniers, Genserik; Abbassi, Rouzbeh; Khan, Faisal

    2016-01-01

    In the context of domino effects, vulnerability analysis of chemical and process plants aims to identify and protect installations which are relatively more susceptible to damage and thus contribute more to the initiation or propagation of domino effects. In the present study, we have developed a methodology based on graph theory for domino vulnerability analysis of hazardous installations within process plants, where owning to the large number of installations or complex interdependencies, the application of sophisticated reasoning approaches such as Bayesian network is limited. We have taken advantage of a hypothetical chemical storage plant to develop the methodology and validated the results using a dynamic Bayesian network approach. The efficacy and out-performance of the developed methodology have been demonstrated via a real-life complex case study. - Highlights: • Graph theory is a reliable tool for vulnerability analysis of chemical plants as to domino effects. • All-closeness centrality score can be used to identify most vulnerable installations. • As for complex chemical plants, the methodology outperforms Bayesian network.

  9. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Technical and economic data biomass-based energy conversion systems for the production of gaseous and/or liquid energy carriers

    International Nuclear Information System (INIS)

    2000-02-01

    The objectives of this study are: (1) to give an indication of the expected development of the currently mainly fossil fuel based Dutch energy supply system to a future CO 2 -emission 'free' energy supply system, and (2) to present main technological, economic, and environmental characteristics of three promising renewable energy based technologies for the production of gaseous and/or liquid secondary energy carriers and/or electricity and/or heat, viz.: (a) biomass hydrogasification for SNG (synthetic natural gas) production; (b) trigeneration of methanol and CHP (combined heat and power) from biomass by integrating a 'once-through' LPMEOH (liquid phase methanol) process into a 'conventional BIG/CC (Biomass-Integrated-Gasifier/Combined Cycle) system; and (c) trigeneration of Fischer-Tropsch derived transportation fuels and CHP from biomass by integrating a 'once-through' FT-process (Fischer-Tropsch) into a 'conventional' BIG/CC-system. Biomass conversion systems, for the production of CHP, transportation fuels, and as biofeedstock for the petrochemical industry, will play a substantial role in meeting the future Dutch renewable energy policy goals. In case fossil fuel prices remain low, additional policies are needed to reach these goals. Biomass will also play a significant role in reaching significant CO 2 emission reduction in Western Europe. In which sector the limited amount of biomass available/contractable can be applied best is still unclear, and therefore needs further research. By biomass hydrogasification it is possible to produce SNG with more or less the same composition as Groningen natural gas. In case relatively cheap hydrogen-rich waste gas streams are used in the short-term, the SNG production costs will he in the same order of magnitude as the market price for Dutch natural gas for small consumers (fl 0.6/Nm 3 ). The calculated minimum production costs for the 'green' fuels (methanol: 15 Euroct/l or 9 Euro/GJ, and FT-fuels: 27 Euroct/l or 9 Euro

  11. Process simulation and uncertainty analysis of plasma arc mixed waste treatment

    International Nuclear Information System (INIS)

    Ferrada, J.J.; Welch, T.D.

    1994-01-01

    Innovative mixed waste treatment subsystems have been analyzed for performance, risk, and life-cycle cost as part of the U.S. Department of Energy's (DOE)'s Mixed Waste Integrated Program (MWIP) treatment alternatives development and evaluation process. This paper concerns the analysis of mixed waste treatment system performance. Performance systems analysis includes approximate material and energy balances and assessments of operability, effectiveness, and reliability. Preliminary material and energy balances of innovative processes have been analyzed using FLOW, an object-oriented, process simulator for waste management systems under development at Oak Ridge National Laboratory. The preliminary models developed for FLOW provide rough order-of-magnitude calculations useful for sensitivity analysis. The insight gained from early modeling of these technologies approximately will ease the transition to more sophisticated simulators as adequate performance and property data become available. Such models are being developed in ASPEN by DOE's Mixed Waste Treatment Project (MWTP) for baseline and alternative flow sheets based on commercial technologies. One alternative to the baseline developed by the MWIP support groups in plasma arc treatment. This process offers a noticeable reduction in the number of process operations as compared to the baseline process because a plasma arc melter is capable of accepting a wide variety of waste streams as direct inputs (without sorting or preprocessing). This innovative process for treating mixed waste replaces several units from the baseline process and, thus, promises an economic advantage. The performance in the plasma arc furnace will directly affect the quality of the waste form and the requirements of the off-gas treatment units. The ultimate objective of MWIP is to reduce the amount of final waste produced, the cost, and the environmental impact

  12. Flotation process diagnostics and modelling by coal grain analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ofori, P; O' Brien, G.; Firth, B.; Jenkins, B. [CSIRO Energy Technology, Brisbane, Qld. (Australia)

    2006-05-15

    In coal flotation, particles of different components of the coal such as maceral groups and mineral matter and their associations have different hydrophobicities and therefore different flotation responses. By using a new coal grain analysis method for characterising individual grains, more detailed flotation performance analysis and modelling approaches have been developed. The method involves the use of microscopic imaging techniques to obtain estimates of size, compositional and density information on individual grains of fine coal. The density and composition partitioning of coal processed through different flotation systems provides an avenue to pinpoint the actual cause of poor process performance so that corrective action may be initiated. The information on grain size, density and composition is being used as input data to develop more detailed flotation process models to provide better predictions of process performance for both mechanical and column flotation devices. A number of approaches may be taken to flotation modelling such as the probability approach and the kinetic model approach or a combination of the two. In the work reported here, a simple probability approach has been taken, which will be further refined in due course. The use of grain data to map the responses of different types of coal grains through various fine coal cleaning processes provided a more advanced diagnostic capability for fine coal cleaning circuits. This enabled flotation performance curves analogous to partition curves for density separators to be produced for flotation devices.

  13. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  14. THE ANALYSIS OF RISK MANAGEMENT PROCESS WITHIN MANAGEMENT

    Directory of Open Access Journals (Sweden)

    ROMANESCU MARCEL LAURENTIU

    2016-10-01

    Full Text Available This article highlights the risk analysis within management, focusing on how a company could practicaly integrate the risks management in the existing leading process. Subsequently, it is exemplified the way of manage risk effectively, which gives numerous advantages to all firms, including improving their decision-making process. All these lead to the conclusion that the degree of risk specific to companies is very high, but if managers make the best decisions then it can diminish it and all business activitiy and its income are not influenced by factors that could disturb in a negative way .

  15. On process capability and system availability analysis of the inverse Rayleigh distribution

    Directory of Open Access Journals (Sweden)

    Sajid Ali

    2015-04-01

    Full Text Available In this article, process capability and system availability analysis is discussed for the inverse Rayleigh lifetime distribution. Bayesian approach with a conjugate gamma distribution is adopted for the analysis. Different types of loss functions are considered to find Bayes estimates of the process capability and system availability. A simulation study is conducted for the comparison of different loss functions.

  16. Semantic orchestration of image processing services for environmental analysis

    Science.gov (United States)

    Ranisavljević, Élisabeth; Devin, Florent; Laffly, Dominique; Le Nir, Yannick

    2013-09-01

    In order to analyze environmental dynamics, a major process is the classification of the different phenomena of the site (e.g. ice and snow for a glacier). When using in situ pictures, this classification requires data pre-processing. Not all the pictures need the same sequence of processes depending on the disturbances. Until now, these sequences have been done manually, which restricts the processing of large amount of data. In this paper, we present how to realize a semantic orchestration to automate the sequencing for the analysis. It combines two advantages: solving the problem of the amount of processing, and diversifying the possibilities in the data processing. We define a BPEL description to express the sequences. This BPEL uses some web services to run the data processing. Each web service is semantically annotated using an ontology of image processing. The dynamic modification of the BPEL is done using SPARQL queries on these annotated web services. The results obtained by a prototype implementing this method validate the construction of the different workflows that can be applied to a large number of pictures.

  17. Cost analysis of simulated base-catalyzed biodiesel production processes

    International Nuclear Information System (INIS)

    Tasić, Marija B.; Stamenković, Olivera S.; Veljković, Vlada B.

    2014-01-01

    Highlights: • Two semi-continuous biodiesel production processes from sunflower oil are simulated. • Simulations were based on the kinetics of base-catalyzed methanolysis reactions. • The total energy consumption was influenced by the kinetic model. • Heterogeneous base-catalyzed process is a preferable industrial technology. - Abstract: The simulation and economic feasibility evaluation of semi-continuous biodiesel production from sunflower oil were based on the kinetics of homogeneously (Process I) and heterogeneously (Process II) base-catalyzed methanolysis reactions. The annual plant’s capacity was determined to be 8356 tonnes of biodiesel. The total energy consumption was influenced by the unit model describing the methanolysis reaction kinetics. The energy consumption of the Process II was more than 2.5 times lower than that of the Process I. Also, the simulation showed the Process I had more and larger process equipment units, compared with the Process II. Based on lower total capital investment costs and biodiesel selling price, the Process II was economically more feasible than the Process I. Sensitivity analysis was conducted using variable sunflower oil and biodiesel prices. Using a biodiesel selling price of 0.990 $/kg, Processes I and II were shown to be economically profitable if the sunflower oil price was 0.525 $/kg and 0.696 $/kg, respectively

  18. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    International Nuclear Information System (INIS)

    Sudowe, Ralf; Roman, Audrey; Dailey, Ashlee; Go, Elaine

    2013-01-01

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  19. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  20. Recommended practice for process sampling for partial pressure analysis

    International Nuclear Information System (INIS)

    Blessing, James E.; Ellefson, Robert E.; Raby, Bruce A.; Brucker, Gerardo A.; Waits, Robert K.

    2007-01-01

    This Recommended Practice describes and recommends various procedures and types of apparatus for obtaining representative samples of process gases from >10 -2 Pa (10 -4 Torr) for partial pressure analysis using a mass spectrometer. The document was prepared by a subcommittee of the Recommended Practices Committee of the American Vacuum Society. The subcommittee was comprised of vacuum users and manufacturers of mass spectrometer partial pressure analyzers who have practical experience in the sampling of process gas atmospheres

  1. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  2. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  3. Simulation and Analysis of Complex Biological Processes: an Organisation Modelling Perspective

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2005-01-01

    This paper explores how the dynamics of complex biological processes can be modelled and simulated as an organisation of multiple agents. This modelling perspective identifies organisational structure occurring in complex decentralised processes and handles complexity of the analysis of the dynamics

  4. Screening Analysis of Criticality Features, Events, and Processes for License Application

    International Nuclear Information System (INIS)

    J.A. McClure

    2004-01-01

    This report documents the screening analysis of postclosure criticality features, events, and processes. It addresses the probability of criticality events resulting from degradation processes as well as disruptive events (i.e., seismic, rock fall, and igneous). Probability evaluations are performed utilizing the configuration generator described in ''Configuration Generator Model'', a component of the methodology from ''Disposal Criticality Analysis Methodology Topical Report''. The total probability per package of criticality is compared against the regulatory probability criterion for inclusion of events established in 10 CFR 63.114(d) (consider only events that have at least one chance in 10,000 of occurring over 10,000 years). The total probability of criticality accounts for the evaluation of identified potential critical configurations of all baselined commercial and U.S. Department of Energy spent nuclear fuel waste form and waste package combinations, both internal and external to the waste packages. This criticality screening analysis utilizes available information for the 21-Pressurized Water Reactor Absorber Plate, 12-Pressurized Water Reactor Absorber Plate, 44-Boiling Water Reactor Absorber Plate, 24-Boiling Water Reactor Absorber Plate, and the 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Short waste package types. Where defensible, assumptions have been made for the evaluation of the following waste package types in order to perform a complete criticality screening analysis: 21-Pressurized Water Reactor Control Rod, 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Long, and 2-Multi-Canister Overpack/2-Defense High-Level Radioactive Waste package types. The inputs used to establish probabilities for this analysis report are based on information and data generated for the Total System Performance Assessment for the License Application, where available. This analysis report determines whether criticality is to be

  5. Off-line data processing and analysis for the GERDA experiment

    International Nuclear Information System (INIS)

    Agostini, M; Pandola, L; Zavarise, P

    2012-01-01

    Gerda is an experiment designed to look for the neutrinoless double beta decay of 76 Ge. The experiment uses an array of high-purity germanium detectors (enriched in 76 Ge) directly immersed in liquid argon. Gerda is presently operating eight enriched coaxial detectors (approximately 15 kg of 76 Ge) and about 25 new custom-made enriched BEGe detectors will be deployed in the next phase (additional 20kg of 76 Ge). The paper describes the Gerda off-line analysis of the high-purity germanium detector data. Firstly we present the signal processing flow, focusing on the digital filters and on the algorithms used. Secondly we discuss the rejection of non-physical events and the data quality monitoring. The analysis is performed completely with the Gerda software framework (Gelatio), designed to support a multi-channel processing and to perform a modular analysis of digital signals.

  6. Probabilistic sensitivity analysis of system availability using Gaussian processes

    International Nuclear Information System (INIS)

    Daneshkhah, Alireza; Bedford, Tim

    2013-01-01

    The availability of a system under a given failure/repair process is a function of time which can be determined through a set of integral equations and usually calculated numerically. We focus here on the issue of carrying out sensitivity analysis of availability to determine the influence of the input parameters. The main purpose is to study the sensitivity of the system availability with respect to the changes in the main parameters. In the simplest case that the failure repair process is (continuous time/discrete state) Markovian, explicit formulae are well known. Unfortunately, in more general cases availability is often a complicated function of the parameters without closed form solution. Thus, the computation of sensitivity measures would be time-consuming or even infeasible. In this paper, we show how Sobol and other related sensitivity measures can be cheaply computed to measure how changes in the model inputs (failure/repair times) influence the outputs (availability measure). We use a Bayesian framework, called the Bayesian analysis of computer code output (BACCO) which is based on using the Gaussian process as an emulator (i.e., an approximation) of complex models/functions. This approach allows effective sensitivity analysis to be achieved by using far smaller numbers of model runs than other methods. The emulator-based sensitivity measure is used to examine the influence of the failure and repair densities' parameters on the system availability. We discuss how to apply the methods practically in the reliability context, considering in particular the selection of parameters and prior distributions and how we can ensure these may be considered independent—one of the key assumptions of the Sobol approach. The method is illustrated on several examples, and we discuss the further implications of the technique for reliability and maintenance analysis

  7. Comprehensive NMR analysis of compositional changes of black garlic during thermal processing.

    Science.gov (United States)

    Liang, Tingfu; Wei, Feifei; Lu, Yi; Kodani, Yoshinori; Nakada, Mitsuhiko; Miyakawa, Takuya; Tanokura, Masaru

    2015-01-21

    Black garlic is a processed food product obtained by subjecting whole raw garlic to thermal processing that causes chemical reactions, such as the Maillard reaction, which change the composition of the garlic. In this paper, we report a nuclear magnetic resonance (NMR)-based comprehensive analysis of raw garlic and black garlic extracts to determine the compositional changes resulting from thermal processing. (1)H NMR spectra with a detailed signal assignment showed that 38 components were altered by thermal processing of raw garlic. For example, the contents of 11 l-amino acids increased during the first step of thermal processing over 5 days and then decreased. Multivariate data analysis revealed changes in the contents of fructose, glucose, acetic acid, formic acid, pyroglutamic acid, cycloalliin, and 5-(hydroxymethyl)furfural (5-HMF). Our results provide comprehensive information on changes in NMR-detectable components during thermal processing of whole garlic.

  8. System Analysis of Flat Grinding Process with Wheel Face

    Directory of Open Access Journals (Sweden)

    T. N. Ivanova

    2014-01-01

    Full Text Available The paper presents a conducted system analysis of the flat grinding wheel face, considers the state parameters, input and output variables of subsystems, namely: machine tool, workpiece, grinding wheel, cutting fluids, and the contact area. It reveals the factors influencing the temperature and power conditions for the grinding process.Aim: conducting the system analysis of the flat grinding process with wheel face expects to enable a development of the system of grinding process parameters as a technical system, which will make it possible to evaluate each parameter individually and implement optimization of the entire system.One of the most important criteria in defining the optimal process conditions is the grinding temperature, which, to avoid defects appearance of on the surface of component, should not exceed the critical temperature values to be experimentally determined. The temperature criterion can be useful for choosing the conditions for the maximum defect-free performance of the mechanical face grinding. To define the maximum performance of defect-free grinding can also use other criteria such as a critical power density, indirectly reflecting the allowable thermal stress grinding process; the structure of the ground surface, which reflects the presence or absence of a defect layer, which is determined after the large number of experiments; flow range of the diamond layer.Optimal conditions should not exceed those of defect-free grinding. It is found that a maximum performance depends on the characteristics of circles and grade of processed material, as well as on the contact area and grinding conditions. Optimal performance depends on the diamond value (cost and specific consumption of diamonds in a circle.Above criteria require formalization as a function of the variable parameters of the grinding process. There is an option for the compromise of inter-criteria optimality, thereby providing a set of acceptable solutions, from

  9. Yucca Mountain Feature, Event, and Process (FEP) Analysis

    International Nuclear Information System (INIS)

    Freeze, G.

    2005-01-01

    A Total System Performance Assessment (TSPA) model was developed for the U.S. Department of Energy (DOE) Yucca Mountain Project (YMP) to help demonstrate compliance with applicable postclosure regulatory standards and support the License Application (LA). Two important precursors to the development of the TSPA model were (1) the identification and screening of features, events, and processes (FEPs) that might affect the Yucca Mountain disposal system (i.e., FEP analysis), and (2) the formation of scenarios from screened in (included) FEPs to be evaluated in the TSPA model (i.e., scenario development). YMP FEP analysis and scenario development followed a five-step process: (1) Identify a comprehensive list of FEPs potentially relevant to the long-term performance of the disposal system. (2) Screen the FEPs using specified criteria to identify those FEPs that should be included in the TSPA analysis and those that can be excluded from the analysis. (3) Form scenarios from the screened in (included) FEPs. (4) Screen the scenarios using the same criteria applied to the FEPs to identify any scenarios that can be excluded from the TSPA, as appropriate. (5) Specify the implementation of the scenarios in the computational modeling for the TSPA, and document the treatment of included FEPs. This paper describes the FEP analysis approach (Steps 1 and 2) for YMP, with a brief discussion of scenario formation (Step 3). Details of YMP scenario development (Steps 3 and 4) and TSPA modeling (Step 5) are beyond scope of this paper. The identification and screening of the YMP FEPs was an iterative process based on site-specific information, design, and regulations. The process was iterative in the sense that there were multiple evaluation and feedback steps (e.g., separate preliminary, interim, and final analyses). The initial YMP FEP list was compiled from an existing international list of FEPs from other radioactive waste disposal programs and was augmented by YMP site- and design

  10. Laminar flow and convective transport processes scaling principles and asymptotic analysis

    CERN Document Server

    Brenner, Howard

    1992-01-01

    Laminar Flow and Convective Transport Processes: Scaling Principles and Asymptotic Analysis presents analytic methods for the solution of fluid mechanics and convective transport processes, all in the laminar flow regime. This book brings together the results of almost 30 years of research on the use of nondimensionalization, scaling principles, and asymptotic analysis into a comprehensive form suitable for presentation in a core graduate-level course on fluid mechanics and the convective transport of heat. A considerable amount of material on viscous-dominated flows is covered.A unique feat

  11. Role of thermal analysis in uranium oxide fuel fabrication process

    International Nuclear Information System (INIS)

    Balaji Rao, Y.; Yadav, R.B.

    2006-01-01

    The present paper discusses the application of thermal analysis, particularly, differential thermal analysis (Dta) at various stages of fuel fabrication process. The useful role of Dta in knowing the decomposition pattern and calcination temperature of Adu along with de-nitration temperature is explained. The decomposition pattern depends upon the type of drying process adopted for wet ADU cake (ADU C). Also, the paper highlights the utility of DTA in determining the APS and SSA of UO 2+x and U 3 O 8 powders as an alternate technique. Further, the temperature difference (ΔT max ) between the two exothermic peaks obtained in UO 2+x powder oxidation is related to sintered density of UO 2 pellets. (author)

  12. Determinants of job stress in chemical process industry: A factor analysis approach.

    Science.gov (United States)

    Menon, Balagopal G; Praveensal, C J; Madhu, G

    2015-01-01

    Job stress is one of the active research domains in industrial safety research. The job stress can result in accidents and health related issues in workers in chemical process industries. Hence it is important to measure the level of job stress in workers so as to mitigate the same to avoid the worker's safety related problems in the industries. The objective of this study is to determine the job stress factors in the chemical process industry in Kerala state, India. This study also aims to propose a comprehensive model and an instrument framework for measuring job stress levels in the chemical process industries in Kerala, India. The data is collected through a questionnaire survey conducted in chemical process industries in Kerala. The collected data out of 1197 surveys is subjected to principal component and confirmatory factor analysis to develop the job stress factor structure. The factor analysis revealed 8 factors that influence the job stress in process industries. It is also found that the job stress in employees is most influenced by role ambiguity and the least by work environment. The study has developed an instrument framework towards measuring job stress utilizing exploratory factor analysis and structural equation modeling.

  13. Development of an expert system for analysis of plutonium processing operations

    International Nuclear Information System (INIS)

    Boeringter, S.T.; Fasel, J.H.; Kornreich, D.E.

    2001-01-01

    At Los Alamos National Laboratory (LANL) an expert system has been developed for the analysis and assessment of plutonium processing operations. This system is based upon an object-oriented simulation environment specifically developed for the needs of nuclear material processing. The simulation environment, called the ''Process Modeling System'' (ProMoS), contains a library of over 250 plutonium-based unit process operations ranging from analytical chemistry, oxide operations, recycle and recovery, waste management, and component fabrication. (author)

  14. Statistical Analysis of the First Passage Path Ensemble of Jump Processes

    Science.gov (United States)

    von Kleist, Max; Schütte, Christof; Zhang, Wei

    2018-02-01

    The transition mechanism of jump processes between two different subsets in state space reveals important dynamical information of the processes and therefore has attracted considerable attention in the past years. In this paper, we study the first passage path ensemble of both discrete-time and continuous-time jump processes on a finite state space. The main approach is to divide each first passage path into nonreactive and reactive segments and to study them separately. The analysis can be applied to jump processes which are non-ergodic, as well as continuous-time jump processes where the waiting time distributions are non-exponential. In the particular case that the jump processes are both Markovian and ergodic, our analysis elucidates the relations between the study of the first passage paths and the study of the transition paths in transition path theory. We provide algorithms to numerically compute statistics of the first passage path ensemble. The computational complexity of these algorithms scales with the complexity of solving a linear system, for which efficient methods are available. Several examples demonstrate the wide applicability of the derived results across research areas.

  15. XbD Video 3, The SEEing process of qualitative data analysis

    DEFF Research Database (Denmark)

    2013-01-01

    This is the third video in the Experience-based Designing series. It presents a live classroom demonstration of a nine step qualitative data analysis process called SEEing: The process is useful for uncovering or discovering deeper layers of 'meaning' and meaning structures in an experience...

  16. UHPLC-MS/MS Quantification Combined with Chemometrics for Comparative Analysis of Different Batches of Raw, Wine-Processed, and Salt-Processed Radix Achyranthis Bidentatae

    Directory of Open Access Journals (Sweden)

    Liu Yang

    2018-03-01

    Full Text Available An accurate and reliable method using ultra-high performance liquid chromatography combined with triple quadrupole tandem mass spectrometry (UHPLC–MS/MS was established for simultaneous quantification of five major bioactive analytes in raw, wine-processed, and salt-processed Radix Achyranthis bidentatae (RAB. The results showed that this method exhibited desirable sensitivity, precision, stability, and repeatability. The overall intra-day and inter-day variations (RSD were in the range of 1.57–2.46 and 1.51–3.00%, respectively. The overall recoveries were 98.58–101.48% with a relative standard deviation (RSD of 0.01–1.86%. In addition, the developed approach was applied to 21 batches of raw, wine-processed, and salt-processed samples of RAB. Hierarchical clustering analysis (HCA, principal component analysis (PCA, heat map, and boxplot analysis were performed to evaluate the quality of raw, wine-processed, and salt-processed RAB collected from different regions. The chemometrics combined with the quantitative analysis based on UHPLC–MS/MS results indicated that the content of five analytes increased significantly in processed RAB compared to raw RAB.

  17. Sustainable process design & analysis of hybrid separations

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Befort, Bridgette; Garg, Nipun

    2016-01-01

    Distillation is an energy intensive operation in chemical process industries. There are around 40,000 distillation columns in operation in the US, requiring approximately 40% of the total energy consumption in US chemical process industries. However, analysis of separations by distillation has...... shown that more than 50% of energy is spent in purifying the last 5-10% of the distillate product. Membrane modules on the other hand can achieve high purity separations at lower energy costs, but if the flux is high, it requires large membrane area. A hybrid scheme where distillation and membrane...... modules are combined such that each operates at its highest efficiency, has the potential for significant energy reduction without significant increase of capital costs. This paper presents a method for sustainable design of hybrid distillation-membrane schemes with guaranteed reduction of energy...

  18. Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report

    Science.gov (United States)

    Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.

    1980-01-01

    Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.

  19. Analysis of the resolution processes of three modeling tasks

    Directory of Open Access Journals (Sweden)

    Cèsar Gallart Palau

    2017-08-01

    Full Text Available In this paper we present a comparative analysis of the resolution process of three modeling tasks performed by secondary education students (13-14 years, designed from three different points of view: The Modelling-eliciting Activities, the LEMA project, and the Realistic Mathematical Problems. The purpose of this analysis is to obtain a methodological characterization of them in order to provide to secondary education teachers a proper selection and sequencing of tasks for their implementation in the classroom.

  20. IMPRINT Analysis of an Unmanned Air System Geospatial Information Process

    National Research Council Canada - National Science Library

    Hunn, Bruce P; Schweitzer, Kristin M; Cahir, John A; Finch, Mary M

    2008-01-01

    ... intelligence, geospatial analysis cell. The Improved Performance Research Integration Tool (IMPRINT) modeling program was used to understand this process and to assess crew workload during several test scenarios...

  1. Data acquisition and processing system for reactor noise analysis

    International Nuclear Information System (INIS)

    Costa Oliveira, J.; Morais Da Veiga, C.; Forjaz Trigueiros, D.; Pombo Duarte, J.

    1975-01-01

    A data acquisition and processing system for reactor noise analysis by time correlation methods is described, consisting in one to four data feeding channels (transducer, associated electronics and V/f converter), a sampling unit, a landline transmission system and a PDP 15 computer. This system is being applied to study the kinetic parameters of the 'Reactor Portugues de Investigacao', a swimming-pool 1MW reactor. The main features that make such a data acquisition and processing system a useful tool to perform noise analysis are: the improved characteristics of analog-to-digital converters employed to quantize the signals; the use of an on-line computer which allows a great accumulation and a rapid treatment of data together with an easy check of the correctness of the experiments; and the adoption of the time cross-correlation technique using two-detectors which by-pass the limitation of low efficiency detectors. (author)

  2. Integrating Process Mining and Cognitive Analysis to Study EHR Workflow.

    Science.gov (United States)

    Furniss, Stephanie K; Burton, Matthew M; Grando, Adela; Larson, David W; Kaufman, David R

    2016-01-01

    There are numerous methods to study workflow. However, few produce the kinds of in-depth analyses needed to understand EHR-mediated workflow. Here we investigated variations in clinicians' EHR workflow by integrating quantitative analysis of patterns of users' EHR-interactions with in-depth qualitative analysis of user performance. We characterized 6 clinicians' patterns of information-gathering using a sequential process-mining approach. The analysis revealed 519 different screen transition patterns performed across 1569 patient cases. No one pattern was followed for more than 10% of patient cases, the 15 most frequent patterns accounted for over half ofpatient cases (53%), and 27% of cases exhibited unique patterns. By triangulating quantitative and qualitative analyses, we found that participants' EHR-interactive behavior was associated with their routine processes, patient case complexity, and EHR default settings. The proposed approach has significant potential to inform resource allocation for observation and training. In-depth observations helped us to explain variation across users.

  3. Second-order analysis of structured inhomogeneous spatio-temporal point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    Statistical methodology for spatio-temporal point processes is in its infancy. We consider second-order analysis based on pair correlation functions and K-functions for first general inhomogeneous spatio-temporal point processes and second inhomogeneous spatio-temporal Cox processes. Assuming...... spatio-temporal separability of the intensity function, we clarify different meanings of second-order spatio-temporal separability. One is second-order spatio-temporal independence and relates e.g. to log-Gaussian Cox processes with an additive covariance structure of the underlying spatio......-temporal Gaussian process. Another concerns shot-noise Cox processes with a separable spatio-temporal covariance density. We propose diagnostic procedures for checking hypotheses of second-order spatio-temporal separability, which we apply on simulated and real data (the UK 2001 epidemic foot and mouth disease data)....

  4. Process integration and pinch analysis in sugarcane industry

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Adelk de Carvalho; Pinheiro, Ricardo Brant [UFMG, Departamento de Engenharia Nuclear, Programa de Pos-Graduacao em Ciencias e Tecnicas Nucleares, Belo Horizonte, MG (Brazil)], E-mail: rbp@nuclear.ufmg.br

    2010-07-01

    Process integration techniques were applied, particularly through the Pinch Analysis method, to sugarcane industry. Research was performed upon harvest data from an agroindustrial complex which processes sugarcane plant in excess of 3.5 million metric tons per year, producing motor fuel grade ethanol, standard quality sugar, and delivering excess electric power to the grid. Pinch Analysis was used in assessing internal heat recovery as well as external utility demand targets, while keeping the lowest but economically achievable targets for entropy increase. Efficiency on the use of energy was evaluated for the plant as it was found (the base case) as well as for five selected process and/or plant design modifications, always with guidance of the method. The first alternative design (case 2) was proposed to evaluate equipment mean idle time in the base case, to support subsequent comparisons. Cases 3 and 4 were used to estimate the upper limits of combined heat and power generation while raw material supply of the base case is kept; both the cases did not prove worth implementing. Cases 5 and 6 were devised to deal with the bottleneck of the plant, namely boiler capacity, in order to allow for some production increment. Inexpensive, minor modifications considered in case 5 were found unable to produce reasonable outcome gain. Nevertheless, proper changes in cane juice evaporation section (case 6) could allow sugar and ethanol combined production to rise up to 9.1% relative to the base case, without dropping cogenerated power. (author)

  5. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    Science.gov (United States)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  6. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    Science.gov (United States)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  7. Design and analysis of nuclear processes with the APROS

    International Nuclear Information System (INIS)

    Haenninen, M.; Puska, E.K.; Nystroem, P.

    1987-01-01

    APROS (Advanced Process Simulator) is the product being developed in the process simulators project of Imatran Voima Co. and Technical Research Centre of Finland. The aim is to design and construct an efficient and easy to use computer simulation system for process and automation system design, evaluation, analysis, testing and training purposes. As halfway of this project a working system exists with a large number of proven routines and models. However, a lot of development is still foreseen before the project will be finished. This article gives an overview of the APROS in general and of the nuclear features in particular. The calculational capabilities of the system are presented with the help of one example. (orig.)

  8. The digital storytelling process: A comparative analysis from various experts

    Science.gov (United States)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  9. Optimizing Endoscope Reprocessing Resources Via Process Flow Queuing Analysis.

    Science.gov (United States)

    Seelen, Mark T; Friend, Tynan H; Levine, Wilton C

    2018-05-04

    The Massachusetts General Hospital (MGH) is merging its older endoscope processing facilities into a single new facility that will enable high-level disinfection of endoscopes for both the ORs and Endoscopy Suite, leveraging economies of scale for improved patient care and optimal use of resources. Finalized resource planning was necessary for the merging of facilities to optimize staffing and make final equipment selections to support the nearly 33,000 annual endoscopy cases. To accomplish this, we employed operations management methodologies, analyzing the physical process flow of scopes throughout the existing Endoscopy Suite and ORs and mapping the future state capacity of the new reprocessing facility. Further, our analysis required the incorporation of historical case and reprocessing volumes in a multi-server queuing model to identify any potential wait times as a result of the new reprocessing cycle. We also performed sensitivity analysis to understand the impact of future case volume growth. We found that our future-state reprocessing facility, given planned capital expenditures for automated endoscope reprocessors (AERs) and pre-processing sinks, could easily accommodate current scope volume well within the necessary pre-cleaning-to-sink reprocessing time limit recommended by manufacturers. Further, in its current planned state, our model suggested that the future endoscope reprocessing suite at MGH could support an increase in volume of at least 90% over the next several years. Our work suggests that with simple mathematical analysis of historic case data, significant changes to a complex perioperative environment can be made with ease while keeping patient safety as the top priority.

  10. Analysis of business process maturity and organisational performance relations

    Directory of Open Access Journals (Sweden)

    Kalinowski T. Bartosz

    2016-12-01

    Full Text Available The paper aims to present results of the study on business process maturity in relation to organisational performance. A two-phase methodology, based on literature review and survey was used. The literature is a source of knowledge about business process maturity and organisational performance, whereas the research on process maturity vs organisational performance in Polish Enterprises provides findings based on 84 surveyed companies. The main areas of the research covered: identification and analysis of maturity related variables and identification of organisational performance perspectives and its relation to process maturity. The study shows that there is a significant positive relation between process maturity and organisational performance. Although there are research on such relation available, they are scarce and have some significant limitations in terms of research sample or the scope of maturity or organisational performance covered. This publication is part of a project funded by the National Science Centre awarded by decision number DEC-2011/01/D/HS4/04070.

  11. A Review of Literature on analysis of JIG Grinding Process

    DEFF Research Database (Denmark)

    Sudheesh, P. K.; Puthumana, Govindan

    2016-01-01

    in jig grinding, because of their uniformity and purity. In this paper, abrief review of the analysis of jig grinding process considering various research trends is presented. The areas highlighted are: optimization, selection of abrasives, selection of processing conditions and practical considerations....... The optimization of parameters in jig grinding process is important to maximize productivity and to improve quality. The abrasives of hard jig grinding wheels get blunt quickly so these are recommended to grind workpiece of low hardness and soft grinding wheels are recommended for hard material workpieces. The jig...

  12. A new microcomputer program for processing data in neutron activation analysis

    International Nuclear Information System (INIS)

    Beeley, P.A.; Page, J.A.; Heimlich, M.S.; Queen's Univ., Kingston, ON; Edward, J.B.; Bennett, L.G.I.

    1993-01-01

    A new utility program for processing data in neutron activation analysis (NAA) has been developed for use on MS-DOS microcomputers. Peak areas are read from ASCII data files of gamma-ray spectra which have been processed by a Gaussian peak fitting program, GAMANAL-PC. Elemental concentrations are then calculated by this new program, QUACANAL, via a semi-absolute algorithm that uses pre-determined activation constants. User-defined ASCII library files are employed to specify the elements of interest required for analysis, and (n, p) and (n, α) interferences are taken into account. The program has been written in turbo PASCAL, is menu driven and contains options for processing data from cyclic NAA. An interactive philosophy has been used in designing the program. (author) 12 refs.; 2 figs.; 1 tab

  13. Global sensitivity analysis of Alkali-Surfactant-Polymer enhanced oil recovery processes

    Energy Technology Data Exchange (ETDEWEB)

    Carrero, Enrique; Queipo, Nestor V.; Pintos, Salvador; Zerpa, Luis E. [Applied Computing Institute, Faculty of Engineering, University of Zulia, Zulia (Venezuela)

    2007-08-15

    After conventional waterflooding processes the residual oil in the reservoir remains as a discontinuous phase in the form of oil drops trapped by capillary forces and is likely to be around 70% of the original oil in place (OOIP). The EOR method so-called Alkaline-Surfactant-Polymer (ASP) flooding has been proved to be effective in reducing the oil residual saturation in laboratory experiments and field projects through reduction of interfacial tension and mobility ratio between oil and water phases. A critical step for the optimal design and control of ASP recovery processes is to find the relative contributions of design variables such as, slug size and chemical concentrations, in the variability of given performance measures (e.g., net present value, cumulative oil recovery), considering a heterogeneous and multiphase petroleum reservoir (sensitivity analysis). Previously reported works using reservoir numerical simulation have been limited to local sensitivity analyses because a global sensitivity analysis may require hundreds or even thousands of computationally expensive evaluations (field scale numerical simulations). To overcome this issue, a surrogate-based approach is suggested. Surrogate-based analysis/optimization makes reference to the idea of constructing an alternative fast model (surrogate) from numerical simulation data and using it for analysis/optimization purposes. This paper presents an efficient global sensitivity approach based on Sobol's method and multiple surrogates (i.e., Polynomial Regression, Kriging, Radial Base Functions and a Weighed Adaptive Model), with the multiple surrogates used to address the uncertainty in the analysis derived from plausible alternative surrogate-modeling schemes. The proposed approach was evaluated in the context of the global sensitivity analysis of a field scale Alkali-Surfactant-Polymer flooding process. The design variables and the performance measure in the ASP process were selected as slug size

  14. Spatial Analysis of Depots for Advanced Biomass Processing

    Energy Technology Data Exchange (ETDEWEB)

    Hilliard, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Webb, Erin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sokhansanj, Shahabaddine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Martinez Gonzalez, Maria I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-09-01

    The objective of this work was to perform a spatial analysis of the total feedstock cost at the conversion reactor for biomass supplied by a conventional system and an advanced system with depots to densify biomass into pellets. From these cost estimates, the conditions (feedstock cost and availability) for which advanced processing depots make it possible to achieve cost and volume targets can be identified.

  15. Stochastic Analysis of Gaussian Processes via Fredholm Representation

    Directory of Open Access Journals (Sweden)

    Tommi Sottinen

    2016-01-01

    Full Text Available We show that every separable Gaussian process with integrable variance function admits a Fredholm representation with respect to a Brownian motion. We extend the Fredholm representation to a transfer principle and develop stochastic analysis by using it. We show the convenience of the Fredholm representation by giving applications to equivalence in law, bridges, series expansions, stochastic differential equations, and maximum likelihood estimations.

  16. Accessibility analysis in manufacturing processes using visibility cones

    Institute of Scientific and Technical Information of China (English)

    尹周平; 丁汉; 熊有伦

    2002-01-01

    Accessibility is a kind of important design feature of products,and accessibility analysis has been acknowledged as a powerful tool for solving computational manufacturing problems arising from different manufacturing processes.After exploring the relations among approachability,accessibility and visibility,a general method for accessibility analysis using visibility cones (VC) is proposed.With the definition of VC of a point,three kinds of visibility of a feature,namely complete visibility cone (CVC),partial visibility cone (PVC) and local visibility cone (LVC),are defined.A novel approach to computing VCs is formulated by identifying C-obstacles in the C-space,for which a general and efficient algorithm is proposed and implemented by making use of visibility culling.Lastly,we discuss briefly how to realize accessibility analysis in numerically controlled (NC) machining planning,coordinate measuring machines (CMMs) inspection planning and assembly sequence planning with the proposed methods.

  17. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  18. Multifractal detrended fluctuation analysis of analog random multiplicative processes

    Energy Technology Data Exchange (ETDEWEB)

    Silva, L.B.M.; Vermelho, M.V.D. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil); Lyra, M.L. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)], E-mail: marcelo@if.ufal.br; Viswanathan, G.M. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)

    2009-09-15

    We investigate non-Gaussian statistical properties of stationary stochastic signals generated by an analog circuit that simulates a random multiplicative process with weak additive noise. The random noises are originated by thermal shot noise and avalanche processes, while the multiplicative process is generated by a fully analog circuit. The resulting signal describes stochastic time series of current interest in several areas such as turbulence, finance, biology and environment, which exhibit power-law distributions. Specifically, we study the correlation properties of the signal by employing a detrended fluctuation analysis and explore its multifractal nature. The singularity spectrum is obtained and analyzed as a function of the control circuit parameter that tunes the asymptotic power-law form of the probability distribution function.

  19. A novel process for recovery of fermentation-derived succinic acid: process design and economic analysis.

    Science.gov (United States)

    Orjuela, Alvaro; Orjuela, Andrea; Lira, Carl T; Miller, Dennis J

    2013-07-01

    Recovery and purification of organic acids produced in fermentation constitutes a significant fraction of total production cost. In this paper, the design and economic analysis of a process to recover succinic acid (SA) via dissolution and acidification of succinate salts in ethanol, followed by reactive distillation to form succinate esters, is presented. Process simulation was performed for a range of plant capacities (13-55 million kg/yr SA) and SA fermentation titers (50-100 kg/m(3)). Economics were evaluated for a recovery system installed within an existing fermentation facility producing succinate salts at a cost of $0.66/kg SA. For a SA processing capacity of 54.9 million kg/yr and a titer of 100 kg/m(3) SA, the model predicts a capital investment of $75 million and a net processing cost of $1.85 per kg SA. Required selling price of diethyl succinate for a 30% annual return on investment is $1.57 per kg. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. From perceptual to lexico-semantic analysis--cortical plasticity enabling new levels of processing.

    Science.gov (United States)

    Schlaffke, Lara; Rüther, Naima N; Heba, Stefanie; Haag, Lauren M; Schultz, Thomas; Rosengarth, Katharina; Tegenthoff, Martin; Bellebaum, Christian; Schmidt-Wilcke, Tobias

    2015-11-01

    Certain kinds of stimuli can be processed on multiple levels. While the neural correlates of different levels of processing (LOPs) have been investigated to some extent, most of the studies involve skills and/or knowledge already present when performing the task. In this study we specifically sought to identify neural correlates of an evolving skill that allows the transition from perceptual to a lexico-semantic stimulus analysis. Eighteen participants were trained to decode 12 letters of Morse code that were presented acoustically inside and outside of the scanner environment. Morse code was presented in trains of three letters while brain activity was assessed with fMRI. Participants either attended to the stimulus length (perceptual analysis), or evaluated its meaning distinguishing words from nonwords (lexico-semantic analysis). Perceptual and lexico-semantic analyses shared a mutual network comprising the left premotor cortex, the supplementary motor area (SMA) and the inferior parietal lobule (IPL). Perceptual analysis was associated with a strong brain activation in the SMA and the superior temporal gyrus bilaterally (STG), which remained unaltered from pre and post training. In the lexico-semantic analysis post learning, study participants showed additional activation in the left inferior frontal cortex (IFC) and in the left occipitotemporal cortex (OTC), regions known to be critically involved in lexical processing. Our data provide evidence for cortical plasticity evolving with a learning process enabling the transition from perceptual to lexico-semantic stimulus analysis. Importantly, the activation pattern remains task-related LOP and is thus the result of a decision process as to which LOP to engage in. © 2015 The Authors. Human Brain Mapping Published by Wiley Periodicals, Inc.

  1. Simulation, integration, and economic analysis of gas-to-liquid processes

    International Nuclear Information System (INIS)

    Bao, Buping; El-Halwagi, Mahmoud M.; Elbashir, Nimir O.

    2010-01-01

    Gas-to-liquid (GTL) involves the chemical conversion of natural gas into synthetic crude that can be upgraded and separated into different useful hydrocarbon fractions including liquid transportation fuels. Such technology can also be used to convert other abundant natural resources such as coal and biomass to fuels and value added chemicals (referred to as coal-to-liquid (CTL) and biomass-to-liquid (BTL)). A leading GTL technology is the Fischer-Tropsch (FT) process. The objective of this work is to provide a techno-economic analysis of the GTL process and to identify optimization and integration opportunities for cost saving and reduction of energy usage while accounting for the environmental impact. First, a base-case flowsheet is synthesized to include the key processing steps of the plant. Then, a computer-aided process simulation is carried out to determine the key mass and energy flows, performance criteria, and equipment specifications. Next, energy and mass integration studies are performed to address the following items: (a) heating and cooling utilities, (b) combined heat and power (process cogeneration), (c) management of process water, (c) optimization of tail gas allocation, and (d) recovery of catalyst-supporting hydrocarbon solvents. Finally, these integration studies are conducted and the results are documented in terms of conserving energy and mass resources as well as providing economic impact. Finally, an economic analysis is undertaken to determine the plant capacity needed to achieve the break-even point and to estimate the return on investment for the base-case study. (author)

  2. Analysis of the packet formation process in packet-switched networks

    Science.gov (United States)

    Meditch, J. S.

    Two new queueing system models for the packet formation process in packet-switched telecommunication networks are developed, and their applications in process stability, performance analysis, and optimization studies are illustrated. The first, an M/M/1 queueing system characterization of the process, is a highly aggregated model which is useful for preliminary studies. The second, a marked extension of an earlier M/G/1 model, permits one to investigate stability, performance characteristics, and design of the packet formation process in terms of the details of processor architecture, and hardware and software implementations with processor structure and as many parameters as desired as variables. The two new models together with the earlier M/G/1 characterization span the spectrum of modeling complexity for the packet formation process from basic to advanced.

  3. Partial wave analysis using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Niklaus; Liu Beijiang; Wang Jike, E-mail: nberger@ihep.ac.c [Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Lu, Shijingshan, 100049 Beijing (China)

    2010-04-01

    Partial wave analysis is an important tool for determining resonance properties in hadron spectroscopy. For large data samples however, the un-binned likelihood fits employed are computationally very expensive. At the Beijing Spectrometer (BES) III experiment, an increase in statistics compared to earlier experiments of up to two orders of magnitude is expected. In order to allow for a timely analysis of these datasets, additional computing power with short turnover times has to be made available. It turns out that graphics processing units (GPUs) originally developed for 3D computer games have an architecture of massively parallel single instruction multiple data floating point units that is almost ideally suited for the algorithms employed in partial wave analysis. We have implemented a framework for tensor manipulation and partial wave fits called GPUPWA. The user writes a program in pure C++ whilst the GPUPWA classes handle computations on the GPU, memory transfers, caching and other technical details. In conjunction with a recent graphics processor, the framework provides a speed-up of the partial wave fit by more than two orders of magnitude compared to legacy FORTRAN code.

  4. An Intelligent System for Modelling, Design and Analysis of Chemical Processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    ICAS, Integrated Computer Aided System, is a software that consists of a number of intelligent tools, which are very suitable, among others, for computer aided modelling, sustainable design of chemical and biochemical processes, and design-analysis of product-process monitoring systems. Each...... the computer aided modelling tool will illustrate how to generate a desired process model, how to analyze the model equations, how to extract data and identify the model and make it ready for various types of application. In sustainable process design, the example will highlight the issue of integration...

  5. Process development and exergy cost sensitivity analysis of a hybrid molten carbonate fuel cell power plant and carbon dioxide capturing process

    Science.gov (United States)

    Mehrpooya, Mehdi; Ansarinasab, Hojat; Moftakhari Sharifzadeh, Mohammad Mehdi; Rosen, Marc A.

    2017-10-01

    An integrated power plant with a net electrical power output of 3.71 × 105 kW is developed and investigated. The electrical efficiency of the process is found to be 60.1%. The process includes three main sub-systems: molten carbonate fuel cell system, heat recovery section and cryogenic carbon dioxide capturing process. Conventional and advanced exergoeconomic methods are used for analyzing the process. Advanced exergoeconomic analysis is a comprehensive evaluation tool which combines an exergetic approach with economic analysis procedures. With this method, investment and exergy destruction costs of the process components are divided into endogenous/exogenous and avoidable/unavoidable parts. Results of the conventional exergoeconomic analyses demonstrate that the combustion chamber has the largest exergy destruction rate (182 MW) and cost rate (13,100 /h). Also, the total process cost rate can be decreased by reducing the cost rate of the fuel cell and improving the efficiency of the combustion chamber and heat recovery steam generator. Based on the total avoidable endogenous cost rate, the priority for modification is the heat recovery steam generator, a compressor and a turbine of the power plant, in rank order. A sensitivity analysis is done to investigate the exergoeconomic factor parameters through changing the effective parameter variations.

  6. SDI-based business processes: A territorial analysis web information system in Spain

    Science.gov (United States)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  7. Modeling and flow analysis of pure nylon polymer for injection molding process

    International Nuclear Information System (INIS)

    Nuruzzaman, D M; Kusaseh, N; Basri, S; Hamedon, Z; Oumer, A N

    2016-01-01

    In the production of complex plastic parts, injection molding is one of the most popular industrial processes. This paper addresses the modeling and analysis of the flow process of the nylon (polyamide) polymer for injection molding process. To determine the best molding conditions, a series of simulations are carried out using Autodesk Moldflow Insight software and the processing parameters are adjusted. This mold filling commercial software simulates the cavity filling pattern along with temperature and pressure distributions in the mold cavity. In the modeling, during the plastics flow inside the mold cavity, different flow parameters such as fill time, pressure, temperature, shear rate and warp at different locations in the cavity are analyzed. Overall, this Moldflow is able to perform a relatively sophisticated analysis of the flow process of pure nylon. Thus the prediction of the filling of a mold cavity is very important and it becomes useful before a nylon plastic part to be manufactured. (paper)

  8. Modeling and flow analysis of pure nylon polymer for injection molding process

    Science.gov (United States)

    Nuruzzaman, D. M.; Kusaseh, N.; Basri, S.; Oumer, A. N.; Hamedon, Z.

    2016-02-01

    In the production of complex plastic parts, injection molding is one of the most popular industrial processes. This paper addresses the modeling and analysis of the flow process of the nylon (polyamide) polymer for injection molding process. To determine the best molding conditions, a series of simulations are carried out using Autodesk Moldflow Insight software and the processing parameters are adjusted. This mold filling commercial software simulates the cavity filling pattern along with temperature and pressure distributions in the mold cavity. In the modeling, during the plastics flow inside the mold cavity, different flow parameters such as fill time, pressure, temperature, shear rate and warp at different locations in the cavity are analyzed. Overall, this Moldflow is able to perform a relatively sophisticated analysis of the flow process of pure nylon. Thus the prediction of the filling of a mold cavity is very important and it becomes useful before a nylon plastic part to be manufactured.

  9. Mathematical foundations of image processing and analysis

    CERN Document Server

    Pinoli, Jean-Charles

    2014-01-01

    Mathematical Imaging is currently a rapidly growing field in applied mathematics, with an increasing need for theoretical mathematics. This book, the second of two volumes, emphasizes the role of mathematics as a rigorous basis for imaging sciences. It provides a comprehensive and convenient overview of the key mathematical concepts, notions, tools and frameworks involved in the various fields of gray-tone and binary image processing and analysis, by proposing a large, but coherent, set of symbols and notations, a complete list of subjects and a detailed bibliography. It establishes a bridg

  10. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  11. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  12. An introduction to audio content analysis applications in signal processing and music informatics

    CERN Document Server

    Lerch, Alexander

    2012-01-01

    "With the proliferation of digital audio distribution over digital media, audio content analysis is fast becoming a requirement for designers of intelligent signal-adaptive audio processing systems. Written by a well-known expert in the field, this book provides quick access to different analysis algorithms and allows comparison between different approaches to the same task, making it useful for newcomers to audio signal processing and industry experts alike. A review of relevant fundamentals in audio signal processing, psychoacoustics, and music theory, as well as downloadable MATLAB files are also included"--

  13. Aspects of second-order analysis of structured inhomogeneous spatio-temporal processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    2012-01-01

    Statistical methodology for spatio-temporal point processes is in its infancy. We consider second-order analysis based on pair correlation functions and K-functions for general inhomogeneous spatio-temporal point processes and for inhomogeneous spatio-temporal Cox processes. Assuming spatio......-temporal separability of the intensity function, we clarify different meanings of second-order spatio-temporal separability. One is second-order spatio-temporal independence and relates to log-Gaussian Cox processes with an additive covariance structure of the underlying spatio-temporal Gaussian process. Another...... concerns shot-noise Cox processes with a separable spatio-temporal covariance density. We propose diagnostic procedures for checking hypotheses of second-order spatio-temporal separability, which we apply on simulated and real data....

  14. Development of safety analysis and constraint detection techniques for process interaction errors

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Chin-Feng, E-mail: csfanc@saturn.yzu.edu.tw [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China); Tsai, Shang-Lin; Tseng, Wan-Hui [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China)

    2011-02-15

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  15. Development of safety analysis and constraint detection techniques for process interaction errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Tsai, Shang-Lin; Tseng, Wan-Hui

    2011-01-01

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  16. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    Science.gov (United States)

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. The knowledge conversion SECI process as innovation indicator analysis factor

    OpenAIRE

    Silva, Elaine da [UNESP; Valentim, Marta Lígia Pomim [UNESP

    2013-01-01

    It highlights the innovation importance in the current society and presents innovation indicators applied in 125 countries. We made an analysis in the 80 variables distributed through seven GII pillars, trying to identify the direct, indirect or null incidences of the knowledge conversion way described by the SECI Process. The researched revealed the fact that knowledge management, in this case specifically the knowledge conversion SECI Process, is present in the variables that, according to ...

  18. Biometric Attendance and Big Data Analysis for Optimizing Work Processes.

    Science.gov (United States)

    Verma, Neetu; Xavier, Teenu; Agrawal, Deepak

    2016-01-01

    Although biometric attendance management is available, large healthcare organizations have difficulty in big data analysis for optimization of work processes. The aim of this project was to assess the implementation of a biometric attendance system and its utility following big data analysis. In this prospective study the implementation of biometric system was evaluated over 3 month period at our institution. Software integration with other existing systems for data analysis was also evaluated. Implementation of the biometric system could be successfully done over a two month period with enrollment of 10,000 employees into the system. However generating reports and taking action this large number of staff was a challenge. For this purpose software was made for capturing the duty roster of each employee and integrating it with the biometric system and adding an SMS gateway. This helped in automating the process of sending SMSs to each employee who had not signed in. Standalone biometric systems have limited functionality in large organizations unless it is meshed with employee duty roster.

  19. The effects of pre-processing strategies in sentiment analysis of online movie reviews

    Science.gov (United States)

    Zin, Harnani Mat; Mustapha, Norwati; Murad, Masrah Azrifah Azmi; Sharef, Nurfadhlina Mohd

    2017-10-01

    With the ever increasing of internet applications and social networking sites, people nowadays can easily express their feelings towards any products and services. These online reviews act as an important source for further analysis and improved decision making. These reviews are mostly unstructured by nature and thus, need processing like sentiment analysis and classification to provide a meaningful information for future uses. In text analysis tasks, the appropriate selection of words/features will have a huge impact on the effectiveness of the classifier. Thus, this paper explores the effect of the pre-processing strategies in the sentiment analysis of online movie reviews. In this paper, supervised machine learning method was used to classify the reviews. The support vector machine (SVM) with linear and non-linear kernel has been considered as classifier for the classification of the reviews. The performance of the classifier is critically examined based on the results of precision, recall, f-measure, and accuracy. Two different features representations were used which are term frequency and term frequency-inverse document frequency. Results show that the pre-processing strategies give a significant impact on the classification process.

  20. Analysis of the mixing processes in the subtropical Advancetown Lake, Australia

    Science.gov (United States)

    Bertone, Edoardo; Stewart, Rodney A.; Zhang, Hong; O'Halloran, Kelvin

    2015-03-01

    This paper presents an extensive investigation of the mixing processes occurring in the subtropical monomictic Advancetown Lake, which is the main water body supplying the Gold Coast City in Australia. Meteorological, chemical and physical data were collected from weather stations, laboratory analysis of grab samples and an in-situ Vertical Profiling System (VPS), for the period 2008-2012. This comprehensive, high frequency dataset was utilised to develop a one-dimensional model of the vertical transport and mixing processes occurring along the water column. Multivariate analysis revealed that air temperature and rain forecasts enabled a reliable prediction of the strength of the lake stratification. Vertical diffusion is the main process driving vertical mixing, particularly during winter circulation. However, a high reservoir volume and warm winters can limit the degree of winter mixing, causing only partial circulation to occur, as was the case in 2013. This research study provides a comprehensive approach for understanding and predicting mixing processes for similar lakes, whenever high-frequency data are available from VPS or other autonomous water monitoring systems.

  1. [A SAS marco program for batch processing of univariate Cox regression analysis for great database].

    Science.gov (United States)

    Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin

    2015-02-01

    To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.

  2. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    Science.gov (United States)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  3. Economic analysis of locust bean processing and marketing in Iwo ...

    African Journals Online (AJOL)

    Economic analysis of locust bean processing and marketing in Iwo local government, Osun state. ... Majority (78.3%) of the processors and marketers were making profit; 95.0% operate ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  4. Computer-Aided Model Based Analysis for Design and Operation of a Copolymerization Process

    DEFF Research Database (Denmark)

    Lopez-Arenas, Maria Teresa; Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    . This will allow analysis of the process behaviour, contribute to a better understanding of the polymerization process, help to avoid unsafe conditions of operation, and to develop operational and optimizing control strategies. In this work, through a computer-aided modeling system ICAS-MoT, two first......The advances in computer science and computational algorithms for process modelling, process simulation, numerical methods and design/synthesis algorithms, makes it advantageous and helpful to employ computer-aided modelling systems and tools for integrated process analysis. This is illustrated......-principles models have been investigated with respect to design and operational issues for solution copolymerization reactors in general, and for the methyl methacrylate/vinyl acetate system in particular. The Model 1 is taken from literature and is commonly used for low conversion region, while the Model 2 has...

  5. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  6. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  7. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  8. Profitability Analysis of Rice Processing and Marketing in Kano State ...

    African Journals Online (AJOL)

    Profitability Analysis of Rice Processing and Marketing in Kano State, Nigeria. ... added to the commodity at each stage in the study area and determine the most efficient services produce. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  9. Global site-specific analysis of glycoprotein N-glycan processing.

    Science.gov (United States)

    Cao, Liwei; Diedrich, Jolene K; Ma, Yuanhui; Wang, Nianshuang; Pauthner, Matthias; Park, Sung-Kyu Robin; Delahunty, Claire M; McLellan, Jason S; Burton, Dennis R; Yates, John R; Paulson, James C

    2018-06-01

    N-glycans contribute to the folding, stability and functions of the proteins they decorate. They are produced by transfer of the glycan precursor to the sequon Asn-X-Thr/Ser, followed by enzymatic trimming to a high-mannose-type core and sequential addition of monosaccharides to generate complex-type and hybrid glycans. This process, mediated by the concerted action of multiple enzymes, produces a mixture of related glycoforms at each glycosite, making analysis of glycosylation difficult. To address this analytical challenge, we developed a robust semiquantitative mass spectrometry (MS)-based method that determines the degree of glycan occupancy at each glycosite and the proportion of N-glycans processed from high-mannose type to complex type. It is applicable to virtually any glycoprotein, and a complete analysis can be conducted with 30 μg of protein. Here, we provide a detailed description of the method that includes procedures for (i) proteolytic digestion of glycoprotein(s) with specific and nonspecific proteases; (ii) denaturation of proteases by heating; (iii) sequential treatment of the glycopeptide mixture with two endoglycosidases, Endo H and PNGase F, to create unique mass signatures for the three glycosylation states; (iv) LC-MS/MS analysis; and (v) data analysis for identification and quantitation of peptides for the three glycosylation states. Full coverage of site-specific glycosylation of glycoproteins is achieved, with up to thousands of high-confidence spectra hits for each glycosite. The protocol can be performed by an experienced technician or student/postdoc with basic skills for proteomics experiments and takes ∼7 d to complete.

  10. On the Need for Quantitative Bias Analysis in the Peer-Review Process.

    Science.gov (United States)

    Fox, Matthew P; Lash, Timothy L

    2017-05-15

    Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Kinematic analysis of in situ measurement during chemical mechanical planarization process

    Energy Technology Data Exchange (ETDEWEB)

    Li, Hongkai; Wang, Tongqing; Zhao, Qian; Meng, Yonggang; Lu, Xinchun, E-mail: xclu@tsinghua.edu.cn [State Key Laboratory of Tribology, Tsinghua University, Beijing 100084 (China)

    2015-10-15

    Chemical mechanical planarization (CMP) is the most widely used planarization technique in semiconductor manufacturing presently. With the aid of in situ measurement technology, CMP tools can achieve good performance and stable productivity. However, the in situ measurement has remained unexplored from a kinematic standpoint. The available related resources for the kinematic analysis are very limited due to the complexity and technical secret. In this paper, a comprehensive kinematic analysis of in situ measurement is provided, including the analysis model, the measurement trajectory, and the measurement time of each zone of wafer surface during the practical CMP process. In addition, a lot of numerical calculations are performed to study the influences of main parameters on the measurement trajectory and the measurement velocity variation of the probe during the measurement process. All the efforts are expected to improve the in situ measurement system and promote the advancement in CMP control system.

  12. Process parameter optimization based on principal components analysis during machining of hardened steel

    Directory of Open Access Journals (Sweden)

    Suryakant B. Chandgude

    2015-09-01

    Full Text Available The optimum selection of process parameters has played an important role for improving the surface finish, minimizing tool wear, increasing material removal rate and reducing machining time of any machining process. In this paper, optimum parameters while machining AISI D2 hardened steel using solid carbide TiAlN coated end mill has been investigated. For optimization of process parameters along with multiple quality characteristics, principal components analysis method has been adopted in this work. The confirmation experiments have revealed that to improve performance of cutting; principal components analysis method would be a useful tool.

  13. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    Science.gov (United States)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  14. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  15. Effective Thermal Analysis of Using Peltier Module for Desalination Process

    OpenAIRE

    Hayder Al-Madhhachi

    2018-01-01

    The key objective of this study is to analyse the heat transfer processes involved in the evaporation and condensation of water in a water distillation system employing a thermoelectric module. This analysis can help to increase the water production and to enhance the system performance. For the analysis, a water distillation unit prototype integrated with a thermoelectric module was designed and fabricated. A theoretical model is developed to study the effect of the heat added, transferred a...

  16. Enhancing Safety of Artificially Ventilated Patients Using Ambient Process Analysis.

    Science.gov (United States)

    Lins, Christian; Gerka, Alexander; Lüpkes, Christian; Röhrig, Rainer; Hein, Andreas

    2018-01-01

    In this paper, we present an approach for enhancing the safety of artificially ventilated patients using ambient process analysis. We propose to use an analysis system consisting of low-cost ambient sensors such as power sensor, RGB-D sensor, passage detector, and matrix infrared temperature sensor to reduce risks for artificially ventilated patients in both home and clinical environments. We describe the system concept and our implementation and show how the system can contribute to patient safety.

  17. An analysis of post-event processing in social anxiety disorder.

    Science.gov (United States)

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  18. Modeling and techno-economic analysis of shale-to-liquid and coal-to-liquid fuels processes

    International Nuclear Information System (INIS)

    Zhou, Huairong; Yang, Siyu; Xiao, Honghua; Yang, Qingchun; Qian, Yu; Gao, Li

    2016-01-01

    To alleviate the conflict between oil supply and demand, Chinese government has accelerated exploration and exploitation of alternative oil productions. STL (Shale-to-liquid) processes and CTL (coal-to-liquid) processes are promising choices to supply oil. However, few analyses have been made on their energy efficiency and economic performance. This paper conducts a detailed analysis of a STL process and a CTL process based on mathematical modeling and simulation. Analysis shows that low efficiency of the STL process is due to low oil yield of the Fushun-type retorting technology. For the CTL process, the utility system provides near to 34% energy consumption of the total. This is because that CTL technologies are in early development and no heat integration between units is implemented. Economic analysis reveals that the total capital investment of the CTL process is higher than that of the STL process. The production cost of the CTL process is right on the same level as that of the STL process. For better techno-economic performance, it is suggested to develop a new retorting technology of high oil yield for the STL process. The remaining retorting gas should be converted to hydrogen and then used for shale oil hydrogenation. For the CTL process, developing an appropriate heat network is an efficient way to apply heat integration. In addition, the CTL process is intended to be integrated with hydrogen rich gas to adjust H_2/CO for better resource utilization. - Highlights: • Aspen Plus software is used for modeling and simulation of a shale-to-liquid (STL) and a coal-to-liquid (CTL) processes. • Techno-economic analysis of STL and CTL processes is conducted. • Suggestions are given for improving energy efficiency and economic performance of STL and CTL processes.

  19. LANL Institutional Decision Support By Process Modeling and Analysis Group (AET-2)

    Energy Technology Data Exchange (ETDEWEB)

    Booth, Steven Richard [Los Alamos National Laboratory

    2016-04-04

    AET-2 has expertise in process modeling, economics, business case analysis, risk assessment, Lean/Six Sigma tools, and decision analysis to provide timely decision support to LANS leading to continuous improvement. This capability is critical during the current tight budgetary environment as LANS pushes to identify potential areas of cost savings and efficiencies. An important arena is business systems and operations, where processes can impact most or all laboratory employees. Lab-wide efforts are needed to identify and eliminate inefficiencies to accomplish Director McMillan’s charge of “doing more with less.” LANS faces many critical and potentially expensive choices that require sound decision support to ensure success. AET-2 is available to provide this analysis support to expedite the decisions at hand.

  20. Image processing. Volumetric analysis with a digital image processing system. [GAMMA]. Bildverarbeitung. Volumetrie mittels eines digitalen Bildverarbeitungssystems

    Energy Technology Data Exchange (ETDEWEB)

    Kindler, M; Radtke, F; Demel, G

    1986-01-01

    The book is arranged in seven sections, describing various applications of volumetric analysis using image processing systems, and various methods of diagnostic evaluation of images obtained by gamma scintigraphy, cardic catheterisation, and echocardiography. A dynamic ventricular phantom is explained that has been developed for checking and calibration for safe examination of patient, the phantom allowing extensive simulation of volumetric and hemodynamic conditions of the human heart: One section discusses the program development for image processing, referring to a number of different computer systems. The equipment described includes a small non-expensive PC system, as well as a standardized nuclear medical diagnostic system, and a computer system especially suited to image processing.

  1. The Analysis of Barriers in Succession Processes of Family Business with the Use of Grey Incidence Analysis (Polish Perspective

    Directory of Open Access Journals (Sweden)

    Więcek-Janka Ewa

    2016-06-01

    Full Text Available The article presents results of research on the identification and evaluation of barriers faced by successors in family businesses during the first process of succession. The analysis of empirical material used grey systems theory, which was considered as an equivalent for the analysis of small samples and qualitative research. While conducting the literature review and empirical study, the authors concentrated on (a the identification of barriers in the development of family firms and (b eliciting the perspective of the new generation of owners in family firms entering the succession process through an empirical analysis of the assessed level of risk in relationships with family and business.

  2. Astro-H Data Analysis, Processing and Archive

    Science.gov (United States)

    Angelini, Lorella; Terada, Yukikatsu; Loewenstein, Michael; Miller, Eric D.; Yamaguchi, Hiroya; Yaqoob, Tahir; Krimm, Hans; Harrus, Ilana; Takahashi, Hiromitsu; Nobukawa, Masayoshi; hide

    2016-01-01

    Astro-H (Hitomi) is an X-ray Gamma-ray mission led by Japan with international participation, launched on February 17, 2016. The payload consists of four different instruments (SXS, SXI, HXI and SGD) that operate simultaneously to cover the energy range from 0.3 keV up to 600 keV. This paper presents the analysis software and the data processing pipeline created to calibrate and analyze the Hitomi science data along with the plan for the archive and user support.These activities have been a collaborative effort shared between scientists and software engineers working in several institutes in Japan and USA.

  3. Thermodynamic analysis of combined Solid Oxide Electrolyzer and Fischer–Tropsch processes

    International Nuclear Information System (INIS)

    Stempien, Jan Pawel; Ni, Meng; Sun, Qiang; Chan, Siew Hwa

    2015-01-01

    In this paper a thermodynamic analysis and simple optimization of a combined Solid Oxide Electrolyzer Cell and Fisher–Tropsch Synthesis processes for sustainable hydrocarbons fuel production is reported. Comprehensive models are employed to describe effects of temperature, pressure, reactant composition and molar flux and flow on the system efficiency and final production distribution. The electrolyzer model was developed in-house and validated with experimental data of a typical Solid Oxide Electrolyzer. The Fischer–Tropsch Synthesis model employed lumped kinetics of syngas utilization, which includes inhibiting effect of water content and kinetics of Water–Gas Shift reaction. Product distribution model incorporated olefin re-adsorption and varying physisorption and solubility of hydrocarbons with their carbon number. The results were compared with those reported by Becker et al. with simplified analysis of such process. In the present study an opposite effect of operation at elevated pressure was observed. Proposed optimized system achieved overall efficiency of 66.67% and almost equal spread of light- (31%wt), mid-(36%wt) and heavy-hydrocarbons (33%wt). Paraffins contributed the majority of the yield. - Highlights: • Analysis of Solid Oxide Electrolyzer combined with Fisher Tropsch process. • Efficiency of converting water and carbon dioxide into synthetic fuels above 66%. • Effects of process temperature, pressure, gas flux and compositions were analyzed

  4. Thermodynamic analysis of tar reforming through auto-thermal reforming process

    Energy Technology Data Exchange (ETDEWEB)

    Nurhadi, N., E-mail: nurhadi@tekmira.esdm.go.id; Diniyati, Dahlia; Efendi, M. Ade Andriansyah [R& D Centre for Mineral and Coal Technology, Jln. Jend.Sudirman no. 623, Bandung. Telp. 022-6030483 (Malaysia); Istadi, I. [Department of Chemical Engineering, Diponegoro University, Jln. Jl. Prof. Soedarto, SH, Semarang (Malaysia)

    2015-12-29

    Fixed bed gasification is a simple and suitable technology for small scale power generation. One of the disadvantages of this technology is producing tar. So far, tar is not utilized yet and being waste that should be treated into a more useful product. This paper presents a thermodynamic analysis of tar conversion into gas producer through non-catalytic auto-thermal reforming technology. Tar was converted into components, C, H, O, N and S, and then reacted with oxidant such as mixture of air or pure oxygen. Thus, this reaction occurred auto-thermally and reached chemical equilibrium. The sensitivity analysis resulted that the most promising process performance occurred at flow rate of air was reached 43% of stoichiometry while temperature of process is 1100°C, the addition of pure oxygen is 40% and preheating of oxidant flow is 250°C. The yield of the most promising process performance between 11.15-11.17 kmol/h and cold gas efficiency was between 73.8-73.9%.The results of this study indicated that thermodynamically the conversion of tar into producer gas through non-catalytic auto-thermal reformingis more promising.

  5. System and Analysis for Low Latency Video Processing using Microservices

    OpenAIRE

    VASUKI BALASUBRAMANIAM, KARTHIKEYAN

    2017-01-01

    The evolution of big data processing and analysis has led to data-parallel frameworks such as Hadoop, MapReduce, Spark, and Hive, which are capable of analyzing large streams of data such as server logs, web transactions, and user reviews. Videos are one of the biggest sources of data and dominate the Internet traffic. Video processing on a large scale is critical and challenging as videos possess spatial and temporal features, which are not taken into account by the existing data-parallel fr...

  6. The Analysis of the Customer Request Processing in a Financial Institution

    Directory of Open Access Journals (Sweden)

    Maria NEAGU

    2013-03-01

    Full Text Available his paper presents the numerical simulation of the customer requests processing by generalists and specialists in a financial institution using ARENA software. The model considers three types of requests: standard requests, direct special requests and special requests received by telephone or e-mail. The requests processing time and costs receive a detailed analysis: the processing time, the waiting time and the total time, the requests number and the requests cost dependencies as a function of the standard requests incoming frequency are presented.

  7. Application of noise analysis to investigate core degradation process during PHEBUS-FPT1 test

    International Nuclear Information System (INIS)

    Oguma, Ritsuo

    1997-01-01

    Noise analysis has been performed for measurement data obtained during PHEBUS-FPT1 test. The purpose of the study is to evaluate the applicability of the noise analysis to the following problems: To get more knowledge about the physical processes going on during severe core conditions; To better understand the core melting process; To establish appropriate on-line shut-down data. Results of the study indicate that the noise analysis is quite promising as a tool for investigating physical processes during the experiment. Compared with conventional approach of evaluating the signal's mean value behaviour, the noise analysis can provide additional, more detailed information: It was found that the neutron flux signal is subjected to additional reactivity perturbations in conjunction with fuel melting and relocation. This can easily be detected by applying noise analysis for the neutron flux signal. It has been demonstrated that the method developed in the present study can provide more accurate estimates of the onset of fuel relocation than using temperature signals from thermocouples in the thermal shroud. Moreover, the result suggests a potential of the present method for tracking the whole process of relocation. The result of the data analysis suggests a possibility of sensor diagnostics which may be important for confirming the quality and reliability of the recorded data. Based on the results achieved it is believed that the combined use of noise analysis and thermocouple signals will provide reliable shut-down criteria for the experiment. 8 refs

  8. Hyperplane distance neighbor clustering based on local discriminant analysis for complex chemical processes monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chunhong; Xiao, Shaoqing; Gu, Xiaofeng [Jiangnan University, Wuxi (China)

    2014-11-15

    The collected training data often include both normal and faulty samples for complex chemical processes. However, some monitoring methods, such as partial least squares (PLS), principal component analysis (PCA), independent component analysis (ICA) and Fisher discriminant analysis (FDA), require fault-free data to build the normal operation model. These techniques are applicable after the preliminary step of data clustering is applied. We here propose a novel hyperplane distance neighbor clustering (HDNC) based on the local discriminant analysis (LDA) for chemical process monitoring. First, faulty samples are separated from normal ones using the HDNC method. Then, the optimal subspace for fault detection and classification can be obtained using the LDA approach. The proposed method takes the multimodality within the faulty data into account, and thus improves the capability of process monitoring significantly. The HDNC-LDA monitoring approach is applied to two simulation processes and then compared with the conventional FDA based on the K-nearest neighbor (KNN-FDA) method. The results obtained in two different scenarios demonstrate the superiority of the HDNC-LDA approach in terms of fault detection and classification accuracy.

  9. Hyperplane distance neighbor clustering based on local discriminant analysis for complex chemical processes monitoring

    International Nuclear Information System (INIS)

    Lu, Chunhong; Xiao, Shaoqing; Gu, Xiaofeng

    2014-01-01

    The collected training data often include both normal and faulty samples for complex chemical processes. However, some monitoring methods, such as partial least squares (PLS), principal component analysis (PCA), independent component analysis (ICA) and Fisher discriminant analysis (FDA), require fault-free data to build the normal operation model. These techniques are applicable after the preliminary step of data clustering is applied. We here propose a novel hyperplane distance neighbor clustering (HDNC) based on the local discriminant analysis (LDA) for chemical process monitoring. First, faulty samples are separated from normal ones using the HDNC method. Then, the optimal subspace for fault detection and classification can be obtained using the LDA approach. The proposed method takes the multimodality within the faulty data into account, and thus improves the capability of process monitoring significantly. The HDNC-LDA monitoring approach is applied to two simulation processes and then compared with the conventional FDA based on the K-nearest neighbor (KNN-FDA) method. The results obtained in two different scenarios demonstrate the superiority of the HDNC-LDA approach in terms of fault detection and classification accuracy

  10. Beyond Discourse: A Genealogical Analysis of an Intersubjective Transformation Process of Gender

    Directory of Open Access Journals (Sweden)

    Patricia Amigot Leache

    2007-05-01

    Full Text Available This article presents a genealogical analysis of a specific process of transformation of gender. It analyses the shifts of a group of Spanish working-class women in the context of the final years of Franco's dictatorship and the transition to democracy. These women took part in the activities of the so-called Centros de Promoción de la Mujer y Cultura Popular [Centres for Women's Promotion and Popular Culture]. I identify the elements of the transformation processes through in-depth interviews with a sample of women. Through these narrations and the documents of the Centres I show the road that goes from a state of domination (in FOUCAULT's words to a more flexible and mobile situation in which not only are the practical possibilities for these women wider, but also we can witness the transformation of the institution itself. The analysis is made with a FOUCAULTian toolbox, using especially FOUCAULT's analysis of power, his theoretical construction of practices of self and the link he posits between such practices and identity games of truth. The analysis shows the intersubjective nature of the processes and of the practices involved in such transformations. URN: urn:nbn:de:0114-fqs070295

  11. About numerical analysis of electromagnetic field induce in gear wheels during hardening process

    Directory of Open Access Journals (Sweden)

    Gabriel Cheregi

    2008-05-01

    Full Text Available The paper presents the results of a numericalsimulation using finite element analysis for a coupledmagneto-thermal problem, specific for inductionhardening processes. The analysis takes into account therelative movement between inductor and the heated part.Numerical simulation allows to determine accurately thethermal regime of the induction heating process and theoptimal parameters which offer maximum efficiency.Therefore the experiments number in designing processcan be decreased and a better knowledge of the processcan be obtained.

  12. Application of Statistical Model in Wastewater Treatment Process Modeling Using Data Analysis

    Directory of Open Access Journals (Sweden)

    Alireza Raygan Shirazinezhad

    2015-06-01

    Full Text Available Background: Wastewater treatment includes very complex and interrelated physical, chemical and biological processes which using data analysis techniques can be rigorously modeled by a non-complex mathematical calculation models. Materials and Methods: In this study, data on wastewater treatment processes from water and wastewater company of Kohgiluyeh and Boyer Ahmad were used. A total of 3306 data for COD, TSS, PH and turbidity were collected, then analyzed by SPSS-16 software (descriptive statistics and data analysis IBM SPSS Modeler 14.2, through 9 algorithm. Results: According to the results on logistic regression algorithms, neural networks, Bayesian networks, discriminant analysis, decision tree C5, tree C & R, CHAID, QUEST and SVM had accuracy precision of 90.16, 94.17, 81.37, 70.48, 97.89, 96.56, 96.46, 96.84 and 88.92, respectively. Discussion and conclusion: The C5 algorithm as the best and most applicable algorithms for modeling of wastewater treatment processes were chosen carefully with accuracy of 97.899 and the most influential variables in this model were PH, COD, TSS and turbidity.

  13. Viability analysis of heat recovery solution for industrial process of roasting coffee

    Directory of Open Access Journals (Sweden)

    Kljajić Miroslav V.

    2016-01-01

    Full Text Available Every industrial heat recovery solution is specific engineering challenge but not because predicted energy rationalization or achieved energy savings but potential unavoidable technological deviations and consequences on related processes and for sure, high investment because of delicate design and construction. Often, the energy savings in a particular segment of the industrial process is a main goal. However, in the food industry, especially roasting coffee, additional criteria has to be strictly observed and fulfilled. Such criteria may include prescribed and uniform product quality, compliance with food safety standards, stability of the processes etc., and all in the presence of key process parameters variability, inconsistency of raw material composition and quality, complexity of measurement and analytical methods etc. The paper respects all circumstances and checks viability of proposed recovery solution. The paper analyzes the possibility of using waste heat from the roasting process to ensure shortening of roasting cycle, reduction of fuel consumption and increasing capacity of roasting lines on daily basis. Analysis concludes that effects are valuable and substantial, although the complete solution is on the threshold of economic sustainability with numerous opportunities to improve of both technical and economic indicators. The analysis combines measuring and analytical methods with standard cost-benefit analysis. Conclusions are derived from measurements and calculations of key parameters in the operating conditions and checked by experimental methods. Test results deviate from 10 to 15%, in relation with parameters in main production line.

  14. Quantitative Risk Analysis of a Pervaporation Process for Concentrating Hydrogen Peroxide

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Ho Jin; Yoon, Ik Keun [Korea Gas Corporation, Ansan (Korea, Republic of); Choi, Soo Hyoung [Chonbuk National University, Jeonju (Korea, Republic of)

    2014-12-15

    Quantitative risk analysis has been performed for a pervaporation process for production of high test peroxide. Potential main accidents are explosion and fire caused by a decomposition reaction. As the target process has a laboratory scale, the consequence is considered to belong to Category 3. An event tree has been developed as a model for occurrence of a decomposition reaction in the target process. The probability functions of the accident causes have been established based on the frequency data of similar events. Using the constructed model, the failure rate has been calculated. The result indicates that additional safety devices are required in order to achieve an acceptable risk level, i.e. an accident frequency less than 10{sup -4}/yr. Therefore, a layer of protection analysis has been applied. As a result, it is suggested to introduce inherently safer design to avoid catalytic reaction, a safety instrumented function to prevent overheating, and a relief system that prevents explosion even if a decomposition reaction occurs. The proposed method is expected to contribute to developing safety management systems for various chemical processes including concentration of hydrogen peroxide.

  15. A hazard and probabilistic safety analysis of a high-level waste transfer process

    International Nuclear Information System (INIS)

    Bott, T.F.; Sasser, M.K.

    1996-01-01

    This paper describes a safety analysis of a transfer process for high-level radioactive and toxic waste. The analysis began with a hazard assessment that used elements of What If, Checklist, Failure Modes and Effects Analysis, and Hazards and Operability Study (HAZOP) techniques to identify and rough-in accident sequences. Based on this preliminary analysis, the most significant accident sequences were developed further using event trees. Quantitative frequency estimates for the accident sequences were based on operational data taken from the historical record of the site where the process is performed. Several modeling challenges were encountered in the course of the study. These included linked initiating and accident progression events, fire propagation modeling, accounting for administrative control violations, and handling mission-phase effects

  16. Site Suitability Analysis for Beekeeping via Analythical Hyrearchy Process, Konya Example

    Science.gov (United States)

    Sarı, F.; Ceylan, D. A.

    2017-11-01

    Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA) and Geographical Information Systems (GIS) integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP) was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.

  17. Analysis of Business Process at PT XYZ by Using SCOR Thread Diagram

    Science.gov (United States)

    Sembiring, M. T.; Rambe, H. C.

    2017-03-01

    Supply Chain Operations Reference (SCOR) is a standard supply chain performance evaluation model which is proposed by Supply Chain Council (SCC). SCOR makes companies can analyse and evaluate their supply chain performance. SCOR has Thread Diagram which describes business process simply and systematically to help the analysis of company’s business process. This research takes place in PT XYZ that is involved in Crude Palm Oil (CPO) industry. PT XYZ used to be the market leader of CPO industry but nowadays they have a trouble to compete with new competitors. The purpose of this study is to provide the input for PT XYZ business process improvement to enhance the competitiveness of the company with the others. The result obtained shows that there are two performance metrics that are not reached. The analysis of business process shows the lack of control role of PT XYZ to supplier and customer side which is going to be the suggestion of improvement.

  18. The SeaDAS Processing and Analysis System: SeaWiFS, MODIS, and Beyond

    Science.gov (United States)

    MacDonald, M. D.; Ruebens, M.; Wang, L.; Franz, B. A.

    2005-12-01

    The SeaWiFS Data Analysis System (SeaDAS) is a comprehensive software package for the processing, display, and analysis of ocean data from a variety of satellite sensors. Continuous development and user support by programmers and scientists for more than a decade has helped to make SeaDAS the most widely used software package in the world for ocean color applications, with a growing base of users from the land and sea surface temperature community. Full processing support for past (CZCS, OCTS, MOS) and present (SeaWiFS, MODIS) sensors, and anticipated support for future missions such as NPP/VIIRS, enables end users to reproduce the standard ocean archive product suite distributed by NASA's Ocean Biology Processing Group (OBPG), as well as a variety of evaluation and intermediate ocean, land, and atmospheric products. Availability of the processing algorithm source codes and a software build environment also provide users with the tools to implement custom algorithms. Recent SeaDAS enhancements include synchronization of MODIS processing with the latest code and calibration updates from the MODIS Calibration Support Team (MCST), support for all levels of MODIS processing including Direct Broadcast, a port to the Macintosh OS X operating system, release of the display/analysis-only SeaDAS-Lite, and an extremely active web-based user support forum.

  19. Analysis of hazardous substances released during CFRP laser processing

    Science.gov (United States)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  20. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools......, satisfying the process and user constraints. A knowledge base consisting of the process knowledge as well as knowledge on measurement methods and tools has been developed. An ontology has been designed for knowledge representation and management. The developed knowledge base has a dual feature. On the one...... procedures has been developed to retrieve the data/information stored in the knowledge base....

  1. Semi-on-line analysis for fast and precise monitoring of bioreaction processes

    DEFF Research Database (Denmark)

    Christensen, L.H.; Marcher, J.; Schulze, Ulrik

    1996-01-01

    Monitoring of substrates and products during fermentation processes can be achieved either by on-line, in situ sensors or by semi-on-line analysis consisting of an automatic sampling step followed by an ex situ analysis of the retrieved sample. The potential risk of introducing time delays...

  2. Estimation of CO2 emission for each process in the Japanese steel industry: a process analysis

    International Nuclear Information System (INIS)

    Sakamoto, Y.; Tonooka, Y.

    2000-01-01

    The CO 2 emission for each process in the Japanese steel industry is estimated by a process analysis using statistical data in order to evaluate the possibility of reducing CO 2 emissions. The emission factor of CO 2 for each product and also for crude steel produced from an integrated steel plant route and an electric arc furnaces route is estimated and compared. The CO 2 emissions can be estimated from production amounts of products for each process and for crude steel. The CO 2 emission of blast furnaces is the largest and that of rolling and piping follows. The emission factor of CO 2 of crude steel produced from an integrated steel plant route is approximately 3.8 times as high as that produced via an electric arc furnace route. (Author)

  3. Techno-economic analysis of organosolv pretreatment process from lignocellulosic biomass

    DEFF Research Database (Denmark)

    Rodrigues Gurgel da Silva, Andrè; Errico, Massimiliano; Rong, Ben-Guang

    2018-01-01

    data, we propose a feasible process flowsheet for organosolv pretreatment. Simulation of the pretreatment process provided mass and energy balances for a techno-economic analysis, and the values were compared with the most prevalent and mature pretreatment method: diluted acid. Organosolv pretreatment...... required more energy, 578.1 versus 213.8 MW for diluted acid pretreatment, but resulted in a higher ethanol concentration after the biomass fermentation, 11.1% compared to 5.4%. Total annual costs (TACs) calculations showed advantages for diluted acid pretreatment, but future improvements explored...

  4. Finite Element Analysis for Bending Process of U-Bending Specimens

    Energy Technology Data Exchange (ETDEWEB)

    Park, Won Dong; Bahn, Chi Bum [Pusan National University, Busan (Korea, Republic of)

    2015-10-15

    ASTM G30 suggests that the applied strain can be calculated by dividing thickness by a bend radius. It should be noted, however, that the formula is reliable under an assumption that the ratio of thickness to bend radius is less than 0.2. Typically, to increase the applied stress/strain, the ratio of thickness to bend radius becomes larger than 0.2. This suggests that the estimated strain values by ASTM G30 are not reliable to predict the actual residual strain state of the highly deformed U-bend specimen. For this reason, finite element analysis (FEA) for the bending process of Ubend specimens was conducted by using a commercial finite element analysis software ABAQUS. ver.6.14- 2;2014. From the results of FEA, PWSCC initiation time and U-bend specimen size can be determined exactly. Since local stress and strain have a significant effect on the initiation of PWSCC, it was inappropriate to apply results of ASTM G30 to the PWSCC test directly. According to results of finite element analysis (FEA), elastic relaxation can cause inaccuracy in intended final residual stress. To modify this inaccuracy, additional process reducing the spring back is required. However this additional process also may cause uncertainty of stress/strain state. Therefore, the U-bending specimen size which is not creating uncertainty should be optimized and selected. With the bending radius of 8.3 mm, the thickness of 3 mm and the roller distance of 32.6 mm, calculated maximum stress and strain were 670 MPa and 0.21, respectively.

  5. Hierarchically structured exergetic and exergoeconomic analysis and evaluation of energy conversion processes

    International Nuclear Information System (INIS)

    Hebecker, Dietrich; Bittrich, Petra; Riedl, Karsten

    2005-01-01

    Evaluation of the efficiency and economic benefit of energy conversion processes and technologies requires a scientifically based analysis. The hierarchically structured exergetic analysis provides a detailed characterization of complex technical systems. By defining corresponding evaluation coefficients, the exergetic efficiency can be assessed for units within the whole system. Based on this exergetic analysis, a thermoeconomic evaluation method is developed. A cost function is defined for all units, subsystems and the total plant, so that the cost flow in the system can be calculated. Three dimensionless coefficients, the Pauer factor, the loss coefficient and the cost factor, enable pinpointing cost intensive process units, allocating cost in cases of co-production and gaining insight for future design improvements. The methodology is demonstrated by a biomass gasification plant producing electricity, heat and cold

  6. Principal Component Analysis of Process Datasets with Missing Values

    Directory of Open Access Journals (Sweden)

    Kristen A. Severson

    2017-07-01

    Full Text Available Datasets with missing values arising from causes such as sensor failure, inconsistent sampling rates, and merging data from different systems are common in the process industry. Methods for handling missing data typically operate during data pre-processing, but can also occur during model building. This article considers missing data within the context of principal component analysis (PCA, which is a method originally developed for complete data that has widespread industrial application in multivariate statistical process control. Due to the prevalence of missing data and the success of PCA for handling complete data, several PCA algorithms that can act on incomplete data have been proposed. Here, algorithms for applying PCA to datasets with missing values are reviewed. A case study is presented to demonstrate the performance of the algorithms and suggestions are made with respect to choosing which algorithm is most appropriate for particular settings. An alternating algorithm based on the singular value decomposition achieved the best results in the majority of test cases involving process datasets.

  7. Image analysis for ophthalmological diagnosis image processing of Corvis ST images using Matlab

    CERN Document Server

    Koprowski, Robert

    2016-01-01

    This monograph focuses on the use of analysis and processing methods for images from the Corvis® ST tonometer. The presented analysis is associated with the quantitative, repeatable and fully automatic evaluation of the response of the eye, eyeball and cornea to an air-puff. All the described algorithms were practically implemented in MATLAB®. The monograph also describes and provides the full source code designed to perform the discussed calculations. As a result, this monograph is intended for scientists, graduate students and students of computer science and bioengineering as well as doctors wishing to expand their knowledge of modern diagnostic methods assisted by various image analysis and processing methods.

  8. Thermodynamic analysis of a milk pasteurization process assisted by geothermal energy

    International Nuclear Information System (INIS)

    Yildirim, Nurdan; Genc, Seda

    2015-01-01

    Renewable energy system is an important concern for sustainable development of the World. Thermodynamic analysis, especially exergy analysis is an intense tool to assess sustainability of the systems. Food processing industry is one of the energy intensive sectors where dairy industry consumes substantial amount of energy among other food industry segments. Therefore, in this study, thermodynamic analysis of a milk pasteurization process assisted by geothermal energy was studied. In the system, a water–ammonia VAC (vapor absorption cycle), a cooling section, a pasteurizer and a regenerator were used for milk pasteurization. Exergetic efficiencies of each component and the whole system were separately calculated. A parametric study was undertaken. In this regard, firstly the effect of the geothermal resource temperature on (i) the total exergy destruction of the absorption cycle and the whole system, (ii) the efficiency of the VAC, the whole system and COP (coefficient of performance) of the VAC, (iii) the flow rate of the pasteurized milk were investigated. Then, the effect of the geothermal resource flow rate on the pasteurization load was analyzed. The exergetic efficiency of the whole system was calculated as 56.81% with total exergy destruction rate of 13.66 kW. The exergetic results were also illustrated through the Grassmann diagram. - Highlights: • Geothermal energy assisted milk pasteurization system was studied thermodynamically. • The first study on exergetic analysis of a milk pasteurization process with VAC. • The thermodynamic properties of water–ammonia mixture were calculated by using EES. • Energetic and exergetic efficiency calculated as 71.05 and 56.81%, respectively.

  9. Thermal analysis of LOFT waste gas processing system nitrogen supply for process line purge and blower seal

    International Nuclear Information System (INIS)

    Tatar, G.A.

    1979-01-01

    The LOFT Waste Gas Processing System uses gaseous nitrogen (GN 2 ) to purge the main process line and to supply pressure on the blower labyrinth seal. The purpose of this analysis was to determine the temperature of the GN 2 at the blower seals and the main process line. Since these temperatures were below 32 0 F the heat rate necessary to raise these temperatures was calculated. This report shows that the GN 2 temperatures at the points mentioned above were below 10 0 F. A heat rate into the GN 2 of 389 Watts added at the point where the supply line enters the vault would raise the GN 2 temperature above 32 0 F

  10. Remote sensing, airborne radiometric survey and aeromagnetic survey data processing and analysis

    International Nuclear Information System (INIS)

    Dong Xiuzhen; Liu Dechang; Ye Fawang; Xuan Yanxiu

    2009-01-01

    Taking remote sensing data, airborne radiometric data and aero magnetic survey data as an example, the authors elaborate about basic thinking of remote sensing data processing methods, spectral feature analysis and adopted processing methods, also explore the remote sensing data combining with the processing of airborne radiometric survey and aero magnetic survey data, and analyze geological significance of processed image. It is not only useful for geological environment research and uranium prospecting in the study area, but also reference to applications in another area. (authors)

  11. Hillslope Discharge Analysis - Threshold Behavior and Mixing Processes

    Science.gov (United States)

    Dusek, J.; Vogel, T. N.

    2017-12-01

    Reliable quantitative prediction of temporal changes of both the soil water storage and the shallow subsurface runoff for natural forest hillslopes exhibiting high degree of subsurface heterogeneity remains a challenge. The intensity of stormflow determines to a large extent the residence time of water in a hillslope segment, thus also influencing biogeochemical processes and mass fluxes of nutrients. Stormflow, as one of the most important runoff mechanisms in headwater catchments, usually develops above the soil-bedrock interface during prominent rainfall-runoff events as saturated flow. In this study, one- and two-dimensional numerical models were used to analyze hydrological processes at an experimental forest site located in a small headwater catchment under humid temperate climate. The models are based on dual-continuum approach reflecting water flow and isotope transport through the soil matrix and preferential pathways. The threshold relationship between rainfall and stormflow as well as hysteresis in the hillslope stormflow-storage relationship were examined. The hillslope storage analysis was performed for selected individual rainfall-runoff events over the period of several consecutive growing seasons. Furthermore, temporal and spatial variations of pre-event and event water contributions to hillslope stormflow were evaluated using a two-component mass balance approach based on the synthetic oxygen-18 signatures. The results of this analysis showed a mutual interplay of components of hillslope water balance exposing a nonlinear character of the hillslope hydrological response. The results also suggested significant mixing processes in a hillslope segment, in particular mixing of pre-event and event water as well as water exchanged between the soil matrix and preferential pathways. Despite the dominant control of preferential stormflow on overall hillslope runoff response, a rapid and substantial contribution of pre-event water to hillslope runoff was

  12. A Review of Literature on analysis of JIG Grinding Process

    DEFF Research Database (Denmark)

    Sudheesh, P. K.; Puthumana, Govindan

    2016-01-01

    Jig grinding is a process practically used by tool and die makers in the creation of jigs or mating holes and pegs on dies.The abrasives normally used in jig grinding are divided into Natural Abrasives and Artificial Abrasives. Artificial Abrasiveare preferred in manufacturing of grinding wheels...... in jig grinding, because of their uniformity and purity. In this paper, abrief review of the analysis of jig grinding process considering various research trends is presented. The areas highlighted are: optimization, selection of abrasives, selection of processing conditions and practical considerations....... The optimization of parameters in jig grinding process is important to maximize productivity and to improve quality. The abrasives of hard jig grinding wheels get blunt quickly so these are recommended to grind workpiece of low hardness and soft grinding wheels are recommended for hard material workpieces. The jig...

  13. Qualitative Analysis of Films: Cultural Processes in the Mirror of Film

    Directory of Open Access Journals (Sweden)

    Gloria Dahl

    2004-05-01

    Full Text Available A special qualitative psychological analysis of movies developed by Wilhelm SALBER is practiced at the Psychological Institute of the University of Cologne for more than 40 years. This kind of film-analysis does not have an end in itself, but also aids as access to research cultural structures. In this respect movies are seismographs of cultural trends expressing general visions and images of future development. They indicate as well the status of society in its genesis and complexity as developmental perspectives, providing information about crisis, narrowing scope of action and its immanent self-healing power. Comparable to the process of dream-interpretation, the "manifest" film narration is expanded with the associations and in-depth descriptions of the audience in order to reconstruct the latent "Komplexentwicklung," the development of psychological lines. Suspense and spellbound is based on activating a meaningful transformational experience—only movies stimulate such a process which touch the heart of the viewers. The psychological analysis works out the morphological dramaturgy of the film-experience, which is shaped into a specific dynamic figure. Paradox insoluble problem-constellations are the driving forces in this moving process. The mere examination of the screenplay or the film-story does not take into consideration that the audience is always part of the scene. Viewers modify the story in a characteristic way while they are watching it—according to the dynamic of the psychological process they are going through. A combination of joining in and maintaining an observing distance—as in therapy, in advertising or in education—is an integral part of this interplay. Because the significant factors work unconsciously, it is necessary to apply a specific qualitative method in order to be able to grasp this. Short exemplary analyses of the movies The Piano, Fight Club, Dogville, Punch-Drunk Love, Catch Me If You Can, The Hours

  14. Economics of coal conversion processing. Advances in coal gasification: support research. Advances in coal gasification: process development and analysis

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    The fall meeting of the American Chemical Society, Division of Fuel Chemistry, was held at Miami Beach, Florida, September 10-15, 1978. Papers involved the economics of coal conversion processing and advances in coal gasification, especially support research and process development and analysis. Fourteen papers have been entered individually into EDB and ERA; three papers had been entered previously from other sources. (LTN)

  15. Thermoreflectance spectroscopy—Analysis of thermal processes in semiconductor lasers

    Science.gov (United States)

    Pierścińska, D.

    2018-01-01

    This review focuses on theoretical foundations, experimental implementation and an overview of experimental results of the thermoreflectance spectroscopy as a powerful technique for temperature monitoring and analysis of thermal processes in semiconductor lasers. This is an optical, non-contact, high spatial resolution technique providing high temperature resolution and mapping capabilities. Thermoreflectance is a thermometric technique based on measuring of relative change of reflectivity of the surface of laser facet, which provides thermal images useful in hot spot detection and reliability studies. In this paper, principles and experimental implementation of the technique as a thermography tool is discussed. Some exemplary applications of TR to various types of lasers are presented, proving that thermoreflectance technique provides new insight into heat management problems in semiconductor lasers and in particular, that it allows studying thermal degradation processes occurring at laser facets. Additionally, thermal processes and basic mechanisms of degradation of the semiconductor laser are discussed.

  16. Analysis of architect’s performance indicators in project delivery process

    Science.gov (United States)

    Marisa, A.

    2018-03-01

    Architect as a professional in the construction industry should possess a good performance in project delivery process. As a design professional, architect has an important role to ensure that the process is well-conducted by delivering a high-quality product for the clients. Thus, analyzing architect’s performance indicators is crucial in the project delivery process. This study aims to analyze the relative importance of architect performance indicators in project delivery process among registered architects in North Sumatera, Indonesia. A total of five indicators that measure architect performance in project delivery process were identified and 110 completed questionnaires were obtained and used for data analysis. A relative importance index is used to rank the relative importance of architect performance indicators. Results indicate that focus on the clients is the most important indicator of architect performance in project delivery process. This study demonstrates project communication as one of crucial indicators perceived by the architects for measuring their performance, and fills a knowledge gap on the importance of identifying the most important indicator for measuring architect performance from their own perspectives which previous studies have overlooked to improve performance assessment in project delivery process.

  17. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    Science.gov (United States)

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  18. IR spectroscopy together with multivariate data analysis as a process analytical tool for in-line monitoring of crystallization process and solid-state analysis of crystalline product

    DEFF Research Database (Denmark)

    Pöllänen, Kati; Häkkinen, Antti; Reinikainen, Satu-Pia

    2005-01-01

    -ray powder diffraction (XRPD) as a reference technique. In order to fully utilize DRIFT, the application of multivariate techniques are needed, e.g., multivariate statistical process control (MSPC), principal component analysis (PCA) and partial least squares (PLS). The results demonstrate that multivariate...... Fourier transform infra red (ATR-FTIR) spectroscopy provides valuable information on process, which can be utilized for more controlled crystallization processes. Diffuse reflectance Fourier transform infra red (DRIFT-IR) is applied for polymorphic characterization of crystalline product using X......Crystalline product should exist in optimal polymorphic form. Robust and reliable method for polymorph characterization is of great importance. In this work, infra red (IR) spectroscopy is applied for monitoring of crystallization process in situ. The results show that attenuated total reflection...

  19. Development of Process Automation in the Neutron Activation Analysis Facility in Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Yussup, N.; Azman, A.; Ibrahim, M.M.; Rahman, N.A.A.; Che Sohashaari, S.; Atan, M.N.; Hamzah, M.A.; Mokhtar, M.; Khalid, M.A.; Salim, N.A.A.; Hamzah, M.S.

    2018-01-01

    Neutron Activation Analysis (NAA) has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established from sample registration to analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, system automation is developed in order to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This report explains NAA process in Nuclear Malaysia and describes the automation development in detail which includes sample registration software, automatic sample changer system which consists of hardware and software; and sample analysis software. (author)

  20. Statistical Analysis of Deep Drilling Process Conditions Using Vibrations and Force Signals

    Directory of Open Access Journals (Sweden)

    Syafiq Hazwan

    2016-01-01

    Full Text Available Cooling systems is a key point for hot forming process of Ultra High Strength Steels (UHSS. Normally, cooling systems is made using deep drilling technique. Although deep twist drill is better than other drilling techniques in term of higher productivity however its main problem is premature tool breakage, which affects the production quality. In this paper, analysis of deep twist drill process parameters such as cutting speed, feed rate and depth of cut by using statistical analysis to identify the tool condition is presented. The comparisons between different two tool geometries are also studied. Measured data from vibrations and force sensors are being analyzed through several statistical parameters such as root mean square (RMS, mean, kurtosis, standard deviation and skewness. Result found that kurtosis and skewness value are the most appropriate parameters to represent the deep twist drill tool conditions behaviors from vibrations and forces data. The condition of the deep twist drill process been classified according to good, blunt and fracture. It also found that the different tool geometry parameters affect the performance of the tool drill. It believe the results of this study are useful in determining the suitable analysis method to be used for developing online tool condition monitoring system to identify the tertiary tool life stage and helps to avoid mature of tool fracture during drilling process.

  1. Economic analysis of locust beans processing and marketing in ilorin, kwara state, Nigeria

    Directory of Open Access Journals (Sweden)

    C.O. Farayola

    2012-12-01

    Full Text Available This study was designed to estimate the economic analysis of locust bean processing and marketing in Ilorin, Kwara State, Nigeria. Primary data was used and purposive sampling technique was adopted to select the respondents used for the study. A total number of 60 respondents were interviewed. The data collected were analyzed using inferential statistical tool such as regression analysis. Budgetary analysis technique was also used to analyze the profitability of locust bean processing and marketing in the study area. Majority of the processors and marketers are making profits; 68.3% operate above breakeven point while 26.7% operate at breakeven point and the rest 5% was below the breakeven point, this indicates that they neither profit nor lost. The regression analysis result shows that quantity processed, family size and years of experience in processing are significant at 1%, 5% and 10% respectively while education level and stall rent is negative and significant at 1% and 5% respectively. F- Test also explained that independent variables are jointly significant at 1% probability level with an adjusted R2 of 78.9%. The overall rate of return on investment indicates that average rate of return is 0.5 (50%, which is positive. It is therefore concluded that profit made by the processors and marketers can be improved on by increasing the quantity of locust bean being processed through adoption of newly discovered method of processing and improved method of preservation, packaging and marketing of the product to international standard by reducing the odour of the product without the loss of essential nutrients and palability in order to generate foreign exchange. Also, rules and regulations against cutting of economic trees for alternative uses should be enforced to maximize their values.

  2. Analysis of launch site processing effectiveness for the Space Shuttle 26R payload

    Science.gov (United States)

    Flores, Carlos A.; Heuser, Robert E.; Pepper, Richard E., Jr.; Smith, Anthony M.

    1991-01-01

    A trend analysis study has been performed on problem reports recorded during the Space Shuttle 26R payload's processing cycle at NASA-Kennedy, using the defect-flow analysis (DFA) methodology; DFA gives attention to the characteristics of the problem-report 'population' as a whole. It is established that the problem reports contain data which distract from pressing problems, and that fully 60 percent of such reports were caused during processing at NASA-Kennedy. The second major cause of problem reports was design defects.

  3. Laser apparatus and method for microscopic and spectroscopic analysis and processing of biological cells

    Science.gov (United States)

    Gourley, P.L.; Gourley, M.F.

    1997-03-04

    An apparatus and method are disclosed for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis. 20 figs.

  4. Energy analysis of a desalination process of sea water with nuclear energy

    International Nuclear Information System (INIS)

    Martinez L, G.; Valle H, J.

    2016-09-01

    In the present work, is theoretically proven that the residual heat, removed by the chillers in the stage prior to the compression of the recuperative Brayton cycle with which nuclear power plants operate with high temperature gas reactors (HTGR), can be used to produce stem and desalinate seawater. The desalination process selected for the analysis, based on its operating characteristics, is the Multi-Stage Distillation (Med). The Med process will use as energy source, for the flash evaporation process in the flash trap, the residual heat that the reactor coolant dissipates to the environment in order to increase the compression efficiency of the same; the energy dissipated depends on the operating conditions of the reactor. The Med distillation process requires saturated steam at low pressure which can be obtained by means of a heat exchanger, taking advantage of the residual heat, where the relative low temperatures with which the process operates make the nuclear plants with HTGR reactors ideal for desalination of sea water, because they do not require major modifications to their design of their operation. In this work the energy analysis of a six-stage Med module coupled to the chillers of an HTGR reactor of the Pebble Bed Modular Reactor type is presented. Mathematical modeling was obtained by differential equations of mass and energy balances in the system. The results of the analysis are presented in a table for each distillation stage, estimating the pure water obtained as a function of the heat supplied. (Author)

  5. Boiling process modelling peculiarities analysis of the vacuum boiler

    Science.gov (United States)

    Slobodina, E. N.; Mikhailov, A. G.

    2017-06-01

    The analysis of the low and medium powered boiler equipment development was carried out, boiler units possible development directions with the purpose of energy efficiency improvement were identified. Engineering studies for the vacuum boilers applying are represented. Vacuum boiler heat-exchange processes where boiling water is the working body are considered. Heat-exchange intensification method under boiling at the maximum heat- transfer coefficient is examined. As a result of the conducted calculation studies, heat-transfer coefficients variation curves depending on the pressure, calculated through the analytical and numerical methodologies were obtained. The conclusion about the possibility of numerical computing method application through RPI ANSYS CFX for the boiling process description in boiler vacuum volume was given.

  6. Integrated system for design and analysis of industrial processes with electrolyte system

    DEFF Research Database (Denmark)

    Takano, Kiyoteru; Gani, Rafiqul

    1999-01-01

    of thermodynamic insights not only to generate process alternatives but also to obtain good initial estimates for the simulation engine and for visualization of process synthesis/design. The main steps of the algorithm are highlighted through a case study involving an industrial crystallization process.......An algorithm for design and analysis of crystallization processes with electrolyte systems is presented. This algorithm consists of a thermodynamic part, a synthesis part and a design part. The three parts are integrated through a simulation engine. The main features of the algorithm is the use...

  7. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  8. Introduction of male circumcision for HIV prevention in Uganda: analysis of the policy process.

    Science.gov (United States)

    Odoch, Walter Denis; Kabali, Kenneth; Ankunda, Racheal; Zulu, Joseph Mumba; Tetui, Moses

    2015-06-20

    Health policy analysis is important for all health policies especially in fields with ever changing evidence-based interventions such as HIV prevention. However, there are few published reports of health policy analysis in sub-Saharan Africa in this field. This study explored the policy process of the introduction of male circumcision (MC) for HIV prevention in Uganda in order to inform the development processes of similar health policies. Desk review of relevant documents was conducted between March and May 2012. Thematic analysis was used to analyse the data. Conceptual frameworks that demonstrate the interrelationship within the policy development processes and influence of actors in the policy development processes guided the analysis. Following the introduction of MC on the national policy agenda in 2007, negotiation and policy formulation preceded its communication and implementation. Policy proponents included academic researchers in the early 2000s and development partners around 2007. Favourable contextual factors that supported the development of the policy included the rising HIV prevalence, adoption of MC for HIV prevention in other sub-Saharan African countries, and expertise on MC. Additionally, the networking capability of proponents facilitated the change in position of non-supportive or neutral actors. Non-supportive and neutral actors in the initial stages of the policy development process included the Ministry of Health, traditional and Muslim leaders, and the Republican President. Using political authority, legitimacy, and charisma, actors who opposed the policy tried to block the policy development process. Researchers' initial disregard of the Ministry of Health in the research process of MC and the missing civil society advocacy arm contributed to delays in the policy development process. This study underscores the importance of securing top political leadership as well as key implementing partners' support in policy development processes

  9. FE-Analysis of Stretch-Blow Moulded Bottles Using an Integrative Process Simulation

    Science.gov (United States)

    Hopmann, C.; Michaeli, W.; Rasche, S.

    2011-05-01

    The two-stage stretch-blow moulding process has been established for the large scale production of high quality PET containers with excellent mechanical and optical properties. The total production costs of a bottle are significantly caused by the material costs. Due to this dominant share of the bottle material, the PET industry is interested in reducing the total production costs by an optimised material efficiency. However, a reduced material inventory means decreasing wall thicknesses and therewith a reduction of the bottle properties (e.g. mechanical properties, barrier properties). Therefore, there is often a trade-off between a minimal bottle weight and adequate properties of the bottle. In order to achieve the objectives Computer Aided Engineering (CAE) techniques can assist the designer of new stretch-blow moulded containers. Hence, tools such as the process simulation and the structural analysis have become important in the blow moulding sector. The Institute of Plastics Processing (IKV) at RWTH Aachen University, Germany, has developed an integrative three-dimensional process simulation which models the complete path of a preform through a stretch-blow moulding machine. At first, the reheating of the preform is calculated by a thermal simulation. Afterwards, the inflation of the preform to a bottle is calculated by finite element analysis (FEA). The results of this step are e.g. the local wall thickness distribution and the local biaxial stretch ratios. Not only the material distribution but also the material properties that result from the deformation history of the polymer have significant influence on the bottle properties. Therefore, a correlation between the material properties and stretch ratios is considered in an integrative simulation approach developed at IKV. The results of the process simulation (wall thickness, stretch ratios) are transferred to a further simulation program and mapped on the bottles FE mesh. This approach allows a local

  10. Unraveling cell processes: interference imaging interwoven with data analysis

    DEFF Research Database (Denmark)

    Brazhe, Nadezda; Brazhe, Alexey; Pavlov, A N

    2006-01-01

    The paper presents results on the application of interference microscopy and wavelet-analysis for cell visualization and studies of cell dynamics. We demonstrate that interference imaging of erythrocytes can reveal reorganization of the cytoskeleton and inhomogenity in the distribution of hemoglo......The paper presents results on the application of interference microscopy and wavelet-analysis for cell visualization and studies of cell dynamics. We demonstrate that interference imaging of erythrocytes can reveal reorganization of the cytoskeleton and inhomogenity in the distribution...... properties differ from cell type to cell type and depend on the cellular compartment. Our results suggest that low frequency variations (0.1-0.6 Hz) result from plasma membrane processes and that higher frequency variations (20-26 Hz) are related to the movement of vesicles. Using double-wavelet analysis, we...... study the modulation of the 1 Hz rhythm in neurons and reveal its changes under depolarization and hyperpolarization of the plasma membrane. We conclude that interference microscopy combined with wavelet analysis is a useful technique for non-invasive cell studies, cell visualization, and investigation...

  11. Memory systems, processes, and tasks: taxonomic clarification via factor analysis.

    Science.gov (United States)

    Bruss, Peter J; Mitchell, David B

    2009-01-01

    The nature of various memory systems was examined using factor analysis. We reanalyzed data from 11 memory tasks previously reported in Mitchell and Bruss (2003). Four well-defined factors emerged, closely resembling episodic and semantic memory and conceptual and perceptual implicit memory, in line with both memory systems and transfer-appropriate processing accounts. To explore taxonomic issues, we ran separate analyses on the implicit tasks. Using a cross-format manipulation (pictures vs. words), we identified 3 prototypical tasks. Word fragment completion and picture fragment identification tasks were "factor pure," tapping perceptual processes uniquely. Category exemplar generation revealed its conceptual nature, yielding both cross-format priming and a picture superiority effect. In contrast, word stem completion and picture naming were more complex, revealing attributes of both processes.

  12. Conversion of paper sludge to ethanol, II: process design and economic analysis.

    Science.gov (United States)

    Fan, Zhiliang; Lynd, Lee R

    2007-01-01

    Process design and economics are considered for conversion of paper sludge to ethanol. A particular site, a bleached kraft mill operated in Gorham, NH by Fraser Papers (15 tons dry sludge processed per day), is considered. In addition, profitability is examined for a larger plant (50 dry tons per day) and sensitivity analysis is carried out with respect to capacity, tipping fee, and ethanol price. Conversion based on simultaneous saccharification and fermentation with intermittent feeding is examined, with ethanol recovery provided by distillation and molecular sieve adsorption. It was found that the Fraser plant achieves positive cash flow with or without xylose conversion and mineral recovery. Sensitivity analysis indicates economics are very sensitive to ethanol selling price and scale; significant but less sensitive to the tipping fee, and rather insensitive to the prices of cellulase and power. Internal rates of return exceeding 15% are projected for larger plants at most combinations of scale, tipping fee, and ethanol price. Our analysis lends support to the proposition that paper sludge is a leading point-of-entry and proving ground for emergent industrial processes featuring enzymatic hydrolysis of cellulosic biomass.

  13. Innovation and decision-making process in reverse logistics: a bibliometric analysis

    Directory of Open Access Journals (Sweden)

    Fernando Antonio Guimarães Tenório

    2014-05-01

    Full Text Available This work aimed to make a bibliometric analysis on empirical studies that focus on the reverse logistics process. Papers published in two major events of management and a production engineering were collected during the years 2007-2012. To perform the analysis assumptions were adopted as the concepts of innovation and decision-making. 43 articles were analyzed and it was found that, in most cases, organizations choose to deploy reverse logistics as a means to solving problems related to environmental laws and regulations and after its implementation, the decision-making process related to the network of companies that perform reverse logistics remains restricted to the adopter company, thus becoming a centralized decision-making process. It was also found that reverse logistics is, in most cases, an innovation in the supply chain, it provides a new way to manage and operate the return and recycling of waste products and generating competitive advantages in the form of increased net income and better picture of the organization to its partners and customers.

  14. Process Measurement Deviation Analysis for Flow Rate due to Miscalibration

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Eunsuk; Kim, Byung Rae; Jeong, Seog Hwan; Choi, Ji Hye; Shin, Yong Chul; Yun, Jae Hee [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    An analysis was initiated to identify the root cause, and the exemption of high static line pressure correction to differential pressure (DP) transmitters was one of the major deviation factors. Also the miscalibrated DP transmitter range was identified as another major deviation factor. This paper presents considerations to be incorporated in the process flow measurement instrumentation calibration and the analysis results identified that the DP flow transmitter electrical output decreased by 3%. Thereafter, flow rate indication decreased by 1.9% resulting from the high static line pressure correction exemption and measurement range miscalibration. After re-calibration, the flow rate indication increased by 1.9%, which is consistent with the analysis result. This paper presents the brief calibration procedures for Rosemount DP flow transmitter, and analyzes possible three cases of measurement deviation including error and cause. Generally, the DP transmitter is required to be calibrated with precise process input range according to the calibration procedure provided for specific DP transmitter. Especially, in case of the DP transmitter installed in high static line pressure, it is important to correct the high static line pressure effect to avoid the inherent systematic error for Rosemount DP transmitter. Otherwise, failure to notice the correction may lead to indicating deviation from actual value.

  15. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    Science.gov (United States)

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Experimental software for modeling and interpreting educational data analysis processes

    Directory of Open Access Journals (Sweden)

    Natalya V. Zorina

    2017-12-01

    Full Text Available Problems, tasks and processes of educational data mining are considered in this article. The objective is to create a fundamentally new information system of the University using the results educational data analysis. One of the functions of such a system is knowledge extraction from accumulated in the operation process data. The creation of the national system of this type is an iterative and time-consuming process requiring the preliminary studies and incremental prototyping modules. The novelty of such systems is that there is a lack of those using this methodology of the development, for this purpose a number of experiments was carried out in order to collect data, choose appropriate methods for the study and to interpret them. As a result of the experiment, the authors were available sources available for analysis in the information environment of the home university. The data were taken from the semester performance, obtained from the information system of the training department of the Institute of IT MTU MIREA, the data obtained as a result of the independent work of students and data, using specially designed Google-forms. To automate the collection of information and analysis of educational data, an experimental software package was created. As a methodology for developing the experimental software complex, a decision was made using the methodologies of rational-empirical complexes (REX and single-experimentation program technologies (TPEI. The details of the program implementation of the complex are described in detail, conclusions are given about the availability of the data sources used, and conclusions are drawn about the prospects for further development.

  17. Japan`s sunshine project. 17.. 1992 annual summary of coal liquefaction and gasification

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    This report describes the achievement of coal liquefaction and gasification technology development in the Sunshine Project for FY 1992. It presents the research and development of coal liquefaction which includes studies on reaction mechanism of coal liquefaction and catalysts for coal liquefaction, the research and development of coal gasification technologies which includes studies on gasification characteristics of various coals and improvement of coal gasification efficiency, the development of bituminous coal liquefaction which includes engineering, construction and operation of a bituminous coal liquefaction pilot plant and research by a process supporting unit (PSU), the development of brown coal liquefaction which includes research on brown coal liquefaction with a pilot plant and development of techniques for upgrading coal oil from brown coal, the development of common base technologies which includes development of slurry letdown valves and study on upgrading technology of coal-derived distillates, the development of coal-based hydrogen production technology with a pilot plant, the development of technology for entrained flow coal gasification, the assessment of coal hydrogasification, and the international co-operation. 4 refs., 125 figs., 39 tabs.

  18. SITE SUITABILITY ANALYSIS FOR BEEKEEPING VIA ANALYTHICAL HYREARCHY PROCESS, KONYA EXAMPLE

    Directory of Open Access Journals (Sweden)

    F. Sarı

    2017-11-01

    Full Text Available Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA and Geographical Information Systems (GIS integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.

  19. An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.

    Science.gov (United States)

    Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes

    2017-10-01

    This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.

  20. A meta-analysis of multicultural competencies and psychotherapy process and outcome.

    Science.gov (United States)

    Tao, Karen W; Owen, Jesse; Pace, Brian T; Imel, Zac E

    2015-07-01

    For decades, psychologists have emphasized the provision of multiculturally competent psychotherapy to reduce racial and ethnic disparities in mental health treatment. However, the relationship between multicultural competencies (MC) and other measures of clinical process and treatment outcome has shown heterogeneity in effect sizes. This meta-analysis tested the association of client ratings of therapist MC with measures of therapeutic processes and outcome, including: (a) working alliance, (b) client satisfaction, (c) general counseling competence, (d) session impact, and (e) symptom improvement. Among 18 studies (20 independent samples) included in the analysis, the correlation between therapist MC and outcome (r = .29) was much smaller than the association with process measures (r = .75), but there were no significant differences in correlations across different types of MC or clinical process measures. Providing some evidence of publication bias, effect sizes from published studies (r = .67) were larger than those from unpublished dissertations (r = .28). Moderator analyses indicated that client age, gender, the representation of racial-ethnic minority (R-EM) clients, and clinical setting were not associated with effect size variability. Based on these findings, we discuss implications and recommendations for future research that might lead to a better understanding of the effects of therapist MC on treatment process and outcome. Primary needs in future research include the development and evaluation of observer ratings of therapist MC and the implementation of longitudinal research designs. (c) 2015 APA, all rights reserved).

  1. Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool

    Science.gov (United States)

    2017-06-01

    impact everything from strategic logistic operations down to the energy demands at the company level. It also looks at the force structure of the...this requirement. 34. The system shall determine the efficiency of the logistics network with respect to an estimated cost of fuel used to deliver...REQUIREMENTS GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL by Jonathan M. Swan June 2017 Thesis Advisor

  2. Research and analysis of modernization processes in food industry enterprises of Ukraine

    Directory of Open Access Journals (Sweden)

    Buzhimska K.O.

    2017-08-01

    Full Text Available The modernization of domestic enterprises is a prerequisite for the integration of Ukraine into the European Union, first of all it concerns food industry enterprises, because they have the greatest potential for access to European markets with their own products. Accelerated modernization will provide an opportunity to improve the quality and safety of domestic food products and bring them closer to world standards. The methods and methodology of economic and statistical analysis remain the focus of the scholars. The analysis of trends, directions of development, and results of activities creates a basis for the adoption of quality management decisions, both at the strategic and operational levels. The study of the modernization process is impossible without the use of methods of economic and statistical analysis for a general evaluation of its state and efficiency. The paper proposes the relative indicators of asset value dynamics, residual value of fixed assets, volumes of sales, financial results before taxation, net profit for a generalizing assessment of the modernization process. It is substantiated that the modernization process is effective if the growth rate of asset value is greater than one unit, the growth rate of the residual value of fixed assets increases the growth rate of assets, the growth rate of sales of products is greater than the growth rate of residual value of fixed assets, the rate of growth of financial results before taxation is higher than the pace, the growth of sales volume, the growth rate of net profit is higher than the growth rate of the financial result before taxation. Using the Spirmeno coefficient, the authors obtained following results: the modernization process was most effective in 2011–2012, the modernization processes in food industry sharply slowed down during 2013–2015, but due to the already formed potential, they continue confirming the integral indices of the state and efficiency of

  3. Finite element analysis of the combined fine blanking and extrusion process

    Science.gov (United States)

    Zheng, Peng-Fei

    The combined fine blanking and extrusion process is such a metal forming process that fine blanking and forward extrusion are carried out on sheet metal material at the same time. There are two typical characteristics in this process, one is the fine blanking whose deformation mechanism is different from conventional blanking; the other is the sheet metal extrusion, which is different from the conventional extrusion. Even though fine blanking has been used in industry for many years, only limited literature can be found which deals with the theoretical analysis of it. On the other hand, no publications on the theoretical analysis of the sheet metal extrusion have been found. Intensive work should be carried out to reveal the mechanism of both fine blanking process and sheet metal extrusion process, and further the combined fine blanking and extrusion process. The scope of this thesis is to study the mechanics of fine blanking, sheet metal extrusion, and combined fine blanking and extrusion process one by one with the rigid-plastic finite element method. All of above processes are typical unsteady ones, especially the fine blanking process in which extremely severe and localized deformation occurs. Therefore, commercial programs can not be used to solve these problems up till now. Owing to this reason, a rigid-plastic finite element program was developed for simulating these processes where remeshing and mesh tracing techniques as well as the golden section method were adopted according to the characteristics of these processes in this thesis. Moreover, a permissible kinematic velocity field was adopted as the initial velocity field for simulating extrusion process successfully. Results from the simulation included the distorted mesh, the field of material flow, the stress and the strain distributions at various moments of deformation. Results under different deformation conditions such as different blanking clearances, different diameters of the extrusion punch and

  4. The analysis of anode sludges, and their process solutions and beneficiation products

    International Nuclear Information System (INIS)

    Dixon, K.; Russell, G.M.; Wall, G.J.; Eddy, B.T.; Mallett, R.C.; Royal, S.J.

    1979-01-01

    As previous methods for the analysis of anode slimes have required lengthy separations, instrumental procedures were developed that require no preparation of the sample or only simple procedures such as acid digestion and fusion. Comparative values for various techniques are given. Methods for the analysis of process solutions and beneficiation products are examined and the procedures that have been adopted together with their relative merits and applicability are discussed. Methods of analysis include: atomic-absorption spectrophotometry, x-ray-fluorescence spectrophotometry, x-ray-fluorescence spectrometry, instrumental neutron-activation analysis and optical emission spectrometry

  5. Methodology to analysis of aging processes of containment spray system

    International Nuclear Information System (INIS)

    Borges, D. da Silva; Lava, D.D.; Moreira, M. de L.; Ferreira Guimarães, A.C.; Fernandes da Silva, L.

    2015-01-01

    This paper presents a contribution to the study of aging process of components in commercial plants of Pressurized Water Reactors (PWRs). The motivation for write this work emerged from the current perspective nuclear. Numerous nuclear power plants worldwide have an advanced operating time. This problem requires a process to ensure the confiability of the operative systems of these plants, because of this, it is necessary a methodologies capable of estimate the failure probability of the components and systems. In addition to the safety factors involved, such methodologies can to be used to search ways to ensure the extension of the life cycle of nuclear plants, which inevitably will pass by the decommissioning process after the operating time of 40 years. This process negatively affects the power generation, besides demanding an enormous investment for such. Thus, this paper aims to present modeling techniques and sensitivity analysis, which together can generate an estimate of how components, which are more sensitive to the aging process, will behave during the normal operation cycle of a nuclear power plant. (authors)

  6. Designing discovery learning environments: process analysis and implications for designing an information system

    NARCIS (Netherlands)

    Pieters, Julius Marie; Limbach, R.; de Jong, Anthonius J.M.

    2004-01-01

    A systematic analysis of the design process of authors of (simulation based) discovery learning environments was carried out. The analysis aimed at identifying the design activities of authors and categorising knowledge gaps that they experience. First, five existing studies were systematically

  7. Comprehensive Mass Analysis for Chemical Processes, a Case Study on L-Dopa Manufacture

    Science.gov (United States)

    To evaluate the “greenness” of chemical processes in route selection and process development, we propose a comprehensive mass analysis to inform the stakeholders from different fields. This is carried out by characterizing the mass intensity for each contributing chemical or wast...

  8. Leadership in nursing: analysis of the process of choosing the heads.

    Science.gov (United States)

    de Moura, Gisela Maria Schebella Souto; de Magalhaes, Ana Maria Müller; Dall'agnol, Clarice Maria; Juchem, Beatriz Cavalcanti; Marona, Daniela dos Santos

    2010-01-01

    The process of choosing heads can be strategic to achieve desired results in nursing care. This study presents an exploratory and descriptive research that aims to analyze the process of choosing heads for the ward, in the nursing area of a teaching hospital in Porto Alegre. Data was collected from registered nurses, technicians and nursing auxiliaries through a semi-structured interview technique and free choice of words. Three theme categories emerged from content analysis: process of choosing heads, managerial competences of the head-to-be and team articulation. Leadership was the word most frequently associated with the process of choosing heads. The consultation process for the choice of the leader also contributes to the success of the manager, as it makes the team members feel co-responsible for the results achieved and legitimizes the head-to-be in their group.

  9. The PWI [plutonium waste incinerator] expert system: Real time, PC-based process analysis

    International Nuclear Information System (INIS)

    Brown, K.G.; Smith, F.G.

    1987-01-01

    A real time, microcomputer-based expert system is being developed for a prototype plutonium waste incinerator (PWI) process at Du Pont's Savannah River Laboratory. The expert system will diagnose instrumentation problems, assist operator training, serve as a repository for engineering knowledge about the process, and provide continuous operation and performance information. A set of necessary operational criteria was developed from process and engineering constraints; it was used to define hardware and software needs. The most important criterion is operating speed because the analysis operates in real time. TURBO PROLOG by Borland International was selected. The analysis system is divided into three sections: the user-system interface, the inference engine and rule base, and the files representing the blackboard information center

  10. High Performance Processing and Analysis of Geospatial Data Using CUDA on GPU

    Directory of Open Access Journals (Sweden)

    STOJANOVIC, N.

    2014-11-01

    Full Text Available In this paper, the high-performance processing of massive geospatial data on many-core GPU (Graphic Processing Unit is presented. We use CUDA (Compute Unified Device Architecture programming framework to implement parallel processing of common Geographic Information Systems (GIS algorithms, such as viewshed analysis and map-matching. Experimental evaluation indicates the improvement in performance with respect to CPU-based solutions and shows feasibility of using GPU and CUDA for parallel implementation of GIS algorithms over large-scale geospatial datasets.

  11. Bayesian analysis of log Gaussian Cox processes for disease mapping

    DEFF Research Database (Denmark)

    Benes, Viktor; Bodlák, Karel; Møller, Jesper

    We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis, and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence...... of the risk on the covariates. Instead of using the common area level approaches we consider a Bayesian analysis for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using markov chain Monte Carlo methods...

  12. Data Processing and Analysis Systems for JT-60U

    International Nuclear Information System (INIS)

    Matsuda, T.; Totsuka, T.; Tsugita, T.; Oshima, T.; Sakata, S.; Sato, M.; Iwasaki, K.

    2002-01-01

    The JT-60U data processing system is a large computer complex gradually modernized by utilizing progressive computer and network technology. A main computer using state-of-the-art CMOS technology can handle ∼550 MB of data per discharge. A gigabit ethernet switch with FDDI ports has been introduced to cope with the increase of handling data. Workstation systems with VMEbus serial highway drivers for CAMAC have been developed and used to replace many minicomputer systems. VMEbus-based fast data acquisition systems have also been developed to enlarge and replace a minicomputer system for mass data.The JT-60U data analysis system is composed of a JT-60U database server and a JT-60U analysis server, which are distributed UNIX servers. The experimental database is stored in the 1TB RAID disk of the JT-60U database server and is composed of ZENKEI and diagnostic databases. Various data analysis tools are available on the JT-60U analysis server. For the remote collaboration, technical features of the data analysis system have been applied to the computer system to access JT-60U data via the Internet. Remote participation in JT-60U experiments has been successfully conducted since 1996

  13. Process and technoeconomic analysis of leading pretreatment technologies for lignocellulosic ethanol production using switchgrass.

    Science.gov (United States)

    Tao, Ling; Aden, Andy; Elander, Richard T; Pallapolu, Venkata Ramesh; Lee, Y Y; Garlock, Rebecca J; Balan, Venkatesh; Dale, Bruce E; Kim, Youngmi; Mosier, Nathan S; Ladisch, Michael R; Falls, Matthew; Holtzapple, Mark T; Sierra, Rocio; Shi, Jian; Ebrik, Mirvat A; Redmond, Tim; Yang, Bin; Wyman, Charles E; Hames, Bonnie; Thomas, Steve; Warner, Ryan E

    2011-12-01

    Six biomass pretreatment processes to convert switchgrass to fermentable sugars and ultimately to cellulosic ethanol are compared on a consistent basis in this technoeconomic analysis. The six pretreatment processes are ammonia fiber expansion (AFEX), dilute acid (DA), lime, liquid hot water (LHW), soaking in aqueous ammonia (SAA), and sulfur dioxide-impregnated steam explosion (SO(2)). Each pretreatment process is modeled in the framework of an existing biochemical design model so that systematic variations of process-related changes are consistently captured. The pretreatment area process design and simulation are based on the research data generated within the Biomass Refining Consortium for Applied Fundamentals and Innovation (CAFI) 3 project. Overall ethanol production, total capital investment, and minimum ethanol selling price (MESP) are reported along with selected sensitivity analysis. The results show limited differentiation between the projected economic performances of the pretreatment options, except for processes that exhibit significantly lower monomer sugar and resulting ethanol yields. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. The Numerical and Experimental Analysis of Ballizing Process of Steel Tubes

    Directory of Open Access Journals (Sweden)

    Dyl T.

    2017-06-01

    Full Text Available This paper presents chosen results of experimental and numerical research of ballizing process of the steel tubes. Ballizing process is a method of burnishing technology of an internal diameter by precisely forcing a ball through a slightly undersized pre-machined tubes. Ballizing process is a fast, low-cost process for sizing and finishing tubes. It consists of pressing a slightly oversized ball through an unfinished tube to quickly bring the hole to desired size. The ball is typically made from a very hard material such as tungsten carbide or bearing steel. Ballizing process is by cold surface plastic forming of the surface structure, thereby leaving a layer of harder material and reducing its roughness. After theoretical and experimental analysis it was determined that the smaller the diameter of the balls, the bigger intensity of stress and strain and strain rate. The paper presents influence of ballizing process on the strain and stress state and on the surface roughness reduction rate of the steel tubes.

  15. Efficiency analysis of wood processing industry in China during 2006-2015

    Science.gov (United States)

    Zhang, Kun; Yuan, Baolong; Li, Yanxuan

    2018-03-01

    The wood processing industry is an important industry which affects the national economy and social development. The data envelopment analysis model (DEA) is a quantitative evaluation method for studying industrial efficiency. In this paper, the wood processing industry of 8 provinces in southern China is taken as the study object, and the efficiency of each province in 2006 to 2015 was measured and calculated with the DEA method, and the efficiency changes, technological changes and Malmquist index were analyzed dynamically. The empirical results show that there is a widening gap in the efficiency of wood processing industry of the 8 provinces, and the technological progress has shown a lag in the promotion of wood processing industry. According to the research conclusion, along with the situation of domestic and foreign wood processing industry development, the government must introduce relevant policies to strengthen the construction of the wood processing industry technology innovation policy system and the industrial coordinated development system.

  16. Modeling a production scale milk drying process: parameter estimation, uncertainty and sensitivity analysis

    DEFF Research Database (Denmark)

    Ferrari, A.; Gutierrez, S.; Sin, Gürkan

    2016-01-01

    A steady state model for a production scale milk drying process was built to help process understanding and optimization studies. It involves a spray chamber and also internal/external fluid beds. The model was subjected to a comprehensive statistical analysis for quality assurance using...

  17. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    Science.gov (United States)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  18. Energetic analysis of fruit juice processing operations in Nigeria

    Energy Technology Data Exchange (ETDEWEB)

    Waheed, M.A.; Imeokparia, O.E. [Ladoke Akintola University of Technology, Ogbomoso, Oyo State (Nigeria). Mechanical Engineering Department; Jekayinfa, S.O.; Ojediran, J.O. [Ladoke Akintola University of Technology, Ogbomoso, Oyo State (Nigeria). Agricultural Engineering Department

    2008-01-15

    Energy and exergy studies were conducted in an orange juice manufacturing industry in Nigeria to determine the energy consumption pattern and methods of energy optimization in the company. An adaptation of the process analysis method of energy accounting was used to evaluate the energy requirement for each of the eight defined unit operations. The types of energy used in the manufacturing of orange juice were electrical, steam and manual with the respective proportions of 18.51%, 80.91% and 0.58% of the total energy. It was estimated that an average energy intensity of 1.12 MJ/kg was required for the manufacturing of orange juice. The most energy intensive operation was identified as the pasteurizer followed by packaging unit with energy intensities of 0.932 and 0.119 MJ/kg, respectively. The exergy analysis revealed that the pasteurizer was responsible for most of the inefficiency (over 90%) followed by packaging (6.60%). It was suggested that the capacity of the pasteurizer be increased to reduce the level of inefficiency of the plant. The suggestion has been limited to equipment modification rather than process alteration, which constitutes additional investment cost and may not be economical from an energy savings perspective. (author)

  19. An analysis of the process and results of manual geocode correction

    Science.gov (United States)

    McDonald, Yolanda J.; Schwind, Michael; Goldberg, Daniel W.; Lampley, Amanda; Wheeler, Cosette M.

    2018-01-01

    Geocoding is the science and process of assigning geographical coordinates (i.e. latitude, longitude) to a postal address. The quality of the geocode can vary dramatically depending on several variables, including incorrect input address data, missing address components, and spelling mistakes. A dataset with a considerable number of geocoding inaccuracies can potentially result in an imprecise analysis and invalid conclusions. There has been little quantitative analysis of the amount of effort (i.e. time) to perform geocoding correction, and how such correction could improve geocode quality type. This study used a low-cost and easy to implement method to improve geocode quality type of an input database (i.e. addresses to be matched) through the processes of manual geocode intervention, and it assessed the amount of effort to manually correct inaccurate geocodes, reported the resulting match rate improvement between the original and the corrected geocodes, and documented the corresponding spatial shift by geocode quality type resulting from the corrections. Findings demonstrated that manual intervention of geocoding resulted in a 90% improvement of geocode quality type, took 42 hours to process, and the spatial shift ranged from 0.02 to 151,368 m. This study provides evidence to inform research teams considering the application of manual geocoding intervention that it is a low-cost and relatively easy process to execute. PMID:28555477

  20. An analysis of the process and results of manual geocode correction

    Directory of Open Access Journals (Sweden)

    Yolanda J. McDonald

    2017-05-01

    Full Text Available Geocoding is the science and process of assigning geographical coordinates (i.e. latitude, longitude to a postal address. The quality of the geocode can vary dramatically depending on several variables, including incorrect input address data, missing address components, and spelling mistakes. A dataset with a considerable number of geocoding inaccuracies can potentially result in an imprecise analysis and invalid conclusions. There has been little quantitative analysis of the amount of effort (i.e. time to perform geocoding correction, and how such correction could improve geocode quality type. This study used a low-cost and easy to implement method to improve geocode quality type of an input database (i.e. addresses to be matched through the processes of manual geocode intervention, and it assessed the amount of effort to manually correct inaccurate geocodes, reported the resulting match rate improvement between the original and the corrected geocodes, and documented the corresponding spatial shift by geocode quality type resulting from the corrections. Findings demonstrated that manual intervention of geocoding resulted in a 90% improvement of geocode quality type, took 42 hours to process, and the spatial shift ranged from 0.02 to 151,368 m. This study provides evidence to inform research teams considering the application of manual geocoding intervention that it is a low-cost and relatively easy process to execute.

  1. Analysis of ex situ processes of CO2 sequestration. Final report

    International Nuclear Information System (INIS)

    Touze, S.; Bourgeois, F.; Baranger, P.; Durst, P.

    2004-01-01

    The aim of this study is to bring quantitative elements to evaluate the validation of the CO 2 mineral sequestration to limit the greenhouse effect gases. This analysis aims to calculate the CO 2 accounting of the system (internal energy production balance the energy expend) sequestrated CO 2 and produced CO 2 . The first part detailed the possible experimental solutions. Then two carbonation processes, direct and indirect, have been chosen of the analysis. (A.L.B.)

  2. On-Line GIS Analysis and Image Processing for Geoportal Kielce/poland Development

    Science.gov (United States)

    Hejmanowska, B.; Głowienka, E.; Florek-Paszkowski, R.

    2016-06-01

    GIS databases are widely available on the Internet, but mainly for visualization with limited functionality; very simple queries are possible i.e. attribute query, coordinate readout, line and area measurements or pathfinder. A little more complex analysis (i.e. buffering or intersection) are rare offered. Paper aims at the concept of Geoportal functionality development in the field of GIS analysis. Multi-Criteria Evaluation (MCE) is planned to be implemented in web application. OGC Service is used for data acquisition from the server and results visualization. Advanced GIS analysis is planned in PostGIS and Python programming. In the paper an example of MCE analysis basing on Geoportal Kielce is presented. Other field where Geoportal can be developed is implementation of processing new available satellite images free of charge (Sentinel-2, Landsat 8, ASTER, WV-2). Now we are witnessing a revolution in access to the satellite imagery without charge. This should result in an increase of interest in the use of these data in various fields by a larger number of users, not necessarily specialists in remote sensing. Therefore, it seems reasonable to expand the functionality of Internet's tools for data processing by non-specialists, by automating data collection and prepared predefined analysis.

  3. ON-LINE GIS ANALYSIS AND IMAGE PROCESSING FOR GEOPORTAL KIELCE/POLAND DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    B. Hejmanowska

    2016-06-01

    Full Text Available GIS databases are widely available on the Internet, but mainly for visualization with limited functionality; very simple queries are possible i.e. attribute query, coordinate readout, line and area measurements or pathfinder. A little more complex analysis (i.e. buffering or intersection are rare offered. Paper aims at the concept of Geoportal functionality development in the field of GIS analysis. Multi-Criteria Evaluation (MCE is planned to be implemented in web application. OGC Service is used for data acquisition from the server and results visualization. Advanced GIS analysis is planned in PostGIS and Python programming. In the paper an example of MCE analysis basing on Geoportal Kielce is presented. Other field where Geoportal can be developed is implementation of processing new available satellite images free of charge (Sentinel-2, Landsat 8, ASTER, WV-2. Now we are witnessing a revolution in access to the satellite imagery without charge. This should result in an increase of interest in the use of these data in various fields by a larger number of users, not necessarily specialists in remote sensing. Therefore, it seems reasonable to expand the functionality of Internet's tools for data processing by non-specialists, by automating data collection and prepared predefined analysis.

  4. Diffusing wave spectroscopy applied to material analysis and process control

    International Nuclear Information System (INIS)

    Lloyd, Christopher James

    1997-01-01

    Diffusing Wave Spectroscopy (DWS) was studied as a method of laboratory analysis of sub-micron particles, and developed as a prospective in-line, industrial, process control sensor, capable of near real-time feedback. No sample pre-treatment was required and measurement was via a non-invasive, flexible, dip in probe. DWS relies on the concept of the diffusive migration of light, as opposed to the ballistic scatter model used in conventional dynamic light scattering. The specific requirements of the optoelectronic hardware, data analysis methods and light scattering model were studied experimentally and, where practical, theoretically resulting in a novel technique of analysis of particle suspensions and emulsions of volume fractions between 0.01 and 0.4. Operation at high concentrations made the technique oblivious to dust and contamination. A pure homodyne (autodyne) experimental arrangement described was resilient to environmental disturbances, unlike many other systems which utilise optical fibres or heterodyne operation. Pilot and subsequent prototype development led to a highly accurate method of size ranking, suitable for analysis of a wide range of suspensions and emulsions. The technique was shown to operate on real industrial samples with statistical variance as low as 0.3% with minimal software processing. Whilst the application studied was the analysis of TiO 2 suspensions, a diverse range of materials including polystyrene beads, cell pastes and industrial cutting fluid emulsions were tested. Results suggest that, whilst all sizing should be comparative to suitable standards, concentration effects may be minimised and even completely modelled-out in many applications. Adhesion to the optical probe was initially a significant problem but was minimised after the evaluation and use of suitable non stick coating materials. Unexpected behaviour in the correlation in the region of short decay times led to consideration of the effects of rotational diffusion

  5. Processing methods for differential analysis of LC/MS profile data

    Directory of Open Access Journals (Sweden)

    Orešič Matej

    2005-07-01

    Full Text Available Abstract Background Liquid chromatography coupled to mass spectrometry (LC/MS has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. Results We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. Conclusion The software is freely available under the GNU General Public License and it can be obtained from the project web page at: http://mzmine.sourceforge.net/.

  6. Analysis and control of harmful emissions from combustion processes

    OpenAIRE

    Jafari, Ahmad

    2000-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. The harmful effects of air pollutants on human beings and environment have been the major reason for efforts in sampling, analysis and control of their sources. The major pollutants emitted to atmosphere from stationary combustion processes are nitrogen oxides, inorganic acids, carbon dioxide, carbon monoxide, hydrocarbon and soot. In the current work two methods are developed for sampl...

  7. Data co-processing for extreme scale analysis level II ASC milestone (4745).

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, David; Moreland, Kenneth D.; Oldfield, Ron A.; Fabian, Nathan D.

    2013-03-01

    Exascale supercomputing will embody many revolutionary changes in the hardware and software of high-performance computing. A particularly pressing issue is gaining insight into the science behind the exascale computations. Power and I/O speed con- straints will fundamentally change current visualization and analysis work ows. A traditional post-processing work ow involves storing simulation results to disk and later retrieving them for visualization and data analysis. However, at exascale, scien- tists and analysts will need a range of options for moving data to persistent storage, as the current o ine or post-processing pipelines will not be able to capture the data necessary for data analysis of these extreme scale simulations. This Milestone explores two alternate work ows, characterized as in situ and in transit, and compares them. We nd each to have its own merits and faults, and we provide information to help pick the best option for a particular use.

  8. A method of signal transmission path analysis for multivariate random processes

    International Nuclear Information System (INIS)

    Oguma, Ritsuo

    1984-04-01

    A method for noise analysis called ''STP (signal transmission path) analysis'' is presentd as a tool to identify noise sources and their propagation paths in multivariate random proceses. Basic idea of the analysis is to identify, via time series analysis, effective network for the signal power transmission among variables in the system and to make use of its information to the noise analysis. In the present paper, we accomplish this through two steps of signal processings; first, we estimate, using noise power contribution analysis, variables which have large contribution to the power spectrum of interest, and then evaluate the STPs for each pair of variables to identify STPs which play significant role for the generated noise to transmit to the variable under evaluation. The latter part of the analysis is executed through comparison of partial coherence function and newly introduced partial noise power contribution function. This paper presents the procedure of the STP analysis and demonstrates, using simulation data as well as Borssele PWR noise data, its effectiveness for investigation of noise generation and propagation mechanisms. (author)

  9. Temporal Information Processing and Stability Analysis of the MHSN Neuron Model in DDF

    Directory of Open Access Journals (Sweden)

    Saket Kumar Choudhary

    2016-12-01

    Full Text Available Implementation of a neuron like information processing structure at hardware level is a burning research problem. In this article, we analyze the modified hybrid spiking neuron model (the MHSN model in distributed delay framework (DDF for hardware level implementation point of view. We investigate its temporal information processing capability in term of inter-spike-interval (ISI distribution. We also perform the stability analysis of the MHSN model, in which, we compute nullclines, steady state solution, eigenvalues corresponding the MHSN model. During phase plane analysis, we notice that the MHSN model generates limit cycle oscillations which is an important phenomenon in many biological processes. Qualitative behavior of these limit cycle does not changes due to the variation in applied input stimulus, however, delay effect the spiking activity and duration of cycle get altered.

  10. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  11. Proposal of Optimization of the Process of Credit Analysis to the Self-Employment Segment

    Directory of Open Access Journals (Sweden)

    Armando Pereira-López

    2017-06-01

    Full Text Available As part of the banking system transformation process in Cuba, from 2011 is provided the granting of credits to persons authorized to engage in self-employment and other forms of non-state management. These new types of credit must be subject to strict rules of risk, analysis by the banking institution, what has caused it to become a slow and complicated process. According to the situation described, an analysis of the credit granted to this segment is carried out at the Branch 8312 of the Popular Saving Bank, determining the main limitations and making proposals to improve this said process through the mapping process tool that Proposes the Microsave Methodology which reduce the response time to the requests of the clients and contribute to the provision of more efficient services. 

  12. The process of NPP refuelling outage analysis and follow-up

    International Nuclear Information System (INIS)

    Nemec, T.; Savli, S.; Cernilogar Radez, M.; Persic, A.; Pecek, V.; Stritar, A.

    2007-01-01

    Following the outages in 2004 and 2006, the Slovenian Nuclear Safety Administration (SNSA) has started with the practice of independent outage analysis in a form of an internal report. It includes a comparison of performed activities against the planned time schedule of activities, evaluation of design modifications implementation and analysis of significant events. The main result of the outage analysis is a list of recommendations and some open issues that have been identified. These findings are the basis for development of an action plan for SNSA activities until the next outage, aimed at eliminating deficiencies found out during the outage and further improving outage activities. The established system of outage supervision together with the final analysis and long term action plan represents an effective continuous safety supervision process, by which the regulatory body independently contributes to the higher level of safety culture both at the licensee and among its own staff. (author)

  13. A process insight repository supporting process optimization

    OpenAIRE

    Vetlugin, Andrey

    2012-01-01

    Existing solutions for analysis and optimization of manufacturing processes, such as online analysis processing or statistical calculations, have shortcomings that limit continuous process improvements. In particular, they lack means of storing and integrating the results of analysis. This makes the valuable information that can be used for process optimizations used only once and then disposed. The goal of the Advanced Manufacturing Analytics (AdMA) research project is to design an integrate...

  14. Stochastic Analysis of a Queue Length Model Using a Graphics Processing Unit

    Czech Academy of Sciences Publication Activity Database

    Přikryl, Jan; Kocijan, J.

    2012-01-01

    Roč. 5, č. 2 (2012), s. 55-62 ISSN 1802-971X R&D Projects: GA MŠk(CZ) MEB091015 Institutional support: RVO:67985556 Keywords : graphics processing unit * GPU * Monte Carlo simulation * computer simulation * modeling Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2012/AS/prikryl-stochastic analysis of a queue length model using a graphics processing unit.pdf

  15. Optimization of casting defects analysis with supply chain in cast iron foundry process

    Directory of Open Access Journals (Sweden)

    C. Narayanaswamy

    2016-10-01

    Full Text Available Some of the foundries are in need of meeting production targets and due to the urgency they ignore the rejections. The objective of this paper is to analyze the various defects, [1] from molding process in a cast iron foundry. The Failure Mode Effects Analysis (FMEA in quality control [2-6] with suitable supply chain for mold making process considering rejection rates are identified and analyzed in terms of Risk Priority Number (RPN to prioritize the attention for each of the problem. The optimum levels of selected parameters [7] are obtained in this analysis.

  16. Software systems for processing and analysis at the NOVA high-energy laser facility

    International Nuclear Information System (INIS)

    Auerbach, J.M.; Montgomery, D.S.; McCauley, E.W.; Stone, G.F.

    1986-01-01

    A typical laser interaction experiment at the NOVA high-energy laser facility produces in excess of 20 Mbytes of digitized data. Extensive processing and analysis of this raw data from a wide variety of instruments is necessary to produce results that can be readily used to interpret the experiment. Using VAX-based computer hardware, software systems have been set up to convert the digitized instrument output to physics quantities describing the experiment. A relational data-base management system is used to coordinate all levels of processing and analysis. Software development emphasizes structured design, flexibility, automation, and ease of use

  17. Inverse Analysis to Formability Design in a Deep Drawing Process

    Science.gov (United States)

    Buranathiti, Thaweepat; Cao, Jian

    Deep drawing process is an important process adding values to flat sheet metals in many industries. An important concern in the design of a deep drawing process generally is formability. This paper aims to present the connection between formability and inverse analysis (IA), which is a systematical means for determining an optimal blank configuration for a deep drawing process. In this paper, IA is presented and explored by using a commercial finite element software package. A number of numerical studies on the effect of blank configurations to the quality of a part produced by a deep drawing process were conducted and analyzed. The quality of the drawing processes is numerically analyzed by using an explicit incremental nonlinear finite element code. The minimum distance between elemental principal strains and the strain-based forming limit curve (FLC) is defined as tearing margin to be the key performance index (KPI) implying the quality of the part. The initial blank configuration has shown that it plays a highly important role in the quality of the product via the deep drawing process. In addition, it is observed that if a blank configuration is not greatly deviated from the one obtained from IA, the blank can still result a good product. The strain history around the bottom fillet of the part is also observed. The paper concludes that IA is an important part of the design methodology for deep drawing processes.

  18. Thermal analysis on x-ray tube for exhaust process

    Science.gov (United States)

    Kumar, Rakesh; Rao Ratnala, Srinivas; Veeresh Kumar, G. B.; Shivakumar Gouda, P. S.

    2018-02-01

    It is great importance in the use of X-rays for medical purposes that the dose given to both the patient and the operator is carefully controlled. There are many types of the X- ray tubes used for different applications based on their capacity and power supplied. In present thesis maxi ray 165 tube is analysed for thermal exhaust processes with ±5% accuracy. Exhaust process is usually done to remove all the air particles and to degasify the insert under high vacuum at 2e-05Torr. The tube glass is made up of Pyrex material, 95%Tungsten and 5%rhenium is used as target material for which the melting point temperature is 3350°C. Various materials are used for various parts; during the operation of X- ray tube these waste gases are released due to high temperature which in turn disturbs the flow of electrons. Thus, before using the X-ray tube for practical applications it has to undergo exhaust processes. Initially we build MX 165 model to carry out thermal analysis, and then we simulate the bearing temperature profiles with FE model to match with test results with ±5%accuracy. At last implement the critical protocols required for manufacturing processes like MF Heating, E-beam, Seasoning and FT.

  19. Rapid DNA analysis for automated processing and interpretation of low DNA content samples.

    Science.gov (United States)

    Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F

    2016-01-01

    Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample

  20. V-SIPAL - A VIRTUAL LABORATORY FOR SATELLITE IMAGE PROCESSING AND ANALYSIS

    Directory of Open Access Journals (Sweden)

    K. M. Buddhiraju

    2012-09-01

    Full Text Available In this paper a virtual laboratory for the Satellite Image Processing and Analysis (v-SIPAL being developed at the Indian Institute of Technology Bombay is described. v-SIPAL comprises a set of experiments that are normally carried out by students learning digital processing and analysis of satellite images using commercial software. Currently, the experiments that are available on the server include Image Viewer, Image Contrast Enhancement, Image Smoothing, Edge Enhancement, Principal Component Transform, Texture Analysis by Co-occurrence Matrix method, Image Indices, Color Coordinate Transforms, Fourier Analysis, Mathematical Morphology, Unsupervised Image Classification, Supervised Image Classification and Accuracy Assessment. The virtual laboratory includes a theory module for each option of every experiment, a description of the procedure to perform each experiment, the menu to choose and perform the experiment, a module on interpretation of results when performed with a given image and pre-specified options, bibliography, links to useful internet resources and user-feedback. The user can upload his/her own images for performing the experiments and can also reuse outputs of one experiment in another experiment where applicable. Some of the other experiments currently under development include georeferencing of images, data fusion, feature evaluation by divergence andJ-M distance, image compression, wavelet image analysis and change detection. Additions to the theory module include self-assessment quizzes, audio-video clips on selected concepts, and a discussion of elements of visual image interpretation. V-SIPAL is at the satge of internal evaluation within IIT Bombay and will soon be open to selected educational institutions in India for evaluation.

  1. Neutron-activation analysis of routine mineral-processing samples

    International Nuclear Information System (INIS)

    Watterson, J.; Eddy, B.; Pearton, D.

    1974-01-01

    Instrumental neutron-activation analysis was applied to a suite of typical mineral-processing samples to establish which elements can be rapidly determined in them by this technique. A total of 35 elements can be determined with precisions (from the counting statistics) ranging from better than 1 per cent to approximately 20 per cent. The elements that can be determined have been tabulated together with the experimental conditions, the precision from the counting statistics, and the estimated number of analyses possible per day. With an automated system, this number can be as high as 150 in the most favourable cases [af

  2. A digital processing method for the analysis of complex nuclear spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Abani, M.C.; Bairi, B.R.

    1994-01-01

    This paper describes a digital processing method using frequency power spectra for the analysis of complex nuclear spectra. The power spectra were estimated by employing modified discrete Fourier transform. The method was applied to observed spectral envelopes. The results for separating closely-spaced doublets in nuclear spectra of low statistical precision compared favorably with those obtained by using a popular peak fitting program SAMPO. The paper also describes limitations of the peak fitting methods. It describes the advantages of digital processing techniques for type II digital signals including nuclear spectra. A compact computer program occupying less than 2.5 kByte of memory space was written in BASIC for the processing of observed spectral envelopes. (orig.)

  3. Waste Analysis Plan for the Waste Receiving and Processing (WRAP) Facility

    International Nuclear Information System (INIS)

    TRINER, G.C.

    1999-01-01

    The purpose of this waste analysis plan (WAP) is to document the waste acceptance process, sampling methodologies, analytical techniques, and overall processes that are undertaken for dangerous, mixed, and radioactive waste accepted for confirmation, nondestructive examination (NDE) and nondestructive assay (NDA), repackaging, certification, and/or storage at the Waste Receiving and Processing Facility (WRAP). Mixed and/or radioactive waste is treated at WRAP. WRAP is located in the 200 West Area of the Hanford Facility, Richland, Washington. Because dangerous waste does not include source, special nuclear, and by-product material components of mixed waste, radionuclides are not within the scope of this documentation. The information on radionuclides is provided only for general knowledge

  4. Experimental analysis of influence of different lubricants types on the multi-phase ironing process

    Directory of Open Access Journals (Sweden)

    Milan Djordjević

    2013-05-01

    Full Text Available This paper is aimed at presenting results of an experimental analysis of the different types of lubricants influence on the multi-phase ironing process. Based on sliding of the metal strip between the two contact elements a special tribological model was adopted. The subject of experimental investigations was variations of the drawing force, contact pressure and the friction coefficient for each type of the applied lubricants. The ironing process was conducted in three-phases at the constant sliding velocity. The objective of this analysis was to compare all the applied lubricants in order to estimate their quality from the point of view of their applicability in the multi-phase ironing process.

  5. EXPERIMENTAL ANALYSIS OF INFLUENCE OF DIFFERENT LUBRICANTS TYPES ON THE MULTI-PHASE IRONING PROCESS

    Directory of Open Access Journals (Sweden)

    Milan Djordjević

    2013-09-01

    Full Text Available This paper is aimed at presenting results of an experimental analysis of the different types of lubricants influence on the multi-phase ironing process. Based on sliding of the metal strip between the two contact elements a special tribological model was adopted. The subject of experimental investigations was variations of the drawing force, contact pressure and the friction coefficient for each type of the applied lubricants. The ironing process was conducted in three-phases at the constant sliding velocity. The objective of this analysis was to compare all the applied lubricants in order to estimate their quality from the point of view of their applicability in the multi-phase ironing process.

  6. Proteomics analysis in frozen horse mackerel previously high-pressure processed.

    Science.gov (United States)

    Pazos, Manuel; Méndez, Lucía; Vázquez, Manuel; Aubourg, Santiago P

    2015-10-15

    The effect of high-pressure processing (HPP) (150, 300 and 450 MPa for 0, 2.5 and 5 min) on total sodium dodecyl sulphate (SDS)-soluble and sarcoplasmic proteins in frozen (-10 °C for 3 months) horse mackerel (Trachurus trachurus) was evaluated. Proteomics tools based on image analysis of SDS-PAGE protein gels and protein identification by tandem mass spectrometry (MS/MS) were applied. Although total SDS-soluble fraction indicated no important changes induced by HPP, this processing modified the 1-D SDS-PAGE sarcoplasmic patterns in a direct-dependent manner and exerted a selective effect on particular proteins depending on processing conditions. Thus, application of the highest pressure (450 MPa) provoked a significant degradation of phosphoglycerate mutase 2, glycogen phosphorylase muscle form, pyruvate kinase muscle isozyme, beta-enolase and triosephosphate isomerase and phosphoglucomutase-1. Conversely, protein bands assigned to tropomyosin alpha-1 chain, fast myotomal muscle troponin T and parvalbumin beta 2 increased their intensity after applying a 450-MPa processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Uncertainty and sensitivity analysis: Mathematical model of coupled heat and mass transfer for a contact baking process

    DEFF Research Database (Denmark)

    Feyissa, Aberham Hailu; Gernaey, Krist; Adler-Nissen, Jens

    2012-01-01

    to uncertainty in the model predictions. The aim of the current paper is to address this uncertainty challenge in the modelling of food production processes using a combination of uncertainty and sensitivity analysis, where the uncertainty analysis and global sensitivity analysis were applied to a heat and mass......Similar to other processes, the modelling of heat and mass transfer during food processing involves uncertainty in the values of input parameters (heat and mass transfer coefficients, evaporation rate parameters, thermo-physical properties, initial and boundary conditions) which leads...

  8. Trans individuals' facilitative coping: An analysis of internal and external processes.

    Science.gov (United States)

    Budge, Stephanie L; Chin, Mun Yuk; Minero, Laura P

    2017-01-01

    Existing research on trans individuals has primarily focused on their negative experiences and has disproportionately examined coming-out processes and identity development stages. Using a grounded theory approach, this qualitative study sought to examine facilitative coping processes among trans-identified individuals. Facilitative coping was operationalized as processes whereby individuals seek social support, learn new skills, change behaviors to positively adapt, and find alternative means to seek personal growth and acceptance. The sample included 15 participants who self-identified with a gender identity that was different from their assigned sex at birth. Results yielded a total of nine overarching themes: Accepting Support from Others, Actions to Increase Protection, Active Engagement Throughout the Transition Process, Actively Seeking Social Interactions, Engaging in Exploration, Internal Processes Leading to Self-Acceptance, Self-Efficacy, Shifts Leading to Embracing Change and Flexibility, and Utilization of Agency. Based on the analysis, a theoretical model emerged that highlighted the importance of internal and external coping processes in facilitating gender identity development and navigating stressors among trans individuals. Clinical implications focusing on how to implement facilitative coping processes are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Analysis of pulse-shape discrimination techniques for BC501A using GHz digital signal processing

    International Nuclear Information System (INIS)

    Rooney, B.D.; Dinwiddie, D.R.; Nelson, M.A.; Rawool-Sullivan, Mohini W.

    2001-01-01

    A comparison study of pulse-shape analysis techniques was conducted for a BC501A scintillator using digital signal processing (DSP). In this study, output signals from a preamplifier were input directly into a 1 GHz analog-to-digital converter. The digitized data obtained with this method was post-processed for both pulse-height and pulse-shape information. Several different analysis techniques were evaluated for neutron and gamma-ray pulse-shape discrimination. It was surprising that one of the simplest and fastest techniques resulted in some of the best pulse-shape discrimination results. This technique, referred to here as the Integral Ratio technique, was able to effectively process several thousand detector pulses per second. This paper presents the results and findings of this study for various pulse-shape analysis techniques with digitized detector signals.

  10. Performance Analysis of the United States Marine Corps War Reserve Materiel Program Process Flow

    Science.gov (United States)

    2016-12-01

    55 2. Cost /Benefit Analysis of Maintaining Inventory .......................55 B. TRANSPORTATION...REVIEW A. DOD LOGISTICS OVERVIEW Acquiring and supplying materiel to deployed forces is a complicated process. While we conducted our analysis on... Cost /Benefit Analysis of Maintaining Inventory Additionally, should certain item types prove to be more prone to delays or incur proportionally

  11. Vibration analysis and vibration damage assessment in nuclear and process equipment

    International Nuclear Information System (INIS)

    Pettigrew, M.J.; Taylor, C.E.; Fisher, N.J.; Yetisir, M.; Smith, B.A.W.

    1997-01-01

    Component failures due to excessive flow-induced vibration are still affecting the performance and reliability of process and nuclear components. The purpose of this paper is to discuss flow-induced vibration analysis and vibration damage prediction. Vibration excitation mechanisms are described with particular emphasis on fluid elastic instability. The dynamic characteristics of process and power equipment are explained. The statistical nature of some parameters, in particular support conditions, is discussed. The prediction of fretting-wear damage is approached from several points-of-view. An energy approach to formulate fretting-wear damage is proposed. (author)

  12. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  13. Numerical analysis of stress fields generated by quenching process

    Directory of Open Access Journals (Sweden)

    A. Bokota

    2011-04-01

    Full Text Available In work the presented numerical models of tool steel hardening processes take into account mechanical phenomena generated by thermalphenomena and phase transformations. In the model of mechanical phenomena, apart from thermal, plastic and structural strain, alsotransformations plasticity was taken into account. The stress and strain fields are obtained using the solution of the Finite Elements Method of the equilibrium equation in rate form. The thermophysical constants occurring in constitutive relation depend on temperature and phase composite. For determination of plastic strain the Huber-Misses condition with isotropic strengthening was applied whereas fordetermination of transformation plasticity a modified Leblond model was used. In order to evaluate the quality and usefulness of thepresented models a numerical analysis of stresses and strains associated hardening process of a fang lathe of cone shaped made of tool steel was carried out.

  14. Computer-aided analysis of cutting processes for brittle materials

    Science.gov (United States)

    Ogorodnikov, A. I.; Tikhonov, I. N.

    2017-12-01

    This paper is focused on 3D computer simulation of cutting processes for brittle materials and silicon wafers. Computer-aided analysis of wafer scribing and dicing is carried out with the use of the ANSYS CAE (computer-aided engineering) software, and a parametric model of the processes is created by means of the internal ANSYS APDL programming language. Different types of tool tip geometry are analyzed to obtain internal stresses, such as a four-sided pyramid with an included angle of 120° and a tool inclination angle to the normal axis of 15°. The quality of the workpieces after cutting is studied by optical microscopy to verify the FE (finite-element) model. The disruption of the material structure during scribing occurs near the scratch and propagates into the wafer or over its surface at a short range. The deformation area along the scratch looks like a ragged band, but the stress width is rather low. The theory of cutting brittle semiconductor and optical materials is developed on the basis of the advanced theory of metal turning. The fall of stress intensity along the normal on the way from the tip point to the scribe line can be predicted using the developed theory and with the verified FE model. The crystal quality and dimensions of defects are determined by the mechanics of scratching, which depends on the shape of the diamond tip, the scratching direction, the velocity of the cutting tool and applied force loads. The disunity is a rate-sensitive process, and it depends on the cutting thickness. The application of numerical techniques, such as FE analysis, to cutting problems enhances understanding and promotes the further development of existing machining technologies.

  15. Digital signal processing and spectral analysis for scientists concepts and applications

    CERN Document Server

    Alessio, Silvia Maria

    2016-01-01

    This book covers the basics of processing and spectral analysis of monovariate discrete-time signals. The approach is practical, the aim being to acquaint the reader with the indications for and drawbacks of the various methods and to highlight possible misuses. The book is rich in original ideas, visualized in new and illuminating ways, and is structured so that parts can be skipped without loss of continuity. Many examples are included, based on synthetic data and real measurements from the fields of physics, biology, medicine, macroeconomics etc., and a complete set of MATLAB exercises requiring no previous experience of programming is provided. Prior advanced mathematical skills are not needed in order to understand the contents: a good command of basic mathematical analysis is sufficient. Where more advanced mathematical tools are necessary, they are included in an Appendix and presented in an easy-to-follow way. With this book, digital signal processing leaves the domain of engineering to address the ne...

  16. Analysis of Work Design in Rubber Processing Plant

    Directory of Open Access Journals (Sweden)

    Wahyuni Dini

    2018-01-01

    Full Text Available The work design illustrates how structured jobs, tasks, and roles are defined and modified and their impact on individuals, groups, and organizations. If the work is not designed well, the company must pay greater costs for workers’ health, longer production processes or even penalties for not being able to meet the delivery schedule. This is visible to the condition in a rubber processing factory in North Sumatra. Work design aspects such as layouts, machinery and equipment, worker's physical working environment, work methods, and organizational policies have not been well-organized. Coagulum grinding machines into sheets are often damaged, resulting in 4 times the delay of product delivery in 2016, the presence of complaints of heat exposure submitted by workers, and workstation that has not been properly arranged is an indication of the need for work design. The research data will be collected through field observation, and distribution of questionnaires related aspects of work design. The result of the analysis depends on the respondent's answer from the distributed questionnaire regarding the 6 aspects studied.

  17. Analysis of Work Design in Rubber Processing Plant

    Science.gov (United States)

    Wahyuni, Dini; Nasution, Harmein; Budiman, Irwan; Wijaya, Khairini

    2018-02-01

    The work design illustrates how structured jobs, tasks, and roles are defined and modified and their impact on individuals, groups, and organizations. If the work is not designed well, the company must pay greater costs for workers' health, longer production processes or even penalties for not being able to meet the delivery schedule. This is visible to the condition in a rubber processing factory in North Sumatra. Work design aspects such as layouts, machinery and equipment, worker's physical working environment, work methods, and organizational policies have not been well-organized. Coagulum grinding machines into sheets are often damaged, resulting in 4 times the delay of product delivery in 2016, the presence of complaints of heat exposure submitted by workers, and workstation that has not been properly arranged is an indication of the need for work design. The research data will be collected through field observation, and distribution of questionnaires related aspects of work design. The result of the analysis depends on the respondent's answer from the distributed questionnaire regarding the 6 aspects studied.

  18. ACTINIDE REMOVAL PROCESS SAMPLE ANALYSIS, CHEMICAL MODELING, AND FILTRATION EVALUATION

    Energy Technology Data Exchange (ETDEWEB)

    Martino, C.; Herman, D.; Pike, J.; Peters, T.

    2014-06-05

    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  19. Are breast biopsies adequately funded? A process cost and revenue analysis

    International Nuclear Information System (INIS)

    Hahn, M.; Fischbach, E.; Fehm, T.

    2011-01-01

    Purpose: The objective of the study was to determine whether the various breast biopsy procedures specified in the S 3 guidelines are sensibly represented within the current German health system as considered from a cost evaluation perspective. Materials and Methods: This prospectively designed multicenter study analyzed 221 breast biopsies at 7 institutions from 04/2006 to 01/2007. Core needle biopsies, vacuum-assisted biopsies and surgical open biopsies under sonographic or mammographic guidance were evaluated. During an analysis of process costs, the individual process steps were recorded in diagrammatic form and assigned to the true consumption of resources. The actual resource consumption costs were entered. A process-related breakeven analysis was conducted to check whether the reimbursement of individual biopsy types covers the costs. Results: Only sonographically guided core needle biopsy and surgical open biopsy are adequately reimbursed in the current German health system. All other breast biopsies indicate a negative profit margin. The principal reasons for underfunding are found in the area of reimbursement of investment and non-personnel costs. Conclusion: The reimbursement of breast biopsies must be improved in order to guarantee nationwide care of the population using the breast biopsy methods recommended in the S 3 guidelines and to avoid disincentives with respect to breast biopsy indications. (orig.)

  20. Sensorial analysis evaluation in cereal bars preserved by ionizing radiation processing

    International Nuclear Information System (INIS)

    Villavicencio, A.L.C.H.; Araujo, M.M.; Fanaro, G.B.; Rela, P.R.; Mancini-Filho, J.

    2007-01-01

    Gamma-rays utilized as a food-processing treatment to eliminate insect contamination is well established in food industries. Recent troubles in Brazilian cereal bars commercialization require a special consumer's attention because some products were contaminated by insects. To solve the problem, food-irradiation treatment was utilized as a safe and effective solution. The final product was free of insect contamination. The aim of this study was to determine the best radiation dose processing utilized to disinfestations and detect some change on sensorial characteristic by sensorial analysis in cereal bars. In this study, three different kinds of cereal bars were purchased in Sao Paulo (Brazil) in supermarkets and irradiated with 1.0, 2.0 and 3.0 kGy at 'Instituto de Pesquisas Energeticas e Nucleares' (IPEN-CNEN/SP). The samples were treated with ionizing radiation using a 60 Co gamma-ray facility (Gammacell 220, A.E.C.L.). That radiation doses were used successfully as an anti-insect treatment in the cereal bars, since in some food industries doses up to 3.0 kGy are used to guarantee at least a dose of 1.0 kGy in internal cereal bars package. Sensorial analysis was necessary since cereal bars contain ingredients very sensitive to ionizing radiation process

  1. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  2. The governance of higher education regionalisation: comparative analysis of the Bologna Process and MERCOSUR-Educativo

    NARCIS (Netherlands)

    Verger, A.; Hermo, J.P.

    2010-01-01

    The article analyses two processes of higher education regionalisation, MERCOSUR‐Educativo in Latin America and the Bologna Process in Europe, from a comparative perspective. The comparative analysis is centered on the content and the governance of both processes and, specifically, on the reasons of

  3. Funding for the 2ND IAEA technical meeting on fusion data processing, validation and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Greenwald, Martin

    2017-06-02

    The International Atomic Energy Agency (IAEA) will organize the second Technical Meeting on Fusion Da Processing, Validation and Analysis from 30 May to 02 June, 2017, in Cambridge, MA USA. The meeting w be hosted by the MIT Plasma Science and Fusion Center (PSFC). The objective of the meeting is to provide a platform where a set of topics relevant to fusion data processing, validation and analysis are discussed with the view of extrapolation needs to next step fusion devices such as ITER. The validation and analysis of experimental data obtained from diagnostics used to characterize fusion plasmas are crucial for a knowledge based understanding of the physical processes governing the dynamics of these plasmas. The meeting will aim at fostering, in particular, discussions of research and development results that set out or underline trends observed in the current major fusion confinement devices. General information on the IAEA, including its mission and organization, can be found at the IAEA websit Uncertainty quantification (UQ) Model selection, validation, and verification (V&V) Probability theory and statistical analysis Inverse problems & equilibrium reconstru ction Integrated data analysis Real time data analysis Machine learning Signal/image proc essing & pattern recognition Experimental design and synthetic diagnostics Data management

  4. Effective Thermal Analysis of Using Peltier Module for Desalination Process

    Directory of Open Access Journals (Sweden)

    Hayder Al-Madhhachi

    2018-01-01

    Full Text Available The key objective of this study is to analyse the heat transfer processes involved in the evaporation and condensation of water in a water distillation system employing a thermoelectric module. This analysis can help to increase the water production and to enhance the system performance. For the analysis, a water distillation unit prototype integrated with a thermoelectric module was designed and fabricated. A theoretical model is developed to study the effect of the heat added, transferred and removed, in forced convection and laminar flow, during the evaporation and condensation processes. The thermoelectric module is used to convert electricity into heat under Peltier effect and control precisely the absorbed and released heat at the cold and hot sides of the module, respectively. Temperatures of water, vapour, condenser, cold and hot sides of the thermoelectric module and water production have been measured experimentally under steady state operation. The theoretical and experimental water production were found to be in agreement. The amount of heat that needs to be evaporated from water-vapour interface and transferred through the condenser surface to the thermoelectric module is crucial for the design and optimization of distillation systems.

  5. The MetabolomeExpress Project: enabling web-based processing, analysis and transparent dissemination of GC/MS metabolomics datasets

    Directory of Open Access Journals (Sweden)

    Carroll Adam J

    2010-07-01

    Full Text Available Abstract Background Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Description Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.. Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP their own data to the server for online processing via a novel raw data processing pipeline. Conclusions MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to

  6. Risk analysis of a biomass combustion process using MOSAR and FMEA methods.

    Science.gov (United States)

    Thivel, P-X; Bultel, Y; Delpech, F

    2008-02-28

    Thermal and chemical conversion processes that convert in energy the sewage sludge, pasty waste and other pre-processed waste are increasingly common, for economic and ecological reasons. Fluidized bed combustion is currently one of the most promising methods of energy conversion, since it burns biomass very efficiently, and produces only very small quantities of sulphur and nitrogen oxides. The hazards associated with biomass combustion processes are fire, explosion and poisoning from the combustion gases (CO, etc.). The risk analysis presented in this paper uses the MADS-MOSAR methodology, applied to a semi-industrial pilot scheme comprising a fluidization column, a conventional cyclone, two natural gas burners and a continuous supply of biomass. The methodology uses a generic approach, with an initial macroscopic stage where hazard sources are identified, scenarios for undesired events are recognized and ranked using a grid of SeverityxProbability and safety barriers suggested. A microscopic stage then analyzes in detail the major risks identified during the first stage. This analysis may use various different tools, such as HAZOP, FMEA, etc.: our analysis is based on FMEA. Using MOSAR, we were able to identify five subsystems: the reactor (fluidized bed and centrifuge), the fuel and biomass supply lines, the operator and the environment. When we drew up scenarios based on these subsystems, we found that malfunction of the gas supply burners was a common trigger in many scenarios. Our subsequent microscopic analysis, therefore, focused on the burners, looking at the ways they failed, and at the effects and criticality of those failures (FMEA). We were, thus, able to identify a number of critical factors such as the incoming gas lines and the ignition electrode.

  7. Risk analysis of a biomass combustion process using MOSAR and FMEA methods

    International Nuclear Information System (INIS)

    Thivel, P.-X.; Bultel, Y.; Delpech, F.

    2008-01-01

    Thermal and chemical conversion processes that convert in energy the sewage sludge, pasty waste and other pre-processed waste are increasingly common, for economic and ecological reasons. Fluidized bed combustion is currently one of the most promising methods of energy conversion, since it burns biomass very efficiently, and produces only very small quantities of sulphur and nitrogen oxides. The hazards associated with biomass combustion processes are fire, explosion and poisoning from the combustion gases (CO, etc.). The risk analysis presented in this paper uses the MADS-MOSAR methodology, applied to a semi-industrial pilot scheme comprising a fluidization column, a conventional cyclone, two natural gas burners and a continuous supply of biomass. The methodology uses a generic approach, with an initial macroscopic stage where hazard sources are identified, scenarios for undesired events are recognized and ranked using a grid of Severity x Probability and safety barriers suggested. A microscopic stage then analyzes in detail the major risks identified during the first stage. This analysis may use various different tools, such as HAZOP, FMEA, etc.: our analysis is based on FMEA. Using MOSAR, we were able to identify five subsystems: the reactor (fluidized bed and centrifuge), the fuel and biomass supply lines, the operator and the environment. When we drew up scenarios based on these subsystems, we found that malfunction of the gas supply burners was a common trigger in many scenarios. Our subsequent microscopic analysis, therefore, focused on the burners, looking at the ways they failed, and at the effects and criticality of those failures (FMEA). We were, thus, able to identify a number of critical factors such as the incoming gas lines and the ignition electrode

  8. An application of image processing techniques in computed tomography image analysis

    DEFF Research Database (Denmark)

    McEvoy, Fintan

    2007-01-01

    number of animals and image slices, automation of the process was desirable. The open-source and free image analysis program ImageJ was used. A macro procedure was created that provided the required functionality. The macro performs a number of basic image processing procedures. These include an initial...... process designed to remove the scanning table from the image and to center the animal in the image. This is followed by placement of a vertical line segment from the mid point of the upper border of the image to the image center. Measurements are made between automatically detected outer and inner...... boundaries of subcutaneous adipose tissue along this line segment. This process was repeated as the image was rotated (with the line position remaining unchanged) so that measurements around the complete circumference were obtained. Additionally, an image was created showing all detected boundary points so...

  9. Process control analysis of IMRT QA: implications for clinical trials

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Rice, Roger K; Yoo, Sua; Court, Laurence E; McMillan, Sharon K; Russell, J Donald; Pacyniak, John M; Woo, Milton K; Basran, Parminder S; Boyer, Arthur L; Bonilla, Claribel

    2008-01-01

    The purpose of this study is two-fold: first is to investigate the process of IMRT QA using control charts and second is to compare control chart limits to limits calculated using the standard deviation (σ). Head and neck and prostate IMRT QA cases from seven institutions in both academic and community settings are considered. The percent difference between the point dose measurement in phantom and the corresponding result from the treatment planning system (TPS) is used for analysis. The average of the percent difference calculations defines the accuracy of the process and is called the process target. This represents the degree to which the process meets the clinical goal of 0% difference between the measurements and TPS. IMRT QA process ability defines the ability of the process to meet clinical specifications (e.g. 5% difference between the measurement and TPS). The process ability is defined in two ways: (1) the half-width of the control chart limits, and (2) the half-width of ±3σ limits. Process performance is characterized as being in one of four possible states that describes the stability of the process and its ability to meet clinical specifications. For the head and neck cases, the average process target across institutions was 0.3% (range: -1.5% to 2.9%). The average process ability using control chart limits was 7.2% (range: 5.3% to 9.8%) compared to 6.7% (range: 5.3% to 8.2%) using standard deviation limits. For the prostate cases, the average process target across the institutions was 0.2% (range: -1.8% to 1.4%). The average process ability using control chart limits was 4.4% (range: 1.3% to 9.4%) compared to 5.3% (range: 2.3% to 9.8%) using standard deviation limits. Using the standard deviation to characterize IMRT QA process performance resulted in processes being preferentially placed in one of the four states. This is in contrast to using control charts for process characterization where the IMRT QA processes were spread over three of the

  10. Introducing buffer inventories in the RBD analysis of process production systems

    International Nuclear Information System (INIS)

    Macchi, Marco; Kristjanpoller, Fredy; Garetti, Marco; Arata, Adolfo; Fumagalli, Luca

    2012-01-01

    The throughput analysis is an important issue for the design and operations management of process production lines. The throughput of a line depends on the availability and nominal throughput of its machines. Further on, it is influenced by the accumulation process of production material stocked into buffers along the line; hence, the buffer inventory level is also a relevant variable that has to be considered when assessing the throughput of the line. The present paper is particularly concerned with using such an assessment for supporting maintenance decisions. The buffer inventory level should provide the proper isolation time before the buffer becomes empty, so that, during this time, a maintenance intervention can be carried on at a failed machine upward, without causing a propagation of the effect of the failure in the machines downward (the so called ‘material starvation’). Alike, it should guarantee the proper isolation time before reaching the complete utilisation of the buffer capacity, so that also during this time a maintenance intervention is possible at a failed machine downward without causing a propagation of the effect of the failure in the machines upward (the so called ‘blocking of production’). This strategy is particularly interesting in the process industry where the capital cost of equipment is high and the holding cost of material is low. Hence, the isolation times before reaching ‘material starvation’ or ‘blocking of production’ have to be properly studied in order to make an accurate analysis of their effect on the throughput of the line. The present paper provides a model to this end, derived by extending the well known Reliability Block Diagram (RBD) method, currently used in the normal duties of a maintenance engineer. The usual RBD availability analysis is integrated by a state space analysis through which isolation times can be analysed. Besides, an empirical study – the case of a production line taken out from the

  11. Structural analysis of advanced spent fuel conditioning process

    International Nuclear Information System (INIS)

    Gu, J. H.; Jung, W. M.; Jo, I. J.; Gug, D. H.; Yoo, K. S.

    2003-01-01

    An advanced spent fuel conditioning process (ACP) is developing for the safe and effective management of spent fuels which arising from the domestic nuclear power plants. And its demonstration facility is under design. This facility will be prepared by modifying IMEF's reserve hot cell facility which reserved for future usage by considering the characteristics of ACP. This study presents a basic structural architecture design and analysis results of ACP hot cell including modification of the IMEF. The results of this study will be used for the detail design of ACP demonstration facility, and utilized as basic data for the licensing of the ACP facility

  12. Analysis of Hazards Associated with a Process Involving Uranium Metal and Uranium Hydride Powders

    Energy Technology Data Exchange (ETDEWEB)

    Bullock, J.S.

    2000-05-01

    An analysis of the reaction chemistry and operational factors associated with processing uranium and uranium hydride powders is presented, focusing on a specific operation in the Development Division which was subjected to the Job Hazard Analysis (JHA) process. Primary emphasis is on the thermodynamic factors leading to pyrophoricity in common atmospheres. The discussion covers feed powders, cold-pressed and hot-pressed materials, and stray material resulting from the operations. The sensitivity of the various forms of material to pyrophoricity in common atmospheres is discussed. Operational recommendations for performing the work described are given.

  13. Influence of oral processing on appetite and food intake - A systematic review and meta-analysis.

    Science.gov (United States)

    Krop, Emma M; Hetherington, Marion M; Nekitsing, Chandani; Miquel, Sophie; Postelnicu, Luminita; Sarkar, Anwesha

    2018-06-01

    Food delivers energy, nutrients and a pleasurable experience. Slow eating and prolonged oro-sensory exposure to food during consumption can enhance the processes that promote satiation. This systematic review and meta-analysis investigated the effects of oral processing on subjective measures of appetite (hunger, desire to eat) and objectively measured food intake. The aim was to investigate the influence of oral processing characteristics, specifically "chewing" and "lubrication", on "appetite" and "food intake". A literature search of six databases (Cochrane library, PubMed, Medline, Food Science and Technology Abstracts, Web of Science, Scopus), yielded 12161 articles which were reduced to a set of 40 articles using pre-specified inclusion and exclusion criteria. A further two articles were excluded from the meta-analysis due to missing relevant data. From the remaining 38 papers, detailing 40 unique studies with 70 subgroups, raw data were extracted for meta-analysis (food intake n = 65, hunger n = 22 and desire to eat ratings n = 15) and analyzed using random effects modelling. Oral processing parameters, such as number of chews, eating rate and texture manipulation, appeared to influence food intake markedly but appetite ratings to a lesser extent. Meta-analysis confirmed a significant effect of the direct and indirect aspects of oral processing that were related to chewing on both self-reported hunger (-0.20 effect size, 95% confidence interval CI: -0.30, -0.11), and food intake (-0.28 effect size, 95% CI: -0.36, -0.19). Although lubrication is an important aspect of oral processing, few studies on its effects on appetite have been conducted. Future experiments using standardized approaches should provide a clearer understanding of the role of oral processing, including both chewing and lubrication, in promoting satiety. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. MTpy - Python Tools for Magnetotelluric Data Processing and Analysis

    Science.gov (United States)

    Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes

    2014-05-01

    We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.

  15. The optimization of gamma spectra processing in prompt gamma neutron activation analysis (PGNAA)

    Energy Technology Data Exchange (ETDEWEB)

    Pinault, Jean-Louis [IAEA Expert, 96 rue du Port David, 45370 Dry (France)], E-mail: jeanlouis_pinault@hotmail.fr; Solis, Jose [Instituto Peruano de Energia Nuclear, Av. Canada No. 1470, San Borja, Lima 41 (Peru)

    2009-04-15

    The uncertainty of the elemental analysis is one of the major factors governing the utility of on-line Prompt Gamma Neutron Activation Analysis (PGNAA) in the blending and sorting of bulk materials. In this paper, a general method applicable to Gamma spectra processing is presented and applied to PGNAA in mineral industry. Based on the Fourier transform of spectra and their de-correlation in the Fourier space (the improvement of the conditioning of the correlation matrix), processing of overlapping of characteristic peaks minimizes the propagation of random errors, which optimizes the accuracy and decreases the detection limits of elemental analyses. In comparison with classical methods based on the linear combinations of relevant regions of spectra the improvement may be considerable, especially when several elements are interfering. The method is applied to four case stories covering both borehole logging and on-line analysis on conveyor belt of raw materials.

  16. Evaluation and analysis method for natural gas hydrate storage and transportation processes

    International Nuclear Information System (INIS)

    Hao Wenfeng; Wang Jinqu; Fan Shuanshi; Hao Wenbin

    2008-01-01

    An evaluation and analysis method is presented to investigate an approach to scale-up a hydration reactor and to solve some economic problems by looking at the natural gas hydrate storage and transportation process as a whole. Experiments with the methane hydration process are used to evaluate the whole natural gas hydrate storage and transportation process. The specific contents and conclusions are as follows: first, batch stirring effects and load coefficients are studied in a semi-continuous stirred-tank reactor. Results indicate that batch stirring and appropriate load coefficients are effective in improving hydrate storage capacity. In the experiments, appropriate values for stirring velocity, stirring time and load coefficient were found to be 320 rpm, 30 min and 0.289, respectively. Second, throughput and energy consumption of the reactor for producing methane hydrates are calculated by mass and energy balance. Results show that throughput of this is 1.06 kg/d, with a product containing 12.4% methane gas. Energy consumption is 0.19 kJ, while methane hydrates containing 1 kJ heat are produced. Third, an energy consumption evaluation parameter is introduced to provide a single energy consumption evaluation rule for different hydration reactors. Parameter analyses indicate that process simplicity or process integration can decrease energy consumption. If experimental gas comes from a small-scale natural gas field and the energy consumption is 0.02 kJ when methane hydrates containing 1 kJ heat are produced, then the decrease is 87.9%. Moreover, the energy consumption evaluation parameter used as an economic criterion is converted into a process evaluation parameter. Analyses indicate that the process evaluation parameter is relevant to technology level and resource consumption for a system, which can make it applicable to economic analysis and venture forecasting for optimal capital utilization

  17. The specifics of the aplication of social and structural approach to electoral processes analysis

    Directory of Open Access Journals (Sweden)

    V F Kovrov

    2009-06-01

    Full Text Available The analysis of a number of problems of the investigation of the electoral process viewed as a social phenomenon contributes to the overcoming of a number of theoretical and methodological obstacles in the process of its sociological cognition. The complexity and delicacy of the electoral process entails the application of a set of distinct approaches, research and description techniques. The article provides the rationale for the most complete insight into the social component of the electoral process via social and structural approach application. The given approach enables one to give concrete expression to the subject matter of electoral sociology at strategic, operational and tactical levels, define the elements of the electoral process, the outcome of the electoral practice, to reveal the invariability of the electoral conscience, psychological attitudes and values of different groups of voters as well as to analyze the evolution of the objective-transforming practice, the electoral activity of the population and to provide the extensive analysis of the results of the material and cultural assimilation of the electoral practice.

  18. Image analysis and mathematical modelling for the supervision of the dough fermentation process

    Science.gov (United States)

    Zettel, Viktoria; Paquet-Durand, Olivier; Hecker, Florian; Hitzmann, Bernd

    2016-10-01

    The fermentation (proof) process of dough is one of the quality-determining steps in the production of baking goods. Beside the fluffiness, whose fundaments are built during fermentation, the flavour of the final product is influenced very much during this production stage. However, until now no on-line measurement system is available, which can supervise this important process step. In this investigation the potential of an image analysis system is evaluated, that enables the determination of the volume of fermented dough pieces. The camera is moving around the fermenting pieces and collects images from the objects by means of different angles (360° range). Using image analysis algorithms the volume increase of individual dough pieces is determined. Based on a detailed mathematical description of the volume increase, which based on the Bernoulli equation, carbon dioxide production rate of yeast cells and the diffusion processes of carbon dioxide, the fermentation process is supervised. Important process parameters, like the carbon dioxide production rate of the yeast cells and the dough viscosity can be estimated just after 300 s of proofing. The mean percentage error for forecasting the further evolution of the relative volume of the dough pieces is just 2.3 %. Therefore, a forecast of the further evolution can be performed and used for fault detection.

  19. Study on determination of durability analysis process and fatigue damage parameter for rubber component

    International Nuclear Information System (INIS)

    Moon, Seong In; Cho, Il Je; Woo, Chang Su; Kim, Wan Doo

    2011-01-01

    Rubber components, which have been widely used in the automotive industry as anti-vibration components for many years, are subjected to fluctuating loads, often failing due to the nucleation and growth of defects or cracks. To prevent such failures, it is necessary to understand the fatigue failure mechanism for rubber materials and to evaluate the fatigue life for rubber components. The objective of this study is to develop a durability analysis process for vulcanized rubber components, that can predict fatigue life at the initial product design step. The determination method of nonlinear material constants for FE analysis was proposed. Also, to investigate the applicability of the commonly used damage parameters, fatigue tests and corresponding finite element analyses were carried out and normal and shear strain was proposed as the fatigue damage parameter for rubber components. Fatigue analysis for automotive rubber components was performed and the durability analysis process was reviewed

  20. Multiscale analysis of the correlation of processing parameters on viscidity of composites fabricated by automated fiber placement

    Science.gov (United States)

    Han, Zhenyu; Sun, Shouzheng; Fu, Yunzhong; Fu, Hongya

    2017-10-01

    Viscidity is an important physical indicator for assessing fluidity of resin that is beneficial to contact resin with the fibers effectively and reduce manufacturing defects during automated fiber placement (AFP) process. However, the effect of processing parameters on viscidity evolution is rarely studied during AFP process. In this paper, viscidities under different scales are analyzed based on multi-scale analysis method. Firstly, viscous dissipation energy (VDE) within meso-unit under different processing parameters is assessed by using finite element method (FEM). According to multi-scale energy transfer model, meso-unit energy is used as the boundary condition for microscopic analysis. Furthermore, molecular structure of micro-system is built by molecular dynamics (MD) method. And viscosity curves are then obtained by integrating stress autocorrelation function (SACF) with time. Finally, the correlation characteristics of processing parameters to viscosity are revealed by using gray relational analysis method (GRAM). A group of processing parameters is found out to achieve the stability of viscosity and better fluidity of resin.

  1. Utilization of Integrated Process Control, Data Capture, and Data Analysis in Construction of Accelerator Systems

    International Nuclear Information System (INIS)

    Bonnie Madre; Charles Reece; Joseph Ozelis; Valerie Bookwalter

    2003-01-01

    Jefferson Lab has developed a web-based system that integrates commercial database, data analysis, document archiving and retrieval, and user interface software, into a coherent knowledge management product (Pansophy). This product provides important tools for the successful pursuit of major projects such as accelerator system development and construction, by offering elements of process and procedure control, data capture and review, and data mining and analysis. After a period of initial development, Pansophy is now being used in Jefferson Lab's SNS superconducting linac construction effort, as a means for structuring and implementing the QA program, for process control and tracking, and for cryomodule test data capture and presentation/analysis. Development of Pansophy is continuing, in particular data queries and analysis functions that are the cornerstone of its utility

  2. Thermochemical production of liquid fuels from biomass: Thermo-economic modeling, process design and process integration analysis

    International Nuclear Information System (INIS)

    Tock, Laurence; Gassner, Martin; Marechal, Francois

    2010-01-01

    A detailed thermo-economic model combining thermodynamics with economic analysis and considering different technological alternatives for the thermochemical production of liquid fuels from lignocellulosic biomass is presented. Energetic and economic models for the production of Fischer-Tropsch fuel (FT), methanol (MeOH) and dimethyl ether (DME) by means of biomass drying with steam or flue gas, directly or indirectly heated fluidized bed or entrained flow gasification, hot or cold gas cleaning, fuel synthesis and upgrading are reviewed and developed. The process is integrated and the optimal utility system is computed. The competitiveness of the different process options is compared systematically with regard to energetic, economic and environmental considerations. At several examples, it is highlighted that process integration is a key element that allows for considerably increasing the performance by optimal utility integration and energy conversion. The performance computations of some exemplary technology scenarios of integrated plants yield overall energy efficiencies of 59.8% (crude FT-fuel), 52.5% (MeOH) and 53.5% (DME), and production costs of 89, 128 and 113 Euro MWh -1 on fuel basis. The applied process design approach allows to evaluate the economic competitiveness compared to fossil fuels, to study the influence of the biomass and electricity price and to project for different plant capacities. Process integration reveals in particular potential energy savings and waste heat valorization. Based on this work, the most promising options for the polygeneration of fuel, power and heat will be determined in a future thermo-economic optimization.

  3. Probabilistic Design in a Sheet Metal Stamping Process under Failure Analysis

    International Nuclear Information System (INIS)

    Buranathiti, Thaweepat; Cao, Jian; Chen, Wei; Xia, Z. Cedric

    2005-01-01

    Sheet metal stamping processes have been widely implemented in many industries due to its repeatability and productivity. In general, the simulations for a sheet metal forming process involve nonlinearity, complex material behavior and tool-material interaction. Instabilities in terms of tearing and wrinkling are major concerns in many sheet metal stamping processes. In this work, a sheet metal stamping process of a mild steel for a wheelhouse used in automobile industry is studied by using an explicit nonlinear finite element code and incorporating failure analysis (tearing and wrinkling) and design under uncertainty. Margins of tearing and wrinkling are quantitatively defined via stress-based criteria for system-level design. The forming process utilizes drawbeads instead of using the blank holder force to restrain the blank. The main parameters of interest in this work are friction conditions, drawbead configurations, sheet metal properties, and numerical errors. A robust design model is created to conduct a probabilistic design, which is made possible for this complex engineering process via an efficient uncertainty propagation technique. The method called the weighted three-point-based method estimates the statistical characteristics (mean and variance) of the responses of interest (margins of failures), and provide a systematic approach in designing a sheet metal forming process under the framework of design under uncertainty

  4. Process Simulation Analysis of HF Stripping

    Directory of Open Access Journals (Sweden)

    Thaer A. Abdulla

    2015-02-01

    Full Text Available    HYSYS process simulator is used for the analysis of existing HF stripping column in LAB plant (Arab Detergent Company, Baiji-Iraq. Simulated column performance and profiles curves are constructed. The variables considered are the thermodynamic model option, bottom temperature, feed temperature, and column profiles for the temperature, vapor flow rate, liquid flow rate and composition. The five thermodynamic models options used (Margules, UNIQUAC, van laar, Antoine, and Zudkevitch-Joffee, affecting the results within (0.1-58% variation for the most cases.        The simulated results show that about 4% of paraffin (C10 & C11 presents at the top stream, which may cause a problem in the LAB production plant. The major variations were noticed for the total top vapor flow rate with bottom temperature and with feed composition. The column profiles maintain fairly constants from tray 5 to tray 18. The study gives evidence about a successful simulation with HYSYS because the results correspond with the real plant operation data.

  5. Numerical analysis of mixing process of two component gases in vertical fluid layer

    International Nuclear Information System (INIS)

    Hatori, Hirofumi; Takeda, Tetsuaki; Funatani, Shumpei

    2015-01-01

    When the depressurization accident occurs in the Very-High-Temperature Reactor (VHTR), it is expected that air enter into the reactor core. Therefore, it is important to know a mixing process of different kind of gases in the stable or unstable stratified fluid layer. Especially, it is also important to examine an influence of localized natural convection and molecular diffusion on mixing process from a viewpoint of safety. In order to research the mixing process of two component gases and flow characteristics of the localized natural convection, we have carried out numerical analysis using three dimensional CFD code. The numerical model was consisted of a storage tank and a reverse U-shaped vertical slot. They were separated by a partition plate. One side of the left vertical fluid layer was heated and the other side was cooled. The right vertical fluid layer was also cooled. The procedure of numerical analysis is as follows. Firstly, the storage tank was filled with heavy gas and the reverse U-shaped vertical slot was filled with light gas. In the left vertical fluid layer, the localized natural convection was generated by the temperature difference between the vertical walls. The flow characteristics were obtained by a steady state analysis. The unsteady state analysis was started when the partition plate was opened. The gases were mixed by molecular diffusion and natural convection. After the time elapsed, natural circulation occurred. The result obtained in this numerical analysis is as follows. The temperature difference of the left vertical fluid layer was set to 100 K. The combination of the mixed gas was nitrogen and argon. After 76 minutes elapsed, natural circulation occurred. (author)

  6. Stacker’s Crane Position Fixing Based on Real Time Image Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Kmeid Saad

    2015-06-01

    Full Text Available This study illustrates the usage of stacker cranes and image processing in automated warehouse systems. The aim is to use real time image processing and analysis for a stacker’s crane position fixing in order to use it as a pick-up and delivery system (P/D, to be controlled by a programmable logic controller unit (PLC.

  7. Processing of gamma-ray spectra employing a Fourier deconvolver for the analysis of complex spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Rattan, S.S.

    1996-01-01

    Processing of a nuclear spectrum e.g. gamma ray spectrum is concerned with the estimation of energies and intensities of radiation. The processing involves filtering, peak detection and its significance, baseline delineation, the qualitative and the quantitative analysis of singlets and multiplets present in the spectrum. The methodology for the analysis of singlets is well established. However, the analysis of multiplets provides a challenge and is a extremely difficult problem. This report incorporates a Fourier deconvolver for the quantitative analysis of doublets separated by more than a full width at half maximum. The method is easy to implement. The report discusses the methodology, mathematical analysis, and the results obtained by analyzing both synthetic and observed spectra. A computer program, developed for the analysis of a nuclear spectrum, was verified by analyzing a 152 Eu gamma ray spectrum. The proposed technique compared favourably with SAMPO and MDFT method. (author). 16 refs., 3 tabs

  8. Progress in Analysis to Remote Sensed Thermal Abnormity with Fault Activity and Seismogenic Process

    Directory of Open Access Journals (Sweden)

    WU Lixin

    2017-10-01

    Full Text Available Research to the remote sensed thermal abnormity with fault activity and seismogenic process is a vital topic of the Earth observation and remote sensing application. It is presented that a systematic review on the international researches on the topic during the past 30 years, in the respects of remote sensing data applications, anomaly analysis methods, and mechanism understanding. Firstly, the outlines of remote sensing data applications are given including infrared brightness temperature, microwave brightness temperature, outgoing longwave radiation, and assimilated data from multiple earth observations. Secondly, three development phases are summarized as qualitative analysis based on visual interpretation, quantitative analysis based on image processing, and multi-parameter spatio-temporal correlation analysis. Thirdly, the theoretical hypotheses presented for the mechanism understanding are introduced including earth degassing, stress-induced heat, crustal rock battery conversion, latent heat release due to radon decay as well as multi-spheres coupling effect. Finally, three key directions of future research on this topic are proposed:anomaly recognizing by remote sensing monitoring and data analysis for typical tectonic activity areas; anomaly mechanism understanding based on earthquake-related earth system responses; spatio-temporal correlation analysis of air-based, space-based and ground-based stereoscopic observations.

  9. Process hazards analysis (PrHA) program, bridging accident analyses and operational safety

    International Nuclear Information System (INIS)

    Richardson, J.A.; McKernan, S.A.; Vigil, M.J.

    2003-01-01

    Recently the Final Safety Analysis Report (FSAR) for the Plutonium Facility at Los Alamos National Laboratory, Technical Area 55 (TA-55) was revised and submitted to the US. Department of Energy (DOE). As a part of this effort, over seventy Process Hazards Analyses (PrHAs) were written and/or revised over the six years prior to the FSAR revision. TA-55 is a research, development, and production nuclear facility that primarily supports US. defense and space programs. Nuclear fuels and material research; material recovery, refining and analyses; and the casting, machining and fabrication of plutonium components are some of the activities conducted at TA-35. These operations involve a wide variety of industrial, chemical and nuclear hazards. Operational personnel along with safety analysts work as a team to prepare the PrHA. PrHAs describe the process; identi fy the hazards; and analyze hazards including determining hazard scenarios, their likelihood, and consequences. In addition, the interaction of the process to facility systems, structures and operational specific protective features are part of the PrHA. This information is rolled-up to determine bounding accidents and mitigating systems and structures. Further detailed accident analysis is performed for the bounding accidents and included in the FSAR. The FSAR is part of the Documented Safety Analysis (DSA) that defines the safety envelope for all facility operations in order to protect the worker, the public, and the environment. The DSA is in compliance with the US. Code of Federal Regulations, 10 CFR 830, Nuclear Safety Management and is approved by DOE. The DSA sets forth the bounding conditions necessary for the safe operation for the facility and is essentially a 'license to operate.' Safely of day-to-day operations is based on Hazard Control Plans (HCPs). Hazards are initially identified in the PrI-IA for the specific operation and act as input to the HCP. Specific protective features important to worker

  10. Beer fermentation: monitoring of process parameters by FT-NIR and multivariate data analysis.

    Science.gov (United States)

    Grassi, Silvia; Amigo, José Manuel; Lyndgaard, Christian Bøge; Foschino, Roberto; Casiraghi, Ernestina

    2014-07-15

    This work investigates the capability of Fourier-Transform near infrared (FT-NIR) spectroscopy to monitor and assess process parameters in beer fermentation at different operative conditions. For this purpose, the fermentation of wort with two different yeast strains and at different temperatures was monitored for nine days by FT-NIR. To correlate the collected spectra with °Brix, pH and biomass, different multivariate data methodologies were applied. Principal component analysis (PCA), partial least squares (PLS) and locally weighted regression (LWR) were used to assess the relationship between FT-NIR spectra and the abovementioned process parameters that define the beer fermentation. The accuracy and robustness of the obtained results clearly show the suitability of FT-NIR spectroscopy, combined with multivariate data analysis, to be used as a quality control tool in the beer fermentation process. FT-NIR spectroscopy, when combined with LWR, demonstrates to be a perfectly suitable quantitative method to be implemented in the production of beer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Tube Bulge Process : Theoretical Analysis and Finite Element Simulations

    International Nuclear Information System (INIS)

    Velasco, Raphael; Boudeau, Nathalie

    2007-01-01

    This paper is focused on the determination of mechanics characteristics for tubular materials, using tube bulge process. A comparative study is made between two different models: theoretical model and finite element analysis. The theoretical model is completely developed, based first on a geometrical analysis of the tube profile during bulging, which is assumed to strain in arc of circles. Strain and stress analysis complete the theoretical model, which allows to evaluate tube thickness and state of stress, at any point of the free bulge region. Free bulging of a 304L stainless steel is simulated using Ls-Dyna 970. To validate FE simulations approach, a comparison between theoretical and finite elements models is led on several parameters such as: thickness variation at the free bulge region pole with bulge height, tube thickness variation with z axial coordinate, and von Mises stress variation with plastic strain. Finally, the influence of geometrical parameters deviations on flow stress curve is observed using analytical model: deviations of the tube outer diameter, its initial thickness and the bulge height measurement are taken into account to obtain a resulting error on plastic strain and von Mises stress

  12. Methodology for systematic analysis and improvement of manufacturing unit process life-cycle inventory (UPLCI)—CO2PE! initiative (cooperative effort on process emissions in manufacturing). Part 1: Methodology description

    DEFF Research Database (Denmark)

    Kellens, Karel; Dewulf, Wim; Overcash, Michael

    2012-01-01

    the provision of high-quality data for LCA studies of products using these unit process datasets for the manufacturing processes, as well as the in-depth analysis of individual manufacturing unit processes.In addition, the accruing availability of data for a range of similar machines (same process, different......This report proposes a life-cycle analysis (LCA)-oriented methodology for systematic inventory analysis of the use phase of manufacturing unit processes providing unit process datasets to be used in life-cycle inventory (LCI) databases and libraries. The methodology has been developed...... and resource efficiency improvements of the manufacturing unit process. To ensure optimal reproducibility and applicability, documentation guidelines for data and metadata are included in both approaches. Guidance on definition of functional unit and reference flow as well as on determination of system...

  13. Application of LES for Analysis of Unsteady Effects on Combustion Processes and Misfires in DISI Engine

    Directory of Open Access Journals (Sweden)

    Goryntsev D.

    2013-10-01

    Full Text Available Cycle-to-cycle variations of combustion processes strongly affect the emissions, specific fuel consumption and work output. Internal combustion engines such as Direct Injection Spark-Ignition (DISI are very sensitive to the cyclic fluctuations of the flow, mixing and combustion processes. Multi-cycle Large Eddy Simulation (LES analysis has been used in order to characterize unsteady effects of combustion processes and misfires in realistic DISI engine. A qualitative analysis of the intensity of cyclic variations of in-cylinder pressure, temperature and fuel mass fraction is presented. The effect of ignition probability and analysis of misfires are pointed out. Finally, the fuel history effects along with the effect of residual gas on in-cylinder pressure and temperature as well as misfires are discussed.

  14. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    Science.gov (United States)

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  15. Initiating an ergonomic analysis. A process for jobs with highly variable tasks.

    Science.gov (United States)

    Conrad, K M; Lavender, S A; Reichelt, P A; Meyer, F T

    2000-09-01

    Occupational health nurses play a vital role in addressing ergonomic problems in the workplace. Describing and documenting exposure to ergonomic risk factors is a relatively straightforward process in jobs in which the work is repetitive. In other types of work, the analysis becomes much more challenging because tasks may be repeated infrequently, or at irregular time intervals, or under different environmental and temporal conditions, thereby making it difficult to observe a "representative" sample of the work performed. This article describes a process used to identify highly variable job tasks for ergonomic analyses. The identification of tasks for ergonomic analysis was a two step process involving interviews and a survey of firefighters and paramedics from a consortium of 14 suburban fire departments. The interviews were used to generate a list of frequently performed, physically strenuous job tasks and to capture clear descriptions of those tasks and associated roles. The goals of the survey were to confirm the interview findings across the entire target population and to quantify the frequency and degree of strenuousness of each task. In turn, the quantitative results from the survey were used to prioritize job tasks for simulation. Although this process was used to study firefighters and paramedics, the approach is likely to be suitable for many other types of occupations in which the tasks are highly variable in content and irregular in frequency.

  16. Progress in Root Cause and Fault Propagation Analysis of Large-Scale Industrial Processes

    Directory of Open Access Journals (Sweden)

    Fan Yang

    2012-01-01

    Full Text Available In large-scale industrial processes, a fault can easily propagate between process units due to the interconnections of material and information flows. Thus the problem of fault detection and isolation for these processes is more concerned about the root cause and fault propagation before applying quantitative methods in local models. Process topology and causality, as the key features of the process description, need to be captured from process knowledge and process data. The modelling methods from these two aspects are overviewed in this paper. From process knowledge, structural equation modelling, various causal graphs, rule-based models, and ontological models are summarized. From process data, cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian nets are introduced. Based on these models, inference methods are discussed to find root causes and fault propagation paths under abnormal situations. Some future work is proposed in the end.

  17. Use of electronic computers for processing of spectrometric data in instrument neutron activation analysis

    International Nuclear Information System (INIS)

    Vyropaev, V.Ya.; Zlokazov, V.B.; Kul'kina, L.I.; Maslov, O.D.; Fefilov, B.V.

    1977-01-01

    A computer program is described for processing gamma spectra in the instrumental activation analysis of multicomponent objects. Structural diagrams of various variants of connection with the computer are presented. The possibility of using a mini-computer as an analyser and for preliminary processing of gamma spectra is considered

  18. Site-specific analysis of the cobbly soils at the Grand Junction processing site

    International Nuclear Information System (INIS)

    1992-06-01

    This report describes a recent site-specific analysis to evaluate the necessity of a recommendation to install a slurry trench around the Grand Junction processing site. The following analysis addresses the cobbly nature of the site's radiologically contaminated foundation soil, reassesses the excavation depths based on bulk radionuclide concentrations, and presents data-based arguments that support the elimination of the initially proposed slurry trench. The slurry trench around the processing site was proposed by the Remedial Action Contractor (RAC) to minimize the amount of water encountered during excavation. The initial depths of excavation developed during conceptual design, which indicated the need for a slurry wall, were reexamined as part of this analysis. This reanalysis, based on bulk concentrations of a cobbly subsoil, supports decreasing the original excavation depth, limiting the dewatering quantities to those which can be dissipated by normal construction activities. This eliminates the need for a slurry trench andseparate water treatment prior to permitted discharge

  19. Energy, Exergy and Advanced Exergy Analysis of a Milk Processing Factory

    DEFF Research Database (Denmark)

    Bühler, Fabian; Nguyen, Tuong-Van; Jensen, Jonas Kjær

    2016-01-01

    integration, an exergy analysis pinpoints the locations, causes and magnitudes of thermodynamic losses. The advanced exergy analysis further identifies the real potential for thermodynamic improvements of the system by splitting exergy destruction into its avoidable and unavoidable parts, which are related......, cream and milk powder. The results show the optimisation potential based on 1st and 2nd law analyses. An evaluation and comparison of the applicability of exergy methods, including advanced exergy methods, to the dairy industry is made. The comparison includes typical energy mappings conducted onsite......, and discusses the benefits and challenges of applying advanced thermodynamic methods to industrial processes....

  20. The Development and Numerical Analysis of the Conical Radiator Extrusion Process

    Directory of Open Access Journals (Sweden)

    Michalczyk J.

    2017-12-01

    Full Text Available The article presents a newly developed method for single-operation extrusion of conical radiators. This is the author’s radiator manufacturing method being the subject of a patent application. The proposed method enables the manufacture of radiators either with or without an inner opening and with an integral plate. Selected results of numerical computations made within Forge®3D, a finite element method (FEM-based software program, were presented during the analysis of the process. A comparative analysis of the proposed manufacturing method using the double-sided extrusion method was also made.

  1. Process Design and Techno-economic Analysis for Materials to Treat Produced Waters.

    Energy Technology Data Exchange (ETDEWEB)

    Heimer, Brandon Walter [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Paap, Scott M [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sasan, Koroush [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Brady, Patrick Vane. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Nenoff, Tina M. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    Significant quantities of water are produced during enhanced oil recovery making these “produced water” streams attractive candidates for treatment and reuse. However, high concentrations of dissolved silica raise the propensity for fouling. In this paper, we report the design and economic analysis for a new ion exchange process using calcined hydrotalcite (HTC) to remove silica from water. This process improves upon known technologies by minimizing sludge product, reducing process fouling, and lowering energy use. Process modeling outputs included raw material requirements, energy use, and the minimum water treatment price (MWTP). Monte Carlo simulations quantified the impact of uncertainty and variability in process inputs on MWTP. These analyses showed that cost can be significantly reduced if the HTC materials are optimized. Specifically, R&D improving HTC reusability, silica binding capacity, and raw material price can reduce MWTP by 40%, 13%, and 20%, respectively. Optimizing geographic deployment further improves cost competitiveness.

  2. Numerical Analysis of Heat Transfer During Quenching Process

    Science.gov (United States)

    Madireddi, Sowjanya; Krishnan, Krishnan Nambudiripad; Reddy, Ammana Satyanarayana

    2018-04-01

    A numerical model is developed to simulate the immersion quenching process of metals. The time of quench plays an important role if the process involves a defined step quenching schedule to obtain the desired characteristics. Lumped heat capacity analysis used for this purpose requires the value of heat transfer coefficient, whose evaluation requires large experimental data. Experimentation on a sample work piece may not represent the actual component which may vary in dimension. A Fluid-Structure interaction technique with a coupled interface between the solid (metal) and liquid (quenchant) is used for the simulations. Initial times of quenching shows boiling heat transfer phenomenon with high values of heat transfer coefficients (5000-2.5 × 105 W/m2K). Shape of the work piece with equal dimension shows less influence on the cooling rate Non-uniformity in hardness at the sharp corners can be reduced by rounding off the edges. For a square piece of 20 mm thickness, with 3 mm fillet radius, this difference is reduced by 73 %. The model can be used for any metal-quenchant combination to obtain time-temperature data without the necessity of experimentation.

  3. Processing of spectral X-ray data with principal components analysis

    CERN Document Server

    Butler, A P H; Cook, N J; Butzer, J; Schleich, N; Tlustos, L; Scott, N; Grasset, R; de Ruiter, N; Anderson, N G

    2011-01-01

    The goal of the work was to develop a general method for processing spectral x-ray image data. Principle component analysis (PCA) is a well understood technique for multivariate data analysis and so was investigated. To assess this method, spectral (multi-energy) computed tomography (CT) data was obtained using a Medipix2 detector in a MARS-CT (Medipix All Resolution System). PCA was able to separate bone (calcium) from two elements with k-edges in the X-ray spectrum used (iodine and barium) within a mouse. This has potential clinical application in dual-energy CT systems and future Medipix3 based spectral imaging where up to eight energies can be recorded simultaneously with excellent energy resolution. (c) 2010 Elsevier B.V. All rights reserved.

  4. MiToBo - A Toolbox for Image Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Birgit Möller

    2016-04-01

    Full Text Available MiToBo is a toolbox and Java library for solving basic as well as advanced image processing and analysis tasks. It features a rich collection of fundamental, intermediate and high-level image processing operators and algorithms as well as a couple of sophisticated tools for specific biological and biomedical applications. These tools include operators for elucidating cellular morphology and locomotion as well as operators for the characterization of certain intracellular particles and structures. MiToBo builds upon and integrates into the widely-used image analysis software packages ImageJ and Fiji [11, 10], and all of its operators can easily be run in ImageJ and Fiji via a generic operator runner plugin. Alternatively MiToBo operators can directly be run from command line, and using its functionality as a library for developing own applications is also supported. Thanks to the Alida library [8] forming the base of MiToBo all operators share unified APIs fostering reusability, and graphical as well as command line user interfaces for operators are automatically generated. MiToBo is available from its website http://www.informatik.uni-halle.de/mitobo, on Github, via an Apache Archiva Maven repository server, and it can easily be activated in Fiji via its own update site.

  5. Dynamical analysis of yeast protein interaction network during the sake brewing process.

    Science.gov (United States)

    Mirzarezaee, Mitra; Sadeghi, Mehdi; Araabi, Babak N

    2011-12-01

    Proteins interact with each other for performing essential functions of an organism. They change partners to get involved in various processes at different times or locations. Studying variations of protein interactions within a specific process would help better understand the dynamic features of the protein interactions and their functions. We studied the protein interaction network of Saccharomyces cerevisiae (yeast) during the brewing of Japanese sake. In this process, yeast cells are exposed to several stresses. Analysis of protein interaction networks of yeast during this process helps to understand how protein interactions of yeast change during the sake brewing process. We used gene expression profiles of yeast cells for this purpose. Results of our experiments revealed some characteristics and behaviors of yeast hubs and non-hubs and their dynamical changes during the brewing process. We found that just a small portion of the proteins (12.8 to 21.6%) is responsible for the functional changes of the proteins in the sake brewing process. The changes in the number of edges and hubs of the yeast protein interaction networks increase in the first stages of the process and it then decreases at the final stages.

  6. Electrophoresis gel image processing and analysis using the KODAK 1D software.

    Science.gov (United States)

    Pizzonia, J

    2001-06-01

    The present article reports on the performance of the KODAK 1D Image Analysis Software for the acquisition of information from electrophoresis experiments and highlights the utility of several mathematical functions for subsequent image processing, analysis, and presentation. Digital images of Coomassie-stained polyacrylamide protein gels containing molecular weight standards and ethidium bromide stained agarose gels containing DNA mass standards are acquired using the KODAK Electrophoresis Documentation and Analysis System 290 (EDAS 290). The KODAK 1D software is used to optimize lane and band identification using features such as isomolecular weight lines. Mathematical functions for mass standard representation are presented, and two methods for estimation of unknown band mass are compared. Given the progressive transition of electrophoresis data acquisition and daily reporting in peer-reviewed journals to digital formats ranging from 8-bit systems such as EDAS 290 to more expensive 16-bit systems, the utility of algorithms such as Gaussian modeling, which can correct geometric aberrations such as clipping due to signal saturation common at lower bit depth levels, is discussed. Finally, image-processing tools that can facilitate image preparation for presentation are demonstrated.

  7. The process of life-cycle cost analysis on the Fernald Environmental Management Project

    International Nuclear Information System (INIS)

    Chang, D.Y.; Jacoboski, J.A.; Fisher, L.A.; Beirne, P.J.

    1993-01-01

    The Estimating Services Department of the Fernald Environmental Restoration Management Corporation (FERMCO) is formalizing the process of life-cycle cost analysis (LCCA) for the Fernald Environmental Management Project (FEMP). The LCCA process is based on the concepts, principles, and guidelines described by applicable Department of Energy's (DOE) orders, pertinent published literature, and the National Bureau of Standards handbook 135. LCC analyses will be performed following a ten-step process on the FEMP at the earliest possible decision point to support the selection of the least-cost alternatives for achieving the FERMCO mission

  8. Modern licensing approaches for analysis of important to safety processes in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Andreeva, M.; Groudev, P.; Pavlova, M.; Stoyanov, S.

    2008-01-01

    It is presented within the paper the modern approaches for analysis of important to safety assessment processes in Nuclear Power Plants, included Bulgarian Regulatory Agency's requirements for quantity assessment of these processes applying deterministic and probabilistic approaches for establishing and confirming the design basis and defence-in-depth effectiveness. (authors)

  9. Contact traction analysis for profile change during coining process

    International Nuclear Information System (INIS)

    Kim, Hyung Kyu; Yoon, Kyung Ho; Kang, Heung Seok; Song, Kee Nam

    2002-01-01

    Contact tractions are analysed in the case of the change in contact profile occurring during the coining process of a thin strip material. The changed profile is assumed as a concave circular arc in the central part of the contact region which is smoothly connected with convex circular arcs at both sides, referring to the actual measurement of the coined material. The profile is discretized and the known solutions of singular integral equations are used. Since the contact profile affects the contact traction and relevant tribological behaviour (e.g. wear) as well, an accurate definition of the profile is necessary in the analysis of material failure. Parametric study is conducted with the variation of the radii and distance of the arcs, which defines the height difference between the summits of the arcs. Considered is the contact profile, which can give the negligible variation of the traction in comparison with that before the coining process

  10. Process development

    Energy Technology Data Exchange (ETDEWEB)

    Schuegerl, K

    1984-01-01

    The item 'process development' comprises the production of acetonic/butonal with C. acetobylicum and the yeasting of potato waste. The target is to increase productivity by taking the following measures - optimation of media, on-line process analysis, analysis of reaction, mathematic modelling and identification of parameters, process simulation, development of a state estimator with the help of the on-line process analysis and the model, optimization and adaptive control.

  11. A real time analysis of the self-assembly process using thermal analysis inside the differential scanning calorimeter instrument.

    Science.gov (United States)

    Roy, Debmalya; Shastri, Babita; Mukhopadhyay, K

    2012-07-12

    The supramolecular assembly of the regioregular poly-3-hexylthiophene (rr-P3HT) in solution has been investigated thoroughly in the past. In the current study, our focus is on the enthalpy of nanofiber formation using thermal analysis techniques by performing the self-assembly process inside the differential scanning calorimetry (DSC) instrument. Thermogravimetric analysis (TGA) was carried out to check the concentration of the solvent during the self-assembly process of P3HT in p-xylene. Ultraviolet visible (UV-vis) spectophotometric technique, small-angle X-ray scattering (SAXS) experiment, atomic force microscopic (AFM), and scanning electron microscopic (SEM) images were used to characterize the different experimental yields generated by cooling the reaction mixture at desired temperatures. Comparison of the morphologies of self-assembled products at different fiber formation temperatures gives us an idea about the possible crystallization parameters which could affect the P3HT nanofiber morphology.

  12. Automated analysis of heterogeneous carbon nanostructures by high-resolution electron microscopy and on-line image processing

    International Nuclear Information System (INIS)

    Toth, P.; Farrer, J.K.; Palotas, A.B.; Lighty, J.S.; Eddings, E.G.

    2013-01-01

    High-resolution electron microscopy is an efficient tool for characterizing heterogeneous nanostructures; however, currently the analysis is a laborious and time-consuming manual process. In order to be able to accurately and robustly quantify heterostructures, one must obtain a statistically high number of micrographs showing images of the appropriate sub-structures. The second step of analysis is usually the application of digital image processing techniques in order to extract meaningful structural descriptors from the acquired images. In this paper it will be shown that by applying on-line image processing and basic machine vision algorithms, it is possible to fully automate the image acquisition step; therefore, the number of acquired images in a given time can be increased drastically without the need for additional human labor. The proposed automation technique works by computing fields of structural descriptors in situ and thus outputs sets of the desired structural descriptors in real-time. The merits of the method are demonstrated by using combustion-generated black carbon samples. - Highlights: ► The HRTEM analysis of heterogeneous nanostructures is a tedious manual process. ► Automatic HRTEM image acquisition and analysis can improve data quantity and quality. ► We propose a method based on on-line image analysis for the automation of HRTEM image acquisition. ► The proposed method is demonstrated using HRTEM images of soot particles

  13. FOREWORD: Focus on Materials Analysis and Processing in Magnetic Fields Focus on Materials Analysis and Processing in Magnetic Fields

    Science.gov (United States)

    Sakka, Yoshio; Hirota, Noriyuki; Horii, Shigeru; Ando, Tsutomu

    2009-03-01

    Recently, interest in the applications of feeble (diamagnetic and paramagnetic) magnetic materials has grown, whereas the popularity of ferromagnetic materials remains steady and high. This trend is due to the progress of superconducting magnet technology, particularly liquid-helium-free superconducting magnets that can generate magnetic fields of 10 T and higher. As the magnetic energy is proportional to the square of the applied magnetic field, the magnetic energy of such 10 T magnets is in excess of 10 000 times that of conventional 0.1 T permanent magnets. Consequently, many interesting phenomena have been observed over the last decade, such as the Moses effect, magnetic levitation and the alignment of feeble magnetic materials. Researchers in this area are widely spread around the world, but their number in Japan is relatively high, which might explain the success of magnetic field science and technology in Japan. Processing in magnetic fields is a rapidly expanding research area with a wide range of promising applications in materials science. The 3rd International Workshop on Materials Analysis and Processing in Magnetic Fields (MAP3), which was held on 14-16 May 2008 at the University of Tokyo, Japan, focused on various topics including magnetic field effects on chemical, physical, biological, electrochemical, thermodynamic and hydrodynamic phenomena; magnetic field effects on the crystal growth and processing of materials; diamagnetic levitation, the magneto-Archimedes effect, spin chemistry, magnetic orientation, control of structure by magnetic fields, magnetic separation and purification, magnetic-field-induced phase transitions, properties of materials in high magnetic fields, the development of NMR and MRI, medical applications of magnetic fields, novel magnetic phenomena, physical property measurement by magnetic fields, and the generation of high magnetic fields. This focus issue compiles 13 key papers selected from the proceedings of MAP3. Other

  14. Analysis of multiparty mediation processes

    NARCIS (Netherlands)

    Vuković, Siniša

    2013-01-01

    Crucial challenges for multiparty mediation processes include the achievement of adequate cooperation among the mediators and consequent coordination of their activities in the mediation process. Existing literature goes only as far as to make it clear that successful mediation requires necessary

  15. A Qualitative Analysis Framework Using Natural Language Processing and Graph Theory

    Science.gov (United States)

    Tierney, Patrick J.

    2012-01-01

    This paper introduces a method of extending natural language-based processing of qualitative data analysis with the use of a very quantitative tool--graph theory. It is not an attempt to convert qualitative research to a positivist approach with a mathematical black box, nor is it a "graphical solution". Rather, it is a method to help qualitative…

  16. [Research on the method of copper converting process determination based on emission spectrum analysis].

    Science.gov (United States)

    Li, Xian-xin; Liu, Wen-qing; Zhang, Yu-jun; Si, Fu-qi; Dou, Ke; Wang, Feng-ping; Huang, Shu-hua; Fang, Wu; Wang, Wei-qiang; Huang, Yong-feng

    2012-05-01

    A method of copper converting process determination based on PbO/PbS emission spectrum analysis was described. According to the known emission spectrum of gas molecules, the existence of PbO and PbS was confirmed in the measured spectrum. Through the field experiment it was determined that the main emission spectrum of the slag stage was from PbS, and the main emission spectrum of the copper stage was from PbO. The relative changes in PbO/PbS emission spectrum provide the method of copper converting process determination. Through using the relative intensity in PbO/PbS emission spectrum the copper smelting process can be divided into two different stages, i.e., the slag stage (S phase) and the copper stage (B phase). In a complete copper smelting cycle, a receiving telescope of appropriate view angle aiming at the converter flame, after noise filtering on the PbO/PbS emission spectrum, the process determination agrees with the actual production. Both the theory and experiment prove that the method of copper converting process determination based on emission spectrum analysis is feasible.

  17. Evaluation of transport safety analysis processes of radioactive material performed by a regulatory body

    International Nuclear Information System (INIS)

    Mattar, Patricia Morais

    2017-01-01

    Radioactive substances have many beneficial applications, ranging from power generation to uses in medicine, industry and agriculture. As a rule, they are produced in different places from where they are used, needing to be transported. In order for transport to take place safely and efficiently, national and international standards must be complied with. This research aims to assess the safety analysis processes for the transport of radioactive material carried out by the regulatory body in Brazil, from the point of view of their compliance with the International Atomic Energy Agency (IAEA) standards. The self-assessment methodology named SARIS, developed by the AIEA, was used. The following steps were carried out: evaluation of the Diagnosis and Processes Mapping; responses to the SARIS Question Set and complementary questions; SWOT analysis; interviews with stakeholders and evaluation of a TranSAS mission conducted by the IAEA in 2002. Considering only SARIS questions, processes are 100% adherent. The deepening of the research, however, led to the development of twenty-two improvement proposals and the identification of nine good practices. The results showed that the safety analysis processes of the transport of radioactive material are being carried out in a structured, safe and reliable way but also that there is much opportunity for improvement. The formulation of an action plan, based on the presented proposals, can bring to the regulatory body many benefits. This would be an important step towards convening an external evaluation, providing greater reliability and transparency to the regulatory body´s processes. (author)

  18. Design of sustainable chemical processes: Systematic retrofit analysis, generation and evaluation alternatives

    DEFF Research Database (Denmark)

    Carvalho, Ana; Gani, Rafiqul; Matos, Henrique

    2008-01-01

    eliminating the need to identify trade-off-based solutions. These indicators are also able to reduce (where feasible) a set of safety indicators. An indicator sensitivity analysis algorithm has been added to the methodology to define design targets and to generate sustainable process alternatives. A computer-aided...... tool has been developed to facilitate the calculations needed for the application of the methodology. The application of the indicator-based methodology and the developed software are highlighted through a process flowsheet for the production of vinyl chlorine monomer (VCM)....

  19. A signal processing analysis of Purkinje cells in vitro

    Directory of Open Access Journals (Sweden)

    Ze'ev R Abrams

    2010-05-01

    Full Text Available Cerebellar Purkinje cells in vitro fire recurrent sequences of Sodium and Calcium spikes. Here, we analyze the Purkinje cell using harmonic analysis, and our experiments reveal that its output signal is comprised of three distinct frequency bands, which are combined using Amplitude and Frequency Modulation (AM/FM. We find that the three characteristic frequencies - Sodium, Calcium and Switching – occur in various combinations in all waveforms observed using whole-cell current clamp recordings. We found that the Calcium frequency can display a frequency doubling of its frequency mode, and the Switching frequency can act as a possible generator of pauses that are typically seen in Purkinje output recordings. Using a reversibly photo-switchable kainate receptor agonist, we demonstrate the external modulation of the Calcium and Switching frequencies. These experiments and Fourier analysis suggest that the Purkinje cell can be understood as a harmonic signal oscillator, enabling a higher level of interpretation of Purkinje signaling based on modern signal processing techniques.

  20. Stochastic analysis in production process and ecology under uncertainty

    CERN Document Server

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...