WorldWideScience

Sample records for hydrogasification process analysis

  1. Exergy analysis of a coal/biomass co-hydrogasification based chemical looping power generation system

    International Nuclear Information System (INIS)

    Yan, Linbo; Yue, Guangxi; He, Boshu

    2015-01-01

    Power generation from co-utilization of coal and biomass is very attractive since this technology can not only save the coal resource but make sufficient utilization of biomass. In addition, with this concept, net carbon discharge per unit electric power generation can also be sharply reduced. In this work, a coal/biomass co-hydrogasification based chemical looping power generation system is presented and analyzed with the assistance of Aspen Plus. The effects of different operating conditions including the biomass mass fraction, R_b, the hydrogen recycle ratio, R_h_r, the hydrogasification pressure, P_h_g, the iron to fuel mole ratio, R_i_f, the reducer temperature, T_r_e, the oxidizer temperature, T_o_x, and the fuel utilization factor, U_f of the SOFC (solid oxide fuel cell) on the system operation results including the energy efficiency, η_e, the total energy efficiency, η_t_e, the exergy efficiency, η_e_x, the total exergy efficiency, η_t_e_x and the carbon capture rate, η_c_c, are analyzed. The energy and exergy balances of the whole system are also calculated and the corresponding Sankey diagram and Grassmann diagram are drawn. Under the benchmark condition, exergy efficiencies of different units in the system are calculated. η_t_e, η_t_e_x and η_c_c of the system are also found to be 43.6%, 41.2% and 99.1%, respectively. - Highlights: • A coal/biomass co-hydrogasification based chemical looping power generation system is setup. • Sankey and Grassmann diagrams are presented based on the energy and exergy balance calculations. • Sensitivity analysis is done to understand the system operation characteristics. • Total energy and exergy efficiencies of this system can be 43.6% and 41.2%, respectively. • About 99.1% of the carbon contained in coal and biomass can be captured in this system.

  2. Prototype plant for nuclear process heat (PNP) - operation of the pilot plant for hydrogasification of coal

    International Nuclear Information System (INIS)

    Bruengel, N.; Dehms, G.; Fiedler, P.; Gerigk, H.P.; Ruddeck, W.; Schrader, L.; Schumacher, H.J.

    1988-04-01

    The Rheinische Braunkohlenwerke AG developed the process of hydrogasification of coal in a fluidized bed for generation of SNG. On basis of test results obtained in a semi-technical pilot plant of a through-put of 250 kg/h dried coal a large pilot plant was erected processing 10 t/h dried brown coal. This plant was on stream for about 14700 h, of which about 7800 h were with gasifier operation; during this time about 38000 t of dried brown coal of the Rhenish district were processed containing 4 to 25% of ash. At pressures of 60 to 120 bar and temperatures of 800 to 935 0 C carbon conversion rates up to 81 percent and methane amounts of 5000 m 3 (STP)/h were reached. The decisive parameter for methane generation was the hydrogen/coal-ratio. Even at high moisture contents, usually diminishing the methane yield from the coal essentially, by high hydrogen/coal-ratios high methane yields could be obtained. The gasifier itself caused no troubles during the total time operation. Difficulties with the original design of the residual char cooler could be overcome by change-over from water injection to liquid carbon dioxide. The design of the heat recovery system proved well. Alltogether so the size increasement of the gasifier from the semi-technical to the large pilot plant as well as the harmonization of gas generation and gas refining was proved. (orig.) With 20 refs., 20 tabs., 81 figs [de

  3. Development of a Hydrogasification Process for Co-Production of Substitute Natural Gas (SNG) and Electric Power from Western Coals

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xiaolei [Arizona Public Service Company, Pheonix, AZ (United States); Rink, Nancy [Arizona Public Service Company, Pheonix, AZ (United States)

    2011-04-30

    This report presents the results of the research and development conducted on an Advanced Hydrogasification Process (AHP) conceived and developed by Arizona Public Service Company (APS) under U.S. Department of Energy (DOE) contract: DE-FC26-06NT42759 for Substitute Natural Gas (SNG) production from western coal. A double-wall (i.e., a hydrogasification contained within a pressure shell) down-flow hydrogasification reactor was designed, engineered, constructed, commissioned and operated by APS, Phoenix, AZ. The reactor is ASME-certified under Section VIII with a rating of 1150 pounds per square inch gage (psig) maximum allowable working pressure at 1950 degrees Fahrenheit (°F). The reaction zone had a 1.75 inch inner diameter and 13 feet length. The initial testing of a sub-bituminous coal demonstrated ~ 50% carbon conversion and ~10% methane yield in the product gas under 1625°F, 1000 psig pressure, with a 11 seconds (s) residence time, and 0.4 hydrogen-to-coal mass ratio. Liquid by-products mainly contained Benzene, Toluene, Xylene (BTX) and tar. Char collected from the bottom of the reactor had 9000-British thermal units per pound (Btu/lb) heating value. A three-dimensional (3D) computational fluid dynamic model simulation of the hydrodynamics around the reactor head was utilized to design the nozzles for injecting the hydrogen into the gasifier to optimize gas-solid mixing to achieve improved carbon conversion. The report also presents the evaluation of using algae for carbon dioxide (CO2) management and biofuel production. Nannochloropsis, Selenastrum and Scenedesmus were determined to be the best algae strains for the project purpose and were studied in an outdoor system which included a 6-meter (6M) radius cultivator with a total surface area of 113 square meters (m2) and a total culture volume between 10,000 to 15,000 liters (L); a CO2 on-demand feeding system; an on-line data collection system for temperature, p

  4. Development of a Hydrogasification Process for Co-Production of Substitute Natural Gas (SNG) and Electric Power from Western Coals-Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Raymond Hobbs

    2007-05-31

    The Advanced Hydrogasification Process (AHP)--conversion of coal to methane--is being developed through NETL with a DOE Grant and has successfully completed its first phase of development. The results so far are encouraging and have led to commitment by DOE/NETL to begin a second phase--bench scale reactor vessel testing, expanded engineering analysis and economic perspective review. During the next decade new means of generating electricity, and other forms of energy, will be introduced. The members of the AHP Team envision a need for expanded sources of natural gas or substitutes for natural gas, to fuel power generating plants. The initial work the team has completed on a process to use hydrogen to convert coal to methane (pipeline ready gas) shows promising potential. The Team has intentionally slanted its efforts toward the needs of US electric utilities, particularly on fuels that can be used near urban centers where the greatest need for new electric generation is found. The process, as it has evolved, would produce methane from coal by adding hydrogen. The process appears to be efficient using western coals for conversion to a highly sought after fuel with significantly reduced CO{sub 2} emissions. Utilities have a natural interest in the preservation of their industry, which will require a dramatic reduction in stack emissions and an increase in sustainable technologies. Utilities tend to rank long-term stable supplies of fuel higher than most industries and are willing to trade some ratio of cost for stability. The need for sustainability, stability and environmentally compatible production are key drivers in the formation and progression of the AHP development. In Phase II, the team will add a focus on water conservation to determine how the basic gasification process can be best integrated with all the plant components to minimize water consumption during SNG production. The process allows for several CO{sub 2} reduction options including consumption of

  5. Production of Fischer–Tropsch fuels and electricity from bituminous coal based on steam hydrogasification

    International Nuclear Information System (INIS)

    Lu, Xiaoming; Norbeck, Joseph M.; Park, Chan S.

    2012-01-01

    A new thermochemical process for (Fischer–Tropsch) FT fuels and electricity coproduction based on steam hydrogasification is addressed and evaluated in this study. The core parts include (Steam Hydrogasification Reactor) SHR, (Steam Methane Reformer) SMR and (Fisher–Tropsch Reactor) FTR. A key feature of SHR is the enhanced conversion of carbon into methane at high steam environment with hydrogen and no need for catalyst or the use of oxygen. Facilities utilizing bituminous coal for coproduction of FT fuels and electricity with carbon dioxide sequestration are designed in detail. Cases with design capacity of either 400 or 4000 TPD (Tonne Per Day) (dry basis) are investigated with process modeling and cost estimation. A cash flow analysis is performed to determine the fuels (Production Cost) PC. The analysis shows that the 400 TPD case due to a FT fuels PC of 5.99 $/gallon diesel equivalent results in a plant design that is totally uneconomic. The 4000 TPD plant design is expected to produce 7143 bbl/day FT liquids with PC of 2.02 $/gallon and 2.27 $/gallon diesel equivalent at overall carbon capture ratio of 65% and 90%, respectively. Prospective commercial economics benefits with increasing plant size and improvements from large-scale demonstration efforts on steam hydrogasification. -- Highlights: ► We develop a new thermochemical method for synthetic fuels production. ► Detailed plant design and process modeling for the Coal-to-Liquid facilities are performed. ► Economic analysis has been carried out in determining the fuel production cost and IRR. ► The fuels produced in this study can compete with petroleum when crude oil price is 100 $/bbl. ► Further economic benefit comes with plant scale-up and process commercial demonstration efforts.

  6. Pilot plant for hydrogasification of coal with nuclear heat

    International Nuclear Information System (INIS)

    Falkenhain, G.; Velling, G.

    1976-01-01

    In the framework of a research and development programme sponsored by the Ministry of Research and Technology of the Federal Republic of Germany, two process variants for hydrogasification of coal by means of nuclear heat have been developed by the Rheinische Braunkohlenwerke AG, Cologne. For testing these process variants a semi-technical pilot plant for gasification of coal under pressure in a fluidized bed was constructed. The pilot plant, in which the gasification of lignite and hard coal is planned, is designed for a throughput of 100kg carbon per hour corresponding to 400kg raw lignite per hour or 150kg hard coal per hour. The plant should provide data on the influence of the most essential process parameters (pressure, temperature, residence time of gas and coal, type and pre-treatment of feed coal) on the performance of gasification and raw gas composition. Different plant components will also be tested. Since the pilot plant will permit testing of both process variants of hydrogasification, it was designed in such a way that it is possible to vary a great number of process parameters. Thus, for instance, the pressure can be chosen in a range up to 100 bar and pure hydrogen or mixtures of hydrogen, carbon monoxide and steam can be applied as gasification agents. The gasifier is an internally insulated fluidized bed reactor with an inner diameter of 200mm and a height of about 8m, to which an internally insulated cyclone for separation of the entrained fines is attached. The raw gas is then cooled down by direct water scrubbing. (author)

  7. Fiscal 1991 survey report. Coal hydrogasification technology development; 1991 nendo sekitan suiten gaska gijutsu kaihatsu chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    Surveys and studies were conducted to establish a practical SNG (substitute natural gas) production process. Dealt with in the survey of basic studies on hydrogasification were the effect of gasification conditions, mechanism of tar decomposition, model-using estimation and assessment of reaction heat, and so forth. In an effort to develop a reactor, the current status was studied and future tasks were extracted concerning the one-through type and the internal circulation type entrained bed hydrogasification furnaces. In the study of practical application of the coal hydrogasification process, it was found that gas cooling efficiency would be increased from last fiscal year's 75.2% to approximately 78% by optimizing the process configuration. An ARCH (advanced rapid coal hydrogasification) process to have a novel reactor was proposed, and, for its commercialization, guidelines for dimensionally enlarging the process were worked out and tasks to discharge at each of the development stages were extracted. Relative to pilot tests, an efficient development program was deliberated, in particular, which comprised ARCH-1 and ARCH-2. (NEDO)

  8. The behavior of catalysts in hydrogasification of sub-bituminous coal in pressured fluidized bed

    International Nuclear Information System (INIS)

    Yan, Shuai; Bi, Jicheng; Qu, Xuan

    2017-01-01

    Highlights: •CCHG in a pressured fluidized bed achieved 77.3 wt.% of CH 4 yield in 30 min. •Co-Ca and Ni-Ca triggered catalytic coal pyrolysis and char hydrogasification. •The reason for better catalytic performance of 5%Co-1%Ca was elucidated. •Sintered catalyst blocked the reactive sites and suppressed coal conversion. •Co-Ca made the catalyzed coal char rich in mesopore structures and reactive sites. -- Abstract: The catalytic hydrogasification of the sub-bituminous coal was carried out in a lab-scale pressurized fluidized bed with the Co-Ca, Ni-Ca and Fe-Ca as catalysts at 850 °C and 3 MPa. The effect of different catalysts on the characteristics of gasification products was investigated, and the behavior of the catalysts was also explored by means of the X-ray diffraction (XRD), FT-Raman, Brunauer–Emmett–Teller (BET), etc. Experiment results showed that all the catalysts promoted the carbon conversion in the coal catalytic hydrogasification (CCHG), and the catalytic activity was in the order: 5%Co-1%Ca > 5%Ni-1%Ca > 5%Fe-1%Ca. Compared with the raw coal hydrogasification, the carbon conversion increased from 43.4 wt.% to 91.3 wt.%, and the CH 4 yield increased from 23.7 wt.% to 77.3 wt.% within 30 min after adding the 5%Co-1%Ca catalyst into the coal. Co-Ca and Ni-Ca possessed catalytic effect on both processes of pyrolysis of coal and hydrogasification of coal char in CCHG, by which the graphitization of the coal was suppressed and methane formation rate was significantly accelerated. Fe/Co/Ni-Ca could penetrate into the interior of coal during CCHG, making the catalytic production of CH 4 conduct in the pore structures. The activity difference of the catalysts was owing to the different ability of rupturing the amorphous C−C bonds in coal structure. The incomplete carbon conversion of the 5%Co-1%Ca loaded coal was due to the agglomeration of the catalyst and the blockage of the reactive sites by the sintered catalyst. This work will provide

  9. Fiscal 1992 survey report. Coal hydrogasification technology development; 1992 nendo sekitan suiten gaska gijutsu kaihatsu chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-03-01

    Surveys and studies were conducted to establish a practical SNG (substitute natural gas) production process. In the study of coal hydrogasification, a mathematical simulation was implemented to estimate the distribution of products with the pyrolytic reaction and the hydrogenolytic reaction controlled independently in the ARCH-2 (advanced rapid coal hydrogasification-2) process, the said two reactions representing the key concepts of the ARCH-2 process. It was then disclosed that a two-stage reaction control would increase the liquid yield. Also, a tentative calculation was made of gas cooling efficiency and cost performance in a process capable of achieving the target liquid acquisition rate. It was then found that BTX (benzene, toluene, xylene) production up to approximately 15% in terms of carbon was feasible and that the SNG price would be 29.03 yen/Nm{sup 3} with benzene priced at 90 yen/kg, these promising a better result than in the ARCH-1 process. The gas cooling efficiency of the ARCH-2 process was but 72.0% or less, however, which demanded improvement. Studies were made, based on the results of studies in progress since fiscal 1990, about what the hydrogasification process for Japan to develop should be. (NEDO)

  10. Fiscal 1993 survey report. Coal hydrogasification technology development; 1993 nendo sekitan suiten gaska gijutsu kaihatsu chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-03-01

    Surveys and studies were conducted to establish a practical SNG (substitute natural gas) production process. The hydrogasification process that Japan should develop is a flexible process that operates on the three modes of the maximum SNG yield, the maximum heat efficiency, and the maximum BTX (benzene, toluene, xylene) yield. Such being the case, an ARCH (advanced rapid coal hydrogasification) process was proposed, provided with a reactor capable of an ARCH-1 type operation for the maximum gas cooling efficiency and an ARCH-2 type operation for the maximum liquid yield. As for the details of the ARCH process development, the time and priority for development were determined for each of the items in consideration of the technical contents and the steps of development in the flow from a bench plant to a demonstration plant. The technology of char cooling and extraction was specified as the first item to be immediately dealt with. As for the development of the hydrogasification reactor, it was concluded that it was suitable to begin with the development of an injector. According to the development plan, the cost required up to a pilot plant test was estimated at 2 billion yen. (NEDO)

  11. Fiscal 1996 New Sunshine Plan auxiliary project. Report on results on development of coal hydrogasification technology (Support research); 1996 nendo New Sunshine keikaku hojo jigyo seika hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu shien kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    R and D was conducted for the purpose of developing the alternative natural gas manufacturing process that, using coal as the raw material, is highly efficient as well as environmentally superior, with the fiscal 1996 results reported. A test research was carried out on the coal hydrogasification process which was selected as the optimum process based on the studies in the previous years. On the basis of the results of the studies, a free fall pyrolyzer was employed as the test equipment for the use of behavior research of hetero-elements in the test. The basic specifications were set based on a hydrogen/coal ratio of 0.2, a temperature of 950 degrees C and a pressure of 7 MPa, which are the reaction conditions of the ARCH reactor, with the basic design drafted accordingly. In a preliminary examination on the mechanisms of the coal hydrogasification reaction, analysis was made on the representative 20 different kinds of coals, on the six kinds of which studies were made on the gasification behavior using a constant heating rate pyrolyzer. As a result, from the temperature dependency of a methane forming speed, it was assumed that three components of coal had bearing upon the gasification behavior. (NEDO)

  12. Appearance of rapid carbon on hydrogasification of coal; Suiten gas ka ni okeru kokassei tanso no hatsugen

    Energy Technology Data Exchange (ETDEWEB)

    Soneda, Y.; Makino, M. [National Institute for Resources and Environment, Tsukuba (Japan)9] Xu, W. [New Energy and Industrial Technology Development Organization, Tokyo, (Japan)

    1998-09-20

    The technology of hydrogasification of coal now under development under a State project aims to produce light oil as well as methane-rich high calorie gas through the direct reaction between coal and hydrogen, and is expected to deal with the difficult natural gas demand/supply relationship anticipated for the future. Although it is mandatory to fully understand the reaction between coal and hydrogen for the completion of this technology, yet there are many tasks to be fulfilled. One of the tasks is the elucidation of the behavior of what is named rapid carbon that appears upon the rapid heating of coal in a high-pressure hydrogen environment. In this paper, some interesting findings about the appearance of rapid carbon are reported. When coal is placed in such an environment, volatile components are lost first of all and then the active carbon reaction occurs. When the behavior of active carbon in the reaction is observed, it is found that active carbon is not so small in quantity, and the result of observation of its appearance and deactivation during the reaction justifies an inference that the reaction is regarded as one of the primary reactions in the process of hydrogasification. Accordingly, systematic studies of its physical and chemical features from various viewpoints are necessary. 5 refs., 3 figs., 1 tab.

  13. Achievement report for fiscal 1997 (edition B) on auxiliary New Sunshine Project. Development of coal hydrogasification technology (Research by using experimental device); 1997 nendo New Sunshine keikaku hojo jigyo seika hokokusho (B ban). Sekitan suiso tenka gaska gijutsu kaihatsu - Jikken sochi ni yoru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    With an objective of using practically the coal hydrogasification technology (the ARCH process), developmental research has been performed. This paper summarizes the achievements in fiscal 1997. In the research by using a small testing device, the Taiheiyo coal was used to have derived hydrogasification data (distribution and yield of the reaction products) in case of having changed the temperature, residential time and H{sub 2}/caoal ratio at a pressure of 7.0 MPa. In the developmental research on the injector, a test to verify mixing performance was performed by simulating the coal/hydrogen with gas/gas and coal/gas at normal temperature and pressure. Furthermore, discussions were given on the heat conduction analysis and cooling structure, whereas an injector was designed and fabricated. With respect to the hot model test to verify the performance of the injector, detailed design and partial fabrication of the test device were carried out. In addition, development was conducted on the coal/gas system mixing simulation to simulate the states of dispersion and mixing of the coal as the first phase of developing the mixing and temperature rise simulation. (NEDO)

  14. Achievement report for fiscal 1997 on investigative research on society compatibility of development of coal hydrogasification technology; 1997 nendo sekitan suiso tenka gas ka gijutsu kaihatsu shakai tekigosei ni kansuru chosa kenkyu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    In view of possibility of the future tightness in natural gas supply, establishment of coal gasification technology was set as the final objective, which can supply cheaply and stably the substitution natural gas of high quality by using coal existing affluently over the world as the raw material. An investigative research is carried out under a five-year plan on society compatibility required to assess the possibility of the practical application thereof. Fiscal 1997 has performed in continuation from the previous year the 'survey on process level elevation' and 'survey on the society compatibility'. This report summarizes the achievements thereon. In the investigative research on the process level elevation, the Shell's methane synthesis process based on an oxygen blown and dry feed coal gasifier was evaluated, and the calculation process was pursued on material balance in a hydrogasification reactor as having been performed in the 'survey on developing the coal hydrogasification technology' in which its reasonability was verified. In the survey on the society compatibility of the process, a survey was carried out on natural gas (including non-conventional methane hydrate and coal bed methane) and coals as raw materials for hydrogasification. (NEDO)

  15. Fiscal 1998 New Sunshine Plan auxiliary project. Report on results on development of coal hydrogasification technology (Support research); 1998 nendo New Sunshine keikaku hojo jigyo seika hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu shien kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    R and D was conducted for the purpose of developing the alternative natural gas manufacturing process that, using coal as the raw material, is highly efficient as well as environmentally superior, with the fiscal 1998 results reported. A hydrogasification test was conducted on Taiheiyo coal at a temperature of 1,173k and a pressure of 7 MPa, which showed that all gaseous products other than methane stopped their evolution roughly in the heat-up process, while methane continued to evolve to show the highest yield. In the reactivity comparison of various types of coal, coals with carbon content of 80% or below were high in reactivity and considered suitable for hydrogasification feedstock. It was also suggested that the hydrogasification reactivity of low rank coals including sub-bituminous coals or below might be greatly affected by the presence/absence of the catalytic effects of ion-exchanged metals. Behavior experiments of sulfur and nitrogen in coal in the hydrogenation reaction were carried out using a continuous free-fall type reactor, which elucidated the effects of hydrogen pressure and gas residence time among various operational elements. (NEDO)

  16. Fiscal 1992 report. Overseas surveys out of surveys for coal hydrogasification technology development; 1992 nendo sekitan suiten gaska gijutsu kaihatsu chosa ni okeru kaigai chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-12-01

    As part of the coal hydrogasification technology development survey project, overseas surveys were carried out as in the preceding fiscal year. With an emphasis placed on the process materials and resources, and on product utilization technologies, surveys and studies were conducted about the trends of development of coal and natural gas resources, and information was collected on energy-related matters in Indonesia and Australia. The need of hydrogasification technology was investigated from the viewpoint of natural resources. Moreover, Japanese engineers were dispatched to APEC (Asia-Pacific Economic Cooperation Conference) New Energy Seminar, Indonesia. Visits were made for information on the natural gas resources at an LNG base in East Kalimantan, Indonesia; coal gasification, energy, and others at CSIRO (Commonwealth Scientific and Industrial Research Organization), Australia; coal bed and methane resources at Warren Center, University of Sydney, Australia; coal bed and methane resources at the Brisbane office, Mitsubishi Gas Chemical Company, Inc.; and coal resources at coal mines of Idemitsu South Queensland Coal Pty Ltd. (NEDO)

  17. Fiscal 1991 report. Overseas surveys out of surveys for coal hydrogasification technology development; 1991 nendo sekitan suiten gaska gijutsu kaihatsu chosa ni okeru kaigai chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-11-01

    For the selection and evaluation of coal gasification processes suitable for substitute natural gas (SNG) production, visits were made to business corporations, research institutes, etc., engaged in the development of coal gasification technology abroad, and surveys were conducted of the development status overseas and information was collected. Visits were made and information was collected on the Lurgi process, a commercial SNG plant, and others at Dakota Gasification Company, U.S.; U-gas process and others at Institute of Gas Technology; energy-related matters at U.S. Department of Energy; coal hydrogasification process and others at Midlands Station, British Gas plc; Shell coal gasification process and others at Amsterdam Research Institute, Royal Dutch Shell; coal gasification, high-temperature desulfurization, and others at KEMA, Holland; and IGCC (integrated gasification combined cycle) verification plant with the Shell coal gasification process incorporated thereinto, now under construction at Demkolec. (NEDO)

  18. Fiscal 1994 survey report. Coal hydrogasification technology development; 1994 nendo sekitan suiten gaska gijutsu kaihatsu chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    Surveys and studies were conducted to establish a practical SNG (substitute natural gas) production process. For the development of an ARCH (advanced rapid coal hydrogasification) process, a plan was prepared covering the basic concept of the process, overall development program, hydrogen/oxygen burner, and an injector. The overall development program comprises element studies (4 years) and the study of the operation of a 50 tons/day pilot plant (8 years), and deals with the development of a reactor and peripheral equipment. Next comes a total system verification effort using a 200 tons/day verification plant in combination with a hydrogen production process, and this aims to achieve commercialization at 3 million Nm{sup 3}/day. As for the hydrogen/oxygen burner, a structure was proposed after surveys of literature and patents on burner structures, ignition methods, and monitoring methods. In the development of an injector, a plan was prepared for testing, and improving, the performance in a cold/hot model of a specimen incorporating the proposed hydrogen/oxygen burner. Basic studies to be carried out include simulation-aided performance prediction. (NEDO)

  19. Commissioned operation report for fiscal 1991 on commissioning of surveying high-level development and effective utilization of natural gas, and development of coal hydrogasification technology; 1991 nendo tennen gas kodo kaihatsu yuko riyo chosa tou itaku gyomu hokokusho. Sekitan suiten gaska gijutsu kaihatsu chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    With an objective of establishing a practically usable process to manufacture substitution natural gas, discussions have been given on the technical, economical, and developmental problems therein. This paper summarizes the achievements in fiscal 1991. The summary of surveys in the current fiscal year is as follows: the coal hydrogasification process is positioned as having high necessity in the gas industry because of its high thermal efficiency and low gas cost; in order to evaluate the reaction heat in the hydrogasification reaction, a mathematical model having flexibility was structured, whereas a large number of findings has been derived, including performance of the reactor and the optimum operating conditions; in addition to having made clear the conditions for an entrained bed hydrogasification reactor, comparisons and discussions were given on the internally circulating reactor and one-through reactor; studies were performed on thermal efficiency and gas cost in the optimized process configuration o the ARCH-1 process base; and a proposal was made on the test for a new reactor having the two-step reaction zone that could be expected of increased yield in aqueous solution, and could contribute to reducing the gas cost. (NEDO)

  20. The hydrogasification of lignite and sub-bituminous coals

    Science.gov (United States)

    Bhatt, B.; Fallon, P. T.; Steinberg, M.

    1981-02-01

    A North Dakota lignite and a New Mexico sub-bituminous coal have been hydrogenated at up to 900°C and 2500 psi hydrogen pressure. Yields of gaseous hydrocarbons and aromatic liquids have been studied as a function of temperature, pressure, residence time, feed rates and H2/coal ratio. Coal feed rates in excess of 10 lb/hr have been achieved in the 1 in. I. D.×8 ft reactor and methane concentration as high as 55% have been observed. A four-step reaction model was developed for the production and decomposition of the hydrocarbon products. A single object function formulated from the weighted errors for the four dependent process, variables, CH4, C2H6, BTX, and oil yields, was minimized using a program containing three independent iterative techniques. The results of the nonlinear regression analysis for lignite show that a first-order chemical reaction model with respect to C conversion satisfactorily describes the dilute phase hydrogenation. The activation energy for the initial products formation was estimated to be 42,700 cal/gmole and the power of hydrogen partial pressure was found to be +0.14. The overall correlation coefficient was 0.83. The mechanism, the rate expressions, and the design curves developed can be used for scale-up and reactor design.

  1. Fiscal 1994 entrusted task report. Surveys of advanced natural gas development and efficient utilization (Survey of coal hydrogasification technology development); 1994 nendo tennen gas kodo kaihatsu yuko riyo chosa tou itaku gyomu hokokusho. Sekitan suiten gaska gijutsu kaihatsu chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    For the establishment of a practical process for substitute natural gas (SNG) production, technological and economical assessments were made, and tasks to discharge for the development were discussed. In this fiscal year, the results of surveys conducted in the past five-year period were compiled, and studies were made to prepare for a smooth transition to the element research stage. Findings obtained are described below. SNG producing technologies need to be developed, with the demand for SNG increasing sharply, to further stabilize the base for SNG supply; coal which is abundantly available should be used as the material for SNG; and coal hydrogasification, among various methods for producing SNG from coal, is the most suitable in view of efficiency and cost performance. It was also found after a prolonged study for the improvement of efficiency and cost performance that probabilities were high that the yield of BTX (benzene, toluene, xylene) would increase and cost performance would improve. Besides, a basic plan and an element technology research plan were prepared for the development of the ARCH (advanced rapid coal hydrogasification) process. (NEDO)

  2. Fiscal 1990 report. Overseas surveys out of surveys for coal hydrogasification technology development; 1990 nendo sekitan suiten gaska gijutsu kaihatsu chosa ni okeru kaigai chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-12-01

    For the selection and evaluation of coal gasification processes suitable for substitute natural gas (SNG) production, visits were made to overseas business corporations, research institutes, etc., engaged in the development of coal gasification technology, surveys were conducted of the status of development abroad, and information was collected. Visited were Westfield Development Center, British Gas plc; Midlands Research Station, British Gas plc; IEA Coal Research; IFP (Institut Francais de Petrole); and DMT-FP (DMT-Gesellschaft fur Forschung und Prufung mbH). The Westfield Development Center uses coal from near-by open cut mines and supplies town gas to the Scottish region. The slagging Lurgi process, etc., were investigated. At Midlands Research Station, where a coal hydrogasification process is under development, the history of development and the cold model test were summarized, a test plan using a 5 tons/day pilot plant and the modification of test facilities were explained, and the 5 tons/day pilot plant was visited for study. (NEDO)

  3. Catalysis of metal-clay intercalation compound in the low temperature coal hydrogasification

    Energy Technology Data Exchange (ETDEWEB)

    Fuda, Kiyoshi; Kimura, Mitsuhiko; Miyamoto, Norimitsu; Matsunaga, Toshiaki

    1986-10-23

    Focusing the hydrogenating methanation by gaseous phase catalytic reactions of low temperature volatile components, the catalytic effects of Ni metal and the effects of carriers having sensitive effects on the catalytic activities of Ni metal were studied. Sample coals were prepared from Shin-Yubari coal, and Ni hydride-montmorillonite complex catalysts and the catalysts produced by carring Ni nitrate on alumina and burning in hydrogen gas flows were prepared. The hydrogasification were carried out in a reaction tube. As a result, the montmorillonite-Ni compounds catalysts had high catalitic effects and high conversion ratio of 90% or more in the low temperature coal gasification. The catalitic effects of carried Ni metal strongly depended on the carrier substances, and the rank of effects for the carriers was montmorillonite>zeorite>TiO/sub 2/>alpha-Al/sub 2/O/sub 3/>MgO>SiO/sub 2/=gamma-Al/sub 2/O/sub 3/. (3 figs, 3 tabs, 3 refs)

  4. Achievement report for fiscal 1999 on auxiliary New Sunshine Project. Development of coal hydrogasification technology (Survey and research on its social acceptability); 1999 nendo New Sunshine keikaku hojo jigyo seika hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu - Shakai tekigo sei ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    With an objective to evaluate feasibility of practical use and economy of the coal hydrogasification technology (the ARCH process), survey and research have been performed. This paper summarizes the achievements in fiscal 1999. In the survey on the social acceptability, survey has been made on the future trend in the demand and supply and the price of LNG, LPG, and coal for hydrogasification. As a result, it was discovered that the price of LNG imported into Japan is determined as if linked with the crude oil price, and Saudi Arabia is the price leader of the LPG price. With respect to the survey on the possibility of international cooperation, surveys were conducted on the prospects of the long-term demand and supply in China, natural gas resources, and the demand and supply thereof. The feasibility study has estimated the product gas manufacturing cost after the process has been improved. In the trial calculation on the three-mode cost, it was discovered that, although the profit from byproducts is great, the BTX maximized mode causes the manufacturing cost to be higher by as much as 2 to 3 yen per Nm{sup 3} than that of other modes because of higher unit consumption in raw materials and higher construction cost. (NEDO)

  5. Achievement report for fiscal 1998 (edition B) on auxiliary New Sunshine Project. Development of coal hydrogasification technology (Research by using experimental device); 1998 nendo New Sunshine keikaku hojo jigyo seika hokokusho (B ban). Sekitan suiso tenka gaska gijutsu kaihatsu - Jikken sochi ni yoru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    With an objective of using practically the coal hydrogasification technology (the ARCH process), developmental research has been performed on important elementary technologies using different experimental devices. This paper summarizes the achievements in fiscal 1998. In the research by using a small testing apparatus, the Taiheiyo coal was used to have derived hydrogasification data (distribution and yield of the reaction products) in case of having changed the reaction pressure, temperature rising rate, and H{sub 2}/caoal ratio, and to verify the possibility of increasing the BTX yield by installing a temperature zone in two steps. In the developmental research on the injector, a combustion test and a coal feeding test were performed on the injector having been designed and fabricated in the previous fiscal year to verify the basic performance and evaluate the heat resistance and durability. With respect to the hot model test, a test installation was completed with the injector mounted to conduct the trial operation and test. In addition, development was conducted on the coal temperature rise simulation as the second phase of developing the simulation of mixing of coal with high-temperature hydrogen and temperature rise. (NEDO)

  6. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  7. A FEASIBILITY STUDY FOR THE COPROCESSING OF FOSSIL FUELS WITH BIOMASS BY THE HYDROCARB PROCESS

    Science.gov (United States)

    The report describes and gives results of an assessment of a new process concept for the production of carbon and methanol from fossil fuels. The Hydrocarb Process consists of the hydrogasification of carbonaceous material to produce methane, which is subsequently thermally decom...

  8. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  9. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  10. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  11. Commissioned operation report for fiscal 1992 on commissioning of surveying high-level development and effective utilization of natural gas, and development of coal hydrogasification technology; 1992 nendo tennen gas kodo kaihatsu yuko riyo chosa tou itaku gyomu hokokusho. Sekitan suiten gaska gijutsu kaihatsu chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-03-01

    With an objective of establishing a practically usable process to manufacture substitution natural gas, discussions have been given on the technical, economical, and developmental problems therein. This paper summarizes the achievements in fiscal 1992. With respect to the possibility of applying the coal SNG to power generation fuel, a power generation system composed of coal SNG, pipeline transportation and natural gas was recognized of having the significance of technological development because of its capability of raising the power plant utilization rate and possibility of being superior in the economic aspect. In the study on enhancement of aqueous solution yield, performance of the ARCH-2 reactor was discussed by the simulation forecast using a mathematical model, whereas the benzene yield was found possible to be raised up to 15% in the carbon conversion rate. As the target of the hydrogasification process to be developed by Japan, based on the study results of the current fiscal year, three points consisted of SNG yield maximization, cooling gas efficiency maximization, and BTV yield maximization were indicated, and it was proposed that a process having flexibility in the product yield should be developed. (NEDO)

  12. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  13. Achievement report for fiscal 1999 (edition B) on auxiliary New Sunshine Project. Development of coal hydrogasification technology (Research by using experimental device); 1999 nendo New Sunshine keikaku hojo jigyo seika hokokusho (B ban). Sekitan suiso tenka gaska gijutsu kaihatsu - Jikken sochi ni yoru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    With an objective of using practically the coal hydrogasification technology (the ARCH process), developmental research has been performed on important elementary technologies using different experimental devices. This paper summarizes the achievements in fiscal 1999. In the research by using a small testing apparatus, the Taiheiyo coal was used to have performed demonstration operation on the replacement natural maximized case, the heat efficiency maximized case, and the BTX maximized case. As a result, the three cases were found nearly as anticipated in the simulation, whereas the replacement natural gas maximized case has achieved the targeted whole coal conversion rate of 60% or more. However, the BTX maximized case presented a value lower than the targeted BTX yield of 12%. In the developmental research on the injector, the injector having been fabricated for the hot model test was given another combustion test, where the focal temperature of 1,200 degree C or higher was derived. The hot model test has verified the non-agglomeration performance of coal by using as parameters the focal temperatures, coal cross sectional area loads, coal types, and injectors. It was verified that the Taiheiyo and Shin Mu coals do not agglomerate excessively. (NEDO)

  14. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  15. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  16. Fuel production from coal by the Mobil Oil process using nuclear high-temperature process heat

    International Nuclear Information System (INIS)

    Hoffmann, G.

    1982-01-01

    Two processes for the production of liquid hydrocarbons are presented: Direct conversion of coal into fuel (coal hydrogenation) and indirect conversion of coal into fuel (syngas production, methanol synthesis, Mobil Oil process). Both processes have several variants in which nuclear process heat may be used; in most cases, the nuclear heat is introduced in the gas production stage. The following gas production processes are compared: LURGI coal gasification process; steam reformer methanation, with and without coal hydrogasification and steam gasification of coal. (orig./EF) [de

  17. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  18. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  19. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  20. FY 2000 report on the results of the project supplementary to the New Sunshine Project - Feasibility of coal hydrogasification technology in China. II - Final report. Investigational study of the social adaptability (Feasibility study of the international cooperation - Report of Beijing Research Institute of Coal Chemistry); 2000 nendo New Sunshine keikaku hojo jigyo (Bessatsu kokusai kyoryoku kanosei chosa Pekin Bai kagaku kenkyujo) hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu - Shakai tekigo sei ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    For the purpose of establishing the coal hydrogasification technology which has a possibility of producing high-quality substitute natural gas in quantity and at low cost, an investigational study of the social adaptability was made. In this fiscal year, the following were carried out: natural gas resource and the plan for the use in China, actual state of the town gas business and the future plan, etc. As a part of the study, Beijing Research Institute of Coal Chemistry, China Coal Research Institute, made a survey under the research contract. As a result of the survey, the following was found out: In Xinjiang and Urumchi, Uigur Autonomous Region, there is an abundant coal resource that is suitable for coal hydrogasification, the transportation pipeline of natural gas had been constructed, and public facilities are prepared, and therefore, both cities are suitable for the construction of coal hydrogasification plant. Datong, Shanxi Province, is a largest city of coal production, enables the long-term coal supply for coal hydrogasification, and has a plan for remodeling of old facilities and construction of new facilities for the introduction of natural gas, and therefore, the city is suitable for the construction of coal hydrogasification plant. (NEDO)

  1. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2008-05-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  2. PRECLOSURE CRITICALITY ANALYSIS PROCESS REPORT

    International Nuclear Information System (INIS)

    Danise, A.E.

    2004-01-01

    This report describes a process for performing preclosure criticality analyses for a repository at Yucca Mountain, Nevada. These analyses will be performed from the time of receipt of fissile material until permanent closure of the repository (preclosure period). The process describes how criticality safety analyses will be performed for various configurations of waste in or out of waste packages that could occur during preclosure as a result of normal operations or event sequences. The criticality safety analysis considers those event sequences resulting in unanticipated moderation, loss of neutron absorber, geometric changes, or administrative errors in waste form placement (loading) of the waste package. The report proposes a criticality analyses process for preclosure to allow a consistent transition from preclosure to postclosure, thereby possibly reducing potential cost increases and delays in licensing of Yucca Mountain. The proposed approach provides the advantage of using a parallel regulatory framework for evaluation of preclosure and postclosure performance and is consistent with the U.S. Nuclear Regulatory Commission's approach of supporting risk-informed, performance-based regulation for fuel cycle facilities, ''Yucca Mountain Review Plan, Final Report'', and 10 CFR Part 63. The criticality-related criteria for ensuring subcriticality are also described as well as which guidance documents will be utilized. Preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the U.S. Nuclear Regulatory Commission; therefore, the design approach for preclosure criticality safety will be dictated by existing regulatory requirements while using a risk-informed approach with burnup credit for in-package operations

  3. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  4. Development of processes for the utilization of Brazilian coal using nuclear process heat and/or nuclear process steam

    International Nuclear Information System (INIS)

    Bamert, H.; Niessen, H.F.; Walbeck, M.; Wasrzik, U.; Mueller, R.; Schiffers, U.; Strauss, W.

    1980-01-01

    Status of the project: End of the project definition phase and preparation of the planned conceptual phase. Objective of the project: Development of processes for the utilization of nuclear process heat and/or nuclear process steam for the gasification of coal with high ash content, in particular coal from Brazil. Results: With the data of Brazilian coal of high ash content (mine Leao/ 43% ash in the mine-mouth quality, 20% ash after preparation) there have been worked out proposals for the mine planning and for a number of processes. On the basis of these proposals and under consideration of the main data specified by the Brazilian working group there have been choosen two processes and worked out in a conceptual design: 1) pressurized water reactor + LURGI-pressure gasifier/hydrogasification for the production of SNG and 2) high temperature reactor steam gasification for the production of town gas. The economic evaluation showed that the two processes are not substantially different in their cost efficiency and they are economical on a long-term basis. For more specific design work there has been planned the implementation of an experimental programme using the semi-technical plants 'hydrogasification' in Wesseling and 'steam gasification' in Essen as the conceptual phase. (orig.) [de

  5. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  6. Analysis of multiparty mediation processes

    NARCIS (Netherlands)

    Vuković, Siniša

    2013-01-01

    Crucial challenges for multiparty mediation processes include the achievement of adequate cooperation among the mediators and consequent coordination of their activities in the mediation process. Existing literature goes only as far as to make it clear that successful mediation requires necessary

  7. Experimental analysis of armouring process

    Science.gov (United States)

    Lamberti, Alberto; Paris, Ennio

    Preliminary results from an experimental investigation on armouring processes are presented. Particularly, the process of development and formation of the armour layer under different steady flow conditions has been analyzed in terms of grain size variations and sediment transport rate associated to each size fraction.

  8. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  9. Formal Analysis of Design Process Dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  10. FY 1999 report on the results of the project supplementary to the New Sunshine Project - Feasibility of coal hydrogasification technology in China. Investigational study of the social adaptability (Feasibility study of the international cooperation - Report of Beijing Research Institute of Coal Chemistry); 1999 nendo New Sunshine keikaku hojo jigyo (Bessatsu, Kokusai kyoryoku kanosei chosa Pekin Bai kagaku kenkyujo) hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu - Shakai tekigo sei ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    For the purpose of establishing the coal hydrogasification technology which has a possibility of producing high-quality substitute natural gas in quantity and at low cost, an investigational study of the social adaptability was made. In this fiscal year, the following were carried out: outlook of energy supply/demand in China and the problems, natural gas resource and the plan for the use, actual state of the town gas business and the future plan, etc. As a part of the study, Beijing Research Institute of Coal Chemistry, China Coal Research Institute, made a survey under the research contract. As to the general situation of natural gas in China, report was made on the following: present situation of the development of natural gas resource in China, present situation of town gas in large cities of China, present situation and outlook of coal development and utilization in China, assessment of the coal mine area adaptable to coal hydrogasification, etc. In the survey of the area suitable for coal hydrogasification, report was made on the present situation and future of energy supply/demand in Shanghai, Shanxi, Shenfua and Xinjiang, present situation and future of town gas supply, etc. Survey/report were also made on the coal hydrogasification technology and the applicability. (NEDO)

  11. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  12. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  13. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  14. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  15. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  16. Root cause analysis with enriched process logs

    NARCIS (Netherlands)

    Suriadi, S.; Ouyang, C.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.; La Rosa, M.; Soffer, P.

    2013-01-01

    n the field of process mining, the use of event logs for the purpose of root cause analysis is increasingly studied. In such an analysis, the availability of attributes/features that may explain the root cause of some phenomena is crucial. Currently, the process of obtaining these attributes from

  17. A core ontology for business process analysis

    NARCIS (Netherlands)

    Pedrinaci, C.; Domingue, J.; Alves De Medeiros, A.K.; Bechhofer, S.; Hauswirth, M.; Hoffmann, J.; Koubarakis, M.

    2008-01-01

    Business Process Management (BPM) aims at supporting the whole life-cycle necessary to deploy and maintain business processes in organisations. An important step of the BPM life-cycle is the analysis of the processes deployed in companies. However, the degree of automation currently achieved cannot

  18. Moments analysis of concurrent Poisson processes

    International Nuclear Information System (INIS)

    McBeth, G.W.; Cross, P.

    1975-01-01

    A moments analysis of concurrent Poisson processes has been carried out. Equations are given which relate combinations of distribution moments to sums of products involving the number of counts associated with the processes and the mean rate of the processes. Elimination of background is discussed and equations suitable for processing random radiation, parent-daughter pairs in the presence of background, and triple and double correlations in the presence of background are given. The theory of identification of the four principle radioactive series by moments analysis is discussed. (Auth.)

  19. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  20. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  1. Summary of process research analysis efforts

    Science.gov (United States)

    Burger, D. R.

    1985-01-01

    A summary of solar-cell process research analysis efforts was presented. Process design and cell design are interactive efforts where technology from integrated circuit processes and other processes are blended. The primary factors that control cell efficiency are: (1) the bulk parameters of the available sheet material, (2) the retention and enhancement of these bulk parameters, and (3) the cell design and the cost to produce versus the finished cells performance. The process sequences need to be tailored to be compatible with the sheet form, the cell shape form, and the processing equipment. New process options that require further evaluation and utilization are lasers, robotics, thermal pulse techniques, and new materials. There are numerous process control techniques that can be adapted and used that will improve product uniformity and reduced costs. Two factors that can lead to longer life modules are the use of solar cell diffusion barriers and improved encapsulation.

  2. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  3. Vygotsky's Analysis of Children's Meaning Making Processes

    Science.gov (United States)

    Mahn, Holbrook

    2012-01-01

    Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…

  4. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  5. Computerization of the safeguards analysis decision process

    International Nuclear Information System (INIS)

    Ehinger, M.H.

    1990-01-01

    This paper reports that safeguards regulations are evolving to meet new demands for timeliness and sensitivity in detecting the loss or unauthorized use of sensitive nuclear materials. The opportunities to meet new rules, particularly in bulk processing plants, involve developing techniques which use modern, computerized process control and information systems. Using these computerized systems in the safeguards analysis involves all the challenges of the man-machine interface experienced in the typical process control application and adds new dimensions to accuracy requirements, data analysis, and alarm resolution in the regulatory environment

  6. BUSINESS PROCESS MANAGEMENT SYSTEMS TECHNOLOGY COMPONENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Andrea Giovanni Spelta

    2007-05-01

    Full Text Available The information technology that supports the implementation of the business process management appproach is called Business Process Management System (BPMS. The main components of the BPMS solution framework are process definition repository, process instances repository, transaction manager, conectors framework, process engine and middleware. In this paper we define and characterize the role and importance of the components of BPMS's framework. The research method adopted was the case study, through the analysis of the implementation of the BPMS solution in an insurance company called Chubb do Brasil. In the case study, the process "Manage Coinsured Events"" is described and characterized, as well as the components of the BPMS solution adopted and implemented by Chubb do Brasil for managing this process.

  7. Processability analysis of candidate waste forms

    International Nuclear Information System (INIS)

    Gould, T.H. Jr.; Dunson, J.B. Jr.; Eisenberg, A.M.; Haight, H.G. Jr.; Mello, V.E.; Schuyler, R.L. III.

    1982-01-01

    A quantitative merit evaluation, or processability analysis, was performed to assess the relative difficulty of remote processing of Savannah River Plant high-level wastes for seven alternative waste form candidates. The reference borosilicate glass process was rated as the simplest, followed by FUETAP concrete, glass marbles in a lead matrix, high-silica glass, crystalline ceramics (SYNROC-D and tailored ceramics), and coated ceramic particles. Cost estimates for the borosilicate glass, high-silica glass, and ceramic waste form processing facilities are also reported

  8. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  9. Profitability Analysis of Soybean Oil Processes.

    Science.gov (United States)

    Cheng, Ming-Hsun; Rosentrater, Kurt A

    2017-10-07

    Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.

  10. Profitability Analysis of Soybean Oil Processes

    Directory of Open Access Journals (Sweden)

    Ming-Hsun Cheng

    2017-10-01

    Full Text Available Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV, break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP, is profitable when the capacity is larger than 17 million kg of annual oil production.

  11. Systemic analysis of the caulking assembly process

    Directory of Open Access Journals (Sweden)

    Rodean Claudiu

    2017-01-01

    Full Text Available The present paper highlights the importance of a caulking process which is nowadays less studied in comparison with the growing of its usage in the automotive industry. Due to the fact that the caulking operation is used in domains with high importance such as shock absorbers and brake systems there comes the demand of this paper to detail the parameters which characterize the process, viewed as input data and output data, and the requirements asked for the final product. The paper presents the actual measurement methods used for analysis the performance of the caulking assembly. All this parameters leads to an analysis algorithm of performance established for the caulking process which it is used later in the paper for an experimental research. The study is a basis from which it will be able to go to further researches in order to optimize the following processing.

  12. Book: Marine Bioacoustic Signal Processing and Analysis

    Science.gov (United States)

    2011-09-30

    physicists , and mathematicians . However, more and more biologists and psychologists are starting to use advanced signal processing techniques and...Book: Marine Bioacoustic Signal Processing and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT ...chapters than it should be, since the project must be finished by Dec. 31. I have started setting aside 2 hours of uninterrupted per workday to work

  13. A cost analysis: processing maple syrup products

    Science.gov (United States)

    Neil K. Huyler; Lawrence D. Garrett

    1979-01-01

    A cost analysis of processing maple sap to syrup for three fuel types, oil-, wood-, and LP gas-fired evaporators, indicates that: (1) fuel, capital, and labor are the major cost components of processing sap to syrup; (2) wood-fired evaporators show a slight cost advantage over oil- and LP gas-fired evaporators; however, as the cost of wood approaches $50 per cord, wood...

  14. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  15. Process based analysis of manually controlled drilling processes for bone

    Science.gov (United States)

    Teicher, Uwe; Achour, Anas Ben; Nestler, Andreas; Brosius, Alexander; Lauer, Günter

    2018-05-01

    The machining operation drilling is part of the standard repertoire for medical applications. This machining cycle, which is usually a multi-stage process, generates the geometric element for the subsequent integration of implants, which are screwed into the bone in subsequent processes. In addition to the form, shape and position of the generated drill hole, it is also necessary to use a technology that ensures an operation with minimal damage. A surface damaged by excessive mechanical and thermal energy input shows a deterioration in the healing capacity of implants and represents a structure with complications for inflammatory reactions. The resulting loads are influenced by the material properties of the bone, the used technology and the tool properties. An important aspect of the process analysis is the fact that machining of bone is in most of the cases a manual process that depends mainly on the skills of the operator. This includes, among other things, the machining time for the production of a drill hole, since manual drilling is a force-controlled process. Experimental work was carried out on the bone of a porcine mandible in order to investigate the interrelation of the applied load during drilling. It can be shown that the load application can be subdivided according to the working feed direction. The entire drilling process thus consists of several time domains, which can be divided into the geometry-generating feed motion and a retraction movement of the tool. It has been shown that the removal of the tool from the drill hole has a significant influence on the mechanical load input. This fact is proven in detail by a new evaluation methodology. The causes of this characteristic can also be identified, as well as possible ways of reducing the load input.

  16. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  17. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  18. Ergonomic analysis of radiopharmaceuticals samples preparation process

    International Nuclear Information System (INIS)

    Gomes, Luciene Betzler C.; Santos, Isaac Luquetti dos; Fonseca, Antonio Carlos C. da; Pellini, Marcos Pinto; Rebelo, Ana Maria

    2005-01-01

    The doses of radioisotopes to be administrated in patients for diagnostic effect or therapy are prepared in the radiopharmacological sector. The preparation process adopts techniques that are aimed to reduce the exposition time of the professionals and the absorption of excessive doses for patients. The ergonomic analysis of this process contributes in the prevention of occupational illnesses and to prevent risks of accidents during the routines, providing welfare and security to the involved users and conferring to the process an adequate working standard. In this context it is perceived relevance of studies that deal with the analysis of factors that point with respect to the solution of problems and for establishing proposals that minimize risks in the exercise of the activities. Through a methodology that considers the application of the concepts of Ergonomics, it is searched the improvement of the effectiveness or the quality and reduction of the difficulties lived for the workers. The work prescribed, established through norms and procedures codified will be faced with the work effectively carried through, the real work, shaped to break the correct appreciation, with focus in the activities. This work has as objective to argue an ergonomic analysis of samples preparation process of radioisotopes in the Setor de Radiofarmacia do Hospital Universitario Clementino Fraga Filho da Universidade Federal do Rio de Janeiro (UFRJ). (author)

  19. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  20. Control system for technological processes in tritium processing plants with process analysis

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian

    2005-01-01

    Integration of a large variety of installations and equipment into a unitary system for controlling the technological process in tritium processing nuclear facilities appears to be a rather complex approach particularly when experimental or new technologies are developed. Ensuring a high degree of versatility allowing easy modifications in configurations and process parameters is a major requirement imposed on experimental installations. The large amount of data which must be processed, stored and easily accessed for subsequent analyses imposes development of a large information network based on a highly integrated system containing the acquisition, control and technological process analysis data as well as data base system. On such a basis integrated systems of computation and control able to conduct the technological process could be developed as well protection systems for cases of failures or break down. The integrated system responds to the control and security requirements in case of emergency and of the technological processes specific to the industry that processes radioactive or toxic substances with severe consequences in case of technological failure as in the case of tritium processing nuclear plant. In order to lower the risk technological failure of these processes an integrated software, data base and process analysis system are developed, which, based on identification algorithm of the important parameters for protection and security systems, will display the process evolution trend. The system was checked on a existing plant that includes a removal tritium unit, finally used in a nuclear power plant, by simulating the failure events as well as the process. The system will also include a complete data base monitoring all the parameters and a process analysis software for the main modules of the tritium processing plant, namely, isotope separation, catalytic purification and cryogenic distillation

  1. Big Data Analysis of Manufacturing Processes

    Science.gov (United States)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  2. Big Data Analysis of Manufacturing Processes

    International Nuclear Information System (INIS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-01-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results. (paper)

  3. Exergy analysis of the LFC process

    International Nuclear Information System (INIS)

    Li, Qingsong; Lin, Yuankui

    2016-01-01

    Highlights: • Mengdong lignite was upgraded by liquids from coal (LFC) process at a laboratory-scale. • True boiling point distillation of tar was performed. • Basing on experimental data, the LFC process was simulated in Aspen Plus. • Amounts of exergy destruction and efficiencies of blocks were calculated. • Potential measures for improving the LFC process are suggested. - Abstract: Liquid from coal (LFC) is a pyrolysis technology for upgrading lignite. LFC is close to viability as a large-scale commercial technology and is strongly promoted by the Chinese government. This paper presents an exergy analysis of the LFC process producing semicoke and tar, simulated in Aspen Plus. The simulation included the drying unit, pyrolysis unit, tar recovery unit and combustion unit. To obtain the data required for the simulation, Mengdong lignite was upgraded using a laboratory-scale experimental facility based on LFC technology. True boiling point distillation of tar was performed. Based on thermodynamic data obtained from the simulation, chemical exergy and physical exergy were determined for process streams and exergy destruction was calculated. The exergy budget of the LFC process is presented as a Grassmann flow diagram. The overall exergy efficiency was 76.81%, with the combustion unit causing the highest exergy destruction. The study found that overall exergy efficiency can be increased by reducing moisture in lignite and making full use of physical exergy of pyrolysates. A feasible method for making full use of physical exergy of semicoke was suggested.

  4. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  5. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  6. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  7. Preliminary Hazards Analysis Plasma Hearth Process

    International Nuclear Information System (INIS)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment

  8. POST-PROCESSING ANALYSIS FOR THC SEEPAGE

    International Nuclear Information System (INIS)

    SUN, Y.

    2004-01-01

    This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and management of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P and CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN: LB0302DSCPTHCS.002 [DIRS 161976]) used in the P and CE (BSC 2004 [DIRS 169860

  9. Sustainable process design & analysis of hybrid separations

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Befort, Bridgette; Garg, Nipun

    2016-01-01

    Distillation is an energy intensive operation in chemical process industries. There are around 40,000 distillation columns in operation in the US, requiring approximately 40% of the total energy consumption in US chemical process industries. However, analysis of separations by distillation has...... shown that more than 50% of energy is spent in purifying the last 5-10% of the distillate product. Membrane modules on the other hand can achieve high purity separations at lower energy costs, but if the flux is high, it requires large membrane area. A hybrid scheme where distillation and membrane...... modules are combined such that each operates at its highest efficiency, has the potential for significant energy reduction without significant increase of capital costs. This paper presents a method for sustainable design of hybrid distillation-membrane schemes with guaranteed reduction of energy...

  10. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  11. Warpage analysis in injection moulding process

    Science.gov (United States)

    Hidayah, M. H. N.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    This study was concentrated on the effects of process parameters in plastic injection moulding process towards warpage problem by using Autodesk Moldflow Insight (AMI) software for the simulation. In this study, plastic dispenser of dental floss has been analysed with thermoplastic material of Polypropylene (PP) used as the moulded material and details properties of 80 Tonne Nessei NEX 1000 injection moulding machine also has been used in this study. The variable parameters of the process are packing pressure, packing time, melt temperature and cooling time. Minimization of warpage obtained from the optimization and analysis data from the Design Expert software. Integration of Response Surface Methodology (RSM), Center Composite Design (CCD) with polynomial models that has been obtained from Design of Experiment (DOE) is the method used in this study. The results show that packing pressure is the main factor that will contribute to the formation of warpage in x-axis and y-axis. While in z-axis, the main factor is melt temperature and packing time is the less significant among the four parameters in x, y and z-axes. From optimal processing parameter, the value of warpage in x, y and z-axis have been optimised by 21.60%, 26.45% and 24.53%, respectively.

  12. Mathematical foundations of image processing and analysis

    CERN Document Server

    Pinoli, Jean-Charles

    2014-01-01

    Mathematical Imaging is currently a rapidly growing field in applied mathematics, with an increasing need for theoretical mathematics. This book, the second of two volumes, emphasizes the role of mathematics as a rigorous basis for imaging sciences. It provides a comprehensive and convenient overview of the key mathematical concepts, notions, tools and frameworks involved in the various fields of gray-tone and binary image processing and analysis, by proposing a large, but coherent, set of symbols and notations, a complete list of subjects and a detailed bibliography. It establishes a bridg

  13. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  14. Partial wave analysis using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Niklaus; Liu Beijiang; Wang Jike, E-mail: nberger@ihep.ac.c [Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Lu, Shijingshan, 100049 Beijing (China)

    2010-04-01

    Partial wave analysis is an important tool for determining resonance properties in hadron spectroscopy. For large data samples however, the un-binned likelihood fits employed are computationally very expensive. At the Beijing Spectrometer (BES) III experiment, an increase in statistics compared to earlier experiments of up to two orders of magnitude is expected. In order to allow for a timely analysis of these datasets, additional computing power with short turnover times has to be made available. It turns out that graphics processing units (GPUs) originally developed for 3D computer games have an architecture of massively parallel single instruction multiple data floating point units that is almost ideally suited for the algorithms employed in partial wave analysis. We have implemented a framework for tensor manipulation and partial wave fits called GPUPWA. The user writes a program in pure C++ whilst the GPUPWA classes handle computations on the GPU, memory transfers, caching and other technical details. In conjunction with a recent graphics processor, the framework provides a speed-up of the partial wave fit by more than two orders of magnitude compared to legacy FORTRAN code.

  15. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  16. Prototype plant for nuclear process heat (PNP), reference phase

    International Nuclear Information System (INIS)

    Fladerer, R.; Schrader, L.

    1982-07-01

    The coal gasification processes using nuclear process heat being developed within the framwork of the PNP project, have the advantages of saving feed coal, improving efficiency, reducing emissions, and stabilizing energy costs. One major gasification process is the hydrogasification of coal for producing SNG or gas mixture of carbon monoxide and hydrogen; this process can also be applied in a conventional route. The first steps to develop this process were planning, construction and operation of a semi-technical pilot plant for hydrogasification of coal in a fluidized bed having an input of 100 kg C/h. Before the completion of the development phase (reference phase) describing here, several components were tested on part of which no operational experience had so far been gained; these were the newly developed devices, e.g. the inclined tube for feeding coal into the fluidized bed, and the raw gas/hydrogenation gas heat exchanger for utilizing the waste heat of the raw gas leaving the gasifier. Concept optimizing of the thoroughly tested equipment parts led to an improved operational behaviour. Between 1976 and 1980, the semi-technical pilot plant was operated for about 19,400 hours under test conditions, more than 7,400 hours of which it has worked under gasification conditions. During this time approx. 1,100 metric tons of dry brown coal and more than 13 metric tons of hard coal were gasified. The longest coherent operational phase under gasification conditions was 748 hours in which 85.4 metric tons of dry brown coal were gasified. Carbon gasification rates up to 82% and methane contents in the dry raw gas (free of N 2 ) up to 48 vol.% were obtained. A detailed evaluation of the test results provided information of the results obtained previously. For the completion of the test - primarily of long-term tests - the operation of the semi-technical pilot plant for hydrogasification of coal is to be continued up to September 1982. (orig.) [de

  17. Traffic analysis and control using image processing

    Science.gov (United States)

    Senthilkumar, K.; Ellappan, Vijayan; Arun, A. R.

    2017-11-01

    This paper shows the work on traffic analysis and control till date. It shows an approach to regulate traffic the use of image processing and MATLAB systems. This concept uses computational images that are to be compared with original images of the street taken in order to determine the traffic level percentage and set the timing for the traffic signal accordingly which are used to reduce the traffic stoppage on traffic lights. They concept proposes to solve real life scenarios in the streets, thus enriching the traffic lights by adding image receivers like HD cameras and image processors. The input is then imported into MATLAB to be used. as a method for calculating the traffic on roads. Their results would be computed in order to adjust the traffic light timings on a particular street, and also with respect to other similar proposals but with the added value of solving a real, big instance.

  18. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  19. Integrated Process Design, Control and Analysis of Intensified Chemical Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil

    chemical processes; for example, intensified processes such as reactive distillation. Most importantly, it identifies and eliminates potentially promising design alternatives that may have controllability problems later. To date, a number of methodologies have been proposed and applied on various problems......, that the same principles that apply to a binary non-reactive compound system are valid also for a binary-element or a multi-element system. Therefore, it is advantageous to employ the element based method for multicomponent reaction-separation systems. It is shown that the same design-control principles...

  20. Entrepreneurship Learning Process by using SWOT Analysis

    Directory of Open Access Journals (Sweden)

    Jajat Sudrajat

    2016-03-01

    Full Text Available The research objective was to produce a model of learning entrepreneurship by using SWOT analysis, which was currently being run with the concept of large classes and small classes. The benefits of this study was expected to be useful for the Binus Entrepreneurship Center (BEC unit to create a map development learning entrepreneurship. Influences that would be generated by using SWOT Analysis were very wide as the benefits of the implementation of large classes and small classes for students and faculty. Participants of this study were Binus student of various majors who were taking courses EN001 and EN002. This study used research and development that examining the theoretical learning components of entrepreneurship education (teaching and learning dimension, where there were six dimensions of the survey which was a fundamental element in determining the framework of entrepreneurship education. Research finds that a strategy based on a matrix of factors is at least eight strategies for improving the learning process of entrepreneurship. From eight strategies are one of them strategies to increase collaboration BEC with family support. This strategy is supported by the survey results to the three majors who are following the EN001 and EN002, where more than 85% of the students are willing to do an aptitude test to determine the advantages and disadvantages of self-development and more of 54% of the students are not willing to accept the wishes of their parents because they do not correspond to his ideals. Based on the above results, it is suggested for further research, namely developing entrepreneurship research by analyzing other dimensions.

  1. The use of wire mesh reactors to characterise solid fuels and provide improved understanding of larger scale thermochemical processes

    Energy Technology Data Exchange (ETDEWEB)

    Lu Gao; Long Wu; Nigel Paterson; Denis Dugwell; Rafael Kandiyoti [Imperial College London, London (United Kingdom). Department of Chemical Engineering

    2008-07-01

    Most reaction products from the pyrolysis and the early stages of gasification of solid fuels are chemically reactive. Secondary reactions between primary products and with heated fuel particles tend to affect the final product distributions. The extents and pathways of these secondary reactions are determined mostly by the heating rate and the size and shape of the reaction zone and of the sample itself. The wire-mesh reactor (WMR) configuration discussed in this paper allows products to be separated from reactants and enables the rapid quenching of products, allowing suppression of secondary reactions. This paper presents an overview of the development of wire-mesh reactors, describing several diverse applications. The first of these involves an analysis of the behaviour of injectant coal particles in blast furnace tuyeres and raceways. The data has offered explanations for helping to understand why, at high coal injection rates, problems can be encountered in the operation of blast furnaces. Another project focused on determining the extents of pyrolysis and gasification reactivities of a suite of Chinese coals under intense reaction conditions. The results showed variations in coal reactivities that were related to the C content. In another project demonstrating the versatility of the WMR configuration, the high pressure version of the reactor is being used for developing the Zero Emission Coal Alliance (ZECA) concept. The work aims to examine and explain the chemical and transport mechanisms underlying the pyrolysis, hydropyrolysis and hydrogasification stages of the process. The results obtained till date have shown the effects of the operating conditions on the extent of hydropyrolysis/gasification of a bituminous coal and two lignites. The lignites were more reactive than the coal, and the data suggests that high levels of conversion will be achievable under the anticipated ZECA process conditions. 29 refs., 15 figs., 7 tabs.

  2. Process Simulation Analysis of HF Stripping

    Directory of Open Access Journals (Sweden)

    Thaer A. Abdulla

    2015-02-01

    Full Text Available    HYSYS process simulator is used for the analysis of existing HF stripping column in LAB plant (Arab Detergent Company, Baiji-Iraq. Simulated column performance and profiles curves are constructed. The variables considered are the thermodynamic model option, bottom temperature, feed temperature, and column profiles for the temperature, vapor flow rate, liquid flow rate and composition. The five thermodynamic models options used (Margules, UNIQUAC, van laar, Antoine, and Zudkevitch-Joffee, affecting the results within (0.1-58% variation for the most cases.        The simulated results show that about 4% of paraffin (C10 & C11 presents at the top stream, which may cause a problem in the LAB production plant. The major variations were noticed for the total top vapor flow rate with bottom temperature and with feed composition. The column profiles maintain fairly constants from tray 5 to tray 18. The study gives evidence about a successful simulation with HYSYS because the results correspond with the real plant operation data.

  3. Natural and professional help: a process analysis.

    Science.gov (United States)

    Tracey, T J; Toro, P A

    1989-08-01

    Differences in the helping interactions formed by mental health professionals, divorce lawyers, and mutual help group leaders were examined. Fourteen members of each of these three helper groups (N = 42) met independently with a coached client presenting marital difficulties. Using ratings of ability to ameliorate the personal and emotional problems presented, the 42 helpers were divided (using a median split) into successful and less successful outcome groups. The responses of each of the pairs were coded using the Hill Counselor Verbal Response Category System. The sequence of client-helper responses were examined using log-linear analysis as they varied by type of helper and outcome. Results indicated that successful helpers (regardless of type of helper) tended to use directives (e.g., guidance and approval-reassurance) differently from less successful helpers. Successful helpers used directives following client emotional expression and not following factual description. In addition, clear differences in helper responses by helper type and outcome were found. Each helper type had unique patterns of responses that differentiated successful from less successful outcomes. Client responses were found to vary across helper type even when given the same helper preceding response. Results are discussed with respect to the unique goals of each helping relationship and the different shaping process involved in each.

  4. Second order analysis for spatial Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We derive summary statistics for stationary Hawkes processes which can be considered as spatial versions of classical Hawkes processes. Particularly, we derive the intensity, the pair correlation function and the Bartlett spectrum. Our results for Gaussian fertility rates and the extension...... to marked Hawkes processes are discussed....

  5. Gasification of coal using nuclear process heat. Chapter D

    International Nuclear Information System (INIS)

    Schilling, H.-D.; Bonn, B.; Krauss, U.

    1979-01-01

    In the light of the high price of coal and the enormous advances made recently in nuclear engineering, the possibility of using heat from high-temperature nuclear reactors for gasification processes was discussed as early as the 1960s. The advantages of this technology are summarized. A joint programme of development work is described, in which the Nuclear Research Centre at Juelich is aiming to develop a high-temperature reactor which will supply process heat at as high a temperature as possible, while other organizations are working on the hydrogasification of lignites and hard coals, and steam gasification. Experiments are at present being carried out on a semi-technical scale, and no operational data for large-scale plants are available as yet. (author)

  6. High temperature reactor and application to nuclear process heat

    Energy Technology Data Exchange (ETDEWEB)

    Schulten, R; Kugeler, K [Kernforschungsanlage Juelich G.m.b.H. (Germany, F.R.)

    1976-01-01

    The principle of high temperature nuclear process heat is explained and the main applications (hydrogasification of coal, nuclear chemical heat pipe, direct reduction of iron ore, coal gasification by steam and water splitting) are described in more detail. The motivation for the introduction of nuclear process heat to the market, questions of cost, of raw material resources and environmental aspects are the next point of discussion. The new technological questions of the nuclear reactor and the status of development are described, especially information about the fuel elements, the hot gas ducts, the contamination and some design considerations are added. Furthermore the status of development of helium heated steam reformers, the main results of the work until now and the further activities in this field are explained.

  7. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  8. SCHEME ANALYSIS TREE DIMENSIONS AND TOLERANCES PROCESSING

    OpenAIRE

    Constanta RADULESCU; Liviu Marius CÎRŢÎNĂ; Constantin MILITARU

    2011-01-01

    This paper presents one of the steps that help us to determine the optimal tolerances depending on thetechnological capability of processing equipment. To determine the tolerances in this way is necessary to takethe study and to represent schematically the operations are used in technological process of making a piece.Also in this phase will make the tree diagram of the dimensions and machining tolerances, dimensions andtolerances shown that the design execution. Determination processes, and ...

  9. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  10. Amorphous silicon batch process cost analysis

    International Nuclear Information System (INIS)

    Whisnant, R.A.; Sherring, C.

    1993-08-01

    This report describes the development of baseline manufacturing cost data to assist PVMaT monitoring teams in assessing current and future subcontracts, which an emphasis on commercialization and production. A process for the manufacture of a single-junction, large-area, a Si module was modeled using an existing Research Triangle Institute (RTI) computer model. The model estimates a required, or breakeven, price for the module based on its production process and the financial structure of the company operating the process. Sufficient detail on cost drivers is presented so the relationship of the process features and business characteristics can be related to the estimated required price

  11. Two-stage process analysis using the process-based performance measurement framework and business process simulation

    NARCIS (Netherlands)

    Han, K.H.; Kang, J.G.; Song, M.S.

    2009-01-01

    Many enterprises have recently been pursuing process innovation or improvement to attain their performance goals. To align a business process with enterprise performances, this study proposes a two-stage process analysis for process (re)design that combines the process-based performance measurement

  12. Social network analysis in software process improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research...

  13. Shielding analysis of the advanced voloxidation process

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je; Park, J. J.; Lee, J. W.; Shin, J. M.; Park, G. I.; Song, K. C

    2008-09-15

    This report deals describes how much a shielding benefit can be obtained by the Advanced Voloxidation process. The calculation was performed with the MCNPX code and a simple problem was modeled with a spent fuel source which was surrounded by a concrete wall. The source terms were estimated with the ORIGEN-ARP code and the gamma spectrum and the neutron spectrum were also obtained. The thickness of the concrete wall was estimated before and after the voloxidation process. From the results, the gamma spectrum after the voloxidation process was estimated as a 67% reduction compared with that of before the voloxidation process due to the removal of several gamma emission elements such as cesium and rubidium. The MCNPX calculations provided that the thickness of the general concrete wall could be reduced by 12% after the voloxidation process. And the heavy concrete wall provided a 28% reduction in the shielding of the source term after the voloxidation process. This can be explained in that there lots of gamma emission isotopes still exist after the advanced voloxidation process such as Pu-241, Y-90, and Sr-90 which are independent of the voloxidation process.

  14. Probabilistic analysis of a thermosetting pultrusion process

    NARCIS (Netherlands)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper H.

    2016-01-01

    In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion

  15. Probabilistic analysis of a thermosetting pultrusion process

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper Henri

    2016-01-01

    In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusio...

  16. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette); V. Capasso

    2009-01-01

    htmlabstractWe give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects

  17. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    Lieshout, van M.N.M.; Capasso, V.

    2009-01-01

    We give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects through a video

  18. On statistical analysis of compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2006-01-01

    Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research

  19. Construction Analysis during the Design Process

    NARCIS (Netherlands)

    Vries, de B.; Harink, J.M.J.; Martens, B.; Brown, A.

    2005-01-01

    4D CAD systems are used by contractors for visually checking the construction process. To enable simulation of the construction process, the construction planner links building components from a CAD model with the activities from a project planning. In this paper we describe a method to generate a

  20. A program for activation analysis data processing

    International Nuclear Information System (INIS)

    Janczyszyn, J.; Loska, L.; Taczanowski, S.

    1978-01-01

    An ALGOL program for activation analysis data handling is presented. The program may be used either for single channel spectrometry data or for multichannel spectrometry. The calculation of instrumental error and of analysis standard deviation is carried out. The outliers are tested, and the regression line diagram with the related observations are plotted by the program. (author)

  1. SCHEME ANALYSIS TREE DIMENSIONS AND TOLERANCES PROCESSING

    Directory of Open Access Journals (Sweden)

    Constanta RADULESCU

    2011-07-01

    Full Text Available This paper presents one of the steps that help us to determine the optimal tolerances depending on thetechnological capability of processing equipment. To determine the tolerances in this way is necessary to takethe study and to represent schematically the operations are used in technological process of making a piece.Also in this phase will make the tree diagram of the dimensions and machining tolerances, dimensions andtolerances shown that the design execution. Determination processes, and operations of the dimensions andtolerances tree scheme will make for a machined piece is both indoor and outdoor.

  2. Energetic Analysis of Poultry Processing Operations

    OpenAIRE

    Simeon Olatayo JEKAYINFA

    2007-01-01

    Energy audit of three poultry processing plants was conducted in southwestern Nigeria. The plants were grouped into three different categories based on their production capacities. The survey involved all the five easily defined unit operations utilized by the poultry processing industry and the experimental design allowed the energy consumed in each unit operation to be measured. The results of the audit revealed that scalding & defeathering is the most energy intensive unit operation in al...

  3. Comparative analysis of accelerogram processing methods

    International Nuclear Information System (INIS)

    Goula, X.; Mohammadioun, B.

    1986-01-01

    The work described here inafter is a short development of an on-going research project, concerning high-quality processing of strong-motion recordings of earthquakes. Several processing procedures have been tested, applied to synthetic signals simulating ground-motion designed for this purpose. The methods of correction operating in the time domain are seen to be strongly dependent upon the sampling rate. Two methods of low-frequency filtering followed by an integration of accelerations yielded satisfactory results [fr

  4. Electromagnetic heating processes: analysis and simulations

    OpenAIRE

    Calay, Rajnish Kaur

    1994-01-01

    Electromagnetic heating (EMH) processes are being increasingly used in the industrial and domestic sectors, yet they receive relatively little attention in the thermal engineering domain. Time-temperature characteristics in EMH are qualitatively different from those in conventional heating techniques due to the additional parameters (viz dielectric properties of the material, size and shape of the product and process frequency). From a unified theory perspective, a multi-...

  5. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  6. Energetic Analysis of Poultry Processing Operations

    Directory of Open Access Journals (Sweden)

    Simeon Olatayo JEKAYINFA

    2007-01-01

    Full Text Available Energy audit of three poultry processing plants was conducted in southwestern Nigeria. The plants were grouped into three different categories based on their production capacities. The survey involved all the five easily defined unit operations utilized by the poultry processing industry and the experimental design allowed the energy consumed in each unit operation to be measured. The results of the audit revealed that scalding & defeathering is the most energy intensive unit operation in all the three plant categories, averagely accounting for about 44% of the total energy consumption in the processing plants. Other processing operations consuming energy in the following order are eviscerating (17.5%, slaughtering (17%, washing & chilling (16% and packing (6%. The results of the study clearly indicated that the least mechanized of the plants consumed the highest energy (50.36 MJ followed by the semi-mechanized plant (28.04 MJ and the most mechanized plant (17.83 MJ. The energy audits have provided baseline information needed for carrying out budgeting, forecasting energy requirements and planning plant expansion in the poultry processing industries in the study area.

  7. Thermal analysis of a glass bending process

    International Nuclear Information System (INIS)

    Buonanno, G.; Dell'Isola, M.; Frattolillo, A.; Giovinco, G.

    2005-01-01

    The paper presents the thermal simulation of naturally ventilated ovens used in glass sheets hot forming for windscreen production. The determination of thermal and flow conditions in the oven and, consequently, the windshield temperature distribution is necessary both for the productive process optimisation and to assure beforehand, without any iterative tuning process, the required characteristics of the product considered. To this purpose, the authors carried out a 3D numerical simulation of the thermal interaction between the glass and the oven internal surfaces during the whole heating process inside the oven. In particular, a finite volumes method was used to take into account both the convective, conductive and radiative heat transfer in the oven. The numerical temperature distribution in the glass was validated through the comparison with the data obtained from an experimental apparatus designed and built for the purpose

  8. Laser processing and analysis of materials

    CERN Document Server

    Duley, W W

    1983-01-01

    It has often been said that the laser is a solution searching for a problem. The rapid development of laser technology over the past dozen years has led to the availability of reliable, industrially rated laser sources with a wide variety of output characteristics. This, in turn, has resulted in new laser applications as the laser becomes a familiar processing and analytical tool. The field of materials science, in particular, has become a fertile one for new laser applications. Laser annealing, alloying, cladding, and heat treating were all but unknown 10 years ago. Today, each is a separate, dynamic field of research activity with many of the early laboratory experiments resulting in the development of new industrial processing techniques using laser technology. Ten years ago, chemical processing was in its infancy awaiting, primarily, the development of reliable tunable laser sources. Now, with tunability over the entire spectrum from the vacuum ultraviolet to the far infrared, photo­ chemistry is undergo...

  9. Encapsulation Processing and Manufacturing Yield Analysis

    Science.gov (United States)

    Willis, P.

    1985-01-01

    Evaluation of the ethyl vinyl acetate (EVA) encapsulation system is presented. This work is part of the materials baseline needed to demonstrate a 30 year module lifetime capability. Process and compound variables are both being studied along with various module materials. Results have shown that EVA should be stored rolled up, and enclosed in a plastic bag to retard loss of peroxide curing agents. The TBEC curing agent has superior shelf life and processing than the earlier Lupersol-101 curing agent. Analytical methods were developed to test for peroxide content, and experimental methodologies were formalized.

  10. Radionuclides for process analysis problems and examples

    International Nuclear Information System (INIS)

    Otto, R.; Koennecke, H.G.; Luther, D.; Hecht, P.

    1986-01-01

    Both practical problems of the application of the tracer techniques for residence time measurements and the advantages of the methods are discussed. In this paper selected examples for tracer experiments carried out in a drinking water generator, a caprolactam production plant and a cokery are given. In all cases the efficiency of the processes investigated could be improved. (author)

  11. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate from...... different fingers. Our model achieves good performance on an NIST-FBI fingerprint database of 258 matched fingerprint pairs....

  12. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    (sampling variances) can be reduced greatly however, and sampling biases can be eliminated completely, by respecting a simple set of rules and guidelines provided by TOS. A systematic approach for description of process heterogeneity furnishes in-depth knowledge about the specific variability of any 1-D lot...

  13. Exergy analysis in industrial food processing

    NARCIS (Netherlands)

    Zisopoulos, F.K.

    2016-01-01

    The sustainable provision of food on a global scale in the near future is a very serious challenge. This thesis focuses on the assessment and design of sustainable industrial food production chains and processes by using the concept of exergy which is an objective metric based on the first and

  14. Entrepreneurship Learning Process by using SWOT Analysis

    OpenAIRE

    Jajat Sudrajat; Muhammad Ali Rahman; Antonius Sianturi; Vendy Vendy

    2016-01-01

    The research objective was to produce a model of learning entrepreneurship by using SWOT analysis, which was currently being run with the concept of large classes and small classes. The benefits of this study was expected to be useful for the Binus Entrepreneurship Center (BEC) unit to create a map development learning entrepreneurship. Influences that would be generated by using SWOT Analysis were very wide as the benefits of the implementation of large classes and small classes for students...

  15. Knee joint vibroarthrographic signal processing and analysis

    CERN Document Server

    Wu, Yunfeng

    2015-01-01

    This book presents the cutting-edge technologies of knee joint vibroarthrographic signal analysis for the screening and detection of knee joint injuries. It describes a number of effective computer-aided methods for analysis of the nonlinear and nonstationary biomedical signals generated by complex physiological mechanics. This book also introduces several popular machine learning and pattern recognition algorithms for biomedical signal classifications. The book is well-suited for all researchers looking to better understand knee joint biomechanics and the advanced technology for vibration arthrometry. Dr. Yunfeng Wu is an Associate Professor at the School of Information Science and Technology, Xiamen University, Xiamen, Fujian, China.

  16. Laplace-Laplace analysis of the fractional Poisson process

    OpenAIRE

    Gorenflo, Rudolf; Mainardi, Francesco

    2013-01-01

    We generate the fractional Poisson process by subordinating the standard Poisson process to the inverse stable subordinator. Our analysis is based on application of the Laplace transform with respect to both arguments of the evolving probability densities.

  17. Processing Cost Analysis for Biomass Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Badger, P.C.

    2002-11-20

    The receiving, handling, storing, and processing of woody biomass feedstocks is an overlooked component of biopower systems. The purpose of this study was twofold: (1) to identify and characterize all the receiving, handling, storing, and processing steps required to make woody biomass feedstocks suitable for use in direct combustion and gasification applications, including small modular biopower (SMB) systems, and (2) to estimate the capital and operating costs at each step. Since biopower applications can be varied, a number of conversion systems and feedstocks required evaluation. In addition to limiting this study to woody biomass feedstocks, the boundaries of this study were from the power plant gate to the feedstock entry point into the conversion device. Although some power plants are sited at a source of wood waste fuel, it was assumed for this study that all wood waste would be brought to the power plant site. This study was also confined to the following three feedstocks (1) forest residues, (2) industrial mill residues, and (3) urban wood residues. Additionally, the study was confined to grate, suspension, and fluidized bed direct combustion systems; gasification systems; and SMB conversion systems. Since scale can play an important role in types of equipment, operational requirements, and capital and operational costs, this study examined these factors for the following direct combustion and gasification system size ranges: 50, 20, 5, and 1 MWe. The scope of the study also included: Specific operational issues associated with specific feedstocks (e.g., bark and problems with bridging); Opportunities for reducing handling, storage, and processing costs; How environmental restrictions can affect handling and processing costs (e.g., noise, commingling of treated wood or non-wood materials, emissions, and runoff); and Feedstock quality issues and/or requirements (e.g., moisture, particle size, presence of non-wood materials). The study found that over the

  18. IMPRINT Analysis of an Unmanned Air System Geospatial Information Process

    National Research Council Canada - National Science Library

    Hunn, Bruce P; Schweitzer, Kristin M; Cahir, John A; Finch, Mary M

    2008-01-01

    ... intelligence, geospatial analysis cell. The Improved Performance Research Integration Tool (IMPRINT) modeling program was used to understand this process and to assess crew workload during several test scenarios...

  19. Analysis of Americium in Transplutonium Process Solutions

    International Nuclear Information System (INIS)

    Ferguson, R.B.

    2001-01-01

    One of the more difficult analyses in the transplutonium field is the determination of americium at trace levels in a complex matrix such as a process dissolver solution. Because of these conditions a highly selective separation must precede the measurement of americium. The separation technique should be mechanically simple to permit remote operation with master-slave manipulators. For subsequent americium measurement by the mass spectroscopic isotopic-dilution technique, plutonium and curium interferences must also have been removed

  20. Analysis and improvement of last warehousing processes

    OpenAIRE

    Kumetytė, Indrė

    2017-01-01

    The efficiency and productivity are one of the most significant factors in every manufacturing company in order to maintain competitiveness and leadership in the market. To keep it, an enterprise has to pay a lot of attention and efforts to inside logistic and management of its inventory. The analyzed selected Lithuanian production company’s activity and production principles addresses to major nowadays problem – inventory management and time reduction for warehousing processes. Thus, in this...

  1. Advanced Color Image Processing and Analysis

    CERN Document Server

    2013-01-01

    This volume does much more than survey modern advanced color processing. Starting with a historical perspective on ways we have classified color, it sets out the latest numerical techniques for analyzing and processing colors, the leading edge in our search to accurately record and print what we see. The human eye perceives only a fraction of available light wavelengths, yet we live in a multicolor world of myriad shining hues. Colors rich in metaphorical associations make us “purple with rage” or “green with envy” and cause us to “see red.” Defining colors has been the work of centuries, culminating in today’s complex mathematical coding that nonetheless remains a work in progress: only recently have we possessed the computing capacity to process the algebraic matrices that reproduce color more accurately. With chapters on dihedral color and image spectrometers, this book provides technicians and researchers with the knowledge they need to grasp the intricacies of today’s color imaging.

  2. Process Analysis of the CV Group's Operation

    CERN Document Server

    Wilhelmsson, M

    2000-01-01

    This report will give an explanation of the internal reorganization that has been done because of the necessity to optimize operation in the cooling and ventilation group. The basic structure for the group was defined at the end of 1998. We understood then that change was needed to accommodate the increased workload due to the LHC project. In addition, we face a relatively large turnover of personnel (retirements and some recruitment) with related integration issues to consider. We would also like to implement new approaches in the management of both operations and maintenance. After some running-in problems during the first half of 1999, we realized that much more could be gained with the analysis and the definition and documenting of each single function and generic activity within the group. The authors will explain how this analysis was carried out and give some feedback of the outcome, so far.

  3. Safety analysis of SISL process module

    International Nuclear Information System (INIS)

    1983-05-01

    This report provides an assessment of various postulated accidental occurrences within an experimental process module which is part of a Special Isotope Separation Laboratory (SISL) currently under construction at the Lawrence Livermore National Laboratory (LLNL). The process module will contain large amounts of molten uranium and various water-cooled structures within a vacuum vessel. Special emphasis is therefore given to potential accidental interactions of molten uranium with water leading to explosive and/or rapid steam formation, as well as uranium oxidation and the potential for combustion. Considerations are also given to the potential for vessel melt-through. Evaluations include mechanical and thermal interactions and design implications both in terms of design basis as well as once-in-a-lifetime accident scenarios. These scenarios include both single- and multiple-failure modes leading to various contact modes and locations within the process module for possible thermal interactions. The evaluations show that a vacuum vessel design based upon nominal operating conditions would appear sufficient to meet safety requirements in connection with both design basis as well as once-in-a-lifetime accidents. Controlled venting requirements for removal of steam and hydrogen in order to avoid possible long-term pressurization events are recommended. Depending upon the resulting accident conditions, the vacuum system (i.e., the roughing system) could also serve this purpose. Finally, based upon accident evaluations of this study, immediate shut-off of all coolant water following an incident leak is not recommended, as such action may have adverse effects in terms of cool-down requirements for the melt crucibles etc. These requirements have not been assessed as part of this study

  4. The analysis of thermally stimulated processes

    CERN Document Server

    Chen, R; Pamplin, Brian

    1981-01-01

    Thermally stimulated processes include a number of phenomena - either physical or chemical in nature - in which a certain property of a substance is measured during controlled heating from a 'low' temperature. Workers and graduate students in a wide spectrum of fields require an introduction to methods of extracting information from such measurements. This book gives an interdisciplinary approach to various methods which may be applied to analytical chemistry including radiation dosimetry and determination of archaeological and geological ages. In addition, recent advances are included, such

  5. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  6. Analysis of sensory processing in preterm infants.

    Science.gov (United States)

    Cabral, Thais Invenção; da Silva, Louise Gracelli Pereira; Martinez, Cláudia Maria Simões; Tudella, Eloisa

    2016-12-01

    Premature birth suggests condition of biological vulnerability, predisposing to neurological injuries, requiring hospitalization in Neonatal Intensive Care Units, which, while contributing to increase the survival rates, expose infants to sensory stimuli harmful to the immature organism. To evaluate the sensory processing at 4 and 6months' corrected age. This was a descriptive cross-sectional study with a sample of 30 infants divided into an experimental group composed of preterm infants (n=15), and a control group composed of full-term infants (n=15). The infants were assessed using the Test of Sensory Functions in Infants. The preterm infants showed poor performance in the total score of the test in reactivity to tactile deep pressure and reactivity to vestibular stimulation. When groups were compared, significant differences in the total score (p=0.0113) and in the reactivity to tactile deep pressure (psensory processing. These changes were most evident in reactivity to tactile deep pressure and vestibular stimulation. Copyright © 2016. Published by Elsevier Ireland Ltd.

  7. Digital image sequence processing, compression, and analysis

    CERN Document Server

    Reed, Todd R

    2004-01-01

    IntroductionTodd R. ReedCONTENT-BASED IMAGE SEQUENCE REPRESENTATIONPedro M. Q. Aguiar, Radu S. Jasinschi, José M. F. Moura, andCharnchai PluempitiwiriyawejTHE COMPUTATION OF MOTIONChristoph Stiller, Sören Kammel, Jan Horn, and Thao DangMOTION ANALYSIS AND DISPLACEMENT ESTIMATION IN THE FREQUENCY DOMAINLuca Lucchese and Guido Maria CortelazzoQUALITY OF SERVICE ASSESSMENT IN NEW GENERATION WIRELESS VIDEO COMMUNICATIONSGaetano GiuntaERROR CONCEALMENT IN DIGITAL VIDEOFrancesco G.B. De NataleIMAGE SEQUENCE RESTORATION: A WIDER PERSPECTIVEAnil KokaramVIDEO SUMMARIZATIONCuneyt M. Taskiran and Edward

  8. Energy analysis in sterilization process of food

    International Nuclear Information System (INIS)

    Lee, Dong Sun; Pyun, Yu Ryang

    1986-01-01

    A procedure was developed for predicting energy consumption of batch type thermal processing of food. From mass and energy balance equations various energy usages or losses were estimated for steam sterilization of model food system in No.301-7 can (Φ74.1 x 113.0mm) at three different temperatures. Selected models were 5 % bentonite solution for conductive food and tap water for convective food. Total steam or energy consumption was higher at 110 deg C than at two other higher temperatures (121 deg C and 130 deg C). High energy consumption at low sterilization temperature was mainly due to high bleeding steam energy and convective and radiative heat losses. Thermal energy efficiency was also disscussed. (Author)

  9. ANALYSIS ON TECHNOLOGICAL PROCESSES CLEANING OIL PIPELINES

    Directory of Open Access Journals (Sweden)

    Mariana PǍTRAŞCU

    2015-05-01

    Full Text Available In this paper the researches are presented concerning the technological processes of oil pipelines.We know several technologies and materials used for cleaning the sludge deposits, iron and manganese oxides, dross, stone, etc.de on the inner walls of drinking water pipes or industries.For the oil industry, methods of removal of waste materials and waste pipes and liquid and gas transport networks are operations known long, tedious and expensive. The main methods and associated problems can be summarized as follows: 1 Blowing with compressed air.2 manual or mechanical brushing, sanding with water or dry.3 Wash with water jet of high pressure, solvent or chemical solution to remove the stone and hard deposits.4 The combined methods of cleaning machines that use water jets, cutters, chains, rotary heads cutters, etc.

  10. Development of advanced spent fuel management process. System analysis of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, S.G.; Kang, D.S.; Seo, C.S.; Lee, H.H.; Shin, Y.J.; Park, S.W.

    1999-03-01

    The system analysis of an advanced spent fuel management process to establish a non-proliferation model for the long-term spent fuel management is performed by comparing the several dry processes, such as a salt transport process, a lithium process, the IFR process developed in America, and DDP developed in Russia. In our system analysis, the non-proliferation concept is focused on the separation factor between uranium and plutonium and decontamination factors of products in each process, and the non-proliferation model for the long-term spent fuel management has finally been introduced. (Author). 29 refs., 17 tabs., 12 figs

  11. Analysis of thermal process of pozzolan production

    Directory of Open Access Journals (Sweden)

    Mejía De Gutiérrez, R.

    2004-06-01

    Full Text Available The objective of this study was evaluated the effect of heat treatment parameters on the pozzolanic activity of natural kaolin clays. The experimental design included three factors: kaolin type, temperature and time. Five types of Colombian kaolin clays were thermally treated from 400 to 1000 °C by 1, 2, and 3 hours. The raw materials and the products obtained were characterized by X-Ray Diffraction (XRD, Fourier Transform Infrared Spectroscopy (FTIR and Differential Thermal / Thermo gravimetric Analysis (DTAJ TGA. The pozzolanic activity of thermally treated samples according to chemical and mechanical tests was investigated.

    El objetivo de este estudio fue caracterizar las variables de producción de un metacaolín de alta reactividad puzolánica. El diseño experimental utilizó un modelo factorial que consideró tres factores: tipo de caolín (C, temperatura y tiempo. A partir del conocimiento de las fuentes de caolín y el contacto con proveedores y distribuidores del producto a nivel nacional, se seleccionaron cinco muestras representativas de arcillas caoliníticas, las cuales se sometieron a un tratamiento térmico entre 400 y 1.000 ºC (seis niveles de temperatura y tres tiempos de exposición, 1, 2 y 3 horas. Los caolines de origen y los productos obtenidos de cada proceso térmico fueron evaluados mediante técnicas de tipo físico y químico, difracción de rayos X, infrarrojo FTIR, y análisis térmico diferencial (OTA, TGA. Complementariamente se evalúa la actividad puzolánica, tanto química como mecánica, del producto obtenido a diferentes temperaturas de estudio.

  12. Improvement of product design process by knowledge value analysis

    OpenAIRE

    XU, Yang; BERNARD, Alain; PERRY, Nicolas; LAROCHE, Florent

    2013-01-01

    Nowadays, design activities remain the core issue for global product development. As knowledge is more and more integrated, effective analysis of knowledge value becomes very useful for the improvement of product design processes. This paper aims at proposing a framework of knowledge value analysis in the context of product design process. By theoretical analysis and case study, the paper illustrates how knowledge value can be calculated and how the results can help the improvement of product...

  13. Analysis of the logistics processes in the wine distribution

    OpenAIRE

    Slavkovský, Matúš

    2011-01-01

    Master's thesis is referring the importance of logistics in the retail business and the importance of reducing logistics costs. It includes so theoretical knowledge as well as the analysis of the relevant markets, which are producing and consuming wine in the largest quantities. Thesis is focused on analysis of the logistical processes and costs of an e-shop. Based on this analysis measures to improve the logistics of the process of the company are proposed. The goal of the Master's thesis is...

  14. AN ENHANCED HAZARD ANALYSIS PROCESS FOR THE HANFORD TANK FARMS

    International Nuclear Information System (INIS)

    SHULTZ MV

    2008-01-01

    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process

  15. Energy analysis handbook. CAC document 214. [Combining process analysis with input-output analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bullard, C. W.; Penner, P. S.; Pilati, D. A.

    1976-10-01

    Methods are presented for calculating the energy required, directly and indirectly, to produce all types of goods and services. Procedures for combining process analysis with input-output analysis are described. This enables the analyst to focus data acquisition cost-effectively, and to achieve a specified degree of accuracy in the results. The report presents sample calculations and provides the tables and charts needed to perform most energy cost calculations, including the cost of systems for producing or conserving energy.

  16. Pedagogical issues for effective teaching of biosignal processing and analysis.

    Science.gov (United States)

    Sandham, William A; Hamilton, David J

    2010-01-01

    Biosignal processing and analysis is generally perceived by many students to be a challenging topic to understand, and to become adept with the necessary analytical skills. This is a direct consequence of the high mathematical content involved, and the many abstract features of the topic. The MATLAB and Mathcad software packages offer an excellent algorithm development environment for teaching biosignal processing and analysis modules, and can also be used effectively in many biosignal, and indeed bioengineering, research areas. In this paper, traditional introductory and advanced biosignal processing (and analysis) syllabi are reviewed, and the use of MATLAB and Mathcad for teaching and research is illustrated with a number of examples.

  17. Reachability for Finite-State Process Algebras Using Static Analysis

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming

    2011-01-01

    of the Data Flow Analysis are used in order to “cut off” some of the branches in the reachability analysis that are not important for determining, whether or not a state is reachable. In this way, it is possible for our reachability algorithm to avoid building large parts of the system altogether and still......In this work we present an algorithm for solving the reachability problem in finite systems that are modelled with process algebras. Our method uses Static Analysis, in particular, Data Flow Analysis, of the syntax of a process algebraic system with multi-way synchronisation. The results...... solve the reachability problem in a precise way....

  18. 40 CFR 68.67 - Process hazard analysis.

    Science.gov (United States)

    2010-07-01

    ...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault Tree...) The hazards of the process; (2) The identification of any previous incident which had a likely...

  19. A Qualitative Analysis of the Turkish Gendarmerie Assignment Process

    National Research Council Canada - National Science Library

    Soylemez, Kadir

    2005-01-01

    ...; this number increases to 43 million (65% of the population) in the summer months. This study is an organizational analysis of the current assignment process of the Turkish General Command of the Gendarmerie...

  20. Economic analysis of locust bean processing and marketing in Iwo ...

    African Journals Online (AJOL)

    Economic analysis of locust bean processing and marketing in Iwo local government, Osun state. ... Majority (78.3%) of the processors and marketers were making profit; 95.0% operate ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  1. Profitability Analysis of Rice Processing and Marketing in Kano State ...

    African Journals Online (AJOL)

    Profitability Analysis of Rice Processing and Marketing in Kano State, Nigeria. ... added to the commodity at each stage in the study area and determine the most efficient services produce. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  2. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Science.gov (United States)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  3. Multivariate statistical analysis of a multi-step industrial processes

    DEFF Research Database (Denmark)

    Reinikainen, S.P.; Høskuldsson, Agnar

    2007-01-01

    Monitoring and quality control of industrial processes often produce information on how the data have been obtained. In batch processes, for instance, the process is carried out in stages; some process or control parameters are set at each stage. However, the obtained data might not be utilized...... efficiently, even if this information may reveal significant knowledge about process dynamics or ongoing phenomena. When studying the process data, it may be important to analyse the data in the light of the physical or time-wise development of each process step. In this paper, a unified approach to analyse...... multivariate multi-step processes, where results from each step are used to evaluate future results, is presented. The methods presented are based on Priority PLS Regression. The basic idea is to compute the weights in the regression analysis for given steps, but adjust all data by the resulting score vectors...

  4. Explaining discontinuity in organizational learning : a process analysis

    NARCIS (Netherlands)

    Berends, J.J.; Lammers, I.S.

    2010-01-01

    This paper offers a process analysis of organizational learning as it unfolds in a social and temporal context. Building upon the 4I framework (Crossan et al. 1999), we examine organizational learning processes in a longitudinal case study of an implementation of knowledge management in an

  5. Engineering analysis of the two-stage trifluoride precipitation process

    International Nuclear Information System (INIS)

    Luerkens, D.w.W.

    1984-06-01

    An engineering analysis of two-stage trifluoride precipitation processes is developed. Precipitation kinetics are modeled using consecutive reactions to represent fluoride complexation. Material balances across the precipitators are used to model the time dependent concentration profiles of the main chemical species. The results of the engineering analysis are correlated with previous experimental work on plutonium trifluoride and cerium trifluoride

  6. Self-similar analysis of the spherical implosion process

    International Nuclear Information System (INIS)

    Ishiguro, Yukio; Katsuragi, Satoru.

    1976-07-01

    The implosion processes caused by laser-heating ablation has been studied by self-similarity analysis. Attention is paid to the possibility of existence of the self-similar solution which reproduces the implosion process of high compression. Details of the self-similar analysis are reproduced and conclusions are drawn quantitatively on the gas compression by a single shock. The compression process by a sequence of shocks is discussed in self-similarity. The gas motion followed by a homogeneous isentropic compression is represented by a self-similar motion. (auth.)

  7. Analysis of hard inclusive processes in quantum chromodynamics

    International Nuclear Information System (INIS)

    Radyushkin, A.V.

    1983-01-01

    An approach to the investigation of hard processes in QCD based on a regular usage of α-representation analysis of Feynman diagram asymptotics is described. Analysis is examplified by two simplest inclusive processes: E + e - annihilation into hadrons and deep inelastic lepton-hadron scattering. The separation procedure of factorization of contributions stipulated by short- and long-range particle interactions is reported. The relation between expansion operators and methods based on direct analysis of diagrams as well as between theoretical field approaches and the parton model is discussed. Specific features of factorization of short- and long-range contributions in non-Abelian gauge theories are investigated

  8. Iterated Process Analysis over Lattice-Valued Regular Expressions

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    We present an iterated approach to statically analyze programs of two processes communicating by message passing. Our analysis operates over a domain of lattice-valued regular expressions, and computes increasingly better approximations of each process's communication behavior. Overall the work e...... extends traditional semantics-based program analysis techniques to automatically reason about message passing in a manner that can simultaneously analyze both values of variables as well as message order, message content, and their interdependencies.......We present an iterated approach to statically analyze programs of two processes communicating by message passing. Our analysis operates over a domain of lattice-valued regular expressions, and computes increasingly better approximations of each process's communication behavior. Overall the work...

  9. Carbon dioxide capture processes: Simulation, design and sensitivity analysis

    DEFF Research Database (Denmark)

    Zaman, Muhammad; Lee, Jay Hyung; Gani, Rafiqul

    2012-01-01

    equilibrium and associated property models are used. Simulations are performed to investigate the sensitivity of the process variables to change in the design variables including process inputs and disturbances in the property model parameters. Results of the sensitivity analysis on the steady state...... performance of the process to the L/G ratio to the absorber, CO2 lean solvent loadings, and striper pressure are presented in this paper. Based on the sensitivity analysis process optimization problems have been defined and solved and, a preliminary control structure selection has been made.......Carbon dioxide is the main greenhouse gas and its major source is combustion of fossil fuels for power generation. The objective of this study is to carry out the steady-state sensitivity analysis for chemical absorption of carbon dioxide capture from flue gas using monoethanolamine solvent. First...

  10. Pinch analysis for bioethanol production process from lignocellulosic biomass

    International Nuclear Information System (INIS)

    Fujimoto, S.; Yanagida, T.; Nakaiwa, M.; Tatsumi, H.; Minowa, T.

    2011-01-01

    Bioethanol produced from carbon neutral and renewable biomass resources is an attractive process for the mitigation of greenhouse gases from vehicle exhaust. This study investigated energy utilization during bioethanol production from lignocellulose while avoiding competition with food production from corn and considering the potential mitigation of greenhouse gases. Process design and simulations were performed for bioethanol production using concentrated sulfuric acid. Mass and heat balances were obtained by process simulations, and the heat recovery ratio was determined by pinch analysis. An energy saving of 38% was achieved. However, energy supply and demand were not effectively utilized in the temperature range from 95 to 100 o C. Therefore, a heat pump was used to improve the temperature range of efficient energy supply and demand. Results showed that the energy required for the process could be supplied by heat released during the process. Additionally, the power required was supplied by surplus power generated during the process. Thus, pinch analysis was used to improve the energy efficiency of the process. - Highlights: → Effective energy utilization of bioethanol production was studied by using pinch analysis. → It was found that energy was not effectively utilized in the temperature range from 95 to 100 o C. → Use of a heat pump was considered to improve the ineffective utilization. → Then, remarkable energy savings could be achieved by it. → Pinch analysis effectively improved the energy efficiency of the bioethanol production.

  11. Mathematical principles of signal processing Fourier and wavelet analysis

    CERN Document Server

    Brémaud, Pierre

    2002-01-01

    Fourier analysis is one of the most useful tools in many applied sciences. The recent developments of wavelet analysis indicates that in spite of its long history and well-established applications, the field is still one of active research. This text bridges the gap between engineering and mathematics, providing a rigorously mathematical introduction of Fourier analysis, wavelet analysis and related mathematical methods, while emphasizing their uses in signal processing and other applications in communications engineering. The interplay between Fourier series and Fourier transforms is at the heart of signal processing, which is couched most naturally in terms of the Dirac delta function and Lebesgue integrals. The exposition is organized into four parts. The first is a discussion of one-dimensional Fourier theory, including the classical results on convergence and the Poisson sum formula. The second part is devoted to the mathematical foundations of signal processing - sampling, filtering, digital signal proc...

  12. Materials, process, product analysis of coal process technology. Phase I final report

    Energy Technology Data Exchange (ETDEWEB)

    Saxton, J. C.; Roig, R. W.; Loridan, A.; Leggett, N. E.; Capell, R. G.; Humpstone, C. C.; Mudry, R. N.; Ayres, E.

    1976-02-01

    The purpose of materials-process-product analysis is a systematic evaluation of alternative manufacturing processes--in this case processes for converting coal into energy and material products that can supplement or replace petroleum-based products. The methodological steps in the analysis include: Definition of functional operations that enter into coal conversion processes, and modeling of alternative, competing methods to accomplish these functions; compilation of all feasible conversion processes that can be assembled from combinations of competing methods for the functional operations; systematic, iterative evaluation of all feasible conversion processes under a variety of economic situations, environmental constraints, and projected technological advances; and aggregative assessments (economic and environmental) of various industrial development scenarios. An integral part of the present project is additional development of the existing computer model to include: A data base for coal-related materials and coal conversion processes; and an algorithmic structure that facilitates the iterative, systematic evaluations in response to exogenously specified variables, such as tax policy, environmental limitations, and changes in process technology and costs. As an analytical tool, the analysis is intended to satisfy the needs of an analyst working at the process selection level, for example, with respect to the allocation of RDandD funds to competing technologies.

  13. Rapid, low-cost, image analysis through video processing

    International Nuclear Information System (INIS)

    Levinson, R.A.; Marrs, R.W.; Grantham, D.G.

    1976-01-01

    Remote Sensing now provides the data necessary to solve many resource problems. However, many of the complex image processing and analysis functions used in analysis of remotely-sensed data are accomplished using sophisticated image analysis equipment. High cost of this equipment places many of these techniques beyond the means of most users. A new, more economical, video system capable of performing complex image analysis has now been developed. This report describes the functions, components, and operation of that system. Processing capability of the new video image analysis system includes many of the tasks previously accomplished with optical projectors and digital computers. Video capabilities include: color separation, color addition/subtraction, contrast stretch, dark level adjustment, density analysis, edge enhancement, scale matching, image mixing (addition and subtraction), image ratioing, and construction of false-color composite images. Rapid input of non-digital image data, instantaneous processing and display, relatively low initial cost, and low operating cost gives the video system a competitive advantage over digital equipment. Complex pre-processing, pattern recognition, and statistical analyses must still be handled through digital computer systems. The video system at the University of Wyoming has undergone extensive testing, comparison to other systems, and has been used successfully in practical applications ranging from analysis of x-rays and thin sections to production of color composite ratios of multispectral imagery. Potential applications are discussed including uranium exploration, petroleum exploration, tectonic studies, geologic mapping, hydrology sedimentology and petrography, anthropology, and studies on vegetation and wildlife habitat

  14. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  15. Advanced exergetic analysis of five natural gas liquefaction processes

    International Nuclear Information System (INIS)

    Vatani, Ali; Mehrpooya, Mehdi; Palizdar, Ali

    2014-01-01

    Highlights: • Advanced exergetic analysis was investigated for five LNG processes. • Avoidable/unavoidable and endogenous/exogenous irreversibilities were calculated. • Advanced exergetic analysis identifies the potentials for improving the system. - Abstract: Conventional exergy analysis cannot identify portion of inefficiencies which can be avoided. Also this analysis does not have ability to calculate a portion of exergy destruction which has been produced through performance of a component alone. In this study advanced exergetic analysis was performed for five mixed refrigerant LNG processes and four parts of irreversibility (avoidable/unavoidable) and (endogenous/exogenous) were calculated for the components with high inefficiencies. The results showed that portion of endogenous exergy destruction in the components is higher than the exogenous one. In fact interactions among the components do not affect the inefficiencies significantly. Also this analysis showed that structural optimization cannot be useful to decrease the overall process irreversibilities. In compressors high portion of the exergy destruction is related to the avoidable one, thus they have high potential to improve. But in multi stream heat exchangers and air coolers, unavoidable inefficiencies were higher than the other parts. Advanced exergetic analysis can identify the potentials and strategies to improve thermodynamic performance of energy intensive processes

  16. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  17. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  18. Second-order analysis of semiparametric recurrent event processes.

    Science.gov (United States)

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  19. A Comparative Analysis of Extract, Transformation and Loading (ETL) Process

    Science.gov (United States)

    Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.

    2018-02-01

    The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).

  20. Computer-Aided Sustainable Process Synthesis-Design and Analysis

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan

    -groups is that, the performance of the entire process can be evaluated from the contributions of the individual process-groups towards the selected flowsheet property (for example, energy consumed). The developed flowsheet property models include energy consumption, carbon footprint, product recovery, product......Process synthesis involves the investigation of chemical reactions needed to produce the desired product, selection of the separation techniques needed for downstream processing, as well as taking decisions on sequencing the involved separation operations. For an effective, efficient and flexible...... focuses on the development and application of a computer-aided framework for sustainable synthesis-design and analysis of process flowsheets by generating feasible alternatives covering the entire search space and includes analysis tools for sustainability, LCA and economics. The synthesis method is based...

  1. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  2. The digital storytelling process: A comparative analysis from various experts

    Science.gov (United States)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  3. Cost analysis of simulated base-catalyzed biodiesel production processes

    International Nuclear Information System (INIS)

    Tasić, Marija B.; Stamenković, Olivera S.; Veljković, Vlada B.

    2014-01-01

    Highlights: • Two semi-continuous biodiesel production processes from sunflower oil are simulated. • Simulations were based on the kinetics of base-catalyzed methanolysis reactions. • The total energy consumption was influenced by the kinetic model. • Heterogeneous base-catalyzed process is a preferable industrial technology. - Abstract: The simulation and economic feasibility evaluation of semi-continuous biodiesel production from sunflower oil were based on the kinetics of homogeneously (Process I) and heterogeneously (Process II) base-catalyzed methanolysis reactions. The annual plant’s capacity was determined to be 8356 tonnes of biodiesel. The total energy consumption was influenced by the unit model describing the methanolysis reaction kinetics. The energy consumption of the Process II was more than 2.5 times lower than that of the Process I. Also, the simulation showed the Process I had more and larger process equipment units, compared with the Process II. Based on lower total capital investment costs and biodiesel selling price, the Process II was economically more feasible than the Process I. Sensitivity analysis was conducted using variable sunflower oil and biodiesel prices. Using a biodiesel selling price of 0.990 $/kg, Processes I and II were shown to be economically profitable if the sunflower oil price was 0.525 $/kg and 0.696 $/kg, respectively

  4. Applied Fourier analysis from signal processing to medical imaging

    CERN Document Server

    Olson, Tim

    2017-01-01

    The first of its kind, this focused textbook serves as a self-contained resource for teaching from scratch the fundamental mathematics of Fourier analysis and illustrating some of its most current, interesting applications, including medical imaging and radar processing. Developed by the author from extensive classroom teaching experience, it provides a breadth of theory that allows students to appreciate the utility of the subject, but at as accessible a depth as possible. With myriad applications included, this book can be adapted to a one or two semester course in Fourier Analysis or serve as the basis for independent study. Applied Fourier Analysis assumes no prior knowledge of analysis from its readers, and begins by making the transition from linear algebra to functional analysis. It goes on to cover basic Fourier series and Fourier transforms before delving into applications in sampling and interpolation theory, digital communications, radar processing, medical i maging, and heat and wave equations. Fo...

  5. Data near processing support for climate data analysis

    Science.gov (United States)

    Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils

    2016-04-01

    Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted

  6. Frames and operator theory in analysis and signal processing

    CERN Document Server

    Larson, David R; Nashed, Zuhair; Nguyen, Minh Chuong; Papadakis, Manos

    2008-01-01

    This volume contains articles based on talks presented at the Special Session Frames and Operator Theory in Analysis and Signal Processing, held in San Antonio, Texas, in January of 2006. Recently, the field of frames has undergone tremendous advancement. Most of the work in this field is focused on the design and construction of more versatile frames and frames tailored towards specific applications, e.g., finite dimensional uniform frames for cellular communication. In addition, frames are now becoming a hot topic in mathematical research as a part of many engineering applications, e.g., matching pursuits and greedy algorithms for image and signal processing. Topics covered in this book include: Application of several branches of analysis (e.g., PDEs; Fourier, wavelet, and harmonic analysis; transform techniques; data representations) to industrial and engineering problems, specifically image and signal processing. Theoretical and applied aspects of frames and wavelets. Pure aspects of operator theory empha...

  7. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  8. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2018-03-01

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  9. Process control analysis of IMRT QA: implications for clinical trials

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Rice, Roger K; Yoo, Sua; Court, Laurence E; McMillan, Sharon K; Russell, J Donald; Pacyniak, John M; Woo, Milton K; Basran, Parminder S; Boyer, Arthur L; Bonilla, Claribel

    2008-01-01

    The purpose of this study is two-fold: first is to investigate the process of IMRT QA using control charts and second is to compare control chart limits to limits calculated using the standard deviation (σ). Head and neck and prostate IMRT QA cases from seven institutions in both academic and community settings are considered. The percent difference between the point dose measurement in phantom and the corresponding result from the treatment planning system (TPS) is used for analysis. The average of the percent difference calculations defines the accuracy of the process and is called the process target. This represents the degree to which the process meets the clinical goal of 0% difference between the measurements and TPS. IMRT QA process ability defines the ability of the process to meet clinical specifications (e.g. 5% difference between the measurement and TPS). The process ability is defined in two ways: (1) the half-width of the control chart limits, and (2) the half-width of ±3σ limits. Process performance is characterized as being in one of four possible states that describes the stability of the process and its ability to meet clinical specifications. For the head and neck cases, the average process target across institutions was 0.3% (range: -1.5% to 2.9%). The average process ability using control chart limits was 7.2% (range: 5.3% to 9.8%) compared to 6.7% (range: 5.3% to 8.2%) using standard deviation limits. For the prostate cases, the average process target across the institutions was 0.2% (range: -1.8% to 1.4%). The average process ability using control chart limits was 4.4% (range: 1.3% to 9.4%) compared to 5.3% (range: 2.3% to 9.8%) using standard deviation limits. Using the standard deviation to characterize IMRT QA process performance resulted in processes being preferentially placed in one of the four states. This is in contrast to using control charts for process characterization where the IMRT QA processes were spread over three of the

  10. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  11. Recommended practice for process sampling for partial pressure analysis

    International Nuclear Information System (INIS)

    Blessing, James E.; Ellefson, Robert E.; Raby, Bruce A.; Brucker, Gerardo A.; Waits, Robert K.

    2007-01-01

    This Recommended Practice describes and recommends various procedures and types of apparatus for obtaining representative samples of process gases from >10 -2 Pa (10 -4 Torr) for partial pressure analysis using a mass spectrometer. The document was prepared by a subcommittee of the Recommended Practices Committee of the American Vacuum Society. The subcommittee was comprised of vacuum users and manufacturers of mass spectrometer partial pressure analyzers who have practical experience in the sampling of process gas atmospheres

  12. The knowledge conversion SECI process as innovation indicator analysis factor

    OpenAIRE

    Silva, Elaine da [UNESP; Valentim, Marta Lígia Pomim [UNESP

    2013-01-01

    It highlights the innovation importance in the current society and presents innovation indicators applied in 125 countries. We made an analysis in the 80 variables distributed through seven GII pillars, trying to identify the direct, indirect or null incidences of the knowledge conversion way described by the SECI Process. The researched revealed the fact that knowledge management, in this case specifically the knowledge conversion SECI Process, is present in the variables that, according to ...

  13. Systems analysis as a tool for optimal process strategy

    International Nuclear Information System (INIS)

    Ditterich, K.; Schneider, J.

    1975-09-01

    For the description and the optimal treatment of complex processes, the methods of Systems Analysis are used as the most promising approach in recent times. In general every process should be optimised with respect to reliability, safety, economy and environmental pollution. In this paper the complex relations between these general optimisation postulates are established in qualitative form. These general trend relations have to be quantified for every particular system studied in practice

  14. STREAM PROCESSING ALGORITHMS FOR DYNAMIC 3D SCENE ANALYSIS

    Science.gov (United States)

    2018-02-15

    PROCESSING ALGORITHMS FOR DYNAMIC 3D SCENE ANALYSIS 5a. CONTRACT NUMBER FA8750-14-2-0072 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62788F 6...of Figures 1 The 3D processing pipeline flowchart showing key modules. . . . . . . . . . . . . . . . . 12 2 Overall view (data flow) of the proposed...pipeline flowchart showing key modules. from motion and bundle adjustment algorithm. By fusion of depth masks of the scene obtained from 3D

  15. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2012-09-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  16. Theoretical analysis of radiographic images by nonstationary Poisson processes

    International Nuclear Information System (INIS)

    Tanaka, Kazuo; Uchida, Suguru; Yamada, Isao.

    1980-01-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process. (author)

  17. Inferring Group Processes from Computer-Mediated Affective Text Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, Jack C [ORNL; Begoli, Edmon [ORNL; Jose, Ajith [Missouri University of Science and Technology; Griffin, Christopher [Pennsylvania State University

    2011-02-01

    Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Several useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.

  18. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Process synthesis, design and analysis using a process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario R.; Gani, Rafiqul

    2015-01-01

    ) techniques. The fundamental pillars of this framework are the definition and use of functional process-groups (building blocks) representing a wide range of process operations, flowsheet connectivity rules to join the process-groups to generate all the feasible flowsheet alternatives and flowsheet property...... models like energy consumption, atom efficiency, environmental impact to evaluate the performance of the generated alternatives. In this way, a list of feasible flowsheets are quickly generated, screened and selected for further analysis. Since the flowsheet is synthesized and the operations......This paper describes the development and application of a process-group contribution method to model, simulate and synthesize chemical processes. Process flowsheets are generated in the same way as atoms or groups of atoms are combined to form molecules in computer aided molecular design (CAMD...

  20. Emotion processing in the visual brain: a MEG analysis.

    Science.gov (United States)

    Peyk, Peter; Schupp, Harald T; Elbert, Thomas; Junghöfer, Markus

    2008-06-01

    Recent functional magnetic resonance imaging (fMRI) and event-related brain potential (ERP) studies provide empirical support for the notion that emotional cues guide selective attention. Extending this line of research, whole head magneto-encephalogram (MEG) was measured while participants viewed in separate experimental blocks a continuous stream of either pleasant and neutral or unpleasant and neutral pictures, presented for 330 ms each. Event-related magnetic fields (ERF) were analyzed after intersubject sensor coregistration, complemented by minimum norm estimates (MNE) to explore neural generator sources. Both streams of analysis converge by demonstrating the selective emotion processing in an early (120-170 ms) and a late time interval (220-310 ms). ERF analysis revealed that the polarity of the emotion difference fields was reversed across early and late intervals suggesting distinct patterns of activation in the visual processing stream. Source analysis revealed the amplified processing of emotional pictures in visual processing areas with more pronounced occipito-parieto-temporal activation in the early time interval, and a stronger engagement of more anterior, temporal, regions in the later interval. Confirming previous ERP studies showing facilitated emotion processing, the present data suggest that MEG provides a complementary look at the spread of activation in the visual processing stream.

  1. Analysis of the resolution processes of three modeling tasks

    Directory of Open Access Journals (Sweden)

    Cèsar Gallart Palau

    2017-08-01

    Full Text Available In this paper we present a comparative analysis of the resolution process of three modeling tasks performed by secondary education students (13-14 years, designed from three different points of view: The Modelling-eliciting Activities, the LEMA project, and the Realistic Mathematical Problems. The purpose of this analysis is to obtain a methodological characterization of them in order to provide to secondary education teachers a proper selection and sequencing of tasks for their implementation in the classroom.

  2. Enhancing Safety of Artificially Ventilated Patients Using Ambient Process Analysis.

    Science.gov (United States)

    Lins, Christian; Gerka, Alexander; Lüpkes, Christian; Röhrig, Rainer; Hein, Andreas

    2018-01-01

    In this paper, we present an approach for enhancing the safety of artificially ventilated patients using ambient process analysis. We propose to use an analysis system consisting of low-cost ambient sensors such as power sensor, RGB-D sensor, passage detector, and matrix infrared temperature sensor to reduce risks for artificially ventilated patients in both home and clinical environments. We describe the system concept and our implementation and show how the system can contribute to patient safety.

  3. Effective Thermal Analysis of Using Peltier Module for Desalination Process

    OpenAIRE

    Hayder Al-Madhhachi

    2018-01-01

    The key objective of this study is to analyse the heat transfer processes involved in the evaporation and condensation of water in a water distillation system employing a thermoelectric module. This analysis can help to increase the water production and to enhance the system performance. For the analysis, a water distillation unit prototype integrated with a thermoelectric module was designed and fabricated. A theoretical model is developed to study the effect of the heat added, transferred a...

  4. What carries a mediation process? Configural analysis of mediation.

    Science.gov (United States)

    von Eye, Alexander; Mun, Eun Young; Mair, Patrick

    2009-09-01

    Mediation is a process that links a predictor and a criterion via a mediator variable. Mediation can be full or partial. This well-established definition operates at the level of variables even if they are categorical. In this article, two new approaches to the analysis of mediation are proposed. Both of these approaches focus on the analysis of categorical variables. The first involves mediation analysis at the level of configurations instead of variables. Thus, mediation can be incorporated into the arsenal of methods of analysis for person-oriented research. Second, it is proposed that Configural Frequency Analysis (CFA) can be used for both exploration and confirmation of mediation relationships among categorical variables. The implications of using CFA are first that mediation hypotheses can be tested at the level of individual configurations instead of variables. Second, this approach leaves the door open for different types of mediation processes to exist within the same set. Using a data example, it is illustrated that aggregate-level analysis can overlook mediation processes that operate at the level of individual configurations.

  5. Use of safety analysis results to support process operation

    International Nuclear Information System (INIS)

    Karvonen, I.; Heino, P.

    1990-01-01

    Safety and risk analysis carried out during the design phase of a process plant produces useful knowledge about the behavior and the disturbances of the system. This knowledge, however, often remains to the designer though it would be of benefit to the operators and supervisors of the process plant, too. In Technical Research Centre of Finland a project has been started to plan and construct a prototype of an information system to make use of the analysis knowledge during the operation phase. The project belongs to a Nordic KRM project (Knowledge Based Risk Management System). The information system is planned to base on safety and risk analysis carried out during the design phase and completed with operational experience. The safety analysis includes knowledge about potential disturbances, their causes and consequences in the form of Hazard and Operability Study, faut trees and/or event trees. During the operation disturbances can however, occur, which are not included in the safety analysis, or the causes or consequences of which have been incompletely identified. Thus the information system must also have an interface for the documentation of the operational knowledge missing from the analysis results. The main tasks off the system when supporting the management of a disturbance are to identify it (or the most important of the coexistent ones) from the stored knowledge and to present it in a proper form (for example as a deviation graph). The information system may also be used to transfer knowledge from one shift to another and to train process personnel

  6. A Review of Literature on analysis of JIG Grinding Process

    DEFF Research Database (Denmark)

    Sudheesh, P. K.; Puthumana, Govindan

    2016-01-01

    in jig grinding, because of their uniformity and purity. In this paper, abrief review of the analysis of jig grinding process considering various research trends is presented. The areas highlighted are: optimization, selection of abrasives, selection of processing conditions and practical considerations....... The optimization of parameters in jig grinding process is important to maximize productivity and to improve quality. The abrasives of hard jig grinding wheels get blunt quickly so these are recommended to grind workpiece of low hardness and soft grinding wheels are recommended for hard material workpieces. The jig...

  7. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  8. Process analysis in a THTR trial reprocessing plant

    International Nuclear Information System (INIS)

    Brodda, B.G.; Filss, P.; Kirchner, H.; Kroth, K.; Lammertz, H.; Schaedlich, W.; Brocke, W.; Buerger, K.; Halling, H.; Watzlawik, K.H.

    1979-01-01

    The demands on an analytical control system for a THTR trial reprocessing plant are specified. In a rather detailed example, a typical sampling, sample monitoring and measuring process is described. Analytical control is partly automated. Data acquisition and evaluation by computer are described for some important, largely automated processes. Sample management and recording of in-line and off-line data are carried out by a data processing system. Some important experiments on sample taking, sample transport and on special analysis are described. (RB) [de

  9. Emotional Processing, Interaction Process, and Outcome in Clarification-Oriented Psychotherapy for Personality Disorders: A Process-Outcome Analysis.

    Science.gov (United States)

    Kramer, Ueli; Pascual-Leone, Antonio; Rohde, Kristina B; Sachse, Rainer

    2016-06-01

    It is important to understand the change processes involved in psychotherapies for patients with personality disorders (PDs). One patient process that promises to be useful in relation to the outcome of psychotherapy is emotional processing. In the present process-outcome analysis, we examine this question by using a sequential model of emotional processing and by additionally taking into account a therapist's appropriate responsiveness to a patient's presentation in clarification-oriented psychotherapy (COP), a humanistic-experiential form of therapy. The present study involved 39 patients with a range of PDs undergoing COP. Session 25 was assessed as part of the working phase of each therapy by external raters in terms of emotional processing using the Classification of Affective-Meaning States (CAMS) and in terms of the overall quality of therapist-patient interaction using the Process-Content-Relationship Scale (BIBS). Treatment outcome was assessed pre- and post-therapy using the Global Severity Index (GSI) of the SCL-90-R and the BDI. Results indicate that the good outcome cases showed more self-compassion, more rejecting anger, and a higher quality of therapist-patient interaction compared to poorer outcome cases. For good outcome cases, emotional processing predicted 18% of symptom change at the end of treatment, which was not found for poor outcome cases. These results are discussed within the framework of an integrative understanding of emotional processing as an underlying mechanism of change in COP, and perhaps in other effective therapy approaches for PDs.

  10. Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process

    Science.gov (United States)

    Chenail, Ronald J.

    2012-01-01

    In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…

  11. Energy and exergy analysis of the silicon production process

    International Nuclear Information System (INIS)

    Takla, M.; Kamfjord, N.E.; Tveit, Halvard; Kjelstrup, S.

    2013-01-01

    We used energy and exergy analysis to evaluate two industrial and one ideal (theoretical) production process for silicon. The industrial processes were considered in the absence and presence of power production from waste heat in the off-gas. The theoretical process, with pure reactants and no side-reactions, was used to provide a more realistic upper limit of performance for the others. The energy analysis documented the large thermal energy source in the off-gas system, while the exergy analysis documented the potential for efficiency improvement. We found an exergetic efficiency equal to 0.33 ± 0.02 for the process without power production. The value increased to 0.41 ± 0.03 when waste heat was utilized. For the ideal process, we found an exergetic efficiency of 0.51. Utilization of thermal exergy in an off-gas of 800 °C increased this exergetic efficiency to 0.71. Exergy destructed due to combustion of by-product gases and exergy lost with the furnace off-gas were the largest contributors to the thermodynamic inefficiency of all processes. - Highlights: • The exergetic efficiency for an industrial silicon production process when silicon is the only product was estimated to 0.33. • With additional power production from thermal energy in the off-gas we estimated the exergetic efficiency to 0.41. • The theoretical silicon production process is established as the reference case. • Exergy lost with the off-gas and exergy destructed due to combustion account for roughly 75% of the total losses. • With utilization of the thermal exergy in the off-gas at a temperature of 800 °C the exergetic efficiency was 0.71

  12. Production yield analysis in the poultry processing industry

    NARCIS (Netherlands)

    Somsen, D.J.; Capelle, A.; Tramper, J.

    2004-01-01

    The paper outlines a case study where the PYA-method (production yield analysis) was implemented at a poultry-slaughtering line, processing 9000 broiler chicks per hour. It was shown that the average live weight of a flock of broilers could be used to predict the maximum production yield of the

  13. Processing of pulse oximeter data using discrete wavelet analysis.

    Science.gov (United States)

    Lee, Seungjoon; Ibey, Bennett L; Xu, Weijian; Wilson, Mark A; Ericson, M Nance; Coté, Gerard L

    2005-07-01

    A wavelet-based signal processing technique was employed to improve an implantable blood perfusion monitoring system. Data was acquired from both in vitro and in vivo sources: a perfusion model and the proximal jejunum of an adult pig. Results showed that wavelet analysis could isolate perfusion signals from raw, periodic, in vitro data as well as fast Fourier transform (FFT) methods. However, for the quasi-periodic in vivo data segments, wavelet analysis provided more consistent results than the FFT analysis for data segments of 50, 10, and 5 s in length. Wavelet analysis has thus been shown to require less data points for quasi-periodic data than FFT analysis making it a good choice for an indwelling perfusion monitor where power consumption and reaction time are paramount.

  14. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  15. Thermodynamic analysis applied to a food-processing plant

    Energy Technology Data Exchange (ETDEWEB)

    Ho, J C; Chandratilleke, T T

    1987-01-01

    Two production lines of a multi-product, food-processing plant are selected for energy auditing and analysis. Thermodynamic analysis showed that the first-law and second-law efficiencies are 81.5% and 26.1% for the instant-noodles line and 23.6% and 7.9% for the malt-beverage line. These efficiency values are dictated primarily by the major energy-consuming sub-processes of each production line. Improvements in both first-law and second-law efficiencies are possible for the plants if the use of steam for heating is replaced by gaseous or liquid fuels, the steam ejectors for creating vacuum are replaced by a mechanical pump, and employing the cooler surroundings to assist in the cooling process.

  16. A report on digital image processing and analysis

    International Nuclear Information System (INIS)

    Singh, B.; Alex, J.; Haridasan, G.

    1989-01-01

    This report presents developments in software, connected with digital image processing and analysis in the Centre. In image processing, one resorts to either alteration of grey level values so as to enhance features in the image or resorts to transform domain operations for restoration or filtering. Typical transform domain operations like Karhunen-Loeve transforms are statistical in nature and are used for a good registration of images or template - matching. Image analysis procedures segment grey level images into images contained within selectable windows, for the purpose of estimating geometrical features in the image, like area, perimeter, projections etc. In short, in image processing both the input and output are images, whereas in image analyses, the input is an image whereas the output is a set of numbers and graphs. (author). 19 refs

  17. Safety analysis of tritium processing system based on PHA

    International Nuclear Information System (INIS)

    Fu Wanfa; Luo Deli; Tang Tao

    2012-01-01

    Safety analysis on primary confinement of tritium processing system for TBM was carried out with Preliminary Hazard Analysis. Firstly, the basic PHA process was given. Then the function and safe measures with multiple confinements about tritium system were described and analyzed briefly, dividing the two kinds of boundaries of tritium transferring through, that are multiple confinement systems division and fluid loops division. Analysis on tritium releasing is the key of PHA. Besides, PHA table about tritium releasing was put forward, the causes and harmful results being analyzed, and the safety measures were put forward also. On the basis of PHA, several kinds of typical accidents were supposed to be further analyzed. And 8 factors influencing the tritium safety were analyzed, laying the foundation of evaluating quantitatively the safety grade of various nuclear facilities. (authors)

  18. Data Farming Process and Initial Network Analysis Capabilities

    Directory of Open Access Journals (Sweden)

    Gary Horne

    2016-01-01

    Full Text Available Data Farming, network applications and approaches to integrate network analysis and processes to the data farming paradigm are presented as approaches to address complex system questions. Data Farming is a quantified approach that examines questions in large possibility spaces using modeling and simulation. It evaluates whole landscapes of outcomes to draw insights from outcome distributions and outliers. Social network analysis and graph theory are widely used techniques for the evaluation of social systems. Incorporation of these techniques into the data farming process provides analysts examining complex systems with a powerful new suite of tools for more fully exploring and understanding the effect of interactions in complex systems. The integration of network analysis with data farming techniques provides modelers with the capability to gain insight into the effect of network attributes, whether the network is explicitly defined or emergent, on the breadth of the model outcome space and the effect of model inputs on the resultant network statistics.

  19. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  20. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  1. Semantic orchestration of image processing services for environmental analysis

    Science.gov (United States)

    Ranisavljević, Élisabeth; Devin, Florent; Laffly, Dominique; Le Nir, Yannick

    2013-09-01

    In order to analyze environmental dynamics, a major process is the classification of the different phenomena of the site (e.g. ice and snow for a glacier). When using in situ pictures, this classification requires data pre-processing. Not all the pictures need the same sequence of processes depending on the disturbances. Until now, these sequences have been done manually, which restricts the processing of large amount of data. In this paper, we present how to realize a semantic orchestration to automate the sequencing for the analysis. It combines two advantages: solving the problem of the amount of processing, and diversifying the possibilities in the data processing. We define a BPEL description to express the sequences. This BPEL uses some web services to run the data processing. Each web service is semantically annotated using an ontology of image processing. The dynamic modification of the BPEL is done using SPARQL queries on these annotated web services. The results obtained by a prototype implementing this method validate the construction of the different workflows that can be applied to a large number of pictures.

  2. Inverse Analysis to Formability Design in a Deep Drawing Process

    Science.gov (United States)

    Buranathiti, Thaweepat; Cao, Jian

    Deep drawing process is an important process adding values to flat sheet metals in many industries. An important concern in the design of a deep drawing process generally is formability. This paper aims to present the connection between formability and inverse analysis (IA), which is a systematical means for determining an optimal blank configuration for a deep drawing process. In this paper, IA is presented and explored by using a commercial finite element software package. A number of numerical studies on the effect of blank configurations to the quality of a part produced by a deep drawing process were conducted and analyzed. The quality of the drawing processes is numerically analyzed by using an explicit incremental nonlinear finite element code. The minimum distance between elemental principal strains and the strain-based forming limit curve (FLC) is defined as tearing margin to be the key performance index (KPI) implying the quality of the part. The initial blank configuration has shown that it plays a highly important role in the quality of the product via the deep drawing process. In addition, it is observed that if a blank configuration is not greatly deviated from the one obtained from IA, the blank can still result a good product. The strain history around the bottom fillet of the part is also observed. The paper concludes that IA is an important part of the design methodology for deep drawing processes.

  3. Example process hazard analysis of a Department of Energy water chlorination process

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  4. Exergetic analysis of a biodiesel production process from Jatropha curcas

    International Nuclear Information System (INIS)

    Blanco-Marigorta, A.M.; Suárez-Medina, J.; Vera-Castellano, A.

    2013-01-01

    Highlights: ► Exergetic analysis of a biodiesel production process from Jatropha curcas. ► A 95% of the inefficiencies are located in the transesterification reactor. ► Exergetic efficiency of the steam generator amounts 37.6%. ► Chemical reactions cause most of the irreversibilities of the process. ► Exergetic efficiency of the overall process is over 63%. -- Abstract: As fossil fuels are depleting day by day, it is necessary to find an alternative fuel to fulfill the energy demand of the world. Biodiesel is considered as an environmentally friendly renewable diesel fuel alternative. The interest in using Jatropha curcas as a feedstock for the production of biodiesel is rapidly growing. On the one hand, J. curcas’ oil does not compete with the food sector due to its toxic nature and to the fact that it must be cultivated in marginal/poor soil. On the other, its price is low and stable. In the last decade, the investigation on biodiesel production was centered on the choice of the suitable raw material and on the optimization of the process operation conditions. Nowadays, research is focused on the improvement of the energetic performance and on diminishing the inefficiencies in the different process components. The method of exergy analysis is well suited for furthering this goal, for it is a powerful tool for developing, evaluating and improving an energy conversion system. In this work, we identify the location, magnitude and sources of thermodynamic inefficiencies in a biodiesel production process from J. curcas by means of an exergy analysis. The thermodynamic properties were calculated from existing databases or estimated when necessary. The higher exergy destruction takes places in the transesterification reactor due to chemical reactions. Almost 95% of the exergy of the fuel is destroyed in this reactor. The exergetic efficiency of the overall process is 63%.

  5. Surplus analysis of Sparre Andersen insurance risk processes

    CERN Document Server

    Willmot, Gordon E

    2017-01-01

    This carefully written monograph covers the Sparre Andersen process in an actuarial context using the renewal process as the model for claim counts. A unified reference on Sparre Andersen (renewal risk) processes is included, often missing from existing literature. The authors explore recent results and analyse various risk theoretic quantities associated with the event of ruin, including the time of ruin and the deficit of ruin. Particular attention is given to the explicit identification of defective renewal equation components, which are needed to analyse various risk theoretic quantities and are also relevant in other subject areas of applied probability such as dams and storage processes, as well as queuing theory. Aimed at researchers interested in risk/ruin theory and related areas, this work will also appeal to graduate students in classical and modern risk theory and Gerber-Shiu analysis.

  6. A Review of Literature on analysis of JIG Grinding Process

    DEFF Research Database (Denmark)

    Sudheesh, P. K.; Puthumana, Govindan

    2016-01-01

    Jig grinding is a process practically used by tool and die makers in the creation of jigs or mating holes and pegs on dies.The abrasives normally used in jig grinding are divided into Natural Abrasives and Artificial Abrasives. Artificial Abrasiveare preferred in manufacturing of grinding wheels...... in jig grinding, because of their uniformity and purity. In this paper, abrief review of the analysis of jig grinding process considering various research trends is presented. The areas highlighted are: optimization, selection of abrasives, selection of processing conditions and practical considerations....... The optimization of parameters in jig grinding process is important to maximize productivity and to improve quality. The abrasives of hard jig grinding wheels get blunt quickly so these are recommended to grind workpiece of low hardness and soft grinding wheels are recommended for hard material workpieces. The jig...

  7. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  8. System Analysis of Flat Grinding Process with Wheel Face

    Directory of Open Access Journals (Sweden)

    T. N. Ivanova

    2014-01-01

    Full Text Available The paper presents a conducted system analysis of the flat grinding wheel face, considers the state parameters, input and output variables of subsystems, namely: machine tool, workpiece, grinding wheel, cutting fluids, and the contact area. It reveals the factors influencing the temperature and power conditions for the grinding process.Aim: conducting the system analysis of the flat grinding process with wheel face expects to enable a development of the system of grinding process parameters as a technical system, which will make it possible to evaluate each parameter individually and implement optimization of the entire system.One of the most important criteria in defining the optimal process conditions is the grinding temperature, which, to avoid defects appearance of on the surface of component, should not exceed the critical temperature values to be experimentally determined. The temperature criterion can be useful for choosing the conditions for the maximum defect-free performance of the mechanical face grinding. To define the maximum performance of defect-free grinding can also use other criteria such as a critical power density, indirectly reflecting the allowable thermal stress grinding process; the structure of the ground surface, which reflects the presence or absence of a defect layer, which is determined after the large number of experiments; flow range of the diamond layer.Optimal conditions should not exceed those of defect-free grinding. It is found that a maximum performance depends on the characteristics of circles and grade of processed material, as well as on the contact area and grinding conditions. Optimal performance depends on the diamond value (cost and specific consumption of diamonds in a circle.Above criteria require formalization as a function of the variable parameters of the grinding process. There is an option for the compromise of inter-criteria optimality, thereby providing a set of acceptable solutions, from

  9. A meta-analysis and review of holistic face processing.

    Science.gov (United States)

    Richler, Jennifer J; Gauthier, Isabel

    2014-09-01

    The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, 2 measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the 2 designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly 3 times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the 1st sections of our review-the complete design-and outline outstanding research questions in that new context. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  10. Flotation process diagnostics and modelling by coal grain analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ofori, P; O' Brien, G.; Firth, B.; Jenkins, B. [CSIRO Energy Technology, Brisbane, Qld. (Australia)

    2006-05-15

    In coal flotation, particles of different components of the coal such as maceral groups and mineral matter and their associations have different hydrophobicities and therefore different flotation responses. By using a new coal grain analysis method for characterising individual grains, more detailed flotation performance analysis and modelling approaches have been developed. The method involves the use of microscopic imaging techniques to obtain estimates of size, compositional and density information on individual grains of fine coal. The density and composition partitioning of coal processed through different flotation systems provides an avenue to pinpoint the actual cause of poor process performance so that corrective action may be initiated. The information on grain size, density and composition is being used as input data to develop more detailed flotation process models to provide better predictions of process performance for both mechanical and column flotation devices. A number of approaches may be taken to flotation modelling such as the probability approach and the kinetic model approach or a combination of the two. In the work reported here, a simple probability approach has been taken, which will be further refined in due course. The use of grain data to map the responses of different types of coal grains through various fine coal cleaning processes provided a more advanced diagnostic capability for fine coal cleaning circuits. This enabled flotation performance curves analogous to partition curves for density separators to be produced for flotation devices.

  11. Bayesian analysis of log Gaussian Cox processes for disease mapping

    DEFF Research Database (Denmark)

    Benes, Viktor; Bodlák, Karel; Møller, Jesper

    We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis, and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence...... of the risk on the covariates. Instead of using the common area level approaches we consider a Bayesian analysis for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using markov chain Monte Carlo methods...

  12. System and Analysis for Low Latency Video Processing using Microservices

    OpenAIRE

    VASUKI BALASUBRAMANIAM, KARTHIKEYAN

    2017-01-01

    The evolution of big data processing and analysis has led to data-parallel frameworks such as Hadoop, MapReduce, Spark, and Hive, which are capable of analyzing large streams of data such as server logs, web transactions, and user reviews. Videos are one of the biggest sources of data and dominate the Internet traffic. Video processing on a large scale is critical and challenging as videos possess spatial and temporal features, which are not taken into account by the existing data-parallel fr...

  13. THE ANALYSIS OF RISK MANAGEMENT PROCESS WITHIN MANAGEMENT

    Directory of Open Access Journals (Sweden)

    ROMANESCU MARCEL LAURENTIU

    2016-10-01

    Full Text Available This article highlights the risk analysis within management, focusing on how a company could practicaly integrate the risks management in the existing leading process. Subsequently, it is exemplified the way of manage risk effectively, which gives numerous advantages to all firms, including improving their decision-making process. All these lead to the conclusion that the degree of risk specific to companies is very high, but if managers make the best decisions then it can diminish it and all business activitiy and its income are not influenced by factors that could disturb in a negative way .

  14. Global processing takes time: A meta-analysis on local-global visual processing in ASD.

    Science.gov (United States)

    Van der Hallen, Ruth; Evers, Kris; Brewaeys, Katrien; Van den Noortgate, Wim; Wagemans, Johan

    2015-05-01

    What does an individual with autism spectrum disorder (ASD) perceive first: the forest or the trees? In spite of 30 years of research and influential theories like the weak central coherence (WCC) theory and the enhanced perceptual functioning (EPF) account, the interplay of local and global visual processing in ASD remains only partly understood. Research findings vary in indicating a local processing bias or a global processing deficit, and often contradict each other. We have applied a formal meta-analytic approach and combined 56 articles that tested about 1,000 ASD participants and used a wide range of stimuli and tasks to investigate local and global visual processing in ASD. Overall, results show no enhanced local visual processing nor a deficit in global visual processing. Detailed analysis reveals a difference in the temporal pattern of the local-global balance, that is, slow global processing in individuals with ASD. Whereas task-dependent interaction effects are obtained, gender, age, and IQ of either participant groups seem to have no direct influence on performance. Based on the overview of the literature, suggestions are made for future research. (c) 2015 APA, all rights reserved).

  15. Analysis of hazardous substances released during CFRP laser processing

    Science.gov (United States)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  16. Role of thermal analysis in uranium oxide fuel fabrication process

    International Nuclear Information System (INIS)

    Balaji Rao, Y.; Yadav, R.B.

    2006-01-01

    The present paper discusses the application of thermal analysis, particularly, differential thermal analysis (Dta) at various stages of fuel fabrication process. The useful role of Dta in knowing the decomposition pattern and calcination temperature of Adu along with de-nitration temperature is explained. The decomposition pattern depends upon the type of drying process adopted for wet ADU cake (ADU C). Also, the paper highlights the utility of DTA in determining the APS and SSA of UO 2+x and U 3 O 8 powders as an alternate technique. Further, the temperature difference (ΔT max ) between the two exothermic peaks obtained in UO 2+x powder oxidation is related to sintered density of UO 2 pellets. (author)

  17. Textural Analysis of Fatique Crack Surfaces: Image Pre-processing

    Directory of Open Access Journals (Sweden)

    H. Lauschmann

    2000-01-01

    Full Text Available For the fatique crack history reconstitution, new methods of quantitative microfractography are beeing developed based on the image processing and textural analysis. SEM magnifications between micro- and macrofractography are used. Two image pre-processing operatins were suggested and proved to prepare the crack surface images for analytical treatment: 1. Normalization is used to transform the image to a stationary form. Compared to the generally used equalization, it conserves the shape of brightness distribution and saves the character of the texture. 2. Binarization is used to transform the grayscale image to a system of thick fibres. An objective criterion for the threshold brightness value was found as that resulting into the maximum number of objects. Both methods were succesfully applied together with the following textural analysis.

  18. Data acquisition and processing system for reactor noise analysis

    International Nuclear Information System (INIS)

    Costa Oliveira, J.; Morais Da Veiga, C.; Forjaz Trigueiros, D.; Pombo Duarte, J.

    1975-01-01

    A data acquisition and processing system for reactor noise analysis by time correlation methods is described, consisting in one to four data feeding channels (transducer, associated electronics and V/f converter), a sampling unit, a landline transmission system and a PDP 15 computer. This system is being applied to study the kinetic parameters of the 'Reactor Portugues de Investigacao', a swimming-pool 1MW reactor. The main features that make such a data acquisition and processing system a useful tool to perform noise analysis are: the improved characteristics of analog-to-digital converters employed to quantize the signals; the use of an on-line computer which allows a great accumulation and a rapid treatment of data together with an easy check of the correctness of the experiments; and the adoption of the time cross-correlation technique using two-detectors which by-pass the limitation of low efficiency detectors. (author)

  19. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  20. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Spatial Analysis of Depots for Advanced Biomass Processing

    Energy Technology Data Exchange (ETDEWEB)

    Hilliard, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Webb, Erin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sokhansanj, Shahabaddine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Martinez Gonzalez, Maria I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-09-01

    The objective of this work was to perform a spatial analysis of the total feedstock cost at the conversion reactor for biomass supplied by a conventional system and an advanced system with depots to densify biomass into pellets. From these cost estimates, the conditions (feedstock cost and availability) for which advanced processing depots make it possible to achieve cost and volume targets can be identified.

  2. Analysis and control of harmful emissions from combustion processes

    OpenAIRE

    Jafari, Ahmad

    2000-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. The harmful effects of air pollutants on human beings and environment have been the major reason for efforts in sampling, analysis and control of their sources. The major pollutants emitted to atmosphere from stationary combustion processes are nitrogen oxides, inorganic acids, carbon dioxide, carbon monoxide, hydrocarbon and soot. In the current work two methods are developed for sampl...

  3. Thermodynamic analysis of CO2 capture processes for power plants

    OpenAIRE

    Biyouki, Zeinab Amrollahi

    2014-01-01

    This thesis work presents an evaluation of various processes for reducing CO2 emissions from natural-gas-fired combined cycle (NGCC) power plants. The scope of the thesis is to focus mainly on post-combustion chemical absorption for NGCC. For the post-combustion capture plant, an important interface is the steam extraction from the steam turbine in order to supply the heat for solvent regeneration. The steam extraction imposes a power production penalty. The thesis includes analysis and compa...

  4. Stochastic Analysis of Gaussian Processes via Fredholm Representation

    Directory of Open Access Journals (Sweden)

    Tommi Sottinen

    2016-01-01

    Full Text Available We show that every separable Gaussian process with integrable variance function admits a Fredholm representation with respect to a Brownian motion. We extend the Fredholm representation to a transfer principle and develop stochastic analysis by using it. We show the convenience of the Fredholm representation by giving applications to equivalence in law, bridges, series expansions, stochastic differential equations, and maximum likelihood estimations.

  5. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  6. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    2017-07-01

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.

  7. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  8. Memory systems, processes, and tasks: taxonomic clarification via factor analysis.

    Science.gov (United States)

    Bruss, Peter J; Mitchell, David B

    2009-01-01

    The nature of various memory systems was examined using factor analysis. We reanalyzed data from 11 memory tasks previously reported in Mitchell and Bruss (2003). Four well-defined factors emerged, closely resembling episodic and semantic memory and conceptual and perceptual implicit memory, in line with both memory systems and transfer-appropriate processing accounts. To explore taxonomic issues, we ran separate analyses on the implicit tasks. Using a cross-format manipulation (pictures vs. words), we identified 3 prototypical tasks. Word fragment completion and picture fragment identification tasks were "factor pure," tapping perceptual processes uniquely. Category exemplar generation revealed its conceptual nature, yielding both cross-format priming and a picture superiority effect. In contrast, word stem completion and picture naming were more complex, revealing attributes of both processes.

  9. Multifractal detrended fluctuation analysis of analog random multiplicative processes

    Energy Technology Data Exchange (ETDEWEB)

    Silva, L.B.M.; Vermelho, M.V.D. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil); Lyra, M.L. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)], E-mail: marcelo@if.ufal.br; Viswanathan, G.M. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)

    2009-09-15

    We investigate non-Gaussian statistical properties of stationary stochastic signals generated by an analog circuit that simulates a random multiplicative process with weak additive noise. The random noises are originated by thermal shot noise and avalanche processes, while the multiplicative process is generated by a fully analog circuit. The resulting signal describes stochastic time series of current interest in several areas such as turbulence, finance, biology and environment, which exhibit power-law distributions. Specifically, we study the correlation properties of the signal by employing a detrended fluctuation analysis and explore its multifractal nature. The singularity spectrum is obtained and analyzed as a function of the control circuit parameter that tunes the asymptotic power-law form of the probability distribution function.

  10. Design and analysis of nuclear processes with the APROS

    International Nuclear Information System (INIS)

    Haenninen, M.; Puska, E.K.; Nystroem, P.

    1987-01-01

    APROS (Advanced Process Simulator) is the product being developed in the process simulators project of Imatran Voima Co. and Technical Research Centre of Finland. The aim is to design and construct an efficient and easy to use computer simulation system for process and automation system design, evaluation, analysis, testing and training purposes. As halfway of this project a working system exists with a large number of proven routines and models. However, a lot of development is still foreseen before the project will be finished. This article gives an overview of the APROS in general and of the nuclear features in particular. The calculational capabilities of the system are presented with the help of one example. (orig.)

  11. Thermoreflectance spectroscopy—Analysis of thermal processes in semiconductor lasers

    Science.gov (United States)

    Pierścińska, D.

    2018-01-01

    This review focuses on theoretical foundations, experimental implementation and an overview of experimental results of the thermoreflectance spectroscopy as a powerful technique for temperature monitoring and analysis of thermal processes in semiconductor lasers. This is an optical, non-contact, high spatial resolution technique providing high temperature resolution and mapping capabilities. Thermoreflectance is a thermometric technique based on measuring of relative change of reflectivity of the surface of laser facet, which provides thermal images useful in hot spot detection and reliability studies. In this paper, principles and experimental implementation of the technique as a thermography tool is discussed. Some exemplary applications of TR to various types of lasers are presented, proving that thermoreflectance technique provides new insight into heat management problems in semiconductor lasers and in particular, that it allows studying thermal degradation processes occurring at laser facets. Additionally, thermal processes and basic mechanisms of degradation of the semiconductor laser are discussed.

  12. Parallel factor analysis PARAFAC of process affected water

    Energy Technology Data Exchange (ETDEWEB)

    Ewanchuk, A.M.; Ulrich, A.C.; Sego, D. [Alberta Univ., Edmonton, AB (Canada). Dept. of Civil and Environmental Engineering; Alostaz, M. [Thurber Engineering Ltd., Calgary, AB (Canada)

    2010-07-01

    A parallel factor analysis (PARAFAC) of oil sands process-affected water was presented. Naphthenic acids (NA) are traditionally described as monobasic carboxylic acids. Research has indicated that oil sands NA do not fit classical definitions of NA. Oil sands organic acids have toxic and corrosive properties. When analyzed by fluorescence technology, oil sands process-affected water displays a characteristic peak at 290 nm excitation and approximately 346 nm emission. In this study, a parallel factor analysis (PARAFAC) was used to decompose process-affected water multi-way data into components representing analytes, chemical compounds, and groups of compounds. Water samples from various oil sands operations were analyzed in order to obtain EEMs. The EEMs were then arranged into a large matrix in decreasing process-affected water content for PARAFAC. Data were divided into 5 components. A comparison with commercially prepared NA samples suggested that oil sands NA is fundamentally different. Further research is needed to determine what each of the 5 components represent. tabs., figs.

  13. Process Equipment Failure Mode Analysis in a Chemical Industry

    Directory of Open Access Journals (Sweden)

    J. Nasl Seraji

    2008-04-01

    Full Text Available Background and aims   Prevention of potential accidents and safety promotion in chemical processes requires systematic safety management in them. The main objective of this study was analysis of important process equipment components failure modes and effects in H2S and CO2  isolation from extracted natural gas process.   Methods   This study was done in sweetening unit of an Iranian gas refinery. Failure Mode and Effect Analysis (FMEA used for identification of process equipments failures.   Results   Totally 30 failures identified and evaluated using FMEA. P-1 blower's blade breaking and sour gas pressure control valve bearing tight moving had maximum risk Priority number (RPN, P-1 body corrosion and increasing plug lower side angle of reach DEAlevel control valve  in tower - 1 were minimum calculated RPN.   Conclusion   By providing a reliable documentation system for equipment failures and  incidents recording, maintaining of basic information for later safety assessments would be  possible. Also, the probability of failures and effects could be minimized by conducting preventive maintenance.

  14. ACTINIDE REMOVAL PROCESS SAMPLE ANALYSIS, CHEMICAL MODELING, AND FILTRATION EVALUATION

    Energy Technology Data Exchange (ETDEWEB)

    Martino, C.; Herman, D.; Pike, J.; Peters, T.

    2014-06-05

    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  15. Accessibility analysis in manufacturing processes using visibility cones

    Institute of Scientific and Technical Information of China (English)

    尹周平; 丁汉; 熊有伦

    2002-01-01

    Accessibility is a kind of important design feature of products,and accessibility analysis has been acknowledged as a powerful tool for solving computational manufacturing problems arising from different manufacturing processes.After exploring the relations among approachability,accessibility and visibility,a general method for accessibility analysis using visibility cones (VC) is proposed.With the definition of VC of a point,three kinds of visibility of a feature,namely complete visibility cone (CVC),partial visibility cone (PVC) and local visibility cone (LVC),are defined.A novel approach to computing VCs is formulated by identifying C-obstacles in the C-space,for which a general and efficient algorithm is proposed and implemented by making use of visibility culling.Lastly,we discuss briefly how to realize accessibility analysis in numerically controlled (NC) machining planning,coordinate measuring machines (CMMs) inspection planning and assembly sequence planning with the proposed methods.

  16. Analysis of briquetting process of sewage sludge with coal to combustion process

    Directory of Open Access Journals (Sweden)

    Kosturkiewicz Bogdan

    2016-01-01

    Full Text Available Energy recovery from sewage sludge can be achieved by several thermal technologies, but before those processes sewage sludge requires special pretreatment. The paper presents the investigation of the sewage sludge with coal briquettes as a fuel for combustion process. Research is conducted at Department of Manufacturing Systems and Department of Thermal Engineering and Environmental Protection, AGH University of Science and Technology to develop a technology of briquette preparation. The obtained results showed possibility of briquetting of municipal sewage sludge with coal in roll presses, equipped with asymmetric thickening gravity feed system. The following properties were determined for the obtained briquettes: density, drop strength and compressive strength. Based on physical and chemical analysis of prepared briquettes it was confirmed that briquettes have good fuel properties to combustion process. Thermal behaviour of studied sewage sludge and prepared mixture was investigated by thermogravimetric analysis (TG. For the thermo gravimetric analysis (TG the samples were heated in an alumina crucible from an ambient temperature up to 1000 °C at a constant rates: 10 °C/min, 40 °C/min and 100 °C/min in a 40 ml/min flow of air.

  17. Estimation of CO2 emission for each process in the Japanese steel industry: a process analysis

    International Nuclear Information System (INIS)

    Sakamoto, Y.; Tonooka, Y.

    2000-01-01

    The CO 2 emission for each process in the Japanese steel industry is estimated by a process analysis using statistical data in order to evaluate the possibility of reducing CO 2 emissions. The emission factor of CO 2 for each product and also for crude steel produced from an integrated steel plant route and an electric arc furnaces route is estimated and compared. The CO 2 emissions can be estimated from production amounts of products for each process and for crude steel. The CO 2 emission of blast furnaces is the largest and that of rolling and piping follows. The emission factor of CO 2 of crude steel produced from an integrated steel plant route is approximately 3.8 times as high as that produced via an electric arc furnace route. (Author)

  18. Analysis of business process maturity and organisational performance relations

    Directory of Open Access Journals (Sweden)

    Kalinowski T. Bartosz

    2016-12-01

    Full Text Available The paper aims to present results of the study on business process maturity in relation to organisational performance. A two-phase methodology, based on literature review and survey was used. The literature is a source of knowledge about business process maturity and organisational performance, whereas the research on process maturity vs organisational performance in Polish Enterprises provides findings based on 84 surveyed companies. The main areas of the research covered: identification and analysis of maturity related variables and identification of organisational performance perspectives and its relation to process maturity. The study shows that there is a significant positive relation between process maturity and organisational performance. Although there are research on such relation available, they are scarce and have some significant limitations in terms of research sample or the scope of maturity or organisational performance covered. This publication is part of a project funded by the National Science Centre awarded by decision number DEC-2011/01/D/HS4/04070.

  19. Cross-Sectional Analysis of Longitudinal Mediation Processes.

    Science.gov (United States)

    O'Laughlin, Kristine D; Martin, Monica J; Ferrer, Emilio

    2018-01-01

    Statistical mediation analysis can help to identify and explain the mechanisms behind psychological processes. Examining a set of variables for mediation effects is a ubiquitous process in the social sciences literature; however, despite evidence suggesting that cross-sectional data can misrepresent the mediation of longitudinal processes, cross-sectional analyses continue to be used in this manner. Alternative longitudinal mediation models, including those rooted in a structural equation modeling framework (cross-lagged panel, latent growth curve, and latent difference score models) are currently available and may provide a better representation of mediation processes for longitudinal data. The purpose of this paper is twofold: first, we provide a comparison of cross-sectional and longitudinal mediation models; second, we advocate using models to evaluate mediation effects that capture the temporal sequence of the process under study. Two separate empirical examples are presented to illustrate differences in the conclusions drawn from cross-sectional and longitudinal mediation analyses. Findings from these examples yielded substantial differences in interpretations between the cross-sectional and longitudinal mediation models considered here. Based on these observations, researchers should use caution when attempting to use cross-sectional data in place of longitudinal data for mediation analyses.

  20. Economic analysis of novel synergistic biofuel (H2Bioil) processes

    International Nuclear Information System (INIS)

    Singh, Navneet R.; Mallapragada, Dharik S.; Agrawal, Rakesh; Tyner, Wallace E.

    2012-01-01

    Fast-pyrolysis based processes can be built on small-scale and have higher process carbon and energy efficiency as compared to other options. H 2 Bioil is a novel process based on biomass fast-hydropyrolysis and subsequent hydrodeoxygenation (HDO) and can potentially provide high yields of high energy density liquid fuel at relatively low hydrogen consumption. This paper contains a comprehensive financial analysis of the H 2 Bioil process with hydrogen derived from different sources. Three different carbon tax scenarios are analyzed: no carbon tax, $55/metric ton carbon tax and $110/metric ton carbon tax. The break-even crude oil price for a delivered biomass cost of $94/metric ton when hydrogen is derived from coal, natural gas or nuclear energy ranges from $103 to $116/bbl for no carbon tax and even lower ($99-$111/bbl) for the carbon tax scenarios. This break-even crude oil price compares favorably with the literature estimated prices of fuels from alternate biochemical and thermochemical routes. The impact of the chosen carbon tax is found to be limited relative to the impact of the H 2 source on the H 2 Bioil break-even price. The economic robustness of the processes for hydrogen derived from coal, natural gas, or nuclear energy is seen by an estimated break-even crude oil price of $114-$126/bbl when biomass cost is increased to $121/metric ton. (orig.)

  1. Methodology to analysis of aging processes of containment spray system

    International Nuclear Information System (INIS)

    Borges, D. da Silva; Lava, D.D.; Moreira, M. de L.; Ferreira Guimarães, A.C.; Fernandes da Silva, L.

    2015-01-01

    This paper presents a contribution to the study of aging process of components in commercial plants of Pressurized Water Reactors (PWRs). The motivation for write this work emerged from the current perspective nuclear. Numerous nuclear power plants worldwide have an advanced operating time. This problem requires a process to ensure the confiability of the operative systems of these plants, because of this, it is necessary a methodologies capable of estimate the failure probability of the components and systems. In addition to the safety factors involved, such methodologies can to be used to search ways to ensure the extension of the life cycle of nuclear plants, which inevitably will pass by the decommissioning process after the operating time of 40 years. This process negatively affects the power generation, besides demanding an enormous investment for such. Thus, this paper aims to present modeling techniques and sensitivity analysis, which together can generate an estimate of how components, which are more sensitive to the aging process, will behave during the normal operation cycle of a nuclear power plant. (authors)

  2. Thermodynamics and process analysis for future economic scenarios

    International Nuclear Information System (INIS)

    Ayres, R.U.

    1995-01-01

    Economists are increasingly interested in forecasting future costs and benefits of policies for dealing with materials/energy fluxes, polluting emissions and environmental impacts on various scales, from sectoral to global. Computable general equilibrium (CGE) models are currently popular because they project demand and industrial structure into the future, along an equilibrium path. But they are applicable only to the extent that structural changes occur in or near equilibrium, independent of radical technological (or social) change. The alternative tool for analyzing economic implications of scenario assumptions is to use Leontief-type Input-Output (I-O) models. I-O models are unable to endogenize structural shifts (changing I-O coefficients). However, this can be a virtue when considering radical rather than incremental shifts. Postulated I-O tables can be used independently to check the internal consistency of scenarios. Or I-O models can be used to generate scenarios by linking them to econometric 'macro-drivers' (which can, in principle, be CGE models). Explicit process analysis can be integrated, in principle, with I-O models. This hybrid scheme provides a natural means of satisfying physical constraints, especially the first and second laws of thermodynamics. This is important, to avoid constructing scenarios based on physically impossible processes. Process analysis is really the only available tool for constructing physically plausible alternative future I-O tables, and generating materials/energy and waste emissions coefficients. Explicit process analysis also helps avoid several problems characteristic of 'pure' CGE or I-O models, viz. (1) aggregation errors (2) inability to handle arbitrary combinations of co-product and co-input relationships and (3) inability to reflect certain non-linearities such as internal feedback loops. 4 figs., 2 tabs., 38 refs

  3. Image processing analysis of traditional Gestalt vision experiments

    Science.gov (United States)

    McCann, John J.

    2002-06-01

    In the late 19th century, the Gestalt Psychology rebelled against the popular new science of Psychophysics. The Gestalt revolution used many fascinating visual examples to illustrate that the whole is greater than the sum of all the parts. Color constancy was an important example. The physical interpretation of sensations and their quantification by JNDs and Weber fractions were met with innumerable examples in which two 'identical' physical stimuli did not look the same. The fact that large changes in the color of the illumination failed to change color appearance in real scenes demanded something more than quantifying the psychophysical response of a single pixel. The debates continues today with proponents of both physical, pixel-based colorimetry and perceptual, image- based cognitive interpretations. Modern instrumentation has made colorimetric pixel measurement universal. As well, new examples of unconscious inference continue to be reported in the literature. Image processing provides a new way of analyzing familiar Gestalt displays. Since the pioneering experiments by Fergus Campbell and Land, we know that human vision has independent spatial channels and independent color channels. Color matching data from color constancy experiments agrees with spatial comparison analysis. In this analysis, simple spatial processes can explain the different appearances of 'identical' stimuli by analyzing the multiresolution spatial properties of their surrounds. Benary's Cross, White's Effect, the Checkerboard Illusion and the Dungeon Illusion can all be understood by the analysis of their low-spatial-frequency components. Just as with color constancy, these Gestalt images are most simply described by the analysis of spatial components. Simple spatial mechanisms account for the appearance of 'identical' stimuli in complex scenes. It does not require complex, cognitive processes to calculate appearances in familiar Gestalt experiments.

  4. Probabilistic sensitivity analysis of system availability using Gaussian processes

    International Nuclear Information System (INIS)

    Daneshkhah, Alireza; Bedford, Tim

    2013-01-01

    The availability of a system under a given failure/repair process is a function of time which can be determined through a set of integral equations and usually calculated numerically. We focus here on the issue of carrying out sensitivity analysis of availability to determine the influence of the input parameters. The main purpose is to study the sensitivity of the system availability with respect to the changes in the main parameters. In the simplest case that the failure repair process is (continuous time/discrete state) Markovian, explicit formulae are well known. Unfortunately, in more general cases availability is often a complicated function of the parameters without closed form solution. Thus, the computation of sensitivity measures would be time-consuming or even infeasible. In this paper, we show how Sobol and other related sensitivity measures can be cheaply computed to measure how changes in the model inputs (failure/repair times) influence the outputs (availability measure). We use a Bayesian framework, called the Bayesian analysis of computer code output (BACCO) which is based on using the Gaussian process as an emulator (i.e., an approximation) of complex models/functions. This approach allows effective sensitivity analysis to be achieved by using far smaller numbers of model runs than other methods. The emulator-based sensitivity measure is used to examine the influence of the failure and repair densities' parameters on the system availability. We discuss how to apply the methods practically in the reliability context, considering in particular the selection of parameters and prior distributions and how we can ensure these may be considered independent—one of the key assumptions of the Sobol approach. The method is illustrated on several examples, and we discuss the further implications of the technique for reliability and maintenance analysis

  5. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    Science.gov (United States)

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  6. Data Processing and Analysis Systems for JT-60U

    International Nuclear Information System (INIS)

    Matsuda, T.; Totsuka, T.; Tsugita, T.; Oshima, T.; Sakata, S.; Sato, M.; Iwasaki, K.

    2002-01-01

    The JT-60U data processing system is a large computer complex gradually modernized by utilizing progressive computer and network technology. A main computer using state-of-the-art CMOS technology can handle ∼550 MB of data per discharge. A gigabit ethernet switch with FDDI ports has been introduced to cope with the increase of handling data. Workstation systems with VMEbus serial highway drivers for CAMAC have been developed and used to replace many minicomputer systems. VMEbus-based fast data acquisition systems have also been developed to enlarge and replace a minicomputer system for mass data.The JT-60U data analysis system is composed of a JT-60U database server and a JT-60U analysis server, which are distributed UNIX servers. The experimental database is stored in the 1TB RAID disk of the JT-60U database server and is composed of ZENKEI and diagnostic databases. Various data analysis tools are available on the JT-60U analysis server. For the remote collaboration, technical features of the data analysis system have been applied to the computer system to access JT-60U data via the Internet. Remote participation in JT-60U experiments has been successfully conducted since 1996

  7. Analysis of electrochemical disintegration process of graphite matrix

    International Nuclear Information System (INIS)

    Tian Lifang; Wen Mingfen; Chen Jing

    2010-01-01

    The electrochemical method with ammonium nitrate as electrolyte was studied to disintegrate the graphite matrix from the simulative fuel elements for high temperature gas-cooled reactor. The influences of process parameters, including salt concentration, system temperature and current density, on the disintegration rate of graphite fragments were investigated in the present work. The experimental results showed that the disintegration rate depended slightly on the temperature and salt concentration. The current density strongly affected the disintegration rate of graphite fragments. Furthermore, the content of introduced oxygen in final graphite fragments was independent of the current density and the concentration of electrolyte. Moreover, the structural evolution of graphite was analyzed based on the microstructural parameters determined by X-ray diffraction profile fitting analysis using MAUD (material analysis using diffraction) before and after the disintegration process. It may safely be concluded that the graphite disintegration can be ascribed to the influences of the intercalation of foreign molecules in between crystal planes and the partial oxidation involved. The disintegration process was described deeply composed of intercalate part and further oxidation part of carbon which effected together to lead to the collapse of graphite crystals.

  8. Process integration and pinch analysis in sugarcane industry

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Adelk de Carvalho; Pinheiro, Ricardo Brant [UFMG, Departamento de Engenharia Nuclear, Programa de Pos-Graduacao em Ciencias e Tecnicas Nucleares, Belo Horizonte, MG (Brazil)], E-mail: rbp@nuclear.ufmg.br

    2010-07-01

    Process integration techniques were applied, particularly through the Pinch Analysis method, to sugarcane industry. Research was performed upon harvest data from an agroindustrial complex which processes sugarcane plant in excess of 3.5 million metric tons per year, producing motor fuel grade ethanol, standard quality sugar, and delivering excess electric power to the grid. Pinch Analysis was used in assessing internal heat recovery as well as external utility demand targets, while keeping the lowest but economically achievable targets for entropy increase. Efficiency on the use of energy was evaluated for the plant as it was found (the base case) as well as for five selected process and/or plant design modifications, always with guidance of the method. The first alternative design (case 2) was proposed to evaluate equipment mean idle time in the base case, to support subsequent comparisons. Cases 3 and 4 were used to estimate the upper limits of combined heat and power generation while raw material supply of the base case is kept; both the cases did not prove worth implementing. Cases 5 and 6 were devised to deal with the bottleneck of the plant, namely boiler capacity, in order to allow for some production increment. Inexpensive, minor modifications considered in case 5 were found unable to produce reasonable outcome gain. Nevertheless, proper changes in cane juice evaporation section (case 6) could allow sugar and ethanol combined production to rise up to 9.1% relative to the base case, without dropping cogenerated power. (author)

  9. Sensorial analysis of peanuts processed by e-beam

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Priscila V.; Furgeri, Camilo; Salum, Debora C.; Rogovschi, Vladimir D.; Villavicencio, Anna Lucia C.H. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mail: villavic@ipen.br

    2007-07-01

    The development of the sensorial analysis was influenced by frequent changes in the technology of production and distribution of foods. Currently the sensorial analysis has represented a decisive part in some sectors of the nourishing industry with the purpose to improve the quality of its products. The food irradiation has as purpose to improve the product quality, in order to eliminate the diverse microorganisms that can spoil the food. The process of irradiation in the recommended doses causes very few chemical alterations in some foods, the nutritional losses are considered insignificant and some of the alterations known found in irradiated foods is not harmful or dangerous. The present study evaluated the sensorial characteristics of peanuts processed by electron beam machine and was made a test of acceptance using a hedonic scale. Samples of peanut had been processed in the doses of 0, 5 and 7 kGy. Thirty volunteer panelists had participated of that acceptance study. The evaluating parameters were: appearance, odor and flavor. The result showed that the consumers had approved the peanut in the dose of 5 and 7 kGy, not having significant difference between the samples controlled and irradiated. (author)

  10. Sensorial analysis of peanuts processed by e-beam

    International Nuclear Information System (INIS)

    Silva, Priscila V.; Furgeri, Camilo; Salum, Debora C.; Rogovschi, Vladimir D.; Villavicencio, Anna Lucia C.H.

    2007-01-01

    The development of the sensorial analysis was influenced by frequent changes in the technology of production and distribution of foods. Currently the sensorial analysis has represented a decisive part in some sectors of the nourishing industry with the purpose to improve the quality of its products. The food irradiation has as purpose to improve the product quality, in order to eliminate the diverse microorganisms that can spoil the food. The process of irradiation in the recommended doses causes very few chemical alterations in some foods, the nutritional losses are considered insignificant and some of the alterations known found in irradiated foods is not harmful or dangerous. The present study evaluated the sensorial characteristics of peanuts processed by electron beam machine and was made a test of acceptance using a hedonic scale. Samples of peanut had been processed in the doses of 0, 5 and 7 kGy. Thirty volunteer panelists had participated of that acceptance study. The evaluating parameters were: appearance, odor and flavor. The result showed that the consumers had approved the peanut in the dose of 5 and 7 kGy, not having significant difference between the samples controlled and irradiated. (author)

  11. Process Measurement Deviation Analysis for Flow Rate due to Miscalibration

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Eunsuk; Kim, Byung Rae; Jeong, Seog Hwan; Choi, Ji Hye; Shin, Yong Chul; Yun, Jae Hee [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    An analysis was initiated to identify the root cause, and the exemption of high static line pressure correction to differential pressure (DP) transmitters was one of the major deviation factors. Also the miscalibrated DP transmitter range was identified as another major deviation factor. This paper presents considerations to be incorporated in the process flow measurement instrumentation calibration and the analysis results identified that the DP flow transmitter electrical output decreased by 3%. Thereafter, flow rate indication decreased by 1.9% resulting from the high static line pressure correction exemption and measurement range miscalibration. After re-calibration, the flow rate indication increased by 1.9%, which is consistent with the analysis result. This paper presents the brief calibration procedures for Rosemount DP flow transmitter, and analyzes possible three cases of measurement deviation including error and cause. Generally, the DP transmitter is required to be calibrated with precise process input range according to the calibration procedure provided for specific DP transmitter. Especially, in case of the DP transmitter installed in high static line pressure, it is important to correct the high static line pressure effect to avoid the inherent systematic error for Rosemount DP transmitter. Otherwise, failure to notice the correction may lead to indicating deviation from actual value.

  12. Integrating Process Mining and Cognitive Analysis to Study EHR Workflow.

    Science.gov (United States)

    Furniss, Stephanie K; Burton, Matthew M; Grando, Adela; Larson, David W; Kaufman, David R

    2016-01-01

    There are numerous methods to study workflow. However, few produce the kinds of in-depth analyses needed to understand EHR-mediated workflow. Here we investigated variations in clinicians' EHR workflow by integrating quantitative analysis of patterns of users' EHR-interactions with in-depth qualitative analysis of user performance. We characterized 6 clinicians' patterns of information-gathering using a sequential process-mining approach. The analysis revealed 519 different screen transition patterns performed across 1569 patient cases. No one pattern was followed for more than 10% of patient cases, the 15 most frequent patterns accounted for over half ofpatient cases (53%), and 27% of cases exhibited unique patterns. By triangulating quantitative and qualitative analyses, we found that participants' EHR-interactive behavior was associated with their routine processes, patient case complexity, and EHR default settings. The proposed approach has significant potential to inform resource allocation for observation and training. In-depth observations helped us to explain variation across users.

  13. Vulnerability analysis of process plants subject to domino effects

    International Nuclear Information System (INIS)

    Khakzad, Nima; Reniers, Genserik; Abbassi, Rouzbeh; Khan, Faisal

    2016-01-01

    In the context of domino effects, vulnerability analysis of chemical and process plants aims to identify and protect installations which are relatively more susceptible to damage and thus contribute more to the initiation or propagation of domino effects. In the present study, we have developed a methodology based on graph theory for domino vulnerability analysis of hazardous installations within process plants, where owning to the large number of installations or complex interdependencies, the application of sophisticated reasoning approaches such as Bayesian network is limited. We have taken advantage of a hypothetical chemical storage plant to develop the methodology and validated the results using a dynamic Bayesian network approach. The efficacy and out-performance of the developed methodology have been demonstrated via a real-life complex case study. - Highlights: • Graph theory is a reliable tool for vulnerability analysis of chemical plants as to domino effects. • All-closeness centrality score can be used to identify most vulnerable installations. • As for complex chemical plants, the methodology outperforms Bayesian network.

  14. Biometric Attendance and Big Data Analysis for Optimizing Work Processes.

    Science.gov (United States)

    Verma, Neetu; Xavier, Teenu; Agrawal, Deepak

    2016-01-01

    Although biometric attendance management is available, large healthcare organizations have difficulty in big data analysis for optimization of work processes. The aim of this project was to assess the implementation of a biometric attendance system and its utility following big data analysis. In this prospective study the implementation of biometric system was evaluated over 3 month period at our institution. Software integration with other existing systems for data analysis was also evaluated. Implementation of the biometric system could be successfully done over a two month period with enrollment of 10,000 employees into the system. However generating reports and taking action this large number of staff was a challenge. For this purpose software was made for capturing the duty roster of each employee and integrating it with the biometric system and adding an SMS gateway. This helped in automating the process of sending SMSs to each employee who had not signed in. Standalone biometric systems have limited functionality in large organizations unless it is meshed with employee duty roster.

  15. Trace analysis for 300 MM wafers and processes with TXRF

    International Nuclear Information System (INIS)

    Nutsch, A.; Erdmann, V.; Zielonka, G.; Pfitzner, L.; Ryssel, H.

    2000-01-01

    Efficient fabrication of semiconductor devices is combined with an increasing size of silicon wafers. The contamination level of processes, media, and equipment has to decrease continuously. A new test laboratory for 300 mm was installed in view of the above mentioned aspects. Aside of numerous processing tools this platform consist electrical test methods, particle detection, vapor phase decomposition (VPD) preparation, and TXRF. The equipment is installed in a cleanroom. It is common to perform process or equipment control, development, evaluation and qualification with monitor wafers. The evaluation and the qualification of 300 mm equipment require direct TXRF on 300 mm wafers. A new TXRF setup was installed due to the wafer size of 300 mm. The 300 mm TXRF is equipped with tungsten and molybdenum anode. This combination allows a sensitive detection of elements with fluorescence energy below 10 keV for tungsten excitation. The molybdenum excitation enables the detection of a wide variety of elements. The detection sensitivity for the tungsten anode excited samples is ten times higher than for molybdenum anode measured samples. The system is calibrated with 1 ng Ni. This calibration shows a stability within 5 % when monitored to control system stability. Decreasing the amount of Ni linear results in a linear decrease of the measured Ni signal. This result is verified for a range of elements by multielement samples. New designs demand new processes and materials, e.g. ferroelectric layers and copper. The trace analysis of many of these materials is supported by the higher excitation energy of the molybdenum anode. Reclaim and recycling of 300 mm wafers demand for an accurate contamination control of the processes to avoid cross contamination. Polishing or etching result in modified surfaces. TXRF as a non-destructive test method allows the simultaneously detection of a variety of elements on differing surfaces in view of contamination control and process

  16. Do compensation processes impair mental health? A meta-analysis.

    Science.gov (United States)

    Elbers, Nieke A; Hulst, Liesbeth; Cuijpers, Pim; Akkermans, Arno J; Bruinvels, David J

    2013-05-01

    Victims who are involved in a compensation processes generally have more health complaints compared to victims who are not involved in a compensation process. Previous research regarding the effect of compensation processes has concentrated on the effect on physical health. This meta-analysis focuses on the effect of compensation processes on mental health. Prospective cohort studies addressing compensation and mental health after traffic accidents, occupational accidents or medical errors were identified using PubMed, EMBASE, PsycInfo, CINAHL, and the Cochrane Library. Relevant studies published between January 1966 and 10 June 2011 were selected for inclusion. Ten studies were included. The first finding was that the compensation group already had higher mental health complaints at baseline compared to the non-compensation group (standardised mean difference (SMD)=-0.38; 95% confidence interval (CI) -0.66 to -0.10; p=.01). The second finding was that mental health between baseline and post measurement improved less in the compensation group compared to the non-compensation group (SMD=-0.35; 95% CI -0.70 to -0.01; p=.05). However, the quality of evidence was limited, mainly because of low quality study design and heterogeneity. Being involved in a compensation process is associated with higher mental health complaints but three-quarters of the difference appeared to be already present at baseline. The findings of this study should be interpreted with caution because of the limited quality of evidence. The difference at baseline may be explained by a selection bias or more anger and blame about the accident in the compensation group. The difference between baseline and follow-up may be explained by secondary gain and secondary victimisation. Future research should involve assessment of exposure to compensation processes, should analyse and correct for baseline differences, and could examine the effect of time, compensation scheme design, and claim settlement on

  17. Contact traction analysis for profile change during coining process

    International Nuclear Information System (INIS)

    Kim, Hyung Kyu; Yoon, Kyung Ho; Kang, Heung Seok; Song, Kee Nam

    2002-01-01

    Contact tractions are analysed in the case of the change in contact profile occurring during the coining process of a thin strip material. The changed profile is assumed as a concave circular arc in the central part of the contact region which is smoothly connected with convex circular arcs at both sides, referring to the actual measurement of the coined material. The profile is discretized and the known solutions of singular integral equations are used. Since the contact profile affects the contact traction and relevant tribological behaviour (e.g. wear) as well, an accurate definition of the profile is necessary in the analysis of material failure. Parametric study is conducted with the variation of the radii and distance of the arcs, which defines the height difference between the summits of the arcs. Considered is the contact profile, which can give the negligible variation of the traction in comparison with that before the coining process

  18. Numerical analysis of stress fields generated by quenching process

    Directory of Open Access Journals (Sweden)

    A. Bokota

    2011-04-01

    Full Text Available In work the presented numerical models of tool steel hardening processes take into account mechanical phenomena generated by thermalphenomena and phase transformations. In the model of mechanical phenomena, apart from thermal, plastic and structural strain, alsotransformations plasticity was taken into account. The stress and strain fields are obtained using the solution of the Finite Elements Method of the equilibrium equation in rate form. The thermophysical constants occurring in constitutive relation depend on temperature and phase composite. For determination of plastic strain the Huber-Misses condition with isotropic strengthening was applied whereas fordetermination of transformation plasticity a modified Leblond model was used. In order to evaluate the quality and usefulness of thepresented models a numerical analysis of stresses and strains associated hardening process of a fang lathe of cone shaped made of tool steel was carried out.

  19. Boiling process modelling peculiarities analysis of the vacuum boiler

    Science.gov (United States)

    Slobodina, E. N.; Mikhailov, A. G.

    2017-06-01

    The analysis of the low and medium powered boiler equipment development was carried out, boiler units possible development directions with the purpose of energy efficiency improvement were identified. Engineering studies for the vacuum boilers applying are represented. Vacuum boiler heat-exchange processes where boiling water is the working body are considered. Heat-exchange intensification method under boiling at the maximum heat- transfer coefficient is examined. As a result of the conducted calculation studies, heat-transfer coefficients variation curves depending on the pressure, calculated through the analytical and numerical methodologies were obtained. The conclusion about the possibility of numerical computing method application through RPI ANSYS CFX for the boiling process description in boiler vacuum volume was given.

  20. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  1. Thermal analysis on x-ray tube for exhaust process

    Science.gov (United States)

    Kumar, Rakesh; Rao Ratnala, Srinivas; Veeresh Kumar, G. B.; Shivakumar Gouda, P. S.

    2018-02-01

    It is great importance in the use of X-rays for medical purposes that the dose given to both the patient and the operator is carefully controlled. There are many types of the X- ray tubes used for different applications based on their capacity and power supplied. In present thesis maxi ray 165 tube is analysed for thermal exhaust processes with ±5% accuracy. Exhaust process is usually done to remove all the air particles and to degasify the insert under high vacuum at 2e-05Torr. The tube glass is made up of Pyrex material, 95%Tungsten and 5%rhenium is used as target material for which the melting point temperature is 3350°C. Various materials are used for various parts; during the operation of X- ray tube these waste gases are released due to high temperature which in turn disturbs the flow of electrons. Thus, before using the X-ray tube for practical applications it has to undergo exhaust processes. Initially we build MX 165 model to carry out thermal analysis, and then we simulate the bearing temperature profiles with FE model to match with test results with ±5%accuracy. At last implement the critical protocols required for manufacturing processes like MF Heating, E-beam, Seasoning and FT.

  2. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  3. Principal Component Analysis of Process Datasets with Missing Values

    Directory of Open Access Journals (Sweden)

    Kristen A. Severson

    2017-07-01

    Full Text Available Datasets with missing values arising from causes such as sensor failure, inconsistent sampling rates, and merging data from different systems are common in the process industry. Methods for handling missing data typically operate during data pre-processing, but can also occur during model building. This article considers missing data within the context of principal component analysis (PCA, which is a method originally developed for complete data that has widespread industrial application in multivariate statistical process control. Due to the prevalence of missing data and the success of PCA for handling complete data, several PCA algorithms that can act on incomplete data have been proposed. Here, algorithms for applying PCA to datasets with missing values are reviewed. A case study is presented to demonstrate the performance of the algorithms and suggestions are made with respect to choosing which algorithm is most appropriate for particular settings. An alternating algorithm based on the singular value decomposition achieved the best results in the majority of test cases involving process datasets.

  4. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  5. Experimental software for modeling and interpreting educational data analysis processes

    Directory of Open Access Journals (Sweden)

    Natalya V. Zorina

    2017-12-01

    Full Text Available Problems, tasks and processes of educational data mining are considered in this article. The objective is to create a fundamentally new information system of the University using the results educational data analysis. One of the functions of such a system is knowledge extraction from accumulated in the operation process data. The creation of the national system of this type is an iterative and time-consuming process requiring the preliminary studies and incremental prototyping modules. The novelty of such systems is that there is a lack of those using this methodology of the development, for this purpose a number of experiments was carried out in order to collect data, choose appropriate methods for the study and to interpret them. As a result of the experiment, the authors were available sources available for analysis in the information environment of the home university. The data were taken from the semester performance, obtained from the information system of the training department of the Institute of IT MTU MIREA, the data obtained as a result of the independent work of students and data, using specially designed Google-forms. To automate the collection of information and analysis of educational data, an experimental software package was created. As a methodology for developing the experimental software complex, a decision was made using the methodologies of rational-empirical complexes (REX and single-experimentation program technologies (TPEI. The details of the program implementation of the complex are described in detail, conclusions are given about the availability of the data sources used, and conclusions are drawn about the prospects for further development.

  6. Optimizing Endoscope Reprocessing Resources Via Process Flow Queuing Analysis.

    Science.gov (United States)

    Seelen, Mark T; Friend, Tynan H; Levine, Wilton C

    2018-05-04

    The Massachusetts General Hospital (MGH) is merging its older endoscope processing facilities into a single new facility that will enable high-level disinfection of endoscopes for both the ORs and Endoscopy Suite, leveraging economies of scale for improved patient care and optimal use of resources. Finalized resource planning was necessary for the merging of facilities to optimize staffing and make final equipment selections to support the nearly 33,000 annual endoscopy cases. To accomplish this, we employed operations management methodologies, analyzing the physical process flow of scopes throughout the existing Endoscopy Suite and ORs and mapping the future state capacity of the new reprocessing facility. Further, our analysis required the incorporation of historical case and reprocessing volumes in a multi-server queuing model to identify any potential wait times as a result of the new reprocessing cycle. We also performed sensitivity analysis to understand the impact of future case volume growth. We found that our future-state reprocessing facility, given planned capital expenditures for automated endoscope reprocessors (AERs) and pre-processing sinks, could easily accommodate current scope volume well within the necessary pre-cleaning-to-sink reprocessing time limit recommended by manufacturers. Further, in its current planned state, our model suggested that the future endoscope reprocessing suite at MGH could support an increase in volume of at least 90% over the next several years. Our work suggests that with simple mathematical analysis of historic case data, significant changes to a complex perioperative environment can be made with ease while keeping patient safety as the top priority.

  7. Retro-Techno-Economic Analysis: Using (Bio)Process Systems Engineering Tools to Attain Process Target Values

    DEFF Research Database (Denmark)

    Furlan, Felipe F.; Costa, Caliane B B; Secchi, Argimiro R.

    2016-01-01

    Economic analysis, allied to process systems engineering tools, can provide useful insights about process techno-economic feasibility. More interestingly, rather than being used to evaluate specific process conditions, this techno-economic analysis can be turned upside down to achieve target valu...

  8. Structural analysis of advanced spent fuel conditioning process

    International Nuclear Information System (INIS)

    Gu, J. H.; Jung, W. M.; Jo, I. J.; Gug, D. H.; Yoo, K. S.

    2003-01-01

    An advanced spent fuel conditioning process (ACP) is developing for the safe and effective management of spent fuels which arising from the domestic nuclear power plants. And its demonstration facility is under design. This facility will be prepared by modifying IMEF's reserve hot cell facility which reserved for future usage by considering the characteristics of ACP. This study presents a basic structural architecture design and analysis results of ACP hot cell including modification of the IMEF. The results of this study will be used for the detail design of ACP demonstration facility, and utilized as basic data for the licensing of the ACP facility

  9. Astro-H Data Analysis, Processing and Archive

    Science.gov (United States)

    Angelini, Lorella; Terada, Yukikatsu; Loewenstein, Michael; Miller, Eric D.; Yamaguchi, Hiroya; Yaqoob, Tahir; Krimm, Hans; Harrus, Ilana; Takahashi, Hiromitsu; Nobukawa, Masayoshi; hide

    2016-01-01

    Astro-H (Hitomi) is an X-ray Gamma-ray mission led by Japan with international participation, launched on February 17, 2016. The payload consists of four different instruments (SXS, SXI, HXI and SGD) that operate simultaneously to cover the energy range from 0.3 keV up to 600 keV. This paper presents the analysis software and the data processing pipeline created to calibrate and analyze the Hitomi science data along with the plan for the archive and user support.These activities have been a collaborative effort shared between scientists and software engineers working in several institutes in Japan and USA.

  10. Analysis of Mental Processes Represented in Models of Artificial Consciousness

    Directory of Open Access Journals (Sweden)

    Luana Folchini da Costa

    2013-12-01

    Full Text Available The Artificial Consciousness concept has been used in the engineering area as being an evolution of the Artificial Intelligence. However, consciousness is a complex subject and often used without formalism. As a main contribution, in this work one proposes an analysis of four recent models of artificial consciousness published in the engineering area. The mental processes represented by these models are highlighted and correlations with the theoretical perspective of cognitive psychology are made. Finally, considerations about consciousness in such models are discussed.

  11. Neutron-activation analysis of routine mineral-processing samples

    International Nuclear Information System (INIS)

    Watterson, J.; Eddy, B.; Pearton, D.

    1974-01-01

    Instrumental neutron-activation analysis was applied to a suite of typical mineral-processing samples to establish which elements can be rapidly determined in them by this technique. A total of 35 elements can be determined with precisions (from the counting statistics) ranging from better than 1 per cent to approximately 20 per cent. The elements that can be determined have been tabulated together with the experimental conditions, the precision from the counting statistics, and the estimated number of analyses possible per day. With an automated system, this number can be as high as 150 in the most favourable cases [af

  12. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  13. Gamma camera image processing and graphical analysis mutual software system

    International Nuclear Information System (INIS)

    Wang Zhiqian; Chen Yongming; Ding Ailian; Ling Zhiye; Jin Yongjie

    1992-01-01

    GCCS gamma camera image processing and graphical analysis system is a special mutual software system. It is mainly used to analyse various patient data acquired from gamma camera. This system is used on IBM PC, PC/XT or PC/AT. It consists of several parts: system management, data management, device management, program package and user programs. The system provides two kinds of user interfaces: command menu and command characters. It is easy to change and enlarge this system because it is best modularized. The user programs include almost all the clinical protocols used now

  14. Image processing and analysis using neural networks for optometry area

    Science.gov (United States)

    Netto, Antonio V.; Ferreira de Oliveira, Maria C.

    2002-11-01

    In this work we describe the framework of a functional system for processing and analyzing images of the human eye acquired by the Hartmann-Shack technique (HS), in order to extract information to formulate a diagnosis of eye refractive errors (astigmatism, hypermetropia and myopia). The analysis is to be carried out using an Artificial Intelligence system based on Neural Nets, Fuzzy Logic and Classifier Combination. The major goal is to establish the basis of a new technology to effectively measure ocular refractive errors that is based on methods alternative those adopted in current patented systems. Moreover, analysis of images acquired with the Hartmann-Shack technique may enable the extraction of additional information on the health of an eye under exam from the same image used to detect refraction errors.

  15. Stochastic analysis in production process and ecology under uncertainty

    CERN Document Server

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...

  16. Unraveling cell processes: interference imaging interwoven with data analysis

    DEFF Research Database (Denmark)

    Brazhe, Nadezda; Brazhe, Alexey; Pavlov, A N

    2006-01-01

    The paper presents results on the application of interference microscopy and wavelet-analysis for cell visualization and studies of cell dynamics. We demonstrate that interference imaging of erythrocytes can reveal reorganization of the cytoskeleton and inhomogenity in the distribution of hemoglo......The paper presents results on the application of interference microscopy and wavelet-analysis for cell visualization and studies of cell dynamics. We demonstrate that interference imaging of erythrocytes can reveal reorganization of the cytoskeleton and inhomogenity in the distribution...... properties differ from cell type to cell type and depend on the cellular compartment. Our results suggest that low frequency variations (0.1-0.6 Hz) result from plasma membrane processes and that higher frequency variations (20-26 Hz) are related to the movement of vesicles. Using double-wavelet analysis, we...... study the modulation of the 1 Hz rhythm in neurons and reveal its changes under depolarization and hyperpolarization of the plasma membrane. We conclude that interference microscopy combined with wavelet analysis is a useful technique for non-invasive cell studies, cell visualization, and investigation...

  17. A novel process for recovery of fermentation-derived succinic acid: process design and economic analysis.

    Science.gov (United States)

    Orjuela, Alvaro; Orjuela, Andrea; Lira, Carl T; Miller, Dennis J

    2013-07-01

    Recovery and purification of organic acids produced in fermentation constitutes a significant fraction of total production cost. In this paper, the design and economic analysis of a process to recover succinic acid (SA) via dissolution and acidification of succinate salts in ethanol, followed by reactive distillation to form succinate esters, is presented. Process simulation was performed for a range of plant capacities (13-55 million kg/yr SA) and SA fermentation titers (50-100 kg/m(3)). Economics were evaluated for a recovery system installed within an existing fermentation facility producing succinate salts at a cost of $0.66/kg SA. For a SA processing capacity of 54.9 million kg/yr and a titer of 100 kg/m(3) SA, the model predicts a capital investment of $75 million and a net processing cost of $1.85 per kg SA. Required selling price of diethyl succinate for a 30% annual return on investment is $1.57 per kg. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  19. Diffusing wave spectroscopy applied to material analysis and process control

    International Nuclear Information System (INIS)

    Lloyd, Christopher James

    1997-01-01

    Diffusing Wave Spectroscopy (DWS) was studied as a method of laboratory analysis of sub-micron particles, and developed as a prospective in-line, industrial, process control sensor, capable of near real-time feedback. No sample pre-treatment was required and measurement was via a non-invasive, flexible, dip in probe. DWS relies on the concept of the diffusive migration of light, as opposed to the ballistic scatter model used in conventional dynamic light scattering. The specific requirements of the optoelectronic hardware, data analysis methods and light scattering model were studied experimentally and, where practical, theoretically resulting in a novel technique of analysis of particle suspensions and emulsions of volume fractions between 0.01 and 0.4. Operation at high concentrations made the technique oblivious to dust and contamination. A pure homodyne (autodyne) experimental arrangement described was resilient to environmental disturbances, unlike many other systems which utilise optical fibres or heterodyne operation. Pilot and subsequent prototype development led to a highly accurate method of size ranking, suitable for analysis of a wide range of suspensions and emulsions. The technique was shown to operate on real industrial samples with statistical variance as low as 0.3% with minimal software processing. Whilst the application studied was the analysis of TiO 2 suspensions, a diverse range of materials including polystyrene beads, cell pastes and industrial cutting fluid emulsions were tested. Results suggest that, whilst all sizing should be comparative to suitable standards, concentration effects may be minimised and even completely modelled-out in many applications. Adhesion to the optical probe was initially a significant problem but was minimised after the evaluation and use of suitable non stick coating materials. Unexpected behaviour in the correlation in the region of short decay times led to consideration of the effects of rotational diffusion

  20. MTpy - Python Tools for Magnetotelluric Data Processing and Analysis

    Science.gov (United States)

    Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes

    2014-05-01

    We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.

  1. Multiobjective Optimization of ELID Grinding Process Using Grey Relational Analysis Coupled with Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    S. Prabhu

    2014-06-01

    Full Text Available Carbon nanotube (CNT mixed grinding wheel has been used in the electrolytic in-process dressing (ELID grinding process to analyze the surface characteristics of AISI D2 Tool steel material. CNT grinding wheel is having an excellent thermal conductivity and good mechanical property which is used to improve the surface finish of the work piece. The multiobjective optimization of grey relational analysis coupled with principal component analysis has been used to optimize the process parameters of ELID grinding process. Based on the Taguchi design of experiments, an L9 orthogonal array table was chosen for the experiments. The confirmation experiment verifies the proposed that grey-based Taguchi method has the ability to find out the optimal process parameters with multiple quality characteristics of surface roughness and metal removal rate. Analysis of variance (ANOVA has been used to verify and validate the model. Empirical model for the prediction of output parameters has been developed using regression analysis and the results were compared for with and without using CNT grinding wheel in ELID grinding process.

  2. Effective Thermal Analysis of Using Peltier Module for Desalination Process

    Directory of Open Access Journals (Sweden)

    Hayder Al-Madhhachi

    2018-01-01

    Full Text Available The key objective of this study is to analyse the heat transfer processes involved in the evaporation and condensation of water in a water distillation system employing a thermoelectric module. This analysis can help to increase the water production and to enhance the system performance. For the analysis, a water distillation unit prototype integrated with a thermoelectric module was designed and fabricated. A theoretical model is developed to study the effect of the heat added, transferred and removed, in forced convection and laminar flow, during the evaporation and condensation processes. The thermoelectric module is used to convert electricity into heat under Peltier effect and control precisely the absorbed and released heat at the cold and hot sides of the module, respectively. Temperatures of water, vapour, condenser, cold and hot sides of the thermoelectric module and water production have been measured experimentally under steady state operation. The theoretical and experimental water production were found to be in agreement. The amount of heat that needs to be evaporated from water-vapour interface and transferred through the condenser surface to the thermoelectric module is crucial for the design and optimization of distillation systems.

  3. Energetic analysis of fruit juice processing operations in Nigeria

    Energy Technology Data Exchange (ETDEWEB)

    Waheed, M.A.; Imeokparia, O.E. [Ladoke Akintola University of Technology, Ogbomoso, Oyo State (Nigeria). Mechanical Engineering Department; Jekayinfa, S.O.; Ojediran, J.O. [Ladoke Akintola University of Technology, Ogbomoso, Oyo State (Nigeria). Agricultural Engineering Department

    2008-01-15

    Energy and exergy studies were conducted in an orange juice manufacturing industry in Nigeria to determine the energy consumption pattern and methods of energy optimization in the company. An adaptation of the process analysis method of energy accounting was used to evaluate the energy requirement for each of the eight defined unit operations. The types of energy used in the manufacturing of orange juice were electrical, steam and manual with the respective proportions of 18.51%, 80.91% and 0.58% of the total energy. It was estimated that an average energy intensity of 1.12 MJ/kg was required for the manufacturing of orange juice. The most energy intensive operation was identified as the pasteurizer followed by packaging unit with energy intensities of 0.932 and 0.119 MJ/kg, respectively. The exergy analysis revealed that the pasteurizer was responsible for most of the inefficiency (over 90%) followed by packaging (6.60%). It was suggested that the capacity of the pasteurizer be increased to reduce the level of inefficiency of the plant. The suggestion has been limited to equipment modification rather than process alteration, which constitutes additional investment cost and may not be economical from an energy savings perspective. (author)

  4. Techno-Economic Analysis of a Secondary Air Stripper Process

    Energy Technology Data Exchange (ETDEWEB)

    Heberle, J.R. [Electric Power Research Inst. (EPRI), Palo Alto, CA (United States); Nikolic, Heather [Center for Applied Energy Research, University of Kentucky, Lexington, KY (United States); Thompson, Jesse [Center for Applied Energy Research, University of Kentucky, Lexington, KY (United States); Liu, Kunlei [Center for Applied Energy Research, University of Kentucky, Lexington, KY (United States); Pinkerton, Lora L. [WorleyParsons, Reading, PA (United States); Brubaker, David [WorleyParsons, Reading, PA (United States); Simpson, James C. [WorleyParsons, Reading, PA (United States); Wu, Song [Mitsubishi Hitachi Power Systems America, Inc, Basking Ridge, NJ (United States); Bhown, Abhoyjit S. [Electric Power Research Inst. (EPRI), Palo Alto, CA (United States)

    2017-08-22

    We present results of an initial techno-economic assessment on a post-combustion CO2 capture process developed by the Center for Applied Energy Research (CAER) at the University of Kentucky using Mitsubishi Hitachi Power Systems’ H3-1 aqueous amine solvent. The analysis is based on data collected at a 0.7 MWe pilot unit combined with laboratory data and process simulations. The process adds a secondary air stripper to a conventional solvent process, which increases the cyclic loading of the solvent in two ways. First, air strips additional CO2 from the solvent downstream of the conventional steam-heated thermal stripper. This extra stripping of CO2 reduces the lean loading entering the absorber. Second, the CO2-enriched air is then sent to the boiler for use as secondary air. This recycling of CO2 results in a higher concentration of CO2 in the flue gas sent to the absorber, and hence a higher rich loading of the solvent exiting the absorber. A process model was incorporated into a full-scale supercritical pulverized coal power plant model to determine the plant performance and heat and mass balances. The performance and heat and mass balance data were used to size equipment and develop cost estimates for capital and operating costs. Lifecycle costs were considered through a levelized cost of electricity (LCOE) assessment based on the capital cost estimate and modeled performance. The results of the simulations show that the CAER process yields a regeneration energy of 3.12 GJ/t CO2, a $53.05/t CO2 capture cost, and LCOE of $174.59/MWh. This compares to the U.S. Department of Energy’s projected costs (Case 10) of regeneration energy of 3.58 GJ/t CO2 , a $61.31/t CO2 capture cost, and LCOE of $189.59/MWh. For H3-1, the CAER process results in a regeneration energy of 2.62 GJ/tCO2 with a stripper pressure of 5.2 bar, a capture cost of $46.93/t CO2, and an LCOE of $164.33/MWh.

  5. Yucca Mountain Feature, Event, and Process (FEP) Analysis

    International Nuclear Information System (INIS)

    Freeze, G.

    2005-01-01

    A Total System Performance Assessment (TSPA) model was developed for the U.S. Department of Energy (DOE) Yucca Mountain Project (YMP) to help demonstrate compliance with applicable postclosure regulatory standards and support the License Application (LA). Two important precursors to the development of the TSPA model were (1) the identification and screening of features, events, and processes (FEPs) that might affect the Yucca Mountain disposal system (i.e., FEP analysis), and (2) the formation of scenarios from screened in (included) FEPs to be evaluated in the TSPA model (i.e., scenario development). YMP FEP analysis and scenario development followed a five-step process: (1) Identify a comprehensive list of FEPs potentially relevant to the long-term performance of the disposal system. (2) Screen the FEPs using specified criteria to identify those FEPs that should be included in the TSPA analysis and those that can be excluded from the analysis. (3) Form scenarios from the screened in (included) FEPs. (4) Screen the scenarios using the same criteria applied to the FEPs to identify any scenarios that can be excluded from the TSPA, as appropriate. (5) Specify the implementation of the scenarios in the computational modeling for the TSPA, and document the treatment of included FEPs. This paper describes the FEP analysis approach (Steps 1 and 2) for YMP, with a brief discussion of scenario formation (Step 3). Details of YMP scenario development (Steps 3 and 4) and TSPA modeling (Step 5) are beyond scope of this paper. The identification and screening of the YMP FEPs was an iterative process based on site-specific information, design, and regulations. The process was iterative in the sense that there were multiple evaluation and feedback steps (e.g., separate preliminary, interim, and final analyses). The initial YMP FEP list was compiled from an existing international list of FEPs from other radioactive waste disposal programs and was augmented by YMP site- and design

  6. Hillslope Discharge Analysis - Threshold Behavior and Mixing Processes

    Science.gov (United States)

    Dusek, J.; Vogel, T. N.

    2017-12-01

    Reliable quantitative prediction of temporal changes of both the soil water storage and the shallow subsurface runoff for natural forest hillslopes exhibiting high degree of subsurface heterogeneity remains a challenge. The intensity of stormflow determines to a large extent the residence time of water in a hillslope segment, thus also influencing biogeochemical processes and mass fluxes of nutrients. Stormflow, as one of the most important runoff mechanisms in headwater catchments, usually develops above the soil-bedrock interface during prominent rainfall-runoff events as saturated flow. In this study, one- and two-dimensional numerical models were used to analyze hydrological processes at an experimental forest site located in a small headwater catchment under humid temperate climate. The models are based on dual-continuum approach reflecting water flow and isotope transport through the soil matrix and preferential pathways. The threshold relationship between rainfall and stormflow as well as hysteresis in the hillslope stormflow-storage relationship were examined. The hillslope storage analysis was performed for selected individual rainfall-runoff events over the period of several consecutive growing seasons. Furthermore, temporal and spatial variations of pre-event and event water contributions to hillslope stormflow were evaluated using a two-component mass balance approach based on the synthetic oxygen-18 signatures. The results of this analysis showed a mutual interplay of components of hillslope water balance exposing a nonlinear character of the hillslope hydrological response. The results also suggested significant mixing processes in a hillslope segment, in particular mixing of pre-event and event water as well as water exchanged between the soil matrix and preferential pathways. Despite the dominant control of preferential stormflow on overall hillslope runoff response, a rapid and substantial contribution of pre-event water to hillslope runoff was

  7. Computer-aided analysis of cutting processes for brittle materials

    Science.gov (United States)

    Ogorodnikov, A. I.; Tikhonov, I. N.

    2017-12-01

    This paper is focused on 3D computer simulation of cutting processes for brittle materials and silicon wafers. Computer-aided analysis of wafer scribing and dicing is carried out with the use of the ANSYS CAE (computer-aided engineering) software, and a parametric model of the processes is created by means of the internal ANSYS APDL programming language. Different types of tool tip geometry are analyzed to obtain internal stresses, such as a four-sided pyramid with an included angle of 120° and a tool inclination angle to the normal axis of 15°. The quality of the workpieces after cutting is studied by optical microscopy to verify the FE (finite-element) model. The disruption of the material structure during scribing occurs near the scratch and propagates into the wafer or over its surface at a short range. The deformation area along the scratch looks like a ragged band, but the stress width is rather low. The theory of cutting brittle semiconductor and optical materials is developed on the basis of the advanced theory of metal turning. The fall of stress intensity along the normal on the way from the tip point to the scribe line can be predicted using the developed theory and with the verified FE model. The crystal quality and dimensions of defects are determined by the mechanics of scratching, which depends on the shape of the diamond tip, the scratching direction, the velocity of the cutting tool and applied force loads. The disunity is a rate-sensitive process, and it depends on the cutting thickness. The application of numerical techniques, such as FE analysis, to cutting problems enhances understanding and promotes the further development of existing machining technologies.

  8. Simulated interprofessional education: an analysis of teaching and learning processes.

    Science.gov (United States)

    van Soeren, Mary; Devlin-Cop, Sandra; Macmillan, Kathleen; Baker, Lindsay; Egan-Lee, Eileen; Reeves, Scott

    2011-11-01

    Simulated learning activities are increasingly being used in health professions and interprofessional education (IPE). Specifically, IPE programs are frequently adopting role-play simulations as a key learning approach. Despite this widespread adoption, there is little empirical evidence exploring the teaching and learning processes embedded within this type of simulation. This exploratory study provides insight into the nature of these processes through the use of qualitative methods. A total of 152 clinicians, 101 students and 9 facilitators representing a range of health professions, participated in video-recorded role-plays and debrief sessions. Videotapes were analyzed to explore emerging issues and themes related to teaching and learning processes related to this type of interprofessional simulated learning experience. In addition, three focus groups were conducted with a subset of participants to explore perceptions of their educational experiences. Five key themes emerged from the data analysis: enthusiasm and motivation, professional role assignment, scenario realism, facilitator style and background and team facilitation. Our findings suggest that program developers need to be mindful of these five themes when using role-plays in an interprofessional context and point to the importance of deliberate and skilled facilitation in meeting desired learning outcomes.

  9. Analysis of Evolutionary Processes of Species Jump in Waterfowl Parvovirus

    Science.gov (United States)

    Fan, Wentao; Sun, Zhaoyu; Shen, Tongtong; Xu, Danning; Huang, Kehe; Zhou, Jiyong; Song, Suquan; Yan, Liping

    2017-01-01

    Waterfowl parvoviruses are classified into goose parvovirus (GPV) and Muscovy duck parvovirus (MDPV) according to their antigenic features and host preferences. A novel duck parvovirus (NDPV), identified as a new variant of GPV, is currently infecting ducks, thus causing considerable economic loss. This study analyzed the molecular evolution and population dynamics of the emerging parvovirus capsid gene to investigate the evolutionary processes concerning the host shift of NDPV. Two important amino acids changes (Asn-489 and Asn-650) were identified in NDPV, which may be responsible for host shift of NDPV. Phylogenetic analysis indicated that the currently circulating NDPV originated from the GPV lineage. The Bayesian Markov chain Monte Carlo tree indicated that the NDPV diverged from GPV approximately 20 years ago. Evolutionary rate analyses demonstrated that GPV evolved with 7.674 × 10-4 substitutions/site/year, and the data for MDPV was 5.237 × 10-4 substitutions/site/year, whereas the substitution rate in NDPV branch was 2.25 × 10-3 substitutions/site/year. Meanwhile, viral population dynamics analysis revealed that the GPV major clade, including NDPV, grew exponentially at a rate of 1.717 year-1. Selection pressure analysis showed that most sites are subject to strong purifying selection and no positively selected sites were found in NDPV. The unique immune-epitopes in waterfowl parvovirus were also estimated, which may be helpful for the prediction of antibody binding sites against NDPV in ducks. PMID:28352261

  10. Tube Bulge Process : Theoretical Analysis and Finite Element Simulations

    International Nuclear Information System (INIS)

    Velasco, Raphael; Boudeau, Nathalie

    2007-01-01

    This paper is focused on the determination of mechanics characteristics for tubular materials, using tube bulge process. A comparative study is made between two different models: theoretical model and finite element analysis. The theoretical model is completely developed, based first on a geometrical analysis of the tube profile during bulging, which is assumed to strain in arc of circles. Strain and stress analysis complete the theoretical model, which allows to evaluate tube thickness and state of stress, at any point of the free bulge region. Free bulging of a 304L stainless steel is simulated using Ls-Dyna 970. To validate FE simulations approach, a comparison between theoretical and finite elements models is led on several parameters such as: thickness variation at the free bulge region pole with bulge height, tube thickness variation with z axial coordinate, and von Mises stress variation with plastic strain. Finally, the influence of geometrical parameters deviations on flow stress curve is observed using analytical model: deviations of the tube outer diameter, its initial thickness and the bulge height measurement are taken into account to obtain a resulting error on plastic strain and von Mises stress

  11. Identifying target processes for microbial electrosynthesis by elementary mode analysis.

    Science.gov (United States)

    Kracke, Frauke; Krömer, Jens O

    2014-12-30

    Microbial electrosynthesis and electro fermentation are techniques that aim to optimize microbial production of chemicals and fuels by regulating the cellular redox balance via interaction with electrodes. While the concept is known for decades major knowledge gaps remain, which make it hard to evaluate its biotechnological potential. Here we present an in silico approach to identify beneficial production processes for electro fermentation by elementary mode analysis. Since the fundamentals of electron transport between electrodes and microbes have not been fully uncovered yet, we propose different options and discuss their impact on biomass and product yields. For the first time 20 different valuable products were screened for their potential to show increased yields during anaerobic electrically enhanced fermentation. Surprisingly we found that an increase in product formation by electrical enhancement is not necessarily dependent on the degree of reduction of the product but rather the metabolic pathway it is derived from. We present a variety of beneficial processes with product yield increases of maximal 36% in reductive and 84% in oxidative fermentations and final theoretical product yields up to 100%. This includes compounds that are already produced at industrial scale such as succinic acid, lysine and diaminopentane as well as potential novel bio-commodities such as isoprene, para-hydroxybenzoic acid and para-aminobenzoic acid. Furthermore, it is shown that the way of electron transport has major impact on achievable biomass and product yields. The coupling of electron transport to energy conservation could be identified as crucial for most processes. This study introduces a powerful tool to determine beneficial substrate and product combinations for electro-fermentation. It also highlights that the maximal yield achievable by bio electrochemical techniques depends strongly on the actual electron transport mechanisms. Therefore it is of great importance to

  12. Process Analysis in Chemical Plant by Means of Radioactive Tracers

    Energy Technology Data Exchange (ETDEWEB)

    Hirayama, T.; Hamada, K.; Osada, K. [Showa Denko K.K., Tokyo (Japan)

    1967-06-15

    Following the movement of solids and fluids is important in chemical processes to determine mixing efficiency and residence time. Since it is necessary to follow many complex substances such as raw materials, intermediates and reactants in plant investigations, it is often necessary to ascertain whether the behaviour of the radioisotope tracer and the substance to be traced are identical. The most difficult problem is to determine the best method of labelling, a factor which is a substantial key to the success of an experiment. Usually, there are three labelling techniques: radioisotope labelling, pre-.activation of the material and post-activation of the material. This paper deals with practical examples of the double-tracer technique, a combination of conventional radioisotope labelling and post-activation methods by means of activation analysis. In process analysis by means of tracers, a practical measurement method should also be devised and developed for each experiment. Phosphorus-32 and gold (non-radioactive) were used to measure retention time in a carbon-black plant. The radioisotope was pumped into a feed-stock pipe positioned before the reactor and samples were taken from each process of the plant, including the bag filter, mixer and product tank. After sampling from each step of the process, {sup 32}P in a semi-infinite powder sample was measured in situ by beta counting, and the gold was measured by gamma counting after activating the sample in a reactor. The experiment showed that both tracers had the same residence time, which was shorter than expected. Useful data were also obtained from the dispersion pattern of the material flow for future operation controls, including the time required to change from one grade of product to another. Practical tracer techniques to measure mixing characteristics in high-speed gas flows using {sup 85}Kr have been developed. A study of the measurement method was conducted by calculating the differential values of

  13. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    Science.gov (United States)

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  14. Rapid Process to Generate Beam Envelopes for Optical System Analysis

    Science.gov (United States)

    Howard, Joseph; Seals, Lenward

    2012-01-01

    The task of evaluating obstructions in the optical throughput of an optical system requires the use of two disciplines, and hence, two models: optical models for the details of optical propagation, and mechanical models for determining the actual structure that exists in the optical system. Previous analysis methods for creating beam envelopes (or cones of light) for use in this obstruction analysis were found to be cumbersome to calculate and take significant time and resources to complete. A new process was developed that takes less time to complete beam envelope analysis, is more accurate and less dependent upon manual node tracking to create the beam envelopes, and eases the burden on the mechanical CAD (computer-aided design) designers to form the beam solids. This algorithm allows rapid generation of beam envelopes for optical system obstruction analysis. Ray trace information is taken from optical design software and used to generate CAD objects that represent the boundary of the beam envelopes for detailed analysis in mechanical CAD software. Matlab is used to call ray trace data from the optical model for all fields and entrance pupil points of interest. These are chosen to be the edge of each space, so that these rays produce the bounding volume for the beam. The x and y global coordinate data is collected on the surface planes of interest, typically an image of the field and entrance pupil internal of the optical system. This x and y coordinate data is then evaluated using a convex hull algorithm, which removes any internal points, which are unnecessary to produce the bounding volume of interest. At this point, tolerances can be applied to expand the size of either the field or aperture, depending on the allocations. Once this minimum set of coordinates on the pupil and field is obtained, a new set of rays is generated between the field plane and aperture plane (or vice-versa). These rays are then evaluated at planes between the aperture and field, at a

  15. Processes that Inform Multicultural Supervision: A Qualitative Meta-Analysis.

    Science.gov (United States)

    Tohidian, Nilou B; Quek, Karen Mui-Teng

    2017-10-01

    As the fields of counseling and psychotherapy have become more cognizant that individuals, couples, and families bring with them a myriad of diversity factors into therapy, multicultural competency has also become a crucial component in the development of clinicians during clinical supervision and training. We employed a qualitative meta-analysis to provide a detailed and comprehensive description of similar themes identified in primary qualitative studies that have investigated supervisory practices with an emphasis on diversity. Findings revealed six meta-categories, namely: (a) Supervisor's Multicultural Stances; (b) Supervisee's Multicultural Encounters; (c) Competency-Based Content in Supervision; (d) Processes Surrounding Multicultural Supervision; (e) Culturally Attuned Interventions; and (f) Multicultural Supervisory Alliance. Implications for practice are discussed. © 2017 American Association for Marriage and Family Therapy.

  16. Low-level processing for real-time image analysis

    Science.gov (United States)

    Eskenazi, R.; Wilf, J. M.

    1979-01-01

    A system that detects object outlines in television images in real time is described. A high-speed pipeline processor transforms the raw image into an edge map and a microprocessor, which is integrated into the system, clusters the edges, and represents them as chain codes. Image statistics, useful for higher level tasks such as pattern recognition, are computed by the microprocessor. Peak intensity and peak gradient values are extracted within a programmable window and are used for iris and focus control. The algorithms implemented in hardware and the pipeline processor architecture are described. The strategy for partitioning functions in the pipeline was chosen to make the implementation modular. The microprocessor interface allows flexible and adaptive control of the feature extraction process. The software algorithms for clustering edge segments, creating chain codes, and computing image statistics are also discussed. A strategy for real time image analysis that uses this system is given.

  17. Accident sequences and causes analysis in a hydrogen production process

    Energy Technology Data Exchange (ETDEWEB)

    Jae, Moo Sung; Hwang, Seok Won; Kang, Kyong Min; Ryu, Jung Hyun; Kim, Min Soo; Cho, Nam Chul; Jeon, Ho Jun; Jung, Gun Hyo; Han, Kyu Min; Lee, Seng Woo [Hanyang Univ., Seoul (Korea, Republic of)

    2006-03-15

    Since hydrogen production facility using IS process requires high temperature of nuclear power plant, safety assessment should be performed to guarantee the safety of facility. First of all, accident cases of hydrogen production and utilization has been surveyed. Based on the results, risk factors which can be derived from hydrogen production facility were identified. Besides the correlation between risk factors are schematized using influence diagram. Also initiating events of hydrogen production facility were identified and accident scenario development and quantification were performed. PSA methodology was used for identification of initiating event and master logic diagram was used for selection method of initiating event. Event tree analysis was used for quantification of accident scenario. The sum of all the leakage frequencies is 1.22x10{sup -4} which is similar value (1.0x10{sup -4}) for core damage frequency that International Nuclear Safety Advisory Group of IAEA suggested as a criteria.

  18. Process analysis and data driven optimization in the salmon industry

    DEFF Research Database (Denmark)

    Johansson, Gine Ørnholt

    Aquaculture supplies around 70% of the salmon in the World and the industry is thus an important player in meeting the increasing demand for salmon products. Such mass production calls for systems that can handle thousands of tonnes of salmon without compromising the welfare of the fish...... and the following product quality. Moreover, the requirement of increased profit performance for the industry should be met with sustainable production solutions. Optimization during the production of salmon fillets could be one feasible approach to increase the outcome from the same level of incoming raw material...... and analysis of data from the salmon industry could be utilized to extract information that will support the industry in their decision-making processes. Mapping of quality parameters, their fluctuations and influences on yield and texture has been investigated. Additionally, the ability to predict the texture...

  19. Astro-H/Hitomi data analysis, processing, and archive

    Science.gov (United States)

    Angelini, Lorella; Terada, Yukikatsu; Dutka, Michael; Eggen, Joseph; Harrus, Ilana; Hill, Robert S.; Krimm, Hans; Loewenstein, Michael; Miller, Eric D.; Nobukawa, Masayoshi; Rutkowski, Kristin; Sargent, Andrew; Sawada, Makoto; Takahashi, Hiromitsu; Yamaguchi, Hiroya; Yaqoob, Tahir; Witthoeft, Michael

    2018-01-01

    Astro-H is the x-ray/gamma-ray mission led by Japan with international participation, launched on February 17, 2016. Soon after launch, Astro-H was renamed Hitomi. The payload consists of four different instruments (SXS, SXI, HXI, and SGD) that operate simultaneously to cover the energy range from 0.3 keV up to 600 keV. On March 27, 2016, JAXA lost contact with the satellite and, on April 28, they announced the cessation of the efforts to restore mission operations. Hitomi collected about one month's worth of data with its instruments. This paper presents the analysis software and the data processing pipeline created to calibrate and analyze the Hitomi science data, along with the plan for the archive. These activities have been a collaborative effort shared between scientists and software engineers working in several institutes in Japan and United States.

  20. Analysis of DIRAC's behavior using model checking with process algebra

    International Nuclear Information System (INIS)

    Remenska, Daniela; Templon, Jeff; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof; Diaz, Ricardo Graciani

    2012-01-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.

  1. Analysis of DIRAC's behavior using model checking with process algebra

    Science.gov (United States)

    Remenska, Daniela; Templon, Jeff; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Graciani Diaz, Ricardo; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof

    2012-12-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.

  2. Bony change of apical lesion healing process using fractal analysis

    International Nuclear Information System (INIS)

    Lee, Ji Min; Park, Hyok; Jeong, Ho Gul; Kim, Kee Deog; Park, Chang Seo

    2005-01-01

    To investigate the change of bone healing process after endodontic treatment of the tooth with an apical lesion by fractal analysis. Radiographic images of 35 teeth from 33 patients taken on first diagnosis, 6 months, and 1 year after endodontic treatment were selected. Radiographic images were taken by JUPITER computerized Dental X-ray System. Fractal dimensions were calculated three times at each area by Scion Image PC program. Rectangular region of interest (30 x 30) were selected at apical lesion and normal apex of each image. The fractal dimension at apical lesion of first diagnosis (L 0 ) is 0.940 ± 0.361 and that of normal area (N 0 ) is 1.186 ± 0.727 (p 1 ) is 1.076 ± 0.069 and that of normal area (N 1 ) is 1.192 ± 0.055 (p 2 ) is 1.163 ± 0.074 and that of normal area (N 2 ) is 1.225 ± 0.079 (p<0.05). After endodontic treatment, the fractal dimensions at each apical lesions depending on time showed statistically significant difference. And there are statistically significant different between normal area and apical lesion on first diagnosis, 6 months after, 1 year after. But the differences were grow smaller as time flows. The evaluation of the prognosis after the endodontic treatment of the apical lesion was estimated by bone regeneration in apical region. Fractal analysis was attempted to overcome the limit of subjective reading, and as a result the change of the bone during the healing process was able to be detected objectively and quantitatively.

  3. Bony change of apical lesion healing process using fractal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ji Min; Park, Hyok; Jeong, Ho Gul; Kim, Kee Deog; Park, Chang Seo [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2005-06-15

    To investigate the change of bone healing process after endodontic treatment of the tooth with an apical lesion by fractal analysis. Radiographic images of 35 teeth from 33 patients taken on first diagnosis, 6 months, and 1 year after endodontic treatment were selected. Radiographic images were taken by JUPITER computerized Dental X-ray System. Fractal dimensions were calculated three times at each area by Scion Image PC program. Rectangular region of interest (30 x 30) were selected at apical lesion and normal apex of each image. The fractal dimension at apical lesion of first diagnosis (L{sub 0}) is 0.940 {+-} 0.361 and that of normal area (N{sub 0}) is 1.186 {+-} 0.727 (p<0.05). Fractal dimension at apical lesion of 6 months after endodontic treatment (L{sub 1}) is 1.076 {+-} 0.069 and that of normal area (N{sub 1}) is 1.192 {+-} 0.055 (p<0.05). Fractal dimension at apical lesion of 1 year after endodontic treatment (L{sub 2}) is 1.163 {+-} 0.074 and that of normal area (N{sub 2}) is 1.225 {+-} 0.079 (p<0.05). After endodontic treatment, the fractal dimensions at each apical lesions depending on time showed statistically significant difference. And there are statistically significant different between normal area and apical lesion on first diagnosis, 6 months after, 1 year after. But the differences were grow smaller as time flows. The evaluation of the prognosis after the endodontic treatment of the apical lesion was estimated by bone regeneration in apical region. Fractal analysis was attempted to overcome the limit of subjective reading, and as a result the change of the bone during the healing process was able to be detected objectively and quantitatively.

  4. Analysis of Work Design in Rubber Processing Plant

    Directory of Open Access Journals (Sweden)

    Wahyuni Dini

    2018-01-01

    Full Text Available The work design illustrates how structured jobs, tasks, and roles are defined and modified and their impact on individuals, groups, and organizations. If the work is not designed well, the company must pay greater costs for workers’ health, longer production processes or even penalties for not being able to meet the delivery schedule. This is visible to the condition in a rubber processing factory in North Sumatra. Work design aspects such as layouts, machinery and equipment, worker's physical working environment, work methods, and organizational policies have not been well-organized. Coagulum grinding machines into sheets are often damaged, resulting in 4 times the delay of product delivery in 2016, the presence of complaints of heat exposure submitted by workers, and workstation that has not been properly arranged is an indication of the need for work design. The research data will be collected through field observation, and distribution of questionnaires related aspects of work design. The result of the analysis depends on the respondent's answer from the distributed questionnaire regarding the 6 aspects studied.

  5. Numerical Analysis of Heat Transfer During Quenching Process

    Science.gov (United States)

    Madireddi, Sowjanya; Krishnan, Krishnan Nambudiripad; Reddy, Ammana Satyanarayana

    2018-04-01

    A numerical model is developed to simulate the immersion quenching process of metals. The time of quench plays an important role if the process involves a defined step quenching schedule to obtain the desired characteristics. Lumped heat capacity analysis used for this purpose requires the value of heat transfer coefficient, whose evaluation requires large experimental data. Experimentation on a sample work piece may not represent the actual component which may vary in dimension. A Fluid-Structure interaction technique with a coupled interface between the solid (metal) and liquid (quenchant) is used for the simulations. Initial times of quenching shows boiling heat transfer phenomenon with high values of heat transfer coefficients (5000-2.5 × 105 W/m2K). Shape of the work piece with equal dimension shows less influence on the cooling rate Non-uniformity in hardness at the sharp corners can be reduced by rounding off the edges. For a square piece of 20 mm thickness, with 3 mm fillet radius, this difference is reduced by 73 %. The model can be used for any metal-quenchant combination to obtain time-temperature data without the necessity of experimentation.

  6. Analysis of the Internationalization Process of a Venezuelan Company

    Directory of Open Access Journals (Sweden)

    Rosângela Sarmento da Silva

    2013-04-01

    Full Text Available This paper aimed to analyze, under the perspectives of economic and organizational behavior theories, the case of the internationalization of Agrogilca, a Venezuelan company classified as an small and medium business, which operates in the mining segment, agricultural materials and building materials. This is an instrumental case study, which had as sources of evidence the examination of documents provided directly by company’s directors and interviews with its owner and employees. For the data analysis and interpretation, it was took the procedure of adaptation to the theory, since the data obtained empirically were contrasted to elements of already consolidated theories about internationalization of companies. It is concluded that the strategy adopted by Agrogilca company is very close to Theory of Market Power proposed by Hymer, in which companies tend to intensify its position abroad and expand their operations. Its internationalization process did not occur according to the Uppsala model, in which the gradualism of relationship matches the gradualism of internationalization processes, but only with exportations that make companies intensify its position abroad, and finally, shows proximity to the internationalization strategy of Multidomestic Theory of Hoskisson. This is because this theory considers as an internationalization cause the sectoral conditions, political structures and customer needs for a greater administrative performance and competitiveness of the company.

  7. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    Science.gov (United States)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  8. Analysis of Work Design in Rubber Processing Plant

    Science.gov (United States)

    Wahyuni, Dini; Nasution, Harmein; Budiman, Irwan; Wijaya, Khairini

    2018-02-01

    The work design illustrates how structured jobs, tasks, and roles are defined and modified and their impact on individuals, groups, and organizations. If the work is not designed well, the company must pay greater costs for workers' health, longer production processes or even penalties for not being able to meet the delivery schedule. This is visible to the condition in a rubber processing factory in North Sumatra. Work design aspects such as layouts, machinery and equipment, worker's physical working environment, work methods, and organizational policies have not been well-organized. Coagulum grinding machines into sheets are often damaged, resulting in 4 times the delay of product delivery in 2016, the presence of complaints of heat exposure submitted by workers, and workstation that has not been properly arranged is an indication of the need for work design. The research data will be collected through field observation, and distribution of questionnaires related aspects of work design. The result of the analysis depends on the respondent's answer from the distributed questionnaire regarding the 6 aspects studied.

  9. Sensitivity analysis on parameters and processes affecting vapor intrusion risk

    KAUST Repository

    Picone, Sara

    2012-03-30

    A one-dimensional numerical model was developed and used to identify the key processes controlling vapor intrusion risks by means of a sensitivity analysis. The model simulates the fate of a dissolved volatile organic compound present below the ventilated crawl space of a house. In contrast to the vast majority of previous studies, this model accounts for vertical variation of soil water saturation and includes aerobic biodegradation. The attenuation factor (ratio between concentration in the crawl space and source concentration) and the characteristic time to approach maximum concentrations were calculated and compared for a variety of scenarios. These concepts allow an understanding of controlling mechanisms and aid in the identification of critical parameters to be collected for field situations. The relative distance of the source to the nearest gas-filled pores of the unsaturated zone is the most critical parameter because diffusive contaminant transport is significantly slower in water-filled pores than in gas-filled pores. Therefore, attenuation factors decrease and characteristic times increase with increasing relative distance of the contaminant dissolved source to the nearest gas diffusion front. Aerobic biodegradation may decrease the attenuation factor by up to three orders of magnitude. Moreover, the occurrence of water table oscillations is of importance. Dynamic processes leading to a retreating water table increase the attenuation factor by two orders of magnitude because of the enhanced gas phase diffusion. © 2012 SETAC.

  10. Radionuclides in Bayer process residues: previous analysis for radiological protection

    International Nuclear Information System (INIS)

    Cuccia, Valeria; Rocha, Zildete; Oliveira, Arno H. de

    2011-01-01

    Natural occurring radionuclides are present in many natural resources. Human activities may enhance concentrations of radionuclides and/or enhance potential of exposure to naturally occurring radioactive material (NORM). The industrial residues containing radionuclides have been receiving a considerable global attention, because of the large amounts of NORM containing wastes and the potential long term risks of long-lived radionuclides. Included in this global concern, this work focuses on the characterization of radioactivity in the main residues of Bayer process for alumina production: red mud and sand samples. Usually, the residues of Bayer process are named red mud, in their totality. However, in the industry where the samples were collected, there is an additional residues separation: sand and red mud. The analytical techniques used were gamma spectrometry (HPGe detector) and neutron activation analysis. The concentrations of radionuclides are higher in the red mud than in the sand. These solid residues present activities concentrations enhanced, when compared to bauxite. Further uses for the residues as building material must be more evaluated from the radiological point of view, due to its potential of radiological exposure enhancement, specially caused by radon emission. (author)

  11. STRATEGI PENINGKATAN KOMPETENSI GURU DENGAN PENDEKATAN ANALYSIS HIERARCHY PROCESS

    Directory of Open Access Journals (Sweden)

    Reni Daharti

    2015-07-01

    Full Text Available Seorang guru sebagai seorang pendidik merupakan komponen penting dalam proses pendidikan. Penelitian ini bertujuan untuk (1 menganalisis profil guru SLTP Komwil 05 Kabupaten Tegal, (2 menganalisis prioritas kebijakan dalam meningkatkan kompetensi guru di daerah penelitian, (3 menentukan strategi untuk meningkatkan kompetensi guru melalui prioritas kebijakan yang dapat diterapkan di daerah penelitian. Respondennya adalah 33 guru SLTP Komwil 05 Kabupaten Tegal. Mereka dipilih dengan menggunakan simple random sampling. Selain itu 15 orang dipilih untuk menjadi keyperson. Statistik Deskriptif dan Analisis Hierarchy Process digunakan untuk menganalisis data dalam penelitian ini. Hasil penelitian menunjukkan bahwa kompetensi pedagogik dan kompetensi profesional guru adalah moderat dan guru memiliki kepribadian dan kompetensi sosial yang tinggi. Hal yang harus ditingkatkan adalah kompetensi guru. Prioritas utama dalam meningkatkan kompetensi guru di Kabupaten Tegal adalah (1 memilih moralitas calon guru 2 menyaring kualitas guru (3 mengirim guru untuk mengikuti berbagai pelatihan untuk meningkatkan karakter mereka.A teacher as an educator is an important component in the educational process. This study aims to (1 analyze the teacher profile of SLTP Komwil 05 Kabupaten Tegal,  (2 analyze the policy priorities in improving the competence of teachers in the study area, (3 determine the strategies for enhancing the competence of teachers through the policy priorities that can be applied in the study area. There are 33 junior high school teachers of SLTP Komwil 05 Kabupaten Tegal as the respondents. They were selected by using simple random sampling. Then, there are also15 key persons. Descriptive Statistics and Analysis Hierarchy Process were used to analyze the data in the study. The results show that pedagogical competence and professional competence are moderate and the teachers have high personality and social competence. The thing that should be

  12. A signal processing analysis of Purkinje cells in vitro

    Directory of Open Access Journals (Sweden)

    Ze'ev R Abrams

    2010-05-01

    Full Text Available Cerebellar Purkinje cells in vitro fire recurrent sequences of Sodium and Calcium spikes. Here, we analyze the Purkinje cell using harmonic analysis, and our experiments reveal that its output signal is comprised of three distinct frequency bands, which are combined using Amplitude and Frequency Modulation (AM/FM. We find that the three characteristic frequencies - Sodium, Calcium and Switching – occur in various combinations in all waveforms observed using whole-cell current clamp recordings. We found that the Calcium frequency can display a frequency doubling of its frequency mode, and the Switching frequency can act as a possible generator of pauses that are typically seen in Purkinje output recordings. Using a reversibly photo-switchable kainate receptor agonist, we demonstrate the external modulation of the Calcium and Switching frequencies. These experiments and Fourier analysis suggest that the Purkinje cell can be understood as a harmonic signal oscillator, enabling a higher level of interpretation of Purkinje signaling based on modern signal processing techniques.

  13. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  14. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic modelbased process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty...

  15. Analysis of the medication reconciliation process conducted at hospital admission

    Directory of Open Access Journals (Sweden)

    María Beatriz Contreras Rey

    2016-07-01

    Full Text Available Objective: To analyze the outcomes of a medication reconciliation process at admission in the hospital setting. To assess the role of the Pharmacist in detecting reconciliation errors and preventing any adverse events entailed. Method: A retrospective study was conducted to analyze the medication reconciliation activity during the previous six months. The study included those patients for whom an apparently not justified discrepancy was detected at admission, after comparing the hospital medication prescribed with the home treatment stated in their clinical hospital records. Those patients for whom the physician ordered the introduction of home medication without any specification were also considered. In order to conduct the reconciliation process, the Pharmacist prepared the best pharmacotherapeutical history possible, reviewing all available information about the medication the patient could be taking before admission, and completing the process with a clinical interview. The discrepancies requiring clarification were reported to the physician. It was considered that the reconciliation proposal had been accepted if the relevant modification was made in the next visit of the physician, or within 24-48 hours maximum; this case was then labeled as a reconciliation error. For the descriptive analysis, the Statistics® SPSS program, version 17.0, was used. Outcomes: 494 medications were reconciled in 220 patients, with a mean of 2.25 medications per patient. More than half of patients (59.5% had some discrepancy that required clarification; the most frequent was the omission of a medication that the patient was taking before admission (86.2%, followed by an unjustified modification in dosing or way of administration (5.9%. In total, 312 discrepancies required clarification; out of these, 93 (29.8% were accepted and considered as reconciliation errors, 126 (40% were not accepted, and in 93 cases (29,8% acceptance was not relevant due to a change in

  16. A meta-analysis of motivational interviewing process: Technical, relational, and conditional process models of change.

    Science.gov (United States)

    Magill, Molly; Apodaca, Timothy R; Borsari, Brian; Gaume, Jacques; Hoadley, Ariel; Gordon, Rebecca E F; Tonigan, J Scott; Moyers, Theresa

    2018-02-01

    In the present meta-analysis, we test the technical and relational hypotheses of Motivational Interviewing (MI) efficacy. We also propose an a priori conditional process model where heterogeneity of technical path effect sizes should be explained by interpersonal/relational (i.e., empathy, MI Spirit) and intrapersonal (i.e., client treatment seeking status) moderators. A systematic review identified k = 58 reports, describing 36 primary studies and 40 effect sizes (N = 3,025 participants). Statistical methods calculated the inverse variance-weighted pooled correlation coefficient for the therapist to client and the client to outcome paths across multiple target behaviors (i.e., alcohol use, other drug use, other behavior change). Therapist MI-consistent skills were correlated with more client change talk (r = .55, p technical hypothesis was supported. Specifically, proportion MI consistency was related to higher proportion change talk (r = .11, p = .004) and higher proportion change talk was related to reductions in risk behavior at follow up (r = -.16, p technical hypothesis path effect sizes was partially explained by inter- and intrapersonal moderators. This meta-analysis provides additional support for the technical hypothesis of MI efficacy; future research on the relational hypothesis should occur in the field rather than in the context of clinical trials. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. Process-aware information systems : design, enactment and analysis

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Wah, B.W.

    2009-01-01

    Process-aware information systems support operational business processes by combining advances in information technology with recent insights from management science. Workflow management systems are typical examples of such systems. However, many other types of information systems are also "process

  18. HYBRID SULFUR PROCESS REFERENCE DESIGN AND COST ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Summers, W.; Boltrunis, C.; Lahoda, E.; Allen, D.; Greyvenstein, R.

    2009-05-12

    This report documents a detailed study to determine the expected efficiency and product costs for producing hydrogen via water-splitting using energy from an advanced nuclear reactor. It was determined that the overall efficiency from nuclear heat to hydrogen is high, and the cost of hydrogen is competitive under a high energy cost scenario. It would require over 40% more nuclear energy to generate an equivalent amount of hydrogen using conventional water-cooled nuclear reactors combined with water electrolysis compared to the proposed plant design described herein. There is a great deal of interest worldwide in reducing dependence on fossil fuels, while also minimizing the impact of the energy sector on global climate change. One potential opportunity to contribute to this effort is to replace the use of fossil fuels for hydrogen production by the use of water-splitting powered by nuclear energy. Hydrogen production is required for fertilizer (e.g. ammonia) production, oil refining, synfuels production, and other important industrial applications. It is typically produced by reacting natural gas, naphtha or coal with steam, which consumes significant amounts of energy and produces carbon dioxide as a byproduct. In the future, hydrogen could also be used as a transportation fuel, replacing petroleum. New processes are being developed that would permit hydrogen to be produced from water using only heat or a combination of heat and electricity produced by advanced, high temperature nuclear reactors. The U.S. Department of Energy (DOE) is developing these processes under a program known as the Nuclear Hydrogen Initiative (NHI). The Republic of South Africa (RSA) also is interested in developing advanced high temperature nuclear reactors and related chemical processes that could produce hydrogen fuel via water-splitting. This report focuses on the analysis of a nuclear hydrogen production system that combines the Pebble Bed Modular Reactor (PBMR), under development by

  19. Focus on Materials Analysis and Processing in Magnetic Fields

    Directory of Open Access Journals (Sweden)

    Yoshio Sakka, Noriyuki Hirota, Shigeru Horii and Tsutomu Ando

    2009-01-01

    Full Text Available Recently, interest in the applications of feeble (diamagnetic and paramagnetic magnetic materials has grown, whereas the popularity of ferromagnetic materials remains steady and high. This trend is due to the progress of superconducting magnet technology, particularly liquid-helium-free superconducting magnets that can generate magnetic fields of 10 T and higher. As the magnetic energy is proportional to the square of the applied magnetic field, the magnetic energy of such 10 T magnets is in excess of 10 000 times that of conventional 0.1 T permanent magnets. Consequently, many interesting phenomena have been observed over the last decade, such as the Moses effect, magnetic levitation and the alignment of feeble magnetic materials. Researchers in this area are widely spread around the world, but their number in Japan is relatively high, which might explain the success of magnetic field science and technology in Japan.Processing in magnetic fields is a rapidly expanding research area with a wide range of promising applications in materials science. The 3rd International Workshop on Materials Analysis and Processing in Magnetic Fields (MAP3, which was held on 14–16 May 2008 at the University of Tokyo, Japan, focused on various topics including magnetic field effects on chemical, physical, biological, electrochemical, thermodynamic and hydrodynamic phenomena; magnetic field effects on the crystal growth and processing of materials; diamagnetic levitation, the magneto-Archimedes effect, spin chemistry, magnetic orientation, control of structure by magnetic fields, magnetic separation and purification, magnetic-field-induced phase transitions, properties of materials in high magnetic fields, the development of NMR and MRI, medical applications of magnetic fields, novel magnetic phenomena, physical property measurement by magnetic fields, and the generation of high magnetic fields.This focus issue compiles 13 key papers selected from the proceedings

  20. Genotype-Specific Measles Transmissibility: A Branching Process Analysis.

    Science.gov (United States)

    Ackley, Sarah F; Hacker, Jill K; Enanoria, Wayne T A; Worden, Lee; Blumberg, Seth; Porco, Travis C; Zipprich, Jennifer

    2018-04-03

    Substantial heterogeneity in measles outbreak sizes may be due to genotype-specific transmissibility. Using a branching process analysis, we characterize differences in measles transmission by estimating the association between genotype and the reproduction number R among postelimination California measles cases during 2000-2015 (400 cases, 165 outbreaks). Assuming a negative binomial secondary case distribution, we fit a branching process model to the distribution of outbreak sizes using maximum likelihood and estimated the reproduction number R for a multigenotype model. Genotype B3 is found to be significantly more transmissible than other genotypes (P = .01) with an R of 0.64 (95% confidence interval [CI], .48-.71), while the R for all other genotypes combined is 0.43 (95% CI, .28-.54). This result is robust to excluding the 2014-2015 outbreak linked to Disneyland theme parks (referred to as "outbreak A" for conciseness and clarity) (P = .04) and modeling genotype as a random effect (P = .004 including outbreak A and P = .02 excluding outbreak A). This result was not accounted for by season of introduction, age of index case, or vaccination of the index case. The R for outbreaks with a school-aged index case is 0.69 (95% CI, .52-.78), while the R for outbreaks with a non-school-aged index case is 0.28 (95% CI, .19-.35), but this cannot account for differences between genotypes. Variability in measles transmissibility may have important implications for measles control; the vaccination threshold required for elimination may not be the same for all genotypes or age groups.

  1. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    Lundtang Paulsen, J.

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  2. An Analysis of Frame Semantics of Continuous Processes

    Science.gov (United States)

    2016-08-10

    continuous processes . Prior research mapped qualitative process elements onto English language constructions, but did not connect the... processes and lays groundwork for systems that learn from and reason with natural language (McFate, Forbus, & Hinrichs, 2014). Kuehne (2004) developed...qualitative process (QP) theory provides a formal language for representing mental models of continuous systems. QP theory is domain general and

  3. Analysis of the Education Program Approval Process: A Program Evaluation.

    Science.gov (United States)

    Fountaine, Charles A.; And Others

    A study of the education program approval process involving the Veterans Administration (VA) and the State Approving Agencies (SAAs) had the following objectives: to describe the present education program approval process; to determine time and costs associated with the education program approval process; to describe the approval process at…

  4. Statistical Analysis of CMC Constituent and Processing Data

    Science.gov (United States)

    Fornuff, Jonathan

    2004-01-01

    observed using statistical analysis software. The ultimate purpose of this study is to determine what variations in material processing can lead to the most critical changes in the materials property. The work I have taken part in this summer explores, in general, the key properties needed In this study SiC/SiC composites of varying architectures, utilizing a boron-nitride (BN)

  5. Cost Risk Analysis Based on Perception of the Engineering Process

    Science.gov (United States)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    process parameters are elicited from the engineer/expert on the project and are based on that expert's technical knowledge. These are converted by a parametric cost model into a cost estimate. The method discussed makes no assumptions about the distribution underlying the distribution of possible costs, and is not tied to the analysis of previous projects, except through the expert calibrations performed by the parametric cost analyst.

  6. GOCI Level-2 Processing Improvements and Cloud Motion Analysis

    Science.gov (United States)

    Robinson, Wayne D.

    2015-01-01

    The Ocean Biology Processing Group has been working with the Korean Institute of Ocean Science and Technology (KIOST) to process geosynchronous ocean color data from the GOCI (Geostationary Ocean Color Instrument) aboard the COMS (Communications, Ocean and Meteorological Satellite). The level-2 processing program, l2gen has GOCI processing as an option. Improvements made to that processing are discussed here as well as a discussion about cloud motion effects.

  7. Irreversibility analysis in the process of solar distillation

    International Nuclear Information System (INIS)

    Chávez, S; Terres, H; Lizardi, A; López, R; Lara, A

    2017-01-01

    In this work an irreversibility analysis for the thermal process of solar distillation of three different substances is presented, for which it employs a solar still of a slope where three experimental tests with 5.5 L of brine, river water and MgCl 2 were performed. Temperature data principally in the glass cover, absorber plate, fluid, environment and the incident solar radiation on the device were obtained. With measurements of temperature, solar radiation and exergetic balance, irreversibilities are found on the device. The results show that the highest values of irreversibilities are concentrated in the absorber plate with an average of 321 W, 342 W and 276 W, followed by the cover glass with an average of 75.8 W, 80.4 W and 86.7 W and finally the fluid with 15.3 W, 15.9 W and 16 W, for 5.5 L of brine, river water and MgCl 2 . (paper)

  8. Image Post-Processing and Analysis. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Yushkevich, P. A. [University of Pennsylvania, Philadelphia (United States)

    2014-09-15

    For decades, scientists have used computers to enhance and analyse medical images. At first, they developed simple computer algorithms to enhance the appearance of interesting features in images, helping humans read and interpret them better. Later, they created more advanced algorithms, where the computer would not only enhance images but also participate in facilitating understanding of their content. Segmentation algorithms were developed to detect and extract specific anatomical objects in images, such as malignant lesions in mammograms. Registration algorithms were developed to align images of different modalities and to find corresponding anatomical locations in images from different subjects. These algorithms have made computer aided detection and diagnosis, computer guided surgery and other highly complex medical technologies possible. Nowadays, the field of image processing and analysis is a complex branch of science that lies at the intersection of applied mathematics, computer science, physics, statistics and biomedical sciences. This chapter will give a general overview of the most common problems in this field and the algorithms that address them.

  9. Field Experiments Aimed To The Analysis of Flood Generation Processes

    Science.gov (United States)

    Carriero, D.; Iacobellis, V.; Oliveto, G.; Romano, N.; Telesca, V.; Fiorentino, M.

    The study of the soil moisture dynamics and of the climate-soil-vegetation interac- tion is essential for the comprehension of possible climatic change phenomena, as well as for the analysis of occurrence of extreme hydrological events. In this trend the theoretically-based distribution of floods recently derived by Fiorentino and Ia- cobellis, [ŞNew insights about the climatic and geologic control on the probability distribution of floodsT, Water Resources Research, 2001, 37: 721-730] demonstrated, by an application in some Southern Italy basins, that processes at the hillslope scale strongly influence the basin response by means of the different mechanisms of runoff generation produced by various distributions of partial area contributing. This area is considered as a stochastic variable whose pdf position parameter showed strong de- pendence on the climate as it can seen in the studied basins behavior: in dry zones, where there is the prevalence of the infiltration excess (Horton) mechanism, the basin water loss parameter decreases as basin area increases and the flood peak source area depends on the permeability of soils; in humid zones, with the prevalence of satu- ration excess (Dunne) process, the loss parameter seems independent from the basin area and very sensitive to simple climatic index while only small portion of the area invested by the storm contributes to floods. The purpose of this work is to investigate the consistency of those interpretations by means of field experiments at the hillslope scale to establish a parameterization accounting for soil physical and hydraulic prop- erties, vegetation characteristics and land-use. The research site is the catchment of River Fiumarella di Corleto, which is located in Basilicata Region, Italy, and has a drainage area of approximately 32 km2. The environment has a rather dynamic geo- morphology and very interesting features from the soil-landscape modeling viewpoint [Santini A., A. Coppola, N. Romano, and

  10. Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool

    Science.gov (United States)

    2017-06-01

    impact everything from strategic logistic operations down to the energy demands at the company level. It also looks at the force structure of the...this requirement. 34. The system shall determine the efficiency of the logistics network with respect to an estimated cost of fuel used to deliver...REQUIREMENTS GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL by Jonathan M. Swan June 2017 Thesis Advisor

  11. FOREWORD: Focus on Materials Analysis and Processing in Magnetic Fields Focus on Materials Analysis and Processing in Magnetic Fields

    Science.gov (United States)

    Sakka, Yoshio; Hirota, Noriyuki; Horii, Shigeru; Ando, Tsutomu

    2009-03-01

    Recently, interest in the applications of feeble (diamagnetic and paramagnetic) magnetic materials has grown, whereas the popularity of ferromagnetic materials remains steady and high. This trend is due to the progress of superconducting magnet technology, particularly liquid-helium-free superconducting magnets that can generate magnetic fields of 10 T and higher. As the magnetic energy is proportional to the square of the applied magnetic field, the magnetic energy of such 10 T magnets is in excess of 10 000 times that of conventional 0.1 T permanent magnets. Consequently, many interesting phenomena have been observed over the last decade, such as the Moses effect, magnetic levitation and the alignment of feeble magnetic materials. Researchers in this area are widely spread around the world, but their number in Japan is relatively high, which might explain the success of magnetic field science and technology in Japan. Processing in magnetic fields is a rapidly expanding research area with a wide range of promising applications in materials science. The 3rd International Workshop on Materials Analysis and Processing in Magnetic Fields (MAP3), which was held on 14-16 May 2008 at the University of Tokyo, Japan, focused on various topics including magnetic field effects on chemical, physical, biological, electrochemical, thermodynamic and hydrodynamic phenomena; magnetic field effects on the crystal growth and processing of materials; diamagnetic levitation, the magneto-Archimedes effect, spin chemistry, magnetic orientation, control of structure by magnetic fields, magnetic separation and purification, magnetic-field-induced phase transitions, properties of materials in high magnetic fields, the development of NMR and MRI, medical applications of magnetic fields, novel magnetic phenomena, physical property measurement by magnetic fields, and the generation of high magnetic fields. This focus issue compiles 13 key papers selected from the proceedings of MAP3. Other

  12. Class A Network Dataring gauges - 1991 data processing and analysis

    OpenAIRE

    Shaw, S.M.

    1992-01-01

    This report presents a summary of still water level data processing for 1991 from 34 modernised dataring sites around the UK coast. Details of geographic position, reference levels, processing, statistics and analyses are included.

  13. An Analysis of the Credit Card Program Using Process Innovation

    National Research Council Canada - National Science Library

    Braney, Ronald

    1999-01-01

    .... This goes a long way toward improving and streamlining the contracting process. One of the key reform initiatives in streamlining the process is the implementation of the Government-wide credit card program...

  14. Process Knowledge Discovery Using Sparse Principal Component Analysis

    DEFF Research Database (Denmark)

    Gao, Huihui; Gajjar, Shriram; Kulahci, Murat

    2016-01-01

    As the goals of ensuring process safety and energy efficiency become ever more challenging, engineers increasingly rely on data collected from such processes for informed decision making. During recent decades, extracting and interpreting valuable process information from large historical data sets...... SPCA approach that helps uncover the underlying process knowledge regarding variable relations. This approach systematically determines the optimal sparse loadings for each sparse PC while improving interpretability and minimizing information loss. The salient features of the proposed approach...

  15. Discovery and analysis of e-mail-driven business processes

    NARCIS (Netherlands)

    Stuit, Marco; Wortmann, Hans

    E-mail is used as the primary tool for business communication and collaboration. This paper presents a novel e-mail interaction mining method to discover and analyze e-mail-driven business processes. An e-mail-driven business process is perceived as a human collaboration process that consists of

  16. Geometric anisotropic spatial point pattern analysis and Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Toftaker, Håkon

    . In particular we study Cox process models with an elliptical pair correlation function, including shot noise Cox processes and log Gaussian Cox processes, and we develop estimation procedures using summary statistics and Bayesian methods. Our methodology is illustrated on real and synthetic datasets of spatial...

  17. Reliability analysis of common hazardous waste treatment processes

    International Nuclear Information System (INIS)

    Waters, R.D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption

  18. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, Robert D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  19. Analysis of therapeutic methods for treating vocal process granulomas.

    Science.gov (United States)

    Ma, Lijing; Xiao, Yang; Ye, Jingying; Yang, Qingwen; Wang, Jun

    2015-03-01

    The combination of laryngeal microsurgery and local injections of botulinum toxin type A (BTA) can increase the cure rate of patients with vocal process granulomas (VPGs). To analyze the therapeutic effects of conservative treatments, microsurgical resection with suturing and microsurgery in combination with local injections of BTA for the treatment of VPGs. A retrospective analysis of 168 cases of VPG was performed. All of the patients initially received a conservative treatment. Some of the patients who did not respond to the conservative treatments were treated using microsurgical resection and microsuturing using an 8-0 absorbable filament. Other patients additionally received a four-point injection of BTA into the thyroarytenoid muscle and the arytenoid muscle on the operated side. The lesions of 41.3% (71/168) of the patients who were given the conservative treatments (including acid suppression, vocal rest, and voice therapy) disappeared, and the lesions of 10.7% (18/168) of the patients were reduced. The conservative treatments were unsuccessful for 47% (79/168) of the patients. The cure rate was 78.4% (29/37) for the patients who were treated by microscope resection using a CO2 laser and microsuturing of the surrounding mucosa. Of the eight patients who experienced a recurrence, five of them had lesions that disappeared after 3 months of conservative treatment, whereas the other three patients recovered after a second operation. The cure rate of the 42 patients who were treated using microsurgery combined with local injections of BTA was 95.2% (40/42), with only 2 cases of recurrence at 2 months post-treatment.

  20. Analysis of clay particles behaviour during hydration-dehydration processes

    International Nuclear Information System (INIS)

    Maison, T.; Laouafa, F.; Delalain, P.; Fleureau, J.M.

    2010-01-01

    Document available in extended abstract form only. The knowledge of the physico-chemical processes at a local (micro) level during the shrinkage or the swelling processes of clayey materials is an essential step to characterise the ability of such soils to shrink or to swell. In order to better understand these phenomena, we performed research at microscopic levels using mainly an Environmental Scanning Electron Microscope (ESEM). This apparatus allows exploring some features of the behaviour and physical properties of clays subjected to controlled hygrometry conditions. The observations were performed on an heterogeneous natural clay, the Romainville clay. This clay, showing a sensitive behavior to shrinkage and swelling, is taken in situ from affected site by the drought. This site is well monitored. This clay was characterised by classical geotechnical laboratory tests (mercury porosimetry, X-Ray diffraction, grain size analysis...). Microstructure observations are done on cubic samples of 1 cm side. Swelling-shrinkage cycles are done on clay powder with grain sizes between 63 μm and 125 μm. The microstructure shows a compact clayey matrix with small calcite and quartz grains. Calcite may be present in veins form, due to sedimentation or pressure-dissolution effect. At high humidity value around 98%, moulds are observed on the totality of sample surface. During swelling-shrinkage cycles, surface sample changes are real time followed. Hydratation-dehydration cycles are imposed with a time of 30 minutes (considered as sufficient to reach steady state). The sample deformation induced by swelling and shrinkage is calculated by analyzing 2D ESEM images and assuming isotropic behaviour for the out of plane strain. The result shows a kinetics of swelling and shrinkage which can be decomposed into two successive phases. At each change of relative humidity, the first step is characterized by a discontinuity (jump) in the deformation, followed by a quite constant strain

  1. Individual differences in emotion word processing: A diffusion model analysis.

    Science.gov (United States)

    Mueller, Christina J; Kuchinke, Lars

    2016-06-01

    The exploratory study investigated individual differences in implicit processing of emotional words in a lexical decision task. A processing advantage for positive words was observed, and differences between happy and fear-related words in response times were predicted by individual differences in specific variables of emotion processing: Whereas more pronounced goal-directed behavior was related to a specific slowdown in processing of fear-related words, the rate of spontaneous eye blinks (indexing brain dopamine levels) was associated with a processing advantage of happy words. Estimating diffusion model parameters revealed that the drift rate (rate of information accumulation) captures unique variance of processing differences between happy and fear-related words, with highest drift rates observed for happy words. Overall emotion recognition ability predicted individual differences in drift rates between happy and fear-related words. The findings emphasize that a significant amount of variance in emotion processing is explained by individual differences in behavioral data.

  2. Thermodynamic analysis on theoretical models of cycle combined heat exchange process: The reversible heat exchange process

    International Nuclear Information System (INIS)

    Zhang, Chenghu; Li, Yaping

    2017-01-01

    Concept of reversible heat exchange process as the theoretical model of the cycle combined heat exchanger could be useful to determine thermodynamics characteristics and the limitation values in the isolated heat exchange system. In this study, the classification of the reversible heat exchange processes is presented, and with the numerical method, medium temperature variation tendency and the useful work production and usage in the whole process are investigated by the construction and solution of the mathematical descriptions. Various values of medium inlet temperatures and heat capacity ratio are considered to analyze the effects of process parameters on the outlet temperature lift/drop. The maximum process work transferred from the Carnot cycle region to the reverse cycle region is also researched. Moreover, influence of the separating point between different sub-processes on temperature variation profile and the process work production are analyzed. In addition, the heat-exchange-enhancement-factor is defined to study the enhancement effect of the application of the idealized process in the isolated heat exchange system, and the variation degree of this factor with process parameters change is obtained. The research results of this paper can be a theoretical guidance to construct the cycle combined heat exchange process in the practical system. - Highlights: • A theoretical model of Cycle combined heat exchange process is proposed. • The classification of reversible heat exchange process are presented. • Effects of Inlet temperatures and heat capacity ratio on process are analyzed. • Process work transmission through the whole process is studied. • Heat-exchange-enhancement-factor can be a criteria to express the application effect of the idealized process.

  3. Analysis of production flow process with lean manufacturing approach

    Science.gov (United States)

    Siregar, Ikhsan; Arif Nasution, Abdillah; Prasetio, Aji; Fadillah, Kharis

    2017-09-01

    This research was conducted on the company engaged in the production of Fast Moving Consumer Goods (FMCG). The production process in the company are still exists several activities that cause waste. Non value added activity (NVA) in the implementation is still widely found, so the cycle time generated to make the product will be longer. A form of improvement on the production line is by applying lean manufacturing method to identify waste along the value stream to find non value added activities. Non value added activity can be eliminated and reduced by utilizing value stream mapping and identifying it with activity mapping process. According to the results obtained that there are 26% of value-added activities and 74% non value added activity. The results obtained through the current state map of the production process of process lead time value of 678.11 minutes and processing time of 173.94 minutes. While the results obtained from the research proposal is the percentage of value added time of 41% of production process activities while non value added time of the production process of 59%. While the results obtained through the future state map of the production process of process lead time value of 426.69 minutes and processing time of 173.89 minutes.

  4. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  5. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  6. Economics of coal conversion processing. Advances in coal gasification: support research. Advances in coal gasification: process development and analysis

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    The fall meeting of the American Chemical Society, Division of Fuel Chemistry, was held at Miami Beach, Florida, September 10-15, 1978. Papers involved the economics of coal conversion processing and advances in coal gasification, especially support research and process development and analysis. Fourteen papers have been entered individually into EDB and ERA; three papers had been entered previously from other sources. (LTN)

  7. Thermomechanical analysis of an electrically assisted wire drawing process

    OpenAIRE

    Sánchez Egea, Antonio José; González Rojas, Hernan Alberto; Celentano, Diego Javier; Jorba Peiró, Jordi; Cao, Jia

    2017-01-01

    Electrically-assisted wire drawing process is a hybrid manufacturing process characterized by enhancement of the formability, ductility and elongation of the wire drawn specimen. A thermomechanical model to describe the change of the mechanical response due to the thermal contribution is proposed in this work. Additionally, a numerical simulation was conducted to study the potential and limitations of this hybrid process by using two different hardening laws: a phenomenological and a dislocat...

  8. Simulation analysis of resource flexibility on healthcare processes.

    Science.gov (United States)

    Simwita, Yusta W; Helgheim, Berit I

    2016-01-01

    This paper uses discrete event simulation to explore the best resource flexibility scenario and examine the effect of implementing resource flexibility on different stages of patient treatment process. Specifically we investigate the effect of resource flexibility on patient waiting time and throughput in an orthopedic care process. We further seek to explore on how implementation of resource flexibility on patient treatment processes affects patient access to healthcare services. We focus on two resources, namely, orthopedic surgeon and operating room. The observational approach was used to collect process data. The developed model was validated by comparing the simulation output with actual patient data collected from the studied orthopedic care process. We developed different scenarios to identify the best resource flexibility scenario and explore the effect of resource flexibility on patient waiting time, throughput, and future changes in demand. The developed scenarios focused on creating flexibility on service capacity of this care process by altering the amount of additional human resource capacity at different stages of patient care process and extending the use of operating room capacity. The study found that resource flexibility can improve responsiveness to patient demand in the treatment process. Testing different scenarios showed that the introduction of resource flexibility reduces patient waiting time and improves throughput. The simulation results show that patient access to health services can be improved by implementing resource flexibility at different stages of the patient treatment process. This study contributes to the current health care literature by explaining how implementing resource flexibility at different stages of patient care processes can improve ability to respond to increasing patients demands. This study was limited to a single patient process; studies focusing on additional processes are recommended.

  9. Image Segmentation and Processing for Efficient Parking Space Analysis

    OpenAIRE

    Tutika, Chetan Sai; Vallapaneni, Charan; R, Karthik; KP, Bharath; Muthu, N Ruban Rajesh Kumar

    2018-01-01

    In this paper, we develop a method to detect vacant parking spaces in an environment with unclear segments and contours with the help of MATLAB image processing capabilities. Due to the anomalies present in the parking spaces, such as uneven illumination, distorted slot lines and overlapping of cars. The present-day conventional algorithms have difficulties processing the image for accurate results. The algorithm proposed uses a combination of image pre-processing and false contour detection ...

  10. Fundamental atomic plasma chemistry for semiconductor manufacturing process analysis

    International Nuclear Information System (INIS)

    Ventzek, P.L.G.; Zhang, D.; Stout, P.J.; Rauf, S.; Orlowski, M.; Kudrya, V.; Astapenko, V.; Eletskii, A.

    2002-01-01

    An absence of fundamental atomic plasma chemistry data (e.g. electron impact cross-sections) hinders the application of plasma process models in semiconductor manufacturing. Of particular importance is excited state plasma chemistry data for metallization applications. This paper describes important plasma chemistry processes in the context of high density plasmas for metallization application and methods for the calculation of data for the study of these processes. Also discussed is the development of model data sets that address computational tractability issues. Examples of model electron impact cross-sections for Ni reduced from multiple collision processes are presented

  11. FEM analysis of hollow hub forming in rolling extrusion process

    Directory of Open Access Journals (Sweden)

    J. Bartnicki

    2014-10-01

    Full Text Available In this paper are presented the results of numerical calculations of rolling extrusion process of a hollow hub. As the flanges manufacturing at both sides of the product is required, in the analyzed process of rolling extrusion, a rear bumper was implemented as additional tool limiting axial metal flow. Numerical calculations of the hub forming process were conducted basing on finite element method, applying software Deform3D and Simufact in conditions of three dimensional state of strain. The obtained satisfactory results show that it is possible to conduct the further research works of experimental character, with the application of a modernized aggregate for the rolling extrusion process PO-2.

  12. Temporal Expectation and Information Processing: A Model-Based Analysis

    Science.gov (United States)

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  13. Commentary: Mediation Analysis, Causal Process, and Cross-Sectional Data

    Science.gov (United States)

    Shrout, Patrick E.

    2011-01-01

    Maxwell, Cole, and Mitchell (2011) extended the work of Maxwell and Cole (2007), which raised important questions about whether mediation analyses based on cross-sectional data can shed light on longitudinal mediation process. The latest article considers longitudinal processes that can only be partially explained by an intervening variable, and…

  14. Analysis of profitability and poverty reduction of yoghurt processing ...

    African Journals Online (AJOL)

    The study assessed the profitability of yoghurt processing with a view of determining its potentials for reducing poverty in Maiduguri Metropolitan Area. Data were collected from a survey of 10 yoghurt processing firms in Maiduguri and analysed using profit model and descriptive statistics. Results revealed that yoghurt ...

  15. Guidelines and cost analysis for catalyst production in biocatalytic processes

    DEFF Research Database (Denmark)

    Tufvesson, Pär; Lima Ramos, Joana; Nordblad, Mathias

    2011-01-01

    Biocatalysis is an emerging area of technology, and to date few reports have documented the economics of such processes. As it is a relatively new technology, many processes do not immediately fulfill the economic requirements for commercial operation. Hence, early-stage economic assessment could...

  16. Modelling and analysis of CVD processes for ceramic membrane preparation

    NARCIS (Netherlands)

    Brinkman, H.W.; Cao, G.Z.; Meijerink, J.; de Vries, Karel Jan; Burggraaf, Anthonie

    1993-01-01

    A mathematical model is presented that describes the modified chemical vapour deposition (CVD) process (which takes place in advance of the electrochemical vapour deposition (EVD) process) to deposit ZrO2 inside porous media for the preparation and modification of ceramic membranes. The isobaric

  17. Semantics and analysis of business process models in BPMN

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Ouyang, C.

    2008-01-01

    The Business Process Modelling Notation (BPMN) is a standard for capturing business processes in the early phases of systems development. The mix of constructs found in BPMN makes it possible to create models with semantic errors. Such errors are especially serious, because errors in the early

  18. A Practical Decision-Analysis Process for Forest Ecosystem Management

    Science.gov (United States)

    H. Michael Rauscher; F. Thomas Lloyd; David L. Loftis; Mark J. Twery

    2000-01-01

    Many authors have pointed out the need to firm up the 'fuzzy' ecosystem management paradigm and develop operationally practical processes to allow forest managers to accommodate more effectively the continuing rapid change in societal perspectives and goals. There are three spatial scales where clear, precise, practical ecosystem management processes are...

  19. PROV2R : Practical provenance analysis of unstructured processes

    NARCIS (Netherlands)

    Stamatogiannakis, Manolis; Athanasopoulos, Elias; Bos, Herbert; Groth, Paul

    2017-01-01

    Information produced by Internet applications is inherently a result of processes that are executed locally. Think of a web server that makes use of a CGI script, or a content management system where a post was first edited using a word processor. Given the impact of these processes to the content

  20. Plant operator performance evaluation based on cognitive process analysis experiment

    International Nuclear Information System (INIS)

    Ujita, H.; Fukuda, M.

    1990-01-01

    This paper reports on an experiment to clarify plant operators' cognitive processes that has been performed, to improve the man-machine interface which supports their diagnoses and decisions. The cognitive processes under abnormal conditions were evaluated by protocol analyses interviews, etc. in the experiment using a plant training simulator. A cognitive process model is represented by a stochastic network, based on Rasmussen's decision making model. Each node of the network corresponds to an element of the cognitive process, such as observation, interpretation, execution, etc. Some observations were obtained as follows, by comparison of Monte Carlo simulation results with the experiment results: A process to reconfirm the plant parameters after execution of a task and feedback paths from this process to the observation and the task definition of next task were observed. The feedback probability average and standard deviation should be determined for each incident type to explain correctly the individual differences in the cognitive processes. The tendency for the operator's cognitive level to change from skill-based to knowledge-based via rule-based behavior was observed during the feedback process

  1. Simple process capability analysis and quality validation of ...

    African Journals Online (AJOL)

    Many ways can be applied to improve the process and one of them is by choosing the correct six sigma's design of experiment (DOE). In this study, Taguchi's experimental design was applied to achieve high percentage of cell viability in the fermentation experiment. The process capability of this study was later analyzed by ...

  2. Economic analysis of fish processing and marketing in Ogun ...

    African Journals Online (AJOL)

    Despite the high profitability of the business, fish processors identified lack of collateral security for bank loan (96.5%), erratic power supply (92.0%) and lack of modern fish processing facilities (43.4%) as their most prevailing problems. With this high level of profitability and viability in fish processing and marketing, it is ...

  3. Transient analysis of reflected Lévy processes

    NARCIS (Netherlands)

    Kella, O.; Mandjes, M.R.H.

    2013-01-01

    In this paper we establish a formula for the joint Laplace-Stieltjes transform of a reflected Lévy process and its regulator at an independent exponentially distributed time, starting at an independent exponentially distributed state. The Lévy process is general, that is, it is not assumed that it

  4. Transient analysis of reflected Lévy processes

    NARCIS (Netherlands)

    Kella, O.; Mandjes, M.

    2013-01-01

    In this paper, we establish a formula for the joint Laplace-Stieltjes transform of a reflected Lévy process and its regulator at an independent exponentially distributed time, starting at an independent exponentially distributed state. The Lévy process is general, that is, it is not assumed that it

  5. PRODIAG -- Dynamic qualitative analysis for process fault diagnosis

    International Nuclear Information System (INIS)

    Reifman, J.; Wei, T.Y.C.

    1995-01-01

    The authors present a method for handling the dynamic effects of process component malfunctions through time-independent rule-based diagnostic systems. The method's theory is discussed and a simplified version is implemented in the process diagnostic expert system PRODIAG. Simulation results from a full-scope operator training simulator of a nuclear power plant are used to illustrate the method

  6. Analysis of Alternatives (AoA) Process Improvement Study

    Science.gov (United States)

    2016-12-01

    currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 2. REPORT TYPE 3. DATES COVERED...analysis, cost analysis, sustainment considerations, early systems engineering analyses, threat projections, and market research. UNCLASSIFIED CAA...primarily the Equipping (EE), Sustaining (SS) and Training (TT) Program Evaluation Groups (PEGs) and Long-range Investment Requirements Analysis

  7. Chemical analysis of cyanide in cyanidation process: review of methods

    International Nuclear Information System (INIS)

    Nova-Alonso, F.; Elorza-Rodriguez, E.; Uribe-Salas, A.; Perez-Garibay, R.

    2007-01-01

    Cyanidation, the world wide method for precious metals recovery, the chemical analysis of cyanide, is a very important, but complex operation. Cyanide can be present forming different species, each of them with different stability, toxicity, analysis method and elimination technique. For cyanide analysis, there exists a wide selection of analytical methods but most of them present difficulties because of the interference of species present in the solution. This paper presents the different available methods for chemical analysis of cyanide: titration, specific electrode and distillation, giving special emphasis on the interferences problem, with the aim of helping in the interpretation of the results. (Author)

  8. Warpage behavior analysis in package processes of embedded copper substrates

    Directory of Open Access Journals (Sweden)

    Hwang Yeong-Maw

    2017-01-01

    Full Text Available With the advance of the semiconductor industry and in response to the demands of ultra-thin products, packaging technology has been continuously developed. Thermal bonding process of copper pillar flip chip packages is a new bonding process in packaging technology, especially for substrates with embedded copper trace. During the packaging process, the substrate usually warps because of the heating process. In this paper, a finite element software ANSYS is used to model the embedded copper trace substrate and simulate the thermal and deformation behaviors of the substrate during the heating package process. A fixed geometric configuration equivalent to the real structure is duplicated to make the simulation of the warpage behavior of the substrate feasible. An empirical formula for predicting the warpage displacements is also established.

  9. STABILITY ANALYSIS OF RADIAL TURNING PROCESS FOR SUPERALLOYS

    Directory of Open Access Journals (Sweden)

    Alberto JIMÉNEZ

    2017-07-01

    Full Text Available Stability detection in machining processes is an essential component for the design of efficient machining processes. Automatic methods are able to determine when instability is happening and prevent possible machine failures. In this work a variety of methods are proposed for detecting stability anomalies based on the measured forces in the radial turning process of superalloys. Two different methods are proposed to determine instabilities. Each one is tested on real data obtained in the machining of Waspalloy, Haynes 282 and Inconel 718. Experimental data, in both Conventional and High Pressure Coolant (HPC environments, are set in four different states depending on materials grain size and Hard-ness (LGA, LGS, SGA and SGS. Results reveal that PCA method is useful for visualization of the process and detection of anomalies in online processes.

  10. Global processing takes time: A meta-analysis on local-global visual processing in ASD

    OpenAIRE

    Van der Hallen, Ruth; Evers, Kris; Brewaeys, K.; Van Den Noortgate, Wim; Wagemans, Johan

    2015-01-01

    What does an individual with autism spectrum disorder (ASD) perceive first: the forest or the trees? In spite of 30 years of research and influential theories like the weak central coherence (WCC) theory and the enhanced perceptual functioning (EPF) account, the interplay of local and global visual processing in ASD remains only partly understood. Research findings vary in indicating a local processing bias or a global processing deficit, and often contradict each other. We have applied a for...

  11. Analysis of Student Satisfaction in The Process of Teaching and Learning Using Importance Performance Analysis

    Science.gov (United States)

    Sembiring, P.; Sembiring, S.; Tarigan, G.; Sembiring, OD

    2017-12-01

    This study aims to determine the level of student satisfaction in the learning process at the University of Sumatra Utara, Indonesia. The sample size of the study consisted 1204 students. Students’ response measured through questionnaires an adapted on a 5-point likert scale and interviews directly to the respondent. SERVQUAL method used to measure the quality of service with five dimensions of service characteristics, namely, physical evidence, reliability, responsiveness, assurance and concern. The result of Importance Performance Analysis reveals that six services attributes must be corrected by policy maker of University Sumatera Utara. The quality of service is still considered low by students.

  12. Domain Endurants: An Analysis and Description Process Model

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2014-01-01

    We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. We...

  13. Dynamic process analysis by moments of extreme orders

    Czech Academy of Sciences Publication Activity Database

    Šimberová, Stanislava; Suk, Tomáš

    2016-01-01

    Roč. 14, January (2016), s. 43-51 ISSN 2213-1337 R&D Projects: GA ČR GA15-16928S Institutional support: RVO:67985815 ; RVO:67985556 Keywords : high-order moments * principal component analysis * frequency analysis Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics; BD - Theory of Information (UTIA-B) Impact factor: 2.010, year: 2016

  14. The Photoplethismographic Signal Processed with Nonlinear Time Series Analysis Tools

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Garcia Lanz, Abel; Garcia Dominguez, Luis; Cabannas, Karelia

    2001-01-01

    Finger photoplethismography (PPG) signals were submitted to nonlinear time series analysis. The applied analytical techniques were: (i) High degree polynomial fitting for baseline estimation; (ii) FFT analysis for estimating power spectra; (iii) fractal dimension estimation via the Higuchi's time-domain method, and (iv) kernel nonparametric estimation for reconstructing noise free-attractors and also for estimating signal's stochastic components

  15. Analysis of an innovative process for landfill gas quality improvement

    International Nuclear Information System (INIS)

    Lombardi, L.; Carnevale, E.A.

    2016-01-01

    Low methane content landfill gas is not suitable for feeding engines and is generally flared. This type of landfill gas may be enriched by removing the inert carbon dioxide. An innovative process, based on the carbon dioxide captured by means of accelerated carbonation of bottom ash was proposed and studied for the above purpose. The process was investigated at a laboratory scale, simulating different landfill gas compositions. The enrichment process is able to decrease the carbon dioxide concentration from 70 to 80% in volume to 60% in volume, requiring about 36 kg of bottom ash per Nm"3 of landfill gas. Using this result it was estimated that an industrial scale plant, processing 100–1000 Nm"3/h of low methane content landfill gas requires about 28,760–2,87,600 t of bottom ash for a one year operation. The specific cost of the studied enrichment process was evaluated as well and ranges from 0.052 to 0.241 Euro per Nm"3 of entering landfill gas. The energy balance showed that about 4–6% of the energy entered with the landfill gas is required for carrying out the enrichment, while the use of the enriched landfill gas in the engine producing electricity allows for negative carbon dioxide emission. - Highlights: • The process uses a waste stream as material to capture CO_2. • The process uses a simple gas/solid fixed bed contact reactor at ambient conditions. • The process captures the CO_2 to enrich low-CH4 landfill gas. • The specific cost ranges from 0.052 to 0.241 Euro per Nm"3 of entering landfill gas. • The process consumes about 4–6% of the entering energy and acts as CO_2 sink.

  16. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  17. Analysis of Optimal Process Flow Diagrams of Light Naphtha Isomerization Process by Mathematic Modelling Method

    OpenAIRE

    Chuzlov, Vyacheslav Alekseevich; Molotov, Konstantin

    2016-01-01

    An approach to simulation of hydrocarbons refining processes catalytic reactors. The kinetic and thermodynamic research of light naphtha isomerization process was conducted. The kinetic parameters of hydrocarbon feedstock chemical conversion on different types of platinum-content catalysts was established. The estimation of efficiency of including different types of isomerization technologies in oil refinery flow diagram was performed.

  18. Analysis of Optimal Process Flow Diagrams of Light Naphtha Isomerization Process by Mathematic Modelling Method

    Directory of Open Access Journals (Sweden)

    Chuzlov Vjacheslav

    2016-01-01

    Full Text Available An approach to simulation of hydrocarbons refining processes catalytic reactors. The kinetic and thermodynamic research of light naphtha isomerization process was conducted. The kinetic parameters of hydrocarbon feedstock chemical conversion on different types of platinum-content catalysts was established. The estimation of efficiency of including different types of isomerization technologies in oil refinery flow diagram was performed.

  19. Application of Cortical Processing Theory to Acoustical Analysis

    National Research Council Canada - National Science Library

    Ghitza, Oded

    2007-01-01

    ... (TMC). Robustness against background noise is provided principally by the signal processing performed by the PAM, while insensitivity to time-scale variations is provided by properties of the TMC...

  20. Analysis of paper machine process waters; Paperikoneen prosessivesianalytiikka - MPKT 09

    Energy Technology Data Exchange (ETDEWEB)

    Knuutinen, J; Alen, R; Harjula, P; Kilpinen, J; Pallonen, R; Jurvela, V

    1999-12-31

    The closure of paper machine circuits demands a better knowledge of the chemical structures and behaviour of organic compounds in pulp mill process waters. Nonionic or negatively charged detrimental substances (anionic trash) which will eventually cause runnability. Paper quality problems are of special interest. The main purpose of the project was to develop routine `fingerprint` analytical procedures to study various process waters. Our major interest was focused on low molecular weight carboxylic acids, carbohydrates and lignin based material. The `fingerprints` (chromatograms and electropherograms) can be used to differentiate various process waters or to find out changes between the composition of organic compounds in various stages of the papermaking process. Until now the most characteristic `fingerprints` were obtained by capillary zone electrophoresis (CZE) and by pyrolysis - gas chromatography - mass spectrometry (Py-GC/MS). Examples of using these techniques are briefly discussed. (orig.)

  1. Analysis of paper machine process waters; Paperikoneen prosessivesianalytiikka - MPKT 09

    Energy Technology Data Exchange (ETDEWEB)

    Knuutinen, J.; Alen, R.; Harjula, P.; Kilpinen, J.; Pallonen, R.; Jurvela, V.

    1998-12-31

    The closure of paper machine circuits demands a better knowledge of the chemical structures and behaviour of organic compounds in pulp mill process waters. Nonionic or negatively charged detrimental substances (anionic trash) which will eventually cause runnability. Paper quality problems are of special interest. The main purpose of the project was to develop routine `fingerprint` analytical procedures to study various process waters. Our major interest was focused on low molecular weight carboxylic acids, carbohydrates and lignin based material. The `fingerprints` (chromatograms and electropherograms) can be used to differentiate various process waters or to find out changes between the composition of organic compounds in various stages of the papermaking process. Until now the most characteristic `fingerprints` were obtained by capillary zone electrophoresis (CZE) and by pyrolysis - gas chromatography - mass spectrometry (Py-GC/MS). Examples of using these techniques are briefly discussed. (orig.)

  2. A Review on Parametric Analysis of Magnetic Abrasive Machining Process

    Science.gov (United States)

    Khattri, Krishna; Choudhary, Gulshan; Bhuyan, B. K.; Selokar, Ashish

    2018-03-01

    The magnetic abrasive machining (MAM) process is a highly developed unconventional machining process. It is frequently used in manufacturing industries for nanometer range surface finishing of workpiece with the help of Magnetic abrasive particles (MAPs) and magnetic force applied in the machining zone. It is precise and faster than conventional methods and able to produce defect free finished components. This paper provides a comprehensive review on the recent advancement of MAM process carried out by different researcher till date. The effect of different input parameters such as rotational speed of electromagnet, voltage, magnetic flux density, abrasive particles size and working gap on the performances of Material Removal Rate (MRR) and surface roughness (Ra) have been discussed. On the basis of review, it is observed that the rotational speed of electromagnet, voltage and mesh size of abrasive particles have significant impact on MAM process.

  3. Simple process capability analysis and quality validation of ...

    African Journals Online (AJOL)

    GREGORY

    2011-12-16

    Dec 16, 2011 ... University Malaysia, Gombak, P.O. Box 10, 50728 Kuala Lumpur, Malaysia. Accepted 7 .... used in the manufacturing industry as a process perform- ance indicator. ... Six Sigma for Electronics design and manufacturing.

  4. Automatic physiological waveform processing for FMRI noise correction and analysis.

    Directory of Open Access Journals (Sweden)

    Daniel J Kelley

    2008-03-01

    Full Text Available Functional MRI resting state and connectivity studies of brain focus on neural fluctuations at low frequencies which share power with physiological fluctuations originating from lung and heart. Due to the lack of automated software to process physiological signals collected at high magnetic fields, a gap exists in the processing pathway between the acquisition of physiological data and its use in fMRI software for both physiological noise correction and functional analyses of brain activation and connectivity. To fill this gap, we developed an open source, physiological signal processing program, called PhysioNoise, in the python language. We tested its automated processing algorithms and dynamic signal visualization on resting monkey cardiac and respiratory waveforms. PhysioNoise consistently identifies physiological fluctuations for fMRI noise correction and also generates covariates for subsequent analyses of brain activation and connectivity.

  5. Socially Grounded Analysis of Knowledge Management Systems and Processes

    NARCIS (Netherlands)

    Guizzardi, R.S.S.; Perini, A.; Dignum, V.

    2008-01-01

    In the struggle to survive and compete in face of constant technological changes and unstable business environments, organizations recognize knowledge as its most valuable asset. Consequently, these organizations often invest on Knowledge Management (KM), seeking to enhance their internal processes

  6. Cognitive Modeling and Task Analysis: Basic Processes and Individual Differences

    National Research Council Canada - National Science Library

    Ackerman, Phillip

    1999-01-01

    ... in a complex-skill environment. The subset of task conditions selected were those that involve basic processes of working memory, task monitoring, and differential loads on spatial reasoning and speed of perceiving...

  7. Simulation and Flexibility Analysis of Milk Production Process

    DEFF Research Database (Denmark)

    Cheng, Hongyuan; Friis, Alan

    . Such flexible dairy production line can adjust its production pace in manufacturing different products without replacing existing equipment in the production line. In this work, the dairy process simulator is applied to study the flexibility of milk production line. In the same production line, various......In this work, process simulation method is used to simulate pasteurised market milk production line. A commercial process simulation tool - Pro/II from Simulation Science Inc. is used in the simulation work. In the simulation, a new model is used to calculate the thermal property of milk....... In this work, a simulator is obtained for the milk production line. Using the simulator, different milk processing situation can be quantitatively simulated investigated, such as different products production, capacity changes, fat content changes in raw milk, energy cost at different operation conditions etc...

  8. Centralized processing of contact-handled TRU waste feasibility analysis

    International Nuclear Information System (INIS)

    1986-12-01

    This report presents work for the feasibility study of central processing of contact-handled TRU waste. Discussion of scenarios, transportation options, summary of cost estimates, and institutional issues are a few of the subjects discussed

  9. Analysis and Improvement of Healthcare Delivery Processes in ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... delivery processes given the district's limited human and financial resources. ... Throughout this period of instability and unrest, Lacor Hospital has delivered critical ... and; strengthen the hospital's decision-making around healthcare delivery ...

  10. Condition Monitoring of a Process Filter Applying Wireless Vibration Analysis

    Directory of Open Access Journals (Sweden)

    Pekka KOSKELA

    2011-05-01

    Full Text Available This paper presents a novel wireless vibration-based method for monitoring the degree of feed filter clogging. In process industry, these filters are applied to prevent impurities entering the process. During operation, the filters gradually become clogged, decreasing the feed flow and, in the worst case, preventing it. The cleaning of the filter should therefore be carried out predictively in order to avoid equipment damage and unnecessary process downtime. The degree of clogging is estimated by first calculating the time domain indices from low frequency accelerometer samples and then taking the median of the processed values. Nine different statistical quantities are compared based on the estimation accuracy and criteria for operating in resource-constrained environments with particular focus on energy efficiency. The initial results show that the method is able to detect the degree of clogging, and the approach may be applicable to filter clogging monitoring.

  11. Automated defect spatial signature analysis for semiconductor manufacturing process

    Science.gov (United States)

    Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed

    1999-01-01

    An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.

  12. SPARO - A system for process analysis of refinery operations

    Energy Technology Data Exchange (ETDEWEB)

    Kesler, M.G.; Graham, J.; Weissbrod, J.

    1987-01-01

    SPARO is a customized process simulator for the PC, designed to review as well as to guide operations of hydrocarbon processing units. It can be applied to: gas plants or refinery gas recovery units; crude/vacuum towers with associated heat exchange; light ends units, such as reforming, alkylation and isomerization; fractionation and heat exchange units of Ethylene plants; aromatics and styrene units, and others. The main uses of SPARO are discussed in this paper.

  13. Performance analysis of Java APIS for XML processing

    OpenAIRE

    Oliveira, Bruno; Santos, Vasco; Belo, Orlando

    2013-01-01

    Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as ...

  14. Data processing and analysis with the autoPROC toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Vonrhein, Clemens, E-mail: vonrhein@globalphasing.com; Flensburg, Claus; Keller, Peter; Sharff, Andrew; Smart, Oliver; Paciorek, Wlodek; Womack, Thomas; Bricogne, Gérard [Global Phasing Ltd, Sheraton House, Castle Park, Cambridge CB3 0AX (United Kingdom)

    2011-04-01

    Typical topics and problems encountered during data processing of diffraction experiments are discussed and the tools provided in the autoPROC software are described. A typical diffraction experiment will generate many images and data sets from different crystals in a very short time. This creates a challenge for the high-throughput operation of modern synchrotron beamlines as well as for the subsequent data processing. Novice users in particular may feel overwhelmed by the tables, plots and numbers that the different data-processing programs and software packages present to them. Here, some of the more common problems that a user has to deal with when processing a set of images that will finally make up a processed data set are shown, concentrating on difficulties that may often show up during the first steps along the path of turning the experiment (i.e. data collection) into a model (i.e. interpreted electron density). Difficulties such as unexpected crystal forms, issues in crystal handling and suboptimal choices of data-collection strategies can often be dealt with, or at least diagnosed, by analysing specific data characteristics during processing. In the end, one wants to distinguish problems over which one has no immediate control once the experiment is finished from problems that can be remedied a posteriori. A new software package, autoPROC, is also presented that combines third-party processing programs with new tools and an automated workflow script that is intended to provide users with both guidance and insight into the offline processing of data affected by the difficulties mentioned above, with particular emphasis on the automated treatment of multi-sweep data sets collected on multi-axis goniostats.

  15. Waste Receiving and Processing (WRAP) Weight Scale Analysis Results

    International Nuclear Information System (INIS)

    JOHNSON, M.D.

    2000-01-01

    Fairbanks Weight Scales are used at the Waste Receiving and Processing (WRAP) facility to determine the weight of waste drums as they are received, processed, and shipped. Due to recent problems, discovered during calibration, the WRAP Engineering Department has completed this document which outlines both the investigation of the infeed conveyor scale failure in September of 1999 and recommendations for calibration procedure modifications designed to correct deficiencies in the current procedures

  16. [Psychosocial analysis of the health-disease process].

    Science.gov (United States)

    Sawaia, B B

    1994-04-01

    This article is a reflection about the transdisciplinary paradigmas of the health-illness process noting the symbolic mediation between the reactions of the biological organism and the socio-environment factors including the pathogenic ones. The symbolic-affective mediation is analyzed from the perspective of Social Representation theory allowing one to comprehend the references of individual and collective actions in the health-illness process.

  17. A Lean Six Sigma Analysis of Student In-Processing

    Science.gov (United States)

    2012-12-01

    improvements. 4. Lt Justin Whipple : NPS Student Services representative, primary stakeholder and key source of implementation and control. xvi THIS...Internet connections took over a week. The entire move process took two weeks and had to be scheduled around required check-in procedures . CPT Johnson...processing procedure which represents, as such, a very important stakeholder. They are, however, a type 4 mixed blessing stakeholder. Although they

  18. Analysis of Work Design in Rubber Processing Plant

    OpenAIRE

    Wahyuni Dini; Nasution Harmein; Budiman Irwan; Wijaya Khairini

    2018-01-01

    The work design illustrates how structured jobs, tasks, and roles are defined and modified and their impact on individuals, groups, and organizations. If the work is not designed well, the company must pay greater costs for workers’ health, longer production processes or even penalties for not being able to meet the delivery schedule. This is visible to the condition in a rubber processing factory in North Sumatra. Work design aspects such as layouts, machinery and equipment, worker's physica...

  19. Analysis of Uncertainty in Dynamic Processes Development of Banks Functioning

    Directory of Open Access Journals (Sweden)

    Aleksei V. Korovyakovskii

    2013-01-01

    Full Text Available The paper offers the approach to measure of uncertainty estimation in dynamic processes of banks functioning, using statistic data of different banking operations indicators. To calculate measure of uncertainty in dynamic processes of banks functioning the phase images of relevant sets of statistic data are considered. Besides, it is shown that the form of phase image of the studied sets of statistic data can act as a basis of measure of uncertainty estimation in dynamic processes of banks functioning. The set of analytical characteristics are offered to formalize the form of phase image definition of the studied sets of statistic data. It is shown that the offered analytical characteristics consider inequality of changes in values of the studied sets of statistic data, which is one of the ways of uncertainty display in dynamic processes development. The invariant estimates of measure of uncertainty in dynamic processes of banks functioning, considering significant changes in absolute values of the same indicators for different banks were obtained. The examples of calculation of measure of uncertainty in dynamic processes of concrete banks functioning were cited.

  20. Aspects of the risk analysis in the process engineering industry

    International Nuclear Information System (INIS)

    Hennings, W.; Madjar, M.; Mock, R.; Reer, B.

    1996-01-01

    This document is the result of a multi-discipline working group of a portion of a project called Risk analysis for chemical plants. Within the framework of the project, only selected methods and tools of risk analysis, thus, aspects of method, were able to be discussed and developed further. Case examples from the chemical industry are dealt with in order to discuss the application of a computer assisted quantitative error analysis in this industrial sector. Included is also a comprehensive documentation of the data and results utilised in the examples. figs., tabs., refs

  1. NHI economic analysis of candidate nuclear hydrogen processes

    International Nuclear Information System (INIS)

    Allen, D.; Pickard, P.; Patterson, M.; Sink, C.

    2010-01-01

    The DOE Nuclear Hydrogen Initiative (NHI) is investigating candidate technologies for large scale hydrogen production using high temperature gas-cooled reactors (HTGR) in concert with the Next Generation Nuclear Plant (NGNP) programme. The candidate processes include high temperature thermochemical and high temperature electrolytic processes which are being investigated in a sequence of experimental and analytic studies to establish the most promising and cost effective means of hydrogen production with nuclear energy. Although these advanced processes are in an early development stage, it is important that the projected economic potential of these processes be evaluated to assist in the prioritisation of research activities, and ultimately in the selection of the most promising processes for demonstration and deployment. The projected cost of hydrogen produced is the most comprehensive metric in comparing candidate processes. Since these advanced processes are in the early stages of development and much of the technology is still unproven, the estimated production costs are also significantly uncertain. The programme approach has been to estimate the cost of hydrogen production from each process periodically, based on the best available data at that time, with the intent of increasing fidelity and reducing uncertainty as the research programme and system definition studies progress. These updated cost estimates establish comparative costs at that stage of development but are also used as inputs to the evaluation of research priorities, and identify the key cost and risk (uncertainty) drivers for each process. The economic methodology used to assess the candidate processes are based on the H2A ground rules and modelling tool (discounted cash flow) developed by the DOE Office of Energy Efficiency and Renewable Energy (EERE). The figure of merit output from the calculation is the necessary selling price for hydrogen in dollars per kilogram that satisfies the cost

  2. Statistical analysis of non-homogeneous Poisson processes. Statistical processing of a particle multidetector

    International Nuclear Information System (INIS)

    Lacombe, J.P.

    1985-12-01

    Statistic study of Poisson non-homogeneous and spatial processes is the first part of this thesis. A Neyman-Pearson type test is defined concerning the intensity measurement of these processes. Conditions are given for which consistency of the test is assured, and others giving the asymptotic normality of the test statistics. Then some techniques of statistic processing of Poisson fields and their applications to a particle multidetector study are given. Quality tests of the device are proposed togetherwith signal extraction methods [fr

  3. Analysis of the Growth Process of Neural Cells in Culture Environment Using Image Processing Techniques

    Science.gov (United States)

    Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid

    Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.

  4. Cognitive Task Analysis of the Battalion Level Visualization Process

    National Research Council Canada - National Science Library

    Leedom, Dennis K; McElroy, William; Shadrick, Scott B; Lickteig, Carl; Pokorny, Robet A; Haynes, Jacqueline A; Bell, James

    2007-01-01

    ... position or as a battalion Operations Officer or Executive Officer. Bases on findings from the cognitive task analysis, 11 skill areas were identified as potential focal points for future training development...

  5. Upper Midwest Gap Analysis Program, Image Processing Protocol

    National Research Council Canada - National Science Library

    Lillesand, Thomas

    1998-01-01

    This document presents a series of technical guidelines by which land cover information is being extracted from Landsat Thematic Mapper data as part of the Upper Midwest Gap Analysis Program (UMGAP...

  6. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  7. Task Analysis data Processing and Enhanced Representations (TAPER), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Task Analysis (TA) is a fundamental part of NASA system design and validation. TAs are used to produce Master Task Lists that support engineering teams and...

  8. Light ion microbeam analysis / processing system and its improvement

    International Nuclear Information System (INIS)

    Koka, Masashi; Ishii, Yasuyuki; Yamada, Naoto; Ohkubo, Takeru; Kamiya, Tomihiro; Satoh, Takahiro; Kada, Wataru; Kitamura, Akane; Iwata, Yoshihiro

    2016-03-01

    A MeV-class light ion microbeam system has been developed for micro-analysis and micro-fabrication with high spatial resolution at 3-MV single-ended accelerator in Takasaki Ion Accelerators for Advanced Radiation Application of Takasaki Advanced Radiation Research Institute, Sector of Nuclear Science Research, Japan Atomic Energy Agency. This report describes the technical improvements for the main apparatus (the accelerator, beam-transport lines, and microbeam system), and auxiliary equipments/ parts for ion beam applications such as Particle Induced X-ray/Gamma-ray Emission (PIXE/PIGE) analysis, 3-D element distribution analysis using PIXE Computed Tomography (CT), Ion Beam Induced Luminescence (IBIL) analysis, and Proton Beam Writing with the microbeam scanning, with functional outline of these apparatus and equipments/parts. (author)

  9. Analysis of the ATR fuel element swaging process

    International Nuclear Information System (INIS)

    Richins, W.D.; Miller, G.K.

    1995-12-01

    This report documents a detailed evaluation of the swaging process used to connect fuel plates to side plates in Advanced Test Reactor (ATR) fuel elements. The swaging is a mechanical process that begins with fitting a fuel plate into grooves in the side plates. Once a fuel plate is positioned, a lip on each of two side plate grooves is pressed into the fuel plate using swaging wheels to form the joints. Each connection must have a specified strength (measured in terms, of a pullout force capacity) to assure that these joints do not fail during reactor operation. The purpose of this study is to analyze the swaging process and associated procedural controls, and to provide recommendations to assure that the manufacturing process produces swaged connections that meet the minimum strength requirement. The current fuel element manufacturer, Babcock and Wilcox (B ampersand W) of Lynchburg, Virginia, follows established procedures that include quality inspections and process controls in swaging these connections. The procedures have been approved by Lockheed Martin Idaho Technologies and are designed to assure repeatability of the process and structural integrity of each joint. Prior to July 1994, ATR fuel elements were placed in the Hydraulic Test Facility (HTF) at the Idaho National Engineering Laboratory (AGNAIL), Test Reactor Area (TRA) for application of Boehmite (an aluminum oxide) film and for checking structural integrity before placement of the elements into the ATR. The results presented in this report demonstrate that the pullout strength of the swaged connections is assured by the current manufacturing process (with several recommended enhancements) without the need for- testing each element in the HTF

  10. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  11. Army Information Technology Procurement: A Business Process Analysis

    Science.gov (United States)

    2015-03-27

    as ‘Other’. For example, funding for a system administrator to perform upkeep on an existing SQL server meets three ‘Item’ criteria and is marked...to perform upkeep on an existing SQL server meets three ‘Item’ criteria and is marked as ‘Other.’ Expected duties are then explained at great...Objective Decision Analysis Tool (AAMODAT) is Value Based Analysis ( VBA ) tool designed for weapon procurement that could serve as a model for

  12. Medical device innovation and the value analysis process.

    Science.gov (United States)

    Krantz, Heidi; Strain, Barbara; Torzewski, Jane

    2017-09-01

    Heidi A. Krantz, RN, BSN is the Director of Value Analysis at Johns Hopkins Bayview Medical Center in the Johns Hopkins Health System. Barbara Strain, MA, CVAHP is the Director of Value Management at the University of Virginia Health System. Jane Torzewski RN, MAN, MBA is a Senior Category Manager for the Mayo Clinic Physician Preference Contracting team. She previously was a Senior Clinical Value Analyst on the Mayo Clinic Value Analysis team. Copyright © 2018. Published by Elsevier Inc.

  13. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  14. Analysis of the migration process of Colombian families in Spain

    Directory of Open Access Journals (Sweden)

    Adelina Gimeno Collado

    2014-04-01

    Full Text Available This study analyses migration as a process centred on the transnational family as told by its main characters: migrants – parents and children –¬ and their families in Colombia. The study is based on the systematic model and methodology of the Grounded Theory approach. The migration process is triggered by a combination of push and pull factors. The pioneers, mainly women, have very diverse profiles. We highlight the difficulty of their first experiences, which they overcome via personal tenacity and external support. Despite the difficulties of the acculturation process, the overall outcome is positive, especially regarding their expectations for their children, who wish to stay in Spain having overcome the initial challenges of adaptation. Children experience their own acculturation process, but there is no conflict between children and parents despite their different acculturation levels. Despite hopes that their integration process Spain would have been better, they are thankful for the support received. Decisions are made and adaptation occurs in the private domain, i.e., the family; however, there is a lack of group awareness or joint social action to improve conditions in the country of origin or to improve integration in the host country.

  15. Dynamic analysis of a guided projectile during engraving process

    Directory of Open Access Journals (Sweden)

    Tao Xue

    2014-06-01

    Full Text Available The reliability of the electronic components inside a guided projectile is highly affected by the launch dynamics of guided projectile. The engraving process plays a crucial role on determining the ballistic performance and projectile stability. This paper analyzes the dynamic response of a guided projectile during the engraving process. By considering the projectile center of gravity moving during the engraving process, a dynamics model is established with the coupling of interior ballistic equations. The results detail the stress situation of a guided projectile band during its engraving process. Meanwhile, the axial dynamic response of projectile in the several milliseconds following the engraving process is also researched. To further explore how the different performance of the engraving band can affect the dynamics of guided projectile, this paper focuses on these two aspects: (a the effects caused by the different band geometry; and (b the effects caused by different band materials. The time domain and frequency domain responses show that the dynamics of the projectile are quite sensitive to the engraving band width. A material with a small modulus of elasticity is more stable than one with a high modulus of elasticity.

  16. Preliminary process simulation and analysis of GMODS: Processing of plutonium surplus materials

    International Nuclear Information System (INIS)

    Ferrada, J.J.; Nehls, J.W. Jr.; Welch, T.D.; Giardina, J.L.; Forsberg, C.W.; Maliyekkel, A.T.

    1996-01-01

    To address growing concerns in the areas of arms control, control of fissile materials, waste management, and environment and health, the US Department of Energy is studying and evaluating various options for the control and disposal of surplus fissile materials (SFMs). One of the options under consideration is the Glass Material Oxidation and Dissolution System (GMODS) which directly converts plutonium-bearing materials such as metals, ceramics, and organics into a durable-high-quality glass for long-term storage or a waste form for disposal. This study undertook the development of a computer simulation of the GMODS process using FLOW. That computer simulation was used to perform an assessment of how GMODS would handle the treatment of plutonium, rich scrap (RS) and lead scrap (LS), and identify critical process parameters. Among the key process parameters affecting the glass formation were processing temperatures, additives, and the effects of varying them on the final product. This assessment looked at the quantity of glass produced, the quality of the final glass form, and the effect of blending different groups of the feed streams on the glass produced. The model also provided a way to study the current process assumptions and determine in which areas more experimental studies are required. The simulation showed that the glass chemistry postulated in the models is workable. It is expected that the glass chemistry assumed during the modeling process can be verified by the results of the laboratory experiments that are currently being conducted relating to the GMODS process.Further waste characterization, especially of the SFM waste streams not studied in this report, will provide more nearly accurate results and give a more detailed evaluation of the GMODS process

  17. Preliminary process simulation and analysis of GMODS: Processing of plutonium surplus materials

    Energy Technology Data Exchange (ETDEWEB)

    Ferrada, J.J.; Nehls, J.W. Jr.; Welch, T.D.; Giardina, J.L.; Forsberg, C.W. [Oak Ridge National Lab., TN (United States); Maliyekkel, A.T. [Oak Ridge Associated Universities, TN (United States)

    1996-01-02

    To address growing concerns in the areas of arms control, control of fissile materials, waste management, and environment and health, the US Department of Energy is studying and evaluating various options for the control and disposal of surplus fissile materials (SFMs). One of the options under consideration is the Glass Material Oxidation and Dissolution System (GMODS) which directly converts plutonium-bearing materials such as metals, ceramics, and organics into a durable-high-quality glass for long-term storage or a waste form for disposal. This study undertook the development of a computer simulation of the GMODS process using FLOW. That computer simulation was used to perform an assessment of how GMODS would handle the treatment of plutonium, rich scrap (RS) and lead scrap (LS), and identify critical process parameters. Among the key process parameters affecting the glass formation were processing temperatures, additives, and the effects of varying them on the final product. This assessment looked at the quantity of glass produced, the quality of the final glass form, and the effect of blending different groups of the feed streams on the glass produced. The model also provided a way to study the current process assumptions and determine in which areas more experimental studies are required. The simulation showed that the glass chemistry postulated in the models is workable. It is expected that the glass chemistry assumed during the modeling process can be verified by the results of the laboratory experiments that are currently being conducted relating to the GMODS process.Further waste characterization, especially of the SFM waste streams not studied in this report, will provide more nearly accurate results and give a more detailed evaluation of the GMODS process.

  18. Silicon Solar Cell Process Development, Fabrication and Analysis, Phase 1

    Science.gov (United States)

    Yoo, H. I.; Iles, P. A.; Tanner, D. P.

    1979-01-01

    Solar cells from RTR ribbons, EFG (RF and RH) ribbons, dendritic webs, Silso wafers, cast silicon by HEM, silicon on ceramic, and continuous Czochralski ingots were fabricated using a standard process typical of those used currently in the silicon solar cell industry. Back surface field (BSF) processing and other process modifications were included to give preliminary indications of possible improved performance. The parameters measured included open circuit voltage, short circuit current, curve fill factor, and conversion efficiency (all taken under AM0 illumination). Also measured for typical cells were spectral response, dark I-V characteristics, minority carrier diffusion length, and photoresponse by fine light spot scanning. the results were compared to the properties of cells made from conventional single crystalline Czochralski silicon with an emphasis on statistical evaluation. Limited efforts were made to identify growth defects which will influence solar cell performance.

  19. Economic analysis of waste recycle process in perhydropolysiloxazane synthesis

    International Nuclear Information System (INIS)

    Yun, Huichan; Yeom, Seungjong; Yang, Dae Ryook

    2014-01-01

    The perhydropolysiloxazane (PHPS) solution has been widely used in the spin-on-dielectric (SOD) process to form silicon oxide layer on a wafer in semiconductor industries. To reduce the whole semiconductor manufacturing cost, the process of PHPS solution production requires high productivity as well as low production cost. A large portion of the PHPS solution production cost is attributed to the large usage of solvents (pyridine and xylene), because more than 20 times of solvents in mass are required to produce a unit mass of high purity PHPS solution. Therefore, we suggest several plausible solvent regeneration processes of organic solvent waste from the PHPS solution production, and their economics is evaluated for comparison

  20. iamxt: Max-tree toolbox for image processing and analysis

    Directory of Open Access Journals (Sweden)

    Roberto Souza

    2017-01-01

    Full Text Available The iamxt is an array-based max-tree toolbox implemented in Python using the NumPy library for array processing. It has state of the art methods for building and processing the max-tree, and a large set of visualization tools that allow to view the tree and the contents of its nodes. The array-based programming style and max-tree representation used in the toolbox make it simple to use. The intended audience of this toolbox includes mathematical morphology students and researchers that want to develop research in the field and image processing researchers that need a toolbox simple to use and easy to integrate in their applications.

  1. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    Science.gov (United States)

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  2. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  3. Thermo-fluid dynamic analysis of wet compression process

    International Nuclear Information System (INIS)

    Mohan, Abhay; Kim, Heuy Dong; Chidambaram, Palani Kumar; Suryan, Abhilash

    2016-01-01

    Wet compression systems increase the useful power output of a gas turbine by reducing the compressor work through the reduction of air temperature inside the compressor. The actual wet compression process differs from the conventional single phase compression process due to the presence of latent heat component being absorbed by the evaporating water droplets. Thus the wet compression process cannot be assumed isentropic. In the current investigation, the gas-liquid two phase has been modeled as air containing dispersed water droplets inside a simple cylinder-piston system. The piston moves in the axial direction inside the cylinder to achieve wet compression. Effects on the thermodynamic properties such as temperature, pressure and relative humidity are investigated in detail for different parameters such as compression speeds and overspray. An analytical model is derived and the requisite thermodynamic curves are generated. The deviations of generated thermodynamic curves from the dry isentropic curves (PV γ = constant) are analyzed

  4. Entrepreneurship Processes and Small Farms Achievements: Empirical Analysis of Linkage

    Directory of Open Access Journals (Sweden)

    Temidayo Gabriel Apata

    2015-01-01

    Full Text Available Entrepreneurship process has been argued as opportunity-driven, creative, and resource-efficient, that could influence income generation of small farmers that adopted entrepreneurial skills and innovation into their farming operations. This study examines entrepreneurship process strategies employed to income increase by small farmers, evidence from southwest of Nigeria. The sampling procedures entail three stages of samples selection of 240 farmers but only 200 data was useful. Descriptive statistical and inferential statistics were used to analyze and describe the data. Respondents’ age ranges from 16 to 65 years old, mean age was 36.16 years. The study found out that 5 % of the samples had modest communication skills that aid adoption of effective entrepreneurial processes and about 83% have a strong belief in one’s self to succeed. Successful farmers had multiple sources of related income generation business ventures. Targeting the entrepreneurs for support could make them even more effective.

  5. Thermo-fluid dynamic analysis of wet compression process

    Energy Technology Data Exchange (ETDEWEB)

    Mohan, Abhay; Kim, Heuy Dong [School of Mechanical Engineering, Andong National University, Andong (Korea, Republic of); Chidambaram, Palani Kumar [FMTRC, Daejoo Machinery Co. Ltd., Daegu (Korea, Republic of); Suryan, Abhilash [Dept. of Mechanical Engineering, College of Engineering Trivandrum, Kerala (India)

    2016-12-15

    Wet compression systems increase the useful power output of a gas turbine by reducing the compressor work through the reduction of air temperature inside the compressor. The actual wet compression process differs from the conventional single phase compression process due to the presence of latent heat component being absorbed by the evaporating water droplets. Thus the wet compression process cannot be assumed isentropic. In the current investigation, the gas-liquid two phase has been modeled as air containing dispersed water droplets inside a simple cylinder-piston system. The piston moves in the axial direction inside the cylinder to achieve wet compression. Effects on the thermodynamic properties such as temperature, pressure and relative humidity are investigated in detail for different parameters such as compression speeds and overspray. An analytical model is derived and the requisite thermodynamic curves are generated. The deviations of generated thermodynamic curves from the dry isentropic curves (PV{sup γ} = constant) are analyzed.

  6. Analysis of possible free quarks production process at hadron colliders

    International Nuclear Information System (INIS)

    Boos, E.E.; Ermolov, P.F.; Golubkov, Yu.A.

    1990-01-01

    The authors regard the process of free b-quark production in proton-antiproton collisions at energies of new colliders. It is suggested to use the pair of unlike sign with transverse momenta in the range p tr >5 GeV/c to trigger this process. Additionally it is suggested to measure a weak ionization signal from free s-quark from b-quark decay. The calculations of free bb-quarks production cross-sections have been made taking into account their energy losses in strong colour field. It is shown that the most effective range of lepton transverse momenta for observation of the process does not depend on threshold energy and is approximately equal to one for usual b mesons. 16 refs.; 10 figs

  7. Process integrated modelling for steelmaking Life Cycle Inventory analysis

    International Nuclear Information System (INIS)

    Iosif, Ana-Maria; Hanrot, Francois; Ablitzer, Denis

    2008-01-01

    During recent years, strict environmental regulations have been implemented by governments for the steelmaking industry in order to reduce their environmental impact. In the frame of the ULCOS project, we have developed a new methodological framework which combines the process integrated modelling approach with Life Cycle Assessment (LCA) method in order to carry out the Life Cycle Inventory of steelmaking. In the current paper, this new concept has been applied to the sinter plant which is the most polluting steelmaking process. It has been shown that this approach is a powerful tool to make the collection of data easier, to save time and to provide reliable information concerning the environmental diagnostic of the steelmaking processes

  8. Analysis and optimisation of a mixed fluid cascade (MFC) process

    Science.gov (United States)

    Ding, He; Sun, Heng; Sun, Shoujun; Chen, Cheng

    2017-04-01

    A mixed fluid cascade (MFC) process that comprises three refrigeration cycles has great capacity for large-scale LNG production, which consumes a great amount of energy. Therefore, any performance enhancement of the liquefaction process will significantly reduce the energy consumption. The MFC process is simulated and analysed by use of proprietary software, Aspen HYSYS. The effect of feed gas pressure, LNG storage pressure, water-cooler outlet temperature, different pre-cooling regimes, liquefaction, and sub-cooling refrigerant composition on MFC performance are investigated and presented. The characteristics of its excellent numerical calculation ability and the user-friendly interface of MATLAB™ and powerful thermo-physical property package of Aspen HYSYS are combined. A genetic algorithm is then invoked to optimise the MFC process globally. After optimisation, the unit power consumption can be reduced to 4.655 kW h/kmol, or 4.366 kW h/kmol on condition that the compressor adiabatic efficiency is 80%, or 85%, respectively. Additionally, to improve the process further, with regards its thermodynamic efficiency, configuration optimisation is conducted for the MFC process and several configurations are established. By analysing heat transfer and thermodynamic performances, the configuration entailing a pre-cooling cycle with three pressure levels, liquefaction, and a sub-cooling cycle with one pressure level is identified as the most efficient and thus optimal: its unit power consumption is 4.205 kW h/kmol. Additionally, the mechanism responsible for the weak performance of the suggested liquefaction cycle configuration lies in the unbalanced distribution of cold energy in the liquefaction temperature range.

  9. Students’ views on the block evaluation process: A descriptive analysis

    Directory of Open Access Journals (Sweden)

    Ntefeleng E. Pakkies

    2016-03-01

    Full Text Available Background: Higher education institutions have executed policies and practices intended to determine and promote good teaching. Students’ evaluation of the teaching and learning process is seen as one measure of evaluating quality and effectiveness of instruction and courses. Policies and procedures guiding this process are discernible in universities, but it isoften not the case for nursing colleges. Objective: To analyse and describe the views of nursing students on block evaluation, and how feedback obtained from this process was managed.Method: A quantitative descriptive study was conducted amongst nursing students (n = 177 in their second to fourth year of training from one nursing college in KwaZulu-Natal. A questionnaire was administered by the researcher and data were analysed using the Statistical Package of Social Sciences Version 19.0. Results: The response rate was 145 (81.9%. The participants perceived the aim of block evaluation as improving the quality of teaching and enhancing their experiences as students.They questioned the significance of their input as stakeholders given that they had never been consulted about the development or review of the evaluation tool, or the administration process; and they often did not receive feedback from the evaluation they participated in. Conclusion: The college management should develop a clear organisational structure with supporting policies and operational guidelines for administering the evaluation process. The administration, implementation procedures, reporting of results and follow-up mechanisms should be made transparent and communicated to all concerned. Reports and actions related to these evaluations should provide feedback into relevant courses or programmes. Keywords: Student evaluation of teaching; perceptions; undergraduate nursing students; evaluation process

  10. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    Science.gov (United States)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  11. Solid municipal waste processing plants: Cost benefit analysis

    International Nuclear Information System (INIS)

    Gerardi, V.

    1992-01-01

    This paper performs cost benefit analyses on three solid municipal waste processing alternatives with plants of diverse daily outputs. The different processing schemes include: selected wastes incineration with the production of refuse derived fuels; selected wastes incineration with the production of refuse derived fuels and compost; pyrolysis with energy recovery in the form of electric power. The plant daily outputs range from 100 to 300 tonnes for the refuse derived fuel alternatives, and from 200 to 800 tonnes for the pyrolysis/power generation scheme. The cost analyses consider investment periods of fifteen years in duration and interest rates of 5%

  12. Analysis of logistic process : Measuring performance using balanced score card

    OpenAIRE

    Kamthunzi, Eleanor

    2014-01-01

    All industrial companies want to be productive and to make profit as well as expand their markets. To accomplish this, organizations are working on different principles, strategies to improve their processes to achieve their goals to have the right item in the right quantity at the right place, in the right condition to the right customer at the right time. This is where logistics management comes in, to make this all possible. This thesis aims at looking into logistics as a process to sh...

  13. ALSAN - A system for disturbance analysis by process computers

    International Nuclear Information System (INIS)

    Felkel, L.; Grumbach, R.

    1977-05-01

    The program system ALSAN has been developed to process the large number of signals due to a disturbance in a complex technical process, to recognize the important (in order to settle the disturbance within a minimum amount of time) information, and to display it to the operators. By means of the results, clear decisions can be made on what counteractions have to be taken. The system works in on-line-open-loop mode, and analyses disturbances autonomously as well as in dialog with the operators. (orig.) [de

  14. Multi-fluid CFD analysis in Process Engineering

    Science.gov (United States)

    Hjertager, B. H.

    2017-12-01

    An overview of modelling and simulation of flow processes in gas/particle and gas/liquid systems are presented. Particular emphasis is given to computational fluid dynamics (CFD) models that use the multi-dimensional multi-fluid techniques. Turbulence modelling strategies for gas/particle flows based on the kinetic theory for granular flows are given. Sub models for the interfacial transfer processes and chemical kinetics modelling are presented. Examples are shown for some gas/particle systems including flow and chemical reaction in risers as well as gas/liquid systems including bubble columns and stirred tanks.

  15. Reflector antenna analysis using physical optics on Graphics Processing Units

    DEFF Research Database (Denmark)

    Borries, Oscar Peter; Sørensen, Hans Henrik Brandenborg; Dammann, Bernd

    2014-01-01

    The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate the perform......The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate...

  16. Bifurcation and stability analysis of a nonlinear milling process

    Science.gov (United States)

    Weremczuk, Andrzej; Rusinek, Rafal; Warminski, Jerzy

    2018-01-01

    Numerical investigations of milling operations dynamics are presented in this paper. A two degree of freedom nonlinear model is used to study workpiece-tool vibrations. The analyzed model takes into account both flexibility of the tool and the workpiece. The dynamics of the milling process is described by the discontinuous ordinary differential equation with time delay, which can cause process instability. First, stability lobes diagrams are created on the basis of the parameters determined in impact test of an end mill and workpiece. Next, the bifurcations diagrams are performed for different values of rotational speeds.

  17. Analysis of Drying Process Quality in Conventional Dry-Kilns

    OpenAIRE

    Sedlar Tomislav; Pervan Stjepan

    2010-01-01

    This paper presents testing results of drying quality in a conventional dry kiln. Testing is based on a new methodology that will show the level of success of the drying process management by analyzing the quality of drying process in a conventional dry kiln, using a scientifi cally improved version of the check list in everyday practical applications. A company that specializes in lamel and classic parquet production was chosen so as to verify the new testing methodology. A total of 56 m3 of...

  18. Thermomechanical Stresses Analysis of a Single Event Burnout Process

    Science.gov (United States)

    Tais, Carlos E.; Romero, Eduardo; Demarco, Gustavo L.

    2009-06-01

    This work analyzes the thermal and mechanical effects arising in a power Diffusion Metal Oxide Semiconductor (DMOS) during a Single Event Burnout (SEB) process. For studying these effects we propose a more detailed simulation structure than the previously used by other authors, solving the mathematical models by means of the Finite Element Method. We use a cylindrical heat generation region, with 5 W, 10 W, 50 W and 100 W for emulating the thermal phenomena occurring during SEB processes, avoiding the complexity of the mathematical treatment of the ion-semiconductor interaction.

  19. Quantitative analysis of resource-constrained business processes

    NARCIS (Netherlands)

    Oliveira, C.A.L.; Lima, R.M.F.; Reijers, H.A.; Ribeiro, J.T.S.

    2012-01-01

    To address the need for evaluation techniques for complex business processes, also known as workflows, this paper proposes an approach based on generalized stochastic Petri nets (GSPNs). We review ten related approaches published in the last fifteen years and compare them to our approach using a

  20. Temporal expectation and information processing: A model-based analysis

    NARCIS (Netherlands)

    Jepma, M.; Wagenmakers, E.-J.; Nieuwenhuis, S.

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information

  1. Economic Analysis of an Organosolv Process for Bioethanol Production

    Directory of Open Access Journals (Sweden)

    Jesse Kautto

    2014-08-01

    Full Text Available In a previous paper, conceptual process design, simulation, and mass and energy balances were presented for an organosolv process with a hardwood feed of 2350 metric tons (MT per day and ethanol, lignin, furfural, and acetic acid production rates of 459, 310, 6.6, and 30.3 MT/day, respectively. In this paper, the investment and operating costs of the process and the minimum ethanol selling price (MESP to make the process economically feasible were estimated. The total capital investment of the plant was approximately 720 million USD. Lignin price was found to affect the MESP considerably. With a base case lignin price of 450 USD/MT, the MESP was approximately 3.1 USD per gallon (gal. Higher lignin price of 1000 USD/MT was required to equal the MESP with the December 2013 ethanol market price (2.0 USD/gal. In addition to lignin price, the MESP was found to be strongly affected by feedstock, enzyme, and investment costs. Variations in feedstock and investment costs affected the MESP by approximately 0.2 and 0.5 USD/gal, respectively. Changing the enzyme dosage and price from base case estimate of 5270 USD/MT and 0.02 g/g cellulose to more conservative 3700 USD/MT and 0.06 g/g cellulose, respectively, increased the MESP by 0.59 USD/gal.

  2. Processing cassava into chips for industry and export: analysis of ...

    African Journals Online (AJOL)

    Data collected were analyzed with descriptive (such as frequency, percentage and means) and inferential statistics. Results of the study showed that more women(56.1%) were involved in cassava processing than men (43.9%) and that substantial proportion of the small holder processors were ageing ((59.1%) and no ...

  3. Analysis of Time Discretization and its Effect on Simulation Processes

    Directory of Open Access Journals (Sweden)

    Gilbert-Rainer Gillich

    2006-10-01

    Full Text Available The paper presents the influence of time discretization on the results of simulations of technical systems. In this sense the systems are mod-eled using the SciLab/SCICOS environment, using different time inter-vals. Ulterior the processes are simulated and the results are com-pared.

  4. analysis of profitability and poverty reduction of yoghurt processing

    African Journals Online (AJOL)

    Admin

    KEY WORDS: Profitability, poverty reduction, yoghurt, processing, employment ... 70% percent of the rural working population (Joshua,. 1999). With about 76 out of every 120 people living ... traditionally the difference between total revenue and ... (70%) of the respondents were males while 30% were females. The age ...

  5. Electron beam additive manufacturing with wire - Analysis of the process

    Science.gov (United States)

    Weglowski, Marek St.; Błacha, Sylwester; Pilarczyk, Jan; Dutkiewicz, Jan; Rogal, Łukasz

    2018-05-01

    The electron beam additive manufacturing process with wire is a part of global trend to find fast and efficient methods for producing complex shapes elements from costly metal alloys such as stainless steels, nickel alloys, titanium alloys etc. whose production by other conventional technologies is unprofitable or technically impossible. Demand for additive manufacturing is linked to the development of new technologies in the automotive, aerospace and machinery industries. The aim of the presented work was to carried out research on electron beam additive manufacturing with a wire as a deposited (filler) material. The scope of the work was to investigate the influence of selected technological parameters such as: wire feed rate, beam current, travelling speed, acceleration voltage on stability of the deposition process and geometric dimensions of the padding welds. The research revealed that, at low beam currents, the deposition process is unstable. The padding weld reinforcement is non-uniform. Irregularity of the width, height and straightness of the padding welds can be observed. At too high acceleration voltage and beam current, burn-through of plate and excess penetration weld can be revealed. The achieved results and gained knowledge allowed to produce, based on EBAM with wire process, whole structure from stainless steel.

  6. Risk-based analysis of business process executions

    NARCIS (Netherlands)

    Alizadeh, M.; Zannone, N.

    2016-01-01

    Organizations need to monitor their business processes to ensure that what actually happens in the system is compliant with the prescribed behavior. Deviations from the prescribed behavior may correspond to violations of security requirements and expose organizations to severe risks. Thus, it is

  7. Analysis of the work-hardening process in spheroidized steels

    International Nuclear Information System (INIS)

    Pacheco, J.L.

    1981-07-01

    An elementary model for the work-hardening process in duplex-structures steels (ferrite - spheroidite) is proposed and tested on low, medium and high carbon content, which seems to give good results concerning the influence of the volume fraction and particle size of the second phase on the work-hardening behaviour. (Author) [pt

  8. Processes of preparation, deposition and analysis of thermionic emissive substances

    International Nuclear Information System (INIS)

    Romao, B.M. Verdelli; Muraro Junior, A.; Tessaroto, L.A.B.; Takahashi, J.

    1992-09-01

    This paper shows the results of the optimization of the process of preparation and deposition of thermionic emissive substances that are used in the oxide-cathodes which are utilized in the gun of the IEAv linear electron accelerator. (author). 5 refs., 5 figs

  9. Analysis of multi-stage open shop processing systems

    NARCIS (Netherlands)

    Eggermont, C.E.J.; Schrijver, A.; Woeginger, G.J.; Schwentick, T.; Dürr, C.

    2011-01-01

    We study algorithmic problems in multi-stage open shop processing systems that are centered around reachability and deadlock detection questions. We characterize safe and unsafe system states. We show that it is easy to recognize system states that can be reached from the initial state (where the

  10. Eye Movement Analysis of Information Processing under Different Testing Conditions.

    Science.gov (United States)

    Dillon, Ronna F.

    1985-01-01

    Undergraduates were given complex figural analogies items, and eye movements were observed under three types of feedback: (1) elaborate feedback; (2) subjects verbalized their thinking and application of rules; and (3) no feedback. Both feedback conditions enhanced the rule-governed information processing during inductive reasoning. (Author/GDC)

  11. On-line sample processing methods in flow analysis

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald

    2008-01-01

    In this chapter, the state of the art of flow injection and related approaches thereof for automation and miniaturization of sample processing regardless of the aggregate state of the sample medium is overviewed. The potential of the various generation of flow injection for implementation of in...

  12. Design and Analysis of Elliptical Nozzle in AJM Process using ...

    African Journals Online (AJOL)

    Abrasive jet machining (AJM) is a micromachining process, where material is removed from the work piece by the erosion effect of a high speed stream of abrasive particles carried in a gas medium, which are emerging from a nozzle. Abrasive machining includes grinding super finishing honing, lapping polishing etc.

  13. Use of NESTLE computer code for NPP transition process analysis

    International Nuclear Information System (INIS)

    Gal'chenko, V.V.

    2001-01-01

    A newly created WWER-440 reactor model with use NESTLE code is discussed. Results of 'fast' and 'slow' transition processes based on it are presented. This model was developed for Rovno NPP reactor and it can be used also for WWER-1000 reactor in Zaporozhe NPP

  14. Contract Management Process Maturity: Empirical Analysis of Organizational Assessments

    Science.gov (United States)

    2009-08-27

    with the National Contract Management Association (NCMA), a Certified Purchasing Manager ( CPM ) with the Institute for Supply Management (ISM), and a...include advertising procurement opportunities, conducting industry and pre-proposal conferences, and amending solicitation documents as required. 4...with organizational core processes include advertising procurement opportunities, conducting solicitation and pre-proposal conferences, and amending

  15. Yield analysis at a poultry processing plant in Harare, Zimbabwe ...

    African Journals Online (AJOL)

    This investigation was conducted to establish the yield of parts or organs of chickens brought for slaughter at a poultry processing plant in Harare. Results of the study will furnish management and other poultry farmers with information that will enable them to identify yield losses and sustainable ways of minimizing resultant ...

  16. Energy analysis of the conventional textile washing process.

    NARCIS (Netherlands)

    Mozes, E.; Cornelissen, R.L.; Hirs, G.G.; Boom, R.M.

    1998-01-01

    In this paper the efficiency of the conventionaltextilewashingprocess is examined. This is done by using the cumulative exergy consumption as developed by Szargut et al. Exergy is the quantity of work that can be extracted from material or energy by reversible processes. Cumulative exergy

  17. Signal analysis and processing for SmartPET

    International Nuclear Information System (INIS)

    Scraggs, David; Boston, Andrew; Boston, Helen; Cooper, Reynold; Hall, Chris; Mather, Andy; Nolan, Paul; Turk, Gerard

    2007-01-01

    Measurement of induced transient charges on spectator electrodes is a critical requirement of the SmartPET project. Such a task requires the precise measurement of small amplitude pulses. Induced charge magnitudes on the SmartPET detectors were therefore studied and the suitability of wavelet analysis applied to de-noising signals was investigated. It was found that the absolute net maximum induced charge magnitudes from the two adjacent electrodes to the collecting electrode is 17% of the real charge magnitude for the AC side and 20% for the DC side. It was also found that wavelet analysis could identify induced charges of comparable magnitude to system noise

  18. Implementing SCRUM using Business Process Management and Pattern Analysis Methodologies

    Directory of Open Access Journals (Sweden)

    Ron S. Kenett

    2013-11-01

    Full Text Available The National Institute of Standards and Technology in the US has estimated that software defects and problems annually cost 59.5 billions the U.S. economy (http://www.abeacha.com/NIST_press_release_bugs_cost.htm. The study is only one of many that demonstrate the need for significant improvements in software development processes and practices. US Federal agencies, that depend on IT to support their missions and spent at least $76 billion on IT in fiscal year 2011, experienced numerous examples of lengthy IT projects that incurred cost overruns and schedule delays while contributing little to mission-related outcomes (www.gao.gov/products/GAO-12-681. To reduce the risk of such problems, the US Office of Management and Budget recommended deploying an agile software delivery, which calls for producing software in small, short increments (GAO, 2012. Consistent with this recommendation, this paper is about the application of Business Process Management to the improvement of software and system development through SCRUM or agile techniques. It focuses on how organizational behavior and process management techniques can be integrated with knowledge management approaches to deploy agile development. The context of this work is a global company developing software solutions for service operators such as cellular phone operators. For a related paper with a comprehensive overview of agile methods in project management see Stare (2013. Through this comprehensive case study we demonstrate how such an integration can be achieved. SCRUM is a paradigm shift in many organizations in that it results in a new balance between focus on results and focus on processes. In order to describe this new paradigm of business processes this work refers to Enterprise Knowledge Development (EKD, a comprehensive approach to map and document organizational patterns. In that context, the paper emphasizes the concept of patterns, reviews the main elements of SCRUM and shows how

  19. Molecular Analysis of Phr Peptide Processing in Bacillus subtilis†

    Science.gov (United States)

    Stephenson, Sophie; Mueller, Christian; Jiang, Min; Perego, Marta

    2003-01-01

    In Bacillus subtilis, an export-import pathway regulates production of the Phr pentapeptide inhibitors of Rap proteins. Processing of the Phr precursor proteins into the active pentapeptide form is a key event in the initiation of sporulation and competence development. The PhrA (ARNQT) and PhrE (SRNVT) peptides inhibit the RapA and RapE phosphatases, respectively, whose activity is directed toward the Spo0F∼P intermediate response regulator of the sporulation phosphorelay. The PhrC (ERGMT) peptide inhibits the RapC protein acting on the ComA response regulator for competence with regard to DNA transformation. The structural organization of PhrA, PhrE, and PhrC suggested a role for type I signal peptidases in the processing of the Phr preinhibitor, encoded by the phr genes, into the proinhibitor form. The proinhibitor was then postulated to be cleaved to the active pentapeptide inhibitor by an additional enzyme. In this report, we provide evidence that Phr preinhibitor proteins are subject to only one processing event at the peptide bond on the amino-terminal end of the pentapeptide. This processing event is most likely independent of type I signal peptidase activity. In vivo and in vitro analyses indicate that none of the five signal peptidases of B. subtilis (SipS, SipT, SipU, SipV, and SipW) are indispensable for Phr processing. However, we show that SipV and SipT have a previously undescribed role in sporulation, competence, and cell growth. PMID:12897006

  20. Molecular analysis of Phr peptide processing in Bacillus subtilis.

    Science.gov (United States)

    Stephenson, Sophie; Mueller, Christian; Jiang, Min; Perego, Marta

    2003-08-01

    In Bacillus subtilis, an export-import pathway regulates production of the Phr pentapeptide inhibitors of Rap proteins. Processing of the Phr precursor proteins into the active pentapeptide form is a key event in the initiation of sporulation and competence development. The PhrA (ARNQT) and PhrE (SRNVT) peptides inhibit the RapA and RapE phosphatases, respectively, whose activity is directed toward the Spo0F approximately P intermediate response regulator of the sporulation phosphorelay. The PhrC (ERGMT) peptide inhibits the RapC protein acting on the ComA response regulator for competence with regard to DNA transformation. The structural organization of PhrA, PhrE, and PhrC suggested a role for type I signal peptidases in the processing of the Phr preinhibitor, encoded by the phr genes, into the proinhibitor form. The proinhibitor was then postulated to be cleaved to the active pentapeptide inhibitor by an additional enzyme. In this report, we provide evidence that Phr preinhibitor proteins are subject to only one processing event at the peptide bond on the amino-terminal end of the pentapeptide. This processing event is most likely independent of type I signal peptidase activity. In vivo and in vitro analyses indicate that none of the five signal peptidases of B. subtilis (SipS, SipT, SipU, SipV, and SipW) are indispensable for Phr processing. However, we show that SipV and SipT have a previously undescribed role in sporulation, competence, and cell growth.

  1. Analysis of social relations among organizational units derived from process models and redesign of organization structure

    NARCIS (Netherlands)

    Choi, I.; Song, M.S.; Kim, K.M.; Lee, Y-H.

    2007-01-01

    Despite surging interests in analyzing business processes, there are few scientific approaches to analysis and redesign of organizational structures which can greatly affect the performance of business processes. This paper presents a method for deriving and analyzing organizational relations from

  2. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    Science.gov (United States)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  3. Analysis of process parameters for a DCMS process of a rotating ceramic ITO target

    Energy Technology Data Exchange (ETDEWEB)

    Ries, Patrick; Wuttig, Matthias [Institute of Physics, RWTH Aachen University (Germany)

    2012-07-01

    ITO is the most commonly used but at the same time rather expensive Transparent Conducting Oxide. This fact is due to the high Indium to Tin ratio of 90:10 that is necessary to obtain the best electrical conductivity. If it is possible to find another ratio with similar electrical properties but higher Tin content, this would be of great industrial relevance. To accomplish this goal and to check the hypothesis an in-house developed serial co-sputtering system is employed. The tool consists of a rotating primary cathode and up to two secondary cathodes for co-sputtering processes. The process parameters of a DC-sputtered ceramic ITO target installed on the primary cathode are analyzed and correlations with the thin film properties, especially the resistance and the transmittance are shown. The resistance behavior upon changing the Tin content via a co-deposition process from a secondary cathode will be presented.

  4. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    -tions. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engi-neer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described...

  5. Stochastic processes analysis in nuclear reactor using ARMA models

    International Nuclear Information System (INIS)

    Zavaljevski, N.

    1990-01-01

    The analysis of ARMA model derived from general stochastic state equations of nuclear reactor is given. The dependence of ARMA model parameters on the main physical characteristics of RB nuclear reactor in Vinca is presented. Preliminary identification results are presented, observed discrepancies between theory and experiment are explained and the possibilities of identification improvement are anticipated. (author)

  6. Social Information Processing Analysis (SIPA): Coding Ongoing Human Communication.

    Science.gov (United States)

    Fisher, B. Aubrey; And Others

    1979-01-01

    The purpose of this paper is to present a new analytical system to be used in communication research. Unlike many existing systems devised ad hoc, this research tool, a system for interaction analysis, is embedded in a conceptual rationale based on modern systems theory. (Author)

  7. Mesh Processing in Medical-Image Analysis-a Tutorial

    DEFF Research Database (Denmark)

    Levine, Joshua A.; Paulsen, Rasmus Reinhold; Zhang, Yongjie

    2012-01-01

    Medical-image analysis requires an understanding of sophisticated scanning modalities, constructing geometric models, building meshes to represent domains, and downstream biological applications. These four steps form an image-to-mesh pipeline. For research in this field to progress, the imaging...

  8. Joint time frequency analysis in digital signal processing

    DEFF Research Database (Denmark)

    Pedersen, Flemming

    with this technique is that the resolution is limited because of distortion. To overcome the resolution limitations of the Fourier Spectogram, many new distributions have been developed. In spite of this the Fourier Spectogram is by far the prime method for the analysis of signals whose spectral content is varying...

  9. Rethinking the process of operational research & systems analysis

    CERN Document Server

    Tomlinson, R

    1984-01-01

    Invited contributions from distinguished practitioners and methodologists of operational research and applied systems analysis which represent a true state-of-the-art and which provide, perhaps for the first time, a coherent, interlocking, set of ideas which may be considered the foundations of the subject as a science in its own right.

  10. The National Health Educator Job Analysis 2010: Process and Outcomes

    Science.gov (United States)

    Doyle, Eva I.; Caro, Carla M.; Lysoby, Linda; Auld, M. Elaine; Smith, Becky J.; Muenzen, Patricia M.

    2012-01-01

    The National Health Educator Job Analysis 2010 was conducted to update the competencies model for entry- and advanced-level health educators. Qualitative and quantitative methods were used. Structured interviews, focus groups, and a modified Delphi technique were implemented to engage 59 health educators from diverse work settings and experience…

  11. Nutritional and toxicological composition analysis of selected cassava processed products

    Directory of Open Access Journals (Sweden)

    Kuda Dewage Supun Charuni Nilangeka Rajapaksha

    2017-01-01

    Full Text Available Cassava (Manihot esculanta Crantz is an important food source in tropical countries where it can withstand environmentally stressed conditions. Cassava and its processed products have a high demand in both local and export market of Sri Lanka. MU51 cassava variety is one of the more common varieties and boiling is the main consumption pattern of cassava among Sri Lankans. The less utilization of cassava is due to the presence of cyanide which is a toxic substance. This research was designed to analyse the nutritional composition and toxicological (cyanide content of Cassava MU51 variety and selected processed products of cassava MU51 (boiled, starch, flour, chips, two chips varieties purchased from market to identify the effect of processing on cassava MU51 variety. Nutritional composition was analysed by AOAC (2012 methods with modifications and cyanide content was determined following picric acid method of spectrophotometric determination. The Flesh of MU51 variety and different processed products of cassava had an average range of moisture content (3.18 - 61.94%, total fat (0.31 - 23.30%, crude fiber (0.94 - 2.15%, protein (1.67 - 3.71% and carbohydrates (32.68 - 84.20% and where they varied significantly in between products and the variety MU51, where no significance difference (p >0.05 observed in between MU51 flesh and processed products' ash content where it ranged (1.02 - 1.91%. However, boiled product and MU51 flesh had more similar results in their nutritional composition where they showed no significant difference at any of the nutrient that was analysed. Thus, there could be no significant effect on the nutrient composition of raw cassava once it boiled. Cyanide content of the MU51 flesh and selected products (boiled, starch, flour and chips prepared using MU51 variety, showed wide variation ranging from 4.68 mg.kg-1 to 33.92 mg.kg-1 in dry basis. But except boiled cassava all processed products had cyanide content <10 mg.kg-1, which

  12. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  13. The Analysis of a Real Life Declarative Process

    DEFF Research Database (Denmark)

    Debois, Søren; Slaats, Tijs

    2015-01-01

    This paper reports on a qualitative study of the use of declarative process notations used in a commercial setting. Specifically, we investigate the actual use of a system implemented in terms of DCR graphs for the Danish "Dreyer Foundation" by our industry partner Exformatics A/S. The study...... by the declarative model, and (2) use process discovery techniques to examine if a perfect-fitness flow-based model representing the main business constraints is in fact easy to come by. For (1), we find evidence in various forms, most notably an apparent change in best practices by end-users allowed by the model....... For (2), we find no such model. We leave as a challenge to the community the construction of a flow-based model adequately representing the business constraints and supporting all observed behaviour by the users, whether by hand or by mining....

  14. High Energy Astronomical Data Processing and Analysis via the Internet

    Science.gov (United States)

    Valencic, Lynne A.; Snowden, S.; Pence, W.

    2012-01-01

    The HEASARC at NASA Goddard Space Flight Center and the US XMM-Newton GOF has developed Hera, a data processing facility for analyzing high energy astronomical data over the internet. Hera provides all the disk space and computing resources needed to do general processing of and advanced research on publicly available data from High Energy Astrophysics missions. The data and data products are kept on a server at GSFC and can be downloaded to a user's local machine. Further, the XMM-GOF has developed scripts to streamline XMM data reduction. These are available through Hera, and can also be downloaded to a user's local machine. These are free services provided to students, educators, and researchers for educational and research purposes.

  15. Process analysis of catalytic multi-stage hydropyrolysis of lignite

    Energy Technology Data Exchange (ETDEWEB)

    Li, W.; Wang, N.; Li, B. [Chinese Academy of Sciences, Taiyuan (China). Inst. of Coal Chemistry, State Key Laboratory of Coal Conversion

    2002-08-01

    The process and the mechanism of multi-stage hydropyrolysis (MHyPy) of coal were investigated by analyzing the products of different MHyPy processes in detail. The results showed that the suitable holding temperature was near the peak temperature (350-500{degree}C) at which more free radicals were produced rapidly, thus more oil was formed and the hydrogen utilization efficiency was increased. The cleavage of organic functional groups in char from MHyPy was mostly affected by the pyrolysis temperature. The effect of retention was to change the product distribution through stabilization of the free radicals and hydrogenation of the heavier products. In the holding stage the specific surface area and average pore volume of the char were increased due to the escape of more hydrogenation products. 18 refs., 8 figs., 3 tabs.

  16. Analysis of reaction and transport processes in zinc air batteries

    CERN Document Server

    Schröder, Daniel

    2016-01-01

    This book contains a novel combination of experimental and model-based investigations, elucidating the complex processes inside zinc air batteries. The work presented helps to answer which battery composition and which air-composition should be adjusted to maintain stable and efficient charge/discharge cycling. In detail, electrochemical investigations and X-ray transmission tomography are applied on button cell zinc air batteries and in-house set-ups. Moreover, model-based investigations of the battery anode and the impact of relative humidity, active operation, carbon dioxide and oxygen on zinc air battery operation are presented. The techniques used in this work complement each other well and yield an unprecedented understanding of zinc air batteries. The methods applied are adaptable and can potentially be applied to gain further understanding of other metal air batteries. Contents Introduction on Zinc Air Batteries Characterizing Reaction and Transport Processes Identifying Factors for Long-Term Stable O...

  17. Analysis of some methods for reduced rank Gaussian process regression

    DEFF Research Database (Denmark)

    Quinonero-Candela, J.; Rasmussen, Carl Edward

    2005-01-01

    While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performance in regression and classification problems, their computational complexity makes them impractical when the size of the training set exceeds a few thousand cases. This has motivated the recent...... proliferation of a number of cost-effective approximations to GPs, both for classification and for regression. In this paper we analyze one popular approximation to GPs for regression: the reduced rank approximation. While generally GPs are equivalent to infinite linear models, we show that Reduced Rank...... Gaussian Processes (RRGPs) are equivalent to finite sparse linear models. We also introduce the concept of degenerate GPs and show that they correspond to inappropriate priors. We show how to modify the RRGP to prevent it from being degenerate at test time. Training RRGPs consists both in learning...

  18. Energy and environmental analysis of a rapeseed biorefinery conversion process

    DEFF Research Database (Denmark)

    Boldrin, Alessio; Balzan, Alberto; Astrup, Thomas Fruergaard

    2013-01-01

    )-based environmental assessment of a Danish biorefinery system was carried out to thoroughly analyze and optimize the concept and address future research. The LCA study was based on case-specific mass and energy balances and inventory data, and was conducted using consequential LCA approach to take into account market...... mechanisms determining the fate of products, lost opportunities and marginal productions. The results show that introduction of enzymatic transesterification and improved oil extraction procedure result in environmental benefits compared to a traditional process. Utilization of rapeseed straw seems to have...... positive effects on the greenhouse gases (GHG) footprint of the biorefinery system, with improvements in the range of 9 % to 29 %, depending on the considered alternative. The mass and energy balances showed the potential for improvement of straw treatment processes (hydrothermal pre-treatment and dark...

  19. Image acquisitions, processing and analysis in the process of obtaining characteristics of horse navicular bone

    Science.gov (United States)

    Zaborowicz, M.; Włodarek, J.; Przybylak, A.; Przybył, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Boniecki, P.; Koszela, K.; Przybył, J.; Skwarcz, J.

    2015-07-01

    The aim of this study was investigate the possibility of using methods of computer image analysis for the assessment and classification of morphological variability and the state of health of horse navicular bone. Assumption was that the classification based on information contained in the graphical form two-dimensional digital images of navicular bone and information of horse health. The first step in the research was define the classes of analyzed bones, and then using methods of computer image analysis for obtaining characteristics from these images. This characteristics were correlated with data concerning the animal, such as: side of hooves, number of navicular syndrome (scale 0-3), type, sex, age, weight, information about lace, information about heel. This paper shows the introduction to the study of use the neural image analysis in the diagnosis of navicular bone syndrome. Prepared method can provide an introduction to the study of non-invasive way to assess the condition of the horse navicular bone.

  20. Using Critical Path Analysis (CPA) in Place Marketing process

    OpenAIRE

    Metaxas, Theodore; Deffner, Alex

    2013-01-01

    The article awards the use of CPA as a methodological tool in Place Marketing implementation. Taking into account that Place Marketing is a strategic process based on ‘project’ meaning with particular actions in specific time horizon, the article proposed that CPΑ has the capacity to satisfy this hypothesis. For this reason, the article creates a hypothetical scenario of CPA in four phases, planning, programming, implementation and feedback, taking as a case study the city of Rostock in Germa...