WorldWideScience

Sample records for qplanar processing method

  1. Comparison of conventional, model-based quantitative planar, and quantitative SPECT image processing methods for organ activity estimation using In-111 agents

    International Nuclear Information System (INIS)

    He, Bin; Frey, Eric C

    2006-01-01

    Accurate quantification of organ radionuclide uptake is important for patient-specific dosimetry. The quantitative accuracy from conventional conjugate view methods is limited by overlap of projections from different organs and background activity, and attenuation and scatter. In this work, we propose and validate a quantitative planar (QPlanar) processing method based on maximum likelihood (ML) estimation of organ activities using 3D organ VOIs and a projector that models the image degrading effects. Both a physical phantom experiment and Monte Carlo simulation (MCS) studies were used to evaluate the new method. In these studies, the accuracies and precisions of organ activity estimates for the QPlanar method were compared with those from conventional planar (CPlanar) processing methods with various corrections for scatter, attenuation and organ overlap, and a quantitative SPECT (QSPECT) processing method. Experimental planar and SPECT projections and registered CT data from an RSD Torso phantom were obtained using a GE Millenium VH/Hawkeye system. The MCS data were obtained from the 3D NCAT phantom with organ activity distributions that modelled the uptake of 111 In ibritumomab tiuxetan. The simulations were performed using parameters appropriate for the same system used in the RSD torso phantom experiment. The organ activity estimates obtained from the CPlanar, QPlanar and QSPECT methods from both experiments were compared. From the results of the MCS experiment, even with ideal organ overlap correction and background subtraction, CPlanar methods provided limited quantitative accuracy. The QPlanar method with accurate modelling of the physical factors increased the quantitative accuracy at the cost of requiring estimates of the organ VOIs in 3D. The accuracy of QPlanar approached that of QSPECT, but required much less acquisition and computation time. Similar results were obtained from the physical phantom experiment. We conclude that the QPlanar method, based

  2. EQPlanar: a maximum-likelihood method for accurate organ activity estimation from whole body planar projections

    International Nuclear Information System (INIS)

    Song, N; Frey, E C; He, B; Wahl, R L

    2011-01-01

    Optimizing targeted radionuclide therapy requires patient-specific estimation of organ doses. The organ doses are estimated from quantitative nuclear medicine imaging studies, many of which involve planar whole body scans. We have previously developed the quantitative planar (QPlanar) processing method and demonstrated its ability to provide more accurate activity estimates than conventional geometric-mean-based planar (CPlanar) processing methods using physical phantom and simulation studies. The QPlanar method uses the maximum likelihood-expectation maximization algorithm, 3D organ volume of interests (VOIs), and rigorous models of physical image degrading factors to estimate organ activities. However, the QPlanar method requires alignment between the 3D organ VOIs and the 2D planar projections and assumes uniform activity distribution in each VOI. This makes application to patients challenging. As a result, in this paper we propose an extended QPlanar (EQPlanar) method that provides independent-organ rigid registration and includes multiple background regions. We have validated this method using both Monte Carlo simulation and patient data. In the simulation study, we evaluated the precision and accuracy of the method in comparison to the original QPlanar method. For the patient studies, we compared organ activity estimates at 24 h after injection with those from conventional geometric mean-based planar quantification using a 24 h post-injection quantitative SPECT reconstruction as the gold standard. We also compared the goodness of fit of the measured and estimated projections obtained from the EQPlanar method to those from the original method at four other time points where gold standard data were not available. In the simulation study, more accurate activity estimates were provided by the EQPlanar method for all the organs at all the time points compared with the QPlanar method. Based on the patient data, we concluded that the EQPlanar method provided a

  3. The impact of 3D volume of interest definition on accuracy and precision of activity estimation in quantitative SPECT and planar processing methods

    Science.gov (United States)

    He, Bin; Frey, Eric C.

    2010-06-01

    Accurate and precise estimation of organ activities is essential for treatment planning in targeted radionuclide therapy. We have previously evaluated the impact of processing methodology, statistical noise and variability in activity distribution and anatomy on the accuracy and precision of organ activity estimates obtained with quantitative SPECT (QSPECT) and planar (QPlanar) processing. Another important factor impacting the accuracy and precision of organ activity estimates is accuracy of and variability in the definition of organ regions of interest (ROI) or volumes of interest (VOI). The goal of this work was thus to systematically study the effects of VOI definition on the reliability of activity estimates. To this end, we performed Monte Carlo simulation studies using randomly perturbed and shifted VOIs to assess the impact on organ activity estimates. The 3D NCAT phantom was used with activities that modeled clinically observed 111In ibritumomab tiuxetan distributions. In order to study the errors resulting from misdefinitions due to manual segmentation errors, VOIs of the liver and left kidney were first manually defined. Each control point was then randomly perturbed to one of the nearest or next-nearest voxels in three ways: with no, inward or outward directional bias, resulting in random perturbation, erosion or dilation, respectively, of the VOIs. In order to study the errors resulting from the misregistration of VOIs, as would happen, e.g. in the case where the VOIs were defined using a misregistered anatomical image, the reconstructed SPECT images or projections were shifted by amounts ranging from -1 to 1 voxels in increments of with 0.1 voxels in both the transaxial and axial directions. The activity estimates from the shifted reconstructions or projections were compared to those from the originals, and average errors were computed for the QSPECT and QPlanar methods, respectively. For misregistration, errors in organ activity estimations were

  4. The impact of 3D volume of interest definition on accuracy and precision of activity estimation in quantitative SPECT and planar processing methods

    Energy Technology Data Exchange (ETDEWEB)

    He Bin [Division of Nuclear Medicine, Department of Radiology, New York Presbyterian Hospital-Weill Medical College of Cornell University, New York, NY 10021 (United States); Frey, Eric C, E-mail: bih2006@med.cornell.ed, E-mail: efrey1@jhmi.ed [Russell H. Morgan Department of Radiology and Radiological Science, Johns Hopkins Medical Institutions, Baltimore, MD 21287-0859 (United States)

    2010-06-21

    Accurate and precise estimation of organ activities is essential for treatment planning in targeted radionuclide therapy. We have previously evaluated the impact of processing methodology, statistical noise and variability in activity distribution and anatomy on the accuracy and precision of organ activity estimates obtained with quantitative SPECT (QSPECT) and planar (QPlanar) processing. Another important factor impacting the accuracy and precision of organ activity estimates is accuracy of and variability in the definition of organ regions of interest (ROI) or volumes of interest (VOI). The goal of this work was thus to systematically study the effects of VOI definition on the reliability of activity estimates. To this end, we performed Monte Carlo simulation studies using randomly perturbed and shifted VOIs to assess the impact on organ activity estimates. The 3D NCAT phantom was used with activities that modeled clinically observed {sup 111}In ibritumomab tiuxetan distributions. In order to study the errors resulting from misdefinitions due to manual segmentation errors, VOIs of the liver and left kidney were first manually defined. Each control point was then randomly perturbed to one of the nearest or next-nearest voxels in three ways: with no, inward or outward directional bias, resulting in random perturbation, erosion or dilation, respectively, of the VOIs. In order to study the errors resulting from the misregistration of VOIs, as would happen, e.g. in the case where the VOIs were defined using a misregistered anatomical image, the reconstructed SPECT images or projections were shifted by amounts ranging from -1 to 1 voxels in increments of with 0.1 voxels in both the transaxial and axial directions. The activity estimates from the shifted reconstructions or projections were compared to those from the originals, and average errors were computed for the QSPECT and QPlanar methods, respectively. For misregistration, errors in organ activity estimations

  5. Evaluation of quantitative imaging methods for organ activity and residence time estimation using a population of phantoms having realistic variations in anatomy and uptake

    International Nuclear Information System (INIS)

    He Bin; Du Yong; Segars, W. Paul; Wahl, Richard L.; Sgouros, George; Jacene, Heather; Frey, Eric C.

    2009-01-01

    Estimating organ residence times is an essential part of patient-specific dosimetry for radioimmunotherapy (RIT). Quantitative imaging methods for RIT are often evaluated using a single physical or simulated phantom but are intended to be applied clinically where there is variability in patient anatomy, biodistribution, and biokinetics. To provide a more relevant evaluation, the authors have thus developed a population of phantoms with realistic variations in these factors and applied it to the evaluation of quantitative imaging methods both to find the best method and to demonstrate the effects of these variations. Using whole body scans and SPECT/CT images, organ shapes and time-activity curves of 111In ibritumomab tiuxetan were measured in dosimetrically important organs in seven patients undergoing a high dose therapy regimen. Based on these measurements, we created a 3D NURBS-based cardiac-torso (NCAT)-based phantom population. SPECT and planar data at realistic count levels were then simulated using previously validated Monte Carlo simulation tools. The projections from the population were used to evaluate the accuracy and variation in accuracy of residence time estimation methods that used a time series of SPECT and planar scans. Quantitative SPECT (QSPECT) reconstruction methods were used that compensated for attenuation, scatter, and the collimator-detector response. Planar images were processed with a conventional (CPlanar) method that used geometric mean attenuation and triple-energy window scatter compensation and a quantitative planar (QPlanar) processing method that used model-based compensation for image degrading effects. Residence times were estimated from activity estimates made at each of five time points. The authors also evaluated hybrid methods that used CPlanar or QPlanar time-activity curves rescaled to the activity estimated from a single QSPECT image. The methods were evaluated in terms of mean relative error and standard deviation of the

  6. The effect of volume-of-interest misregistration on quantitative planar activity and dose estimation

    International Nuclear Information System (INIS)

    Song, N; Frey, E C; He, B

    2010-01-01

    In targeted radionuclide therapy (TRT), dose estimation is essential for treatment planning and tumor dose response studies. Dose estimates are typically based on a time series of whole-body conjugate view planar or SPECT scans of the patient acquired after administration of a planning dose. Quantifying the activity in the organs from these studies is an essential part of dose estimation. The quantitative planar (QPlanar) processing method involves accurate compensation for image degrading factors and correction for organ and background overlap via the combination of computational models of the image formation process and 3D volumes of interest defining the organs to be quantified. When the organ VOIs are accurately defined, the method intrinsically compensates for attenuation, scatter and partial volume effects, as well as overlap with other organs and the background. However, alignment between the 3D organ volume of interest (VOIs) used in QPlanar processing and the true organ projections in the planar images is required. The aim of this research was to study the effects of VOI misregistration on the accuracy and precision of organ activity estimates obtained using the QPlanar method. In this work, we modeled the degree of residual misregistration that would be expected after an automated registration procedure by randomly misaligning 3D SPECT/CT images, from which the VOI information was derived, and planar images. Mutual information-based image registration was used to align the realistic simulated 3D SPECT images with the 2D planar images. The residual image misregistration was used to simulate realistic levels of misregistration and allow investigation of the effects of misregistration on the accuracy and precision of the QPlanar method. We observed that accurate registration is especially important for small organs or ones with low activity concentrations compared to neighboring organs. In addition, residual misregistration gave rise to a loss of precision

  7. Processing module operating methods, processing modules, and communications systems

    Science.gov (United States)

    McCown, Steven Harvey; Derr, Kurt W.; Moore, Troy

    2014-09-09

    A processing module operating method includes using a processing module physically connected to a wireless communications device, requesting that the wireless communications device retrieve encrypted code from a web site and receiving the encrypted code from the wireless communications device. The wireless communications device is unable to decrypt the encrypted code. The method further includes using the processing module, decrypting the encrypted code, executing the decrypted code, and preventing the wireless communications device from accessing the decrypted code. Another processing module operating method includes using a processing module physically connected to a host device, executing an application within the processing module, allowing the application to exchange user interaction data communicated using a user interface of the host device with the host device, and allowing the application to use the host device as a communications device for exchanging information with a remote device distinct from the host device.

  8. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  9. Microencapsulation and Electrostatic Processing Method

    Science.gov (United States)

    Morrison, Dennis R. (Inventor); Mosier, Benjamin (Inventor)

    2000-01-01

    Methods are provided for forming spherical multilamellar microcapsules having alternating hydrophilic and hydrophobic liquid layers, surrounded by flexible, semi-permeable hydrophobic or hydrophilic outer membranes which can be tailored specifically to control the diffusion rate. The methods of the invention rely on low shear mixing and liquid-liquid diffusion process and are particularly well suited for forming microcapsules containing both hydrophilic and hydrophobic drugs. These methods can be carried out in the absence of gravity and do not rely on density-driven phase separation, mechanical mixing or solvent evaporation phases. The methods include the process of forming, washing and filtering microcapsules. In addition, the methods contemplate coating microcapsules with ancillary coatings using an electrostatic field and free fluid electrophoresis of the microcapsules. The microcapsules produced by such methods are particularly useful in the delivery of pharmaceutical compositions.

  10. Digital signal processing with kernel methods

    CERN Document Server

    Rojo-Alvarez, José Luis; Muñoz-Marí, Jordi; Camps-Valls, Gustavo

    2018-01-01

    A realistic and comprehensive review of joint approaches to machine learning and signal processing algorithms, with application to communications, multimedia, and biomedical engineering systems Digital Signal Processing with Kernel Methods reviews the milestones in the mixing of classical digital signal processing models and advanced kernel machines statistical learning tools. It explains the fundamental concepts from both fields of machine learning and signal processing so that readers can quickly get up to speed in order to begin developing the concepts and application software in their own research. Digital Signal Processing with Kernel Methods provides a comprehensive overview of kernel methods in signal processing, without restriction to any application field. It also offers example applications and detailed benchmarking experiments with real and synthetic datasets throughout. Readers can find further worked examples with Matlab source code on a website developed by the authors. * Presents the necess...

  11. Finite Element Method in Machining Processes

    CERN Document Server

    Markopoulos, Angelos P

    2013-01-01

    Finite Element Method in Machining Processes provides a concise study on the way the Finite Element Method (FEM) is used in the case of manufacturing processes, primarily in machining. The basics of this kind of modeling are detailed to create a reference that will provide guidelines for those who start to study this method now, but also for scientists already involved in FEM and want to expand their research. A discussion on FEM, formulations and techniques currently in use is followed up by machining case studies. Orthogonal cutting, oblique cutting, 3D simulations for turning and milling, grinding, and state-of-the-art topics such as high speed machining and micromachining are explained with relevant examples. This is all supported by a literature review and a reference list for further study. As FEM is a key method for researchers in the manufacturing and especially in the machining sector, Finite Element Method in Machining Processes is a key reference for students studying manufacturing processes but al...

  12. Present status of processing method

    Energy Technology Data Exchange (ETDEWEB)

    Kosako, Kazuaki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)

    1998-11-01

    Present status of processing method for a high-energy nuclear data file was examined. The NJOY94 code is the only one available to the processing. In Japan, present processing used NJOY94 is orienting toward the production of traditional cross section library, because a high-energy transport code using a high-energy cross section library is indistinct. (author)

  13. Methods of digital image processing

    International Nuclear Information System (INIS)

    Doeler, W.

    1985-01-01

    Increasing use of computerized methods for diagnostical imaging of radiological problems will open up a wide field of applications for digital image processing. The requirements set by routine diagnostics in medical radiology point to picture data storage and documentation and communication as the main points of interest for application of digital image processing. As to the purely radiological problems, the value of digital image processing is to be sought in the improved interpretability of the image information in those cases where the expert's experience and image interpretation by human visual capacities do not suffice. There are many other domains of imaging in medical physics where digital image processing and evaluation is very useful. The paper reviews the various methods available for a variety of problem solutions, and explains the hardware available for the tasks discussed. (orig.) [de

  14. Digital processing methods for bronchograms

    International Nuclear Information System (INIS)

    Mamilyaev, R.M.; Popova, N.P.; Matsulevich, T.V.

    1989-01-01

    The technique of digital processing of bronchograms with the aim of separating morphological details of bronchi and increasing the clarity in the outlines of contrasted bronchi, is described. The block diagram of digital processing on the automatized system of image processing is given. It is shown that digital processing of bronchograms permits to clearly outline bronchi walls and makes the measurements of bronchi diameters easier and more reliable. Considerable advantages of digital processing of images as compared with the optical methods, are shown

  15. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2013-09-01

    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  16. Three-dimensional image signals: processing methods

    Science.gov (United States)

    Schiopu, Paul; Manea, Adrian; Craciun, Anca-Ileana; Craciun, Alexandru

    2010-11-01

    Over the years extensive studies have been carried out to apply coherent optics methods in real-time processing, communications and transmission image. This is especially true when a large amount of information needs to be processed, e.g., in high-resolution imaging. The recent progress in data-processing networks and communication systems has considerably increased the capacity of information exchange. We describe the results of literature investigation research of processing methods for the signals of the three-dimensional images. All commercially available 3D technologies today are based on stereoscopic viewing. 3D technology was once the exclusive domain of skilled computer-graphics developers with high-end machines and software. The images capture from the advanced 3D digital camera can be displayed onto screen of the 3D digital viewer with/ without special glasses. For this is needed considerable processing power and memory to create and render the complex mix of colors, textures, and virtual lighting and perspective necessary to make figures appear three-dimensional. Also, using a standard digital camera and a technique called phase-shift interferometry we can capture "digital holograms." These are holograms that can be stored on computer and transmitted over conventional networks. We present some research methods to process "digital holograms" for the Internet transmission and results.

  17. Calcification–carbonation method for red mud processing

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ruibing [School of Metallurgy, Northeastern University, Shenyang 110819 (China); Laboratory for Simulation and Modelling of Particulate Systems, Department of Chemical Engineering, Monash University, Clayton, Victoria, 3800 (Australia); Zhang, Tingan, E-mail: zhangta@smm.neu.edu.cn [School of Metallurgy, Northeastern University, Shenyang 110819 (China); Liu, Yan; Lv, Guozhi; Xie, Liqun [School of Metallurgy, Northeastern University, Shenyang 110819 (China)

    2016-10-05

    Highlights: • A new approach named calcification–carbonation method for red mud processing is proposed. • The method can prevent emission of red mud from alumina production and is good for the environment. • Thermodynamics characteristics were investigated. • The method was verified experimentally using a jet-flow reactor. - Abstract: Red mud, the Bayer process residue, is generated from alumina industry and causes environmental problem. In this paper, a novel calcification–carbonation method that utilized a large amount of the Bayer process residue is proposed. Using this method, the red mud was calcified with lime to transform the silicon phase into hydrogarnet, and the alkali in red mud was recovered. Then, the resulting hydrogarnet was decomposed by CO{sub 2} carbonation, affording calcium silicate, calcium carbonate, and aluminum hydroxide. Alumina was recovered using an alkaline solution at a low temperature. The effects of the new process were analyzed by thermodynamics analysis and experiments. The extraction efficiency of the alumina and soda obtained from the red mud reached 49.4% and 96.8%, respectively. The new red mud with <0.3% alkali can be used in cement production. Using a combination of this method and cement production, the Bayer process red mud can be completely utilized.

  18. Methods in Astronomical Image Processing

    Science.gov (United States)

    Jörsäter, S.

    A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

  19. Uranium manufacturing process employing the electrolytic reduction method

    International Nuclear Information System (INIS)

    Oda, Yoshio; Kazuhare, Manabu; Morimoto, Takeshi.

    1986-01-01

    The present invention related to a uranium manufacturing process that employs the electrolytic reduction method, but particularly to a uranium manufacturing process that employs an electrolytic reduction method requiring low voltage. The process, in which uranium is obtained by means of the electrolytic method and with uranyl acid as the raw material, is prior art

  20. Method and apparatus for processing algae

    Science.gov (United States)

    Chew, Geoffrey; Reich, Alton J.; Dykes, Jr., H. Waite; Di Salvo, Roberto

    2012-07-03

    Methods and apparatus for processing algae are described in which a hydrophilic ionic liquid is used to lyse algae cells. The lysate separates into at least two layers including a lipid-containing hydrophobic layer and an ionic liquid-containing hydrophilic layer. A salt or salt solution may be used to remove water from the ionic liquid-containing layer before the ionic liquid is reused. The used salt may also be dried and/or concentrated and reused. The method can operate at relatively low lysis, processing, and recycling temperatures, which minimizes the environmental impact of algae processing while providing reusable biofuels and other useful products.

  1. Comparative analysis of accelerogram processing methods

    International Nuclear Information System (INIS)

    Goula, X.; Mohammadioun, B.

    1986-01-01

    The work described here inafter is a short development of an on-going research project, concerning high-quality processing of strong-motion recordings of earthquakes. Several processing procedures have been tested, applied to synthetic signals simulating ground-motion designed for this purpose. The methods of correction operating in the time domain are seen to be strongly dependent upon the sampling rate. Two methods of low-frequency filtering followed by an integration of accelerations yielded satisfactory results [fr

  2. Multi-block methods in multivariate process control

    DEFF Research Database (Denmark)

    Kohonen, J.; Reinikainen, S.P.; Aaljoki, K.

    2008-01-01

    methods the effect of a sub-process can be seen and an example with two blocks, near infra-red, NIR, and process data, is shown. The results show improvements in modelling task, when a MB-based approach is used. This way of working with data gives more information on the process than if all data...... are in one X-matrix. The procedure is demonstrated by an industrial continuous process, where knowledge about the sub-processes is available and X-matrix can be divided into blocks between process variables and NIR spectra.......In chemometric studies all predictor variables are usually collected in one data matrix X. This matrix is then analyzed by PLS regression or other methods. When data from several different sub-processes are collected in one matrix, there is a possibility that the effects of some sub-processes may...

  3. Comparative Study of Different Processing Methods for the ...

    African Journals Online (AJOL)

    The result of the two processing methods reduced the cyanide concentration to the barest minimum level required by World Health Organization (10mg/kg). The mechanical pressing-fermentation method removed more cyanide when compared to fermentation processing method. Keywords: Cyanide, Fermentation, Manihot ...

  4. Method for pre-processing LWR spent fuel

    International Nuclear Information System (INIS)

    Otsuka, Katsuyuki; Ebihara, Hikoe.

    1986-01-01

    Purpose: To facilitate the decladding of spent fuel, cladding tube processing, and waste gas recovery, and to enable the efficient execution of main re-processing process thereafter. Constitution: Spent fuel assemblies are sent to a cutting process where they are cut into chips of easy-to-process size. The chips, in a thermal decladding process, undergo a thermal cycle processing in air with the processing temperatures increased and decreased within the range of from 700 deg C to 1200 deg C, oxidizing zircaloy comprising the cladding tubes into zirconia. The oxidized cladding tubes have a number of fine cracks and become very brittle and easy to loosen off from fuel pellets when even a slight mechanical force is applied thereto, thus changing into a form of powder. Processed products are then separated into zirconia sand and fuel pellets by a gravitational selection method or by a sifting method, the zirconia sand being sent to a waste processing process and the fuel pellets to a melting-refining process. (Yoshino, Y.)

  5. Parametric methods for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper

    is studied in Section 4, and Bayesian inference in Section 5. On one hand, as the development in computer technology and computational statistics continues,computationally-intensive simulation-based methods for likelihood inference probably will play a increasing role for statistical analysis of spatial...... inference procedures for parametric spatial point process models. The widespread use of sensible but ad hoc methods based on summary statistics of the kind studied in Chapter 4.3 have through the last two decades been supplied by likelihood based methods for parametric spatial point process models......(This text is submitted for the volume ‘A Handbook of Spatial Statistics' edited by A.E. Gelfand, P. Diggle, M. Fuentes, and P. Guttorp, to be published by Chapmand and Hall/CRC Press, and planned to appear as Chapter 4.4 with the title ‘Parametric methods'.) 1 Introduction This chapter considers...

  6. Digital image processing mathematical and computational methods

    CERN Document Server

    Blackledge, J M

    2005-01-01

    This authoritative text (the second part of a complete MSc course) provides mathematical methods required to describe images, image formation and different imaging systems, coupled with the principle techniques used for processing digital images. It is based on a course for postgraduates reading physics, electronic engineering, telecommunications engineering, information technology and computer science. This book relates the methods of processing and interpreting digital images to the 'physics' of imaging systems. Case studies reinforce the methods discussed, with examples of current research

  7. Methods Used to Deal with Peace Process Spoilers

    Directory of Open Access Journals (Sweden)

    MA. Bilbil Kastrati

    2014-06-01

    Full Text Available The conflicts of the past three decades have shown that the major problems which peace processes face are the spoilers. Spoilers are warring parties and their leaders who believe that peaceful settlement of disputes threatens their interests, power and their reputation; therefore, they use all means to undermine or completely spoil the process. Spoilers of peace processes can be inside or outside of the process and are characterized as limited, greedy or total spoilers. Their motives for spoiling can be different, such as: political, financial, ethnic, security, etc. Furthermore, it is important to emphasise that spoilers are not only rebels and insurgents, but can often be governments, diasporas, warlords, private military companies, etc. In order to counteract the spoilers, the international community has adopted and implemented three methods: inducement, socialization and coercion. Often all three methods are used to convince the spoilers to negotiate, accept and implement peace agreements. Hence, this paper will examine the methods used to deal with peace process spoilers through an assessment of the strategies employed, impact, success and failures. This paper will also argue that the success or failure of the peace process depends on the method(s used to deal with spoilers. If the right method is chosen, with a persistent engagement of the international community, the peace process will be successful; on the contrary, if they fail to do so, the consequences will be devastating.

  8. Processing methods for temperature-dependent MCNP libraries

    International Nuclear Information System (INIS)

    Li Songyang; Wang Kan; Yu Ganglin

    2008-01-01

    In this paper,the processing method of NJOY which transfers ENDF files to ACE (A Compact ENDF) files (point-wise cross-Section file used for MCNP program) is discussed. Temperatures that cover the range for reactor design and operation are considered. Three benchmarks are used for testing the method: Jezebel Benchmark, 28 cm-thick Slab Core Benchmark and LWR Benchmark with Burnable Absorbers. The calculation results showed the precision of the neutron cross-section library and verified the correct processing methods in usage of NJOY. (authors)

  9. Application of finite-element-methods in food processing

    DEFF Research Database (Denmark)

    Risum, Jørgen

    2004-01-01

    Presentation of the possible use of finite-element-methods in food processing. Examples from diffusion studies are given.......Presentation of the possible use of finite-element-methods in food processing. Examples from diffusion studies are given....

  10. Method for double-sided processing of thin film transistors

    Science.gov (United States)

    Yuan, Hao-Chih; Wang, Guogong; Eriksson, Mark A.; Evans, Paul G.; Lagally, Max G.; Ma, Zhenqiang

    2008-04-08

    This invention provides methods for fabricating thin film electronic devices with both front- and backside processing capabilities. Using these methods, high temperature processing steps may be carried out during both frontside and backside processing. The methods are well-suited for fabricating back-gate and double-gate field effect transistors, double-sided bipolar transistors and 3D integrated circuits.

  11. Geophysical methods for monitoring soil stabilization processes

    Science.gov (United States)

    Saneiyan, Sina; Ntarlagiannis, Dimitrios; Werkema, D. Dale; Ustra, Andréa

    2018-01-01

    Soil stabilization involves methods used to turn unconsolidated and unstable soil into a stiffer, consolidated medium that could support engineered structures, alter permeability, change subsurface flow, or immobilize contamination through mineral precipitation. Among the variety of available methods carbonate precipitation is a very promising one, especially when it is being induced through common soil borne microbes (MICP - microbial induced carbonate precipitation). Such microbial mediated precipitation has the added benefit of not harming the environment as other methods can be environmentally detrimental. Carbonate precipitation, typically in the form of calcite, is a naturally occurring process that can be manipulated to deliver the expected soil strengthening results or permeability changes. This study investigates the ability of spectral induced polarization and shear-wave velocity for monitoring calcite driven soil strengthening processes. The results support the use of these geophysical methods as soil strengthening characterization and long term monitoring tools, which is a requirement for viable soil stabilization projects. Both tested methods are sensitive to calcite precipitation, with SIP offering additional information related to long term stability of precipitated carbonate. Carbonate precipitation has been confirmed with direct methods, such as direct sampling and scanning electron microscopy (SEM). This study advances our understanding of soil strengthening processes and permeability alterations, and is a crucial step for the use of geophysical methods as monitoring tools in microbial induced soil alterations through carbonate precipitation.

  12. Process control and optimization with simple interval calculation method

    DEFF Research Database (Denmark)

    Pomerantsev, A.; Rodionova, O.; Høskuldsson, Agnar

    2006-01-01

    for the quality improvement in the course of production. The latter is an active quality optimization, which takes into account the actual history of the process. The advocate approach is allied to the conventional method of multivariate statistical process control (MSPC) as it also employs the historical process......Methods of process control and optimization are presented and illustrated with a real world example. The optimization methods are based on the PLS block modeling as well as on the simple interval calculation methods of interval prediction and object status classification. It is proposed to employ...... the series of expanding PLS/SIC models in order to support the on-line process improvements. This method helps to predict the effect of planned actions on the product quality and thus enables passive quality control. We have also considered an optimization approach that proposes the correcting actions...

  13. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  14. Development of continuous pharmaceutical production processes supported by process systems engineering methods and tools

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2012-01-01

    The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way.......The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way....

  15. Methods for Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte

    This thesis deals with the adaptation and implementation of various optimisation methods, in the field of experimental design, for the laser cutting process. The problem in optimising the laser cutting process has been defined and a structure for at Decision Support System (DSS......) for the optimisation of the laser cutting process has been suggested. The DSS consists of a database with the currently used and old parameter settings. Also one of the optimisation methods has been implemented in the DSS in order to facilitate the optimisation procedure for the laser operator. The Simplex Method has...... been adapted in two versions. A qualitative one, that by comparing the laser cut items optimise the process and a quantitative one that uses a weighted quality response in order to achieve a satisfactory quality and after that maximises the cutting speed thus increasing the productivity of the process...

  16. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  17. Optoelectronic imaging of speckle using image processing method

    Science.gov (United States)

    Wang, Jinjiang; Wang, Pengfei

    2018-01-01

    A detailed image processing of laser speckle interferometry is proposed as an example for the course of postgraduate student. Several image processing methods were used together for dealing with optoelectronic imaging system, such as the partial differential equations (PDEs) are used to reduce the effect of noise, the thresholding segmentation also based on heat equation with PDEs, the central line is extracted based on image skeleton, and the branch is removed automatically, the phase level is calculated by spline interpolation method, and the fringe phase can be unwrapped. Finally, the imaging processing method was used to automatically measure the bubble in rubber with negative pressure which could be used in the tire detection.

  18. Signal processing methods for MFE plasma diagnostics

    International Nuclear Information System (INIS)

    Candy, J.V.; Casper, T.; Kane, R.

    1985-02-01

    The application of various signal processing methods to extract energy storage information from plasma diamagnetism sensors occurring during physics experiments on the Tandom Mirror Experiment-Upgrade (TMX-U) is discussed. We show how these processing techniques can be used to decrease the uncertainty in the corresponding sensor measurements. The algorithms suggested are implemented using SIG, an interactive signal processing package developed at LLNL

  19. Methods of process management in radiology

    International Nuclear Information System (INIS)

    Teichgraeber, U.K.M.; Gillessen, C.; Neumann, F.

    2003-01-01

    The main emphasis in health care has been on quality and availability but increasing cost pressure has made cost efficiency ever more relevant for nurses, technicians, and physicians. Within a hospital, the radiologist considerably influences the patient's length of stay through the availability of service and diagnostic information. Therefore, coordinating and timing radiologic examinations become increasingly more important. Physicians are not taught organizational management during their medical education and residency training, and the necessary expertise in economics is generally acquired through the literature or specialized courses. Beyond the medical service, the physicians are increasingly required to optimize their work flow according to economic factors. This review introduces various tools for process management and its application in radiology. By means of simple paper-based methods, the work flow of most processes can be analyzed. For more complex work flow, it is suggested to choose a method that allows for an exact qualitative and quantitative prediction of the effect of variations. This review introduces network planning technique and process simulation. (orig.) [de

  20. Method and apparatus for lysing and processing algae

    Science.gov (United States)

    Chew, Geoffrey; Reich, Alton J.; Dykes, Jr., H. Waite H.; Di Salvo, Roberto

    2013-03-05

    Methods and apparatus for processing algae are described in which a hydrophilic ionic liquid is used to lyse algae cells at lower temperatures than existing algae processing methods. A salt or salt solution is used as a separation agent and to remove water from the ionic liquid, allowing the ionic liquid to be reused. The used salt may be dried or concentrated and reused. The relatively low lysis temperatures and recycling of the ionic liquid and salt reduce the environmental impact of the algae processing while providing biofuels and other useful products.

  1. Process synthesis, design and analysis using a process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario R.; Gani, Rafiqul

    2015-01-01

    ) techniques. The fundamental pillars of this framework are the definition and use of functional process-groups (building blocks) representing a wide range of process operations, flowsheet connectivity rules to join the process-groups to generate all the feasible flowsheet alternatives and flowsheet property...... models like energy consumption, atom efficiency, environmental impact to evaluate the performance of the generated alternatives. In this way, a list of feasible flowsheets are quickly generated, screened and selected for further analysis. Since the flowsheet is synthesized and the operations......This paper describes the development and application of a process-group contribution method to model, simulate and synthesize chemical processes. Process flowsheets are generated in the same way as atoms or groups of atoms are combined to form molecules in computer aided molecular design (CAMD...

  2. An Automated Processing Method for Agglomeration Areas

    Directory of Open Access Journals (Sweden)

    Chengming Li

    2018-05-01

    Full Text Available Agglomeration operations are a core component of the automated generalization of aggregated area groups. However, because geographical elements that possess agglomeration features are relatively scarce, the current literature has not given sufficient attention to agglomeration operations. Furthermore, most reports on the subject are limited to the general conceptual level. Consequently, current agglomeration methods are highly reliant on subjective determinations and cannot support intelligent computer processing. This paper proposes an automated processing method for agglomeration areas. Firstly, the proposed method automatically identifies agglomeration areas based on the width of the striped bridging area, distribution pattern index (DPI, shape similarity index (SSI, and overlap index (OI. Next, the progressive agglomeration operation is carried out, including the computation of the external boundary outlines and the extraction of agglomeration lines. The effectiveness and rationality of the proposed method has been validated by using actual census data of Chinese geographical conditions in the Jiangsu Province.

  3. A method to evaluate process performance by integrating time and resources

    Science.gov (United States)

    Wang, Yu; Wei, Qingjie; Jin, Shuang

    2017-06-01

    The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.

  4. Municipal solid waste processing methods: Technical-economic comparison

    International Nuclear Information System (INIS)

    Bertanza, G.

    1993-01-01

    This paper points out the advantages and disadvantages of municipal solid waste processing methods incorporating different energy and/or materials recovery techniques, i.e., those involving composting or incineration and those with a mix of composting and incineration. The various technologies employed are compared especially with regard to process reliability, flexibility, modularity, pollution control efficiency and cost effectiveness. For that which regards composting, biodigestors are examined, while for incineration, the paper analyzes systems using combustion with complete recovery of vapour, combustion with total recovery of available electric energy, and combustion with cogeneration. Each of the processing methods examined includes an iron recovery cycle

  5. Novel welding image processing method based on fractal theory

    Institute of Scientific and Technical Information of China (English)

    陈强; 孙振国; 肖勇; 路井荣

    2002-01-01

    Computer vision has come into used in the fields of welding process control and automation. In order to improve precision and rapidity of welding image processing, a novel method based on fractal theory has been put forward in this paper. Compared with traditional methods, the image is preliminarily processed in the macroscopic regions then thoroughly analyzed in the microscopic regions in the new method. With which, an image is divided up to some regions according to the different fractal characters of image edge, and the fuzzy regions including image edges are detected out, then image edges are identified with Sobel operator and curved by LSM (Lease Square Method). Since the data to be processed have been decreased and the noise of image has been reduced, it has been testified through experiments that edges of weld seam or weld pool could be recognized correctly and quickly.

  6. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  7. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  8. Apparatus and method for radiation processing of materials

    International Nuclear Information System (INIS)

    Neuberg, W.B.; Luniewski, R.

    1983-01-01

    A method and apparatus for radiation degradation processing of polytetrafluoroethylene makes use of a simultaneous irradiation, agitation and cooling. The apparatus is designed to make efficent use of radiation in the processing. (author)

  9. Method for processing spent nuclear reactor fuel

    International Nuclear Information System (INIS)

    Levenson, M.; Zebroski, E.L.

    1981-01-01

    A method and apparatus are claimed for processing spent nuclear reactor fuel wherein plutonium is continuously contaminated with radioactive fission products and diluted with uranium. Plutonium of sufficient purity to fabricate nuclear weapons cannot be produced by the process or in the disclosed reprocessing plant. Diversion of plutonium is prevented by radiation hazards and ease of detection

  10. Intelligent methods for the process parameter determination of plastic injection molding

    Science.gov (United States)

    Gao, Huang; Zhang, Yun; Zhou, Xundao; Li, Dequn

    2018-03-01

    Injection molding is one of the most widely used material processing methods in producing plastic products with complex geometries and high precision. The determination of process parameters is important in obtaining qualified products and maintaining product quality. This article reviews the recent studies and developments of the intelligent methods applied in the process parameter determination of injection molding. These intelligent methods are classified into three categories: Case-based reasoning methods, expert system- based methods, and data fitting and optimization methods. A framework of process parameter determination is proposed after comprehensive discussions. Finally, the conclusions and future research topics are discussed.

  11. Image restoration and processing methods

    International Nuclear Information System (INIS)

    Daniell, G.J.

    1984-01-01

    This review will stress the importance of using image restoration techniques that deal with incomplete, inconsistent, and noisy data and do not introduce spurious features into the processed image. No single image is equally suitable for both the resolution of detail and the accurate measurement of intensities. A good general purpose technique is the maximum entropy method and the basis and use of this will be explained. (orig.)

  12. Non-filtration method of processing uranium ores

    International Nuclear Information System (INIS)

    Laskorin, B.N.; Vodolazov, L.I.; Tokarev, N.N.; Vyalkov, V.I.; Goldobina, V.A.; Gosudarstvennyj Komitet po Ispol'zovaniyu Atomnoj Ehnergii SSSR, Moscow)

    1977-01-01

    The development of the non-filtration sorption method has lead to procedures of the sorption leaching and the extraction desorption, which have made it possible to intensify the processing of uranium ores and to improve greatly the technical and economic indexes by eliminating the complex method of multiple filtration and re-pulping of cakes. This method makes it possible to involve more poor uranium raw materials, at the same time extracting valuable components such as molybdenum, vanadium, copper, etc. Considerable industrial experience has been acquired in the sorption of dense pulp with a solid-to-liquid phase ratio of 1:1. This has led to a plant production increase of 1.5-3.0 times, an increase of uranium extraction by 5-10%, a two- to- three-fold increase of labour capacity of the main workers, and to a several-fold decrease of reagents, auxiliary materials, electric energy and vapour. This non-filtration method is a continuous process in all its phases thanks to the use of high-yield and high-power equipment for high-density pulps. (author)

  13. Evaluation of processing methods for static radioisotope scan images

    International Nuclear Information System (INIS)

    Oakberg, J.A.

    1976-12-01

    Radioisotope scanning in the field of nuclear medicine provides a method for the mapping of a radioactive drug in the human body to produce maps (images) which prove useful in detecting abnormalities in vital organs. At best, radioisotope scanning methods produce images with poor counting statistics. One solution to improving the body scan images is using dedicated small computers with appropriate software to process the scan data. Eleven methods for processing image data are compared

  14. Collaborative simulation method with spatiotemporal synchronization process control

    Science.gov (United States)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  15. Methods for the Evaluation of Waste Treatment Processes

    Directory of Open Access Journals (Sweden)

    Hans-Joachim Gehrmann

    2017-01-01

    Full Text Available Decision makers for waste management are confronted with the problem of selecting the most economic, environmental, and socially acceptable waste treatment process. This paper elucidates evaluation methods for waste treatment processes for the comparison of ecological and economic aspects such as material flow analysis, statistical entropy analysis, energetic and exergetic assessment, cumulative energy demand, and life cycle assessment. The work is based on the VDI guideline 3925. A comparison of two thermal waste treatment plants with different process designs and energy recovery systems was performed with the described evaluation methods. The results are mainly influenced by the type of energy recovery, where the waste-to-energy plant providing district heat and process steam emerged to be beneficial in most aspects. Material recovery options from waste incineration were evaluated according to sustainability targets, such as saving of resources and environmental protection.

  16. Mathematical methods for diffusion MRI processing

    International Nuclear Information System (INIS)

    Lenglet, C.; Lenglet, C.; Sapiro, G.; Campbell, J.S.W.; Pike, G.B.; Campbell, J.S.W.; Siddiqi, K.; Descoteaux, M.; Haro, G.; Wassermann, D.; Deriche, R.; Wassermann, D.; Anwander, A.; Thompson, P.M.

    2009-01-01

    In this article, we review recent mathematical models and computational methods for the processing of diffusion Magnetic Resonance Images, including state-of-the-art reconstruction of diffusion models, cerebral white matter connectivity analysis, and segmentation techniques. We focus on Diffusion Tensor Images (DTI) and Q-Ball Images (QBI). (authors)

  17. Study on Processing Method of Image Shadow

    Directory of Open Access Journals (Sweden)

    Wang Bo

    2014-07-01

    Full Text Available In order to effectively remove disturbance of shadow and enhance robustness of information processing of computer visual image, this paper makes study on inspection and removal of image shadow. It makes study the continual removal algorithm of shadow based on integration, the illumination surface and texture, it respectively introduces their work principles and realization method, it can effectively carrying processing for shadow by test.

  18. Apparatus and method X-ray image processing

    International Nuclear Information System (INIS)

    1984-01-01

    The invention relates to a method for X-ray image processing. The radiation passed through the object is transformed into an electric image signal from which the logarithmic value is determined and displayed by a display device. Its main objective is to provide a method and apparatus that renders X-ray images or X-ray subtraction images with strong reduction of stray radiation. (Auth.)

  19. Processing method and device for radioactive liquid waste

    International Nuclear Information System (INIS)

    Matsuo, Toshiaki; Nishi, Takashi; Matsuda, Masami; Yukita, Atsushi.

    1997-01-01

    When only suspended particulate ingredients are contained as COD components in radioactive washing liquid wastes, the liquid wastes are heated by a first process, for example, an adsorption step to adsorb the suspended particulate ingredients to an activated carbon, and then separating and removing the suspended particulate ingredients by filtration. When both of the floating particle ingredients and soluble organic ingredients are contained, the suspended particulate ingredients are separated and removed by the first process, and then soluble organic ingredients are removed by other process, or both of the suspended particulate ingredients and the soluble organic ingredients are removed by the first process. In an existent method of adding an activated carbon and then filtering them at a normal temperature, the floating particle ingredients cover the layer of activated carbon formed on a filter paper or fabric to sometimes cause clogging. However, according to the method of the present invention, since disturbance by the floating particle ingredients does not occur, the COD components can be separated and removed sufficiently without lowering liquid waste processing speed. (T.M.)

  20. Device and method for shortening reactor process tubes

    Science.gov (United States)

    Frantz, Charles E.; Alexander, William K.; Lander, Walter E. B.

    1980-01-01

    This disclosure describes a device and method for in situ shortening of nuclear reactor zirconium alloy process tubes which have grown as a result of radiation exposure. An upsetting technique is utilized which involves inductively heating a short band of a process tube with simultaneous application of an axial load sufficient to cause upsetting with an attendant decrease in length of the process tube.

  1. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs

    International Nuclear Information System (INIS)

    Brooks, Robin; Thorpe, Richard; Wilson, John

    2004-01-01

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice

  2. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, Robin [Curvaceous Software Limited, P.O. Box 43, Gerrards Cross, Bucks SL98UX (United Kingdom)]. E-mail: enquiries@curvaceous.com; Thorpe, Richard [Curvaceous Software Limited, P.O. Box 43, Gerrards Cross, Bucks SL98UX (United Kingdom); Wilson, John [Curvaceous Software Limited, P.O. Box 43, Gerrards Cross, Bucks SL98UX (United Kingdom)

    2004-11-11

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.

  3. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs.

    Science.gov (United States)

    Brooks, Robin; Thorpe, Richard; Wilson, John

    2004-11-11

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.

  4. Non-filtration method of processing of uranium ores

    International Nuclear Information System (INIS)

    Laskorin, B.N.; Vodolazov, L.I.; Tokarev, N.N.; Vyalkov, V.I.; Goldobina, V.A.; Gosudarstvennyj Komitet po Ispol'zovaniyu Atomnoj Ehnergii SSSR, Moscow)

    1977-01-01

    The development of the filterless sorption method has lead to working out the sorption leaching process and the process of extraction desorption, which has made possible to intensify the process of uranium ore working and to improve greatly the technical economic indexes by liquidating the complex method of multiple filtration and repulping of cakes. This method makes possible to involve more poor uranium raw materials and at the same time to extract valuable components: molybdenum, vanadium, copper, etc. Great industrial experience has been accumulating in sorption of dense pulp with the ratio of solid phase to liquid one equal to 1:1. This has lead to the increase of productivity of working plants by 1,5-3,0 times, the increase of uranium extraction by 5-10%, the increase of labour capacity of main workers by 2-3 times, and to the decrease of reagents expense, auxiliary materials, electric energy and vapour by several times. In fact the developed technology is continuous in all its steps with complete complex automatization of the process with the help of the most simple and available means of regulation and controlling. The process is equipped with high productivity apparatuses of great power with mechanic and pneumatic mixing for high density pulps, and with the columns KDS, KDZS, KNSPR and PIK for the regeneration of saturated sorbent in the counterflow regime. The exploitation of fine-granular hydrophilic ion-exchange resins in hydrophobized state is foreseen [ru

  5. Advances in Packaging Methods, Processes and Systems

    Directory of Open Access Journals (Sweden)

    Nitaigour Premchand Mahalik

    2014-10-01

    Full Text Available The food processing and packaging industry is becoming a multi-trillion dollar global business. The reason is that the recent increase in incomes in traditionally less economically developed countries has led to a rise in standards of living that includes a significantly higher consumption of packaged foods. As a result, food safety guidelines have been more stringent than ever. At the same time, the number of research and educational institutions—that is, the number of potential researchers and stakeholders—has increased in the recent past. This paper reviews recent developments in food processing and packaging (FPP, keeping in view the aforementioned advancements and bearing in mind that FPP is an interdisciplinary area in that materials, safety, systems, regulation, and supply chains play vital roles. In particular, the review covers processing and packaging principles, standards, interfaces, techniques, methods, and state-of-the-art technologies that are currently in use or in development. Recent advances such as smart packaging, non-destructive inspection methods, printing techniques, application of robotics and machineries, automation architecture, software systems and interfaces are reviewed.

  6. Processing of low-quality bauxite feedstock by thermochemistry-Bayer method

    Directory of Open Access Journals (Sweden)

    О. А. Дубовиков

    2016-11-01

    Full Text Available The modern production of aluminum which by its global output ranks first among the non-ferrous metals includes three main stages: ore extraction, its processing into alumina and, finally, the production of primary aluminum. Alumina production from bauxites,  being the  primary raw material in the  alumina industry,  is based  on two main methods: the Bayer method and the sintering method developed in Russia under the lead of an academician Nikolay Semenovich Kurnakov. Alumina production by the Bayer’s method is more cost effective,  but  has  higher  requirements to the  quality of the bauxite feedstock.  A great deal  of research has  been carried  out on low quality bauxites focusing firstly on finding ways to enrich the feedstock, secondly on improving the combined sequential Bayer-sintering method and thirdly on developing new hydrometallurgical ways for bauxites processing. Mechanical methods of bauxite enrichment have not yet brought any positive outcome, and a development of new hydrometallurgical high alkaline  autoclave process  faced  significant hardware  difficulties not addressed so far. For efficient processing of such low quality bauxite feedstock it is suggested to use a universal thermochemistry-Bayer method, which was developed in St. Petersburg Mining University under  the lead  of  Nikolay Ivanovich Eremin, allows to process different substandard bauxite feedstock and has a competitive costing as compared to the sintering method and combined methods. The main stages of thermochemistry-Bayer method are thermal activation of feedstock, its further desiliconization with the alkaline solution and leaching of the resultant bauxite product  under Bayer’s method. Despite high energy consumption at  the baking stage,  it  allows to condition the  low quality bauxite feedstock by neutralizing a variety of technologically harmful impurities such as organic matter, sulfide sulfur, carbonates, and at the

  7. Method of parallel processing in SANPO real time system

    International Nuclear Information System (INIS)

    Ostrovnoj, A.I.; Salamatin, I.M.

    1981-01-01

    A method of parellel processing in SANPO real time system is described. Algorithms of data accumulation and preliminary processing in this system as a parallel processes using a specialized high level programming language are described. Hierarchy of elementary processes are also described. It provides the synchronization of concurrent processes without semaphors. The developed means are applied to the systems of experiment automation using SM-3 minicomputers [ru

  8. Post-Processing in the Material-Point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars Vabbersgaard

    The material-point method (MPM) is a numerical method for dynamic or static analysis of solids using a discretization in time and space. The method has shown to be successful in modelling physical problems involving large deformations, which are difficult to model with traditional numerical tools...... such as the finite element method. In the material-point method, a set of material points is utilized to track the problem in time and space, while a computational background grid is utilized to obtain spatial derivatives relevant to the physical problem. Currently, the research within the material-point method......-point method. The first idea involves associating a volume with each material point and displaying the deformation of this volume. In the discretization process, the physical domain is divided into a number of smaller volumes each represented by a simple shape; here quadrilaterals are chosen for the presented...

  9. Effect of thermal processing methods on the proximate composition ...

    African Journals Online (AJOL)

    The nutritive value of raw and thermal processed castor oil seed (Ricinus communis) was investigated using the following parameters; proximate composition, gross energy, mineral constituents and ricin content. Three thermal processing methods; toasting, boiling and soaking-and-boiling were used in the processing of the ...

  10. An object-oriented description method of EPMM process

    Science.gov (United States)

    Jiang, Zuo; Yang, Fan

    2017-06-01

    In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.

  11. Method of processing low-level radioactive liquid wastes

    International Nuclear Information System (INIS)

    Matsunaga, Ichiro; Sugai, Hiroshi.

    1984-01-01

    Purpose: To effectively reduce the radioactivity density of low-level radioactive liquid wastes discharged from enriched uranium conversion processing steps or the likes. Method: Hydrazin is added to low-level radioactive liquid wastes, which are in contact with iron hydroxide-cation exchange resins prepared by processing strongly acidic-cation exchange resins with ferric chloride and aqueous ammonia to form hydrorizates of ferric ions in the resin. Hydrazine added herein may be any of hydrazine hydrate, hydrazine hydrochloride and hydranine sulfate. The preferred addition amount is more than 100 mg per one liter of the liquid wastes. If it is less than 100 mg, the reduction rate for the radioactivety density (procession liquid density/original liquid density) is decreased. This method enables to effectively reduce the radioactivity density of the low-level radioactive liquid wastes containing a trace amount of radioactive nucleides. (Yoshihara, H.)

  12. A method for manufacturing a tool part for an injection molding process, a hot embossing process, a nano-imprint process, or an extrusion process

    DEFF Research Database (Denmark)

    2013-01-01

    The present invention relates to a method for manufacturing a tool part for an injection molding process, a hot embossing process, nano-imprint process or an extrusion process. First, there is provided a master structure (10) with a surface area comprising nanometre-sized protrusions (11...

  13. Studies of neutron methods for process control and criticality surveillance of fissile material processing facilities

    International Nuclear Information System (INIS)

    Zoltowski, T.

    1988-01-01

    The development of radiochemical processes for fissile material processing and spent fuel handling need new control procedures enabling an improvement of plant throughput. This is strictly related to the implementation of continuous criticality control policy and developing reliable methods for monitoring the reactivity of radiochemical plant operations in presence of the process perturbations. Neutron methods seem to be applicable for fissile material control in some technological facilities. The measurement of epithermal neutron source multiplication with heuristic evaluation of measured data enables surveillance of anomalous reactivity enhancement leading to unsafe states. 80 refs., 47 figs., 33 tabs. (author)

  14. Method of processing liquid wastes

    International Nuclear Information System (INIS)

    Naba, Katsumi; Oohashi, Takeshi; Kawakatsu, Ryu; Kuribayashi, Kotaro.

    1980-01-01

    Purpose: To process radioactive liquid wastes with safety by distillating radioactive liquid wastes while passing gases, properly treating the distillation fractions, adding combustible and liquid synthetic resin material to the distillation residues, polymerizing to solidify and then burning them. Method: Radioactive substance - containing liquid wastes are distillated while passing gases and the distillation fractions containing no substantial radioactive substances are treated in an adequate method. Synthetic resin material, which may be a mixture of polymer and monomer, is added together with a catalyst to the distillation residues containing almost of the radioactive substances to polymerize and solidify. Water or solvent in such an extent as not hindering the solidification may be allowed if remained. The solidification products are burnt for facilitating the treatment of the radioactive substances. The resin material can be selected suitably, methacrylate syrup (mainly solution of polymethylmethacrylate and methylmethacrylate) being preferred. (Seki, T.)

  15. Improved Methods for Production Manufacturing Processes in Environmentally Benign Manufacturing

    Directory of Open Access Journals (Sweden)

    Yan-Yan Wang

    2011-09-01

    Full Text Available How to design a production process with low carbon emissions and low environmental impact as well as high manufacturing performance is a key factor in the success of low-carbon production. It is important to address concerns about climate change for the large carbon emission source manufacturing industries because of their high energy consumption and environmental impact during the manufacturing stage of the production life cycle. In this paper, methodology for determining a production process is developed. This methodology integrates process determination from three different levels: new production processing, selected production processing and batch production processing. This approach is taken within a manufacturing enterprise based on prior research. The methodology is aimed at providing decision support for implementing Environmentally Benign Manufacturing (EBM and low-carbon production to improve the environmental performance of the manufacturing industry. At the first level, a decision-making model for new production processes based on the Genetic Simulated Annealing Algorithm (GSAA is presented. The decision-making model considers not only the traditional factors, such as time, quality and cost, but also energy and resource consumption and environmental impact, which are different from the traditional methods. At the second level, a methodology is developed based on an IPO (Input-Process-Output model that integrates assessments of resource consumption and environmental impact in terms of a materials balance principle for batch production processes. At the third level, based on the above two levels, a method for determining production processes that focus on low-carbon production is developed based on case-based reasoning, expert systems and feature technology for designing the process flow of a new component. Through the above three levels, a method for determining the production process to identify, quantify, assess, and optimize the

  16. First-order Convex Optimization Methods for Signal and Image Processing

    DEFF Research Database (Denmark)

    Jensen, Tobias Lindstrøm

    2012-01-01

    In this thesis we investigate the use of first-order convex optimization methods applied to problems in signal and image processing. First we make a general introduction to convex optimization, first-order methods and their iteration complexity. Then we look at different techniques, which can...... be used with first-order methods such as smoothing, Lagrange multipliers and proximal gradient methods. We continue by presenting different applications of convex optimization and notable convex formulations with an emphasis on inverse problems and sparse signal processing. We also describe the multiple...

  17. Digital processing method for monitoring the radioactivity of stack releases

    International Nuclear Information System (INIS)

    Vialettes, H.; Leblanc, P.; Perotin, J.P.; Lazou, J.P.

    1978-01-01

    The digital processing method proposed is adapted for data supplied by a fixed-filter detector normally used for analogue processing (integrator system). On the basis of the raw data (pulses) from the detector, the technique makes it possible to determine the rate of activity released whereas analogue processing gives only the released activity. Furthermore, the method can be used to develop alarm systems on the basis of a possible exposure rate at the point of fall-out, and by including in the program a coefficient which allows for atmospheric diffusion conditions at any given time one can improve the accuracy of the results. In order to test the digital processing method and demonstrate its advantages over analogue processing, various atmospheric contamination situations were simulated in a glove-box and analysed simultaneously, using both systems, from the pulses transmitted by the same sampling and fixed-filter detection unit. The experimental results confirm the advantages foreseen in the theoretical research. (author)

  18. Gear hot forging process robust design based on finite element method

    International Nuclear Information System (INIS)

    Xuewen, Chen; Won, Jung Dong

    2008-01-01

    During the hot forging process, the shaping property and forging quality will fluctuate because of die wear, manufacturing tolerance, dimensional variation caused by temperature and the different friction conditions, etc. In order to control this variation in performance and to optimize the process parameters, a robust design method is proposed in this paper, based on the finite element method for the hot forging process. During the robust design process, the Taguchi method is the basic robust theory. The finite element analysis is incorporated in order to simulate the hot forging process. In addition, in order to calculate the objective function value, an orthogonal design method is selected to arrange experiments and collect sample points. The ANOVA method is employed to analyze the relationships of the design parameters and design objectives and to find the best parameters. Finally, a case study for the gear hot forging process is conducted. With the objective to reduce the forging force and its variation, the robust design mathematical model is established. The optimal design parameters obtained from this study indicate that the forging force has been reduced and its variation has been controlled

  19. Nonaqueous processing methods

    International Nuclear Information System (INIS)

    Coops, M.S.; Bowersox, D.F.

    1984-09-01

    A high-temperature process utilizing molten salt extraction from molten metal alloys has been developed for purification of spent power reactor fuels. Experiments with laboratory-scale processing operations show that purification and throughput parameters comparable to the Barnwell Purex process can be achieved by pyrochemical processing in equipment one-tenth the size, with all wastes being discharged as stable metal alloys at greatly reduced volume and disposal cost. This basic technology can be developed for large-scale processing of spent reactor fuels. 13 references, 4 figures

  20. Recent developments in analytical detection methods for radiation processed foods

    International Nuclear Information System (INIS)

    Wu Jilan

    1993-01-01

    A short summary of the programmes of 'ADMIT' (FAO/IAEA) and the developments in analytical detection methods for radiation processed foods has been given. It is suggested that for promoting the commercialization of radiation processed foods and controlling its quality, one must pay more attention to the study of analytical detection methods of irradiated food

  1. Adoption of the Creative Process According to the Immersive Method

    Directory of Open Access Journals (Sweden)

    Sonja Vuk

    2015-09-01

    Full Text Available The immersive method is a new concept of visual education that is better suited to the needs of students in contemporary post-industrial society. The features of the immersive method are: 1 it emerges from interaction with visual culture; 2 it encourages understanding of contemporary art (as an integral part of visual culture; and 3 it implements the strategies and processes of the dominant tendencies in contemporary art (new media art and relational art with the goal of adopting the creative process, expressing one’s thoughts and emotions, and communicating with the environment. The immersive method transfers the creative process from art to the process of creation by the students themselves. This occurs with the mediation of an algorithmic scheme that enables students to adopt ways to solve problems, to express thoughts and emotions, to develop ideas and to transfer these ideas to form, medium and material. The immersive method uses transfer in classes, the therapeutic aspect of art and “flow state” (the optimal experience of being immersed in an activity/aesthetic experience (a total experience that has a beginning, a process and a conclusion/immersive experience (comprehensive immersion in the present moment. This is a state leading to the sublimative effect of creation (identification with what has been expressed, as well as to self-actualisation. The immersive method teaches one to connect the context, social relations and the artwork as a whole in which one lives as an individual. The adopted creative process is implemented in a critical manner on one’s surrounding through analysis, aesthetic interventions, and ecologically and socially aware inclusion in the life of a community. The students gain the crucial meta-competence of a creative thinking process.

  2. Research of Monte Carlo method used in simulation of different maintenance processes

    International Nuclear Information System (INIS)

    Zhao Siqiao; Liu Jingquan

    2011-01-01

    The paper introduces two kinds of Monte Carlo methods used in equipment life process simulation under the least maintenance: condition: method of producing the interval of lifetime, method of time scale conversion. The paper also analyzes the characteristics and the using scope of the two methods. By using the conception of service age reduction factor, the model of equipment's life process under incomplete maintenance condition is established, and also the life process simulation method applicable to this situation is invented. (authors)

  3. [Influence of different processing methods on Angelica sinensis polysaccharides from same origin].

    Science.gov (United States)

    Lv, Jieli; Chen, Hongli; Duan, Jinao; Yan, Hui; Tang, Yuping; Song, Bingsheng

    2011-04-01

    To study the influences of different processing methods on the content of Angelica sinensis polysaccharides (APS) from the same origin. The contents of neutral polysaccharides and acidic polysaccharides in various samples of A. sinensis were determined by phenol-sulfuric acid and carbazole-sulfuric acid method, respectively. The proliferation ability of lymphocyte was detected by MTT method after the cells were cultured with different concentrations of APS from two samples processed by different methods. The different processing methods had different effects on the contents of polysaccharide. The maximum content of APS (26.03%) was found in the sample processed by microwave drying medium-fired, but the minimum content of APS (2.25%) was found in the sample processed by vacuum drying at 50 TC. Furthermore, the APS (high concentration group, P methods have different effects on the contents of APS and the proliferation ability of lymphocytes.

  4. Method of electrolytic processing for radioactive liquid waste

    International Nuclear Information System (INIS)

    Otsuka, Katsuyuki; Takahashi, Yoshiharu; Tamai, Hideaki.

    1989-01-01

    Radioactive liquid wastes containing sodium compounds are electrolized using mercury as a cathode. As a result, they are separated into sodium-containing metal amalgam and residues. Metals containing sodium are separated from amalgam, purified and re-utilized, while mercury is recycled to the electrolysis vessel. The foregoing method can provide advantageous effect such as: (1) volume of the wastes to be processed can be reduced, (2) since processing can be carried out at a relatively low temperature, low boiling elements can be handled with no evaporization, (3) useful elements can be recovered and (4) other method than glass solidification can easily be employed remarkable volume-reduction of solidification products can be expected. (K.M.)

  5. Processing Methods of Alkaline Hydrolysate from Rice Husk

    Directory of Open Access Journals (Sweden)

    Olga D. Arefieva

    2017-07-01

    Full Text Available This paper devoted to finding processing methods of alkaline hydrolysate produced from rice husk pre-extraction, and discusses alkaline hydrolysate processing schemed and disengagement of some products: amorphous silica of various quality, alkaline lignin, and water and alkaline extraction polysaccharides. Silica samples were characterized: crude (air-dried, burnt (no preliminary water treatment, washed in distilled water, and washed in distilled water and burnt. Waste water parameters upon the extraction of solids from alkaline hydrolysate dropped a few dozens or thousand times depending on the applied processing method. Color decreased a few thousand times, turbidity was virtually eliminated, chemical oxygen demanded about 20–136 times; polyphenols content might decrease 50% or be virtually eliminated. The most prospective scheme obtained the two following solid products from rice husk alkaline hydrolysate: amorphous silica and alkaline extraction polysaccharide. Chemical oxygen demand of the remaining waste water decreased about 140 times compared to the silica-free solution.

  6. Integrated process development-a robust, rapid method for inclusion body harvesting and processing at the microscale level.

    Science.gov (United States)

    Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid

    2017-10-21

    Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.

  7. Research on interpolation methods in medical image processing.

    Science.gov (United States)

    Pan, Mei-Sen; Yang, Xiao-Li; Tang, Jing-Tian

    2012-04-01

    Image interpolation is widely used for the field of medical image processing. In this paper, interpolation methods are divided into three groups: filter interpolation, ordinary interpolation and general partial volume interpolation. Some commonly-used filter methods for image interpolation are pioneered, but the interpolation effects need to be further improved. When analyzing and discussing ordinary interpolation, many asymmetrical kernel interpolation methods are proposed. Compared with symmetrical kernel ones, the former are have some advantages. After analyzing the partial volume and generalized partial volume estimation interpolations, the new concept and constraint conditions of the general partial volume interpolation are defined, and several new partial volume interpolation functions are derived. By performing the experiments of image scaling, rotation and self-registration, the interpolation methods mentioned in this paper are compared in the entropy, peak signal-to-noise ratio, cross entropy, normalized cross-correlation coefficient and running time. Among the filter interpolation methods, the median and B-spline filter interpolations have a relatively better interpolating performance. Among the ordinary interpolation methods, on the whole, the symmetrical cubic kernel interpolations demonstrate a strong advantage, especially the symmetrical cubic B-spline interpolation. However, we have to mention that they are very time-consuming and have lower time efficiency. As for the general partial volume interpolation methods, from the total error of image self-registration, the symmetrical interpolations provide certain superiority; but considering the processing efficiency, the asymmetrical interpolations are better.

  8. Radiometric method for the characterization of particulate processes in colloidal suspensions. II. Experimental verification of the method

    Energy Technology Data Exchange (ETDEWEB)

    Subotic, B. [Institut Rudjer Boskovic, Zagreb (Yugoslavia)

    1979-09-15

    A radiometric method for the characterization of particulate processes is verified using stable hydrosols of silver iodide. Silver iodide hydrosols satisfy the conditions required for the applications of the proposed method. Comparison shows that the values for the change of particle size measured in silver iodide hydrosols by the proposed method are in excellent agreement with the values obtained by other methods on the same systems (electron microscopy, sedimentation analysis, light scattering). This shows that the proposed method is suitable for the characterization of particulate processes in colloidal suspensions. (Auth.).

  9. SKOCh modified parameters and data processing method

    International Nuclear Information System (INIS)

    Abramov, V.V.; Baldin, B.Yu.; Vasil'chenko, V.G.

    1986-01-01

    Characteristics of a modified Cherenkov radiation ring spectrometer variant (SKOCH) are presented. Methods of experimental data processing are described. Different SKOCH optics variants are investigated. Multi-particle registering electronic equipment for data read-out from SKOCH providing for the improvement of multiparticle occurance registration conditions is applied in the course of measurements using proton beams. A system of SKOCH spectrometer data processing programms is developed and experimentally tested. Effective algorithm for calibrating Cherenkov radiation ring spectrometers with quite a large angular and radial aperture is developed. The on-line- and off-line-processing program complex provides for the complete control of SKOCH operation during statistics collection and for particle (π, K, P) identification within 5.5-30 GeV/c range

  10. Development of rupture process analysis method for great earthquakes using Direct Solution Method

    Science.gov (United States)

    Yoshimoto, M.; Yamanaka, Y.; Takeuchi, N.

    2010-12-01

    Conventional rupture process analysis methods using teleseismic body waves were based on ray theory. Therefore, these methods have the following problems in applying to great earthquakes such as 2004 Sumatra earthquake: (1) difficulty in computing all later phases such as the PP reflection phase, (2) impossibility of computing called “W phase”, the long period phase arriving before S wave, (3) implausibility of hypothesis that the distance is far enough from the observation points to the hypocenter compared to the fault length. To solve above mentioned problems, we have developed a new method which uses the synthetic seismograms computed by the Direct Solution Method (DSM, e.g. Kawai et al. 2006) as Green’s functions. We used the DSM software (http://www.eri.u-tokyo.ac.jp/takeuchi/software/) for computing the Green’s functions up to 1 Hz for the IASP91 (Kennett and Engdahl, 1991) model, and determined the final slip distributions using the waveform inversion method (Kikuchi et al. 2003). First we confirmed whether the Green’s functions computed by DSM were accurate in higher frequencies up to 1 Hz. Next we performed the rupture process analysis of this new method for Mw8.0 (GCMT) large Solomon Islands earthquake on April 1, 2007. We found that this earthquake consisted of two asperities and the rupture propagated across the subducting Sinbo ridge. The obtained slip distribution better correlates to the aftershock distributions than existing method. Furthermore, this new method keep same accuracy of existing method (which has the advantage of calculating) with respect to direct P-wave and reflection phases near the source, and also accurately calculate the later phases such a PP-wave.

  11. SELECTION OF NON-CONVENTIONAL MACHINING PROCESSES USING THE OCRA METHOD

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2015-04-01

    Full Text Available Selection of the most suitable nonconventional machining process (NCMP for a given machining application can be viewed as multi-criteria decision making (MCDM problem with many conflicting and diverse criteria. To aid these selection processes, different MCDM methods have been proposed. This paper introduces the use of an almost unexplored MCDM method, i.e. operational competitiveness ratings analysis (OCRA method for solving the NCMP selection problems. Applicability, suitability and computational procedure of OCRA method have been demonstrated while solving three case studies dealing with selection of the most suitable NCMP. In each case study the obtained rankings were compared with those derived by the past researchers using different MCDM methods. The results obtained using the OCRA method have good correlation with those derived by the past researchers which validate the usefulness of this method while solving complex NCMP selection problems.

  12. Method of processing radioactive wastes

    International Nuclear Information System (INIS)

    Nomura, Ichiro; Hashimoto, Yasuo.

    1984-01-01

    Purpose: To improve the volume-reduction effect, as well as enable simultaneous procession for the wastes such as burnable solid wastes, resin wastes or sludges, and further convert the processed materials into glass-solidified products which are much less burnable and stable chemically and thermally. Method: Auxiliaries mainly composed of SiO 2 such as clays, and wastes such as burnable solid wastes, waste resins and sludges are charged through a waste hopper into an incinerating melting furnace comprising an incinerating and a melting furnace, while radioactive concentrated liquid wastes are sprayed from a spray nozzle. The wastes are burnt by the heat from the melting furnace and combustion air, and the sprayed concentrated wastes are dried by the hot air after the combustion into solid components. The solid matters from the concentrated liquid wastes and the incinerating ashes of the wastes are melted together with the auxiliaries in the melting furnace and converted into glass-like matters. The glass-like matters thus formed are caused to flow into a vessel and gradually cooled to solidify. (Horiuchi, T.)

  13. Methods and systems for the processing of physiological signals

    International Nuclear Information System (INIS)

    Cosnac, B. de; Gariod, R.; Max, J.; Monge, V.

    1975-01-01

    This note is a general survey of the processing of physiological signals. After an introduction about electrodes and their limitations, the physiological nature of the main signals are shortly recalled. Different methods (signal averaging, spectral analysis, shape morphological analysis) are described as applications to the fields of magnetocardiography, electro-encephalography, cardiography, electronystagmography. As for processing means (single portable instruments and programmable), they are described through the example of application to rheography and to the Plurimat'S general system. As a conclusion the methods of signal processing are dominated by the morphological analysis of curves and by the necessity of a more important introduction of the statistical classification. As for the instruments, microprocessors will appear but specific operators linked to computer will certainly grow [fr

  14. Extrusion Process by Finite Volume Method Using OpenFoam Software

    International Nuclear Information System (INIS)

    Matos Martins, Marcelo; Tonini Button, Sergio; Divo Bressan, Jose; Ivankovic, Alojz

    2011-01-01

    The computational codes are very important tools to solve engineering problems. In the analysis of metal forming process, such as extrusion, this is not different because the computational codes allow analyzing the process with reduced cost. Traditionally, the Finite Element Method is used to solve solid mechanic problems, however, the Finite Volume Method (FVM) have been gaining force in this field of applications. This paper presents the velocity field and friction coefficient variation results, obtained by numerical simulation using the OpenFoam Software and the FVM to solve an aluminum direct cold extrusion process.

  15. Changing perspective on tissue processing - comparison of microwave histoprocessing method with the conventional method

    Directory of Open Access Journals (Sweden)

    G Shrestha

    2015-09-01

    Full Text Available Background: Histopathological examination of tissues requires sliver of formalin fixed tissue that has been chemically processed and then stained with Haematoxylin and Eosin. The time honored conventional method of tissue processing, which requires 12 to 13 hours for completion, is employed at majority of laboratories but is now seeing the

  16. Processing method of radioactive metal wastes

    International Nuclear Information System (INIS)

    Uetake, Naoto; Urata, Megumu; Sato, Masao.

    1985-01-01

    Purpose: To reduce the volume and increase the density of radioactive metal wastes easily while preventing scattering of radioactivity and process them into suitable form to storage and treatment. Method: Metal wastes mainly composed of zirconium are discharged from nuclear power plants or fuel re-processing plants, and these metals such as zirconium and titanium vigorously react with hydrogen and rapidly diffuse as hydrides. Since the hydrides are extremely brittle and can be pulverized easily, they can be volume-reduced. However, since metal hydrides have no ductility, dehydrogenation is applied for the molding fabrication in view of the subsequent storage and processing. The dehydrogenation is easy like the hydrogenation and fine metal pieces can be molded in a small compression device. For the dehydrogenation, a temperature is slightly increased as compared with that in the hydrogenation, pressure is reduced through the vacuum evacuation system and the removed hydrogen is purified for reuse. The upper limit for the temperature of the hydrogenation is 680 0 C in order to prevent the scttering of radioactivity. (Kamimura, M.)

  17. Performance Analysis of Entropy Methods on K Means in Clustering Process

    Science.gov (United States)

    Dicky Syahputra Lubis, Mhd.; Mawengkang, Herman; Suwilo, Saib

    2017-12-01

    K Means is a non-hierarchical data clustering method that attempts to partition existing data into one or more clusters / groups. This method partitions the data into clusters / groups so that data that have the same characteristics are grouped into the same cluster and data that have different characteristics are grouped into other groups.The purpose of this data clustering is to minimize the objective function set in the clustering process, which generally attempts to minimize variation within a cluster and maximize the variation between clusters. However, the main disadvantage of this method is that the number k is often not known before. Furthermore, a randomly chosen starting point may cause two points to approach the distance to be determined as two centroids. Therefore, for the determination of the starting point in K Means used entropy method where this method is a method that can be used to determine a weight and take a decision from a set of alternatives. Entropy is able to investigate the harmony in discrimination among a multitude of data sets. Using Entropy criteria with the highest value variations will get the highest weight. Given this entropy method can help K Means work process in determining the starting point which is usually determined at random. Thus the process of clustering on K Means can be more quickly known by helping the entropy method where the iteration process is faster than the K Means Standard process. Where the postoperative patient dataset of the UCI Repository Machine Learning used and using only 12 data as an example of its calculations is obtained by entropy method only with 2 times iteration can get the desired end result.

  18. New Principles of Process Control in Geotechnics by Acoustic Methods

    OpenAIRE

    Leššo, I.; Flegner, P.; Pandula, B.; Horovčák, P.

    2007-01-01

    The contribution describes the new solution of the control of rotary drilling process as some elementary process in geotechnics. The article presents the first results of research on the utilization of acoustic methods in identification process by optimal control of rotary drilling.

  19. Ethnographic methods for process evaluations of complex health behaviour interventions.

    Science.gov (United States)

    Morgan-Trimmer, Sarah; Wood, Fiona

    2016-05-04

    This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.

  20. A new decomposition method for parallel processing multi-level optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Min Soo; Choi, Dong Hoon

    2002-01-01

    In practical designs, most of the multidisciplinary problems have a large-size and complicate design system. Since multidisciplinary problems have hundreds of analyses and thousands of variables, the grouping of analyses and the order of the analyses in the group affect the speed of the total design cycle. Therefore, it is very important to reorder and regroup the original design processes in order to minimize the total computational cost by decomposing large multidisciplinary problems into several MultiDisciplinary Analysis SubSystems (MDASS) and by processing them in parallel. In this study, a new decomposition method is proposed for parallel processing of multidisciplinary design optimization, such as Collaborative Optimization (CO) and Individual Discipline Feasible (IDF) method. Numerical results for two example problems are presented to show the feasibility of the proposed method

  1. New Principles of Process Control in Geotechnics by Acoustic Methods

    Directory of Open Access Journals (Sweden)

    Leššo, I.

    2007-01-01

    Full Text Available The contribution describes the new solution of the control of rotary drilling process as some elementary process in geotechnics. The article presents the first results of research on the utilization of acoustic methods in identification process by optimal control of rotary drilling.

  2. A Multi-Objective Optimization Method to integrate Heat Pumps in Industrial Processes

    OpenAIRE

    Becker, Helen; Spinato, Giulia; Maréchal, François

    2011-01-01

    Aim of process integration methods is to increase the efficiency of industrial processes by using pinch analysis combined with process design methods. In this context, appropriate integrated utilities offer promising opportunities to reduce energy consumption, operating costs and pollutants emissions. Energy integration methods are able to integrate any type of predefined utility, but so far there is no systematic approach to generate potential utilities models based on their technology limit...

  3. Methods of control the machining process

    Directory of Open Access Journals (Sweden)

    Yu.V. Petrakov

    2017-12-01

    Full Text Available Presents control methods, differentiated by the time of receipt of information used: a priori, a posteriori and current. When used a priori information to determine the mode of cutting is carried out by simulation the process of cutting allowance, where the shape of the workpiece and the details are presented in the form of wireframes. The office for current information provides for a system of adaptive control and modernization of CNC machine, where in the input of the unit shall be computed by using established optimization software. For the control by a posteriori information of the proposed method of correction of shape-generating trajectory in the second pass measurement surface of the workpiece formed by the first pass. Developed programs that automatically design the adjusted file for machining.

  4. A Situational Implementation Method for Business Process Management Systems

    NARCIS (Netherlands)

    R.L. Jansen; J.P.P. Ravensteyn

    For the integrated implementation of Business Process Management and supporting information systems many methods are available. Most of these methods, however, apply a one-size fits all approach and do not take into account the specific situation of the organization in which an information system is

  5. Psychophysical "blinding" methods reveal a functional hierarchy of unconscious visual processing.

    Science.gov (United States)

    Breitmeyer, Bruno G

    2015-09-01

    Numerous non-invasive experimental "blinding" methods exist for suppressing the phenomenal awareness of visual stimuli. Not all of these suppressive methods occur at, and thus index, the same level of unconscious visual processing. This suggests that a functional hierarchy of unconscious visual processing can in principle be established. The empirical results of extant studies that have used a number of different methods and additional reasonable theoretical considerations suggest the following tentative hierarchy. At the highest levels in this hierarchy is unconscious processing indexed by object-substitution masking. The functional levels indexed by crowding, the attentional blink (and other attentional blinding methods), backward pattern masking, metacontrast masking, continuous flash suppression, sandwich masking, and single-flash interocular suppression, fall at progressively lower levels, while unconscious processing at the lowest levels is indexed by eye-based binocular-rivalry suppression. Although unconscious processing levels indexed by additional blinding methods is yet to be determined, a tentative placement at lower levels in the hierarchy is also given for unconscious processing indexed by Troxler fading and adaptation-induced blindness, and at higher levels in the hierarchy indexed by attentional blinding effects in addition to the level indexed by the attentional blink. The full mapping of levels in the functional hierarchy onto cortical activation sites and levels is yet to be determined. The existence of such a hierarchy bears importantly on the search for, and the distinctions between, neural correlates of conscious and unconscious vision. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. A Robust Photogrammetric Processing Method of Low-Altitude UAV Images

    Directory of Open Access Journals (Sweden)

    Mingyao Ai

    2015-02-01

    Full Text Available Low-altitude Unmanned Aerial Vehicles (UAV images which include distortion, illumination variance, and large rotation angles are facing multiple challenges of image orientation and image processing. In this paper, a robust and convenient photogrammetric approach is proposed for processing low-altitude UAV images, involving a strip management method to automatically build a standardized regional aerial triangle (AT network, a parallel inner orientation algorithm, a ground control points (GCPs predicting method, and an improved Scale Invariant Feature Transform (SIFT method to produce large number of evenly distributed reliable tie points for bundle adjustment (BA. A multi-view matching approach is improved to produce Digital Surface Models (DSM and Digital Orthophoto Maps (DOM for 3D visualization. Experimental results show that the proposed approach is robust and feasible for photogrammetric processing of low-altitude UAV images and 3D visualization of products.

  7. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    Science.gov (United States)

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  8. Research on the raw data processing method of the hydropower construction project

    Science.gov (United States)

    Tian, Zhichao

    2018-01-01

    In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.

  9. Effect of processing methods on the mechanical properties of engineered bamboo

    OpenAIRE

    Sharma, Bhavna; Gatóo, Ana; Ramage, Michael H.

    2015-01-01

    Engineered bamboo is increasingly explored as a material with significant potential for structural applications. The material is comprised of raw bamboo processed into a laminated composite. Commercial methods vary due to the current primary use as an architectural surface material, with processing used to achieve different colours in the material. The present work investigates the effect of two types of processing methods, bleaching and caramelisation, to determine the effect on the mechanic...

  10. Minimal processing - preservation methods of the future: an overview

    International Nuclear Information System (INIS)

    Ohlsson, T.

    1994-01-01

    Minimal-processing technologies are modern techniques that provide sufficient shelf life to foods to allow their distribution, while also meeting the demands of the consumers for convenience and fresh-like quality. Minimal-processing technologies can be applied at various stages of the food distribution chain, in storage, in processing and/or in packaging. Examples of methods will be reviewed, including modified-atmosphere packaging, high-pressure treatment, sous-vide cooking and active packaging

  11. Operating cost budgeting methods: quantitative methods to improve the process

    Directory of Open Access Journals (Sweden)

    José Olegário Rodrigues da Silva

    Full Text Available Abstract Operating cost forecasts are used in economic feasibility studies of projects and in budgeting process. Studies have pointed out that some companies are not satisfied with the budgeting process and chief executive officers want updates more frequently. In these cases, the main problem lies in the costs versus benefits. Companies seek simple and cheap forecasting methods without, at the same time, conceding in terms of quality of the resulting information. This study aims to compare operating cost forecasting models to identify the ones that are relatively easy to implement and turn out less deviation. For this purpose, we applied ARIMA (autoregressive integrated moving average and distributed dynamic lag models to data from a Brazilian petroleum company. The results suggest that the models have potential application, and that multivariate models fitted better and showed itself a better way to forecast costs than univariate models.

  12. Possibilities of implementing nonthermal processing methods in the dairy industry

    OpenAIRE

    Irena Jeličić

    2010-01-01

    In the past two decades a lot of research in the field of food science has focused on new, non-thermal processing methods. This article describes the most intensively investigated new processing methodsfor implementation in the dairy industry, like microfiltration, high hydrostatic pressure, ultrasound and pulsed electric fields. For each method an overview is given for the principle of microbial inactivation, the obtained results regarding reduction of microorganisms as well as the positive ...

  13. System and method for deriving a process-based specification

    Science.gov (United States)

    Hinchey, Michael Gerard (Inventor); Rash, James Larry (Inventor); Rouff, Christopher A. (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  14. Efficient point cloud data processing in shipbuilding: Reformative component extraction method and registration method

    Directory of Open Access Journals (Sweden)

    Jingyu Sun

    2014-07-01

    Full Text Available To survive in the current shipbuilding industry, it is of vital importance for shipyards to have the ship components’ accuracy evaluated efficiently during most of the manufacturing steps. Evaluating components’ accuracy by comparing each component’s point cloud data scanned by laser scanners and the ship’s design data formatted in CAD cannot be processed efficiently when (1 extract components from point cloud data include irregular obstacles endogenously, or when (2 registration of the two data sets have no clear direction setting. This paper presents reformative point cloud data processing methods to solve these problems. K-d tree construction of the point cloud data fastens a neighbor searching of each point. Region growing method performed on the neighbor points of the seed point extracts the continuous part of the component, while curved surface fitting and B-spline curved line fitting at the edge of the continuous part recognize the neighbor domains of the same component divided by obstacles’ shadows. The ICP (Iterative Closest Point algorithm conducts a registration of the two sets of data after the proper registration’s direction is decided by principal component analysis. By experiments conducted at the shipyard, 200 curved shell plates are extracted from the scanned point cloud data, and registrations are conducted between them and the designed CAD data using the proposed methods for an accuracy evaluation. Results show that the methods proposed in this paper support the accuracy evaluation targeted point cloud data processing efficiently in practice.

  15. Development of X-ray radiography examination technology by image processing method

    Energy Technology Data Exchange (ETDEWEB)

    Min, Duck Kee; Koo, Dae Seo; Kim, Eun Ka [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    Because the dimension of nuclear fuel rods was measured with rapidity and accuracy by X-ray radiography examination, the set-up of image processing system which was composed of 979 CCD-L camera, image processing card and fluorescent lighting was carried out, and the image processing system enabled image processing to perform. The examination technology of X-ray radiography, which enabled dimension measurement of nuclear fuel rods to perform, was developed by image processing method. The result of dimension measurement of standard fuel rod by image processing method was 2% reduction in relative measuring error than that of X-ray radiography film, while the former was better by 100 {approx} 200 {mu}m in measuring accuracy than the latter. (author). 9 refs., 22 figs., 3 tabs.

  16. Method of processing radioactive rare gase

    International Nuclear Information System (INIS)

    Tagusagawa, Atsushi; Tuda, Kazuaki.

    1988-01-01

    Purpose: To obtain a safety processing method without using mechanical pumps or pressure-proof containers and, accordingly, with no risk for the leakage of radioactive rare gas. Method: A container filled with zeolige is inserted with a cover being opened into an autoclave. Meanwhile, krypton-containing gases are supplied to an adsorption tower filled with adsorbents, cooled, adsorbed and then heated to desorb adsorbed krypton. The krypton-containing gases are introduced due to the pressure difference to the autoclave thereby causing krypton to adsorb at ambient temperature to zeolite. Then, the inside of the autoclave is heated to desorb krypton and adsorbed moistures from zeolite and the pressure is elevated. After sending the gases under pressure to the adsorption tower, the zeolite-filled container is taken out from the autoclave, tightly closed and then transferred to a predetermined site. (Takahashi, M.)

  17. Recovery process of elite athletes: A review of contemporary methods

    Directory of Open Access Journals (Sweden)

    Veljović Draško

    2012-01-01

    Full Text Available A numerous training stimulus and competition as well can reduce level of abilities among athletes. This decline of performance can be a temporary phenomenon, with duration of several minutes or several hours after a workout, or take much longer, even a several days. The lack of adequate recovery process can influence on athletes not being able to train at the desired intensity or do not fully meet the tasks at the next training session. Chronic fatigue can lead to injuries, and therefore, full recovery is necessary for achieving optimal level of abilities that will ensure a better athletic performance. For this reasons, athletes often carry out a variety of techniques and methods aimed to recover after training or match. They have become a part of the training process and their purpose is reduction of stress and fatigue incurred as a result of daily exposure to intense training stimulus. There are numerous methods and techniques today that can accelerate the recovery process of athletes. For this reason it is necessary to know the efficiency of an adequate method which will be applied in the training process. The aim of this review article is to point to those currently used and their effects on the process of recovery after physical activity in elite sport.

  18. Bridging Technometric Method and Innovation Process: An Initial Study

    Science.gov (United States)

    Rumanti, A. A.; Reynaldo, R.; Samadhi, T. M. A. A.; Wiratmadja, I. I.; Dwita, A. C.

    2018-03-01

    The process of innovation is one of ways utilized to increase the capability of a technology component that reflects the need of SME. Technometric method can be used to identify to what extent the level of technology advancement in a SME is, and also which technology component that needs to be maximized in order to significantly deliver an innovation. This paper serves as an early study, which lays out a conceptual framework that identifies and elaborates the principles of innovation process from a well-established innovation model by Martin with the technometric method, based on the initial background research conducted at SME Ira Silver in Jogjakarta, Indonesia.

  19. Method of processing radioactive liquid wastes

    International Nuclear Information System (INIS)

    Kurumada, Norimitsu; Shibata, Setsuo; Wakabayashi, Toshikatsu; Kuribayashi, Hiroshi.

    1984-01-01

    Purpose: To facilitate the procession of liquid wastes containing insoluble salts of boric acid and calcium in a process for solidifying under volume reduction of radioactive liquid wastes containing boron. Method: A soluble calcium compound (such as calcium hydroxide, calcium oxide and calcium nitrate) is added to liquid wastes whose pH value is adjusted neutral or alkaline such that the molar ratio of calcium to boron in the liquid wastes is at least 0.2. Then, they are agitated at a temperature between 40 - 70 0 C to form insoluble calcium salt containing boron. Thereafter, the liquid is maintained at a temperature less than the above-mentioned forming temperature to age the products and, thereafter, the liquid is evaporated to condensate into a liquid concentrate containing 30 - 80% by weight of solid components. The concentrated liquid is mixed with cement to solidify. (Ikeda, J.)

  20. The OptD-multi method in LiDAR processing

    International Nuclear Information System (INIS)

    Błaszczak-Bąk, Wioleta; Sobieraj-Żłobińska, Anna; Kowalik, Michał

    2017-01-01

    New and constantly developing technology for acquiring spatial data, such as LiDAR (light detection and ranging), is a source for large volume of data. However, such amount of data is not always needed for developing the most popular LiDAR products: digital terrain model (DTM) or digital surface model. Therefore, in many cases, the number of contained points are reduced in the pre-processing stage. The degree of reduction is determined by the algorithm used, which should enable the user to obtain a dataset appropriate and optimal for the planned purpose. The aim of this article is to propose a new Optimum Dataset method (OptD method) in the processing of LiDAR point clouds. The OptD method can reduce the number of points in a dataset for the specified optimization criteria concerning the characteristics of generated DTM. The OptD method can be used in two variants: OptD-single (one criterion for optimization) and OptD-multi (two or more optimization criteria). The OptD-single method has been thoroughly tested and presented by Błaszczak-Bąk (2016 Acta Geodyn. Geomater . 13/4 379–86). In this paper the authors discussed the OptD-multi method. (paper)

  1. OPTIMAL SIGNAL PROCESSING METHODS IN GPR

    Directory of Open Access Journals (Sweden)

    Saeid Karamzadeh

    2014-01-01

    Full Text Available In the past three decades, a lot of various applications of Ground Penetrating Radar (GPR took place in real life. There are important challenges of this radar in civil applications and also in military applications. In this paper, the fundamentals of GPR systems will be covered and three important signal processing methods (Wavelet Transform, Matched Filter and Hilbert Huang will be compared to each other in order to get most accurate information about objects which are in subsurface or behind the wall.

  2. USING ANALYTIC HIERARCHY PROCESS (AHP METHOD IN RURAL DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Tülay Cengiz

    2003-04-01

    Full Text Available Rural development is a body of economical and social policies towards improving living conditions in rural areas through enabling rural population to utilize economical, social, cultural and technological blessing of city life in place, without migrating. As it is understood from this description, rural development is a very broad concept. Therefore, in development efforts problem should be stated clearly, analyzed and many criterias should be evaluated by experts. Analytic Hierarchy Process (AHP method can be utilized at there stages of development efforts. AHP methods is one of multi-criteria decision method. After degrading a problem in smaller pieces, relative importance and level of importance of two compared elements are determined. It allows evaluation of quality and quantity factors. At the same time, it permits utilization of ideas of many experts and use them in decision process. Because mentioned features of AHP method, it could be used in rural development works. In this article, cultural factors, one of the important components of rural development is often ignored in many studies, were evaluated as an example. As a result of these applications and evaluations, it is concluded that AHP method could be helpful in rural development efforts.

  3. Implementation of a new rapid tissue processing method--advantages and challenges

    DEFF Research Database (Denmark)

    Munkholm, Julie; Talman, Maj-Lis; Hasselager, Thomas

    2008-01-01

    Conventional tissue processing of histologic specimens has been carried out in the same manner for many years. It is a time-consuming process involving batch production, resulting in a 1-day delay of the diagnosis. Microwave-assisted tissue processing enables a continuous high flow of histologic...... specimens through the processor with a processing time of as low as 1h. In this article, we present the effects of the automated microwave-assisted tissue processor on the histomorphologic quality and the turnaround time (TAT) for histopathology reports. We present a blind comparative study regarding...... the histomorphologic quality of microwave-processed and conventionally processed tissue samples. A total of 333 specimens were included. The microwave-assisted processing method showed a histomorphologic quality comparable to the conventional method for a number of tissue types, including skin and specimens from...

  4. Extension of moment projection method to the fragmentation process

    International Nuclear Information System (INIS)

    Wu, Shaohua; Yapp, Edward K.Y.; Akroyd, Jethro; Mosbach, Sebastian; Xu, Rong; Yang, Wenming; Kraft, Markus

    2017-01-01

    The method of moments is a simple but efficient method of solving the population balance equation which describes particle dynamics. Recently, the moment projection method (MPM) was proposed and validated for particle inception, coagulation, growth and, more importantly, shrinkage; here the method is extended to include the fragmentation process. The performance of MPM is tested for 13 different test cases for different fragmentation kernels, fragment distribution functions and initial conditions. Comparisons are made with the quadrature method of moments (QMOM), hybrid method of moments (HMOM) and a high-precision stochastic solution calculated using the established direct simulation algorithm (DSA) and advantages of MPM are drawn.

  5. Extension of moment projection method to the fragmentation process

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Shaohua [Department of Mechanical Engineering, National University of Singapore, Engineering Block EA, Engineering Drive 1, 117576 (Singapore); Yapp, Edward K.Y.; Akroyd, Jethro; Mosbach, Sebastian [Department of Chemical Engineering and Biotechnology, University of Cambridge, New Museums Site, Pembroke Street, Cambridge, CB2 3RA (United Kingdom); Xu, Rong [School of Chemical and Biomedical Engineering, Nanyang Technological University, 62 Nanyang Drive, 637459 (Singapore); Yang, Wenming [Department of Mechanical Engineering, National University of Singapore, Engineering Block EA, Engineering Drive 1, 117576 (Singapore); Kraft, Markus, E-mail: mk306@cam.ac.uk [Department of Chemical Engineering and Biotechnology, University of Cambridge, New Museums Site, Pembroke Street, Cambridge, CB2 3RA (United Kingdom); School of Chemical and Biomedical Engineering, Nanyang Technological University, 62 Nanyang Drive, 637459 (Singapore)

    2017-04-15

    The method of moments is a simple but efficient method of solving the population balance equation which describes particle dynamics. Recently, the moment projection method (MPM) was proposed and validated for particle inception, coagulation, growth and, more importantly, shrinkage; here the method is extended to include the fragmentation process. The performance of MPM is tested for 13 different test cases for different fragmentation kernels, fragment distribution functions and initial conditions. Comparisons are made with the quadrature method of moments (QMOM), hybrid method of moments (HMOM) and a high-precision stochastic solution calculated using the established direct simulation algorithm (DSA) and advantages of MPM are drawn.

  6. Standard CMMIsm Appraisal Method for Process Improvement (SCAMPIsm), Version 1.1: Method Definition Document

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard CMMI Appraisal Method for Process Improvement (SCAMPI(Service Mark)) is designed to provide benchmark quality ratings relative to Capability Maturity Model(registered) Integration (CMMI(Service Mark)) models...

  7. A digital processing method for the analysis of complex nuclear spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Abani, M.C.; Bairi, B.R.

    1994-01-01

    This paper describes a digital processing method using frequency power spectra for the analysis of complex nuclear spectra. The power spectra were estimated by employing modified discrete Fourier transform. The method was applied to observed spectral envelopes. The results for separating closely-spaced doublets in nuclear spectra of low statistical precision compared favorably with those obtained by using a popular peak fitting program SAMPO. The paper also describes limitations of the peak fitting methods. It describes the advantages of digital processing techniques for type II digital signals including nuclear spectra. A compact computer program occupying less than 2.5 kByte of memory space was written in BASIC for the processing of observed spectral envelopes. (orig.)

  8. The Influence of Different Processing Methods on Component Content of Sophora japonica

    Science.gov (United States)

    Ji, Y. B.; Zhu, H. J.; Xin, G. S.; Wei, C.

    2017-12-01

    The purpose of this experiment is to understand the effect of different processing methods on the content of active ingredients in Sophora japonica, and to determine the content of rutin and quercetin in Sophora japonica under different processing methods by UV spectrophotometry of the content determination. So as to compare the effect of different processing methods on the active ingredient content of Sophora japonica. Experiments can be seen in the rutin content: Fried Sophora japonica>Vinegar sunburn Sophora> Health products Sophora japonica> Charred sophora flower, Vinegar sunburn Sophora and Fried Sophora japonica difference is not obvious; Quercetin content: Charred sophora flower> Fried Sophora japonica> Vinegar sunburn Sophora>Health products Sophora japonica. It is proved that there are some differences in the content of active ingredients in Sophora japonica in different processing methods. The content of rutin increased with the increase of the processing temperature, but the content decreased after a certain temperature; Quercetin content will increase gradually with time.

  9. Review of conventional and novel food processing methods on food allergens.

    Science.gov (United States)

    Vanga, Sai Kranthi; Singh, Ashutosh; Raghavan, Vijaya

    2017-07-03

    With the turn of this century, novel food processing techniques have become commercially very important because of their profound advantages over the traditional methods. These novel processing methods tend to preserve the characteristic properties of food including their organoleptic and nutritional qualities better when compared with the conventional food processing methods. During the same period of time, there is a clear rise in the populations suffering from food allergies, especially infants and children. Though, this fact is widely attributed to the changing livelihood of population in both developed and developing nations and to the introduction of new food habits with advent of novel foods and new processing techniques, their complete role is still uncertain. Under the circumstance, it is very important to understand the structural changes in the protein as food is processed to comprehend whether the specific processing technique (conventional and novel) is increasing or mitigating the allergenicity. Various modern means are now being employed to understand the conformational changes in the protein which can affect the allergenicity. In this review, the processing effects on protein structure and allergenicity are discussed along with the insinuations of recent studies and techniques for establishing a platform to investigate future pathway to reduce or eliminate allergenicity in the population.

  10. Systematic Development of Miniaturized (Bio)Processes using Process Systems Engineering (PSE) Methods and Tools

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Larsson, Hilde; Heintz, Søren

    2014-01-01

    The focus of this work is on process systems engineering (PSE) methods and tools, and especially on how such PSE methods and tools can be used to accelerate and support systematic bioprocess development at a miniature scale. After a short presentation of the PSE methods and the bioprocess...... development drivers, three case studies are presented. In the first example it is demonstrated how experimental investigations of the bi-enzymatic production of lactobionic acid can be modeled with help of a new mechanistic mathematical model. The reaction was performed at lab scale and the prediction quality...

  11. Method of processing radioactive wastes

    International Nuclear Information System (INIS)

    Katada, Katsuo.

    1986-01-01

    Purpose: To improve the management for radioactive wastes containers thereby decrease the amount of stored matters by arranging the radioactive wastes containers in the order of their radioactivity levels. Method: The radiation doses of radioactive wastes containers arranged in the storing area before volume-reducing treatment are previously measured by a dosemeter. Then, a classifying machine is actuated to hoist the containers in the order to their radiation levels and the containers are sent out passing through conveyor, surface contamination gage, weight measuring device and switcher to a volume-reducing processing machine. The volume-reduced products are packed each by several units to the storing containers. Thus, the storing containers after stored for a certain period of time can be transferred in an assembled state. (Kawakami, Y.)

  12. Decision Support Methods for Supply Processes in the Floral Industry

    Directory of Open Access Journals (Sweden)

    Kutyba Agata

    2017-12-01

    Full Text Available The aim of this paper was to show the application of the ABC and AHP (multi-criteria method for hierarchical analysis of decision processes as an important part of decision making in supply processes which are realized in the floral industry. The ABC analysis was performed in order to classify the product mix from the perspective of the demand values. This in consequence enabled us to identify the most important products which were then used as a variant in the AHP method.

  13. Measurement of company effectiveness using analytic network process method

    Science.gov (United States)

    Goran, Janjić; Zorana, Tanasić; Borut, Kosec

    2017-07-01

    The sustainable development of an organisation is monitored through the organisation's performance, which beforehand incorporates all stakeholders' requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP) to define the weight factors of the mutual influences of all the important elements of an organisation's strategy. The calculation of an organisation's effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation's business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation's most important measures.

  14. THE BASE OF THE METHODICAL DESIGN AND IMPLEMENTATION OF ENGINEERING EDUCATION PROCESS

    Directory of Open Access Journals (Sweden)

    Renata Lis

    2012-12-01

    Full Text Available The article is devoted to the methodology of implementation of European and national qualifications framework in the academic process. It consists of: the methodic of design degree programs and classes and the methodic of the teaching process.

  15. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  16. Nonparametric Inference of Doubly Stochastic Poisson Process Data via the Kernel Method.

    Science.gov (United States)

    Zhang, Tingting; Kou, S C

    2010-01-01

    Doubly stochastic Poisson processes, also known as the Cox processes, frequently occur in various scientific fields. In this article, motivated primarily by analyzing Cox process data in biophysics, we propose a nonparametric kernel-based inference method. We conduct a detailed study, including an asymptotic analysis, of the proposed method, and provide guidelines for its practical use, introducing a fast and stable regression method for bandwidth selection. We apply our method to real photon arrival data from recent single-molecule biophysical experiments, investigating proteins' conformational dynamics. Our result shows that conformational fluctuation is widely present in protein systems, and that the fluctuation covers a broad range of time scales, highlighting the dynamic and complex nature of proteins' structure.

  17. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C......In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...

  18. Process-tracing methods in decision making: on growing up in the 70s

    NARCIS (Netherlands)

    Schulte-Mecklenbeck, M.; Johnson, J.G.; Böckenholt, U.; Goldstein, D.G.; Russo, J.E.; Sullivan, N.J.; Willemsen, M.C.

    2017-01-01

    Decision research has experienced a shift from simple algebraic theories of choice to an appreciation of mental processes underlying choice. A variety of process-tracing methods has helped researchers test these process explanations. Here, we provide a survey of these methods, including specific

  19. Comparison of pre-processing methods for multiplex bead-based immunoassays.

    Science.gov (United States)

    Rausch, Tanja K; Schillert, Arne; Ziegler, Andreas; Lüking, Angelika; Zucht, Hans-Dieter; Schulz-Knappe, Peter

    2016-08-11

    High throughput protein expression studies can be performed using bead-based protein immunoassays, such as the Luminex® xMAP® technology. Technical variability is inherent to these experiments and may lead to systematic bias and reduced power. To reduce technical variability, data pre-processing is performed. However, no recommendations exist for the pre-processing of Luminex® xMAP® data. We compared 37 different data pre-processing combinations of transformation and normalization methods in 42 samples on 384 analytes obtained from a multiplex immunoassay based on the Luminex® xMAP® technology. We evaluated the performance of each pre-processing approach with 6 different performance criteria. Three performance criteria were plots. All plots were evaluated by 15 independent and blinded readers. Four different combinations of transformation and normalization methods performed well as pre-processing procedure for this bead-based protein immunoassay. The following combinations of transformation and normalization were suitable for pre-processing Luminex® xMAP® data in this study: weighted Box-Cox followed by quantile or robust spline normalization (rsn), asinh transformation followed by loess normalization and Box-Cox followed by rsn.

  20. Methods of Complex Data Processing from Technical Means of Monitoring

    Directory of Open Access Journals (Sweden)

    Serhii Tymchuk

    2017-03-01

    Full Text Available The problem of processing the information from different types of monitoring equipment was examined. The use of generalized methods of information processing, based on the techniques of clustering combined territorial information sources for monitoring and the use of framing model of knowledge base for identification of monitoring objects was proposed as a possible solution of the problem. Clustering methods were formed on the basis of Lance-Williams hierarchical agglomerative procedure using the Ward metrics. Frame model of knowledge base was built using the tools of object-oriented modeling.

  1. A review of experiment data processing method for uranium mining and metallurgy in BRICEM

    International Nuclear Information System (INIS)

    Ye Guoqiang; Lu Kehong; Wang Congying

    1997-01-01

    The authors investigates the methods of experiment data processing in Beijing Research Institute of Chemical Engineering and Metallurgy (BRICEM). It turns out that error analysis method is used to process experiment data, single-factor transformation and orthogonal test design method are adopted for arranging test, and regression analysis and mathematical process simulation are applied to process mathematical model for uranium mining and metallurgy. The methods above-mentioned lay a foundation for the utilization of mathematical statistics in our subject

  2. Methods of gated-blood-pool-spect data processing

    International Nuclear Information System (INIS)

    Kosa, I.; Mester, J.; Tanaka, M.; Csernay, L.; Mate, E.; Szasz, K.

    1991-01-01

    Three techniques of gated SPECT were evaluated. The methods of Integral SPECT (ISPECT), enddyastole-endsystole SPECT (ED-ES SPECT) and Fourier SPECT were adapted and developed on the Hungarian nuclear medicine data processing system microSEGAMS. The methods are based on data reduction before back projection which results in processing times acceptable for the clinical routine. The clinical performance of the introduced techniques was tested in 10 patients with old posterior myocardial infarction and in 5 patients without cardiac disease. The left ventricular ejection faction determined by ISPECT correlated well with the planar values. The correlation coefficient was 0.89. The correlation coefficient of EF values determined by ED-ES SPECT and planar radionuclide ventriculography was lower (0.70). For the identification of left ventricular wall motion abnormalities ED-ES SPECT and Fourier SPECT exhibited a favourable performance, but ISPECT only moderate suitability. In the detection of regional phase delay Fourier-SPECT demonstrated higher sensitivity than the planar radionuclide ventriculography. (author) 4 refs.; 3 figs.; 2 tabs

  3. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  4. Method for innovative synthesis-design of chemical process flowsheets

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Gani, Rafiqul

    Chemical process synthesis-design involve the identification of the processing route to reach a desired product from a specified set of raw materials, design of the operations involved in the processing route, the calculations of utility requirements, the calculations of waste and emission...... to the surrounding and many more. Different methods (knowledge-based [1], mathematical programming [2], hybrid, etc.) have been proposed and are also currently employed to solve these synthesis-design problems. D’ Anterroches [3] proposed a group contribution based approach to solve the synthesis-design problem...... of chemical processes, where, chemical process flowsheets could be synthesized in the same way as atoms or groups of atoms are synthesized to form molecules in computer aided molecular design (CAMD) techniques [4]. That, from a library of building blocks (functional process-groups) and a set of rules to join...

  5. Assessing Commercial and Alternative Poultry Processing Methods using Microbiome Analyses

    Science.gov (United States)

    Assessing poultry processing methods/strategies has historically used culture-based methods to assess bacterial changes or reductions, both in terms of general microbial communities (e.g. total aerobic bacteria) or zoonotic pathogens of interest (e.g. Salmonella, Campylobacter). The advent of next ...

  6. Development of Auto-Seeding System Using Image Processing Technology in the Sapphire Crystal Growth Process via the Kyropoulos Method

    Directory of Open Access Journals (Sweden)

    Churl Min Kim

    2017-04-01

    Full Text Available The Kyropoulos (Ky and Czochralski (Cz methods of crystal growth are used for large-diameter single crystals. The seeding process in these methods must induce initial crystallization by initiating contact between the seed crystals and the surface of the melted material. In the Ky and Cz methods, the seeding process lays the foundation for ingot growth during the entire growth process. When any defect occurs in this process, it is likely to spread to the entire ingot. In this paper, a vision system was constructed for auto seeding and for observing the surface of the melt in the Ky method. An algorithm was developed to detect the time when the internal convection of the melt is stabilized by observing the shape of the spoke pattern on the melt material surface. Then, the vision system and algorithm were applied to the growth furnace, and the possibility of process automation was examined for sapphire growth. To confirm that the convection of the melt was stabilized, the position of the island (i.e., the center of a spoke pattern was detected using the vision system and image processing. When the observed coordinates for the center of the island were compared with the coordinates detected from the image processing algorithm, there was an average error of 1.87 mm (based on an image with 1024 × 768 pixels.

  7. Influence of harvesting and processing methods on organic viability of soybean seed

    Directory of Open Access Journals (Sweden)

    Đukanović Lana

    2000-01-01

    Full Text Available Organic viability of soybean seed for three soybean varieties - elite (Bosa, ZPS 015 and Nena depending on methods of manipulation with seeds during harvesting and processing phase were determined in this paper. Trial was conducted in Zemun Polje during 1999; manual and mechanized harvesting or processing methods were applied. Seed germination was tested using ISTA methods (Standard method and Cold test. Following parameters were evaluated: germination viability, germination, rate-speed of emergence, length of hypocotile and main root Rate-speed of emergence was based on number of emerged plants per day. Length of hypocotile or root and percent of germination determined vigour index. Based on obtained results it maybe concluded that methods of seed manipulation during harvesting or processing phase were influenced on soybean seed quality parameters evaluated. Ways of seed manipulation - methods evaluated were influenced organic viability of soybean seed by decreasing germination viability, total germination and length of main root.

  8. METHOD OF DISPLAYING AN EXECUTABLE BUSINESS PROCESS MODELS INTO PETRI NETS

    Directory of Open Access Journals (Sweden)

    Igor G. Fedorov

    2013-01-01

    Full Text Available Executable business process models, as well as programs, require evidence of a defect-free finish. The methods based on the formalism of Petri nets are widely used. A business process is a network of dishes, and its properties are set by the analysis of the properties of the network. The aim is to study the methods of displaying an executable business process model in a Petri net. Analysis of the properties of the resulting model allows us to prove a number of important properties: it is a network of free choice and clean without looping.

  9. High-resolution imaging methods in array signal processing

    DEFF Research Database (Denmark)

    Xenaki, Angeliki

    in active sonar signal processing for detection and imaging of submerged oil contamination in sea water from a deep-water oil leak. The submerged oil _eld is modeled as a uid medium exhibiting spatial perturbations in the acoustic parameters from their mean ambient values which cause weak scattering...... of the incident acoustic energy. A highfrequency active sonar is selected to insonify the medium and receive the backscattered waves. High-frequency acoustic methods can both overcome the optical opacity of water (unlike methods based on electromagnetic waves) and resolve the small-scale structure...... of the submerged oil field (unlike low-frequency acoustic methods). The study shows that high-frequency acoustic methods are suitable not only for large-scale localization of the oil contamination in the water column but also for statistical characterization of the submerged oil field through inference...

  10. Application of remote sensing methods and GIS in erosive process investigations

    Directory of Open Access Journals (Sweden)

    Mustafić Sanja

    2007-01-01

    Full Text Available Modern geomorphologic investigations of condition and change of the intensity of erosive process should be based on application of remote sensing methods which are based on processing of aerial and satellite photographs. Using of these methods is very important because it enables good possibilities for realizing regional relations of the investigated phenomenon, as well as the estimate of spatial and temporal variability of all physical-geographical and anthropogenic factors influencing given process. Realizing process of land erosion, on the whole, is only possible by creating universal data base, as well as by using of appropriate software, more exactly by establishing uniform information system. Geographical information system, as the most effective one, the most complex and the most integral system of information about the space enables unification as well as analytical and synthetically processing of all data.

  11. Learning-based controller for biotechnology processing, and method of using

    Science.gov (United States)

    Johnson, John A.; Stoner, Daphne L.; Larsen, Eric D.; Miller, Karen S.; Tolle, Charles R.

    2004-09-14

    The present invention relates to process control where some of the controllable parameters are difficult or impossible to characterize. The present invention relates to process control in biotechnology of such systems, but not limited to. Additionally, the present invention relates to process control in biotechnology minerals processing. In the inventive method, an application of the present invention manipulates a minerals bioprocess to find local exterma (maxima or minima) for selected output variables/process goals by using a learning-based controller for bioprocess oxidation of minerals during hydrometallurgical processing. The learning-based controller operates with or without human supervision and works to find processor optima without previously defined optima due to the non-characterized nature of the process being manipulated.

  12. Application of PROMETHEE-GAIA method for non-traditional machining processes selection

    Directory of Open Access Journals (Sweden)

    Prasad Karande

    2012-10-01

    Full Text Available With ever increasing demand for manufactured products of hard alloys and metals with high surface finish and complex shape geometry, more interest is now being paid to non-traditional machining (NTM processes, where energy in its direct form is used to remove material from workpiece surface. Compared to conventional machining processes, NTM processes possess almost unlimited capabilities and there is a strong believe that use of NTM processes would go on increasing in diverse range of applications. Presence of a large number of NTM processes along with complex characteristics and capabilities, and lack of experts in NTM process selection domain compel for development of a structured approach for NTM process selection for a given machining application. Past researchers have already attempted to solve NTM process selection problems using various complex mathematical approaches which often require a profound knowledge in mathematics/artificial intelligence from the part of process engineers. In this paper, four NTM process selection problems are solved using an integrated PROMETHEE (preference ranking organization method for enrichment evaluation and GAIA (geometrical analysis for interactive aid method which would act as a visual decision aid to the process engineers. The observed results are quite satisfactory and exactly match with the expected solutions.

  13. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    Energy Technology Data Exchange (ETDEWEB)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles.

  14. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    International Nuclear Information System (INIS)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles

  15. Method of processing radioactive metallic sodium with recycling alcohols

    International Nuclear Information System (INIS)

    Sakai, Takuhiko; Mitsuzuka, Norimasa.

    1980-01-01

    Purpose: To employ high safety alcohol procession and decrease the amount of wastes in the procession of radioactive metallic sodium discharged from LMFBR type reactors. Method: Radioactive metallic sodium containing long half-decay period nuclides such as cesium, strontium, barium, cerium, lanthanum or zirconium is dissolved in an alcohol at about 70% purity. After extracting the sodium alcoholate thus formed, gaseous hydrochloride is blown-in to separate the sodium alcoholate into alcohol and sodium chloride, and regenerated alcohol is used again for dissolving sodium metal. The sodium chloride thus separated is processed into solid wastes. (Furukawa, Y.)

  16. Method of processing spent ion exchange resins

    International Nuclear Information System (INIS)

    Mori, Kazuhide; Tamada, Shin; Kikuchi, Makoto; Matsuda, Masami; Aoyama, Yoshiyuki.

    1985-01-01

    Purpose: To decrease the amount of radioactive spent ion exchange resins generated from nuclear power plants, etc and process them into stable inorganic compounds through heat decomposition. Method: Spent ion exchange resins are heat-decomposed in an inert atmosphere to selectively decompose only ion exchange groups in the preceeding step while high molecular skeltons are completely heat-decomposed in an oxidizing atmosphere in the succeeding step. In this way, gaseous sulfur oxides and nitrogen oxides are generated in the preceeding step, while gaseous carbon dioxide and hydrogen requiring no discharge gas procession are generated in the succeeding step. Accordingly, the amount of discharged gases requiring procession can significantly be reduced, as well as the residues can be converted into stable inorganic compounds. Further, if transition metals are ionically adsorbed as the catalyst to the ion exchange resins, the ion exchange groups are decomposed at 130 - 300 0 C, while the high molecular skeltons are thermally decomposed at 240 - 300 0 C. Thus, the temperature for the heat decomposition can be lowered to prevent the degradation of the reactor materials. (Kawakami, Y.)

  17. Measurement of company effectiveness using analytic network process method

    Directory of Open Access Journals (Sweden)

    Goran Janjić

    2017-07-01

    Full Text Available The sustainable development of an organisation is monitored through the organisation’s performance, which beforehand incorporates all stakeholders’ requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP to define the weight factors of the mutual influences of all the important elements of an organisation’s strategy. The calculation of an organisation’s effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation’s business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation’s most important measures.

  18. NUMERICAL WITHOUT ITERATION METHOD OF MODELING OF ELECTROMECHANICAL PROCESSES IN ASYNCHRONOUS ENGINES

    Directory of Open Access Journals (Sweden)

    D. G. Patalakh

    2018-02-01

    Full Text Available Purpose. Development of calculation of electromagnetic and electromechanic transients is in asynchronous engines without iterations. Methodology. Numeral methods of integration of usual differential equations, programming. Findings. As the system of equations, describing the dynamics of asynchronous engine, contents the products of rotor and stator currents and product of rotation frequency of rotor and currents, so this system is nonlinear one. The numeral solution of nonlinear differential equations supposes an iteration process on every step of integration. Time-continuing and badly converging iteration process may be the reason of calculation slowing. The improvement of numeral method by the way of an iteration process removing is offered. As result the modeling time is reduced. The improved numeral method is applied for integration of differential equations, describing the dynamics of asynchronous engine. Originality. The improvement of numeral method allowing to execute numeral integrations of differential equations containing product of functions is offered, that allows to avoid an iteration process on every step of integration and shorten modeling time. Practical value. On the basis of the offered methodology the universal program of modeling of electromechanics processes in asynchronous engines could be developed as taking advantage on fast-acting.

  19. Effect of the method of processing on quality and oxidative stability ...

    African Journals Online (AJOL)

    In this study four samn samples prepared from cow milk using two processing methods (traditional T1, T2 and factory processed T3, T4) were investigated for their physico-chemical properties, fatty acids composition, oxidative stability and sensory properties. The traditionally processed samples showed a significance ...

  20. A design method for process design kit based on an SMIC 65 nm process

    International Nuclear Information System (INIS)

    Luo Haiyan; Chen Lan; Yin Minghui

    2010-01-01

    The frame structure of a process design kit (PDK) is described in detail, and a practical design method for PDK is presented. Based on this method, a useful SMIC 65 nm PDK has been successfully designed and realized, which is applicable to native EDA software of Zeni. The design process and difficulties of PDK are introduced by developing and analyzing these parameterized cell (Pcell) devices (MOS, resistor, etc.). A structured design method was proposed to implement Pcell, which makes thousands upon thousands of source codes of Pcell concise, readable, easy-to-upkeep and transplantable. Moreover, a Pcase library for each Pcell is designed to verify the Pcell in batches. By this approach, the Pcell can be verified efficiently and the PDK will be more reliable and steady. In addition, the component description format parameters and layouts of the Pcell are optimized by adding flexibility and improving performance, which benefits analog and custom IC designers to satisfy the demand of design. Finally, the SMIC 65 nm PDK was applied to IC design. The results indicate that the SMIC 65 nm PDK is competent to support IC design. (semiconductor integrated circuits)

  1. [Reasearch on evolution and transition of processing method of fuzi in ancient and modern times].

    Science.gov (United States)

    Liu, Chan-Chan; Cheng, Ming-En; Duan, Hai-Yan; Peng, Hua-Sheng

    2014-04-01

    Fuzi is a medicine used for rescuing from collapse by restoring yang as well as a famous toxic traditional Chinese medicine. In order to ensure the efficacy and safe medication, Fuzi has mostly been applied after being processed. There have been different Fuzi processing methods recorded by doctors of previous generations. Besides, there have also been differences in Fuzi processing methods recorded in modern pharmacopeia and ancient medical books. In this study, the authors traced back to medical books between the Han Dynasty and the period of Republic of China, and summarized Fuzi processing methods collected in ancient and modern literatures. According to the results, Fuzi processing methods and using methods have changed along with the evolution of dynasties, with differences in ancient and modern processing methods. Before the Tang Dynasty, Fuzi had been mostly processed and soaked. From Tang to Ming Dynasties, Fuzi had been mostly processed, soaked and stir-fried. During the Qing Dynasty, Fuzi had been mostly soaked and boiled. In the modem times, Fuzi is mostly processed by being boiled and soaked. Before the Tang Dynasty, a whole piece of Fuzi herbs or their fragments had been applied in medicines; Whereas their fragments are primarily used in the modern times. Because different processing methods have great impacts on the toxicity of Fuzi, it is suggested to study Fuzi processing methods.

  2. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  3. Radiation process control, study and acceptance of dosimetric methods

    International Nuclear Information System (INIS)

    Radak, B.B.

    1984-01-01

    The methods of primary dosimetric standardization and the calibration of dosimetric monitors suitable for radiation process control were outlined in the form of a logical pattern in which they are in current use on industrial scale in Yugoslavia. The reliability of the process control of industrial sterilization of medical supplies for the last four years was discussed. The preparatory works for the intermittent use of electron beams in cable industry were described. (author)

  4. Classical-processing and quantum-processing signal separation methods for qubit uncoupling

    Science.gov (United States)

    Deville, Yannick; Deville, Alain

    2012-12-01

    The Blind Source Separation problem consists in estimating a set of unknown source signals from their measured combinations. It was only investigated in a non-quantum framework up to now. We propose its first quantum extensions. We thus introduce the Quantum Source Separation field, investigating both its blind and non-blind configurations. More precisely, we show how to retrieve individual quantum bits (qubits) only from the global state resulting from their undesired coupling. We consider cylindrical-symmetry Heisenberg coupling, which e.g. occurs when two electron spins interact through exchange. We first propose several qubit uncoupling methods which typically measure repeatedly the coupled quantum states resulting from individual qubits preparations, and which then statistically process the classical data provided by these measurements. Numerical tests prove the effectiveness of these methods. We then derive a combination of quantum gates for performing qubit uncoupling, thus avoiding repeated qubit preparations and irreversible measurements.

  5. Fault Diagnosis for Rotating Machinery: A Method based on Image Processing.

    Directory of Open Access Journals (Sweden)

    Chen Lu

    Full Text Available Rotating machinery is one of the most typical types of mechanical equipment and plays a significant role in industrial applications. Condition monitoring and fault diagnosis of rotating machinery has gained wide attention for its significance in preventing catastrophic accident and guaranteeing sufficient maintenance. With the development of science and technology, fault diagnosis methods based on multi-disciplines are becoming the focus in the field of fault diagnosis of rotating machinery. This paper presents a multi-discipline method based on image-processing for fault diagnosis of rotating machinery. Different from traditional analysis method in one-dimensional space, this study employs computing method in the field of image processing to realize automatic feature extraction and fault diagnosis in a two-dimensional space. The proposed method mainly includes the following steps. First, the vibration signal is transformed into a bi-spectrum contour map utilizing bi-spectrum technology, which provides a basis for the following image-based feature extraction. Then, an emerging approach in the field of image processing for feature extraction, speeded-up robust features, is employed to automatically exact fault features from the transformed bi-spectrum contour map and finally form a high-dimensional feature vector. To reduce the dimensionality of the feature vector, thus highlighting main fault features and reducing subsequent computing resources, t-Distributed Stochastic Neighbor Embedding is adopt to reduce the dimensionality of the feature vector. At last, probabilistic neural network is introduced for fault identification. Two typical rotating machinery, axial piston hydraulic pump and self-priming centrifugal pumps, are selected to demonstrate the effectiveness of the proposed method. Results show that the proposed method based on image-processing achieves a high accuracy, thus providing a highly effective means to fault diagnosis for

  6. Fault Diagnosis for Rotating Machinery: A Method based on Image Processing.

    Science.gov (United States)

    Lu, Chen; Wang, Yang; Ragulskis, Minvydas; Cheng, Yujie

    2016-01-01

    Rotating machinery is one of the most typical types of mechanical equipment and plays a significant role in industrial applications. Condition monitoring and fault diagnosis of rotating machinery has gained wide attention for its significance in preventing catastrophic accident and guaranteeing sufficient maintenance. With the development of science and technology, fault diagnosis methods based on multi-disciplines are becoming the focus in the field of fault diagnosis of rotating machinery. This paper presents a multi-discipline method based on image-processing for fault diagnosis of rotating machinery. Different from traditional analysis method in one-dimensional space, this study employs computing method in the field of image processing to realize automatic feature extraction and fault diagnosis in a two-dimensional space. The proposed method mainly includes the following steps. First, the vibration signal is transformed into a bi-spectrum contour map utilizing bi-spectrum technology, which provides a basis for the following image-based feature extraction. Then, an emerging approach in the field of image processing for feature extraction, speeded-up robust features, is employed to automatically exact fault features from the transformed bi-spectrum contour map and finally form a high-dimensional feature vector. To reduce the dimensionality of the feature vector, thus highlighting main fault features and reducing subsequent computing resources, t-Distributed Stochastic Neighbor Embedding is adopt to reduce the dimensionality of the feature vector. At last, probabilistic neural network is introduced for fault identification. Two typical rotating machinery, axial piston hydraulic pump and self-priming centrifugal pumps, are selected to demonstrate the effectiveness of the proposed method. Results show that the proposed method based on image-processing achieves a high accuracy, thus providing a highly effective means to fault diagnosis for rotating machinery.

  7. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    Science.gov (United States)

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  8. Analysis of Unit Process Cost for an Engineering-Scale Pyroprocess Facility Using a Process Costing Method in Korea

    Directory of Open Access Journals (Sweden)

    Sungki Kim

    2015-08-01

    Full Text Available Pyroprocessing, which is a dry recycling method, converts spent nuclear fuel into U (Uranium/TRU (TRansUranium metal ingots in a high-temperature molten salt phase. This paper provides the unit process cost of a pyroprocess facility that can process up to 10 tons of pyroprocessing product per year by utilizing the process costing method. Toward this end, the pyroprocess was classified into four kinds of unit processes: pretreatment, electrochemical reduction, electrorefining and electrowinning. The unit process cost was calculated by classifying the cost consumed at each process into raw material and conversion costs. The unit process costs of the pretreatment, electrochemical reduction, electrorefining and electrowinning were calculated as 195 US$/kgU-TRU, 310 US$/kgU-TRU, 215 US$/kgU-TRU and 231 US$/kgU-TRU, respectively. Finally the total pyroprocess cost was calculated as 951 US$/kgU-TRU. In addition, the cost driver for the raw material cost was identified as the cost for Li3PO4, needed for the LiCl-KCl purification process, and platinum as an anode electrode in the electrochemical reduction process.

  9. Literature Review on Processing and Analytical Methods for ...

    Science.gov (United States)

    Report The purpose of this report was to survey the open literature to determine the current state of the science regarding the processing and analytical methods currently available for recovery of F. tularensis from water and soil matrices, and to determine what gaps remain in the collective knowledge concerning F. tularensis identification from environmental samples.

  10. Dense Medium Machine Processing Method for Palm Kernel/ Shell ...

    African Journals Online (AJOL)

    ADOWIE PERE

    Cracked palm kernel is a mixture of kernels, broken shells, dusts and other impurities. In ... machine processing method using dense medium, a separator, a shell collector and a kernel .... efficiency, ease of maintenance and uniformity of.

  11. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  12. Desalination Processes Evaluation at Common Platform: A Universal Performance Ratio (UPR) Method

    KAUST Repository

    Wakil Shahzad, Muhammad

    2018-01-31

    The inevitable escalation in economic development have serious implications on energy and environment nexus. The International Energy Outlook 2016 (IEO2016) predicted that the Non Organization for Economic Cooperation and Development (non-OECD) countries will lead with 71% rise in energy demand in contrast with only 18% in developed countries from 2012-2040. In Gulf Cooperation Council (GCC) countries, about 40% of primary energy is consumed for cogeneration based power and desalination plants. The cogeneration based plants are struggling with unfair primary fuel cost apportionment to electricity and desalination. Also, the desalination processes performance evaluated based on derived energy, providing misleading selection of processes. There is a need of (i) appropriate primary fuel cost appointment method for multi-purposed plants and (ii) desalination processes performance evaluation method based on primary energy. As a solution, we proposed exergetic analysis for primary fuel percentage apportionment to all components in the cycle according to the quality of working fluid utilized. The proposed method showed that the gas turbine was under charged by 40%, steam turbine was overcharged by 71% and desalination was overcharged by 350% by conventional energetic apportionment methods. We also proposed a new and most suitable desalination processes performance evaluation method based on primary energy, called universal performance ratio (UPR). Since UPR is based on primary energy, it can be used to evaluate any kind of desalination processes, thermally driven, pressure driven & humidification-dehumidification etc. on common platform. We showed that all desalination processes are operating only at 10-13% of thermodynamic limit (TL) of UPR. For future sustainability, desalination must achieve 25-30% of TL and it is only possible either by hybridization of different processes or by innovative membrane materials.

  13. Data Processing And Machine Learning Methods For Multi-Modal Operator State Classification Systems

    Science.gov (United States)

    Hearn, Tristan A.

    2015-01-01

    This document is intended as an introduction to a set of common signal processing learning methods that may be used in the software portion of a functional crew state monitoring system. This includes overviews of both the theory of the methods involved, as well as examples of implementation. Practical considerations are discussed for implementing modular, flexible, and scalable processing and classification software for a multi-modal, multi-channel monitoring system. Example source code is also given for all of the discussed processing and classification methods.

  14. Unconscious neural processing differs with method used to render stimuli invisible

    Directory of Open Access Journals (Sweden)

    Sergey Victor Fogelson

    2014-06-01

    Full Text Available Visual stimuli can be kept from awareness using various methods. The extent of processing that a given stimulus receives in the absence of awareness is typically used to make claims about the role of consciousness more generally. The neural processing elicited by a stimulus, however, may also depend on the method used to keep it from awareness, and not only on whether the stimulus reaches awareness. Here we report that the method used to render an image invisible has a dramatic effect on how category information about the unseen stimulus is encoded across the human brain. We collected fMRI data while subjects viewed images of faces and tools, that were rendered invisible using either continuous flash suppression (CFS or chromatic flicker fusion (CFF. In a third condition, we presented the same images under normal fully visible viewing conditions. We found that category information about visible images could be extracted from patterns of fMRI responses throughout areas of neocortex known to be involved in face or tool processing. However, category information about stimuli kept from awareness using CFS could be recovered exclusively within occipital cortex, whereas information about stimuli kept from awareness using CFF was also decodable within temporal and frontal regions. We conclude that unconsciously presented objects are processed differently depending on how they are rendered subjectively invisible. Caution should therefore be used in making generalizations on the basis of any one method about the neural basis of consciousness or the extent of information processing without consciousness.

  15. Unconscious neural processing differs with method used to render stimuli invisible.

    Science.gov (United States)

    Fogelson, Sergey V; Kohler, Peter J; Miller, Kevin J; Granger, Richard; Tse, Peter U

    2014-01-01

    Visual stimuli can be kept from awareness using various methods. The extent of processing that a given stimulus receives in the absence of awareness is typically used to make claims about the role of consciousness more generally. The neural processing elicited by a stimulus, however, may also depend on the method used to keep it from awareness, and not only on whether the stimulus reaches awareness. Here we report that the method used to render an image invisible has a dramatic effect on how category information about the unseen stimulus is encoded across the human brain. We collected fMRI data while subjects viewed images of faces and tools, that were rendered invisible using either continuous flash suppression (CFS) or chromatic flicker fusion (CFF). In a third condition, we presented the same images under normal fully visible viewing conditions. We found that category information about visible images could be extracted from patterns of fMRI responses throughout areas of neocortex known to be involved in face or tool processing. However, category information about stimuli kept from awareness using CFS could be recovered exclusively within occipital cortex, whereas information about stimuli kept from awareness using CFF was also decodable within temporal and frontal regions. We conclude that unconsciously presented objects are processed differently depending on how they are rendered subjectively invisible. Caution should therefore be used in making generalizations on the basis of any one method about the neural basis of consciousness or the extent of information processing without consciousness.

  16. Method for treating a nuclear process off-gas stream

    International Nuclear Information System (INIS)

    Pence, D.T.; Chou, C.-C.

    1981-01-01

    A method is described for selectively removing and recovering the noble gas and other gaseous components typically emitted during nuclear process operations. The method is useful for treating dissolver off-gas effluents released during reprocessing of spent nuclear fuels to permit radioactive contaminant recovery prior to releasing the remaining off-gases to the atmosphere. The method involves a sequence of adsorption and desorption steps which are specified. Particular reference is made to the separation of xenon and krypton from the off-gas stream, and to the use of silver-exchanged mordenite as the adsorbent. (U.K.)

  17. TARGET CONTROLLING METHOD OF THE PRICING PROCESS IN THE TOURISM ENTERPRISES

    Directory of Open Access Journals (Sweden)

    N. Sagalakova

    2016-02-01

    Full Text Available Key stages of the pricing process in the tourism enterprises are investigated: subprocess of establishing of nominal value of the new tourism product price and subprocess of adjustment of the established price depending on a situation in the tourism market. For establishing of nominal value of the price it is offered by use of optimizing model, which maximizes the usefulness function of structural parts of the tourism product price. For adjustment of the tourism product price under change of external conditions procedure of installation of the target with use of the process behavior charts of the pricing process is applied. The new methodology of the pricing process controlling in the tourism enterprises, which based on complex application of methods of the statistical processes control and a method of dynamic programming, is presented in article and fully considers one of key features of the tourism sphere - seasonal fluctuations of the tourism product price.

  18. Unsupervised process monitoring and fault diagnosis with machine learning methods

    CERN Document Server

    Aldrich, Chris

    2013-01-01

    This unique text/reference describes in detail the latest advances in unsupervised process monitoring and fault diagnosis with machine learning methods. Abundant case studies throughout the text demonstrate the efficacy of each method in real-world settings. The broad coverage examines such cutting-edge topics as the use of information theory to enhance unsupervised learning in tree-based methods, the extension of kernel methods to multiple kernel learning for feature extraction from data, and the incremental training of multilayer perceptrons to construct deep architectures for enhanced data

  19. Biological features produced by additive manufacturing processes using vat photopolymerization method

    DEFF Research Database (Denmark)

    Davoudinejad, Ali; Mendez Ribo, Macarena; Pedersen, David Bue

    2017-01-01

    of micro biological features by Additive Manufacturing (AM) processes. The study characterizes the additive manufacturing processes for polymeric micro part productions using the vat photopolymerization method. A specifically designed vat photopolymerization AM machine suitable for precision printing...

  20. Survey: interpolation methods for whole slide image processing.

    Science.gov (United States)

    Roszkowiak, L; Korzynska, A; Zak, J; Pijanowska, D; Swiderska-Chadaj, Z; Markiewicz, T

    2017-02-01

    Evaluating whole slide images of histological and cytological samples is used in pathology for diagnostics, grading and prognosis . It is often necessary to rescale whole slide images of a very large size. Image resizing is one of the most common applications of interpolation. We collect the advantages and drawbacks of nine interpolation methods, and as a result of our analysis, we try to select one interpolation method as the preferred solution. To compare the performance of interpolation methods, test images were scaled and then rescaled to the original size using the same algorithm. The modified image was compared to the original image in various aspects. The time needed for calculations and results of quantification performance on modified images were also compared. For evaluation purposes, we used four general test images and 12 specialized biological immunohistochemically stained tissue sample images. The purpose of this survey is to determine which method of interpolation is the best to resize whole slide images, so they can be further processed using quantification methods. As a result, the interpolation method has to be selected depending on the task involving whole slide images. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  1. Real-Time Detection Methods to Monitor TRU Compositions in UREX+Process Streams

    Energy Technology Data Exchange (ETDEWEB)

    McDeavitt, Sean; Charlton, William; Indacochea, J Ernesto; taleyarkhan, Rusi; Pereira, Candido

    2013-03-01

    The U.S. Department of Energy has developed advanced methods for reprocessing spent nuclear fuel. The majority of this development was accomplished under the Advanced Fuel Cycle Initiative (AFCI), building on the strong legacy of process development R&D over the past 50 years. The most prominent processing method under development is named UREX+. The name refers to a family of processing methods that begin with the Uranium Extraction (UREX) process and incorporate a variety of other methods to separate uranium, selected fission products, and the transuranic (TRU) isotopes from dissolved spent nuclear fuel. It is important to consider issues such as safeguards strategies and materials control and accountability methods. Monitoring of higher actinides during aqueous separations is a critical research area. By providing on-line materials accountability for the processes, covert diversion of the materials streams becomes much more difficult. The importance of the nuclear fuel cycle continues to rise on national and international agendas. The U.S. Department of Energy is evaluating and developing advanced methods for safeguarding nuclear materials along with instrumentation in various stages of the fuel cycle, especially in material balance areas (MBAs) and during reprocessing of used nuclear fuel. One of the challenges related to the implementation of any type of MBA and/or reprocessing technology (e.g., PUREX or UREX) is the real-time quantification and control of the transuranic (TRU) isotopes as they move through the process. Monitoring of higher actinides from their neutron emission (including multiplicity) and alpha signatures during transit in MBAs and in aqueous separations is a critical research area. By providing on-line real-time materials accountability, diversion of the materials becomes much more difficult. The objective of this consortium was to develop real time detection methods to monitor the efficacy of the UREX+ process and to safeguard the separated

  2. Method of controlling radioactive waste processing systems

    International Nuclear Information System (INIS)

    Mikawa, Hiroji; Sato, Takao.

    1981-01-01

    Purpose: To minimize the pellet production amount, maximize the working life of a solidifying device and maintaining the mechanical strength of pellets to a predetermined value irrespective of the type and the cycle of occurrence of the secondary waste in the secondary waste solidifying device for radioactive waste processing systems in nuclear power plants. Method: Forecasting periods for the type, production amount and radioactivity level of the secondary wastes are determined in input/output devices connected to a control system and resulted signals are sent to computing elements. The computing elements forecast the production amount of regenerated liquid wastes after predetermined days based on the running conditions of a condensate desalter and the production amounts of filter sludges and liquid resin wastes after predetermined days based on the liquid waste processing amount or the like in a processing device respectively. Then, the mass balance between the type and the amount of the secondary wastes presently stored in a tank are calculated and the composition and concentration for the processing liquid are set so as to obtain predetermined values for the strength of pellets that can be dried to solidify, the working life of the solidifying device itself and the radioactivity level of the pellets. Thereafter, the running conditions for the solidifying device are determined so as to maximize the working life of the solidifying device. (Horiuchi, T.)

  3. An alternative method to achieve metrological confirmation in measurement process

    Science.gov (United States)

    Villeta, M.; Rubio, E. M.; Sanz, A.; Sevilla, L.

    2012-04-01

    Metrological confirmation process must be designed and implemented to ensure that metrological characteristics of the measurement system meet metrological requirements of the measurement process. The aim of this paper is to present an alternative method to the traditional metrological requirements about the relationship between tolerance and measurement uncertainty, to develop such confirmation processes. The proposed way to metrological confirmation considers a given inspection task of the measurement process into the manufacturing system, and it is based on the Index of Contamination of the Capability, ICC. Metrological confirmation process is then developed taking into account the producer risks and economic considerations on this index. As a consequence, depending on the capability of the manufacturing process, the measurement system will be or will not be in adequate state of metrological confirmation for the measurement process.

  4. Composite media for fluid stream processing, a method of forming the composite media, and a related method of processing a fluid stream

    Science.gov (United States)

    Garn, Troy G; Law, Jack D; Greenhalgh, Mitchell R; Tranter, Rhonda

    2014-04-01

    A composite media including at least one crystalline aluminosilicate material in polyacrylonitrile. A method of forming a composite media is also disclosed. The method comprises dissolving polyacrylonitrile in an organic solvent to form a matrix solution. At least one crystalline aluminosilicate material is combined with the matrix solution to form a composite media solution. The organic solvent present in the composite media solution is diluted. The composite media solution is solidified. In addition, a method of processing a fluid stream is disclosed. The method comprises providing a beads of a composite media comprising at least one crystalline aluminosilicate material dispersed in a polyacrylonitrile matrix. The beads of the composite media are contacted with a fluid stream comprising at least one constituent. The at least one constituent is substantially removed from the fluid stream.

  5. A method for automated processing of measurement information during mechanical drilling

    Energy Technology Data Exchange (ETDEWEB)

    Samonenko, V.I.; Belinkov, V.G.; Romanova, L.A.

    1984-01-01

    An algorithm is cited for a developed method for automated processing of measurement information during mechanical drilling. Its use in conditions of operation of an automated control system (ASU) from drilling will make it possible to precisely identify a change in the lithology, the physical and mechanical and the abrasive properties, in the stratum (pore) pressure in the rock being drilled out during mechanical drilling, which along with other methods for testing the drilling process will increase the reliability of the decisions made.

  6. Analysis methods of stochastic transient electro–magnetic processes in electric traction system

    Directory of Open Access Journals (Sweden)

    T. M. Mishchenko

    2013-04-01

    Full Text Available Purpose. The essence and basic characteristics of calculation methods of transient electromagnetic processes in the elements and devices of non–linear dynamic electric traction systems taking into account the stochastic changes of voltages and currents in traction networks of power supply subsystem and power circuits of electric rolling stock are developed. Methodology. Classical methods and the methods of non–linear electric engineering, as well as probability theory method, especially the methods of stationary ergodic and non–stationary stochastic processes application are used in the research. Findings. Using the above-mentioned methods an equivalent circuit and the system of nonlinear integra–differential equations for electromagnetic condition of the double–track inter-substation zone of alternating current electric traction system are drawn up. Calculations allow obtaining electric traction current distribution in the areas of feeder zones. Originality. First of all the paper is interesting and important from scientific point of view due to the methods, which allow taking into account probabilistic character of change for traction voltages and electric traction system currents. On the second hand the researches develop the most efficient methods of nonlinear circuits’ analysis. Practical value. The practical value of the research is presented in application of the methods to the analysis of electromagnetic and electric energy processes in the traction power supply system in the case of high-speed train traffic.

  7. Method for qualification of cementation processes and its application to a vibration mixer

    International Nuclear Information System (INIS)

    Vicente, R.; Rzyski, B.M.; Suarez, A.A.

    1987-01-01

    In this paper the definition of homogeneneity is discussed and methods to measure the 'degree of heterogeneity' of waste forms are proposed. These measurements are important as aids for mixing process qualification, and as tools in quality assurance procedures and in the development of waste management standards. Homogeneity is a basic quality requirement for waste forms to be accepted in final sites. It do not depend on the matrix immmobilization, rather it is one mean for qualification of the immobilization process. The proposed methods were applied to a vibration assisted mixing process and has proved to an useful mean to judge process improvements. There are many conceivable methods to evaluate homogeneity of waste forms. Some were selected as screening tests aiming at quickly reaching a promising set of process variables. Others were selected to evaluate the degree of excellence of the process in respect to product quality. These envisaged methods were: visual inspection, the use of cement dye as tracer, scanning of radioactive tracers, and measurements of variations of density, water absorption, porosity and mechanical strength across the waste form sample. The process variables were: waste-cement and water-cement ratios, mixer geometry, mixing time and vibration intensity. Some of the apparatus details were change during the experimental work in order to improve product quality. Experimental methods and results statistically analysed and compared with data obtained from samples prepared with a planetary paddle mixer, which were adopted as the homogeneity standard. (Author) [pt

  8. A Block-Asynchronous Relaxation Method for Graphics Processing Units

    OpenAIRE

    Anzt, H.; Dongarra, J.; Heuveline, Vincent; Tomov, S.

    2011-01-01

    In this paper, we analyze the potential of asynchronous relaxation methods on Graphics Processing Units (GPUs). For this purpose, we developed a set of asynchronous iteration algorithms in CUDA and compared them with a parallel implementation of synchronous relaxation methods on CPU-based systems. For a set of test matrices taken from the University of Florida Matrix Collection we monitor the convergence behavior, the average iteration time and the total time-to-solution time. Analyzing the r...

  9. A process for application of ATHEANA - a new HRA method

    International Nuclear Information System (INIS)

    Parry, G.W.; Bley, D.C.; Cooper, S.E.

    1996-01-01

    This paper describes the analytical process for the application of ATHEANA, a new approach to the performance of human reliability analysis as part of a PRA. This new method, unlike existing methods, is based upon an understanding of the reasons why people make errors, and was developed primarily to address the analysis of errors of commission

  10. Simulation of multivariate stationary stochastic processes using dimension-reduction representation methods

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui; Peng, Yongbo

    2018-03-01

    In view of the Fourier-Stieltjes integral formula of multivariate stationary stochastic processes, a unified formulation accommodating spectral representation method (SRM) and proper orthogonal decomposition (POD) is deduced. By introducing random functions as constraints correlating the orthogonal random variables involved in the unified formulation, the dimension-reduction spectral representation method (DR-SRM) and the dimension-reduction proper orthogonal decomposition (DR-POD) are addressed. The proposed schemes are capable of representing the multivariate stationary stochastic process with a few elementary random variables, bypassing the challenges of high-dimensional random variables inherent in the conventional Monte Carlo methods. In order to accelerate the numerical simulation, the technique of Fast Fourier Transform (FFT) is integrated with the proposed schemes. For illustrative purposes, the simulation of horizontal wind velocity field along the deck of a large-span bridge is proceeded using the proposed methods containing 2 and 3 elementary random variables. Numerical simulation reveals the usefulness of the dimension-reduction representation methods.

  11. A radiometric method for the characterization of particulate processes in colloidal suspensions. II

    International Nuclear Information System (INIS)

    Subotic, B.

    1979-01-01

    A radiometric method for the characterization of particulate processes is verified using stable hydrosols of silver iodide. Silver iodide hydrosols satisfy the conditions required for the applications of the proposed method. Comparison shows that the values for the change of particle size measured in silver iodide hydrosols by the proposed method are in excellent agreement with the values obtained by other methods on the same systems (electron microscopy, sedimentation analysis, light scattering). This shows that the proposed method is suitable for the characterization of particulate processes in colloidal suspensions. (Auth.

  12. CESAR cost-efficient methods and processes for safety-relevant embedded systems

    CERN Document Server

    Wahl, Thomas

    2013-01-01

    The book summarizes the findings and contributions of the European ARTEMIS project, CESAR, for improving and enabling interoperability of methods, tools, and processes to meet the demands in embedded systems development across four domains - avionics, automotive, automation, and rail. The contributions give insight to an improved engineering and safety process life-cycle for the development of safety critical systems. They present new concept of engineering tools integration platform to improve the development of safety critical embedded systems and illustrate capacity of this framework for end-user instantiation to specific domain needs and processes. They also advance state-of-the-art in component-based development as well as component and system validation and verification, with tool support. And finally they describe industry relevant evaluated processes and methods especially designed for the embedded systems sector as well as easy adoptable common interoperability principles for software tool integratio...

  13. New method of processing heat treatment experiments with numerical simulation support

    Science.gov (United States)

    Kik, T.; Moravec, J.; Novakova, I.

    2017-08-01

    In this work, benefits of combining modern software for numerical simulations of welding processes with laboratory research was described. Proposed new method of processing heat treatment experiments leading to obtaining relevant input data for numerical simulations of heat treatment of large parts was presented. It is now possible, by using experiments on small tested samples, to simulate cooling conditions comparable with cooling of bigger parts. Results from this method of testing makes current boundary conditions during real cooling process more accurate, but also can be used for improvement of software databases and optimization of a computational models. The point is to precise the computation of temperature fields for large scale hardening parts based on new method of temperature dependence determination of the heat transfer coefficient into hardening media for the particular material, defined maximal thickness of processed part and cooling conditions. In the paper we will also present an example of the comparison standard and modified (according to newly suggested methodology) heat transfer coefficient data’s and theirs influence on the simulation results. It shows how even the small changes influence mainly on distribution of temperature, metallurgical phases, hardness and stresses distribution. By this experiment it is also possible to obtain not only input data and data enabling optimization of computational model but at the same time also verification data. The greatest advantage of described method is independence of used cooling media type.

  14. MULTIPLE CRITERA METHODS WITH FOCUS ON ANALYTIC HIERARCHY PROCESS AND GROUP DECISION MAKING

    Directory of Open Access Journals (Sweden)

    Lidija Zadnik-Stirn

    2010-12-01

    Full Text Available Managing natural resources is a group multiple criteria decision making problem. In this paper the analytic hierarchy process is the chosen method for handling the natural resource problems. The one decision maker problem is discussed and, three methods: the eigenvector method, data envelopment analysis method, and logarithmic least squares method are presented for the derivation of the priority vector. Further, the group analytic hierarchy process is discussed and six methods for the aggregation of individual judgments or priorities: weighted arithmetic mean method, weighted geometric mean method, and four methods based on data envelopment analysis are compared. The case study on land use in Slovenia is applied. The conclusions review consistency, sensitivity analyses, and some future directions of research.

  15. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  16. The Discovery of Processing Stages: Extension of Sternberg's Method

    NARCIS (Netherlands)

    Anderson, John R; Zhang, Qiong; Borst, Jelmer P; Walsh, Matthew M

    2016-01-01

    We introduce a method for measuring the number and durations of processing stages from the electroencephalographic signal and apply it to the study of associative recognition. Using an extension of past research that combines multivariate pattern analysis with hidden semi-Markov models, the approach

  17. Catalytic arylation methods from the academic lab to industrial processes

    CERN Document Server

    Burke, Anthony J

    2014-01-01

    This "hands-on" approach to the topic of arylation consolidates the body of key research over the last ten years (and up to around 2014) on various catalytic methods which involve an arylation process. Clearly structured, the chapters in this one-stop resource are arranged according to the reaction type, and focus on novel, efficient and sustainable processes, rather than the well-known and established cross-coupling methods. The entire contents are written by two authors with academic and industrial expertise to ensure consistent coverage of the latest developments in the field, as well as industrial applications, such as C-H activation, iron and gold-catalyzed coupling reactions, cycloadditions or novel methodologies using arylboron reagents. A cross-section of relevant tried-and-tested experimental protocols is included at the end of each chapter for putting into immediate practice, along with patent literature. Due to its emphasis on efficient, "green" methods and industrial applications of the products c...

  18. The Role of Attention in Somatosensory Processing: A Multi-Trait, Multi-Method Analysis

    Science.gov (United States)

    Wodka, Ericka L.; Puts, Nicolaas A. J.; Mahone, E. Mark; Edden, Richard A. E.; Tommerdahl, Mark; Mostofsky, Stewart H.

    2016-01-01

    Sensory processing abnormalities in autism have largely been described by parent report. This study used a multi-method (parent-report and measurement), multi-trait (tactile sensitivity and attention) design to evaluate somatosensory processing in ASD. Results showed multiple significant within-method (e.g., parent report of different…

  19. Measuring methods, registration and signal processing for magnetic field research

    International Nuclear Information System (INIS)

    Nagiello, Z.

    1981-01-01

    Some measuring methods and signal processing systems based on analogue and digital technics, which have been applied in magnetic field research using magnetometers with ferromagnetic transducers, are presented. (author)

  20. Development of Process Analytical Technology (PAT) methods for controlled release pellet coating.

    Science.gov (United States)

    Avalle, P; Pollitt, M J; Bradley, K; Cooper, B; Pearce, G; Djemai, A; Fitzpatrick, S

    2014-07-01

    This work focused on the control of the manufacturing process for a controlled release (CR) pellet product, within a Quality by Design (QbD) framework. The manufacturing process was Wurster coating: firstly layering active pharmaceutical ingredient (API) onto sugar pellet cores and secondly a controlled release (CR) coating. For each of these two steps, development of a Process Analytical Technology (PAT) method is discussed and also a novel application of automated microscopy as the reference method. Ultimately, PAT methods should link to product performance and the two key Critical Quality Attributes (CQAs) for this CR product are assay and release rate, linked to the API and CR coating steps respectively. In this work, the link between near infra-red (NIR) spectra and those attributes was explored by chemometrics over the course of the coating process in a pilot scale industrial environment. Correlations were built between the NIR spectra and coating weight (for API amount), CR coating thickness and dissolution performance. These correlations allow the coating process to be monitored at-line and so better control of the product performance in line with QbD requirements. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Option pricing with COS method on Graphics Processing Units

    NARCIS (Netherlands)

    B. Zhang (Bo); C.W. Oosterlee (Kees)

    2009-01-01

    htmlabstractIn this paper, acceleration on the GPU for option pricing by the COS method is demonstrated. In particular, both European and Bermudan options will be discussed in detail. For Bermudan options, we consider both the Black-Scholes model and Levy processes of infinite activity. Moreover,

  2. A fast exact simulation method for a class of Markov jump processes.

    Science.gov (United States)

    Li, Yao; Hu, Lili

    2015-11-14

    A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze its properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.

  3. A qualitative diagnosis method for a continuous process monitor system

    International Nuclear Information System (INIS)

    Lucas, B.; Evrard, J.M.; Lorre, J.P.

    1993-01-01

    SEXTANT, an expert system for the analysis of transients, was built initially to study physical transients in nuclear reactors. It combines several knowledge bases concerning measurements, models and qualitative behavior of the plant with a generate-and-test mechanism and a set of numerical models of the physical process. The integration of an improved diagnosis method using a mixed model in SEXTANT in order to take into account the existence and the reliability of only a few number of sensors, the knowledge on failure and the possibility of non anticipated failures, is presented. This diagnosis method is based on two complementary qualitative models of the process and a methodology to build these models from a system description. 8 figs., 17 refs

  4. REVIEW OF MATHEMATICAL METHODS AND ALGORITHMS OF MEDICAL IMAGE PROCESSING ON THE EXAMPLE OF TECHNOLOGY OF MEDICAL IMAGE PROCESSING FROM WOLFRAM MATHEMATICA

    Directory of Open Access Journals (Sweden)

    О. E. Prokopchenko

    2015-09-01

    Full Text Available The article analyzes the basic methods and algorithms of mathematical processing of medical images as objects of computer mathematics. The presented methods and computer algorithms of mathematics relevant and may find application in the field of medical imaging - automated processing of images; as a tool for measurement and determination the optical parameters; identification and formation of medical images database. Methods and computer algorithms presented in the article & based on Wolfram Mathematica are also relevant to the problem of modern medical education. As an example of Wolfram Mathematica may be considered appropriate demonstration, such as recognition of special radiographs and morphological imaging. These methods are used to improve the diagnostic significance and value of medical (clinical research and can serve as an educational interactive demonstration. Implementation submitted individual methods and algorithms of computer Wolfram Mathematics contributes, in general, the optimization process of practical processing and presentation of medical images.

  5. Memory Indexing: A Novel Method for Tracing Memory Processes in Complex Cognitive Tasks

    Science.gov (United States)

    Renkewitz, Frank; Jahn, Georg

    2012-01-01

    We validate an eye-tracking method applicable for studying memory processes in complex cognitive tasks. The method is tested with a task on probabilistic inferences from memory. It provides valuable data on the time course of processing, thus clarifying previous results on heuristic probabilistic inference. Participants learned cue values of…

  6. Effects of processing methods on nutritive values of Ekuru from two ...

    African Journals Online (AJOL)

    Beans contain substantial amount of protein, dietary fibre, B-vitamins, minerals, and anti-nutrients which limit their utilisation. Processing reduce the level of antinutrients in plant products but little information exist on effects of processing methods on nutrient and antinutrient composition of bean products. This study was ...

  7. [Development of an automated processing method to detect coronary motion for coronary magnetic resonance angiography].

    Science.gov (United States)

    Asou, Hiroya; Imada, N; Sato, T

    2010-06-20

    On coronary MR angiography (CMRA), cardiac motions worsen the image quality. To improve the image quality, detection of cardiac especially for individual coronary motion is very important. Usually, scan delay and duration were determined manually by the operator. We developed a new evaluation method to calculate static time of individual coronary artery. At first, coronary cine MRI was taken at the level of about 3 cm below the aortic valve (80 images/R-R). Chronological change of the signals were evaluated with Fourier transformation of each pixel of the images were done. Noise reduction with subtraction process and extraction process were done. To extract higher motion such as coronary arteries, morphological filter process and labeling process were added. Using these imaging processes, individual coronary motion was extracted and individual coronary static time was calculated automatically. We compared the images with ordinary manual method and new automated method in 10 healthy volunteers. Coronary static times were calculated with our method. Calculated coronary static time was shorter than that of ordinary manual method. And scan time became about 10% longer than that of ordinary method. Image qualities were improved in our method. Our automated detection method for coronary static time with chronological Fourier transformation has a potential to improve the image quality of CMRA and easy processing.

  8. Calculations of the electromechanical transfer processes using implicit methods of numerical integration

    Energy Technology Data Exchange (ETDEWEB)

    Pogosyan, T A

    1983-01-01

    The article is dedicated to the solution of systems of differential equations which describe the transfer processes in an electric power system (EES) by implicit methods of numerical integration. The distinguishing feature of the implicit methods (Euler's reverse method and the trapeze method) is their absolute stability and, consequently, the relatively small accumulation of errors in each step of integration. Therefore, they are found to be very convenient for solving problems of electric power engineering, when the transfer processes are described by a rigid system of differential equations. The rigidity is associated with the range of values of the time constants considered. The advantage of the implicit methods over explicit are shown in a specific example (calculation of the dynamic stability of the simplest electric power system), along with the field of use of the implicit methods and the expedience of their use in power engineering problems.

  9. A method of creep rupture data extrapolation based on physical processes

    International Nuclear Information System (INIS)

    Leinster, M.G.

    2008-01-01

    There is a need for a reliable method to extrapolate generic creep rupture data to failure times in excess of the currently published times. A method based on well-understood and mathematically described physical processes is likely to be stable and reliable. Creep process descriptions have been developed based on accepted theory, to the extent that good fits with published data have been obtained. Methods have been developed to apply these descriptions to extrapolate creep rupture data to stresses below the published values. The relationship creep life parameter=f(ln(sinh(stress))) has been shown to be justifiable over the stress ranges of most interest, and gives realistic results at high temperatures and long times to failure. In the interests of continuity with past and present practice, the suggested method is intended to extend existing polynomial descriptions of life parameters at low stress. Where no polynomials exist, the method can be used to describe the behaviour of life parameters throughout the full range of a particular failure mode in the published data

  10. A Bayesian MCMC method for point process models with intractable normalising constants

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2004-01-01

    to simulate from the "unknown distribution", perfect simulation algorithms become useful. We illustrate the method in cases whre the likelihood is given by a Markov point process model. Particularly, we consider semi-parametric Bayesian inference in connection to both inhomogeneous Markov point process models...... and pairwise interaction point processes....

  11. Indigenous processing methods and raw materials of borde , an ...

    African Journals Online (AJOL)

    A flow chart of borde production was constructed showing four major processing stages. The short shelf life of borde and the seasonal variations in production volume were identified as major problems for the vendors in the study areas. Keywords: indigenous methods; cereal fermentation; borde; beverage; Ethiopia J Food ...

  12. Radioactive waste processing method

    International Nuclear Information System (INIS)

    Sakuramoto, Naohiko.

    1992-01-01

    When granular materials comprising radioactive wastes containing phosphorus are processed at first in a fluidized bed type furnace, if the granular materials are phosphorus-containing activated carbon, granular materials comprising alkali compound such as calcium hydroxide and barium hydroxide are used as fluidizing media. Even granular materials of slow burning speed can be burnt stably in a fluidizing state by high temperature heat of the fluidizing media, thereby enabling to take a long burning processing time. Accordingly, radioactive activated carbon wastes can be processed by burning treatment. (T.M.)

  13. IMF-Slices for GPR Data Processing Using Variational Mode Decomposition Method

    Directory of Open Access Journals (Sweden)

    Xuebing Zhang

    2018-03-01

    Full Text Available Using traditional time-frequency analysis methods, it is possible to delineate the time-frequency structures of ground-penetrating radar (GPR data. A series of applications based on time-frequency analysis were proposed for the GPR data processing and imaging. With respect to signal processing, GPR data are typically non-stationary, which limits the applications of these methods moving forward. Empirical mode decomposition (EMD provides alternative solutions with a fresh perspective. With EMD, GPR data are decomposed into a set of sub-components, i.e., the intrinsic mode functions (IMFs. However, the mode-mixing effect may also bring some negatives. To utilize the IMFs’ benefits, and avoid the negatives of the EMD, we introduce a new decomposition scheme termed variational mode decomposition (VMD for GPR data processing for imaging. Based on the decomposition results of the VMD, we propose a new method which we refer as “the IMF-slice”. In the proposed method, the IMFs are generated by the VMD trace by trace, and then each IMF is sorted and recorded into different profiles (i.e., the IMF-slices according to its center frequency. Using IMF-slices, the GPR data can be divided into several IMF-slices, each of which delineates a main vibration mode, and some subsurface layers and geophysical events can be identified more clearly. The effectiveness of the proposed method is tested using synthetic benchmark signals, laboratory data and the field dataset.

  14. Information processing systems, reasoning modules, and reasoning system design methods

    Science.gov (United States)

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  15. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  16. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  17. Defect recognition in CFRP components using various NDT methods within a smart manufacturing process

    Science.gov (United States)

    Schumacher, David; Meyendorf, Norbert; Hakim, Issa; Ewert, Uwe

    2018-04-01

    The manufacturing process of carbon fiber reinforced polymer (CFRP) components is gaining a more and more significant role when looking at the increasing amount of CFRPs used in industries today. The monitoring of the manufacturing process and hence the reliability of the manufactured products, is one of the major challenges we need to face in the near future. Common defects which arise during manufacturing process are e.g. porosity and voids which may lead to delaminations during operation and under load. To find irregularities and classify them as possible defects in an early stage of the manufacturing process is of high importance for the safety and reliability of the finished products, as well as of significant impact from an economical point of view. In this study we compare various NDT methods which were applied to similar CFRP laminate samples in order to detect and characterize regions of defective volume. Besides ultrasound, thermography and eddy current, different X-ray methods like radiography, laminography and computed tomography are used to investigate the samples. These methods are compared with the intention to evaluate their capability to reliably detect and characterize defective volume. Beyond the detection and evaluation of defects, we also investigate possibilities to combine various NDT methods within a smart manufacturing process in which the decision which method shall be applied is inherent within the process. Is it possible to design an in-line or at-line testing process which can recognize defects reliably and reduce testing time and costs? This study aims to show up opportunities of designing a smart NDT process synchronized to the production based on the concepts of smart production (Industry 4.0). A set of defective CFRP laminate samples and different NDT methods were used to demonstrate how effective defects are recognized and how communication between interconnected NDT sensors and the manufacturing process could be organized.

  18. Hyperspectral image processing methods

    Science.gov (United States)

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  19. Dental ceramics: a review of new materials and processing methods.

    Science.gov (United States)

    Silva, Lucas Hian da; Lima, Erick de; Miranda, Ranulfo Benedito de Paula; Favero, Stéphanie Soares; Lohbauer, Ulrich; Cesar, Paulo Francisco

    2017-08-28

    The evolution of computerized systems for the production of dental restorations associated to the development of novel microstructures for ceramic materials has caused an important change in the clinical workflow for dentists and technicians, as well as in the treatment options offered to patients. New microstructures have also been developed by the industry in order to offer ceramic and composite materials with optimized properties, i.e., good mechanical properties, appropriate wear behavior and acceptable aesthetic characteristics. The objective of this literature review is to discuss the main advantages and disadvantages of the new ceramic systems and processing methods. The manuscript is divided in five parts: I) monolithic zirconia restorations; II) multilayered dental prostheses; III) new glass-ceramics; IV) polymer infiltrated ceramics; and V) novel processing technologies. Dental ceramics and processing technologies have evolved significantly in the past ten years, with most of the evolution being related to new microstructures and CAD-CAM methods. In addition, a trend towards the use of monolithic restorations has changed the way clinicians produce all-ceramic dental prostheses, since the more aesthetic multilayered restorations unfortunately are more prone to chipping or delamination. Composite materials processed via CAD-CAM have become an interesting option, as they have intermediate properties between ceramics and polymers and are more easily milled and polished.

  20. Key technologies of drilling process with raise boring method

    Directory of Open Access Journals (Sweden)

    Zhiqiang Liu

    2015-08-01

    Full Text Available This study presents the concept of shaft constructed by raise boring in underground mines, and the idea of inverse construction can be extended to other fields of underground engineering. The conventional raise boring methods, such as the wood support method, the hanging cage method, the creeping cage method, and the deep-hole blasting method, are analyzed and compared. In addition, the raise boring machines are classified into different types and the characteristics of each type are described. The components of a raise boring machine including the drill rig, the drill string and the auxiliary system are also presented. Based on the analysis of the raise boring method, the rock mechanics problems during the raise boring process are put forward, including rock fragmentation, removal of cuttings, shaft wall stability, and borehole deviation control. Finally, the development trends of raise boring technology are described as follows: (i improvement of rock-breaking modes to raise drilling efficiency, (ii development of an intelligent control technique, and (iii development of technology and equipment for nonlinear raise boring.

  1. REVIEW OF MATHEMATICAL METHODS AND ALGORITHMS OF MEDICAL IMAGE PROCESSING ON THE EXAMPLE OF TECHNOLOGY OF MEDICAL IMAGE PROCESSING FROM WOLFRAM MATHEMATICS

    Directory of Open Access Journals (Sweden)

    O. Ye. Prokopchenko

    2015-10-01

    Full Text Available The article analyzes the basic methods and algorithms of mathematical processing of medical images as objects of computer mathematics. The presented methods and computer algorithms of mathematics relevant and may find application in the field of medical imaging - automated processing of images; as a tool for measurement and determination the optical parameters; identification and formation of medical images database. Methods and computer algorithms presented in the article and based on Wolfram Mathematica are also relevant to the problem of modern medical education. As an example of Wolfram Mathematics may be considered appropriate demonstration, such as recognition of special radiographs and morphological imaging. These methods are used to improve  the diagnostic significance and value of medical (clinical research and can serve as an educational interactive demonstration. Implementation submitted individual methods and algorithms of computer Wolfram Mathematics contributes, in general, the optimization process of practical processing and presentation of medical images.

  2. MethodS of radioactive waste processing and disposal in the United Kingdom

    International Nuclear Information System (INIS)

    Tolstykh, V.D.

    1983-01-01

    The results of investigations into radioactive waste processing and disposal in the United Kingdom are discussed. Methods for solidification of metal and graphite radioactive wastes and radioactive slime of the Magnox reactors are described. Specifications of different installations used for radioactive waste disposal are given. Climatic and geological conditions in the United Kingdom are such that any deep storages of wastes will be lower than the underground water level. That is why dissolution and transport by underground waters will inevitably result in radionuclide mobility. In this connection an extended program of investigations into the main three aspects of disposal problem namely radionucleide release in storages, underground water transport and radionuclide migration is realized. The program is divided in two parts. The first part deals with retrival of hydrological and geochemical data on geological formations, development of specialized methods of investigations which are necessary for identification of places for waste final disposal. The second part represents theoretical and laboratory investigations into provesses of radionuclide transport in the system of ''sttorage-geological formation''. It is concluded that vitrification on the base of borosilicate glass is the most advanced method of radioactive waste solidification

  3. APPLICATION OF FTA AND FMEA METHOD TO IMPROVE SUGAR PRODUCTION PROCESS QUALITY

    Directory of Open Access Journals (Sweden)

    JojoAndriana

    2016-10-01

    Full Text Available Defective product is a product that has poor quality and do not meet the standart. This defective products can give a bad impact to company, such as high production costs and decreased image company. Several methods that can be used to improve the quality is Six Sigma DMAIC methodology, FTA, and FMEA method. This study is conducted for several purpose, they are to determine the value of sigma level on the process of sugar production in PT.PG. Krebet Baru, to determine the factors that cause defective products in the process of sugar production by the FTA method, and to make a suitable solution based on the FMEA defective causes. The process sigma level in PT.PG.Krebet Baru is 3.58. That value sigma level indicates PT. PG. Krebet Baru is a company that are still growing and need improvement. The primary cause of the defects in the production process is a factor of operator and machine. Mode of failure with the highest RPN at 210 is time for steam process is too long, so they need to install the equipment that can detect the water level on sugar. When this equipment is installed, the exact time for drying will be known and the amount of defective product will be decreased.

  4. Clustering method to process signals from a CdZnTe detector

    International Nuclear Information System (INIS)

    Zhang, Lan; Takahashi, Hiroyuki; Fukuda, Daiji; Nakazawa, Masaharu

    2001-01-01

    The poor mobility of holes in a compound semiconductor detector results in the imperfect collection of the primary charge deposited in the detector. Furthermore the fluctuation of the charge loss efficiency due to the change in the hole collection path length seriously degrades the energy resolution of the detector. Since the charge collection efficiency varies with the signal waveform, we can expect the improvement of the energy resolution through a proper waveform signal processing method. We developed a new digital signal processing technique, a clustering method which derives typical patterns containing the information on the real situation inside a detector from measured signals. The obtained typical patterns for the detector are then used for the pattern matching method. Measured signals are classified through analyzing the practical waveform variation due to the charge trapping, the electric field and the crystal defect etc. Signals with similar shape are placed into the same cluster. For each cluster we calculate an average waveform as a reference pattern. Using these reference patterns obtained from all the clusters, we can classify other measured signal waveforms from the same detector. Then signals are independently processed according to the classified category and form corresponding spectra. Finally these spectra are merged into one spectrum by multiplying normalization coefficients. The effectiveness of this method was verified with a CdZnTe detector of 2 mm thick and a 137 Cs gamma-ray source. The obtained energy resolution as improved to about 8 keV (FWHM). Because the clustering method is only related to the measured waveforms, it can be applied to any type and size of detectors and compatible with any type of filtering methods. (author)

  5. Simulation of ecological processes using response functions method

    International Nuclear Information System (INIS)

    Malkina-Pykh, I.G.; Pykh, Yu. A.

    1998-01-01

    The article describes further development and applications of the already well-known response functions method (MRF). The method is used as a basis for the development of mathematical models of a wide set of ecological processes. The model of radioactive contamination of the ecosystems is chosen as an example. The mathematical model was elaborated for the description of 90 Sr dynamics in the elementary ecosystems of various geographical zones. The model includes the blocks corresponding with the main units of any elementary ecosystem: lower atmosphere, soil, vegetation, surface water. Parameters' evaluation was provided on a wide set of experimental data. A set of computer simulations was done on the model to prove the possibility of the model's use for ecological forecasting

  6. Method of noncontacting ultrasonic process monitoring

    Science.gov (United States)

    Garcia, Gabriel V.; Walter, John B.; Telschow, Kenneth L.

    1992-01-01

    A method of monitoring a material during processing comprising the steps of (a) shining a detection light on the surface of a material; (b) generating ultrasonic waves at the surface of the material to cause a change in frequency of the detection light; (c) detecting a change in the frequency of the detection light at the surface of the material; (d) detecting said ultrasonic waves at the surface point of detection of the material; (e) measuring a change in the time elapsed from generating the ultrasonic waves at the surface of the material and return to the surface point of detection of the material, to determine the transit time; and (f) comparing the transit time to predetermined values to determine properties such as, density and the elastic quality of the material.

  7. Method and equipment of processing radioactive laundry wastes

    International Nuclear Information System (INIS)

    Shirai, Takamori; Suzuki, Takeo; Tabata, Masayuki; Takada, Takao; Yamaguchi, Shin-ichi; Noda, Tetsuya.

    1985-01-01

    Purpose: To effectively process radioactive laundry wastes generated due to water-washing after dry-cleaning of protective clothings which have been put on in nuclear facilities. Method: Dry cleaning soaps and ionic radioactive materials contained in radioactive laundry wastes are selectively adsorbed to decontaminate by adsorbents. Then, the adsorbents having adsorbed dry cleaning soaps and ionic radioactive materials are purified by being removed with these radioactive materials. The purified adsorbents are re-used. (Seki, T.)

  8. Classification-based comparison of pre-processing methods for interpretation of mass spectrometry generated clinical datasets

    Directory of Open Access Journals (Sweden)

    Hoefsloot Huub CJ

    2009-05-01

    Full Text Available Abstract Background Mass spectrometry is increasingly being used to discover proteins or protein profiles associated with disease. Experimental design of mass-spectrometry studies has come under close scrutiny and the importance of strict protocols for sample collection is now understood. However, the question of how best to process the large quantities of data generated is still unanswered. Main challenges for the analysis are the choice of proper pre-processing and classification methods. While these two issues have been investigated in isolation, we propose to use the classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Results Two in-house generated clinical SELDI-TOF MS datasets are used in this study as an example of high throughput mass-spectrometry data. We perform a systematic comparison of two commonly used pre-processing methods as implemented in Ciphergen ProteinChip Software and in the Cromwell package. With respect to reproducibility, Ciphergen and Cromwell pre-processing are largely comparable. We find that the overlap between peaks detected by either Ciphergen ProteinChip Software or Cromwell is large. This is especially the case for the more stringent peak detection settings. Moreover, similarity of the estimated intensities between matched peaks is high. We evaluate the pre-processing methods using five different classification methods. Classification is done in a double cross-validation protocol using repeated random sampling to obtain an unbiased estimate of classification accuracy. No pre-processing method significantly outperforms the other for all peak detection settings evaluated. Conclusion We use classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Both pre-processing methods lead to similar classification results on an ovarian cancer and a Gaucher disease dataset. However, the settings for pre-processing

  9. Total focusing method with correlation processing of antenna array signals

    Science.gov (United States)

    Kozhemyak, O. A.; Bortalevich, S. I.; Loginov, E. L.; Shinyakov, Y. A.; Sukhorukov, M. P.

    2018-03-01

    The article proposes a method of preliminary correlation processing of a complete set of antenna array signals used in the image reconstruction algorithm. The results of experimental studies of 3D reconstruction of various reflectors using and without correlation processing are presented in the article. Software ‘IDealSystem3D’ by IDeal-Technologies was used for experiments. Copper wires of different diameters located in a water bath were used as a reflector. The use of correlation processing makes it possible to obtain more accurate reconstruction of the image of the reflectors and to increase the signal-to-noise ratio. The experimental results were processed using an original program. This program allows varying the parameters of the antenna array and sampling frequency.

  10. Prediction of periodically correlated processes by wavelet transform and multivariate methods with applications to climatological data

    Science.gov (United States)

    Ghanbarzadeh, Mitra; Aminghafari, Mina

    2015-05-01

    This article studies the prediction of periodically correlated process using wavelet transform and multivariate methods with applications to climatological data. Periodically correlated processes can be reformulated as multivariate stationary processes. Considering this fact, two new prediction methods are proposed. In the first method, we use stepwise regression between the principal components of the multivariate stationary process and past wavelet coefficients of the process to get a prediction. In the second method, we propose its multivariate version without principal component analysis a priori. Also, we study a generalization of the prediction methods dealing with a deterministic trend using exponential smoothing. Finally, we illustrate the performance of the proposed methods on simulated and real climatological data (ozone amounts, flows of a river, solar radiation, and sea levels) compared with the multivariate autoregressive model. The proposed methods give good results as we expected.

  11. Method of processing radiation-contaminated organic polymer materials

    International Nuclear Information System (INIS)

    Kobayashi, Yoshii.

    1980-01-01

    Purpose: To process radiation contaminated organic high polymer materials with no evolution of toxic gases, at low temperature and with safety by hot-acid immersion process using sulfuric acid-hydrogen peroxide. Method: Less flammable or easily flammable organic polymers contaminated with radioactive substances, particularly with long life actinoid are heated and carbonized in concentrated sulfuric acid. Then, aqueous 30% H 2 O 2 solution is continuously added dropwise as an oxidizing agent till the solution turns colourless. If the carbonization was insufficient, addition of H 2 O 2 solution is stopped temporarily and the carbonization is conducted again. Thus, the organic polymers are completely decomposed by the wet oxidization. Then, the volume of the organic materials to be discharged is decreased and the radioactive substances contained are simultaneously concentrated and collected. (Seki, T.)

  12. An Efficient Quality-Related Fault Diagnosis Method for Real-Time Multimode Industrial Process

    Directory of Open Access Journals (Sweden)

    Kaixiang Peng

    2017-01-01

    Full Text Available Focusing on quality-related complex industrial process performance monitoring, a novel multimode process monitoring method is proposed in this paper. Firstly, principal component space clustering is implemented under the guidance of quality variables. Through extraction of model tags, clustering information of original training data can be acquired. Secondly, according to multimode characteristics of process data, the monitoring model integrated Gaussian mixture model with total projection to latent structures is effective after building the covariance description form. The multimode total projection to latent structures (MTPLS model is the foundation of problem solving about quality-related monitoring for multimode processes. Then, a comprehensive statistics index is defined which is based on the posterior probability of the monitored samples belonging to each Gaussian component in the Bayesian theory. After that, a combined index is constructed for process monitoring. Finally, motivated by the application of traditional contribution plot in fault diagnosis, a gradient contribution rate is applied for analyzing the variation of variable contribution rate along samples. Our method can ensure the implementation of online fault monitoring and diagnosis for multimode processes. Performances of the whole proposed scheme are verified in a real industrial, hot strip mill process (HSMP compared with some existing methods.

  13. An Integrated Computational Materials Engineering Method for Woven Carbon Fiber Composites Preforming Process

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Weizhao; Ren, Huaqing; Wang, Zequn; Liu, Wing K.; Chen, Wei; Zeng, Danielle; Su, Xuming; Cao, Jian

    2016-10-19

    An integrated computational materials engineering method is proposed in this paper for analyzing the design and preforming process of woven carbon fiber composites. The goal is to reduce the cost and time needed for the mass production of structural composites. It integrates the simulation methods from the micro-scale to the macro-scale to capture the behavior of the composite material in the preforming process. In this way, the time consuming and high cost physical experiments and prototypes in the development of the manufacturing process can be circumvented. This method contains three parts: the micro-scale representative volume element (RVE) simulation to characterize the material; the metamodeling algorithm to generate the constitutive equations; and the macro-scale preforming simulation to predict the behavior of the composite material during forming. The results show the potential of this approach as a guidance to the design of composite materials and its manufacturing process.

  14. Place of modern imaging methods and their influence on the diagnostic process

    International Nuclear Information System (INIS)

    Petkov, D.; Lazarova, I.

    1991-01-01

    The main trends in development of the modern imaging diagnostic methods are presented: increasing the specificity of CT, nuclear-magnetic resonance imaging, positron-emission tomography, digital substractional angiography, echography etc. based on modern technical improvements; objective representation of the physiological and biochemical divergencies in particular diseases; interventional radiology; integral application of different methods; improving the sensitivity and specificity of the methods based on developments in pharmacology (new contrast media, parmaceuticals influencing the function of examinated organs, etc.); the possibilities for data compilation and further computerized processing of primary data. Personal experience is reported with the exploitation of these methods in Bulgaria. Attention is also called to the unfavourable impact connected with the too strong technicization of the diagnostic and therapeutic process in a health, deontologic, economical and social respect. 15 refs

  15. Physiological justification for using an unconventional method for processing raw material in aquaculture

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2018-03-01

    Full Text Available Purpose. To study the influence of Spirulina рlatensis, which was previously treated by a non-traditional method, on physiological and biochemical processes in the organism of hydrobionts. To analyze the growth rate of Ukrainian scaly carp and tilapia under the influence of the feeding factor. Methodology. Experimental studies were carried out in the laboratory of biological resources and aquaculture. Following was performed during the experiments: clinical examination of hydrobionts, control of growth rates, survival rate by recording the results in a working journal, physiological studies, analysis of morphological and functional blood indices, which were performed according to generally accepted methods. Findings. It was found that the proposed method of non-traditional processing of feed resource for hydrobionts promoted the activation of metabolic processes, contributed to the improvement of fish development indices. During the process of cultivation of Spirulina рlatensis, the use of plasma-chemically activated water had a positive effect on the dynamics of development. Originality. This article presents for the first time the results of a positive effect of the pretreatment of Spirulina Platensis culture with plasma-chemically activated water for its feeding to hydrobionts. A positive effect of this method of feeding on the functional status of fish organism and the process of adaptation-compensatory mechanisms in ontogenesis has been found. Practical value. The proposed method will provide an opportunity to improve the rate of the development of hydrobionts, physiological and biochemical processes with the maximum utilization of the potential of fish organism at the stages of active growth, and also to reduce the cost of artificial feeds.

  16. Sequential method for the assessment of innovations in computer assisted industrial processes

    International Nuclear Information System (INIS)

    Suarez Antola R.

    1995-01-01

    A sequential method for the assessment of innovations in industrial processes is proposed, using suitable combinations of mathematical modelling and numerical simulation of dynamics. Some advantages and limitations of the proposed method are discussed. tabs

  17. Fast Fourier Transform Pricing Method for Exponential Lévy Processes

    KAUST Repository

    Crocce, Fabian; Happola, Juho; Kiessling, Jonas; Tempone, Raul

    2014-01-01

    We describe a set of partial-integro-differential equations (PIDE) whose solutions represent the prices of european options when the underlying asset is driven by an exponential L´evy process. Exploiting the L´evy -Khintchine formula, we give a Fourier based method for solving this class of PIDEs. We present a novel L1 error bound for solving a range of PIDEs in asset pricing and use this bound to set parameters for numerical methods.

  18. Fast Fourier Transform Pricing Method for Exponential Lévy Processes

    KAUST Repository

    Crocce, Fabian

    2014-05-04

    We describe a set of partial-integro-differential equations (PIDE) whose solutions represent the prices of european options when the underlying asset is driven by an exponential L´evy process. Exploiting the L´evy -Khintchine formula, we give a Fourier based method for solving this class of PIDEs. We present a novel L1 error bound for solving a range of PIDEs in asset pricing and use this bound to set parameters for numerical methods.

  19. Numerical methods in image processing for applications in jewellery industry

    OpenAIRE

    Petrla, Martin

    2016-01-01

    Presented thesis deals with a problem from the field of image processing for application in multiple scanning of jewelery stones. The aim is to develop a method for preprocessing and subsequent mathematical registration of images in order to increase the effectivity and reliability of the output quality control. For these purposes the thesis summerizes mathematical definition of digital image as well as theoretical base of image registration. It proposes a method adjusting every single image ...

  20. Cathodic processes in high-temperature molten salts for the development of new materials processing methods

    International Nuclear Information System (INIS)

    Schwandt, Carsten

    2017-01-01

    Molten salts play an important role in the processing of a range of commodity materials. This includes the large-scale production of iron, aluminium, magnesium and alkali metals as well as the refining of nuclear fuel materials. This presentation focuses on two more recent concepts in which the cathodic reactions in molten salt electrolytic cells are used to prepare high-value-added materials. Both were developed and advanced at the Department of Materials Science and Metallurgy at the University of Cambridge and are still actively being pursued. One concept is now generally known as the FFC-Cambridge process. The presentation will highlight the optimisation of the process towards high selectivities for tubes or particles depict a modification of the method to synthesize tin-filled carbon nanomaterial, and illustrate the implementation of a novel type of process control to enable the preparation of gramme quantities of material within a few hours with simple laboratory equipment. Also discussed will be the testing of these materials in lithium ion batteries

  1. Systematic methods for synthesis and design of sustainable chemical and biochemical processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Chemical and biochemical process design consists of designing the process that can sustainably manufacture an identified chemical product through a chemical or biochemical route. The chemical product tree is potentially very large; starting from a set of basic raw materials (such as petroleum...... for process intensification, sustainable process design, identification of optimal biorefinery models as well as integrated process-control design, and chemical product design. The lecture will present the main concepts, the decomposition based solution approach, the developed methods and tools together...

  2. The Scientific Method and the Creative Process: Implications for the K-6 Classroom

    Science.gov (United States)

    Nichols, Amanda J.; Stephens, April H.

    2013-01-01

    Science and the arts might seem very different, but the processes that both fields use are very similar. The scientific method is a way to explore a problem, form and test a hypothesis, and answer questions. The creative process creates, interprets, and expresses art. Inquiry is at the heart of both of these methods. The purpose of this article is…

  3. The uranium waste fluid processing examination by liquid and liquid extraction method using the emulsion flow method

    International Nuclear Information System (INIS)

    Kanda, Nobuhiro; Daiten, Masaki; Endo, Yuji; Yoshida, Hideaki; Mita, Yutaka; Naganawa, Hirochika; Nagano, Tetsushi; Yanase, Nobuyuki

    2015-03-01

    Spent centrifuges which had used for the development of the uranium enrichment technology are stored in the uranium enrichment facility located in Ningyo-toge Environmental Center, Japan Atomic Energy Agency (JAEA). Our technology of the centrifugal machine processing are supposed to separate the radioactive material adhered on surface of inner parts of centrifuges by the wet way decontamination method using the ultrasonic bath filled dilute sulfuric acid and water, and it is generated the neutralization sediment (sludge) by the processing of the radioactive waste fluid with the decontamination. JAEA had been considering the applicability of a streamlining and reduction of the processing of the sludge by decreases radioactive concentration including the sludge through the removes uranium from the radioactive waste fluid. As part of considerations, JAEA have been promoting technological developments of the uranium extraction separation using The Emulsion Flow Extraction Method (a theory propounded by JAEA-Nuclear Science and Engineering Center) in close coordination and cooperation between with JAEA-Nuclear Science and Engineering Center and Ningyo-toge Environmental Center from 2007 fiscal year. This report describes the outline of the application test using actual waste fluid of dilute sulfuric acid and water by developed the examination system introducing the emulsion flow extraction method. (author)

  4. The prioritization and categorization method (PCM) process evaluation at Ericsson : a case study

    NARCIS (Netherlands)

    Ohlsson, Jens; Han, Shengnan; Bouwman, W.A.G.A.

    2017-01-01

    Purpose: The purpose of this paper is to demonstrate and evaluate the prioritization and categorization method (PCM), which facilitates the active participation of process stakeholders (managers, owners, customers) in process assessments. Stakeholders evaluate processes in terms of effectiveness,

  5. 40 CFR 63.115 - Process vent provisions-methods and procedures for process vent group determination.

    Science.gov (United States)

    2010-07-01

    ... this section. (2) The gas volumetric flow rate shall be determined using Method 2, 2A, 2C, or 2D of 40... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or...)(3) of this section. (i) The vent stream volumetric flow rate (Qs), in standard cubic meters per...

  6. Dental ceramics: a review of new materials and processing methods

    Directory of Open Access Journals (Sweden)

    Lucas Hian da SILVA

    2017-08-01

    Full Text Available Abstract The evolution of computerized systems for the production of dental restorations associated to the development of novel microstructures for ceramic materials has caused an important change in the clinical workflow for dentists and technicians, as well as in the treatment options offered to patients. New microstructures have also been developed by the industry in order to offer ceramic and composite materials with optimized properties, i.e., good mechanical properties, appropriate wear behavior and acceptable aesthetic characteristics. The objective of this literature review is to discuss the main advantages and disadvantages of the new ceramic systems and processing methods. The manuscript is divided in five parts: I monolithic zirconia restorations; II multilayered dental prostheses; III new glass-ceramics; IV polymer infiltrated ceramics; and V novel processing technologies. Dental ceramics and processing technologies have evolved significantly in the past ten years, with most of the evolution being related to new microstructures and CAD-CAM methods. In addition, a trend towards the use of monolithic restorations has changed the way clinicians produce all-ceramic dental prostheses, since the more aesthetic multilayered restorations unfortunately are more prone to chipping or delamination. Composite materials processed via CAD-CAM have become an interesting option, as they have intermediate properties between ceramics and polymers and are more easily milled and polished.

  7. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  8. Method of optimization of the natural gas refining process

    Energy Technology Data Exchange (ETDEWEB)

    Sadykh-Zade, E.S.; Bagirov, A.A.; Mardakhayev, I.M.; Razamat, M.S.; Tagiyev, V.G.

    1980-01-01

    The SATUM (automatic control system of technical operations) system introduced at the Shatlyk field should assure good quality of gas refining. In order to optimize the natural gas refining processes and experimental-analytical method is used in compiling the mathematical descriptions. The program, compiled in Fortran language, in addition to parameters of optimal conditions gives information on the yield of concentrate and water, concentration and consumption of DEG, composition and characteristics of the gas and condensate. The algorithm for calculating optimum engineering conditions of gas refining is proposed to be used in ''advice'' mode, and also for monitoring progress of the gas refining process.

  9. Methods of assessing functioning of organizational and economic mechanism during innovation process implementation

    Directory of Open Access Journals (Sweden)

    Blinkov Maksim

    2017-01-01

    Full Text Available This paper proposes methods of assessing the efficiency of organizational and economic mechanism of an industrial enterprise when implementing innovation processes. These methods allow continuous monitoring at all stages of innovation process implementation, lead to reduction in costs of innovation activity and enable maximum use of the creative potential of enterprise personnel. The significance and attractiveness of this method is ensured by the fact that it can be applied by industrial companies in any market fields regardless of the lifecycle stage applicable to the studied goods, company and/or innovative process because the composition and the number of specific indicators can be adjusted by the work group both before the study and in the course of the company’s innovative activities (at any stage of their implementation. The multi-sided approach proposed for assessing the efficiency of organizational and economic mechanism of the industrial enterprise when implementing innovation processes ensures full and accurate assessment of the impact of certain factors on the final result.

  10. A fast all-in-one method for automated post-processing of PIV data.

    Science.gov (United States)

    Garcia, Damien

    2011-05-01

    Post-processing of PIV (particle image velocimetry) data typically contains three following stages: validation of the raw data, replacement of spurious and missing vectors, and some smoothing. A robust post-processing technique that carries out these steps simultaneously is proposed. The new all-in-one method (DCT-PLS), based on a penalized least squares approach (PLS), combines the use of the discrete cosine transform (DCT) and the generalized cross-validation, thus allowing fast unsupervised smoothing of PIV data. The DCT-PLS was compared with conventional methods, including the normalized median test, for post-processing of simulated and experimental raw PIV velocity fields. The DCT-PLS was shown to be more efficient than the usual methods, especially in the presence of clustered outliers. It was also demonstrated that the DCT-PLS can easily deal with a large amount of missing data. Because the proposed algorithm works in any dimension, the DCT-PLS is also suitable for post-processing of volumetric three-component PIV data.

  11. A fast all-in-one method for automated post-processing of PIV data

    Science.gov (United States)

    Garcia, Damien

    2013-01-01

    Post-processing of PIV (particle image velocimetry) data typically contains three following stages: validation of the raw data, replacement of spurious and missing vectors, and some smoothing. A robust post-processing technique that carries out these steps simultaneously is proposed. The new all-in-one method (DCT-PLS), based on a penalized least squares approach (PLS), combines the use of the discrete cosine transform (DCT) and the generalized cross-validation, thus allowing fast unsupervised smoothing of PIV data. The DCT-PLS was compared with conventional methods, including the normalized median test, for post-processing of simulated and experimental raw PIV velocity fields. The DCT-PLS was shown to be more efficient than the usual methods, especially in the presence of clustered outliers. It was also demonstrated that the DCT-PLS can easily deal with a large amount of missing data. Because the proposed algorithm works in any dimension, the DCT-PLS is also suitable for post-processing of volumetric three-component PIV data. PMID:24795497

  12. Method for processing powdery radioactive wastes

    International Nuclear Information System (INIS)

    Yasumura, Keijiro; Matsuura, Hiroyuki; Tomita, Toshihide; Nakayama, Yasuyuki.

    1978-01-01

    Purpose: To solidify radioactive wastes with ease and safety at a high reaction speed but with no boiling by impregnating the radioactive wastes with chlorostyrene. Method: Beads-like dried ion exchange resin, powdery ion exchange resin, filter sludges, concentrated dried waste liquor or the like are mixed or impregnated with a chlorostyrene monomer dissolving therein a polymerization initiator such as methyl ethyl ketone peroxide and benzoyl peroxide. Mixed or impregnated products are polymerized to solid after a predetermined of time through curing reaction to produce solidified radioactive wastes. Since inflammable materials are used, this process has a high safety. About 70% wastes can be incorporated. The solidified products have a strength as high as 300 - 400 kg/cm 3 and are suitable to ocean disposal. The products have a greater radioactive resistance than other plastic solidification products. (Seki, T.)

  13. Processing methods for operation test data of radioactive aerosols monitor based on accumulation techniques

    International Nuclear Information System (INIS)

    Fu Cuiming; Xi Pingping; Ma Yinghao; Tan Linglong; Shen Fu

    2011-01-01

    This article introduces a radioactive aerosol continuous monitor based on accumulation sampling and measuring and three methods for processing the operation data. The monitoring results are processed by the 3 methods which are applied both under the conditions of natural background and at workplaces of a nuclear facility. How the monitoring results are assessed and how to calculate the detection limit when using the 3 different methods are explained. Moreover, the advantages and disadvantages of the 3 methods are discussed. (authors)

  14. Surface Nano Structures Manufacture Using Batch Chemical Processing Methods for Tooling Applications

    DEFF Research Database (Denmark)

    Tosello, Guido; Calaon, Matteo; Gavillet, J.

    2011-01-01

    The patterning of large surface areas with nano structures by using chemical batch processes to avoid using highenergy intensive nano machining processes was investigated. The capability of different surface treatment methods of creating micro and nano structured adaptable mould inserts for subse...

  15. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  16. METHODS FOR ORGANIZATION OF WORKING PROCESS FOR GAS-DIESEL ENGINE

    Directory of Open Access Journals (Sweden)

    G. A. Vershina

    2017-01-01

    Full Text Available Over the past few decades reduction in pollutant emissions has become one of the main directions for further deve- lopment of engine technology. Solution of such problems has led to implementation of catalytic post-treatment systems, new technologies of fuel injection, technology for regulated phases of gas distribution, regulated turbocharger system and, lately, even system for variable compression ratio of engine. Usage of gaseous fuel, in particular gas-diesel process, may be one of the means to reduce air pollution caused by toxic substances and meet growing environmental standards and regulations. In this regard, an analysis of methods for organization of working process for a gas-diesel engine has been conducted in the paper. The paper describes parameters that influence on the nature of gas diesel process, it contains graphics of specific total heat consumption according to ignition portion of diesel fuel and dependence of gas-diesel indices on advance angle for igni-tion portion injection of the diesel fuel. A modern fuel system of gas-diesel engine ГД-243 has been demonstrated in the pa- per. The gas-diesel engine has better environmental characteristics than engines running on diesel fuel or gasoline. According to the European Natural & bio Gas Vehicle Association a significant reduction in emissions is reached at a 50%-substitution level of diesel fuel by gas fuel (methane and in such a case there is a tendency towards even significant emission decrease. In order to ensure widespread application of gaseous fuel as fuel for gas-diesel process it is necessary to develop a new wor- king process, to improve fuel equipment, to enhance injection strategy and fuel supply control. A method for organization of working process for multi-fuel engine has been proposed on the basis of the performed analysis. An application has been submitted for a patent.

  17. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    Science.gov (United States)

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  18. Energy-saving method for technogenic waste processing.

    Directory of Open Access Journals (Sweden)

    Bayandy Dikhanbaev

    Full Text Available Dumps of a mining-metallurgical complex of post-Soviet Republics have accumulated a huge amount of technogenic waste products. Out of them, Kazakhstan alone has preserved about 20 billion tons. In the field of technogenic waste treatment, there is still no technical solution that leads it to be a profitable process. Recent global trends prompted scientists to focus on developing energy-saving and a highly efficient melting unit that can significantly reduce specific fuel consumption. This paper reports, the development of a new technological method-smelt layer of inversion phase. The introducing method is characterized by a combination of ideal stirring and ideal displacement regimes. Using the method of affine modelling, recalculation of pilot plant's test results on industrial sample has been obtained. Experiments show that in comparison with bubbling and boiling layers of smelt, the degree of zinc recovery increases in the layer of inversion phase. That indicates the reduction of the possibility of new formation of zinc silicates and ferrites from recombined molecules of ZnO, SiO2, and Fe2O3. Calculations show that in industrial samples of the pilot plant, the consumption of natural gas has reduced approximately by two times in comparison with fuming-furnace. The specific fuel consumption has reduced by approximately four times in comparison with Waelz-kiln.

  19. Wet separation processes as method to separate limestone and oil shale

    Science.gov (United States)

    Nurme, Martin; Karu, Veiko

    2015-04-01

    Biggest oil shale industry is located in Estonia. Oil shale usage is mainly for electricity generation, shale oil generation and cement production. All these processes need certain quality oil shale. Oil shale seam have interlayer limestone layers. To use oil shale in production, it is needed to separate oil shale and limestone. A key challenge is find separation process when we can get the best quality for all product types. In oil shale separation typically has been used heavy media separation process. There are tested also different types of separation processes before: wet separation, pneumatic separation. Now oil shale industry moves more to oil production and this needs innovation methods for separation to ensure fuel quality and the changes in quality. The pilot unit test with Allmineral ALLJIG have pointed out that the suitable new innovation way for oil shale separation can be wet separation with gravity, where material by pulsating water forming layers of grains according to their density and subsequently separates the heavy material (limestone) from the stratified material (oil shale)bed. Main aim of this research is to find the suitable separation process for oil shale, that the products have highest quality. The expected results can be used also for developing separation processes for phosphorite rock or all others, where traditional separation processes doesn't work property. This research is part of the study Sustainable and environmentally acceptable Oil shale mining No. 3.2.0501.11-0025 http://mi.ttu.ee/etp and the project B36 Extraction and processing of rock with selective methods - http://mi.ttu.ee/separation; http://mi.ttu.ee/miningwaste/

  20. THE ROLE OF QUALITY METHODS IN IMPROVING EDUCATION PROCESS: CASE STUDY

    Directory of Open Access Journals (Sweden)

    Dragan Pavlović

    2014-10-01

    Full Text Available This paper presents a methodology for applying the Lean Six Sigma method on the educational process. After defining defects that have negative influence on the final quality evaluation of higher education and how these defects can be remedied, the Pareto analysis is done, and that is used for establishing a vital minority of the exams that are critical for examination of faculty. The next step is the Statistical Process Control (SPC analysis that is performed on the exams that are classified as vital minority in Pareto analysis. Ishikawa diagram shows a relation between considered consequence (small number of passed exams and all factors that influence this consequence. Based on the results of implementation of the Lean Six Sigma method in the educational process and implementation of all suggested improvements, the comparative overview of Pareto analysis is given for 2009/2010 and 2012/2013 academic year at the Faculty of Mechanical Engineering, University of Niš.

  1. Modelling dynamic processes in a nuclear reactor by state change modal method

    Science.gov (United States)

    Avvakumov, A. V.; Strizhov, V. F.; Vabishchevich, P. N.; Vasilev, A. O.

    2017-12-01

    Modelling of dynamic processes in nuclear reactors is carried out, mainly, using the multigroup neutron diffusion approximation. The basic model includes a multidimensional set of coupled parabolic equations and ordinary differential equations. Dynamic processes are modelled by a successive change of the reactor states. It is considered that the transition from one state to another occurs promptly. In the modal method the approximate solution is represented as eigenfunction expansion. The numerical-analytical method is based on the use of dominant time-eigenvalues of a group diffusion model taking into account delayed neutrons.

  2. Application of wavelet analysis to signal processing methods for eddy-current test

    International Nuclear Information System (INIS)

    Chen, G.; Yoneyama, H.; Yamaguchi, A.; Uesugi, N.

    1998-01-01

    This study deals with the application of wavelet analysis to detection and characterization of defects from eddy-current and ultrasonic testing signals of a low signal-to-noise ratio. Presented in this paper are the methods for processing eddy-current testing signals of heat exchanger tubes of a steam generator in a nuclear power plant. The results of processing eddy-current testing signals of tube testpieces with artificial flaws show that the flaw signals corrupted by noise and/or non-defect signals can be effectively detected and characterized by using the wavelet methods. (author)

  3. Standardization of a method to study the distribution of Americium in purex process

    International Nuclear Information System (INIS)

    Dapolikar, T.T.; Pant, D.K.; Kapur, H.N.; Kumar, Rajendra; Dubey, K.

    2017-01-01

    In the present work the distribution of Americium in PUREX process is investigated in various process streams. For this purpose a method has been standardized for the determination of Am in process samples. The method involves extraction of Am with associated actinides using 30% TRPO-NPH at 0.3M HNO 3 followed by selective stripping of Am from the organic phase into aqueous phase at 6M HNO 3 . The assay of aqueous phase for Am content is carried out by alpha radiometry. The investigation has revealed that 100% Am follows the HLLW route. (author)

  4. Comparison of the quasi-static method and the dynamic method for simulating fracture processes in concrete

    Science.gov (United States)

    Liu, J. X.; Deng, S. C.; Liang, N. G.

    2008-02-01

    Concrete is heterogeneous and usually described as a three-phase material, where matrix, aggregate and interface are distinguished. To take this heterogeneity into consideration, the Generalized Beam (GB) lattice model is adopted. The GB lattice model is much more computationally efficient than the beam lattice model. Numerical procedures of both quasi-static method and dynamic method are developed to simulate fracture processes in uniaxial tensile tests conducted on a concrete panel. Cases of different loading rates are compared with the quasi-static case. It is found that the inertia effect due to load increasing becomes less important and can be ignored with the loading rate decreasing, but the inertia effect due to unstable crack propagation remains considerable no matter how low the loading rate is. Therefore, an unrealistic result will be obtained if a fracture process including unstable cracking is simulated by the quasi-static procedure.

  5. Process qualification and control in electron beams--requirements, methods, new concepts and challenges

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Gratzl, F.; Hanis, D.

    2004-01-01

    In this paper the status of process qualification and control in electron beam irradiation is analyzed in terms of requirements, concepts, methods and challenges for a state-of-the-art process control concept for medical device sterilization. Aspects from process qualification to routine process control are described together with the associated process variables. As a case study the 10 MeV beams at Mediscan GmbH are considered. Process control concepts like statistical process control (SPC) and a new concept to determine process capability is briefly discussed

  6. A collaborative processes synchronization method with regard to system crashes and network failures

    NARCIS (Netherlands)

    Wang, Lei; Wombacher, Andreas; Ferreira Pires, Luis; van Sinderen, Marten J.; Chi, Chihung

    2014-01-01

    Processes can synchronize their states by exchanging messages. System crashes and network failures may cause message loss, so that state changes of a process may remain unnoticed by its partner processes, resulting in state inconsistency or deadlocks. In this paper we define a method to transform a

  7. Study of parachute inflation process using fluid–structure interaction method

    Directory of Open Access Journals (Sweden)

    Yu Li

    2014-04-01

    Full Text Available A direct numerical modeling method for parachute is proposed firstly, and a model for the star-shaped folded parachute with detailed structures is established. The simplified arbitrary Lagrangian–Eulerian fluid–structure interaction (SALE/FSI method is used to simulate the inflation process of a folded parachute, and the flow field calculation is mainly based on operator splitting technique. By using this method, the dynamic variations of related parameters such as flow field and structure are obtained, and the load jump appearing at the end of initial inflation stage is captured. Numerical results including opening load, drag characteristics, swinging angle, etc. are well consistent with wind tunnel tests. In addition, this coupled method can get more space–time detailed information such as geometry shape, structure, motion, and flow field. Compared with previous inflation time method, this method is a completely theoretical analysis approach without relying on empirical coefficients, which can provide a reference for material selection, performance optimization during parachute design.

  8. Advanced methods for processing ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Carter, W.B. [Georgia Institute of Technology, Atlanta, GA (United States)

    1995-05-01

    Combustion chemical vapor deposition (CCVD) is a flame assisted, open air chemical vapor deposition (CVD) process. The process is capable of producing textured, epitaxial coatings on single crystal substrates using low cost reagents. Combustion chemical vapor deposition is a relatively inexpensive, alternative thin film deposition process with potential to replace conventional coating technologies for certain applications. The goals of this project are to develop the CCVD process to the point that potential industrial applications can be identified and reliably assessed.

  9. Optimal Selection Method of Process Patents for Technology Transfer Using Fuzzy Linguistic Computing

    Directory of Open Access Journals (Sweden)

    Gangfeng Wang

    2014-01-01

    Full Text Available Under the open innovation paradigm, technology transfer of process patents is one of the most important mechanisms for manufacturing companies to implement process innovation and enhance the competitive edge. To achieve promising technology transfers, we need to evaluate the feasibility of process patents and optimally select the most appropriate patent according to the actual manufacturing situation. Hence, this paper proposes an optimal selection method of process patents using multiple criteria decision-making and 2-tuple fuzzy linguistic computing to avoid information loss during the processes of evaluation integration. An evaluation index system for technology transfer feasibility of process patents is designed initially. Then, fuzzy linguistic computing approach is applied to aggregate the evaluations of criteria weights for each criterion and corresponding subcriteria. Furthermore, performance ratings for subcriteria and fuzzy aggregated ratings of criteria are calculated. Thus, we obtain the overall technology transfer feasibility of patent alternatives. Finally, a case study of aeroengine turbine manufacturing is presented to demonstrate the applicability of the proposed method.

  10. Method and apparatus for surface characterization and process control utilizing radiation from desorbed particles

    International Nuclear Information System (INIS)

    Feldman, L.C.; Kraus, J.S.; Tolk, N.H.; Traum, M.M.; Tully, J.C.

    1983-01-01

    Emission of characteristic electromagnetic radiation in the infrared, visible, or UV from excited particles, typically ions, molecules, or neutral atoms, desorbed from solid surfaces by an incident beam of low-momentum probe radiation has been observed. Disclosed is a method for characterizing solid surfaces based on the observed effect, with low-momentum probe radiation consisting of electrons or photons. Further disclosed is a method for controlling manufacturing processes that is also based on the observed effect. The latter method can, for instance, be advantageously applied in integrated circuit-, integrated optics-, and magnetic bubble device manufacture. Specific examples of applications of the method are registering of masks, control of a direct-writing processing beam, end-point detection in etching, and control of a processing beam for laser- or electron-beam annealing or ion implantation

  11. Applying Process Improvement Methods to Clinical and Translational Research: Conceptual Framework and Case Examples.

    Science.gov (United States)

    Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K

    2015-12-01

    There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.

  12. An Efficient Method to Search Real-Time Bulk Data for an Information Processing System

    International Nuclear Information System (INIS)

    Kim, Seong Jin; Kim, Jong Myung; Suh, Yong Suk; Keum, Jong Yong; Park, Heui Youn

    2005-01-01

    The Man Machine Interface System (MMIS) of System-integrated Modular Advanced ReacTor (SMART) is designed with fully digitalized features. The Information Processing System (IPS) of the MMIS acquires and processes plant data from other systems. In addition, the IPS provides plant operation information to operators in the control room. The IPS is required to process bulky data in a real-time. So, it is necessary to consider a special processing method with regards to flexibility and performance because more than a few thousands of Plant Information converges on the IPS. Among other things, the processing time for searching for data from the bulk data consumes much more than other the processing times. Thus, this paper explores an efficient method for the search and examines its feasibility

  13. Method for enhanced control of welding processes

    Science.gov (United States)

    Sheaffer, Donald A.; Renzi, Ronald F.; Tung, David M.; Schroder, Kevin

    2000-01-01

    Method and system for producing high quality welds in welding processes, in general, and gas tungsten arc (GTA) welding, in particular by controlling weld penetration. Light emitted from a weld pool is collected from the backside of a workpiece by optical means during welding and transmitted to a digital video camera for further processing, after the emitted light is first passed through a short wavelength pass filter to remove infrared radiation. By filtering out the infrared component of the light emitted from the backside weld pool image, the present invention provides for the accurate determination of the weld pool boundary. Data from the digital camera is fed to an imaging board which focuses on a 100.times.100 pixel portion of the image. The board performs a thresholding operation and provides this information to a digital signal processor to compute the backside weld pool dimensions and area. This information is used by a control system, in a dynamic feedback mode, to automatically adjust appropriate parameters of a welding system, such as the welding current, to control weld penetration and thus, create a uniform weld bead and high quality weld.

  14. Study on highly efficient seismic data acquisition and processing methods based on sparsity constraint

    Science.gov (United States)

    Wang, H.; Chen, S.; Tao, C.; Qiu, L.

    2017-12-01

    High-density, high-fold and wide-azimuth seismic data acquisition methods are widely used to overcome the increasingly sophisticated exploration targets. The acquisition period is longer and longer and the acquisition cost is higher and higher. We carry out the study of highly efficient seismic data acquisition and processing methods based on sparse representation theory (or compressed sensing theory), and achieve some innovative results. The theoretical principles of highly efficient acquisition and processing is studied. We firstly reveal sparse representation theory based on wave equation. Then we study the highly efficient seismic sampling methods and present an optimized piecewise-random sampling method based on sparsity prior information. At last, a reconstruction strategy with the sparsity constraint is developed; A two-step recovery approach by combining sparsity-promoting method and hyperbolic Radon transform is also put forward. The above three aspects constitute the enhanced theory of highly efficient seismic data acquisition. The specific implementation strategies of highly efficient acquisition and processing are studied according to the highly efficient acquisition theory expounded in paragraph 2. Firstly, we propose the highly efficient acquisition network designing method by the help of optimized piecewise-random sampling method. Secondly, we propose two types of highly efficient seismic data acquisition methods based on (1) single sources and (2) blended (or simultaneous) sources. Thirdly, the reconstruction procedures corresponding to the above two types of highly efficient seismic data acquisition methods are proposed to obtain the seismic data on the regular acquisition network. A discussion of the impact on the imaging result of blended shooting is discussed. In the end, we implement the numerical tests based on Marmousi model. The achieved results show: (1) the theoretical framework of highly efficient seismic data acquisition and processing

  15. Housing decision making methods for initiation development phase process

    Science.gov (United States)

    Zainal, Rozlin; Kasim, Narimah; Sarpin, Norliana; Wee, Seow Ta; Shamsudin, Zarina

    2017-10-01

    Late delivery and sick housing project problems were attributed to poor decision making. These problems are the string of housing developer that prefers to create their own approach based on their experiences and expertise with the simplest approach by just applying the obtainable standards and rules in decision making. This paper seeks to identify the decision making methods for housing development at the initiation phase in Malaysia. The research involved Delphi method by using questionnaire survey which involved 50 numbers of developers as samples for the primary stage of collect data. However, only 34 developers contributed to the second stage of the information gathering process. At the last stage, only 12 developers were left for the final data collection process. Finding affirms that Malaysian developers prefer to make their investment decisions based on simple interpolation of historical data and using simple statistical or mathematical techniques in producing the required reports. It was suggested that they seemed to skip several important decision-making functions at the primary development stage. These shortcomings were mainly due to time and financial constraints and the lack of statistical or mathematical expertise among the professional and management groups in the developer organisations.

  16. Apparatus and method for plasma processing of SRF cavities

    Science.gov (United States)

    Upadhyay, J.; Im, Do; Peshl, J.; Bašović, M.; Popović, S.; Valente-Feliciano, A.-M.; Phillips, L.; Vušković, L.

    2016-05-01

    An apparatus and a method are described for plasma etching of the inner surface of superconducting radio frequency (SRF) cavities. Accelerator SRF cavities are formed into a variable-diameter cylindrical structure made of bulk niobium, for resonant generation of the particle accelerating field. The etch rate non-uniformity due to depletion of the radicals has been overcome by the simultaneous movement of the gas flow inlet and the inner electrode. An effective shape of the inner electrode to reduce the plasma asymmetry for the coaxial cylindrical rf plasma reactor is determined and implemented in the cavity processing method. The processing was accomplished by moving axially the inner electrode and the gas flow inlet in a step-wise way to establish segmented plasma columns. The test structure was a pillbox cavity made of steel of similar dimension to the standard SRF cavity. This was adopted to experimentally verify the plasma surface reaction on cylindrical structures with variable diameter using the segmented plasma generation approach. The pill box cavity is filled with niobium ring- and disk-type samples and the etch rate of these samples was measured.

  17. Waste processing method

    International Nuclear Information System (INIS)

    Furukawa, Osamu; Shibata, Minoru.

    1996-01-01

    X-rays are irradiated from a predetermined direction to solid wastes containing radioactive isotopes packed in a bag before charged into an inlet of an incinerator. Most of the wastes is burnable plastics such as test tubes and papers. Glasses such as chemical bottles and metals such as lead plates for radiation shielding are contained as a portion of the wastes. The X-rays have such an intensity capable of discriminating metals and glasses from burnable materials. Irradiation images formed on a X-ray irradiation receiving portion are processed, and the total number of picture elements on the portion where a gradation of the light receiving portion of the metal is within a predetermined range is counted on the image. Then, the bag having total picture elements of not less than a predetermined number are separated from the bag having a lesser number. Similar processings are conducted for glasses. With such procedures, the bags containing lead and glasses not suitable to incineration are separated from the bags not containing them thereby enabling to prevent lowering of operation efficiency of the incinerator. (I.N.)

  18. Variational methods for high-order multiphoton processes

    International Nuclear Information System (INIS)

    Gao, B.; Pan, C.; Liu, C.; Starace, A.F.

    1990-01-01

    Methods for applying the variationally stable procedure for Nth-order perturbative transition matrix elements of Gao and Starace [Phys. Rev. Lett. 61, 404 (1988); Phys. Rev. A 39, 4550 (1989)] to multiphoton processes involving systems other than atomic H are presented. Three specific cases are discussed: one-electron ions or atoms in which the electron--ion interaction is described by a central potential; two-electron ions or atoms in which the electronic states are described by the adiabatic hyperspherical representation; and closed-shell ions or atoms in which the electronic states are described by the multiconfiguration Hartree--Fock representation. Applications are made to the dynamic polarizability of He and the two-photon ionization cross section of Ar

  19. Comparison of Statistical Post-Processing Methods for Probabilistic Wind Speed Forecasting

    Science.gov (United States)

    Han, Keunhee; Choi, JunTae; Kim, Chansoo

    2018-02-01

    In this study, the statistical post-processing methods that include bias-corrected and probabilistic forecasts of wind speed measured in PyeongChang, which is scheduled to host the 2018 Winter Olympics, are compared and analyzed to provide more accurate weather information. The six post-processing methods used in this study are as follows: mean bias-corrected forecast, mean and variance bias-corrected forecast, decaying averaging forecast, mean absolute bias-corrected forecast, and the alternative implementations of ensemble model output statistics (EMOS) and Bayesian model averaging (BMA) models, which are EMOS and BMA exchangeable models by assuming exchangeable ensemble members and simplified version of EMOS and BMA models. Observations for wind speed were obtained from the 26 stations in PyeongChang and 51 ensemble member forecasts derived from the European Centre for Medium-Range Weather Forecasts (ECMWF Directorate, 2012) that were obtained between 1 May 2013 and 18 March 2016. Prior to applying the post-processing methods, reliability analysis was conducted by using rank histograms to identify the statistical consistency of ensemble forecast and corresponding observations. Based on the results of our study, we found that the prediction skills of probabilistic forecasts of EMOS and BMA models were superior to the biascorrected forecasts in terms of deterministic prediction, whereas in probabilistic prediction, BMA models showed better prediction skill than EMOS. Even though the simplified version of BMA model exhibited best prediction skill among the mentioned six methods, the results showed that the differences of prediction skills between the versions of EMOS and BMA were negligible.

  20. Nuclear pulse signal processing techniques based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Qi Zhong; Meng Xiangting; Fu Yanyan; Li Dongcang

    2012-01-01

    This article presents a method of measurement and analysis of nuclear pulse signal, the FPGA to control high-speed ADC measurement of nuclear radiation signals and control the high-speed transmission status of the USB to make it work on the Slave FIFO mode, using the LabVIEW online data processing and display, using the blind deconvolution method to remove the accumulation of signal acquisition, and to restore the nuclear pulse signal with a transmission speed, real-time measurements show that the advantages. (authors)

  1. Evaluation of silage-fed biogas process performance using microbiological and kinetic methods

    Energy Technology Data Exchange (ETDEWEB)

    Jarvis, Aa

    1996-10-01

    In this study, different kinetic and microbiological methods were used to evaluate the growth and activity of key groups of bacteria degrading ley silage in one-phase and two-phase biogas processes. Emphasis was placed on studying the dynamic behaviour of different trophic groups resulting from the initiation of liquid recirculation in the processes. The microbiological methods included microscopy and most probable number (MPN) counts with different substrates. The kinetic methods included measurements of specific methanogenic activity (SMA) with acetate and H{sub 2}/CO{sub 2} as substrates, batch assays with trace element additions and measurement of conversion rates of mannitol and lactate in the digesters. In general, the initiation of liquid recirculation at first promoted the growth and/or activity of several trophic groups of bacteria, such as butyrate and propionate degraders and acetotrophic and hydrogenotrophic methanogens in the liquefaction/acidogenesis reactors of the two-phase processes. This was probably mainly due to the increased pH. However, after some time of liquid recirculation, an inhibition of some bacterial groups occurred, such as propionate degraders and methanogens in the methanogenic reactors of two-phase processes. This was probably due to increased concentrations of salts and free ammonia. The batch assays proved to be valuable tools in process optimization by the addition of trace elements. Here, the addition of cobalt significantly increased methane production from acetate. In this study, a more comprehensive understanding of the process behaviour in response to the initiation of liquid recirculation was achieved which could not have been obtained by only monitoring routine parameters such as pH, methane production and concentrations of organic acids and salts. 120 refs, 4 figs, 1 tab

  2. Effect of Processing Methods on Nutrient Contents of Six Sweet ...

    African Journals Online (AJOL)

    in rural communities and was often given low priority. Currently ... (p≤0.05) differences between varieties in protein, fat, reducing sugars, carbohydrates, total carotenoids, calcium, iron ... maturity (about 5 months, average maturity rate for ... 630-12) (method. 968.08). ..... processing sweet potato by either boiling, roasting or.

  3. Combined expert system/neural networks method for process fault diagnosis

    Science.gov (United States)

    Reifman, Jaques; Wei, Thomas Y. C.

    1995-01-01

    A two-level hierarchical approach for process fault diagnosis is an operating system employs a function-oriented approach at a first level and a component characteristic-oriented approach at a second level, where the decision-making procedure is structured in order of decreasing intelligence with increasing precision. At the first level, the diagnostic method is general and has knowledge of the overall process including a wide variety of plant transients and the functional behavior of the process components. An expert system classifies malfunctions by function to narrow the diagnostic focus to a particular set of possible faulty components that could be responsible for the detected functional misbehavior of the operating system. At the second level, the diagnostic method limits its scope to component malfunctions, using more detailed knowledge of component characteristics. Trained artificial neural networks are used to further narrow the diagnosis and to uniquely identify the faulty component by classifying the abnormal condition data as a failure of one of the hypothesized components through component characteristics. Once an anomaly is detected, the hierarchical structure is used to successively narrow the diagnostic focus from a function misbehavior, i.e., a function oriented approach, until the fault can be determined, i.e., a component characteristic-oriented approach.

  4. Combined expert system/neural networks method for process fault diagnosis

    Science.gov (United States)

    Reifman, J.; Wei, T.Y.C.

    1995-08-15

    A two-level hierarchical approach for process fault diagnosis of an operating system employs a function-oriented approach at a first level and a component characteristic-oriented approach at a second level, where the decision-making procedure is structured in order of decreasing intelligence with increasing precision. At the first level, the diagnostic method is general and has knowledge of the overall process including a wide variety of plant transients and the functional behavior of the process components. An expert system classifies malfunctions by function to narrow the diagnostic focus to a particular set of possible faulty components that could be responsible for the detected functional misbehavior of the operating system. At the second level, the diagnostic method limits its scope to component malfunctions, using more detailed knowledge of component characteristics. Trained artificial neural networks are used to further narrow the diagnosis and to uniquely identify the faulty component by classifying the abnormal condition data as a failure of one of the hypothesized components through component characteristics. Once an anomaly is detected, the hierarchical structure is used to successively narrow the diagnostic focus from a function misbehavior, i.e., a function oriented approach, until the fault can be determined, i.e., a component characteristic-oriented approach. 9 figs.

  5. Characterizing Dynamic Walking Patterns and Detecting Falls with Wearable Sensors Using Gaussian Process Methods

    Directory of Open Access Journals (Sweden)

    Taehwan Kim

    2017-05-01

    Full Text Available By incorporating a growing number of sensors and adopting machine learning technologies, wearable devices have recently become a prominent health care application domain. Among the related research topics in this field, one of the most important issues is detecting falls while walking. Since such falls may lead to serious injuries, automatically and promptly detecting them during daily use of smartphones and/or smart watches is a particular need. In this paper, we investigate the use of Gaussian process (GP methods for characterizing dynamic walking patterns and detecting falls while walking with built-in wearable sensors in smartphones and/or smartwatches. For the task of characterizing dynamic walking patterns in a low-dimensional latent feature space, we propose a novel approach called auto-encoded Gaussian process dynamical model, in which we combine a GP-based state space modeling method with a nonlinear dimensionality reduction method in a unique manner. The Gaussian process methods are fit for this task because one of the most import strengths of the Gaussian process methods is its capability of handling uncertainty in the model parameters. Also for detecting falls while walking, we propose to recycle the latent samples generated in training the auto-encoded Gaussian process dynamical model for GP-based novelty detection, which can lead to an efficient and seamless solution to the detection task. Experimental results show that the combined use of these GP-based methods can yield promising results for characterizing dynamic walking patterns and detecting falls while walking with the wearable sensors.

  6. A new method of knowledge processing for equipment diagnosis of nuclear power plants

    International Nuclear Information System (INIS)

    Fujii, M.; Fukumoto, A.; Tai, I.; Morioka, T.

    1987-01-01

    In this work, the authors complete the development of a new knowledge processing method and representation for equipment diagnosis of nuclear power plants and evaluate its functions by applying to the maintenance and diagnosis support system of the reactor instrumentation. This knowledge processing method system is based on the Cause Generation and Checking concept and has sufficient performance not only in the diagnosis function but also in the man-machine interfacing function. The maintenance and diagnosis support system based on this method leads to the possibility for users to diagnose various phenomena occurred in an objective equipment to the considerable extent by consulting with the system, even if they don't have enough knowledge. With this system, it becomes easy for operators or plant engineers to take immediate actions to counteract against the abnormality. The maintainability of the equipments is improved and MTTR (Mean Time To Repair) is expected to be shorter. This new knowledge processing method is proved to be suited for fault diagnosis of the equipments of nuclear power plants

  7. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    Science.gov (United States)

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  8. Method of volume-reducing processing for radioactive wastes

    International Nuclear Information System (INIS)

    Sato, Koei; Yamauchi, Noriyuki; Hirayama, Toshihiko.

    1985-01-01

    Purpose: To process the processing products of radioactive liquid wastes and burnable solid wastes produced from nuclear facilities into stable solidification products by heat melting. Method: At first, glass fiber wastes of contaminated air filters are charged in a melting furnace. Then, waste products obtained through drying, sintering, incineration, etc. are mixed with a proper amount of glass fibers and charged into the melting furnace. Both of the charged components are heated to a temperature at which the glass fibers are melted. The burnable materials are burnt out to provide a highly volume-reduced products. When the products are further heated to a temperature at which metals or metal oxides of a higher melting point than the glass fiber, the glass fibers and the metals or metal oxides are fused to each other to be combined in a molecular structure into more stabilized products. The products are excellent in strength, stability, durability and leaching resistance at ambient temperature. (Kamimura, M.)

  9. Recent Advances in Computational Methods for Nuclear Magnetic Resonance Data Processing

    KAUST Repository

    Gao, Xin

    2013-01-11

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  10. Method of processing radioactive metal wastes

    International Nuclear Information System (INIS)

    Inoue, Yoichi; Kitagawa, Kazuo; Tsuzura, Katsuhiko.

    1980-01-01

    Purpose: To enable long and safety storage for radioactive metal wastes such as used fuel cans after the procession or used pipe, instruments and the likes polluted with various radioactive substances, by compacting them to solidify. Method: Metal wastes such as used fuel cans, which have been cut shorter and reprocessed, are pressed into generally hexagonal blocks. The block is charged in a capsule of a hexagonal cross section made of non-gas permeable materials such as soft steels, stainless steels and the likes. Then, the capsule is subjected to static hydraulic hot pressing as it is or after deaeration and sealing. While various combinations are possible for temperature, pressure and time as the conditions for the static hydraulic hot pressing, dense block with no residual gas pores can be obtained, for example, under the conditions of 900 0 C, 1000 Kg/cm 2 and one hour where the wastes are composed of zircaloy. (Kawakami, Y.)

  11. Possibilities of implementing nonthermal processing methods in the dairy industry

    Directory of Open Access Journals (Sweden)

    Irena Jeličić

    2010-06-01

    Full Text Available In the past two decades a lot of research in the field of food science has focused on new, non-thermal processing methods. This article describes the most intensively investigated new processing methodsfor implementation in the dairy industry, like microfiltration, high hydrostatic pressure, ultrasound and pulsed electric fields. For each method an overview is given for the principle of microbial inactivation, the obtained results regarding reduction of microorganisms as well as the positive and undesirable effects on milk composition and characteristics. Most promising methods for further implementation in the dairy industry appeared to be combination of moderate temperatures with high hydrostatic pressure, respectively, pulsed electric fields and microfiltration, since those treatments did not result in any undesirable changes in sensory properties of milk. Additionally, milk treatment with these methodsresulted in a better milk fat homogenization, faster rennet coagulation, shorter duration of milk fermentations, etc. Very good results regarding microbial inactivation were obtained by treating milkwith combination of moderate temperatures and high intensity ultrasound which is also called a process of thermosonification. However, thermosonification treatments often result in undesirablechanges in milk sensory properties, which is most probably due to ultrasonic induced milk fat oxidation. This article also shortly describes the use of natural compounds with antimicrobial effects such as bacteriocins, lactoperoxidase system and lysozime. However their implementation is limited for reasons like high costs, interaction with other food ingredients, poor solubility, narrow activity spectrum, spontaneous loss of bacteriocinogenicity, etc. In addition, principles of antimicrobial effect of microwaves and ultraviolet irradiation are described. However their implementation in the dairy industry failed mostly due to technical and commercial reasons.

  12. Application of the microbiological method DEFT/APC and DNA comet assay to detect ionizing radiation processing of minimally processed vegetables

    International Nuclear Information System (INIS)

    Araujo, Michel Mozeika

    2008-01-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent healthy. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and inactivation of food-borne pathogens, Its combination with minimal processing could improve the safety and quality of MPV. Two different food irradiation detection methods, a biological, the DEFT/APC, and another biochemical, the DNA Comet Assay were applied to MPV in order to test its applicability to detect irradiation treatment. DEFT/APC is a microbiological screening method based on the use of the direct epi fluorescent filter technique (DEFT) and the aerobic plate count (APC). DNA Comet Assay detects DNA damage due to ionizing radiation. Samples of lettuce, chard, watercress, dandelion, kale, chicory, spinach, cabbage from retail market were irradiated O.5 kGy and 1.0 kGy using a 60 Co facility. Irradiation treatment guaranteed at least 2 log cycle reduction for aerobic and psychotropic microorganisms. In general, with increasing radiation doses, DEFT counts remained similar independent of irradiation processing while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. DNA Comet Assay allowed distinguishing non-irradiated samples from irradiated ones, which showed different types of comets owing to DNA fragmentation. Both DEFT/APC method and DNA Comet Assay would be satisfactorily used as a screening method for indicating irradiation processing. (author)

  13. New signal processing methods for the evaluation of eddy current NDT data

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Signal processing and pattern recognition methods play a crucial role in a number of areas associated with nondestructive evaluation. Defect characterization schemes often involve mapping the signal onto an appropriate feature domain and using pattern recognition techniques for classification. In addition, signal processing methods are also used to acquire, enhance, restore, and compress data. EPRI Project RP 2673-4 is concerned with developing new signal processing and pattern recognition techniques for evaluating eddy current signals. Efforts under this project have focused on three closely related areas. The thrust has been to: (1) develop a scheme to compress eddy current signals for the purposes of storing them in a compact form, (2) develop a robust clustering algorithm capable of discarding feature vectors that fall in the gray areas between clusters, and (3) investigate the feasibility of designing and developing a digital eddyscope

  14. Application of psychodiagnostic methods in the recruitment process for factory workers

    Directory of Open Access Journals (Sweden)

    Klára Seitlová

    2014-01-01

    Full Text Available Human resources (HR are the most valuable asset in any organization and many successful managers regard work with people as the most important aspect for the prosperity and health of the society. Work and organizational psychology is therefore facing a number of challenges in this a rea. The correct process of recruitment is just one of them. The selection process is a process which aims at recognizing the best candidate for a specific position. It does not always require that all procedures be carried out (preliminary interview, testing, selection interview, etc., rather, the selection process requires a critical examination in relation to the position, number of applicants, etc. The present study focuses on the use of psychodiagnostic methods in the process of selecting workmen, highlighting their usefulness in the selection process. The Tower of Hanoi test (ToH and d2 Test of Attention (d2 were applied and further supplemented by practical assessment examination of the candidate in the recruitment process. It was investigated whether the results of ToH and d2 tests together with the result of the practical test may help predict the overall work quality of a future employee. The quality of work of employees was evaluated based on the following criteria: effort and performance, interoperability, performance of tasks, respect for rules, attendance, quality of work. The overall evaluation was the average of the partial results of the individual criteria. The data were collected between August 2013 and September 2014 in a production company with a focus on engineering production in the Moravian - Sliesian region. The research group consisted of 30 people who applied for the position of a welder and, after having succeeded in the recruitment process, entered into a labor - law relationship with the employer. All respondents were acquainted with the ethical conditions of the study. The results show that the use of the above-described tests in the

  15. Application of psychodiagnostic methods in the recruitment process for factory workers

    Directory of Open Access Journals (Sweden)

    Klára Seitlová

    2014-12-01

    Full Text Available Human resources (HR are the most valuable asset in any organization and many successful managers regard work with people as the most important aspect for the prosperity and health of the society. Work and organizational psychology is therefore facing a number of challenges in this area. The correct process of recruitment is just one of them. The selection process is a process which aims at recognizing the best candidate for a specific position. It does not always require that all procedures be carried out (preliminary interview, testing, selection interview, etc., rather, the selection process requires a critical examination in relation to the position, number of applicants, etc. The present study focuses on the use of psychodiagnostic methods in the process of selecting workmen, highlighting their usefulness in the selection process. The Tower of Hanoi test (ToH and d2 Test of Attention (d2 were applied and further supplemented by practical assessment examination of the candidate in the recruitment process. It was investigated whether the results of ToH and d2 tests together with the result of the practical test may help predict the overall work quality of a future employee. The quality of work of employees was evaluated based on the following criteria: effort and performance, interoperability, performance of tasks, respect for rules, attendance, quality of work. The overall evaluation was the average of the partial results of the individual criteria. The data were collected between August 2013 and September 2014 in a production company with a focus on engineering production in the Moravian-Sliesian region. The research group consisted of 30 people who applied for the position of a welder and, after having succeeded in the recruitment process, entered into a labor-law relationship with the employer. All respondents were acquainted with the ethical conditions of the study. The results show that the use of the above-described tests in the

  16. A method for energy and exergy analyses of product transformation processes in industry

    International Nuclear Information System (INIS)

    Abou Khalil, B.

    2008-12-01

    After a literature survey enabling the determination of the advantages and drawbacks of existing methods of assessment of the potential energy gains of an industrial site, this research report presents a newly developed method, named Energy and Exergy Analysis of Transformation Processes (or AEEP for Analyse energetique et exergetique des procedes de transformation), while dealing with actual industrial operations, in order to demonstrate the systematic character of this method. The different steps of the method are presented and detailed, one of them, the process analysis, being critical for the application of the developed method. This particular step is then applied to several industrial unitary operations in order to be a base for future energy audits in the concerned industry sectors, as well as to demonstrate its generic and systematic character. The method is the then applied in a global manner to a cheese manufacturing plant, all the different steps of the AEEP being applied. The author demonstrates that AEEP is a systematic method and can be applied to all energy audit levels, moreover to the lowest levels which have a relatively low cost

  17. Possibilities of Particle Finite Element Methods in Industrial Forming Processes

    Science.gov (United States)

    Oliver, J.; Cante, J. C.; Weyler, R.; Hernandez, J.

    2007-04-01

    The work investigates the possibilities offered by the particle finite element method (PFEM) in the simulation of forming problems involving large deformations, multiple contacts, and new boundaries generation. The description of the most distinguishing aspects of the PFEM, and its application to simulation of representative forming processes, illustrate the proposed methodology.

  18. Complexity, Methodology and Method: Crafting a Critical Process of Research

    Science.gov (United States)

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  19. Actively Teaching Research Methods with a Process Oriented Guided Inquiry Learning Approach

    Science.gov (United States)

    Mullins, Mary H.

    2017-01-01

    Active learning approaches have shown to improve student learning outcomes and improve the experience of students in the classroom. This article compares a Process Oriented Guided Inquiry Learning style approach to a more traditional teaching method in an undergraduate research methods course. Moving from a more traditional learning environment to…

  20. Tight Error Bounds for Fourier Methods for Option Pricing for Exponential Levy Processes

    KAUST Repository

    Crocce, Fabian

    2016-01-06

    Prices of European options whose underlying asset is driven by the L´evy process are solutions to partial integrodifferential Equations (PIDEs) that generalise the Black-Scholes equation by incorporating a non-local integral term to account for the discontinuities in the asset price. The Levy -Khintchine formula provides an explicit representation of the characteristic function of a L´evy process (cf, [6]): One can derive an exact expression for the Fourier transform of the solution of the relevant PIDE. The rapid rate of convergence of the trapezoid quadrature and the speedup provide efficient methods for evaluationg option prices, possibly for a range of parameter configurations simultaneously. A couple of works have been devoted to the error analysis and parameter selection for these transform-based methods. In [5] several payoff functions are considered for a rather general set of models, whose characteristic function is assumed to be known. [4] presents the framework and theoretical approach for the error analysis, and establishes polynomial convergence rates for approximations of the option prices. [1] presents FT-related methods with curved integration contour. The classical flat FT-methods have been, on the other hand, extended for option pricing problems beyond the European framework [3]. We present a methodology for studying and bounding the error committed when using FT methods to compute option prices. We also provide a systematic way of choosing the parameters of the numerical method, minimising the error bound and guaranteeing adherence to a pre-described error tolerance. We focus on exponential L´evy processes that may be of either diffusive or pure jump in type. Our contribution is to derive a tight error bound for a Fourier transform method when pricing options under risk-neutral Levy dynamics. We present a simplified bound that separates the contributions of the payoff and of the process in an easily processed and extensible product form that

  1. Methods for Dissecting Motivation and Related Psychological Processes in Rodents.

    Science.gov (United States)

    Ward, Ryan D

    2016-01-01

    Motivational impairments are increasingly recognized as being critical to functional deficits and decreased quality of life in patients diagnosed with psychiatric disease. Accordingly, much preclinical research has focused on identifying psychological and neurobiological processes which underlie motivation . Inferring motivation from changes in overt behavioural responding in animal models, however, is complicated, and care must be taken to ensure that the observed change is accurately characterized as a change in motivation , and not due to some other, task-related process. This chapter discusses current methods for assessing motivation and related psychological processes in rodents. Using an example from work characterizing the motivational impairments in an animal model of the negative symptoms of schizophrenia, we highlight the importance of careful and rigorous experimental dissection of motivation and the related psychological processes when characterizing motivational deficits in rodent models . We suggest that such work is critical to the successful translation of preclinical findings to therapeutic benefits for patients.

  2. Temporal response methods for dynamic measurement of in-process inventory of dissolved nuclear materials

    International Nuclear Information System (INIS)

    Ziri, S.M.; Seefeldt, W.B.

    1977-08-01

    This analysis has demonstrated that a plant's temporal response to perturbation of feed isotope composition can be used to measure the in-process inventory, without suspending plant operations. The main advantages of the temporal response technique over the step-displacement method are (1) it (the temporal response method) obviates the need for large special feed batches, and (2) it obviates the requirement that all the in-process material have a uniform isotopic composition at the beginning of the measurement. The temporal response method holds promise for essentially continuous real-time determination of in-process SNM. However, the temporal response method requires the measurement of the isotopic composition of many samples, and it works best for a stationary random input time series of tracer perturbations. Both of these requirements appear amenable to satisfaction by practical equipment and procedures if the benefits are deemed sufficiently worthwhile

  3. Applying some methods to process the data coming from the nuclear reactions

    International Nuclear Information System (INIS)

    Suleymanov, M.K.; Abdinov, O.B.; Belashev, B.Z.

    2010-01-01

    Full text : The methods of a posterior increasing the resolution of the spectral lines are offered to process the data coming from the nuclear reactions. The methods have applied to process the data coming from the nuclear reactions at high energies. They give possibilities to get more detail information on a structure of the spectra of particles emitted in the nuclear reactions. The nuclear reactions are main source of the information on the structure and physics of the atomic nuclei. Usually the spectrums of the fragments of the reactions are complex ones. Apparently it is not simple to extract the necessary for investigation information. In the talk we discuss the methods of a posterior increasing the resolution of the spectral lines. The methods could be useful to process the complex data coming from the nuclear reactions. We consider the Fourier transformation method and maximum entropy one. The complex structures were identified by the method. One can see that at lest two selected points are indicated by the method. Recent we presented a talk where we shown that the results of the analyzing the structure of the pseudorapidity spectra of charged relativistic particles with ≥ 0.7 measured in Au+Em and Pb+Em at AGS and SPS energies using the Fourier transformation method and maximum entropy one. The dependences of these spectra on the number of fast target protons were studied. These distribution shown visually some plateau and shoulder that was at least three selected points on the distributions. The plateaus become wider in PbEm reactions. The existing of plateau is necessary for the parton models. The maximum entropy method could confirm the existing of the plateau and the shoulder on the distributions. The figure shows the results of applying the maximum entropy method. One can see that the method indicates several clean selected points. Some of them same with observed visually ones. We would like to note that the Fourier transformation method could not

  4. A review of the uses and methods of processing banana and ...

    African Journals Online (AJOL)

    ) ... Journal of Agricultural Research and Development ... Different processing methods of Musa spp. into new food products which include production of flour, preparation of jams and jellies and the quality attributes of the products obtained from ...

  5. Soft-tissues Image Processing: Comparison of Traditional Segmentation Methods with 2D active Contour Methods

    Czech Academy of Sciences Publication Activity Database

    Mikulka, J.; Gescheidtová, E.; Bartušek, Karel

    2012-01-01

    Roč. 12, č. 4 (2012), s. 153-161 ISSN 1335-8871 R&D Projects: GA ČR GAP102/11/0318; GA ČR GAP102/12/1104; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : Medical image processing * image segmentation * liver tumor * temporomandibular joint disc * watershed method Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.233, year: 2012

  6. The Open Method of Coordination and the Implementation of the Bologna Process

    Science.gov (United States)

    Veiga, Amelia; Amaral, Alberto

    2006-01-01

    In this paper the authors argue that the use of the Open Method of Coordination (OMC) in the implementation of the Bologna process presents coordination problems that do not allow for the full coherence of the results. As the process is quite complex, involving three different levels (European, national and local) and as the final actors in the…

  7. Optimal and adaptive methods of processing hydroacoustic signals (review)

    Science.gov (United States)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  8. Comparison between two rheocasting processes of damper cooling tube method and low superheat casting

    Directory of Open Access Journals (Sweden)

    Zhang Xiaoli

    2014-09-01

    Full Text Available To produce a high quality semisolid slurry that consists of fine primary particles uniformly suspended in the liquid matrix for rheoforming, chemical refining and electromagnetic or mechanical stirring are the two methods commonly used. But these two methods either contaminate the melt or incur high cost. In this study, the damper cooling tube (DCT method was designed to prepare semisolid slurry of A356 aluminum alloy, and was compared with the low superheat casting (LSC method - a conventional process used to produce casting slab with equiaxed dendrite microstructure for thixoforming route. A series of comparative experiments were performed at the pouring temperatures of 650 °C, 638 °C and 622 °C. Metallographic observations of the casting samples were carried out using an optical electron microscope with image analysis software. Results show that the microstructure of semisolid slurry produced by the DCT process consists of spherical primary α-Al grains, while equiaxed grains microstructure is found in the LSC process. The lower the pouring temperature, the smaller the grain size and the rounder the grain morphology in both methods. The copious nucleation, which could be generated in the DCT, owing to the cooling and stirring effect, is the key to producing high quality semisolid slurry. DCT method could produce rounder and smaller α-Al grains, which are suitable for semisolid processing; and the equivalent grain size is no more than 60 μm when the pouring temperature is 622 °C.

  9. AUTOMR: An automatic processing program system for the molecular replacement method

    International Nuclear Information System (INIS)

    Matsuura, Yoshiki

    1991-01-01

    An automatic processing program system of the molecular replacement method AUTMR is presented. The program solves the initial model of the target crystal structure using a homologous molecule as the search model. It processes the structure-factor calculation of the model molecule, the rotation function, the translation function and the rigid-group refinement successively in one computer job. Test calculations were performed for six protein crystals and the structures were solved in all of these cases. (orig.)

  10. A standard curve based method for relative real time PCR data processing

    Directory of Open Access Journals (Sweden)

    Krause Andreas

    2005-03-01

    Full Text Available Abstract Background Currently real time PCR is the most precise method by which to measure gene expression. The method generates a large amount of raw numerical data and processing may notably influence final results. The data processing is based either on standard curves or on PCR efficiency assessment. At the moment, the PCR efficiency approach is preferred in relative PCR whilst the standard curve is often used for absolute PCR. However, there are no barriers to employ standard curves for relative PCR. This article provides an implementation of the standard curve method and discusses its advantages and limitations in relative real time PCR. Results We designed a procedure for data processing in relative real time PCR. The procedure completely avoids PCR efficiency assessment, minimizes operator involvement and provides a statistical assessment of intra-assay variation. The procedure includes the following steps. (I Noise is filtered from raw fluorescence readings by smoothing, baseline subtraction and amplitude normalization. (II The optimal threshold is selected automatically from regression parameters of the standard curve. (III Crossing points (CPs are derived directly from coordinates of points where the threshold line crosses fluorescence plots obtained after the noise filtering. (IV The means and their variances are calculated for CPs in PCR replicas. (V The final results are derived from the CPs' means. The CPs' variances are traced to results by the law of error propagation. A detailed description and analysis of this data processing is provided. The limitations associated with the use of parametric statistical methods and amplitude normalization are specifically analyzed and found fit to the routine laboratory practice. Different options are discussed for aggregation of data obtained from multiple reference genes. Conclusion A standard curve based procedure for PCR data processing has been compiled and validated. It illustrates that

  11. Application of image processing methods to industrial radiography

    International Nuclear Information System (INIS)

    Goutte, R.; Odet, C.; Tuncer, T.; Bodson, F.; Varcin, E.

    1985-01-01

    This study was carried out with the financial support of the Commission of the European Communities as part of the CECA research program comprising of IRSID, INSA de Lyon and the Framatome and Creusot Loire companies. Its purpose was to evaluate the possibility of using digital enhancement of radiographic images to improve defect visibility in industrial radiography, thereby providing assistance in defect detection and a method for automatic analysis of radiographs. This paper provides full results obtained from work on digital processing of radiographs showing real and artificial defects. Furthermore, work on simulated automatic defect detection is also presented. 2 refs

  12. MULTIAGENT TECHNOLOGIES’ METHOD IN MANAGING BUSINESS-PROCESSES OF THE TECHNICAL PREPARING FOR PRODUCTION

    Directory of Open Access Journals (Sweden)

    P.N. Pavlenko

    2005-02-01

    Full Text Available  The method of managing the process of the extended productions technological preparation is given. The method is used for integrating the automated systems of industrial assignment of  CAD/CAM/SAPP and ERP systems.

  13. Relativistic decay widths of autoionization processes: The relativistic FanoADC-Stieltjes method

    Energy Technology Data Exchange (ETDEWEB)

    Fasshauer, Elke, E-mail: Elke.Fasshauer@uit.no [Centre for Theoretical and Computational Chemistry, Department of Chemistry, University of Tromsø–The Arctic University of Norway, N-9037 Tromsø (Norway); Theoretische Chemie, Universität Heidelberg, Im Neuenheimer Feld 229, D-69120 Heidelberg (Germany); Kolorenč, Přemysl [Institute of Theoretical Physics, Faculty of Mathematics and Physics, Charles University in Prague, V Holešovičkách 2, 180 00 Prague (Czech Republic); Pernpointner, Markus [Theoretische Chemie, Universität Heidelberg, Im Neuenheimer Feld 229, D-69120 Heidelberg (Germany)

    2015-04-14

    Electronic decay processes of ionized systems are, for example, the Auger decay or the Interatomic/ Intermolecular Coulombic Decay. In both processes, an energetically low lying vacancy is filled by an electron of an energetically higher lying orbital and a secondary electron is instantaneously emitted to the continuum. Whether or not such a process occurs depends both on the energetic accessibility and the corresponding lifetime compared to the lifetime of competing decay mechanisms. We present a realization of the non-relativistically established FanoADC-Stieltjes method for the description of autoionization decay widths including relativistic effects. This procedure, being based on the Algebraic Diagrammatic Construction (ADC), was adapted to the relativistic framework and implemented into the relativistic quantum chemistry program package Dirac. It is, in contrast to other existing relativistic atomic codes, not limited to the description of autoionization lifetimes in spherically symmetric systems, but is instead also applicable to molecules and clusters. We employ this method to the Auger processes following the Kr3d{sup −1}, Xe4d{sup −1}, and Rn5d{sup −1} ionization. Based on the results, we show a pronounced influence of mainly scalar-relativistic effects on the decay widths of autoionization processes.

  14. Nuclear pulse signal processing technique based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Fu Tingyan; Qi Zhong; Li Dongcang; Ren Zhongguo

    2012-01-01

    In this paper, we present a method for measurement and analysis of nuclear pulse signal, with which pile-up signal is removed, the signal baseline is restored, and the original signal is obtained. The data acquisition system includes FPGA, ADC and USB. The FPGA controls the high-speed ADC to sample the signal of nuclear radiation, and the USB makes the ADC work on the Slave FIFO mode to implement high-speed transmission status. Using the LabVIEW, it accomplishes online data processing of the blind deconvolution algorithm and data display. The simulation and experimental results demonstrate advantages of the method. (authors)

  15. Analysis of the overall energy intensity of alumina refinery process using unit process energy intensity and product ratio method

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Liru; Aye, Lu [International Technologies Center (IDTC), Department of Civil and Environmental Engineering,The University of Melbourne, Vic. 3010 (Australia); Lu, Zhongwu [Institute of Materials and Metallurgy, Northeastern University, Shenyang 110004 (China); Zhang, Peihong [Department of Municipal and Environmental Engineering, Shenyang Architecture University, Shenyang 110168 (China)

    2006-07-15

    Alumina refinery is an energy intensive industry. Traditional energy saving methods employed have been single-equipment-orientated. Based on two concepts of 'energy carrier' and 'system', this paper presents a method that analyzes the effects of unit process energy intensity (e) and product ratio (p) on overall energy intensity of alumina. The important conclusion drawn from this method is that it is necessary to decrease both the unit process energy intensity and the product ratios in order to decrease the overall energy intensity of alumina, which may be taken as a future policy for energy saving. As a case study, the overall energy intensity of the Chinese Zhenzhou alumina refinery plant with Bayer-sinter combined method between 1995 and 2000 was analyzed. The result shows that the overall energy intensity of alumina in this plant decreased by 7.36 GJ/t-Al{sub 2}O{sub 3} over this period, 49% of total energy saving is due to direct energy saving, and 51% is due to indirect energy saving. The emphasis in this paper is on decreasing product ratios of high-energy consumption unit processes, such as evaporation, slurry sintering, aluminium trihydrate calcining and desilication. Energy savings can be made (1) by increasing the proportion of Bayer and indirect digestion, (2) by increasing the grade of ore by ore dressing or importing some rich gibbsite and (3) by promoting the advancement in technology. (author)

  16. Advanced methods for processing ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Carter, W.B. [Georgia Institute of Technology, Atlanta, GA (United States)

    1997-04-01

    Combustion chemical vapor deposition (combustion CVD) is being developed for the deposition of high temperature oxide coatings. The process is being evaluated as an alternative to more capital intensive conventional coating processes. The thrusts during this reporting period were the development of the combustion CVD process for depositing lanthanum monazite, the determination of the influence of aerosol size on coating morphology, the incorporation of combustion CVD coatings into thermal barrier coatings (TBCs) and related oxidation research, and continued work on the deposition of zirconia-yttria coatings.

  17. Temporal response methods for dynamic measurement of in-process inventory of dissolved nuclear materials

    International Nuclear Information System (INIS)

    Zivi, S.M.; Seefeldt, W.B.

    1976-01-01

    This analysis demonstrated that a plant's temporal response to perturbations of feed isotope composition can be used to measure the in-process inventory, without suspending plant operations. The main advantage of the temporal response technique over the step-displacement method are (1) it obviates the need for large special feed batches and (2) it obviates the requirement that all the in-process material have a uniform isotopic composition at the beginning of the measurement. The temporal response method holds promise for essentially continuous real-time determination of in-process SNM. The main disadvantage or problem with the temporal response method is that it requires the measurement of the isotopic composition of a great many samples to moderately high accuracy. This requirement appears amenable to solution by a modest effort in instrument development

  18. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Energy Technology Data Exchange (ETDEWEB)

    Senvar, O.; Sennaroglu, B.

    2016-07-01

    This study examines Clements’ Approach (CA), Box-Cox transformation (BCT), and Johnson transformation (JT) methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI) Ppu is handled for process capability analysis (PCA) because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD), which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB) and the Relative Root Mean Square Error (RRMSE) are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations... (Author)

  19. [Research on the method of copper converting process determination based on emission spectrum analysis].

    Science.gov (United States)

    Li, Xian-xin; Liu, Wen-qing; Zhang, Yu-jun; Si, Fu-qi; Dou, Ke; Wang, Feng-ping; Huang, Shu-hua; Fang, Wu; Wang, Wei-qiang; Huang, Yong-feng

    2012-05-01

    A method of copper converting process determination based on PbO/PbS emission spectrum analysis was described. According to the known emission spectrum of gas molecules, the existence of PbO and PbS was confirmed in the measured spectrum. Through the field experiment it was determined that the main emission spectrum of the slag stage was from PbS, and the main emission spectrum of the copper stage was from PbO. The relative changes in PbO/PbS emission spectrum provide the method of copper converting process determination. Through using the relative intensity in PbO/PbS emission spectrum the copper smelting process can be divided into two different stages, i.e., the slag stage (S phase) and the copper stage (B phase). In a complete copper smelting cycle, a receiving telescope of appropriate view angle aiming at the converter flame, after noise filtering on the PbO/PbS emission spectrum, the process determination agrees with the actual production. Both the theory and experiment prove that the method of copper converting process determination based on emission spectrum analysis is feasible.

  20. Forest Service National Visitor Use Monitoring Process: Research Method Documentation

    Science.gov (United States)

    Donald B.K. English; Susan M. Kocis; Stanley J. Zarnoch; J. Ross Arnold

    2002-01-01

    In response to the need for improved information on recreational use of National Forest System lands, the authors have developed a nationwide, systematic monitoring process. This report documents the methods they used in estimating recreational use on an annual basis. The basic unit of measure is exiting volume of visitors from a recreation site on a given day. Sites...

  1. Mathematical Modeling and Simulation of SWRO Process Based on Simultaneous Method

    Directory of Open Access Journals (Sweden)

    Aipeng Jiang

    2014-01-01

    Full Text Available Reverse osmosis (RO technique is one of the most efficient ways for seawater desalination to solve the shortage of freshwater. For prediction and analysis of the performance of seawater reverse osmosis (SWRO process, an accurate and detailed model based on the solution-diffusion and mass transfer theory is established. Since the accurate formulation of the model includes many differential equations and strong nonlinear equations (differential and algebraic equations, DAEs, to solve the problem efficiently, the simultaneous method through orthogonal collocation on finite elements and large scale solver were used to obtain the solutions. The model was fully discretized into NLP (nonlinear programming with large scale variables and equations, and then the NLP was solved by large scale solver of IPOPT. Validation of the formulated model and solution method is verified by case study on a SWRO plant. Then simulation and analysis are carried out to demonstrate the performance of reverse osmosis process; operational conditions such as feed pressure and feed flow rate as well as feed temperature are also analyzed. This work is of significant meaning for the detailed understanding of RO process and future energy saving through operational optimization.

  2. Processing methods for photoacoustic Doppler flowmetry with a clinical ultrasound scanner

    Science.gov (United States)

    Bücking, Thore M.; van den Berg, Pim J.; Balabani, Stavroula; Steenbergen, Wiendelt; Beard, Paul C.; Brunker, Joanna

    2018-02-01

    Photoacoustic flowmetry (PAF) based on time-domain cross correlation of photoacoustic signals is a promising technique for deep tissue measurement of blood flow velocity. Signal processing has previously been developed for single element transducers. Here, the processing methods for acoustic resolution PAF using a clinical ultrasound transducer array are developed and validated using a 64-element transducer array with a -6 dB detection band of 11 to 17 MHz. Measurements were performed on a flow phantom consisting of a tube (580 μm inner diameter) perfused with human blood flowing at physiological speeds ranging from 3 to 25 mm / s. The processing pipeline comprised: image reconstruction, filtering, displacement detection, and masking. High-pass filtering and background subtraction were found to be key preprocessing steps to enable accurate flow velocity estimates, which were calculated using a cross-correlation based method. In addition, the regions of interest in the calculated velocity maps were defined using a masking approach based on the amplitude of the cross-correlation functions. These developments enabled blood flow measurements using a transducer array, bringing PAF one step closer to clinical applicability.

  3. Magnetic filter apparatus and method for generating cold plasma in semicoductor processing

    Science.gov (United States)

    Vella, Michael C.

    1996-01-01

    Disclosed herein is a system and method for providing a plasma flood having a low electron temperature to a semiconductor target region during an ion implantation process. The plasma generator providing the plasma is coupled to a magnetic filter which allows ions and low energy electrons to pass therethrough while retaining captive the primary or high energy electrons. The ions and low energy electrons form a "cold plasma" which is diffused in the region of the process surface while the ion implantation process takes place.

  4. Magnetic filter apparatus and method for generating cold plasma in semiconductor processing

    Science.gov (United States)

    Vella, M.C.

    1996-08-13

    Disclosed herein is a system and method for providing a plasma flood having a low electron temperature to a semiconductor target region during an ion implantation process. The plasma generator providing the plasma is coupled to a magnetic filter which allows ions and low energy electrons to pass therethrough while retaining captive the primary or high energy electrons. The ions and low energy electrons form a ``cold plasma`` which is diffused in the region of the process surface while the ion implantation process takes place. 15 figs.

  5. Metal Removal Process Optimisation using Taguchi Method - Simplex Algorithm (TM-SA) with Case Study Applications

    OpenAIRE

    Ajibade, Oluwaseyi A.; Agunsoye, Johnson O.; Oke, Sunday A.

    2018-01-01

    In the metal removal process industry, the current practice to optimise cutting parameters adoptsa conventional method. It is based on trial and error, in which the machine operator uses experience,coupled with handbook guidelines to determine optimal parametric values of choice. This method is notaccurate, is time-consuming and costly. Therefore, there is a need for a method that is scientific, costeffectiveand precise. Keeping this in mind, a different direction for process optimisation is ...

  6. Information theoretic methods for image processing algorithm optimization

    Science.gov (United States)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  7. Process Research Methods and Their Application in the Didactics of Text Production and Translation

    DEFF Research Database (Denmark)

    Dam-Jensen, Helle; Heine, Carmen

    2009-01-01

    not only as learners, but also as thinkers and problem solvers. This can be achieved by systematically applying knowledge from process research as this can give insight into mental and physical processes of text production. This article provides an overview of methods commonly used in process research...

  8. Development and application of a probabilistic evaluation method for advanced process technologies

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H.C.; Rubin, E.S.

    1991-04-01

    The objective of this work is to develop and apply a method for research planning for advanced process technologies. To satisfy requirements for research planning, it is necessary to: (1) identify robust solutions to process design questions in the face of uncertainty to eliminate inferior design options; (2) identify key problem areas in a technology that should be the focus of further research to reduce the risk of technology failure; (3) compare competing technologies on a consistent basis to determine the risks associated with adopting a new technology; and (4) evaluate the effects that additional research might have on comparisons with conventional technology. An important class of process technologies are electric power plants. In particular, advanced clean coal technologies are expected to play a key role in the energy and environmental future of the US, as well as in other countries. Research planning for advanced clean coal technology development is an important part of energy and environmental policy. Thus, the research planning method developed here is applied to case studies focusing on a specific clean coal technology. The purpose of the case studies is both to demonstrate the research planning method and to obtain technology-specific conclusions regarding research strategies.

  9. Estimation methods for process holdup of special nuclear materials

    International Nuclear Information System (INIS)

    Pillay, K.K.S.; Picard, R.R.; Marshall, R.S.

    1984-06-01

    The US Nuclear Regulatory Commission sponsored a research study at the Los Alamos National Laboratory to explore the possibilities of developing statistical estimation methods for materials holdup at highly enriched uranium (HEU)-processing facilities. Attempts at using historical holdup data from processing facilities and selected holdup measurements at two operating facilities confirmed the need for high-quality data and reasonable control over process parameters in developing statistical models for holdup estimations. A major effort was therefore directed at conducting large-scale experiments to demonstrate the value of statistical estimation models from experimentally measured data of good quality. Using data from these experiments, we developed statistical models to estimate residual inventories of uranium in large process equipment and facilities. Some of the important findings of this investigation are the following: prediction models for the residual holdup of special nuclear material (SNM) can be developed from good-quality historical data on holdup; holdup data from several of the equipment used at HEU-processing facilities, such as air filters, ductwork, calciners, dissolvers, pumps, pipes, and pipe fittings, readily lend themselves to statistical modeling of holdup; holdup profiles of process equipment such as glove boxes, precipitators, and rotary drum filters can change with time; therefore, good estimation of residual inventories in these types of equipment requires several measurements at the time of inventory; although measurement of residual holdup of SNM in large facilities is a challenging task, reasonable estimates of the hidden inventories of holdup to meet the regulatory requirements can be accomplished through a combination of good measurements and the use of statistical models. 44 references, 62 figures, 43 tables

  10. Testing of the Defense Waste Processing Facility Cold Chemical Dissolution Method in Sludge Batch 9 Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Pareizs, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Coleman, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Young, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Brown, L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-05-10

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) tests the applicability of the digestion methods used by the DWPF Laboratory for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) Receipt samples and SRAT Product process control samples. DWPF SRAT samples are typically dissolved using a method referred to as the DWPF Cold Chemical or Cold Chem Method (CC), (see DWPF Procedure SW4- 15.201). Testing indicates that the CC method produced mixed results. The CC method did not result in complete dissolution of either the SRAT Receipt or SRAT Product with some fine, dark solids remaining. However, elemental analyses did not reveal extreme biases for the major elements in the sludge when compared with analyses obtained following dissolution by hot aqua regia (AR) or sodium peroxide fusion (PF) methods. The CC elemental analyses agreed with the AR and PF methods well enough that it should be adequate for routine process control analyses in the DWPF after much more extensive side-by-side tests of the CC method and the PF method are performed on the first 10 SRAT cycles of the Sludge Batch 9 (SB9) campaign. The DWPF Laboratory should continue with their plans for further tests of the CC method during these 10 SRAT cycles.

  11. Signal processing methods for in-situ creep specimen monitoring

    Science.gov (United States)

    Guers, Manton J.; Tittmann, Bernhard R.

    2018-04-01

    Previous work investigated using guided waves for monitoring creep deformation during accelerated life testing. The basic objective was to relate observed changes in the time-of-flight to changes in the environmental temperature and specimen gage length. The work presented in this paper investigated several signal processing strategies for possible application in the in-situ monitoring system. Signal processing methods for both group velocity (wave-packet envelope) and phase velocity (peak tracking) time-of-flight were considered. Although the Analytic Envelope found via the Hilbert transform is commonly applied for group velocity measurements, erratic behavior in the indicated time-of-flight was observed when this technique was applied to the in-situ data. The peak tracking strategies tested had generally linear trends, and tracking local minima in the raw waveform ultimately showed the most consistent results.

  12. Extracellular Signatures as Indicators of Processing Methods

    Energy Technology Data Exchange (ETDEWEB)

    Wahl, Karen L.

    2012-01-09

    As described in other chapters within this volume, many aspects of microbial cells vary with culture conditions and therefore can potentially be analyzed as forensic signatures of growth conditions. In addition to changes or variations in components of the microbes themselves, extracellular materials indicative of production processes may remain associated with the final bacterial product. It is well recognized that even with considerable effort to make pure products such as fine chemicals or pharmaceuticals, trace impurities from components or synthesis steps associated with production processes can be detected in the final product. These impurities can be used as indicators of production source or methods, such as to help connect drugs of abuse to supply chains. Extracellular residue associated with microbial cells could similarly help to characterize production processes. For successful growth of microorganisms on culture media there must be an available source of carbon, nitrogen, inorganic phosphate and sulfur, trace metals, water and vitamins. The pH, temperature, and a supply of oxygen or other gases must also be appropriate for a given organism for successful culture. The sources of these components and the range in temperature, pH and other variables has adapted over the years with currently a wide range of possible combinations of media components, recipes and parameters to choose from for a given organism. Because of this wide variability in components, mixtures of components, and other parameters, there is the potential for differentiation of cultured organisms based on changes in culture conditions. The challenge remains how to narrow the field of potential combinations and be able to attribute variations in the final bacterial product and extracellular signatures associated with the final product to information about the culture conditions or recipe used in the production of that product.

  13. [Influence of different processing methods and mature stages on 3,29-dibenzoyl rarounitriol of Trichosanthes kirilowii seeds].

    Science.gov (United States)

    Liu, Jin-Na; Xie, Xiao-Liang; Yang, Tai-Xin; Zhang, Cun-Li; Jia, Dong-Sheng; Liu, Ming; Wen, Chun-Xiu

    2014-04-01

    To study the different mature stages and the best processing methods on the quality of Trichosanthes kirilowii seeds. The content of 3,29-dibenzoyl rarounitriol in Trichosanthes kirilowii seeds was determined by HPLC. The sample of different mature stages such as immature, near mature and fully mature and processed by different methods were studied. Fully mature Trichosanthes kirilowii seeds were better than the immatured, and the best processing method was dried under 60degrees C, the content of 3,29-dibenzoyl rarounitriol reached up to 131.63microlg/mL. Different processing methods and different mature stages had a significant influence on the quality of Trichosanthes kirilowii seeds.

  14. A Data Pre-Processing Model for the Topsis Method

    Directory of Open Access Journals (Sweden)

    Kobryń Andrzej

    2016-12-01

    Full Text Available TOPSIS is one of the most popular methods of multi-criteria decision making (MCDM. Its fundamental role is the establishment of chosen alternatives ranking based on their distance from the ideal and negative-ideal solution. There are three primary versions of the TOPSIS method distinguished: classical, interval and fuzzy, where calculation algorithms are adjusted to the character of input rating decision-making alternatives (real numbers, interval data or fuzzy numbers. Various, specialist publications present descriptions on the use of particular versions of the TOPSIS method in the decision-making process, particularly popular is the fuzzy version. However, it should be noticed, that depending on the character of accepted criteria – rating of alternatives can have a heterogeneous character. The present paper suggests the means of proceeding in the situation when the set of criteria covers characteristic criteria for each of the mentioned versions of TOPSIS, as a result of which the rating of the alternatives is vague. The calculation procedure has been illustrated by an adequate numerical example.

  15. System and method for integrating hazard-based decision making tools and processes

    Science.gov (United States)

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  16. Application of the dual reciprocity boundary element method for numerical modelling of solidification process

    Directory of Open Access Journals (Sweden)

    E. Majchrzak

    2008-12-01

    Full Text Available The dual reciprocity boundary element method is applied for numerical modelling of solidification process. This variant of the BEM is connected with the transformation of the domain integral to the boundary integrals. In the paper the details of the dual reciprocity boundary element method are presented and the usefulness of this approach to solidification process modelling is demonstrated. In the final part of the paper the examples of computations are shown.

  17. Estimation of the Thermal Process in the Honeycomb Panel by a Monte Carlo Method

    Science.gov (United States)

    Gusev, S. A.; Nikolaev, V. N.

    2018-01-01

    A new Monte Carlo method for estimating the thermal state of the heat insulation containing honeycomb panels is proposed in the paper. The heat transfer in the honeycomb panel is described by a boundary value problem for a parabolic equation with discontinuous diffusion coefficient and boundary conditions of the third kind. To obtain an approximate solution, it is proposed to use the smoothing of the diffusion coefficient. After that, the obtained problem is solved on the basis of the probability representation. The probability representation is the expectation of the functional of the diffusion process corresponding to the boundary value problem. The process of solving the problem is reduced to numerical statistical modelling of a large number of trajectories of the diffusion process corresponding to the parabolic problem. It was used earlier the Euler method for this object, but that requires a large computational effort. In this paper the method is modified by using combination of the Euler and the random walk on moving spheres methods. The new approach allows us to significantly reduce the computation costs.

  18. Effect of different Processing Methods on the Vitamin A content of ...

    African Journals Online (AJOL)

    Objectives: This study was designed to identify commonly used vegetables and assess the effect of processing on the vitamin A content of four commonly used vegetables. Materials and methods: Data was collected from one hundred women systematically selected using structured, validated and pre-tested questionnaire.

  19. Method of processing titanium aluminium alloys modified by chromium and niobium

    International Nuclear Information System (INIS)

    Huang, S.C.

    1991-01-01

    This patent describes the method of processing a TiAl base alloy to impart desirable strength and ductility properties which providing a melt of the TiAl base alloy having the formula Ti 51-42 Al 46-50 Cr 1-3 Nb 1-5

  20. A Mixed-Methods Research Framework for Healthcare Process Improvement.

    Science.gov (United States)

    Bastian, Nathaniel D; Munoz, David; Ventura, Marta

    2016-01-01

    The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.

  1. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  2. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  3. Method for processing coal-enrichment waste with solid and volatile fuel inclusions

    Science.gov (United States)

    Khasanova, A. V.; Zhirgalova, T. B.; Osintsev, K. V.

    2017-10-01

    The method relates to the field of industrial heat and power engineering. It can be used in coal preparation plants for processing coal waste. This new way is realized to produce a loose ash residue directed to the production of silicate products and fuel gas in rotary kilns. The proposed method is associated with industrial processing of brown coal beneficiation waste. Waste is obtained by flotation separation of rock particles up to 13 mm in size from coal particles. They have in their composition both solid and volatile fuel inclusions (components). Due to the high humidity and significant rock content, low heat of combustion, these wastes are not used on energy boilers, they are stored in dumps polluting the environment.

  4. Application of Data Smoothing Method in Signal Processing for Vortex Flow Meters

    Directory of Open Access Journals (Sweden)

    Zhang Jun

    2017-01-01

    Full Text Available Vortex flow meter is typical flow measure equipment. Its measurement output signals can easily be impaired by environmental conditions. In order to obtain an improved estimate of the time-averaged velocity from the vortex flow meter, a signal filter method is applied in this paper. The method is based on a simple Savitzky-Golay smoothing filter algorithm. According with the algorithm, a numerical program is developed in Python with the scientific library numerical Numpy. Two sample data sets are processed through the program. The results demonstrate that the processed data is available accepted compared with the original data. The improved data of the time-averaged velocity is obtained within smoothing curves. Finally the simple data smoothing program is useable and stable for this filter.

  5. Processing method for high resolution monochromator

    International Nuclear Information System (INIS)

    Kiriyama, Koji; Mitsui, Takaya

    2006-12-01

    A processing method for high resolution monochromator (HRM) has been developed at Japanese Atomic Energy Agency/Quantum Beam Science Directorate/Synchrotron Radiation Research unit at SPring-8. For manufacturing a HRM, a sophisticated slicing machine and X-ray diffractometer have been installed for shaping a crystal ingot and orienting precisely the surface of a crystal ingot, respectively. The specification of the slicing machine is following; Maximum size of a diamond blade is φ 350mm in diameter, φ 38.1mm in the spindle diameter, and 2mm in thickness. A large crystal such as an ingot with 100mm in diameter, 200mm in length can be cut. Thin crystal samples such as a wafer can be also cut using by another sample holder. Working distance of a main shaft with the direction perpendicular to working table in the machine is 350mm at maximum. Smallest resolution of the main shaft with directions of front-and-back and top-and-bottom are 0.001mm read by a digital encoder. 2mm/min can set for cutting samples in the forward direction. For orienting crystal faces relative to the blade direction adjustment, a one-circle goniometer and 2-circle segment are equipped on the working table in the machine. A rotation and a tilt of the stage can be done by manual operation. Digital encoder in a turn stage is furnished and has angle resolution of less than 0.01 degrees. In addition, a hand drill as a supporting device for detailed processing of crystal is prepared. Then, an ideal crystal face can be cut from crystal samples within an accuracy of about 0.01 degrees. By installation of these devices, a high energy resolution monochromator crystal for inelastic x-ray scattering and a beam collimator are got in hand and are expected to be used for nanotechnology studies. (author)

  6. An integrated condition-monitoring method for a milling process using reduced decomposition features

    International Nuclear Information System (INIS)

    Liu, Jie; Wu, Bo; Hu, Youmin; Wang, Yan

    2017-01-01

    Complex and non-stationary cutting chatter affects productivity and quality in the milling process. Developing an effective condition-monitoring approach is critical to accurately identify cutting chatter. In this paper, an integrated condition-monitoring method is proposed, where reduced features are used to efficiently recognize and classify machine states in the milling process. In the proposed method, vibration signals are decomposed into multiple modes with variational mode decomposition, and Shannon power spectral entropy is calculated to extract features from the decomposed signals. Principal component analysis is adopted to reduce feature size and computational cost. With the extracted feature information, the probabilistic neural network model is used to recognize and classify the machine states, including stable, transition, and chatter states. Experimental studies are conducted, and results show that the proposed method can effectively detect cutting chatter during different milling operation conditions. This monitoring method is also efficient enough to satisfy fast machine state recognition and classification. (paper)

  7. Singularity Processing Method of Microstrip Line Edge Based on LOD-FDTD

    Directory of Open Access Journals (Sweden)

    Lei Li

    2014-01-01

    Full Text Available In order to improve the performance of the accuracy and efficiency for analyzing the microstrip structure, a singularity processing method is proposed theoretically and experimentally based on the fundamental locally one-dimensional finite difference time domain (LOD-FDTD with second-order temporal accuracy (denoted as FLOD2-FDTD. The proposed method can highly improve the performance of the FLOD2-FDTD even when the conductor is embedded into more than half of the cell by the coordinate transformation. The experimental results showed that the proposed method can achieve higher accuracy when the time step size is less than or equal to 5 times of that the Courant-Friedrich-Levy (CFL condition allowed. In comparison with the previously reported methods, the proposed method for calculating electromagnetic field near microstrip line edge not only improves the efficiency, but also can provide a higher accuracy.

  8. Silver recovery from the waste materials by the method of flotation process

    OpenAIRE

    B. Oleksiak; G. Siwiec; A. Tomaszewska; D. Piękoś

    2018-01-01

    During the leaching process of zinc concentrates, the waste materials rich in various metals such as eg. silver are produced. So far no attempts of silver recovery from the mentioned waste materials have been made due to the lack of any method which would be both effective and beneficial. The paper presents some possibilities of application of flotation process in silver recovery form waste materials generated during zinc production.

  9. Method to minimize the organic waste in liquid-liquid extraction processes

    International Nuclear Information System (INIS)

    Schoen, J.; Ochsenfeld, W.

    1978-01-01

    In order to free the aqueous phases, accuring in the Purex process of the reprocessing of irradiated nuclear and breeder materials, from the most interfering tri-n-butyl phosphate (TBP) only present in small amounts, and its decomposition products, a suggestion is made to add macroporous sorption resin based on polystyrene which was cross-linked with divinyl benzene, to the former. A method is also described how to reprocess these resins so that almost all components can be recycled. 7 detailed examples explain the method. (UWI) [de

  10. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Directory of Open Access Journals (Sweden)

    Ozlem Senvar

    2016-08-01

    Full Text Available Purpose: This study examines Clements’ Approach (CA, Box-Cox transformation (BCT, and Johnson transformation (JT methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI Ppu is handled for process capability analysis (PCA because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD, which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB and the Relative Root Mean Square Error (RRMSE are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations. Practical implications: Weibull distribution covers a wide class of non-normal processes due to its capability to yield a variety of distinct curves based on its parameters. Weibull distributions are known to have significantly different tail behaviors, which greatly affects the process capability. In quality and reliability applications, they are widely used for the analyses of failure data in order to understand how

  11. Energy-saving method for technogenic waste processing

    Science.gov (United States)

    Dikhanbaev, Bayandy; Dikhanbaev, Aristan Bayandievich

    2017-01-01

    Dumps of a mining-metallurgical complex of post-Soviet Republics have accumulated a huge amount of technogenic waste products. Out of them, Kazakhstan alone has preserved about 20 billion tons. In the field of technogenic waste treatment, there is still no technical solution that leads it to be a profitable process. Recent global trends prompted scientists to focus on developing energy-saving and a highly efficient melting unit that can significantly reduce specific fuel consumption. This paper reports, the development of a new technological method—smelt layer of inversion phase. The introducing method is characterized by a combination of ideal stirring and ideal displacement regimes. Using the method of affine modelling, recalculation of pilot plant’s test results on industrial sample has been obtained. Experiments show that in comparison with bubbling and boiling layers of smelt, the degree of zinc recovery increases in the layer of inversion phase. That indicates the reduction of the possibility of new formation of zinc silicates and ferrites from recombined molecules of ZnO, SiO2, and Fe2O3. Calculations show that in industrial samples of the pilot plant, the consumption of natural gas has reduced approximately by two times in comparison with fuming-furnace. The specific fuel consumption has reduced by approximately four times in comparison with Waelz-kiln. PMID:29281646

  12. A method of network topology optimization design considering application process characteristic

    Science.gov (United States)

    Wang, Chunlin; Huang, Ning; Bai, Yanan; Zhang, Shuo

    2018-03-01

    Communication networks are designed to meet the usage requirements of users for various network applications. The current studies of network topology optimization design mainly considered network traffic, which is the result of network application operation, but not a design element of communication networks. A network application is a procedure of the usage of services by users with some demanded performance requirements, and has obvious process characteristic. In this paper, we first propose a method to optimize the design of communication network topology considering the application process characteristic. Taking the minimum network delay as objective, and the cost of network design and network connective reliability as constraints, an optimization model of network topology design is formulated, and the optimal solution of network topology design is searched by Genetic Algorithm (GA). Furthermore, we investigate the influence of network topology parameter on network delay under the background of multiple process-oriented applications, which can guide the generation of initial population and then improve the efficiency of GA. Numerical simulations show the effectiveness and validity of our proposed method. Network topology optimization design considering applications can improve the reliability of applications, and provide guidance for network builders in the early stage of network design, which is of great significance in engineering practices.

  13. The advanced CECE process for enriching tritium by the chemical exchange method with a hydrophobic catalyst

    International Nuclear Information System (INIS)

    Kitamoto, Asashi; Shimizu, Masami; Masui, Takashi.

    1992-01-01

    The monothermal chemical exchange process with electrolysis, i.e., CECE process, was an effective method for enriching and removing tritium from tritiated water with low to middle level activity. The purpose of this study is to propose the theoretical background of the two-parameter evaluation method, which is based on a two-step isotope exchange reaction between hydrogen gas and liquid water, for improvement of the performance of a hydrophobic catalyst by a trickle bed-type column. Finally, a two-parameter method could attain the highest performance of isotope separation and the lowest liquid holdup for a trickle bed-type column. Therefore, this method will present some effective and practical procedures in scaling up a tritium enrichment process. The main aspect of the CECE process in engineering design and system evaluation was to develop the isotope exchange column with a high performance catalyst. (author)

  14. New applications of old processes in nondestructive testing - irradiation and backscatter methods

    International Nuclear Information System (INIS)

    Segebade, C.

    1995-01-01

    The application of two non-destructive test processes based on photon irradiation measurement is described. The photon backscatter process and the irradiation measurement were used in the technical field and in examining artificial articles. With the aid of the two beam absorption method, wall thicknesses on large liquid containers made of polyethylene and of steel were measured. The same process with a somewhat modified test rig was used in measuring pipe wall thickness on an antique musical instrument. The components made of turbine blade material were excited to X-ray fluorescence with a source of radio nuclides and analysed with a semiconductor detector. This is particularly advantageous for elements which cannot be determined or can only be determined with difficulty by 'conventional' methods (e.g.: yttrium, rhenium). Also the wall thickness measurement for large (diameter approx. 6 m) plastic pipes with the aid of gamma backscatter is described, as is humidity measurement in brick material. Finally, there is a report on wood profile measurement in a stringed instrument with the aid of gamma backscatter. (orig./HP) [de

  15. Methods of modeling and optimization of work effects for chosen mineral processing systems

    Directory of Open Access Journals (Sweden)

    Tomasz Niedoba

    2005-11-01

    Full Text Available The methods being used in the mineral processing modeling are reviewed in this paper. Particularly, the heuristic approach was presented. The new, modern techniques of modeling and optimization were proposed, including the least median squares method and genetic algorithms. The rules of the latter were described in details.

  16. The application of nursing process method in training nurses working in the department of interventional radiology

    International Nuclear Information System (INIS)

    Ni Daihui; Wang Hongjuan; Yang Yajuan; Ye Rui; Qu Juan; Li Xinying; Xu Ying

    2010-01-01

    Objective: To describe the training procedure,typical training method and the clinical effect of nursing process method which was used to cultivate nurses working in the interventional ward. Methods: According to the evaluation index, the authors made a detail assessment of each nurse and found out individually the problems which needed to be perfected, then, the practicable measures were made for each individual nurse, after the training course the clinical results were evaluated. Results: After the nurses on different technical levels were cultivated with nursing process method, the comprehensive quality of each nurse was improved in different degree, and the general nursing quality of entire Department was also markedly improved. Conclusion: By using the nursing process method the cultivating period can be effectively shortened, the possible waste of time, manpower, material and energy cause by the blind training plan can be avoided. (authors)

  17. Regularization of the double period method for experimental data processing

    Science.gov (United States)

    Belov, A. A.; Kalitkin, N. N.

    2017-11-01

    In physical and technical applications, an important task is to process experimental curves measured with large errors. Such problems are solved by applying regularization methods, in which success depends on the mathematician's intuition. We propose an approximation based on the double period method developed for smooth nonperiodic functions. Tikhonov's stabilizer with a squared second derivative is used for regularization. As a result, the spurious oscillations are suppressed and the shape of an experimental curve is accurately represented. This approach offers a universal strategy for solving a broad class of problems. The method is illustrated by approximating cross sections of nuclear reactions important for controlled thermonuclear fusion. Tables recommended as reference data are obtained. These results are used to calculate the reaction rates, which are approximated in a way convenient for gasdynamic codes. These approximations are superior to previously known formulas in the covered temperature range and accuracy.

  18. Sample processing method for the determination of perchlorate in milk

    International Nuclear Information System (INIS)

    Dyke, Jason V.; Kirk, Andrea B.; Kalyani Martinelango, P.; Dasgupta, Purnendu K.

    2006-01-01

    In recent years, many different water sources and foods have been reported to contain perchlorate. Studies indicate that significant levels of perchlorate are present in both human and dairy milk. The determination of perchlorate in milk is particularly important due to its potential health impact on infants and children. As for many other biological samples, sample preparation is more time consuming than the analysis itself. The concurrent presence of large amounts of fats, proteins, carbohydrates, etc., demands some initial cleanup; otherwise the separation column lifetime and the limit of detection are both greatly compromised. Reported milk processing methods require the addition of chemicals such as ethanol, acetic acid or acetonitrile. Reagent addition is undesirable in trace analysis. We report here an essentially reagent-free sample preparation method for the determination of perchlorate in milk. Milk samples are spiked with isotopically labeled perchlorate and centrifuged to remove lipids. The resulting liquid is placed in a disposable centrifugal ultrafilter device with a molecular weight cutoff of 10 kDa, and centrifuged. Approximately 5-10 ml of clear liquid, ready for analysis, is obtained from a 20 ml milk sample. Both bovine and human milk samples have been successfully processed and analyzed by ion chromatography-mass spectrometry (IC-MS). Standard addition experiments show good recoveries. The repeatability of the analytical result for the same sample in multiple sample cleanup runs ranged from 3 to 6% R.S.D. This processing technique has also been successfully applied for the determination of iodide and thiocyanate in milk

  19. A novel process control method for a TT-300 E-Beam/X-Ray system

    Science.gov (United States)

    Mittendorfer, Josef; Gallnböck-Wagner, Bernhard

    2018-02-01

    This paper presents some aspects of the process control method for a TT-300 E-Beam/X-Ray system at Mediscan, Austria. The novelty of the approach is the seamless integration of routine monitoring dosimetry with process data. This allows to calculate a parametric dose for each production unit and consequently a fine grain and holistic process performance monitoring. Process performance is documented in process control charts for the analysis of individual runs as well as historic trending of runs of specific process categories over a specified time range.

  20. Silver recovery from the waste materials by the method of flotation process

    Directory of Open Access Journals (Sweden)

    B. Oleksiak

    2018-01-01

    Full Text Available During the leaching process of zinc concentrates, the waste materials rich in various metals such as eg. silver are produced. So far no attempts of silver recovery from the mentioned waste materials have been made due to the lack of any method which would be both effective and beneficial. The paper presents some possibilities of application of flotation process in silver recovery form waste materials generated during zinc production.

  1. Quality control of roll-to-roll processed polymer solar modules by complementary imaging methods

    DEFF Research Database (Denmark)

    Rösch, R.; Krebs, Frederik C; Tanenbaum, D.M.

    2012-01-01

    We applied complementary imaging methods to investigate processing failures of roll-to-roll solution processed polymer solar modules based on polymer:fullerene bulk heterojunctions. For investigation of processing deficiencies in solar modules we employed dark lock-in thermography (DLIT......), electroluminescence (ELI) and photoluminescence/reflection imaging (PLI/RI) complemented by optical imaging (OI). The combination of all high resolution images allowed us to allocate the origin of processing errors to a specific deposition process, i.e. the insufficient coverage of an electrode interlayer...

  2. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  3. Flexural Strength of Acrylic Resin Denture Bases Processed by Two Different Methods

    Directory of Open Access Journals (Sweden)

    Jafar Gharechahi

    2014-09-01

    Full Text Available Background and aims. The aim of this study was to compare flexural strength of specimens processed by conventional and injection-molding techniques. Materials and methods. Conventional pressure-packed PMMA was used for conventional pressure-packed and injection-molded PMMA was used for injection-molding techniques. After processing, 15 specimens were stored in distilled water at room temperature until measured. Three-point flexural strength test was carried out. Statistical analysis was carried out by SPSS using t-test. Statistical significance was defined at P<0.05. Results. Flexural strength of injection-polymerized acrylic resin specimens was higher than that of theconventional method (P=0.006. This difference was statistically significant (P=0.006. Conclusion. Within the limitations of this study, flexural strength of acrylic resin specimens was influenced by the mold-ing technique.

  4. Evaluation of polymer micro parts produced by additive manufacturing processes using vat photopolymerization method

    DEFF Research Database (Denmark)

    Davoudinejad, Ali; Pedersen, David Bue; Tosello, Guido

    2017-01-01

    Micro manufacturing scale feature production by Additive Manufacturing (AM) processes for the direct production of miniaturized polymer components is analysed in this work. The study characterizes the AM processes for polymer micro parts productions using the vat photopolymerization method...

  5. Methods of Run-Time Error Detection in Distributed Process Control Software

    DEFF Research Database (Denmark)

    Drejer, N.

    of generic run-time error types, design of methods of observing application software behaviorduring execution and design of methods of evaluating run time constraints. In the definition of error types it is attempted to cover all relevant aspects of the application softwaree behavior. Methods of observation......In this thesis, methods of run-time error detection in application software for distributed process control is designed. The error detection is based upon a monitoring approach in which application software is monitored by system software during the entire execution. The thesis includes definition...... and constraint evaluation is designed for the modt interesting error types. These include: a) semantical errors in data communicated between application tasks; b) errors in the execution of application tasks; and c) errors in the timing of distributed events emitted by the application software. The design...

  6. A novel method for detecting and counting overlapping tracks in SSNTD by image processing techniques

    International Nuclear Information System (INIS)

    Ab Azar, N.; Babakhani, A.; Broumandnia, A.; Sepanloo, K.

    2016-01-01

    Overlapping object detection and counting is a challenge in image processing. A new method for detecting and counting overlapping circles is presented in this paper. This method is based on pattern recognition and feature extraction using “neighborhood values“ in an object image by implementation of image processing techniques. The junction points are detected by assignment of a value for each pixel in an image. As is shown, the neighborhood values for junction points are larger than the values for other points. This distinction of neighborhood values is the main feature which can be utilized to identify the junction points and to count the overlapping tracks. This method can be used for recognizing and counting charged particle tracks, blood cells and also cancer cells. The method is called “Track Counting based on Neighborhood Values” and is symbolized by “TCNV”. - Highlights: • A new method is introduced to recognize nuclear tracks by image processing. • The method is used to specify neighborhood pixels in junction points in overlapping tracks. • Enhanced method of counting overlapping tracks. • New counting system has linear behavior in counting tracks with density less than 300,000 tracks per cm"2. • In the new method, the overlap tracks can be recognized even to 10× tracks and more.

  7. Study of transport processes in soils and plants by microautoradiographic and radioabsorption methods

    International Nuclear Information System (INIS)

    Varro, T.; Gelencser, Judit; Somogyi, G.

    1987-01-01

    The concentration profiles of lead and boron in carrot root and potato tuber were determined at various diffusion times by microradiographic method. The transport process of nutrients, leaf-manures and plant-protecting agents in plants was investigated by radioabsorption method. The influence of the pH of soils and complex-forming agents on the effective diffusion coefficients of nutritives was studied by radioabsorption technique. In soils, the effective diffusion coefficient of the nutrients was found to change in the region of 10 -16 -10 -10 m 2 s -1 . The data of the measurements give valuable information about the transport processes in plants and soils. (author) 9 refs., 4 figs

  8. [Investigation of potential toxic factors for fleece-flower root: from perspective of processing methods evolution].

    Science.gov (United States)

    Cui, He-Rong; Bai, Zhao-Fang; Song, Hai-Bo; Jia, Tian-Zhu; Wang, Jia-Bo; Xiao, Xiao-He

    2016-01-01

    In recent years, the rapid growth of reports on fleece-flower root-caused liver damages has drawn wide attention of both at home and abroad, however, there were rare literature on toxicology of fleece-flower root in ancient Chinese medicine. But why there are so many reports on toxicology of fleece-flower root now compared with the ancient literature? As a typical tonic medicine, the clinical utility of fleece-flower root was largely limited by its standardization and reliability of processing methods in ancient Chinese medicine. The ancient processing methods of fleece-flower root emphasized nine times of steaming and nine times of drying, while the modern processes have been simplified into one time of steaming. Whether the differences between ancient and modern processing methods are the potential cause of the increased events of fleece-flower root-caused liver damages. We will make deep analysis and provide new clues and perspectives for the research on its toxicity. This article, therefore, would discuss the affecting factors and key problems in toxicity attenuation of fleece-flower root on the basis of sorting out the processing methods of fleece-flower root in ancient medical books and modern standards, in order to provide the reference for establishing specification for toxicity attenuation of fleece-flower root. Copyright© by the Chinese Pharmaceutical Association.

  9. Developing an Engineering Design Process Assessment using Mixed Methods.

    Science.gov (United States)

    Wind, Stefanie A; Alemdar, Meltem; Lingle, Jeremy A; Gale, Jessica D; Moore, Roxanne A

    Recent reforms in science education worldwide include an emphasis on engineering design as a key component of student proficiency in the Science, Technology, Engineering, and Mathematics disciplines. However, relatively little attention has been directed to the development of psychometrically sound assessments for engineering. This study demonstrates the use of mixed methods to guide the development and revision of K-12 Engineering Design Process (EDP) assessment items. Using results from a middle-school EDP assessment, this study illustrates the combination of quantitative and qualitative techniques to inform item development and revisions. Overall conclusions suggest that the combination of quantitative and qualitative evidence provides an in-depth picture of item quality that can be used to inform the revision and development of EDP assessment items. Researchers and practitioners can use the methods illustrated here to gather validity evidence to support the interpretation and use of new and existing assessments.

  10. COMPARISON OF CONSEQUENCE ANALYSIS RESULTS FROM TWO METHODS OF PROCESSING SITE METEOROLOGICAL DATA

    International Nuclear Information System (INIS)

    , D

    2007-01-01

    Consequence analysis to support documented safety analysis requires the use of one or more years of representative meteorological data for atmospheric transport and dispersion calculations. At minimum, the needed meteorological data for most atmospheric transport and dispersion models consist of hourly samples of wind speed and atmospheric stability class. Atmospheric stability is inferred from measured and/or observed meteorological data. Several methods exist to convert measured and observed meteorological data into atmospheric stability class data. In this paper, one year of meteorological data from a western Department of Energy (DOE) site is processed to determine atmospheric stability class using two methods. The method that is prescribed by the U.S. Nuclear Regulatory Commission (NRC) for supporting licensing of nuclear power plants makes use of measurements of vertical temperature difference to determine atmospheric stability. Another method that is preferred by the U.S. Environmental Protection Agency (EPA) relies upon measurements of incoming solar radiation, vertical temperature gradient, and wind speed. Consequences are calculated and compared using the two sets of processed meteorological data from these two methods as input data into the MELCOR Accident Consequence Code System 2 (MACCS2) code

  11. Investigation of the fluidized bed-chemical vapor deposition (FBCVD) process using CFD-DEM method

    International Nuclear Information System (INIS)

    Liu Malin; Liu Rongzheng; Wen Yuanyun; Liu Bing; Shao Youlin

    2014-01-01

    The CFD-DEM-CVD multiscale coupling simulation concept was proposed based on the mass/momentum/energy transfer involved in the FB-CVD process. The pyrolysis process of the reaction gas in the spouted bed can be simulated by CFD method, then the concentration field and velocity field can be extracted and coupled with the particle movement behavior which can be simulated by DEM. Particle deposition process can be described by the CVD model based on particle position, velocity and neighboring gas concentration. This multiscale coupling method can be implemented in the Fluent@-EDEM@ software with their UDF (User Definition Function) and API (Application Programming Interface). Base on the multiscale coupling concept, the criterion for evaluating FB-CVD process is given. At first, the volume in the coating furnace is divided into two parts (active coating area and non-active coating area) based on simulation results of chemical pyrolysis process. Then the residence time of all particles in the active coating area can be obtained using the CFD-DEM simulation method. The residence time distribution can be used as a criterion for evaluating the gas-solid contact efficiency and operation performance of the coating furnace. At last different coating parameters of the coating furnace are compared based on the proposed criterion. And also, the future research emphasis is discussed. (author)

  12. Analytical methods and laboratory facility for the Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Dewberry, R.A.; Lethco, A.J.; Denard, C.D.

    1985-01-01

    This paper describes the analytical methods, instruments, and laboratory that will support vitrification of defense waste. The Defense Waste Processing Facility (DWPF) is now being constructed at Savannah River Plant (SRP). Beginning in 1989, SRP high-level defense waste will be immobilized in borosilicate glass for disposal in a federal repository. The DWPF will contain an analytical laboratory for performing process control analyses. Additional analyses will be performed for process history and process diagnostics. The DWPF analytical facility will consist of a large shielded sampling cell, three shielded analytical cells, a laboratory for instrumental analysis and chemical separations, and a counting room. Special instrumentation is being designed for use in the analytical cells, including microwave drying/dissolution apparatus, and remote pipetting devices. The instrumentation laboratory will contain inductively coupled plasma, atomic absorption, Moessbauer spectrometers, a carbon analyzer, and ion chromatography equipment. Counting equipment will include intrinsic germanium detectors, scintillation counters, Phoswich alpha, beta, gamma detectors, and a low-energy photon detector

  13. Processing method for cleaning water waste from cement kneader

    International Nuclear Information System (INIS)

    Soda, Kenzo; Fujita, Hisao; Nakajima, Tadashi.

    1990-01-01

    The present invention concerns a method of processing cleaning water wastes from a cement kneader in a case of processing liquid wastes containing radioactive wastes or deleterious materials such as heavy metals by means of cement solidification. Cleaning waste wastes from the kneader are sent to a cleaning water waste tank, in which gentle stirring is applied near the bottom and sludges are retained so as not to be coagulated. Sludges retained at the bottom of the cleaning water waste tank are sent after elapse of a predetermined time and then kneaded with cements. Thus, since the sludges in the cleaning water are solidified with cement, inhomogenous solidification products consisting only of cleaning sludges with low strength are not formed. The resultant solidification product is homogenous and the compression strength thereof reaches such a level as capable of satisfying marine disposal standards required for the solidification products of radioactive wastes. (I.N.)

  14. Transition processes in the novel method of the muon catalysis investigation

    International Nuclear Information System (INIS)

    Filchenkov, V.V.

    1997-01-01

    The problem of modifying the interpretation of the results to be obtained with the novel method of muon catalysis investigation to take the fast transition processes into account is first considered. The results of exploring the process kinetics are compared with the ones found from the analysis of the appropriate Monte Carlo distributions. The calculation programs simulate both the kinetics and the registration system of the experiment which is now performed in the frame of the large international project TRITON. The main conclusion is that the multiplicity distribution of the fusion neutrons is 'invariant' under any assumptions of the fast transition stage

  15. Controlled decomposition and oxidation: A treatment method for gaseous process effluents

    Science.gov (United States)

    Mckinley, Roger J. B., Sr.

    1990-01-01

    The safe disposal of effluent gases produced by the electronics industry deserves special attention. Due to the hazardous nature of many of the materials used, it is essential to control and treat the reactants and reactant by-products as they are exhausted from the process tool and prior to their release into the manufacturing facility's exhaust system and the atmosphere. Controlled decomposition and oxidation (CDO) is one method of treating effluent gases from thin film deposition processes. CDO equipment applications, field experience, and results of the use of CDO equipment and technological advances gained from the field experiences are discussed.

  16. Multi Blending Technology (MBT): mineral processing method for increasing added value of marginal reserve

    Science.gov (United States)

    Agustinus, E. T. S.

    2018-02-01

    Indonesia's position on the path of ring of fire makes it rich in mineral resources. Nevertheless, in the past, the exploitation of Indonesian mineral resources was uncontrolled resulting in environmental degradation and marginal reserves. Exploitation of excessive mineral resources is very detrimental to the state. Reflecting on the occasion, the management and utilization of Indonesia's mineral resources need to be good in mining practice. The problem is how to utilize the mineral reserve resources effectively and efficiently. Utilization of marginal reserves requires new technologies and processing methods because the old processing methods are inadequate. This paper gives a result of Multi Blending Technology (MBT) Method. The underlying concept is not to do the extraction or refinement but processing through the formulation of raw materials by adding an additive and produce a new material called functional materials. Application of this method becomes important to be summarized into a scientific paper in a book form, so that the information can spread across multiple print media and become focused on and optimized. This book is expected to be used as a reference for stakeholder providing added value to environmentally marginal reserves in Indonesia. The conclusions are that Multi Blending Technology (MBT) Method can be used as a strategy to increase added values effectively and efficiently to marginal reserve minerals and that Multi Blending Technology (MBT) method has been applied to forsterite, Atapulgite Synthesis, Zeoceramic, GEM, MPMO, SMAC and Geomaterial.

  17. Method of transition from 3D model to its ontological representation in aircraft design process

    Science.gov (United States)

    Govorkov, A. S.; Zhilyaev, A. S.; Fokin, I. V.

    2018-05-01

    This paper proposes the method of transition from a 3D model to its ontological representation and describes its usage in the aircraft design process. The problems of design for manufacturability and design automation are also discussed. The introduced method is to aim to ease the process of data exchange between important aircraft design phases, namely engineering and design control. The method is also intended to increase design speed and 3D model customizability. This requires careful selection of the complex systems (CAD / CAM / CAE / PDM), providing the basis for the integration of design and technological preparation of production and more fully take into account the characteristics of products and processes for their manufacture. It is important to solve this problem, as investment in the automation define the company's competitiveness in the years ahead.

  18. Method of the Aquatic Environment Image Processing for Determining the Mineral Suspension Parameters

    Directory of Open Access Journals (Sweden)

    D.A. Antonenkov

    2016-10-01

    Full Text Available The present article features the developed method to determine the mineral suspension characteristics by obtaining and following processing of the aquatic environment images. This method is capable of maintaining its performance under the conditions of considerable dynamic activity of the water masses. The method feature consists in application of the developed computing algorithm, simultaneous use of morphological filters and histogram methods for image processing, and in a special calibration technique. As a whole it provides a possibility to calculate size and concentration of the particles on the images obtained. The developed technical means permitting to get the environment images of the required quality are briefly described. The algorithm of the developed software operation is represented. The examples of numerical and weight distribution of the particles according to their sizes, and the totals of comparing the results obtained by the standard and developed methods are represented. The developed method makes it possible to obtain the particle size data in the range of 50–1000 μm and also to determine the suspension concentration with ~12 % error. This method can be technically implemented for the instruments intended for in situ measurements using the gauges, allowing obtaining exposure time short values, such as applying the electron-optical converter, which acts as the image intensifier, and the high-speed electronic shutter. The completed method testing in the laboratory makes possible to obtain the results similar in accuracy with the results of the in situ measurements.

  19. Advanced Methods of Biomedical Signal Processing

    CERN Document Server

    Cerutti, Sergio

    2011-01-01

    This book grew out of the IEEE-EMBS Summer Schools on Biomedical Signal Processing, which have been held annually since 2002 to provide the participants state-of-the-art knowledge on emerging areas in biomedical engineering. Prominent experts in the areas of biomedical signal processing, biomedical data treatment, medicine, signal processing, system biology, and applied physiology introduce novel techniques and algorithms as well as their clinical or physiological applications. The book provides an overview of a compelling group of advanced biomedical signal processing techniques, such as mult

  20. New radiation technologies and methods for control of technological processes in metallurgy

    International Nuclear Information System (INIS)

    Zaykin, Yu.

    1996-01-01

    Radiation Technology of Metal and Ceramic Production with Enhanced Service Properties. Based on application of radiation technique in powder metallurgy the new technology for obtaining metals, alloys and ceramic materials with high service properties is worked out. Radiation processing of powder materials at the certain stage of the process leads to profound structure alterations at all further stages and eventually effects the properties of the resulting product. Theoretical calculation and experimental studies of electron-positron annihilation in powder-pressed samples showed that irradiation caused powder particles surface state changes favorable for further sintering and crystallization processes development. It is shown that irradiation of metal powders and powder-pressed samples by high energy electrons is technologically most efficient. The right choice of the type-and the mode of the radiation processing makes it possible to obtain metals, alloys and ceramic materials (Mo,Fe, W, Al, Ni, Cu, stainless steels, ceramics, etc.) with homogeneous structure and stable enhanced service properties. The project on radiation technology application to powder metallurgy represented by a group of authors was awarded with the diploma and the gold medal at the 22 International Exhibition of Inventions (Geneva, 1994). New Technologic Opportunities of the Chromium-Nickel Alloys Processing To obtain the required phase-structure state special methods of the chromium-nickel alloy processing for sensitive elastic devices production were worked out combining plastic deformation, thermal and radiation processing. It is shown that h-gbb phase transfer not observed before is possible in extremely non-equilibrium conditions under electron irradiation. It is established that the complex reaction of recrystallization and gb-phase deposition proceeds under electron irradiation at the room temperature when the certain threshold plastic deformation degree is reached that leads to the same

  1. Determination of optimum thermal debinding and sintering process parameters using Taguchi Method

    CSIR Research Space (South Africa)

    Seerane, M

    2015-07-01

    Full Text Available powder and a wax-based binder. The binder’s backbone component is a low density polyethylene (LDPE). Careful selection of thermal debinding parameters was guided by thermo- gravimetric analysis (TGA) results. The Taguchi method was used to determine... International Light Metals Technology Conference (LMT 2015), Port Elizabeth, South Africa, July 27-29 Determination of Optimum Process for Thermal Debinding and Sintering using Taguchi Method SEERANE Mandya,*, CHIKWANDA Hildab, MACHAKA Ronaldc CSIR...

  2. Furnace and support equipment for space processing. [space manufacturing - Czochralski method

    Science.gov (United States)

    Mazelsky, R.; Duncan, C. S.; Seidensticker, R. G.; Johnson, R. A.; Hopkins, R. H.; Roland, G. W.

    1975-01-01

    A core facility capable of performing a majority of materials processing experiments is discussed. Experiment classes are described, the needs peculiar to each experiment type are outlined, and projected facility requirements to perform the experiments are treated. Control equipment (automatic control) and variations of the Czochralski method for use in space are discussed.

  3. Isoflavone profile in soymilk as affected by soybean variety, grinding, and heat-processing methods.

    Science.gov (United States)

    Zhang, Yan; Chang, Sam K C; Liu, Zhisheng

    2015-05-01

    Isoflavones impart health benefits and their overall content and profile in foods are greatly influenced at each step during processing. In this study, 2 soybean varieties (Prosoy and black soybean) were processed with 3 different grinding (ambient, cold, and hot grinding) and heating methods (traditional stove cooking, 1-phase UHT, and 2-phase UHT) for soymilk making. The results showed after cold, ambient, and hot grinding, the total isoflavones were 3917, 5013, and 5949 nmol/g for Prosoy; the total isoflavones were 4073, 3966, and 4284 nmol/g for black soybean. Grinding could significantly increase isoflavone extraction. The grinding process had a destructive effect on isoflavones and this effect varied with grinding temperature. Different heating methods had different effects on different isoflavone forms. Two soybean varieties showed distinct patterns with respect to the change of isoflavone profile during processing. © 2015 Institute of Food Technologists®

  4. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    Science.gov (United States)

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  5. THE USE OF AHP METHOD IN THE MULTI‐CRITERIA TASK SOLVING PROCESS – CASE STUDY

    Directory of Open Access Journals (Sweden)

    Zygmunt KORBAN

    2014-01-01

    Full Text Available In the decision‐making process, both single‐ and multi‐criteria tasks are dealt with. In the majority of cases, the selection of a solution comes down to determination of the “best” decision (most often based on the subjective assessment or to organisation of the set of decisions. The Analytic Hierarchy Process (AHP is one of the methods used for evaluation of qualitative features in the multi‐criteria optimisation processes. This article discusses the possibilities of using the above‐mentioned method, illustrated with an example of purchasing technical equipment for one of the municipal landfill sites in the Silesian Province.

  6. [Cost management: the implementation of the activity-based costing method in sterile processing department].

    Science.gov (United States)

    Jericó, Marli de Carvalho; Castilho, Valéria

    2010-09-01

    This exploratory case study was performed aiming at implementing the Activity-based Costing (ABC) method in a sterile processing department (SPD) of a major teaching hospital. Data collection was performed throughout 2006. Documentary research techniques and non participant closed observation were used. The ABC implementation allowed for learning the activity-based costing of both the chemical and physical disinfection cycle/load: (dollar 9.95) and (dollar 12.63), respectively; as well as the cost for sterilization by steam under pressure (autoclave) (dollar 31.37) and low temperature steam and gaseous formaldehyde sterilization (LTSF) (dollar 255.28). The information provided by the ABC method has optimized the overall understanding of the cost driver process and provided the foundation for assessing performance and improvement in the SPD processes.

  7. Fault Detection for Nonlinear Process With Deterministic Disturbances: A Just-In-Time Learning Based Data Driven Method.

    Science.gov (United States)

    Yin, Shen; Gao, Huijun; Qiu, Jianbin; Kaynak, Okyay

    2017-11-01

    Data-driven fault detection plays an important role in industrial systems due to its applicability in case of unknown physical models. In fault detection, disturbances must be taken into account as an inherent characteristic of processes. Nevertheless, fault detection for nonlinear processes with deterministic disturbances still receive little attention, especially in data-driven field. To solve this problem, a just-in-time learning-based data-driven (JITL-DD) fault detection method for nonlinear processes with deterministic disturbances is proposed in this paper. JITL-DD employs JITL scheme for process description with local model structures to cope with processes dynamics and nonlinearity. The proposed method provides a data-driven fault detection solution for nonlinear processes with deterministic disturbances, and owns inherent online adaptation and high accuracy of fault detection. Two nonlinear systems, i.e., a numerical example and a sewage treatment process benchmark, are employed to show the effectiveness of the proposed method.

  8. Bispectral methods of signal processing applications in radar, telecommunications and digital image restoration

    CERN Document Server

    Totsky, Alexander V; Kravchenko, Victor F

    2015-01-01

    By studying applications in radar, telecommunications and digital image restoration, this monograph discusses signal processing techniques based on bispectral methods. Improved robustness against different forms of noise as well as preservation of phase information render this method a valuable alternative to common power-spectrum analysis used in radar object recognition, digital wireless communications, and jitter removal in images.

  9. Systematic process synthesis and design methods for cost effective waste minimization

    International Nuclear Information System (INIS)

    Biegler, L.T.; Grossman, I.E.; Westerberg, A.W.

    1995-01-01

    We present progress on our work to develop synthesis methods to aid in the design of cost effective approaches to waste minimization. Work continues to combine the approaches of Douglas and coworkers and of Grossmann and coworkers on a hierarchical approach where bounding information allows it to fit within a mixed integer programming approach. We continue work on the synthesis of reactors and of flexible separation processes. In the first instance, we strive for methods we can use to reduce the production of potential pollutants, while in the second we look for ways to recover and recycle solvents

  10. Systematic process synthesis and design methods for cost effective waste minimization

    Energy Technology Data Exchange (ETDEWEB)

    Biegler, L.T.; Grossman, I.E.; Westerberg, A.W. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-12-31

    We present progress on our work to develop synthesis methods to aid in the design of cost effective approaches to waste minimization. Work continues to combine the approaches of Douglas and coworkers and of Grossmann and coworkers on a hierarchical approach where bounding information allows it to fit within a mixed integer programming approach. We continue work on the synthesis of reactors and of flexible separation processes. In the first instance, we strive for methods we can use to reduce the production of potential pollutants, while in the second we look for ways to recover and recycle solvents.

  11. Comparative exergy analyses of Jatropha curcas oil extraction methods: Solvent and mechanical extraction processes

    International Nuclear Information System (INIS)

    Ofori-Boateng, Cynthia; Keat Teong, Lee; JitKang, Lim

    2012-01-01

    Highlights: ► Exergy analysis detects locations of resource degradation within a process. ► Solvent extraction is six times exergetically destructive than mechanical extraction. ► Mechanical extraction of jatropha oil is 95.93% exergetically efficient. ► Solvent extraction of jatropha oil is 79.35% exergetically efficient. ► Exergy analysis of oil extraction processes allow room for improvements. - Abstract: Vegetable oil extraction processes are found to be energy intensive. Thermodynamically, any energy intensive process is considered to degrade the most useful part of energy that is available to produce work. This study uses literature values to compare the efficiencies and degradation of the useful energy within Jatropha curcas oil during oil extraction taking into account solvent and mechanical extraction methods. According to this study, J. curcas seeds on processing into J. curcas oil is upgraded with mechanical extraction but degraded with solvent extraction processes. For mechanical extraction, the total internal exergy destroyed is 3006 MJ which is about six times less than that for solvent extraction (18,072 MJ) for 1 ton J. curcas oil produced. The pretreatment processes of the J. curcas seeds recorded a total internal exergy destructions of 5768 MJ accounting for 24% of the total internal exergy destroyed for solvent extraction processes and 66% for mechanical extraction. The exergetic efficiencies recorded are 79.35% and 95.93% for solvent and mechanical extraction processes of J. curcas oil respectively. Hence, mechanical oil extraction processes are exergetically efficient than solvent extraction processes. Possible improvement methods are also elaborated in this study.

  12. Experimental methods to characterize the woven composite prepreg behavior during the preforming process

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Weizhao; Ren, Huaqing; Lu, Jie; Zhang, Zixuan; Su, Lingxuan; Wang, Q Jane; Zeng, Danielle; Su, Xuming; Cao, Jian

    2016-09-19

    This paper reports several characterization methods of the properties of the uncured woven prepreg during the preforming process. The uniaxial tension, bias-extension, and bending tests are conducted to measure the in-plane properties of the material. The friction tests utilized to reveal the prepreg-prepreg and prepreg-forming tool interactions. All these tests are performed within the temperature range of the real manufacturing process. The results serve as the inputs to the numerical simulation for the product prediction and preforming process parameter optimization.

  13. Incrementally Detecting Change Types of Spatial Area Object: A Hierarchical Matching Method Considering Change Process

    Directory of Open Access Journals (Sweden)

    Yanhui Wang

    2018-01-01

    Full Text Available Detecting and extracting the change types of spatial area objects can track area objects’ spatiotemporal change pattern and provide the change backtracking mechanism for incrementally updating spatial datasets. To respond to the problems of high complexity of detection methods, high redundancy rate of detection factors, and the low automation degree during incrementally update process, we take into account the change process of area objects in an integrated way and propose a hierarchical matching method to detect the nine types of changes of area objects, while minimizing the complexity of the algorithm and the redundancy rate of detection factors. We illustrate in details the identification, extraction, and database entry of change types, and how we achieve a close connection and organic coupling of incremental information extraction and object type-of-change detection so as to characterize the whole change process. The experimental results show that this method can successfully detect incremental information about area objects in practical applications, with the overall accuracy reaching above 90%, which is much higher than the existing weighted matching method, making it quite feasible and applicable. It helps establish the corresponding relation between new-version and old-version objects, and facilitate the linked update processing and quality control of spatial data.

  14. ALTERNATIVE METHODS OF TECHNOLOGICAL PROCESSING TO REDUCE SALT IN MEAT PRODUCTS

    Directory of Open Access Journals (Sweden)

    E. K. Tunieva

    2017-01-01

    Full Text Available The world trends in table salt reduction in meat products contemplate the use of different methods for preservation of taste and consistency in finished products as well as shelf life prolongation. There are several approaches to a sodium chloride reduction in meat products. The paper presents a review of the foreign studies that give evidence of the possibility to maintain quality of traditional meat products produced with the reduced salt content. The studies in the field of salty taste perception established that a decrease in a salt crystal size to 20 µm enabled reducing an amount of added table salt due to an increase in the salty taste intensity in food products. Investigation of the compatibility of different taste directions is also interesting as one of the approaches to a sodium chloride reduction in food products. The use of water-in-oil-in-water (w/o/w double emulsions allows controlling a release of encapsulated ingredients (salt, which enables enhancement of salty taste. The other alternative method of technological processing of meat raw material for reducing salt in meat products is the use of high pressure processing. This method has several advantages and allows not only an increase in the salty taste intensity, but also formation of a stable emulsion, an increase in water binding capacity of minced meat and extension of shelf-life.

  15. Curcumin complexation with cyclodextrins by the autoclave process: Method development and characterization of complex formation.

    Science.gov (United States)

    Hagbani, Turki Al; Nazzal, Sami

    2017-03-30

    One approach to enhance curcumin (CUR) aqueous solubility is to use cyclodextrins (CDs) to form inclusion complexes where CUR is encapsulated as a guest molecule within the internal cavity of the water-soluble CD. Several methods have been reported for the complexation of CUR with CDs. Limited information, however, is available on the use of the autoclave process (AU) in complex formation. The aims of this work were therefore to (1) investigate and evaluate the AU cycle as a complex formation method to enhance CUR solubility; (2) compare the efficacy of the AU process with the freeze-drying (FD) and evaporation (EV) processes in complex formation; and (3) confirm CUR stability by characterizing CUR:CD complexes by NMR, Raman spectroscopy, DSC, and XRD. Significant differences were found in the saturation solubility of CUR from its complexes with CD when prepared by the three complexation methods. The AU yielded a complex with expected chemical and physical fingerprints for a CUR:CD inclusion complex that maintained the chemical integrity and stability of CUR and provided the highest solubility of CUR in water. Physical and chemical characterizations of the AU complexes confirmed the encapsulated of CUR inside the CD cavity and the transformation of the crystalline CUR:CD inclusion complex to an amorphous form. It was concluded that the autoclave process with its short processing time could be used as an alternate and efficient methods for drug:CD complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Method for Forming Pulp Fibre Yarns Developed by a Design-driven Process

    Directory of Open Access Journals (Sweden)

    Tiia-Maria Tenhunen

    2016-01-01

    Full Text Available A simple and inexpensive method for producing water-stable pulp fibre yarns using a deep eutectic mixture composed of choline chloride and urea (ChCl/urea was developed in this work. Deep eutectic solvents (DESs are eutectic mixtures consisting of two or more components that together have a lower melting point than the individual components. DESs have been previously studied with respect to cellulose dissolution, functionalisation, and pre-treatment. This new method uses a mixture of choline chloride and urea, which is used as a swelling and dispersing agent for the pulp fibres in the yarn-forming process. Although the pulp seemed to form a gel when dispersed in ChCl/urea, the ultrastructure of the pulp was not affected. To enable water stability, pulp fibres were crosslinked by esterification using polyacrylic acid. ChCl/urea could be easily recycled and reused by distillation. The novel process described in this study enables utilisation of pulp fibres in textile production without modification or dissolution and shortening of the textile value chain. An interdisciplinary approach was used, where potential applications were explored simultaneously with material development from process development to the early phase prototyping.

  17. Presentation of a method for consequence modeling and quantitative risk assessment of fire and explosion in process industry (Case study: Hydrogen Production Process

    Directory of Open Access Journals (Sweden)

    M J Jafari

    2013-05-01

     .Conclusion: Since the proposed method is applicable in all phases of process or system design, and estimates the risk of fire and explosion by a quantitative, comprehensive and mathematical-based equations approach. It can be used as an alternative method instead of qualitative and semi quantitative methods.

  18. Method and Process Development of Advanced Atmospheric Plasma Spraying for Thermal Barrier Coatings

    Science.gov (United States)

    Mihm, Sebastian; Duda, Thomas; Gruner, Heiko; Thomas, Georg; Dzur, Birger

    2012-06-01

    Over the last few years, global economic growth has triggered a dramatic increase in the demand for resources, resulting in steady rise in prices for energy and raw materials. In the gas turbine manufacturing sector, process optimizations of cost-intensive production steps involve a heightened potential of savings and form the basis for securing future competitive advantages in the market. In this context, the atmospheric plasma spraying (APS) process for thermal barrier coatings (TBC) has been optimized. A constraint for the optimization of the APS coating process is the use of the existing coating equipment. Furthermore, the current coating quality and characteristics must not change so as to avoid new qualification and testing. Using experience in APS and empirically gained data, the process optimization plan included the variation of e.g. the plasma gas composition and flow-rate, the electrical power, the arrangement and angle of the powder injectors in relation to the plasma jet, the grain size distribution of the spray powder and the plasma torch movement procedures such as spray distance, offset and iteration. In particular, plasma properties (enthalpy, velocity and temperature), powder injection conditions (injection point, injection speed, grain size and distribution) and the coating lamination (coating pattern and spraying distance) are examined. The optimized process and resulting coating were compared to the current situation using several diagnostic methods. The improved process significantly reduces costs and achieves the requirement of comparable coating quality. Furthermore, a contribution was made towards better comprehension of the APS of ceramics and the definition of a better method for future process developments.

  19. Study on Thixojoining Process Using Partial Remelting Method

    Directory of Open Access Journals (Sweden)

    M. N. Mohammed

    2013-01-01

    Full Text Available Cold-work tool steel is considered to be a nonweldable metal due to its high percentage content of carbon and alloy elements. The application of a new process of the semisolid joining of two dissimilar metals is proposed. AISI D2 cold-work tool steel was thixojoined to 304 stainless steel by using a partial remelting method. After thixojoining, microstructural examination including metallographic analysis, energy dispersive spectroscopy (EDS, and Vickers hardness tests was performed. From the results, metallographic analyses along the joint interface between semisolid AISI D2 and stainless steel showed a smooth transition from one to another and neither oxides nor microcracking was observed. Hardness values obtained from the points in the diffusion zone were much higher than those in the 304 stainless steel but lower than those in the AISI D2 tool steel. The study revealed that a new type of nonequilibrium diffusion interfacial structure was constructed at the interface of the two different types of steel. The current work successfully confirmed that avoidance of a dendritic microstructure in the semisolid joined zone and high bonding quality components can be achieved without the need for force or complex equipment when compared to conventional welding processes.

  20. Possibilities of Utilizing the Method of Analytical Hierarchy Process Within the Strategy of Corporate Social Business

    Science.gov (United States)

    Drieniková, Katarína; Hrdinová, Gabriela; Naňo, Tomáš; Sakál, Peter

    2010-01-01

    The paper deals with the analysis of the theory of corporate social responsibility, risk management and the exact method of analytic hierarchic process that is used in the decision-making processes. The Chapters 2 and 3 focus on presentation of the experience with the application of the method in formulating the stakeholders' strategic goals within the Corporate Social Responsibility (CSR) and simultaneously its utilization in minimizing the environmental risks. The major benefit of this paper is the application of Analytical Hierarchy Process (AHP).

  1. Study of a multivariable nonlinear process by the phase space method

    International Nuclear Information System (INIS)

    Tomei, Alain

    1969-02-01

    This paper concerns the study of the properties of a multivariate nonlinear process using the phase space method. Based on the example of the Rapsodie reactor, a fast sodium reactor, the authors have established the simplified differential equations with the analogical study of partial differential equations (in order to replace them with ordinary differential equations), a mathematical study of dynamic properties and stability of the simplified model by the phase space method, and the verification of the model properties using an analog calculator. The reactor, with all its thermal circuits, has been considered as a nonlinear system with two inputs and one output (reactor power). The great stability of a fast reactor such as Rapsodie, in the normal operating conditions, has been verified. The same method could be applied to any other type of reactor

  2. Use of simulated data sets to evaluate the fidelity of metagenomic processing methods

    Energy Technology Data Exchange (ETDEWEB)

    Mavromatis, K [U.S. Department of Energy, Joint Genome Institute; Ivanova, N [U.S. Department of Energy, Joint Genome Institute; Barry, Kerrie [U.S. Department of Energy, Joint Genome Institute; Shapiro, Harris [U.S. Department of Energy, Joint Genome Institute; Goltsman, Eugene [U.S. Department of Energy, Joint Genome Institute; McHardy, Alice C. [IBM T. J. Watson Research Center; Rigoutsos, Isidore [IBM T. J. Watson Research Center; Salamov, Asaf [U.S. Department of Energy, Joint Genome Institute; Korzeniewski, Frank [U.S. Department of Energy, Joint Genome Institute; Land, Miriam L [ORNL; Lapidus, Alla L. [U.S. Department of Energy, Joint Genome Institute; Grigoriev, Igor [U.S. Department of Energy, Joint Genome Institute; Hugenholtz, Philip [U.S. Department of Energy, Joint Genome Institute; Kyrpides, Nikos C [U.S. Department of Energy, Joint Genome Institute

    2007-01-01

    Metagenomics is a rapidly emerging field of research for studying microbial communities. To evaluate methods presently used to process metagenomic sequences, we constructed three simulated data sets of varying complexity by combining sequencing reads randomly selected from 113 isolate genomes. These data sets were designed to model real metagenomes in terms of complexity and phylogenetic composition. We assembled sampled reads using three commonly used genome assemblers (Phrap, Arachne and JAZZ), and predicted genes using two popular gene-finding pipelines (fgenesb and CRITICA/GLIMMER). The phylogenetic origins of the assembled contigs were predicted using one sequence similarity-based ( blast hit distribution) and two sequence composition-based (PhyloPythia, oligonucleotide frequencies) binning methods. We explored the effects of the simulated community structure and method combinations on the fidelity of each processing step by comparison to the corresponding isolate genomes. The simulated data sets are available online to facilitate standardized benchmarking of tools for metagenomic analysis.

  3. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  4. Method for treating a nuclear process off-gas stream

    International Nuclear Information System (INIS)

    Pence, D.T.; Chou, C.C.

    1984-01-01

    Disclosed is a method for selectively removing and recovering the noble gas and other gaseous components typically emitted during nuclear process operations. The method is adaptable and useful for treating dissolver off-gas effluents released during reprocessing of spent nuclear fuels whereby to permit radioactive contaminant recovery prior to releasing the remaining off-gases to the atmosphere. Briefly, the method sequentially comprises treating the off-gas stream to preliminarily remove NO /SUB x/ , hydrogen and carbon-containing organic compounds, and semivolatile fission product metal oxide components therefrom; adsorbing iodine components on silver-exchanged mordenite; removing water vapor carried by said stream by means of a molecular sieve; selectively removing the carbon dioxide components of said off-gas stream by means of a molecular sieve; selectively removing xenon in gas phase by passing said stream through a molecular sieve comprising silver-exchanged mordenite; selectively separating krypton from oxygen by means of a molecular sieve comprising silver-exchanged mordenite; selectively separating krypton from the bulk nitrogen stream using a molecular sieve comprising silver-exchanged mordenite cooled to about -140 0 to -160 0 C.; concentrating the desorbed krypton upon a molecular sieve comprising silver-exchange mordenite cooled to about -140 0 to -160 0 C.; and further cryogenically concentrating, and the recovering for storage, the desorbed krypton

  5. Method for treating a nuclear process off-gas stream

    Science.gov (United States)

    Pence, Dallas T.; Chou, Chun-Chao

    1984-01-01

    Disclosed is a method for selectively removing and recovering the noble gas and other gaseous components typically emitted during nuclear process operations. The method is adaptable and useful for treating dissolver off-gas effluents released during reprocessing of spent nuclear fuels whereby to permit radioactive contaminant recovery prior to releasing the remaining off-gases to the atmosphere. Briefly, the method sequentially comprises treating the off-gas stream to preliminarily remove NO.sub.x, hydrogen and carbon-containing organic compounds, and semivolatile fission product metal oxide components therefrom; adsorbing iodine components on silver-exchanged mordenite; removing water vapor carried by said stream by means of a molecular sieve; selectively removing the carbon dioxide components of said off-gas stream by means of a molecular sieve; selectively removing xenon in gas phase by passing said stream through a molecular sieve comprising silver-exchanged mordenite; selectively separating krypton from oxygen by means of a molecular sieve comprising silver-exchanged mordenite; selectively separating krypton from the bulk nitrogen stream using a molecular sieve comprising silver-exchanged mordenite cooled to about -140.degree. to -160.degree. C.; concentrating the desorbed krypton upon a molecular sieve comprising silver-exchange mordenite cooled to about -140.degree. to -160.degree. C.; and further cryogenically concentrating, and the recovering for storage, the desorbed krypton.

  6. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles...... with corresponding analyses in the literature. The centerline degree of cure at the exit (CDOCE) being less than a critical value and the maximum composite temperature (Tmax) during the process being greater than a critical temperature are selected as the limit state functions (LSFs) for the FORM. The cumulative...

  7. [Influence of different original processing methods on quality of Salvia Miltiorrhizae Radix et Rhizoma from Shandong].

    Science.gov (United States)

    Zhao, Zhi-Gang; Gao, Shu-Rui; Hou, Jun-Ling; Wang, Wen-Quan; Xu, Zhen-Guang; Song, Yan; Zhang, Xian-Ming; Li, Jun

    2014-04-01

    In this paper the contents of rosmarinic acid, salvianolic acid B, crytotanshinone, tanshinone II(A) in samples of different original processed Salvia Miltiorrhizae Radix et Rhizoma were determined by HPLC. Different processing methods have varied influences on four active ingredients in Salvia Miltiorrhizae Radix et Rhizoma. Sun-drying reduced the content of crytotanshinone, tanshi-none II(A) and rosmarinic acid, integralsamples were better than those cut into segments. Oven dry method had great influence on water--soluble ingredients, high temperature (80-100 degrees C) could easily cause big loss of rosmarinic acid and salvianolic acid B. The role of traditional processing method "fahan: was complicated, the content of rosmarinic acid decreased, crytotanshinone and tanshinone II(A) increased, and salvianolic acid B showed no difference after "fahan". Drying in the shade and oven dry under low temperatrure (40-60 degrees C) were all effective to keep active ingredients of Salvia Miltiorrhizae Radix et Rhizoma, and, there was no difference between integral samples and samples cut into segments. Therefore, considering comprehensively the content of active ingredients in Salvia Miltiorrhizae Radix et Rhizoma, and processing costing etc., shade-drying or oven dry underlow temperature (40-60 degrees C) should be the most suitable original processing method.

  8. The application of the analytic hierarchy process (AHP) in uranium mine mining method of the optimal selection

    International Nuclear Information System (INIS)

    Tan Zhongyin; Kuang Zhengping; Qiu Huiyuan

    2014-01-01

    Analytic hierarchy process, AHP, is a combination of qualitative and quantitative, systematic and hierarchical analysis method. Basic decision theory of analytic hierarchy process is applied in this article, with a project example in north Guangdong region as the research object, the in-situ mining method optimization choose hierarchical analysis model is established and the analysis method, The results show that, the AHP model for mining method selecting model was reliable, optimization results were conformity with the actual use of the in-situ mining method, and it has better practicability. (authors)

  9. Using stable isotopes to monitor forms of sulfur during desulfurization processes: A quick screening method

    Science.gov (United States)

    Liu, Chao-Li; Hackley, Keith C.; Coleman, D.D.; Kruse, C.W.

    1987-01-01

    A method using stable isotope ratio analysis to monitor the reactivity of sulfur forms in coal during thermal and chemical desulfurization processes has been developed at the Illinois State Geological Survey. The method is based upon the fact that a significant difference exists in some coals between the 34S/32S ratios of the pyritic and organic sulfur. A screening method for determining the suitability of coal samples for use in isotope ratio analysis is described. Making these special coals available from coal sample programs would assist research groups in sorting out the complex sulfur chemistry which accompanies thermal and chemical processing of high sulfur coals. ?? 1987.

  10. METHOD OF ELECTRON BEAM PROCESSING

    DEFF Research Database (Denmark)

    2003-01-01

    As a rule, electron beam welding takes place in a vacuum. However, this means that the workpieces in question have to be placed in a vacuum chamber and have to be removed therefrom after welding. This is time−consuming and a serious limitation of a process the greatest advantage of which is the o......As a rule, electron beam welding takes place in a vacuum. However, this means that the workpieces in question have to be placed in a vacuum chamber and have to be removed therefrom after welding. This is time−consuming and a serious limitation of a process the greatest advantage of which...... is the option of welding workpieces of large thicknesses. Therefore the idea is to guide the electron beam (2) to the workpiece via a hollow wire, said wire thereby acting as a prolongation of the vacuum chamber (4) down to workpiece. Thus, a workpiece need not be placed inside the vacuum chamber, thereby...... exploiting the potential of electron beam processing to a greater degree than previously possible, for example by means of electron beam welding...

  11. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    Science.gov (United States)

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Stochastic processes, multiscale modeling, and numerical methods for computational cellular biology

    CERN Document Server

    2017-01-01

    This book focuses on the modeling and mathematical analysis of stochastic dynamical systems along with their simulations. The collected chapters will review fundamental and current topics and approaches to dynamical systems in cellular biology. This text aims to develop improved mathematical and computational methods with which to study biological processes. At the scale of a single cell, stochasticity becomes important due to low copy numbers of biological molecules, such as mRNA and proteins that take part in biochemical reactions driving cellular processes. When trying to describe such biological processes, the traditional deterministic models are often inadequate, precisely because of these low copy numbers. This book presents stochastic models, which are necessary to account for small particle numbers and extrinsic noise sources. The complexity of these models depend upon whether the biochemical reactions are diffusion-limited or reaction-limited. In the former case, one needs to adopt the framework of s...

  13. Method of processing liquid wastes containing radioactive materials

    International Nuclear Information System (INIS)

    Matsumoto, Kaname; Shirai, Takamori; Nemoto, Kuniyoshi; Yoshikawa, Jun; Matsuda, Takeshi.

    1983-01-01

    Purpose: To reduce the number of solidification products by removing, particularly, Co-60 that is difficult to remove in a radioactive liquid wastes containing a water-soluble chelating agent, by adsorbing Co-60 to a specific chelating agent. Method: Liquid wastes containing radioactive cobalt and water-soluble chelating agent are passed through the layer of less water-soluble chelating agent that forms a complex compound with cobalt in an acidic pH region. Thus, the chelating compound of radioactive cobalt (particularly Co-60) is eliminated by adsorbing the same on a specific chelating agent layer. The chelating agent having Co-60 adsorbed thereon is discarded as it is through the cement- or asphalt-solidification process, whereby the number of solidification products to be generated can significantly be suppressed. (Moriyama, K.)

  14. Processing methods for differential analysis of LC/MS profile data

    Directory of Open Access Journals (Sweden)

    Orešič Matej

    2005-07-01

    Full Text Available Abstract Background Liquid chromatography coupled to mass spectrometry (LC/MS has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. Results We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. Conclusion The software is freely available under the GNU General Public License and it can be obtained from the project web page at: http://mzmine.sourceforge.net/.

  15. Quantitative Diagnosis of Rotor Vibration Fault Using Process Power Spectrum Entropy and Support Vector Machine Method

    Directory of Open Access Journals (Sweden)

    Cheng-Wei Fei

    2014-01-01

    Full Text Available To improve the diagnosis capacity of rotor vibration fault in stochastic process, an effective fault diagnosis method (named Process Power Spectrum Entropy (PPSE and Support Vector Machine (SVM (PPSE-SVM, for short method was proposed. The fault diagnosis model of PPSE-SVM was established by fusing PPSE method and SVM theory. Based on the simulation experiment of rotor vibration fault, process data for four typical vibration faults (rotor imbalance, shaft misalignment, rotor-stator rubbing, and pedestal looseness were collected under multipoint (multiple channels and multispeed. By using PPSE method, the PPSE values of these data were extracted as fault feature vectors to establish the SVM model of rotor vibration fault diagnosis. From rotor vibration fault diagnosis, the results demonstrate that the proposed method possesses high precision, good learning ability, good generalization ability, and strong fault-tolerant ability (robustness in four aspects of distinguishing fault types, fault severity, fault location, and noise immunity of rotor stochastic vibration. This paper presents a novel method (PPSE-SVM for rotor vibration fault diagnosis and real-time vibration monitoring. The presented effort is promising to improve the fault diagnosis precision of rotating machinery like gas turbine.

  16. Improved methods for signal processing in measurements of mercury by Tekran® 2537A and 2537B instruments

    Science.gov (United States)

    Ambrose, Jesse L.

    2017-12-01

    Atmospheric Hg measurements are commonly carried out using Tekran® Instruments Corporation's model 2537 Hg vapor analyzers, which employ gold amalgamation preconcentration sampling and detection by thermal desorption (TD) and atomic fluorescence spectrometry (AFS). A generally overlooked and poorly characterized source of analytical uncertainty in those measurements is the method by which the raw Hg atomic fluorescence (AF) signal is processed. Here I describe new software-based methods for processing the raw signal from the Tekran® 2537 instruments, and I evaluate the performances of those methods together with the standard Tekran® internal signal processing method. For test datasets from two Tekran® instruments (one 2537A and one 2537B), I estimate that signal processing uncertainties in Hg loadings determined with the Tekran® method are within ±[1 % + 1.2 pg] and ±[6 % + 0.21 pg], respectively. I demonstrate that the Tekran® method can produce significant low biases (≥ 5 %) not only at low Hg sample loadings (< 5 pg) but also at tropospheric background concentrations of gaseous elemental mercury (GEM) and total mercury (THg) (˜ 1 to 2 ng m-3) under typical operating conditions (sample loadings of 5-10 pg). Signal processing uncertainties associated with the Tekran® method can therefore represent a significant unaccounted for addition to the overall ˜ 10 to 15 % uncertainty previously estimated for Tekran®-based GEM and THg measurements. Signal processing bias can also add significantly to uncertainties in Tekran®-based gaseous oxidized mercury (GOM) and particle-bound mercury (PBM) measurements, which often derive from Hg sample loadings < 5 pg. In comparison, estimated signal processing uncertainties associated with the new methods described herein are low, ranging from within ±0.053 pg, when the Hg thermal desorption peaks are defined manually, to within ±[2 % + 0.080 pg] when peak definition is automated. Mercury limits of detection (LODs

  17. Development of an inexpensive optical method for studies of dental erosion process in vitro

    Science.gov (United States)

    Nasution, A. M. T.; Noerjanto, B.; Triwanto, L.

    2008-09-01

    Teeth have important roles in digestion of food, supporting the facial-structure, as well as in articulation of speech. Abnormality in teeth structure can be initiated by an erosion process due to diet or beverages consumption that lead to destruction which affect their functionality. Research to study the erosion processes that lead to teeth's abnormality is important in order to be used as a care and prevention purpose. Accurate measurement methods would be necessary as a research tool, in order to be capable for quantifying dental destruction's degree. In this work an inexpensive optical method as tool to study dental erosion process is developed. It is based on extraction the parameters from the 3D dental visual information. The 3D visual image is obtained from reconstruction of multiple lateral projection of 2D images that captured from many angles. Using a simple motor stepper and a pocket digital camera, sequence of multi-projection 2D images of premolar tooth is obtained. This images are then reconstructed to produce a 3D image, which is useful for quantifying related dental erosion parameters. The quantification process is obtained from the shrinkage of dental volume as well as surface properties due to erosion process. Results of quantification is correlated to the ones of dissolved calcium atom which released from the tooth using atomic absorption spectrometry. This proposed method would be useful as visualization tool in many engineering, dentistry, and medical research. It would be useful also for the educational purposes.

  18. Obtaining bixin from semi-defatted annatto seeds by a mechanical method and solvent extraction: Process integration and economic evaluation.

    Science.gov (United States)

    Alcázar-Alay, Sylvia C; Osorio-Tobón, J Felipe; Forster-Carneiro, Tânia; Meireles, M Angela A

    2017-09-01

    This work involves the application of physical separation methods to concentrate the pigment of semi-defatted annatto seeds, a noble vegetal biomass rich in bixin pigments. Semi-defatted annatto seeds are the residue produced after the extraction of the lipid fraction from annatto seeds using supercritical fluid extraction (SFE). Semi-defatted annatto seeds are use in this work due to three important reasons: i) previous lipid extraction is necessary to recovery the tocotrienol-rich oil present in the annatto seeds, ii) an initial removal of the oil via SFE process favors bixin separation and iii) the cost of raw material is null. Physical methods including i) the mechanical fractionation method and ii) an integrated process of mechanical fractionation method and low-pressure solvent extraction (LPSE) were studied. The integrated process was proposed for processing two different semi-defatted annatto materials denoted Batches 1 and 2. The cost of manufacture (COM) was calculated for two different production scales (5 and 50L) considering the integrated process vs. only the mechanical fractionation method. The integrated process showed a significantly higher COM than mechanical fractionation method. This work suggests that mechanical fractionation method is an adequate and low-cost process to obtain a rich-pigment product from semi-defatted annatto seeds. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A collaborative design method to support integrated care. An ICT development method containing continuous user validation improves the entire care process and the individual work situation

    Science.gov (United States)

    Scandurra, Isabella; Hägglund, Maria

    2009-01-01

    Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].

  20. Method for atmospheric pressure reactive atom plasma processing for surface modification

    Science.gov (United States)

    Carr, Jeffrey W [Livermore, CA

    2009-09-22

    Reactive atom plasma processing can be used to shape, polish, planarize and clean the surfaces of difficult materials with minimal subsurface damage. The apparatus and methods use a plasma torch, such as a conventional ICP torch. The workpiece and plasma torch are moved with respect to each other, whether by translating and/or rotating the workpiece, the plasma, or both. The plasma discharge from the torch can be used to shape, planarize, polish, and/or clean the surface of the workpiece, as well as to thin the workpiece. The processing may cause minimal or no damage to the workpiece underneath the surface, and may involve removing material from the surface of the workpiece.

  1. Apparatus and method for materials processing utilizing a rotating magnetic field

    Science.gov (United States)

    Muralidharan, Govindarajan; Angelini, Joseph A.; Murphy, Bart L.; Wilgen, John B.

    2017-04-11

    An apparatus for materials processing utilizing a rotating magnetic field comprises a platform for supporting a specimen, and a plurality of magnets underlying the platform. The plurality of magnets are configured for rotation about an axis of rotation intersecting the platform. A heat source is disposed above the platform for heating the specimen during the rotation of the plurality of magnets. A method for materials processing utilizing a rotating magnetic field comprises providing a specimen on a platform overlying a plurality of magnets; rotating the plurality of magnets about an axis of rotation intersecting the platform, thereby applying a rotating magnetic field to the specimen; and, while rotating the plurality of magnets, heating the specimen to a desired temperature.

  2. DEVELOPMENT OF A METHOD STATISTICAL ANALYSIS ACCURACY AND PROCESS STABILITY PRODUCTION OF EPOXY RESIN ED-20

    Directory of Open Access Journals (Sweden)

    N. V. Zhelninskaya

    2015-01-01

    Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of

  3. Water conservation and reuse using the Water Sources Diagram method for batch process: case studies

    Directory of Open Access Journals (Sweden)

    Fernando Luiz Pellegrini Pessoa

    2012-04-01

    Full Text Available The water resources management has been an important factor for the sustainability of industrial processes, since there is a growing need for the development of methodologies aimed at the conservation and rational use of water. The objective of this work was to apply the heuristic-algorithmic method called Water Sources Diagram (WSD, which is used to define the target of minimum water consumption, to batch processes. Scenarios with reuse of streams were generated and evaluated with application of the method from the data of water quantity and concentration of contaminants in the operations. Two case studies aiming to show the reduction of water consumption and wastewater generation, and final treatment costs besides investment in storage tanks, were presented. The scenarios showed great promising, achieving reduction up to 45% in water consumption and wastewater generation, and a reduction of around 37% on cost of storage tanks, without the need to allocate regeneration processes. Thus, the WSD method showed to be a relevant and flexible alternative regarding to systemic tools aimed at minimizing the consumption of water in industrial processes, playing an important role within a program of water resources management.

  4. Hybrid numerical methods for multiscale simulations of subsurface biogeochemical processes

    International Nuclear Information System (INIS)

    Scheibe, T D; Tartakovsky, A M; Tartakovsky, D M; Redden, G D; Meakin, P

    2007-01-01

    Many subsurface flow and transport problems of importance today involve coupled non-linear flow, transport, and reaction in media exhibiting complex heterogeneity. In particular, problems involving biological mediation of reactions fall into this class of problems. Recent experimental research has revealed important details about the physical, chemical, and biological mechanisms involved in these processes at a variety of scales ranging from molecular to laboratory scales. However, it has not been practical or possible to translate detailed knowledge at small scales into reliable predictions of field-scale phenomena important for environmental management applications. A large assortment of numerical simulation tools have been developed, each with its own characteristic scale. Important examples include 1. molecular simulations (e.g., molecular dynamics); 2. simulation of microbial processes at the cell level (e.g., cellular automata or particle individual-based models); 3. pore-scale simulations (e.g., lattice-Boltzmann, pore network models, and discrete particle methods such as smoothed particle hydrodynamics); and 4. macroscopic continuum-scale simulations (e.g., traditional partial differential equations solved by finite difference or finite element methods). While many problems can be effectively addressed by one of these models at a single scale, some problems may require explicit integration of models across multiple scales. We are developing a hybrid multi-scale subsurface reactive transport modeling framework that integrates models with diverse representations of physics, chemistry and biology at different scales (sub-pore, pore and continuum). The modeling framework is being designed to take advantage of advanced computational technologies including parallel code components using the Common Component Architecture, parallel solvers, gridding, data and workflow management, and visualization. This paper describes the specific methods/codes being used at each

  5. Identification of Hidden Failures in Process Control Systems Based on the HMG Method

    DEFF Research Database (Denmark)

    Jalashgar, Atoosa

    1998-01-01

    cause the systems to become overloaded and even unstable, if they remain hidden. The method uses a particular terminology to contribute to the identification of system properties, including goals, functions, and the capabilities. All identified knowledge about the system is then represented by using...... a tailored combination of two function-oriented methods, Multilevel Flow Modelling (MFM) and Goal Tree-Success Tree (GTST). The features of the method, called Hybrid MFM-GTST, are described and demonstrated by using an example of a process control system. (C) 1998 John Wiley & Sons, Inc....

  6. Rapid and accurate processing method for amide proton exchange rate measurement in proteins

    International Nuclear Information System (INIS)

    Koskela, Harri; Heikkinen, Outi; Kilpelaeinen, Ilkka; Heikkinen, Sami

    2007-01-01

    Exchange between protein backbone amide hydrogen and water gives relevant information about solvent accessibility and protein secondary structure stability. NMR spectroscopy provides a convenient tool to study these dynamic processes with saturation transfer experiments. Processing of this type of NMR spectra has traditionally required peak integration followed by exponential fitting, which can be tedious with large data sets. We propose here a computer-aided method that applies inverse Laplace transform in the exchange rate measurement. With this approach, the determination of exchange rates can be automated, and reliable results can be acquired rapidly without a need for manual processing

  7. Drying of water based foundry coatings: Innovative test, process design and optimization methods

    DEFF Research Database (Denmark)

    Di Muoio, Giovanni Luca; Johansen, Bjørn Budolph

    on real industrial cases. These tools have been developed in order to simulate and optimize the drying process and reduce drying time and power consumption as well as production process design time and cost of expensive drying equipment. Results show that test methods from other industries can be used...... capacity goals there is a need to understand how to design, control and optimize drying processes. The main focus of this project was on the critical parameters and properties to be controlled in production in order to achieve a stable and predictable drying process. We propose for each of these parameters...... of Denmark with the overall aim to optimize the drying process of water based foundry coatings. Drying of foundry coatings is a relatively new process in the foundry industry that followed the introduction of water as a solvent. In order to avoid moisture related quality problems and reach production...

  8. A method for acetylcholinesterase staining of brain sections previously processed for receptor autoradiography.

    Science.gov (United States)

    Lim, M M; Hammock, E A D; Young, L J

    2004-02-01

    Receptor autoradiography using selective radiolabeled ligands allows visualization of brain receptor distribution and density on film. The resolution of specific brain regions on the film often can be difficult to discern owing to the general spread of the radioactive label and the lack of neuroanatomical landmarks on film. Receptor binding is a chemically harsh protocol that can render the tissue virtually unstainable by Nissl and other conventional stains used to delineate neuroanatomical boundaries of brain regions. We describe a method for acetylcholinesterase (AChE) staining of slides previously processed for receptor binding. AChE staining is a useful tool for delineating major brain nuclei and tracts. AChE staining on sections that have been processed for receptor autoradiography provides a direct comparison of brain regions for more precise neuroanatomical description. We report a detailed thiocholine protocol that is a modification of the Koelle-Friedenwald method to amplify the AChE signal in brain sections previously processed for autoradiography. We also describe several temporal and experimental factors that can affect the density and clarity of the AChE signal when using this protocol.

  9. The ways of implementing interactive methods in the educational process of students of higher educational institutions

    Directory of Open Access Journals (Sweden)

    Y.V. Vaskov

    2015-02-01

    Full Text Available Purpose : theoretical basis and practical implementation of interactive methods in the educational process of higher education institutions. Material : еhe study involved 50 students of the Kharkiv humanitarian-pedagogical Academy. Results : ыhowing the possibility of introducing interactive teaching method "Joint project." The theoretical study and practical implementation of the method is the process of inclusion of all students in the joint study group (in the form of small groups work on mastering the content of teaching material. Also, the presentation of educational options to solve their own problems, discussion of the results of joint activities, making optimal decisions. Conclusions : еhe development of theoretical foundations and practical implementation of an interactive method improved the quality of the educational process of students. This is reflected in the involvement of all students in active joint work on learning. Also provide an opportunity for each student to express their own opinions on those tasks. This increased level of theoretical knowledge of each student.

  10. NUTRITIONAL VALUE AND METHODS OF THE TECHNOLOGICAL PROCESSING OF PELED (СOREGONUS PELED GMELIN (REVIEW

    Directory of Open Access Journals (Sweden)

    O. Nazarov

    2016-06-01

    Full Text Available Purpose. To investigate peled as a food product, raw material for processing and analyze traditional methods of its technological processing. Findings. The paper contains an analysis of the chemical composition of peled meat and its difference compared to other fish of pond aquaculture of Ukraine. According to the parameters of the biochemical composition of the meat of peled reared in the conditions of pond aquaculture, including: contents of fats, proteins, and moisture, belongs to the category of fish from medium to high fat content with medium protein content as well as to fish of increased nutritional value and assimilability based on water-protein, fat-protein, and water-fat balance, and based on amino-acid composition in percent, according to Score standard. Unlike cyprinids — objects of pond aquaculture, general indices of the biochemical composition and peculiarities of anatomical structure of peled as a coregonid representative, contribute to the formation of organoleptic features of native origin that are inherent to gourmet types of the products of traditional processing. It was found that unlike other coregonids, the biochemical indices of peled meat, which define the type and directions of its processing and its regime, first of all, the content of fat, protein, and moisture аre relatively stable for different age groups under conditions of pond aquaculture and they change less during the biological cycle. Main product requirements to the methods of technological processing of peled are summarized, namely: drying, smoking, salting. Full technological schemes of peled processing by traditional methods taking into account biochemical peculiarities of raw material and requirements for the finished product are presented and analyzed. Practical value. The summarized information is useful for further development of domestic aquaculture and processing. Different indices of biochemical composition and high output indices of peled meat

  11. Sample processing device and method

    DEFF Research Database (Denmark)

    2011-01-01

    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... angle with water, the first contact angle being smaller than the second contact angle. The first substrate defines an inlet system and a preparation system in areas of the first type which two areas are separated by a barrier system in an area of the second type. The inlet system is adapted to receive...

  12. Method for verification of constituents of a process stream

    Energy Technology Data Exchange (ETDEWEB)

    Baylor, L.C.; Buchanan, B.R.; O`Rourke, P.E.

    1993-01-01

    This invention is comprised of a method for validating a process stream for the presence or absence of a substance of interest such as a chemical warfare agent; that is, for verifying that a chemical warfare agent is present in an input line for feeding the agent into a reaction vessel for destruction, or, in a facility for producing commercial chemical products, that a constituent of the chemical warfare agent has not been substituted for the proper chemical compound. The method includes the steps of transmitting light through a sensor positioned in the feed line just before the chemical constituent in the input line enters the reaction vessel, measuring an optical spectrum of the chemical constituent from the light beam transmitted through it, and comparing the measured spectrum to a reference spectrum of the chemical agent and preferable also reference spectra of surrogates. A signal is given if the chemical agent is not entering a reaction vessel for destruction, or if a constituent of a chemical agent is added to a feed line in substitution of the proper chemical compound.

  13. A robust method for processing scanning probe microscopy images and determining nanoobject position and dimensions

    NARCIS (Netherlands)

    Silly, F.

    2009-01-01

    P>Processing of scanning probe microscopy (SPM) images is essential to explore nanoscale phenomena. Image processing and pattern recognition techniques are developed to improve the accuracy and consistency of nanoobject and surface characterization. We present a robust and versatile method to

  14. Evaluation and selection of in-situ leaching mining method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Zhao Heyong; Tan Kaixuan; Liu Huizhen

    2007-01-01

    According to the complicated conditions and main influence factors of in-situ leaching min- ing, a model and processes of analytic hierarchy are established for evaluation and selection of in-situ leaching mining methods based on analytic hierarchy process. Taking a uranium mine in Xinjiang of China for example, the application of this model is presented. The results of analyses and calculation indicate that the acid leaching is the optimum project. (authors)

  15. A method for processing the critical zone of a carbonate stratum

    Energy Technology Data Exchange (ETDEWEB)

    Dytyuk, L T; Barsukov, A V; Bragina, O A; Kalabina, A V; Samakayev, R Kh

    1982-01-01

    A method is proposed for processing the critical zone of a carbonate stratum by pumping a carbonate rock solvent into it. It is distinguished by the fact that in order to increase the penetration depth of the solvent into the stratum by reducing the speed of interaction of the solvent, a solution of beta-phenoxyvinylphosphonic acid is pumped into the critical zone of the stratum.

  16. 40 CFR 63.645 - Test methods and procedures for miscellaneous process vents.

    Science.gov (United States)

    2010-07-01

    ... analysis based on accepted chemical engineering principles, measurable process parameters, or physical or... for a new source, Method 18 may be used to determine any non-VOC hydrocarbons that may be deducted to calculate the TOC (minus non-VOC hydrocarbons) concentration and mass flow rate. The following procedures...

  17. Scientific Process Flowchart Assessment (SPFA): A Method for Evaluating Changes in Understanding and Visualization of the Scientific Process in a Multidisciplinary Student Population

    Science.gov (United States)

    Wilson, Kristy J.; Rigakos, Bessie

    2016-01-01

    The scientific process is nonlinear, unpredictable, and ongoing. Assessing the nature of science is difficult with methods that rely on Likert-scale or multiple-choice questions. This study evaluated conceptions about the scientific process using student-created visual representations that we term "flowcharts." The methodology,…

  18. Multiresolution, Geometric, and Learning Methods in Statistical Image Processing, Object Recognition, and Sensor Fusion

    National Research Council Canada - National Science Library

    Willsky, Alan

    2004-01-01

    .... Our research blends methods from several fields-statistics and probability, signal and image processing, mathematical physics, scientific computing, statistical learning theory, and differential...

  19. Aplication of AHP method in partner's selection process for supply chain development

    Directory of Open Access Journals (Sweden)

    Barac Nada

    2012-06-01

    Full Text Available The process of developing a supply chain is long and complex. with many restrictions and obstacles that accompany it. In this paper the authors focus on the first stage in developing the supply chain and the selection process and selection of partners. This phase of the development significantly affect the competitive position of the supply chain and create value for the consumer. Selected partners or 'links' of the supply chain influence the future performance of the chain which points to the necessity of full commitment to this process. The process of selection and choice of partner is conditioned by the key criteria that are used on that occasion. The use of inadequate criteria may endanger the whole process of building a supply chain partner selection through inadequate future supply chain needs. This paper is an analysis of partner selection based on key criteria used by managers in Serbia. For this purpose we used the AHP method. the results show that these are the top ranked criteria in terms of managers.

  20. RE-EDUCATIVE METHOD IN THE PROCESS OF MINIMIZING OF AUTOAGRESIVE WAYS OF BEHAVIOR

    Directory of Open Access Journals (Sweden)

    Nenad GLUMBIC

    1999-05-01

    Full Text Available Autoagressive behavior is a relatively frequent symptom of mental disturbances and behavior disturbances which are the subject of professional engagement of clinically oriented defectologists. In the process of rehabilitation numerous methods are used, from behavioral to psychopharmacological ones by which the above mentioned problems are eliminated of softened.The paper deals with four children with different diagnosis (autism, disintegrative psychosis, Patau syndrome and amaurosis that have the same common denominator-mental retardation and autoagression.We have tried to point out-by the description of a study case as well as the ways od work with these children-an application possibly of the particular methods of general and special re-education of psychomotorics in the process of autoagressive ways of behavior minimizing.The paper gives the autor’s notion of indications for re-educative method application with in the multihandicapped children population. Defectological treatment discovers new forms of existence in the existential field, not only to the retarded child but also to the very therapist. Epistemological consequences of the mentioned transfer are given in details in the paper.

  1. Processing of micro-nano bacterial cellulose with hydrolysis method as a reinforcing bioplastic

    Science.gov (United States)

    Maryam, Maryam; Dedy, Rahmad; Yunizurwan, Yunizurwan

    2017-01-01

    Nanotechnology is the ability to create and manipulate atoms and molecules on the smallest of scales. Their size allows them to exhibit novel and significantly improved physical, chemical, biological properties, phenomena, and processes because of their size. The purpose of this research is obtaining micro-nano bacterial cellulose as reinforcing bioplastics. Bacterial cellulose (BC) was made from coconut water for two weeks. BC was dried and grinded. Bacterial cellulose was given purification process with NaOH 5% for 6 hours. Making the micro-nano bacterial cellulose with hydrolysis method. Hydrolysis process with hydrochloric acid (HCl) at the conditions 3,5M, 55°C, 6 hours. Drying process used spray dryer. The hydrolysis process was obtained bacterial cellulose with ±7 μm. The addition 2% micro-nano bacterial cellulose as reinforcing in bioplastics composite can improve the physical characteristics.

  2. Processing of micro-nano bacterial cellulose with hydrolysis method as a reinforcing bioplastic

    International Nuclear Information System (INIS)

    Maryam, Maryam; Yunizurwan, Yunizurwan; Dedy, Rahmad

    2017-01-01

    Nanotechnology is the ability to create and manipulate atoms and molecules on the smallest of scales. Their size allows them to exhibit novel and significantly improved physical, chemical, biological properties, phenomena, and processes because of their size. The purpose of this research is obtaining micro-nano bacterial cellulose as reinforcing bioplastics. Bacterial cellulose (BC) was made from coconut water for two weeks. BC was dried and grinded. Bacterial cellulose was given purification process with NaOH 5% for 6 hours. Making the micro-nano bacterial cellulose with hydrolysis method. Hydrolysis process with hydrochloric acid (HCl) at the conditions 3,5M, 55°C, 6 hours. Drying process used spray dryer. The hydrolysis process was obtained bacterial cellulose with ±7 μm. The addition 2% micro-nano bacterial cellulose as reinforcing in bioplastics composite can improve the physical characteristics. (paper)

  3. Effect of different processing methods on antioxidant activity of underutilized legumes, Entada scandens seed kernel and Canavalia gladiata seeds.

    Science.gov (United States)

    Sasipriya, Gopalakrishnan; Siddhuraju, Perumal

    2012-08-01

    The present study is proposed to determine the antioxidant activity of raw and processed samples of underutilized legumes, Entada scandens seed kernel and Canavalia gladiata seeds. The indigenous processing methods like dry heating, autoclaving and soaking followed by autoclaving in different solutions (plain water, ash, sugar and sodium bicarbonate) were adopted to seed samples. All other processing methods than dry heat showed significant reduction in phenolics (2.9-63%), tannins (26-100%) and flavonoids (14-67%). However, in processed samples of E. scandens, the hydroxyl radical scavenging activity and β-carotene bleaching inhibition activity were increased, whereas, 2,2-azinobis (3-ethyl benzothiazoline-6-sulfonic acid) diammonium salt (ABTS·(+)), ferric reducing antioxidant power (FRAP), metal chelating and superoxide anion scavenging activity were similar to unprocessed ones. In contrary, except dry heating in C. gladiata, all other processing methods significantly (Pprocessing methods in E. scandens and dry heating in C. gladiata would be a suitable method for adopting in domestic or industrial processing. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Methods of Run-Time Error Detection in Distributed Process Control Software

    DEFF Research Database (Denmark)

    Drejer, N.

    In this thesis, methods of run-time error detection in application software for distributed process control is designed. The error detection is based upon a monitoring approach in which application software is monitored by system software during the entire execution. The thesis includes definition...... and constraint evaluation is designed for the modt interesting error types. These include: a) semantical errors in data communicated between application tasks; b) errors in the execution of application tasks; and c) errors in the timing of distributed events emitted by the application software. The design...... of error detection methods includes a high level software specification. this has the purpose of illustrating that the designed can be used in practice....

  5. Considerations on the question of applying ion exchange or reverse osmosis methods in boiler feedwater processing

    International Nuclear Information System (INIS)

    Marquardt, K.; Dengler, H.

    1976-01-01

    This consideration is to show that the method of reverse osmosis presents in many cases an interesting and economical alternative to part and total desolination plants using ion exchangers. The essential advantages of the reverse osmosis are a higher degree of automization, no additional salting of the removed waste water, small constructional volume of the plant as well as favourable operational costs with increasing salt content of the crude water to be processed. As there is a relatively high salt breakthrough compared to the ion exchange method, the future tendency in boiler feedwater processing will be more towards a combination of methods of reverse osmosis and post-purification through continuous ion exchange methods. (orig./LH) [de

  6. Enhancement of Efficiency and Reduction of Grid Thickness Variation on Casting Process with Lean Six Sigma Method

    Science.gov (United States)

    Witantyo; Setyawan, David

    2018-03-01

    In a lead acid battery industry, grid casting is a process that has high defect and thickness variation level. DMAIC (Define-Measure-Analyse-Improve-Control) method and its tools will be used to improve the casting process. In the Define stage, it is used project charter and SIPOC (Supplier Input Process Output Customer) method to map the existent problem. In the Measure stage, it is conducted a data retrieval related to the types of defect and the amount of it, also the grid thickness variation that happened. And then the retrieved data is processed and analyzed by using 5 Why’s and FMEA method. In the Analyze stage, it is conducted a grid observation that experience fragile and crack type of defect by using microscope showing the amount of oxide Pb inclusion in the grid. Analysis that is used in grid casting process shows the difference of temperature that is too high between the metal fluid and mold temperature, also the corking process that doesn’t have standard. The Improve stage is conducted a fixing process which generates the reduction of grid variation thickness level and defect/unit level from 9,184% to 0,492%. In Control stage, it is conducted a new working standard determination and already fixed control process.

  7. Identification of wastewater treatment processes for nutrient removal on a full-scale WWTP by statistical methods

    DEFF Research Database (Denmark)

    Carstensen, Jakob; Madsen, Henrik; Poulsen, Niels Kjølstad

    1994-01-01

    of the processes, i.e. including prior knowledge, with the significant effects found in data by using statistical identification methods. Rates of the biochemical and hydraulic processes are identified by statistical methods and the related constants for the biochemical processes are estimated assuming Monod...... kinetics. The models only include those hydraulic and kinetic parameters, which have shown to be significant in a statistical sense, and hence they can be quantified. The application potential of these models is on-line control, because the present state of the plant is given by the variables of the models......The introduction of on-line sensors of nutrient salt concentrations on wastewater treatment plants opens a wide new area of modelling wastewater processes. Time series models of these processes are very useful for gaining insight in real time operation of wastewater treatment systems which deal...

  8. Quantitative evaluation method of the bubble structure of sponge cake by using morphology image processing

    Science.gov (United States)

    Tatebe, Hironobu; Kato, Kunihito; Yamamoto, Kazuhiko; Katsuta, Yukio; Nonaka, Masahiko

    2005-12-01

    Now a day, many evaluation methods for the food industry by using image processing are proposed. These methods are becoming new evaluation method besides the sensory test and the solid-state measurement that are using for the quality evaluation. An advantage of the image processing is to be able to evaluate objectively. The goal of our research is structure evaluation of sponge cake by using image processing. In this paper, we propose a feature extraction method of the bobble structure in the sponge cake. Analysis of the bubble structure is one of the important properties to understand characteristics of the cake from the image. In order to take the cake image, first we cut cakes and measured that's surface by using the CIS scanner. Because the depth of field of this type scanner is very shallow, the bubble region of the surface has low gray scale values, and it has a feature that is blur. We extracted bubble regions from the surface images based on these features. First, input image is binarized, and the feature of bubble is extracted by the morphology analysis. In order to evaluate the result of feature extraction, we compared correlation with "Size of the bubble" of the sensory test result. From a result, the bubble extraction by using morphology analysis gives good correlation. It is shown that our method is as well as the subjectivity evaluation.

  9. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    Science.gov (United States)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  10. Stress strain modelling of casting processes in the framework of the control volume method

    DEFF Research Database (Denmark)

    Hattel, Jesper; Andersen, Søren; Thorborg, Jesper

    1998-01-01

    Realistic computer simulations of casting processes call for the solution of both thermal, fluid-flow and stress/strain related problems. The multitude of the influencing parameters, and their non-linear, transient and temperature dependent nature, make the calculations complex. Therefore the nee......, the present model is based on the mainly decoupled representation of the thermal, mechanical and microstructural processes. Examples of industrial applications, such as predicting residual deformations in castings and stress levels in die casting dies, are presented...... for fast, flexible, multidimensional numerical methods is obvious. The basis of the deformation and stress/strain calculation is a transient heat transfer analysis including solidification. This paper presents an approach where the stress/strain and the heat transfer analysis uses the same computational...... domain, which is highly convenient. The basis of the method is the control volume finite difference approach on structured meshes. The basic assumptions of the method are shortly reviewed and discussed. As for other methods which aim at application oriented analysis of casting deformations and stresses...

  11. Biogenic amine profile in unripe Arabica coffee beans processed according to dry and wet methods.

    Science.gov (United States)

    Dias, Eduardo C; Pereira, Rosemary G F A; Borém, Flávio M; Mendes, Eulália; de Lima, Renato R; Fernandes, José O; Casal, Susana

    2012-04-25

    Immature coffee fruit processing contributes to a high amount of defective beans, which determines a significant amount of low-quality coffee sold in the Brazilian internal market. Unripe bean processing was tested, taking the levels of bioactive amines as criteria for evaluating the extent of fermentation and establishing the differences between processing methods. The beans were processed by the dry method after being mechanically depulped immediately after harvest or after a 12 h resting period in a dry pile or immersed in water. Seven bioactive amines were quantified: putrescine, spermine, spermidine, serotonin, cadaverine, histamine, and tyramine, with global amounts ranging from 71.8 to 80.3 mg/kg. The levels of spermine and spermidine were lower in the unripe depulped coffee than in the natural coffee. The specific conditions of dry and wet processing also influenced cadaverine levels, and histamine was reduced in unripe depulped coffee. A resting period of 12 h does not induce significant alteration on the beans and can be improved if performed in water. These results confirm that peeling immature coffee can decrease fermentation processes while providing more uniform drying, thus reducing the number of defects and potentially increasing beverage quality.

  12. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    Science.gov (United States)

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the

  13. Off-flavor related volatiles in soymilk as affected by soybean variety, grinding, and heat-processing methods.

    Science.gov (United States)

    Zhang, Yan; Guo, Shuntang; Liu, Zhisheng; Chang, Sam K C

    2012-08-01

    Off-flavor of soymilk is a barrier to the acceptance of consumers. The objectionable soy odor can be reduced through inhibition of their formation or through removal after being formed. In this study, soymilk was prepared by three grinding methods (ambient, cold, and hot grinding) from two varieties (yellow Prosoy and a black soybean) before undergoing three heating processes: stove cooking, one-phase UHT (ultrahigh temperature), and two-phase UHT process using a Microthermics direct injection processor, which was equipped with a vacuuming step to remove injected water and volatiles. Eight typical soy odor compounds, generated from lipid oxidation, were extracted by a solid-phase microextraction method and analyzed by gas chromatography. The results showed that hot grinding and cold grinding significantly reduced off-flavor as compared with ambient grinding, and hot grinding achieved the best result. The UHT methods, especially the two-phase UHT method, were effective to reduce soy odor. Different odor compounds showed distinct concentration patterns because of different formation mechanisms. The two varieties behaved differently in odor formation during the soymilk-making process. Most odor compounds could be reduced to below the detection limit through a combination of hot grinding and two-phase UHT processing. However, hot grinding gave lower solid and protein recoveries in soymilk.

  14. Evaluation of methods for retention of radioiodine during processing of irradiated 237Np

    International Nuclear Information System (INIS)

    Thompson, G.H.; Kelley, J.A.

    1975-06-01

    Methods of removing radioiodine from 237 Np-- 238 Pu dissolver solution and process off-gas were investigated. This program is part of a continuing effort to reduce releases of radionuclides from plant operations. Experimental data show: Greater than 99.9 percent of the radioiodine in dissolver solution can be removed by precipitation, in situ, of manganese dioxide. Silver zeolite will sorb greater than 99.9 percent of radioiodine in process off-gas. Other solid sorbents and nitric acid-mercuric nitrate scrubber solutions do not remove appreciable amounts of radioiodine from process off-gas, because radioiodine is present principally as relatively unreactive organic iodine compounds. (U.S.)

  15. Emerging non-invasive Raman methods in process control and forensic applications.

    Science.gov (United States)

    Macleod, Neil A; Matousek, Pavel

    2008-10-01

    This article reviews emerging Raman techniques (Spatially Offset and Transmission Raman Spectroscopy) for non-invasive, sub-surface probing in process control and forensic applications. New capabilities offered by these methods are discussed and several application examples are given including the non-invasive detection of counterfeit drugs through blister packs and opaque plastic bottles and the rapid quantitative analysis of the bulk content of pharmaceutical tablets and capsules without sub-sampling.

  16. Analysis of Off Gas From Disintegration Process of Graphite Matrix by Electrochemical Method

    International Nuclear Information System (INIS)

    Tian Lifang; Wen Mingfen; Chen Jing

    2010-01-01

    Using electrochemical method with salt solutions as electrolyte, some gaseous substances (off gas) would be generated during the disintegration of graphite from high-temperature gas-cooled reactor fuel elements. The off gas is determined to be composed of H 2 , O 2 , N 2 , CO 2 and NO x by gas chromatography. Only about 1.5% graphite matrix is oxidized to CO 2 . Compared to the direct burning-graphite method, less off gas,especially CO 2 , is generated in the disintegration process of graphite by electrochemical method and the treatment of off gas becomes much easier. (authors)

  17. CASE METHOD AS A MEANS TO INTENSIFY THE EDUCATIONAL PROCESS OF INTENDING ECONOMISTS

    Directory of Open Access Journals (Sweden)

    Олександр Вашків

    2014-04-01

    Full Text Available The article presents practical aspects of case-method usage in the process of researching social responsibility of business. The paper also gives a detailed insight into a situational exercise on the level of its constituting elements. Case-method has been characterized in terms of its peculiarities and the advantages, and in comparison to the traditional teaching methods. This article also presents a scenario of a situational exercise «Our daily bread», which has been prepared as a result of authors’ participation in the state programme «the Ukrainian Initiative», which presupposed training at the enterprises in Germany.

  18. Standard-Setting Methods as Measurement Processes

    Science.gov (United States)

    Nichols, Paul; Twing, Jon; Mueller, Canda D.; O'Malley, Kimberly

    2010-01-01

    Some writers in the measurement literature have been skeptical of the meaningfulness of achievement standards and described the standard-setting process as blatantly arbitrary. We argue that standard setting is more appropriately conceived of as a measurement process similar to student assessment. The construct being measured is the panelists'…

  19. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  20. Comparison of Chemical Constituents in Scrophulariae Radix Processed by Different Methods based on UFLC-MS Combined with Multivariate Statistical Analysis.

    Science.gov (United States)

    Wang, Shengnan; Hua, Yujiao; Zou, Lisi; Liu, Xunhong; Yan, Ying; Zhao, Hui; Luo, Yiyuan; Liu, Juanxiu

    2018-02-01

    Scrophulariae Radix is one of the most popular traditional Chinese medicines (TCMs). Primary processing of Scrophulariae Radix is an important link which closely related to the quality of products in this TCM. The aim of this study is to explore the influence of different processing methods on chemical constituents in Scrophulariae Radix. The difference of chemical constituents in Scrophulariae Radix processed by different methods was analyzed by using ultra fast liquid chromatography-triple quadrupole-time of flight mass spectrometry coupled with principal component analysis and orthogonal partial least squares discriminant analysis. Furthermore, the contents of 12 index differential constituents in Scrophulariae Radix processed by different methods were simultaneously determined by using ultra fast liquid chromatography coupled with triple quadrupole-linear ion trap mass spectrometry. Gray relational analysis was performed to evaluate the different processed samples according to the contents of 12 constituents. All of the results demonstrated that the quality of Scrophulariae Radix processed by "sweating" method was better. This study will provide the basic information for revealing the change law of chemical constituents in Scrophulariae Radix processed by different methods and facilitating selection of the suitable processing method of this TCM. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. A measurement method for micro 3D shape based on grids-processing and stereovision technology

    International Nuclear Information System (INIS)

    Li, Chuanwei; Xie, Huimin; Liu, Zhanwei

    2013-01-01

    An integrated measurement method for micro 3D surface shape by a combination of stereovision technology in a scanning electron microscope (SEM) and grids-processing methodology is proposed. The principle of the proposed method is introduced in detail. By capturing two images of the tested specimen with grids on the surface at different tilt angles in an SEM, the 3D surface shape of the specimen can be obtained. Numerical simulation is applied to analyze the feasibility of the proposed method. A validation experiment is performed here. The surface shape of the metal-wire/polymer-membrane structures with thermal deformation is reconstructed. By processing the surface grids of the specimen, the out-of-plane displacement field of the specimen surface is also obtained. Compared with the measurement results obtained by a 3D digital microscope, the experimental error of the proposed method is discussed (paper)

  2. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  3. Methods for processing and analysis functional and anatomical brain images: computerized tomography, emission tomography and nuclear resonance imaging

    International Nuclear Information System (INIS)

    Mazoyer, B.M.

    1988-01-01

    The various methods for brain image processing and analysis are presented and compared. The following topics are developed: the physical basis of brain image comparison (nature and formation of signals intrinsic performance of the methods image characteristics); mathematical methods for image processing and analysis (filtering, functional parameter extraction, morphological analysis, robotics and artificial intelligence); methods for anatomical localization (neuro-anatomy atlas, proportional stereotaxic atlas, numerized atlas); methodology of cerebral image superposition (normalization, retiming); image networks [fr

  4. Data warehousing methods and processing infrastructure for brain recovery research.

    Science.gov (United States)

    Gee, T; Kenny, S; Price, C J; Seghier, M L; Small, S L; Leff, A P; Pacurar, A; Strother, S C

    2010-09-01

    In order to accelerate translational neuroscience with the goal of improving clinical care it has become important to support rapid accumulation and analysis of large, heterogeneous neuroimaging samples and their metadata from both normal control and patient groups. We propose a multi-centre, multinational approach to accelerate the data mining of large samples and facilitate data-led clinical translation of neuroimaging results in stroke. Such data-driven approaches are likely to have an early impact on clinically relevant brain recovery while we simultaneously pursue the much more challenging model-based approaches that depend on a deep understanding of the complex neural circuitry and physiological processes that support brain function and recovery. We present a brief overview of three (potentially converging) approaches to neuroimaging data warehousing and processing that aim to support these diverse methods for facilitating prediction of cognitive and behavioral recovery after stroke, or other types of brain injury or disease.

  5. PGAA method for control of the technologically important elements at processing of sulfide ores

    International Nuclear Information System (INIS)

    Kurbanov, B.I.; Aripov, G.A.; Allamuratova, G.; Umaraliev, M.

    2006-01-01

    Full text: Many precious elements (Au, Re, Pt, Pd, Ag, Cu, Ni, Co, Mo) in ores mainly exist in the form of sulfide minerals and the flotation method is often used for processing of such kind of ores. To enhance the efficiency of the process it is very important to carry out the operative control of the elements of interest at various stages of ore processing. In this work the results of studies for developing methods for control of technologically important elements at processing and enrichment sulfide ores, which content the gold, copper, nickel, molybdenum in the ore-processing plants of Uzbekistan. The design of transportable experimental PGAA device on the basis of low-power radionuclide neutron source ( 252 Cf) with neutrons of 2x10 7 neutr/sec allowing to determine element content of the above named ores and their processing products is offered. It is shown that the use of the thermal neutron capture gamma-ray spectrometry in real samples and technological products allows prompt determination of such elements as S, Cu, Ti and others, which are important for flotation of sulfide ores. Efficiency control of the flotation processing of sulfide ores is based on quick determination of the content of sulfur and some other important elements at different stages of the process. It was found that to determine elements the following gamma lines are the most suitable - 840.3 keV for sulfur, 609 keV and 7307 keV for copper and 1381.5 keV, 1498.3 keV and 1585.3 keV for titanium. Based on the measurements of original ores, concentrates of various stages of flotation and flotation slime the possibility for prompt determination of S, Cu and Ti content and thus to get necessary information on the efficiency of the flotation process was shown. (author)

  6. Development and application of a probabilistic evaluation method for advanced process technologies. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H.C.; Rubin, E.S.

    1991-04-01

    The objective of this work is to develop and apply a method for research planning for advanced process technologies. To satisfy requirements for research planning, it is necessary to: (1) identify robust solutions to process design questions in the face of uncertainty to eliminate inferior design options; (2) identify key problem areas in a technology that should be the focus of further research to reduce the risk of technology failure; (3) compare competing technologies on a consistent basis to determine the risks associated with adopting a new technology; and (4) evaluate the effects that additional research might have on comparisons with conventional technology. An important class of process technologies are electric power plants. In particular, advanced clean coal technologies are expected to play a key role in the energy and environmental future of the US, as well as in other countries. Research planning for advanced clean coal technology development is an important part of energy and environmental policy. Thus, the research planning method developed here is applied to case studies focusing on a specific clean coal technology. The purpose of the case studies is both to demonstrate the research planning method and to obtain technology-specific conclusions regarding research strategies.

  7. Ductile cast iron obtaining by Inmold method with use of LOST FOAM process

    Directory of Open Access Journals (Sweden)

    T. Pacyniak

    2010-01-01

    Full Text Available The possibility of manufacturing of ductile cast iron castings by Inmold method with use of LOST FOAM process was presented in this work. The spheroidization was carried out by magnesium master alloy in amounts of 1% casting mass. Nodulizer was located in the reactive chamber in the gating system made of foamed polystyrene. Pretests showed, that there are technical possibilities of manufacturing of casts from ductile cast iron in the LOST FOAM process with use of spheroidization in mould.

  8. Method of processing liquid waste containing fission product

    International Nuclear Information System (INIS)

    Funabashi, Kiyomi; Kawamura, Fumio; Matsuda, Masami; Komori, Itaru; Miura, Eiichi.

    1988-01-01

    Purpose: To prepare solidification products of low surface dose by removing cesium which is main radioactive nuclides from re-processing plants. Method: Liquid wastes containing a great amount of fission products are generated accompanying the reprocessing for spent nuclear fuels. After pH adjustment, the liquid wastes are sent to a concentrator to concentrate the dissolved ingredients. The concentrated liquid wastes are pumped to an adsorption tower in which radioactive cesium contributing much to the surface dose is removed. Then, the liquid wastes are sent by way of a surge tank to a mixing tank, in which they are mixed under stirring with solidifying agents such as cements. Then, the mixture is filled in a drum-can and solidified. According to this invention, since radioactive cesium is removed before solidification, it is possible to prepare solidification products at low surface dose and facilitate the handling of the solidification products. (Horiuchi, T.)

  9. A new method for recovery of cellulose from lignocellulosic bio-waste: Pile processing.

    Science.gov (United States)

    Tezcan, Erdem; Atıcı, Oya Galioğlu

    2017-12-01

    This paper presents a new delignification method (pile processing) for the recovery of cellulose from lignocellulosic bio-wastes, adapted from heap leaching technology in metallurgy. The method is based on the stacking of cellulosic materials in a pile, irrigation of the pile with aqueous reactive solution from the top, lignin and hemicellulose removal and enrichment of cellulose by the reactive solution while percolation occurs through the bottom of the pile, recirculating the reactive solution after adjusting several values such as chemical concentrations, and allow the system run until the desired time or cellulose purity. Laboratory scale systems were designed using fall leaves (FL) as lignocellulosic waste materials. The ideal condition for FL was noted as: 0.1g solid NaOH addition per gram of FL into the irrigating solution resulting in instant increase in pH to about 13.8, later allowing self-decrease in pH due to delignification over time down to 13.0, at which point another solid NaOH addition was performed. The new method achieved enrichment of cellulose from 30% to 81% and removal of 84% of the lignin that prevents industrial application of lignocellulosic bio-waste using total of 0.3g NaOH and 4ml of water per gram of FL at environmental temperature and pressure. While the stirring reactions used instead of pile processing required the same amount of NaOH, they needed at least 12ml of water and delignification was only 56.1%. Due to its high delignification performance using common and odorless chemicals and simple equipment in mild conditions, the pile processing method has great promise for the industrial evaluation of lignocellulosic bio-waste. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Contamination control methods for gases used in the microlithography process

    Science.gov (United States)

    Rabellino, Larry; Applegarth, Chuck; Vergani, Giorgio

    2002-07-01

    Sensitivity to contamination continues to increase as the technology shrinks from 365 nm I-line lamp illumination to 13.4 nm Extreme Ultraviolet laser activated plasma. Gas borne impurities can be readily distributed within the system, remaining both suspended in the gas and attached to critical surfaces. Effects from a variety of contamination, some well characterized and others not, remain a continuing obstacle for stepper manufacturers and users. Impurities like oxygen, moisture and hydrocarbons in parts per billion levels can absorb light, reducing the light intensity and subsequently reducing the consistence of the process. Moisture, sulfur compounds, ammonia, acid compounds and organic compounds such as hydrocarbons can deposit on lens or mirror surfaces affecting image quality. Regular lens replacement or removal for cleaning is a costly option and in-situ cleaning processes must be carefully managed to avoid recontamination of the system. The contamination can come from outside the controlled environment (local gas supply, piping system, & leaks), or from the materials moving into the controlled environment; or contamination may be generated inside the controlled environment as a result of the process itself. The release of amines can occur as a result of the degassing of the photo-resists. For the manufacturer and user of stepper equipment, the challenge is not in predictable contamination, but the variable or unpredictable contamination in the process. One type of unpredictable contamination may be variation in the environmental conditions when producing the nitrogen gas and Clean Dry Air (CDA). Variation in the CDA, nitrogen and xenon may range from parts per billion to parts per million. The risk due to uncontrolled or unmonitored variation in gas quality can be directly related to product defects. Global location can significantly affect the gas quality, due to the ambient air quality (for nitrogen and CDA), production methods, gas handling equipment

  11. Analyses of Methods and Algorithms for Modelling and Optimization of Biotechnological Processes

    Directory of Open Access Journals (Sweden)

    Stoyan Stoyanov

    2009-08-01

    Full Text Available A review of the problems in modeling, optimization and control of biotechnological processes and systems is given in this paper. An analysis of existing and some new practical optimization methods for searching global optimum based on various advanced strategies - heuristic, stochastic, genetic and combined are presented in the paper. Methods based on the sensitivity theory, stochastic and mix strategies for optimization with partial knowledge about kinetic, technical and economic parameters in optimization problems are discussed. Several approaches for the multi-criteria optimization tasks are analyzed. The problems concerning optimal controls of biotechnological systems are also discussed.

  12. An application of business process method to the clinical efficiency of hospital.

    Science.gov (United States)

    Leu, Jun-Der; Huang, Yu-Tsung

    2011-06-01

    The concept of Total Quality Management (TQM) has come to be applied in healthcare over the last few years. The process management category in the Baldrige Health Care Criteria for Performance Excellence model is designed to evaluate the quality of medical services. However, a systematic approach for implementation support is necessary to achieve excellence in the healthcare business process. The Architecture of Integrated Information Systems (ARIS) is a business process architecture developed by IDS Scheer AG and has been applied in a variety of industrial application. It starts with a business strategy to identify the core and support processes, and encompasses the whole life-cycle range, from business process design to information system deployment, which is compatible with the concept of healthcare performance excellence criteria. In this research, we apply the basic ARIS framework to optimize the clinical processes of an emergency department in a mid-size hospital with 300 clinical beds while considering the characteristics of the healthcare organization. Implementation of the case is described, and 16 months of clinical data are then collected, which are used to study the performance and feasibility of the method. The experience gleaned in this case study can be used a reference for mid-size hospitals with similar business models.

  13. Comparison of changes in vertical dimension of the upper and lower complete dentures processed using two investing methods

    International Nuclear Information System (INIS)

    Kharat, D.U.; Fakiha, Z.

    1990-01-01

    Standardized compression molding technique was used to process 14 sets of complete dentures. Seven sets of dentures were invested by the conventional method and the other seven were invested by a modified method. In the latter method, the second layer of the investment extended occlusally only up to the maximum convexity on the labial/buccal and lingual surfaces of the teeth. In the beginning, only the upper or the lower denture was processed and changes in the vertical dimension of occlusion were measured. After that, the other denture was also processed and the increase in the vertical dimension of occlusion of the sets of dentures was measured. All measurements were made at the incisal guide pin using a leaf gauge. Statistical analysis using t-test showed no difference in the changes of vertical dimension of the dentures processed by the two different methods of investing. However, both of the methods of investing showed significantly more increase in vertical dimension of the upper complete dentures than that of the lower complete dentures. (author)

  14. Simulation of optical configurations and signal processing methods in Anger-type neutron-position scintillation detector

    International Nuclear Information System (INIS)

    Roche, C.T.; Strauss, M.G.; Brenner, R.

    1984-01-01

    The spatial linearity and resolution of Anger-type neutron-position scintillation detectors are studied using a semi-empirical model. Detector optics with either an air gap or optical grease between the scintillator and the dispersive light guide are considered. Three signal processing methods which truncate signals from PMT's distant from the scintillation are compared with the linear resistive weighting method. Air gap optics yields a 15% improvement in spatial resolution and 50% reduction in differential and integral nonlinearity relative to grease coupled optics, using linear processing. Using signal truncation instead of linear processing improves the resolution 15-20% for the air gap and 20-30% for the grease coupling case. Thus, the initial discrepancy in the resolution between the two optics nearly vanished, however the linearity of the grease coupled system is still significantly poorer

  15. A method for automatic control of the process of producing electrode pitch

    Energy Technology Data Exchange (ETDEWEB)

    Rozenman, E.S.; Bugaysen, I.M.; Chernyshov, Yu.A.; Klyusa, M.D.; Krysin, V.P.; Livshits, B.Ya.; Martynenko, V.V.; Meniovich, B.I.; Sklyar, M.G.; Voytenko, B.I.

    1983-01-01

    A method is proposed for automatic control of the process for producing electride pitch through regulation of the feeding of the starting raw material with correction based on the pitch level in the last apparatus of the technological line and change in the feeding of air into the reactors based on the flow rates of the starting raw material and the temperature of the liquid phase in the reactors. In order to increase the stability of the quality of the electrode pitch with changes in the properties of the starting resin, the heating temperature of the dehydrated resin is regulated in the pipe furnace relative to the quality of the mean temperature pitch produced from it, while the level of the liquid phase in the reactor is regulated relative to the quality of the final product. The proposed method provides for an improvement in the quality of process regulation, which makes it possible to improve the properties of the anode mass and to reduce its expenditure for the production of Aluminum.

  16. Method of processing radioactive nuclide-containing liquids

    International Nuclear Information System (INIS)

    Hirai, Masahide; Tomoshige, Shozo; Kondo, Kozo; Suzuki, Kazunori; Todo, Fukuzo; Yamanaka, Akihiro.

    1985-01-01

    Purpose: To solidify radioactive nuclides in to a much compact state and facilitate the storage. Method: Liquid wastes such as drain liquids generated from a nuclear power plant at a low density of 1 x 10 -6 - 10 -4 μCi/ml are previously brought into contact with a chelate type ion exchange resin such as of phenolic resin or ion exchange resin to adsorb the radioactive nuclides on the resin and the nuclides are eluted with sulfuric acid or the like to obtain liquid concentrates. The liquid concentrates are electrolyzed in an ordinary electrolytic facility using platinum or the like as the anode, Al or the like as the cathode, under the presence of 1 - 20 g/l of non-radioactive heavy metals such as Co and Ni in the liquid and while adjusting pH to 2 - 8. The electrolysis liquid residue is returned again to the electrolysis tank as it is or in the form of precipitates coagulated with a polymeric floculant. The supernatant liquid upon floculating treatment is processed with the chelate type ion exchange resin into hazardless liquid. (Sekiya, K.)

  17. On the selection of optimized carbon nano tube synthesis method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Besharati, M. K.; Afaghi Khatibi, A.; Akbari, M.

    2008-01-01

    Evidence from the early and late industrializes shows that technology, as the commercial application of scientific knowledge, has been a major driver of industrial and economic development. International technology transfer is now being recognized as having played an important role in the development of the most successful late industrializes of the second half of the twentieth Century. Our society stands to be significantly influenced by carbon nano tubes, shaped by nano tube applications in every aspect, just as silicon-based technology still shapes society today. Nano tubes can be formed in various structures using several different processing methods. In this paper, the synthesis methods used to produce nano tubes in industrial or laboratory scales are discussed and a comparison is made. A technical feasibility study is conducted by using the multi criteria decision-making model, namely Analytic Hierarchy Process. The article ends with a discussion of selecting the best method of Technology Transferring of Carbon Nano tubes to Iran

  18. High-efficient method for spectrometric data real time processing with increased resolution of a measuring channel

    International Nuclear Information System (INIS)

    Ashkinaze, S.I.; Voronov, V.A.; Nechaev, Yu.I.

    1988-01-01

    Solution of reduction problem as a mean to increase spectrometric tract resolution when it is realized using the digit-by-digit modified method and special strategy, significantly reducing the time of processing, is considered. The results presented confirm that the complex measurement tract plus microcomputer is equivalent to the use of the tract with a higher resolution, and the use of the digit-by-digit modified method permits to process spectrometric information in real time scale

  19. A RAPID Method for Blood Processing to Increase the Yield of Plasma Peptide Levels in Human Blood.

    Science.gov (United States)

    Teuffel, Pauline; Goebel-Stengel, Miriam; Hofmann, Tobias; Prinz, Philip; Scharner, Sophie; Körner, Jan L; Grötzinger, Carsten; Rose, Matthias; Klapp, Burghard F; Stengel, Andreas

    2016-04-28

    Research in the field of food intake regulation is gaining importance. This often includes the measurement of peptides regulating food intake. For the correct determination of a peptide's concentration, it should be stable during blood processing. However, this is not the case for several peptides which are quickly degraded by endogenous peptidases. Recently, we developed a blood processing method employing Reduced temperatures, Acidification, Protease inhibition, Isotopic exogenous controls and Dilution (RAPID) for the use in rats. Here, we have established this technique for the use in humans and investigated recovery, molecular form and circulating concentration of food intake regulatory hormones. The RAPID method significantly improved the recovery for (125)I-labeled somatostatin-28 (+39%), glucagon-like peptide-1 (+35%), acyl ghrelin and glucagon (+32%), insulin and kisspeptin (+29%), nesfatin-1 (+28%), leptin (+21%) and peptide YY3-36 (+19%) compared to standard processing (EDTA blood on ice, p processing, while after standard processing 62% of acyl ghrelin were degraded resulting in an earlier peak likely representing desacyl ghrelin. After RAPID processing the acyl/desacyl ghrelin ratio in blood of normal weight subjects was 1:3 compared to 1:23 following standard processing (p = 0.03). Also endogenous kisspeptin levels were higher after RAPID compared to standard processing (+99%, p = 0.02). The RAPID blood processing method can be used in humans, yields higher peptide levels and allows for assessment of the correct molecular form.

  20. Novel ergonomic postural assessment method (NERPA) using product-process computer aided engineering for ergonomic workplace design.

    Science.gov (United States)

    Sanchez-Lite, Alberto; Garcia, Manuel; Domingo, Rosario; Angel Sebastian, Miguel

    2013-01-01

    Musculoskeletal disorders (MSDs) that result from poor ergonomic design are one of the occupational disorders of greatest concern in the industrial sector. A key advantage in the primary design phase is to focus on a method of assessment that detects and evaluates the potential risks experienced by the operative when faced with these types of physical injuries. The method of assessment will improve the process design identifying potential ergonomic improvements from various design alternatives or activities undertaken as part of the cycle of continuous improvement throughout the differing phases of the product life cycle. This paper presents a novel postural assessment method (NERPA) fit for product-process design, which was developed with the help of a digital human model together with a 3D CAD tool, which is widely used in the aeronautic and automotive industries. The power of 3D visualization and the possibility of studying the actual assembly sequence in a virtual environment can allow the functional performance of the parts to be addressed. Such tools can also provide us with an ergonomic workstation design, together with a competitive advantage in the assembly process. The method developed was used in the design of six production lines, studying 240 manual assembly operations and improving 21 of them. This study demonstrated the proposed method's usefulness and found statistically significant differences in the evaluations of the proposed method and the widely used Rapid Upper Limb Assessment (RULA) method.

  1. Ready for goal setting? Process evaluation of a patient-specific goal-setting method in physiotherapy.

    Science.gov (United States)

    Stevens, Anita; Köke, Albère; van der Weijden, Trudy; Beurskens, Anna

    2017-08-31

    Patient participation and goal setting appear to be difficult in daily physiotherapy practice, and practical methods are lacking. An existing patient-specific instrument, Patient-Specific Complaints (PSC), was therefore optimized into a new Patient Specific Goal-setting method (PSG). The aims of this study were to examine the feasibility of the PSG in daily physiotherapy practice, and to explore the potential impact of the new method. We conducted a process evaluation within a non-controlled intervention study. Community-based physiotherapists were instructed on how to work with the PSG in three group training sessions. The PSG is a six-step method embedded across the physiotherapy process, in which patients are stimulated to participate in the goal-setting process by: identifying problematic activities, prioritizing them, scoring their abilities, setting goals, planning and evaluating. Quantitative and qualitative data were collected among patients and physiotherapists by recording consultations and assessing patient files, questionnaires and written reflection reports. Data were collected from 51 physiotherapists and 218 patients, and 38 recordings and 219 patient files were analysed. The PSG steps were performed as intended, but the 'setting goals' and 'planning treatment' steps were not performed in detail. The patients and physiotherapists were positive about the method, and the physiotherapists perceived increased patient participation. They became aware of the importance of engaging patients in a dialogue, instead of focusing on gathering information. The lack of integration in the electronic patient system was a major barrier for optimal use in practice. Although the self-reported actual use of the PSG, i.e. informing and involving patients, and client-centred competences had improved, this was not completely confirmed by the objectively observed behaviour. The PSG is a feasible method and tends to have impact on increasing patient participation in the goal

  2. Isotopic method for investigation of process of periodic sedimentation of argillaceous suspensions

    International Nuclear Information System (INIS)

    Kohman, L.; Woznicki, T.

    1976-01-01

    The process of periodic sedimentation of kaolinic suspension in water has been investigated, by isotopic tracer method. the tracer was either the irradiated matrix material or 198 Au, adsorbed on the kaolin grains. The velocity of suspension level lowering (the sedimentation curve) and the variation in density in vertical section of sediment layer have been determined. (author)

  3. Silver-halide sensitized gelatin (SHSG) processing method for pulse holograms recorded on VRP plates

    Science.gov (United States)

    Evstigneeva, Maria K.; Drozdova, Olga V.; Mikhailov, Viktor N.

    2002-06-01

    One of the most important area of holograph applications is display holography. In case of pulse recording the requirement for vibration stability is easier than compared to CW exposure. At the same time it is widely known that the behavior of sliver-halide holographic materials strongly depends on the exposure duration. In particular the exposure sensitivity drastically decreases under nanosecond pulse duration. One of the effective ways of the diffraction efficiency improvement is SHSG processing method. This processing scheme is based on high modulation of refractive index due to microvoids appearance inside emulsion layer. It should be mentioned that the SHSG method was used earlier only in the cases when the holograms were recorded by use of CW lasers. This work is devoted to the investigation of SHSG method for pulse hologram recording on VRP plates. We used a pulsed YLF:Nd laser with pulse duration of 25 nanoseconds and wavelength of 527 nm. Both transmission and reflection holograms were recorded. The different kinds of bleaching as well as developing solutions were investigated. Our final processing scheme includes the following stages: 1) development in non-tanning solution, 2) rehalogenating bleach, 3) intermediate alcohol drying, 4) uniform second exposure, 5) second development in diluted developer, 6) reverse bleaching, 7) fixing and 8) gradient drying in isopropyl alcohol. Diffraction efficiency of transmission holograms was of about 60 percent and reflection mirror holograms was of about 45 percent. Thus we have demonstrated the SHSG processing scheme for producing effective holograms on VRP plates under pulse exposure.

  4. New Design Methods And Algorithms For High Energy-Efficient And Low-cost Distillation Processes

    Energy Technology Data Exchange (ETDEWEB)

    Agrawal, Rakesh [Purdue Univ., West Lafayette, IN (United States)

    2013-11-21

    This project sought and successfully answered two big challenges facing the creation of low-energy, cost-effective, zeotropic multi-component distillation processes: first, identification of an efficient search space that includes all the useful distillation configurations and no undesired configurations; second, development of an algorithm to search the space efficiently and generate an array of low-energy options for industrial multi-component mixtures. Such mixtures are found in large-scale chemical and petroleum plants. Commercialization of our results was addressed by building a user interface allowing practical application of our methods for industrial problems by anyone with basic knowledge of distillation for a given problem. We also provided our algorithm to a major U.S. Chemical Company for use by the practitioners. The successful execution of this program has provided methods and algorithms at the disposal of process engineers to readily generate low-energy solutions for a large class of multicomponent distillation problems in a typical chemical and petrochemical plant. In a petrochemical complex, the distillation trains within crude oil processing, hydrotreating units containing alkylation, isomerization, reformer, LPG (liquefied petroleum gas) and NGL (natural gas liquids) processing units can benefit from our results. Effluents from naphtha crackers and ethane-propane crackers typically contain mixtures of methane, ethylene, ethane, propylene, propane, butane and heavier hydrocarbons. We have shown that our systematic search method with a more complete search space, along with the optimization algorithm, has a potential to yield low-energy distillation configurations for all such applications with energy savings up to 50%.

  5. Application of multi attribute failure mode analysis of milk production using analytical hierarchy process method

    Science.gov (United States)

    Rucitra, A. L.

    2018-03-01

    Pusat Koperasi Induk Susu (PKIS) Sekar Tanjung, East Java is one of the modern dairy industries producing Ultra High Temperature (UHT) milk. A problem that often occurs in the production process in PKIS Sekar Tanjung is a mismatch between the production process and the predetermined standard. The purpose of applying Analytical Hierarchy Process (AHP) was to identify the most potential cause of failure in the milk production process. Multi Attribute Failure Mode Analysis (MAFMA) method was used to eliminate or reduce the possibility of failure when viewed from the failure causes. This method integrates the severity, occurrence, detection, and expected cost criteria obtained from depth interview with the head of the production department as an expert. The AHP approach was used to formulate the priority ranking of the cause of failure in the milk production process. At level 1, the severity has the highest weight of 0.41 or 41% compared to other criteria. While at level 2, identifying failure in the UHT milk production process, the most potential cause was the average mixing temperature of more than 70 °C which was higher than the standard temperature (≤70 ° C). This failure cause has a contributes weight of 0.47 or 47% of all criteria Therefore, this study suggested the company to control the mixing temperature to minimise or eliminate the failure in this process.

  6. Method for Business Process Management System Selection

    NARCIS (Netherlands)

    Thijs van de Westelaken; Bas Terwee; Pascal Ravesteijn

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However

  7. A New Digital Signal Processing Method for Spectrum Interference Monitoring

    Science.gov (United States)

    Angrisani, L.; Capriglione, D.; Ferrigno, L.; Miele, G.

    2011-01-01

    Frequency spectrum is a limited shared resource, nowadays interested by an ever growing number of different applications. Generally, the companies providing such services pay to the governments the right of using a limited portion of the spectrum, consequently they would be assured that the licensed radio spectrum resource is not interested by significant external interferences. At the same time, they have to guarantee that their devices make an efficient use of the spectrum and meet the electromagnetic compatibility regulations. Therefore the competent authorities are called to control the access to the spectrum adopting suitable management and monitoring policies, as well as the manufacturers have to periodically verify the correct working of their apparatuses. Several measurement solutions are present on the market. They generally refer to real-time spectrum analyzers and measurement receivers. Both of them are characterized by good metrological accuracies but show costs, dimensions and weights that make no possible a use "on the field". The paper presents a first step in realizing a digital signal processing based measurement instrument able to suitably accomplish for the above mentioned needs. In particular the attention has been given to the DSP based measurement section of the instrument. To these aims an innovative measurement method for spectrum monitoring and management is proposed in this paper. It performs an efficient sequential analysis based on a sample by sample digital processing. Three main issues are in particular pursued: (i) measurement performance comparable to that exhibited by other methods proposed in literature; (ii) fast measurement time, (iii) easy implementation on cost-effective measurement hardware.

  8. Impact of post-processing methods on apparent diffusion coefficient values

    Energy Technology Data Exchange (ETDEWEB)

    Zeilinger, Martin Georg; Lell, Michael; Uder, Michael [University of Erlangen-Nuremberg, Institute of Diagnostic Radiology, Erlangen (Germany); Baltzer, Pascal Andreas Thomas [Medical University Vienna, Department of Radiology and Nuclear Medicine, Vienna (Austria); Doerfler, Arnd; Dietzel, Matthias [University of Erlangen-Nuremberg, Department of Neuroradiology, Erlangen (Germany)

    2017-03-15

    The apparent diffusion coefficient (ADC) is increasingly used as a quantitative biomarker in oncological imaging. ADC calculation is based on raw diffusion-weighted imaging (DWI) data, and multiple post-processing methods (PPMs) have been proposed for this purpose. We investigated whether PPM has an impact on final ADC values. Sixty-five lesions scanned with a standardized whole-body DWI-protocol at 3 T served as input data (EPI-DWI, b-values: 50, 400 and 800 s/mm{sup 2}). Using exactly the same ROI coordinates, four different PPM (ADC{sub 1}-ADC{sub 4}) were executed to calculate corresponding ADC values, given as [10{sup -3} mm{sup 2}/s] of each lesion. Statistical analysis was performed to intra-individually compare ADC values stratified by PPM (Wilcoxon signed-rank tests: α = 1 %; descriptive statistics; relative difference/∇; coefficient of variation/CV). Stratified by PPM, mean ADCs ranged from 1.136-1.206 *10{sup -3} mm{sup 2}/s (∇ = 7.0 %). Variances between PPM were pronounced in the upper range of ADC values (maximum: 2.540-2.763 10{sup -3} mm{sup 2}/s, ∇ = 8 %). Pairwise comparisons identified significant differences between all PPM (P ≤ 0.003; mean CV = 7.2 %) and reached 0.137 *10{sup -3} mm{sup 2}/s within the 25th-75th percentile. Altering the PPM had a significant impact on the ADC value. This should be considered if ADC values from different post-processing methods are compared in patient studies. (orig.)

  9. Laser apparatus and method for microscopic and spectroscopic analysis and processing of biological cells

    Science.gov (United States)

    Gourley, P.L.; Gourley, M.F.

    1997-03-04

    An apparatus and method are disclosed for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis. 20 figs.

  10. Process and research method of radionuclide migration in high level radioactive waste geological disposal system

    International Nuclear Information System (INIS)

    Chen Rui; Zhang Zhanshi

    2014-01-01

    Radionuclides released from waste can migrate from the repository to the rock and soil outside. On the other hand, nuclides also are retarded by the backfill material. Radionuclide migration is the main geochemical process of the waste disposal. This paper introduces various methods for radionuclide migration research, and give a brief analysis of the geochemical process of radionuclide migration. Finally, two of the most important processes of the radionuclide migration have been instanced. (authors)

  11. [Analysis of chondroitin sulfate content of Cervi Cornu Pantotrichum with different processing methods and different parts].

    Science.gov (United States)

    Gong, Rui-Ze; Wang, Yan-Hua; Sun, Yin-Shi

    2018-02-01

    The differences and the variations of chondroitin sulfate content in different parts of Cervi Cornu Pantotrichum(CCP) with different processing methods were investigated. The chondroitin sulfate from velvet was extracted by dilute alkali-concentrated salt method. Next, the chondroitin sulfate was digested by chondroitinase ABC.The contents of total chondroitin sulfate and chondroitin sulfate A, B and C in the samples were determined by high performance liquid chromatography(HPLC).The content of chondroitin sulfate in wax,powder,gauze,bone slices of CCP with freeze-drying processing is 14.13,11.99,1.74,0.32 g·kg⁻¹, respectively. The content of chondroitin sulfate in wax,powder,gauze,bone slices of CCP with boiling processing is 10.71,8.97,2.21,1.40 g·kg⁻¹, respectively. The content of chondroitin sulfate in wax,powder,gauze,bone slices of CCP without blood is 12.47,9.47,2.64,0.07 g·kg⁻¹, respectively. And the content of chondroitin sulfate in wax,powder,gauze,bone slices of CCP with blood is 8.22,4.39,0.87,0.28 g·kg⁻¹ respectively. The results indicated that the chondroitin sulfate content in different processing methods was significantly different.The content of chondroitin sulfate in CCP with freeze-drying is higher than that in CCP with boiling processing.The content of chondroitin sulfate in CCP without blood is higher than that in CCP with blood. The chondroitin sulfate content in differerent paris of the velvet with the same processing methods was arranged from high to low as: wax slices, powder, gauze slices, bone slices. Copyright© by the Chinese Pharmaceutical Association.

  12. Exploring the Q-marker of "sweat soaking method" processed radix Wikstroemia indica: Based on the "effect-toxicity-chemicals" study.

    Science.gov (United States)

    Feng, Guo; Chen, Yun-Long; Li, Wei; Li, Lai-Lai; Wu, Zeng-Guang; Wu, Zi-Jun; Hai, Yue; Zhang, Si-Chao; Zheng, Chuan-Qi; Liu, Chang-Xiao; He, Xin

    2018-06-01

    Radix Wikstroemia indica (RWI), named "Liao Ge Wang" in Chinese, is a kind of toxic Chinese herbal medicine (CHM) commonly used in Miao nationality of South China. "Sweat soaking method" processed RWI could effectively decrease its toxicity and preserve therapeutic effect. However, the underlying mechanism of processing is still not clear, and the Q-markers database for processed RWI has not been established. Our study is to investigate and establish the quality evaluation system and potential Q-markers based on "effect-toxicity-chemicals" relationship of RWI for quality/safety assessment of "sweat soaking method" processing. The variation of RWI in efficacy and toxicity before and after processing was investigated by pharmacological and toxicological studies. Cytotoxicity test was used to screen the cytotoxicity of components in RWI. The material basis in ethanol extract of raw and processed RWI was studied by UPLC-Q-TOF/MS. And the potential Q-markers were analyzed and predicted according to "effect-toxicity-chemical" relationship. RWI was processed by "sweat soaking method", which could preserve efficacy and reduce toxicity. Raw RWI and processed RWI did not show significant difference on the antinociceptive and anti-inflammatory effect, however, the injury of liver and kidney by processed RWI was much weaker than that by raw RWI. The 20 compounds were identified from the ethanol extract of raw product and processed product of RWI using UPLC-Q-TOF/MS, including daphnoretin, emodin, triumbelletin, dibutyl phthalate, Methyl Paraben, YH-10 + OH and matairesinol, arctigenin, kaempferol and physcion. Furthermore, 3 diterpenoids (YH-10, YH-12 and YH-15) were proved to possess the high toxicity and decreased by 48%, 44% and 65%, respectively, which could be regarded as the potential Q-markers for quality/safety assessment of "sweat soaking method" processed RWI. A Q-marker database of processed RWI by "sweat soaking method" was established according to the results

  13. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, M.M.; Duarte, R.C.; Silva, P.V. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Centro de Tecnologia das Radiacoes, Laboratorio de Deteccao de Alimentos Irradiados, Cidade Universitaria, Av. Prof. Lineu Prestes 2242, Butanta Zip Code 05508-000 Sao Paulo (Brazil); Marchioni, E. [Laboratoire de Chimie Analytique et Sciences de l' Aliment (UMR 7512), Faculte de Pharmacie, Universite Louis Pasteur, 74, route du Rhin, F-67400 Illkirch (France); Villavicencio, A.L.C.H. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Centro de Tecnologia das Radiacoes, Laboratorio de Deteccao de Alimentos Irradiados, Cidade Universitaria, Av. Prof. Lineu Prestes 2242, Butanta Zip Code 05508-000 Sao Paulo (Brazil)], E-mail: villavic@ipen.br

    2009-07-15

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a {sup 60}Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  14. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    Science.gov (United States)

    Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.

    2009-07-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  15. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    International Nuclear Information System (INIS)

    Araujo, M.M.; Duarte, R.C.; Silva, P.V.; Marchioni, E.; Villavicencio, A.L.C.H.

    2009-01-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60 Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  16. Solid electrolyte material manufacturable by polymer processing methods

    Science.gov (United States)

    Singh, Mohit; Gur, Ilan; Eitouni, Hany Basam; Balsara, Nitash Pervez

    2012-09-18

    The present invention relates generally to electrolyte materials. According to an embodiment, the present invention provides for a solid polymer electrolyte material that is ionically conductive, mechanically robust, and can be formed into desirable shapes using conventional polymer processing methods. An exemplary polymer electrolyte material has an elastic modulus in excess of 1.times.10.sup.6 Pa at 90 degrees C. and is characterized by an ionic conductivity of at least 1.times.10.sup.-5 Scm-1 at 90 degrees C. An exemplary material can be characterized by a two domain or three domain material system. An exemplary material can include material components made of diblock polymers or triblock polymers. Many uses are contemplated for the solid polymer electrolyte materials. For example, the present invention can be applied to improve Li-based batteries by means of enabling higher energy density, better thermal and environmental stability, lower rates of self-discharge, enhanced safety, lower manufacturing costs, and novel form factors.

  17. Method of radioactive waste processing and equipment therefor

    International Nuclear Information System (INIS)

    Napravnik, J.; Skaba, V.; Ditl, P.

    1988-01-01

    Mushy or liquid radioactive wastes are mixed with chemical additives, e.g., aluminium sulfate, colloidal silicon oxide, formic acid and cement suspension. The mix is heated to 100 to 320 degC. By drying the waste and by chemical reaction, a bulk intermediate product will be obtained which is homogenized with molten bitumen or organic polymers. The mass is then poured into containers where it will harden and will then be transported to the depository. The advantage of the method is that the final product is a stable mass resistant to separation, leaching and erosion, showing long-term storage safety. The main components of the installation are a mixed reactor, a doser of bulk material and a homogenizer which are series connected in that order. The apparatus is mounted on a support structure which may be divided into at least two parts. The advantage of this facility is that it is easily transported and can thereby be used for processing waste at source. (E.S.). 2 figs

  18. A Comparative Study of Applying Active-Set and Interior Point Methods in MPC for Controlling Nonlinear pH Process

    Directory of Open Access Journals (Sweden)

    Syam Syafiie

    2014-06-01

    Full Text Available A comparative study of Model Predictive Control (MPC using active-set method and interior point methods is proposed as a control technique for highly non-linear pH process. The process is a strong acid-strong base system. A strong acid of hydrochloric acid (HCl and a strong base of sodium hydroxide (NaOH with the presence of buffer solution sodium bicarbonate (NaHCO3 are used in a neutralization process flowing into reactor. The non-linear pH neutralization model governed in this process is presented by multi-linear models. Performance of both controllers is studied by evaluating its ability of set-point tracking and disturbance-rejection. Besides, the optimization time is compared between these two methods; both MPC shows the similar performance with no overshoot, offset, and oscillation. However, the conventional active-set method gives a shorter control action time for small scale optimization problem compared to MPC using IPM method for pH control.

  19. Hospital Registration Process Reengineering Using Simulation Method

    Directory of Open Access Journals (Sweden)

    Qiang Su

    2010-01-01

    Full Text Available With increasing competition, many healthcare organizations have undergone tremendous reform in the last decade aiming to increase efficiency, decrease waste, and reshape the way that care is delivered. This study focuses on the operational efficiency improvement of hospital’s registration process. The operational efficiency related factors including the service process, queue strategy, and queue parameters were explored systematically and illustrated with a case study. Guided by the principle of business process reengineering (BPR, a simulation approach was employed for process redesign and performance optimization. As a result, the queue strategy is changed from multiple queues and multiple servers to single queue and multiple servers with a prepare queue. Furthermore, through a series of simulation experiments, the length of the prepare queue and the corresponding registration process efficiency was quantitatively evaluated and optimized.

  20. Accurate Methods for Signal Processing of Distorted Waveforms in Power Systems

    Directory of Open Access Journals (Sweden)

    Langella R

    2007-01-01

    Full Text Available A primary problem in waveform distortion assessment in power systems is to examine ways to reduce the effects of spectral leakage. In the framework of DFT approaches, line frequency synchronization techniques or algorithms to compensate for desynchronization are necessary; alternative approaches such as those based on the Prony and ESPRIT methods are not sensitive to desynchronization, but they often require significant computational burden. In this paper, the signal processing aspects of the problem are considered; different proposals by the same authors regarding DFT-, Prony-, and ESPRIT-based advanced methods are reviewed and compared in terms of their accuracy and computational efforts. The results of several numerical experiments are reported and analysed; some of them are in accordance with IEC Standards, while others use more open scenarios.