WorldWideScience

Sample records for qplanar processing method

  1. Comparison of conventional, model-based quantitative planar, and quantitative SPECT image processing methods for organ activity estimation using In-111 agents

    International Nuclear Information System (INIS)

    He, Bin; Frey, Eric C

    2006-01-01

    Accurate quantification of organ radionuclide uptake is important for patient-specific dosimetry. The quantitative accuracy from conventional conjugate view methods is limited by overlap of projections from different organs and background activity, and attenuation and scatter. In this work, we propose and validate a quantitative planar (QPlanar) processing method based on maximum likelihood (ML) estimation of organ activities using 3D organ VOIs and a projector that models the image degrading effects. Both a physical phantom experiment and Monte Carlo simulation (MCS) studies were used to evaluate the new method. In these studies, the accuracies and precisions of organ activity estimates for the QPlanar method were compared with those from conventional planar (CPlanar) processing methods with various corrections for scatter, attenuation and organ overlap, and a quantitative SPECT (QSPECT) processing method. Experimental planar and SPECT projections and registered CT data from an RSD Torso phantom were obtained using a GE Millenium VH/Hawkeye system. The MCS data were obtained from the 3D NCAT phantom with organ activity distributions that modelled the uptake of 111 In ibritumomab tiuxetan. The simulations were performed using parameters appropriate for the same system used in the RSD torso phantom experiment. The organ activity estimates obtained from the CPlanar, QPlanar and QSPECT methods from both experiments were compared. From the results of the MCS experiment, even with ideal organ overlap correction and background subtraction, CPlanar methods provided limited quantitative accuracy. The QPlanar method with accurate modelling of the physical factors increased the quantitative accuracy at the cost of requiring estimates of the organ VOIs in 3D. The accuracy of QPlanar approached that of QSPECT, but required much less acquisition and computation time. Similar results were obtained from the physical phantom experiment. We conclude that the QPlanar method, based

  2. EQPlanar: a maximum-likelihood method for accurate organ activity estimation from whole body planar projections

    International Nuclear Information System (INIS)

    Song, N; Frey, E C; He, B; Wahl, R L

    2011-01-01

    Optimizing targeted radionuclide therapy requires patient-specific estimation of organ doses. The organ doses are estimated from quantitative nuclear medicine imaging studies, many of which involve planar whole body scans. We have previously developed the quantitative planar (QPlanar) processing method and demonstrated its ability to provide more accurate activity estimates than conventional geometric-mean-based planar (CPlanar) processing methods using physical phantom and simulation studies. The QPlanar method uses the maximum likelihood-expectation maximization algorithm, 3D organ volume of interests (VOIs), and rigorous models of physical image degrading factors to estimate organ activities. However, the QPlanar method requires alignment between the 3D organ VOIs and the 2D planar projections and assumes uniform activity distribution in each VOI. This makes application to patients challenging. As a result, in this paper we propose an extended QPlanar (EQPlanar) method that provides independent-organ rigid registration and includes multiple background regions. We have validated this method using both Monte Carlo simulation and patient data. In the simulation study, we evaluated the precision and accuracy of the method in comparison to the original QPlanar method. For the patient studies, we compared organ activity estimates at 24 h after injection with those from conventional geometric mean-based planar quantification using a 24 h post-injection quantitative SPECT reconstruction as the gold standard. We also compared the goodness of fit of the measured and estimated projections obtained from the EQPlanar method to those from the original method at four other time points where gold standard data were not available. In the simulation study, more accurate activity estimates were provided by the EQPlanar method for all the organs at all the time points compared with the QPlanar method. Based on the patient data, we concluded that the EQPlanar method provided a

  3. The impact of 3D volume of interest definition on accuracy and precision of activity estimation in quantitative SPECT and planar processing methods

    Science.gov (United States)

    He, Bin; Frey, Eric C.

    2010-06-01

    Accurate and precise estimation of organ activities is essential for treatment planning in targeted radionuclide therapy. We have previously evaluated the impact of processing methodology, statistical noise and variability in activity distribution and anatomy on the accuracy and precision of organ activity estimates obtained with quantitative SPECT (QSPECT) and planar (QPlanar) processing. Another important factor impacting the accuracy and precision of organ activity estimates is accuracy of and variability in the definition of organ regions of interest (ROI) or volumes of interest (VOI). The goal of this work was thus to systematically study the effects of VOI definition on the reliability of activity estimates. To this end, we performed Monte Carlo simulation studies using randomly perturbed and shifted VOIs to assess the impact on organ activity estimates. The 3D NCAT phantom was used with activities that modeled clinically observed 111In ibritumomab tiuxetan distributions. In order to study the errors resulting from misdefinitions due to manual segmentation errors, VOIs of the liver and left kidney were first manually defined. Each control point was then randomly perturbed to one of the nearest or next-nearest voxels in three ways: with no, inward or outward directional bias, resulting in random perturbation, erosion or dilation, respectively, of the VOIs. In order to study the errors resulting from the misregistration of VOIs, as would happen, e.g. in the case where the VOIs were defined using a misregistered anatomical image, the reconstructed SPECT images or projections were shifted by amounts ranging from -1 to 1 voxels in increments of with 0.1 voxels in both the transaxial and axial directions. The activity estimates from the shifted reconstructions or projections were compared to those from the originals, and average errors were computed for the QSPECT and QPlanar methods, respectively. For misregistration, errors in organ activity estimations were

  4. The impact of 3D volume of interest definition on accuracy and precision of activity estimation in quantitative SPECT and planar processing methods

    Energy Technology Data Exchange (ETDEWEB)

    He Bin [Division of Nuclear Medicine, Department of Radiology, New York Presbyterian Hospital-Weill Medical College of Cornell University, New York, NY 10021 (United States); Frey, Eric C, E-mail: bih2006@med.cornell.ed, E-mail: efrey1@jhmi.ed [Russell H. Morgan Department of Radiology and Radiological Science, Johns Hopkins Medical Institutions, Baltimore, MD 21287-0859 (United States)

    2010-06-21

    Accurate and precise estimation of organ activities is essential for treatment planning in targeted radionuclide therapy. We have previously evaluated the impact of processing methodology, statistical noise and variability in activity distribution and anatomy on the accuracy and precision of organ activity estimates obtained with quantitative SPECT (QSPECT) and planar (QPlanar) processing. Another important factor impacting the accuracy and precision of organ activity estimates is accuracy of and variability in the definition of organ regions of interest (ROI) or volumes of interest (VOI). The goal of this work was thus to systematically study the effects of VOI definition on the reliability of activity estimates. To this end, we performed Monte Carlo simulation studies using randomly perturbed and shifted VOIs to assess the impact on organ activity estimates. The 3D NCAT phantom was used with activities that modeled clinically observed {sup 111}In ibritumomab tiuxetan distributions. In order to study the errors resulting from misdefinitions due to manual segmentation errors, VOIs of the liver and left kidney were first manually defined. Each control point was then randomly perturbed to one of the nearest or next-nearest voxels in three ways: with no, inward or outward directional bias, resulting in random perturbation, erosion or dilation, respectively, of the VOIs. In order to study the errors resulting from the misregistration of VOIs, as would happen, e.g. in the case where the VOIs were defined using a misregistered anatomical image, the reconstructed SPECT images or projections were shifted by amounts ranging from -1 to 1 voxels in increments of with 0.1 voxels in both the transaxial and axial directions. The activity estimates from the shifted reconstructions or projections were compared to those from the originals, and average errors were computed for the QSPECT and QPlanar methods, respectively. For misregistration, errors in organ activity estimations

  5. Evaluation of quantitative imaging methods for organ activity and residence time estimation using a population of phantoms having realistic variations in anatomy and uptake

    International Nuclear Information System (INIS)

    He Bin; Du Yong; Segars, W. Paul; Wahl, Richard L.; Sgouros, George; Jacene, Heather; Frey, Eric C.

    2009-01-01

    Estimating organ residence times is an essential part of patient-specific dosimetry for radioimmunotherapy (RIT). Quantitative imaging methods for RIT are often evaluated using a single physical or simulated phantom but are intended to be applied clinically where there is variability in patient anatomy, biodistribution, and biokinetics. To provide a more relevant evaluation, the authors have thus developed a population of phantoms with realistic variations in these factors and applied it to the evaluation of quantitative imaging methods both to find the best method and to demonstrate the effects of these variations. Using whole body scans and SPECT/CT images, organ shapes and time-activity curves of 111In ibritumomab tiuxetan were measured in dosimetrically important organs in seven patients undergoing a high dose therapy regimen. Based on these measurements, we created a 3D NURBS-based cardiac-torso (NCAT)-based phantom population. SPECT and planar data at realistic count levels were then simulated using previously validated Monte Carlo simulation tools. The projections from the population were used to evaluate the accuracy and variation in accuracy of residence time estimation methods that used a time series of SPECT and planar scans. Quantitative SPECT (QSPECT) reconstruction methods were used that compensated for attenuation, scatter, and the collimator-detector response. Planar images were processed with a conventional (CPlanar) method that used geometric mean attenuation and triple-energy window scatter compensation and a quantitative planar (QPlanar) processing method that used model-based compensation for image degrading effects. Residence times were estimated from activity estimates made at each of five time points. The authors also evaluated hybrid methods that used CPlanar or QPlanar time-activity curves rescaled to the activity estimated from a single QSPECT image. The methods were evaluated in terms of mean relative error and standard deviation of the

  6. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  7. Nonaqueous processing methods

    International Nuclear Information System (INIS)

    Coops, M.S.; Bowersox, D.F.

    1984-09-01

    A high-temperature process utilizing molten salt extraction from molten metal alloys has been developed for purification of spent power reactor fuels. Experiments with laboratory-scale processing operations show that purification and throughput parameters comparable to the Barnwell Purex process can be achieved by pyrochemical processing in equipment one-tenth the size, with all wastes being discharged as stable metal alloys at greatly reduced volume and disposal cost. This basic technology can be developed for large-scale processing of spent reactor fuels. 13 references, 4 figures

  8. Hyperspectral image processing methods

    Science.gov (United States)

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  9. Radioactive waste processing method

    International Nuclear Information System (INIS)

    Sakuramoto, Naohiko.

    1992-01-01

    When granular materials comprising radioactive wastes containing phosphorus are processed at first in a fluidized bed type furnace, if the granular materials are phosphorus-containing activated carbon, granular materials comprising alkali compound such as calcium hydroxide and barium hydroxide are used as fluidizing media. Even granular materials of slow burning speed can be burnt stably in a fluidizing state by high temperature heat of the fluidizing media, thereby enabling to take a long burning processing time. Accordingly, radioactive activated carbon wastes can be processed by burning treatment. (T.M.)

  10. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  11. Waste processing method

    International Nuclear Information System (INIS)

    Furukawa, Osamu; Shibata, Minoru.

    1996-01-01

    X-rays are irradiated from a predetermined direction to solid wastes containing radioactive isotopes packed in a bag before charged into an inlet of an incinerator. Most of the wastes is burnable plastics such as test tubes and papers. Glasses such as chemical bottles and metals such as lead plates for radiation shielding are contained as a portion of the wastes. The X-rays have such an intensity capable of discriminating metals and glasses from burnable materials. Irradiation images formed on a X-ray irradiation receiving portion are processed, and the total number of picture elements on the portion where a gradation of the light receiving portion of the metal is within a predetermined range is counted on the image. Then, the bag having total picture elements of not less than a predetermined number are separated from the bag having a lesser number. Similar processings are conducted for glasses. With such procedures, the bags containing lead and glasses not suitable to incineration are separated from the bags not containing them thereby enabling to prevent lowering of operation efficiency of the incinerator. (I.N.)

  12. Radioactive waste processing method

    International Nuclear Information System (INIS)

    Ando, Ken-ichi; Kawamura, Hideki; Takeuchi, Kunifumi.

    1997-01-01

    Base rock is dug in a substantially cylindrical shape, bentonite blocks in an amount for a predetermined lift are disposed on the inner side of the dug wall surfaces. Concrete blocks constituting a structure of an underground silo are disposed at the inner side. Barrier blocks are disposed to the inner side thereof, and vessels incorporated with radioactive wastes are disposed to the inner side. The bentonite disposed to the inner side of the dug wall surfaces, the concrete structure of the underground silo and the barrier members are divided in the vertical direction into a plurality of blocks, and these blocks are stacked successively from the lowermost layer together with the containing vessels of the radioactive wastes, and after stacking them to a predetermined height, a filler is filled up to the circumference of the vessels. With such a constitution, the underground silo is not fallen down or vibrated even upon occurrence of an earthquake. In addition, bending stresses are scarcely caused thereby making reinforcement of iron reinforcing materials unnecessary. Accordingly, the sealing performance is improved, and processing cost is reduced. (T.M.)

  13. Digital processing methods for bronchograms

    International Nuclear Information System (INIS)

    Mamilyaev, R.M.; Popova, N.P.; Matsulevich, T.V.

    1989-01-01

    The technique of digital processing of bronchograms with the aim of separating morphological details of bronchi and increasing the clarity in the outlines of contrasted bronchi, is described. The block diagram of digital processing on the automatized system of image processing is given. It is shown that digital processing of bronchograms permits to clearly outline bronchi walls and makes the measurements of bronchi diameters easier and more reliable. Considerable advantages of digital processing of images as compared with the optical methods, are shown

  14. Present status of processing method

    Energy Technology Data Exchange (ETDEWEB)

    Kosako, Kazuaki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)

    1998-11-01

    Present status of processing method for a high-energy nuclear data file was examined. The NJOY94 code is the only one available to the processing. In Japan, present processing used NJOY94 is orienting toward the production of traditional cross section library, because a high-energy transport code using a high-energy cross section library is indistinct. (author)

  15. Methods in Astronomical Image Processing

    Science.gov (United States)

    Jörsäter, S.

    A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

  16. Microencapsulation and Electrostatic Processing Method

    Science.gov (United States)

    Morrison, Dennis R. (Inventor); Mosier, Benjamin (Inventor)

    2000-01-01

    Methods are provided for forming spherical multilamellar microcapsules having alternating hydrophilic and hydrophobic liquid layers, surrounded by flexible, semi-permeable hydrophobic or hydrophilic outer membranes which can be tailored specifically to control the diffusion rate. The methods of the invention rely on low shear mixing and liquid-liquid diffusion process and are particularly well suited for forming microcapsules containing both hydrophilic and hydrophobic drugs. These methods can be carried out in the absence of gravity and do not rely on density-driven phase separation, mechanical mixing or solvent evaporation phases. The methods include the process of forming, washing and filtering microcapsules. In addition, the methods contemplate coating microcapsules with ancillary coatings using an electrostatic field and free fluid electrophoresis of the microcapsules. The microcapsules produced by such methods are particularly useful in the delivery of pharmaceutical compositions.

  17. Methods of digital image processing

    International Nuclear Information System (INIS)

    Doeler, W.

    1985-01-01

    Increasing use of computerized methods for diagnostical imaging of radiological problems will open up a wide field of applications for digital image processing. The requirements set by routine diagnostics in medical radiology point to picture data storage and documentation and communication as the main points of interest for application of digital image processing. As to the purely radiological problems, the value of digital image processing is to be sought in the improved interpretability of the image information in those cases where the expert's experience and image interpretation by human visual capacities do not suffice. There are many other domains of imaging in medical physics where digital image processing and evaluation is very useful. The paper reviews the various methods available for a variety of problem solutions, and explains the hardware available for the tasks discussed. (orig.) [de

  18. Image restoration and processing methods

    International Nuclear Information System (INIS)

    Daniell, G.J.

    1984-01-01

    This review will stress the importance of using image restoration techniques that deal with incomplete, inconsistent, and noisy data and do not introduce spurious features into the processed image. No single image is equally suitable for both the resolution of detail and the accurate measurement of intensities. A good general purpose technique is the maximum entropy method and the basis and use of this will be explained. (orig.)

  19. Method of processing liquid wastes

    International Nuclear Information System (INIS)

    Naba, Katsumi; Oohashi, Takeshi; Kawakatsu, Ryu; Kuribayashi, Kotaro.

    1980-01-01

    Purpose: To process radioactive liquid wastes with safety by distillating radioactive liquid wastes while passing gases, properly treating the distillation fractions, adding combustible and liquid synthetic resin material to the distillation residues, polymerizing to solidify and then burning them. Method: Radioactive substance - containing liquid wastes are distillated while passing gases and the distillation fractions containing no substantial radioactive substances are treated in an adequate method. Synthetic resin material, which may be a mixture of polymer and monomer, is added together with a catalyst to the distillation residues containing almost of the radioactive substances to polymerize and solidify. Water or solvent in such an extent as not hindering the solidification may be allowed if remained. The solidification products are burnt for facilitating the treatment of the radioactive substances. The resin material can be selected suitably, methacrylate syrup (mainly solution of polymethylmethacrylate and methylmethacrylate) being preferred. (Seki, T.)

  20. Method of processing radioactive wastes

    International Nuclear Information System (INIS)

    Katada, Katsuo.

    1986-01-01

    Purpose: To improve the management for radioactive wastes containers thereby decrease the amount of stored matters by arranging the radioactive wastes containers in the order of their radioactivity levels. Method: The radiation doses of radioactive wastes containers arranged in the storing area before volume-reducing treatment are previously measured by a dosemeter. Then, a classifying machine is actuated to hoist the containers in the order to their radiation levels and the containers are sent out passing through conveyor, surface contamination gage, weight measuring device and switcher to a volume-reducing processing machine. The volume-reduced products are packed each by several units to the storing containers. Thus, the storing containers after stored for a certain period of time can be transferred in an assembled state. (Kawakami, Y.)

  1. Method of processing radioactive wastes

    International Nuclear Information System (INIS)

    Nomura, Ichiro; Hashimoto, Yasuo.

    1984-01-01

    Purpose: To improve the volume-reduction effect, as well as enable simultaneous procession for the wastes such as burnable solid wastes, resin wastes or sludges, and further convert the processed materials into glass-solidified products which are much less burnable and stable chemically and thermally. Method: Auxiliaries mainly composed of SiO 2 such as clays, and wastes such as burnable solid wastes, waste resins and sludges are charged through a waste hopper into an incinerating melting furnace comprising an incinerating and a melting furnace, while radioactive concentrated liquid wastes are sprayed from a spray nozzle. The wastes are burnt by the heat from the melting furnace and combustion air, and the sprayed concentrated wastes are dried by the hot air after the combustion into solid components. The solid matters from the concentrated liquid wastes and the incinerating ashes of the wastes are melted together with the auxiliaries in the melting furnace and converted into glass-like matters. The glass-like matters thus formed are caused to flow into a vessel and gradually cooled to solidify. (Horiuchi, T.)

  2. Advanced methods for processing ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Carter, W.B. [Georgia Institute of Technology, Atlanta, GA (United States)

    1995-05-01

    Combustion chemical vapor deposition (CCVD) is a flame assisted, open air chemical vapor deposition (CVD) process. The process is capable of producing textured, epitaxial coatings on single crystal substrates using low cost reagents. Combustion chemical vapor deposition is a relatively inexpensive, alternative thin film deposition process with potential to replace conventional coating technologies for certain applications. The goals of this project are to develop the CCVD process to the point that potential industrial applications can be identified and reliably assessed.

  3. Advanced methods for processing ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Carter, W.B. [Georgia Institute of Technology, Atlanta, GA (United States)

    1997-04-01

    Combustion chemical vapor deposition (combustion CVD) is being developed for the deposition of high temperature oxide coatings. The process is being evaluated as an alternative to more capital intensive conventional coating processes. The thrusts during this reporting period were the development of the combustion CVD process for depositing lanthanum monazite, the determination of the influence of aerosol size on coating morphology, the incorporation of combustion CVD coatings into thermal barrier coatings (TBCs) and related oxidation research, and continued work on the deposition of zirconia-yttria coatings.

  4. Sample processing device and method

    DEFF Research Database (Denmark)

    2011-01-01

    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... angle with water, the first contact angle being smaller than the second contact angle. The first substrate defines an inlet system and a preparation system in areas of the first type which two areas are separated by a barrier system in an area of the second type. The inlet system is adapted to receive...

  5. Processing method for radioactive sludge

    International Nuclear Information System (INIS)

    Shoji, Yuichi; Kaneko, Masaaki.

    1993-01-01

    The concentration of radioactive sludges contained in a storage tank is controlled, thereafter, a filter is charged into a processing vessel to continuously conduct dewatering. Then, the radioactive sludges and an oxidizer are mixed by stirring using a stirring impeller and by vibrations using a vibrator. At the same time, thermic rays are irradiated by using infrared ray lamps to heat and decompose them. Since thermic rays reach the center of the radioactive sludges by the infrared ray lamps, ion exchange resins are sufficiently decomposed and carbonized into inorganic material. Then, a filling hardener such as mortar cement having a good flowability is charged to solidify the wastes. With such procedures, radioactive sludges can be stored under a stable condition for a long period of time by decomposing organic materials into inorganic materials and solidifying them. Further, an operator's radiation exposure dose can remarkably be reduced by applying a predetermined and a stabilization treatment in an identical processing vessel. (N.H.)

  6. METHOD OF ELECTRON BEAM PROCESSING

    DEFF Research Database (Denmark)

    2003-01-01

    As a rule, electron beam welding takes place in a vacuum. However, this means that the workpieces in question have to be placed in a vacuum chamber and have to be removed therefrom after welding. This is time−consuming and a serious limitation of a process the greatest advantage of which is the o......As a rule, electron beam welding takes place in a vacuum. However, this means that the workpieces in question have to be placed in a vacuum chamber and have to be removed therefrom after welding. This is time−consuming and a serious limitation of a process the greatest advantage of which...... is the option of welding workpieces of large thicknesses. Therefore the idea is to guide the electron beam (2) to the workpiece via a hollow wire, said wire thereby acting as a prolongation of the vacuum chamber (4) down to workpiece. Thus, a workpiece need not be placed inside the vacuum chamber, thereby...... exploiting the potential of electron beam processing to a greater degree than previously possible, for example by means of electron beam welding...

  7. METHOD OF PROCESSING MONAZITE SAND

    Science.gov (United States)

    Welt, M.A.; Smutz, M.

    1958-08-26

    A process is described for recovering thorium, uranium, and rare earth values from monazite sand. The monazite sand is first digested with sulfuric acid and the resulting "monazite sulfate" solution is adjusted to a pH of between 0.4 and 3.0, and oxalate anions are added causing precipitation of the thorium and the rare earths as the oxalates. The oxalate precipitate is separated from the uranium containing supernatant solution, and is dried and calcined to the oxides. The thorium and rare earth oxides are then dissolved in nitric acid and the solution is contacted with tribntyl phosphate whereby an organic extract phase containing the cerium and thorium values is obtained, together with an aqueous raffinate containing the other rare earth values. The organic phase is then separated from the aqueous raffinate and the cerium and thorium are back extracted with an aqueous medium.

  8. Method of processing radioactive materials

    International Nuclear Information System (INIS)

    Kondo, Susumu; Moriya, Tetsuo; Ishibashi, Tadashi; Kariya, Masahiro.

    1986-01-01

    Purpose: To improve contamination proofness, water proofness, close bondability and stretching performance of strippable paints coated to substrates liable to be contaminated with radioactive materials. Method: Strippable paints are previously coated on substrates which may possibly be contaminated with radioactive materials. After the contamination, the coated membranes are stripped and removed. Alternatively, the strippable paints may be coated on the already contaminated substrates and, after drying, the paints are stripped and removed. The strippable paints used herein have a composition comprising a styrene-butadiene block copolymer containing from 60 to 80 wt% of styrene as a main ingredient and from 0.3 to 5 % by weight of a higher alkyl amine compound having 12 to 18 carbon atoms blended with the copolymer. (Ikeda, J.)

  9. Method of processing radioactive wastes

    International Nuclear Information System (INIS)

    Takahashi, Toshihiko; Maruko, Morihisa; Takamura, Yoshiyuki.

    1981-01-01

    Purpose: To effectively separate radioactive claddings from the slurry of wasted ion exchange resins containing radioactive claddings. Method: Wasted ion exchange resins having radioactive claddings (fine particles of iron oxides or hydroxide adhered with radioactive cobalt) are introduced into a clad separation tank. Sulfuric acid or sodium hydroxide is introduced to the separation tank to adjust the pH value to 3 - 6. Then, sodium lauryl sulfate is added for capturing claddings and airs are blown from an air supply nozzle to generate air bubbles. The claddings are detached from the ion exchange resins and adhered to the air bubbles. The air bubbles adhered with the claddings float up to the surface of the liquid wastes and then forced out of the separation tank. (Ikeda, J.)

  10. Method of processing radioactive wastes

    International Nuclear Information System (INIS)

    Funabashi, Kiyomi; Sugimoto, Yoshikazu; Kikuchi, Makoto; Yusa, Hideo.

    1979-01-01

    Purpose: To obtain solidified radioactive wastes at high packing density by packing radioactive waste pellets in a container and then packing and curing a thermosetting resin therein. Method: Radioactive liquid wastes are dried into power and subjected to compression molding. The pellets thus obtained are supplied in a predetermined amount from the hopper to the inside of a drum can. Then, thermosetting plastic and a curing agent are filled in the drum can. Gas between the pellets is completely expelled by the intrusion of the thermosetting resin and the curing agent among the pellets. Thereafter, the drum can is heated by a heater and curing is effected. After the curing, the drum can is sealed. (Kawakami, Y.)

  11. Method of processing radioactive gas

    International Nuclear Information System (INIS)

    Saito, Masayuki.

    1978-01-01

    Purpose: To reduce the quantity of radioactive gas discharged at the time of starting a nuclear power plant. Method: After the stoppage of a nuclear power plant air containing a radioactive gas is extracted from a main condenser by operating an air extractor. The air is sent into a gaseous waste disposal device, and then introduced into the activated carbon adsorptive tower of a rare gas holdup device where xenon and krypton are trapped. Thereafter, the air passes through pipelines and returned to the main condenser. In this manner, the radioactive gas contained in air within the main condenser is removed during the stoppage of the operation of the nuclear power plant. After the plant has been started, when it enters the normal operation, a flow control valve is closed and another valve is opened, and a purified gas exhausted from the rare gas holdup device is discharged into the atmosphere through an exhaust cylinder. (Aizawa, K.)

  12. Method of processing chloride waste

    International Nuclear Information System (INIS)

    Tokiwai, Moriyasu; Tsunashima, Mikiyasu; Horie, Masaaki; Koyama, Masafumi; Sudo, Minoru; Kitagawa, Masatoshi; Ogasawara, Tadashi.

    1991-01-01

    In a method of applying molten salt electrolysis to chloride wastes discharged from a electrolytic refining step of a dry reprocessing step for spent fuels, and removed with transuranium elements of long half-decaying time, metals capable of alloying with alkali and alkaline earth metals under melting by electrolysis are used as a cathode material, and an electrolytic temperature is made higher than the melting point of salts in a molten salt electrolysis bath, to recover Li, Ca and Na as alloys with the cathode material in a first electrolysis step. Then, the electrolytic temperature is made higher than the melting point of the chloride salts remained in the bath after the electrolysis step described above by using the cathode material, to recover Ba, Rb, Sr and Cs of nuclear fission products also as alloys with the cathode material in a second electrolysis step. Accordingly, the amount of wastes formed can be reduced, and the wastes contain no heat generating nuclear fission elements. (T.M.)

  13. Method of processing waste sodium

    International Nuclear Information System (INIS)

    Shimoyashiki, Shigehiro; Takahashi, Kazuo.

    1982-01-01

    Purpose: To enable safety store of waste sodium in the form of intermetallic compounds. Method: Waste sodium used in a reactor is mixed with molten metal under an inert gas atmosphere and resulted intermetallic compounds are stored in a closely sealed container to enable quasi-permanent safety store as inert compound. Used waste sodium particularly, waste sodium in the primary system containing radioactive substances is charged in a waste sodium melting tank having a heater on the side, the tank is evacuated by a vacuum pump and then sealed with gaseous argon supplied from a gaseous argon tank, and waste sodium is melted under heating. The temperature and the amount of the liquid are measured by a thermometer and a level meter respectively. While on the other hand, molten metal such as Sn, Pb and Zn having melting point above 300 0 C are charged in a metal melting tank and heated by a heater. The molten sodium and the molten metals are charged into a mixing tank and agitated to mix by an induction type agitator. Sodium vapors in the tank are collected by traps. The air in the tank is replaced with gaseous argon. The molten mixture is closely sealed in a drum can and cooled to solidify for safety storage. (Seki, T.)

  14. Method of processing radioactive wastes

    International Nuclear Information System (INIS)

    Matagi, Yoshihiko; Takahara, Akira; Ootsuka, Katsuyuki.

    1984-01-01

    Purpose: To avoid the reduction in the atmospheric insulation by preventing the generation of CO 2 , H 2 O, etc. upon irradiation of microwave heat. Method: Radioactive wastes are charged into a hopper, supplied on a conveyor, fed each by a predetermined amount to a microwave furnace and heated by microwaves applied from a microwave guide. Simultaneously, inert gases are supplied from a supply line. The Radioactive wastes to be treated are shielded by the inert gases to prevent the combustion of decomposed gases produced from the wastes upon irradiation of microwave heat to thereby prevent the generation of CO 2 , H 2 , etc., as well as the generated decomposed gases are diluted with the inert gases to decrease the dissociation of the decomposed gases to prevent the reduction in the atmospheric insulation. Since the spent inert gases can be recovered for reuse, the amount of gaseous wastes released to the atmosphere can be decreased and the working life of the high performance air filters can be extended. (Sekiya, K.)

  15. Method of processing laundry drain

    Energy Technology Data Exchange (ETDEWEB)

    Sugimoto, Y; Oda, A; Yusa, H; Kitamura, M; Horiuchi, S

    1979-09-28

    Purpose: To subject the laundry drain to flocculation precipitation treatment in the prior stage of an inverse osmotic treatment, and subject only the supernatant to the inverse osmotic treatment and the precipitate directly to the condensation treatment, thereby decreasing the frequency of exchange of the inverse osmotic membranes, and reducing the quantity of purifying water. Method: The laundry drain is supplied to a flocculation precipitation tank, and added and mixed with a flocculant and a neutralizing agent, thus being subjected to a flocculation precipitation treatment. The supernatant is transported to a circulation tank through a transportation pipe, and is subjected to an inverse osmotic treatment in inverse osmotic module through the circulation tank, a filter and a high tension pump, and then returned to the circulation tank. The supernatant is thus concentrated to a predetermined concentration by repeating such operations. On the other hand, the precipitate at the bottom part of the flocculation precipitation tank is supplied through the transportation pipe to an evaporator supply tank together with the concentrate from the drain circulation tank, and evaporated and concentrated in the evaporator.

  16. Processing module operating methods, processing modules, and communications systems

    Science.gov (United States)

    McCown, Steven Harvey; Derr, Kurt W.; Moore, Troy

    2014-09-09

    A processing module operating method includes using a processing module physically connected to a wireless communications device, requesting that the wireless communications device retrieve encrypted code from a web site and receiving the encrypted code from the wireless communications device. The wireless communications device is unable to decrypt the encrypted code. The method further includes using the processing module, decrypting the encrypted code, executing the decrypted code, and preventing the wireless communications device from accessing the decrypted code. Another processing module operating method includes using a processing module physically connected to a host device, executing an application within the processing module, allowing the application to exchange user interaction data communicated using a user interface of the host device with the host device, and allowing the application to use the host device as a communications device for exchanging information with a remote device distinct from the host device.

  17. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2013-09-01

    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  18. Digital image processing mathematical and computational methods

    CERN Document Server

    Blackledge, J M

    2005-01-01

    This authoritative text (the second part of a complete MSc course) provides mathematical methods required to describe images, image formation and different imaging systems, coupled with the principle techniques used for processing digital images. It is based on a course for postgraduates reading physics, electronic engineering, telecommunications engineering, information technology and computer science. This book relates the methods of processing and interpreting digital images to the 'physics' of imaging systems. Case studies reinforce the methods discussed, with examples of current research

  19. Digital signal processing with kernel methods

    CERN Document Server

    Rojo-Alvarez, José Luis; Muñoz-Marí, Jordi; Camps-Valls, Gustavo

    2018-01-01

    A realistic and comprehensive review of joint approaches to machine learning and signal processing algorithms, with application to communications, multimedia, and biomedical engineering systems Digital Signal Processing with Kernel Methods reviews the milestones in the mixing of classical digital signal processing models and advanced kernel machines statistical learning tools. It explains the fundamental concepts from both fields of machine learning and signal processing so that readers can quickly get up to speed in order to begin developing the concepts and application software in their own research. Digital Signal Processing with Kernel Methods provides a comprehensive overview of kernel methods in signal processing, without restriction to any application field. It also offers example applications and detailed benchmarking experiments with real and synthetic datasets throughout. Readers can find further worked examples with Matlab source code on a website developed by the authors. * Presents the necess...

  20. Comparative analysis of accelerogram processing methods

    International Nuclear Information System (INIS)

    Goula, X.; Mohammadioun, B.

    1986-01-01

    The work described here inafter is a short development of an on-going research project, concerning high-quality processing of strong-motion recordings of earthquakes. Several processing procedures have been tested, applied to synthetic signals simulating ground-motion designed for this purpose. The methods of correction operating in the time domain are seen to be strongly dependent upon the sampling rate. Two methods of low-frequency filtering followed by an integration of accelerations yielded satisfactory results [fr

  1. Method for processing spent nuclear reactor fuel

    International Nuclear Information System (INIS)

    Levenson, M.; Zebroski, E.L.

    1981-01-01

    A method and apparatus are claimed for processing spent nuclear reactor fuel wherein plutonium is continuously contaminated with radioactive fission products and diluted with uranium. Plutonium of sufficient purity to fabricate nuclear weapons cannot be produced by the process or in the disclosed reprocessing plant. Diversion of plutonium is prevented by radiation hazards and ease of detection

  2. Signal processing methods for MFE plasma diagnostics

    International Nuclear Information System (INIS)

    Candy, J.V.; Casper, T.; Kane, R.

    1985-02-01

    The application of various signal processing methods to extract energy storage information from plasma diamagnetism sensors occurring during physics experiments on the Tandom Mirror Experiment-Upgrade (TMX-U) is discussed. We show how these processing techniques can be used to decrease the uncertainty in the corresponding sensor measurements. The algorithms suggested are implemented using SIG, an interactive signal processing package developed at LLNL

  3. Method and apparatus for processing algae

    Science.gov (United States)

    Chew, Geoffrey; Reich, Alton J.; Dykes, Jr., H. Waite; Di Salvo, Roberto

    2012-07-03

    Methods and apparatus for processing algae are described in which a hydrophilic ionic liquid is used to lyse algae cells. The lysate separates into at least two layers including a lipid-containing hydrophobic layer and an ionic liquid-containing hydrophilic layer. A salt or salt solution may be used to remove water from the ionic liquid-containing layer before the ionic liquid is reused. The used salt may also be dried and/or concentrated and reused. The method can operate at relatively low lysis, processing, and recycling temperatures, which minimizes the environmental impact of algae processing while providing reusable biofuels and other useful products.

  4. Geophysical methods for monitoring soil stabilization processes

    Science.gov (United States)

    Saneiyan, Sina; Ntarlagiannis, Dimitrios; Werkema, D. Dale; Ustra, Andréa

    2018-01-01

    Soil stabilization involves methods used to turn unconsolidated and unstable soil into a stiffer, consolidated medium that could support engineered structures, alter permeability, change subsurface flow, or immobilize contamination through mineral precipitation. Among the variety of available methods carbonate precipitation is a very promising one, especially when it is being induced through common soil borne microbes (MICP - microbial induced carbonate precipitation). Such microbial mediated precipitation has the added benefit of not harming the environment as other methods can be environmentally detrimental. Carbonate precipitation, typically in the form of calcite, is a naturally occurring process that can be manipulated to deliver the expected soil strengthening results or permeability changes. This study investigates the ability of spectral induced polarization and shear-wave velocity for monitoring calcite driven soil strengthening processes. The results support the use of these geophysical methods as soil strengthening characterization and long term monitoring tools, which is a requirement for viable soil stabilization projects. Both tested methods are sensitive to calcite precipitation, with SIP offering additional information related to long term stability of precipitated carbonate. Carbonate precipitation has been confirmed with direct methods, such as direct sampling and scanning electron microscopy (SEM). This study advances our understanding of soil strengthening processes and permeability alterations, and is a crucial step for the use of geophysical methods as monitoring tools in microbial induced soil alterations through carbonate precipitation.

  5. Study on Processing Method of Image Shadow

    Directory of Open Access Journals (Sweden)

    Wang Bo

    2014-07-01

    Full Text Available In order to effectively remove disturbance of shadow and enhance robustness of information processing of computer visual image, this paper makes study on inspection and removal of image shadow. It makes study the continual removal algorithm of shadow based on integration, the illumination surface and texture, it respectively introduces their work principles and realization method, it can effectively carrying processing for shadow by test.

  6. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  7. Three-dimensional image signals: processing methods

    Science.gov (United States)

    Schiopu, Paul; Manea, Adrian; Craciun, Anca-Ileana; Craciun, Alexandru

    2010-11-01

    Over the years extensive studies have been carried out to apply coherent optics methods in real-time processing, communications and transmission image. This is especially true when a large amount of information needs to be processed, e.g., in high-resolution imaging. The recent progress in data-processing networks and communication systems has considerably increased the capacity of information exchange. We describe the results of literature investigation research of processing methods for the signals of the three-dimensional images. All commercially available 3D technologies today are based on stereoscopic viewing. 3D technology was once the exclusive domain of skilled computer-graphics developers with high-end machines and software. The images capture from the advanced 3D digital camera can be displayed onto screen of the 3D digital viewer with/ without special glasses. For this is needed considerable processing power and memory to create and render the complex mix of colors, textures, and virtual lighting and perspective necessary to make figures appear three-dimensional. Also, using a standard digital camera and a technique called phase-shift interferometry we can capture "digital holograms." These are holograms that can be stored on computer and transmitted over conventional networks. We present some research methods to process "digital holograms" for the Internet transmission and results.

  8. Parametric methods for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper

    is studied in Section 4, and Bayesian inference in Section 5. On one hand, as the development in computer technology and computational statistics continues,computationally-intensive simulation-based methods for likelihood inference probably will play a increasing role for statistical analysis of spatial...... inference procedures for parametric spatial point process models. The widespread use of sensible but ad hoc methods based on summary statistics of the kind studied in Chapter 4.3 have through the last two decades been supplied by likelihood based methods for parametric spatial point process models......(This text is submitted for the volume ‘A Handbook of Spatial Statistics' edited by A.E. Gelfand, P. Diggle, M. Fuentes, and P. Guttorp, to be published by Chapmand and Hall/CRC Press, and planned to appear as Chapter 4.4 with the title ‘Parametric methods'.) 1 Introduction This chapter considers...

  9. An Automated Processing Method for Agglomeration Areas

    Directory of Open Access Journals (Sweden)

    Chengming Li

    2018-05-01

    Full Text Available Agglomeration operations are a core component of the automated generalization of aggregated area groups. However, because geographical elements that possess agglomeration features are relatively scarce, the current literature has not given sufficient attention to agglomeration operations. Furthermore, most reports on the subject are limited to the general conceptual level. Consequently, current agglomeration methods are highly reliant on subjective determinations and cannot support intelligent computer processing. This paper proposes an automated processing method for agglomeration areas. Firstly, the proposed method automatically identifies agglomeration areas based on the width of the striped bridging area, distribution pattern index (DPI, shape similarity index (SSI, and overlap index (OI. Next, the progressive agglomeration operation is carried out, including the computation of the external boundary outlines and the extraction of agglomeration lines. The effectiveness and rationality of the proposed method has been validated by using actual census data of Chinese geographical conditions in the Jiangsu Province.

  10. Finite Element Method in Machining Processes

    CERN Document Server

    Markopoulos, Angelos P

    2013-01-01

    Finite Element Method in Machining Processes provides a concise study on the way the Finite Element Method (FEM) is used in the case of manufacturing processes, primarily in machining. The basics of this kind of modeling are detailed to create a reference that will provide guidelines for those who start to study this method now, but also for scientists already involved in FEM and want to expand their research. A discussion on FEM, formulations and techniques currently in use is followed up by machining case studies. Orthogonal cutting, oblique cutting, 3D simulations for turning and milling, grinding, and state-of-the-art topics such as high speed machining and micromachining are explained with relevant examples. This is all supported by a literature review and a reference list for further study. As FEM is a key method for researchers in the manufacturing and especially in the machining sector, Finite Element Method in Machining Processes is a key reference for students studying manufacturing processes but al...

  11. Mathematical methods for diffusion MRI processing

    International Nuclear Information System (INIS)

    Lenglet, C.; Lenglet, C.; Sapiro, G.; Campbell, J.S.W.; Pike, G.B.; Campbell, J.S.W.; Siddiqi, K.; Descoteaux, M.; Haro, G.; Wassermann, D.; Deriche, R.; Wassermann, D.; Anwander, A.; Thompson, P.M.

    2009-01-01

    In this article, we review recent mathematical models and computational methods for the processing of diffusion Magnetic Resonance Images, including state-of-the-art reconstruction of diffusion models, cerebral white matter connectivity analysis, and segmentation techniques. We focus on Diffusion Tensor Images (DTI) and Q-Ball Images (QBI). (authors)

  12. SKOCh modified parameters and data processing method

    International Nuclear Information System (INIS)

    Abramov, V.V.; Baldin, B.Yu.; Vasil'chenko, V.G.

    1986-01-01

    Characteristics of a modified Cherenkov radiation ring spectrometer variant (SKOCH) are presented. Methods of experimental data processing are described. Different SKOCH optics variants are investigated. Multi-particle registering electronic equipment for data read-out from SKOCH providing for the improvement of multiparticle occurance registration conditions is applied in the course of measurements using proton beams. A system of SKOCH spectrometer data processing programms is developed and experimentally tested. Effective algorithm for calibrating Cherenkov radiation ring spectrometers with quite a large angular and radial aperture is developed. The on-line- and off-line-processing program complex provides for the complete control of SKOCH operation during statistics collection and for particle (π, K, P) identification within 5.5-30 GeV/c range

  13. Methods of process management in radiology

    International Nuclear Information System (INIS)

    Teichgraeber, U.K.M.; Gillessen, C.; Neumann, F.

    2003-01-01

    The main emphasis in health care has been on quality and availability but increasing cost pressure has made cost efficiency ever more relevant for nurses, technicians, and physicians. Within a hospital, the radiologist considerably influences the patient's length of stay through the availability of service and diagnostic information. Therefore, coordinating and timing radiologic examinations become increasingly more important. Physicians are not taught organizational management during their medical education and residency training, and the necessary expertise in economics is generally acquired through the literature or specialized courses. Beyond the medical service, the physicians are increasingly required to optimize their work flow according to economic factors. This review introduces various tools for process management and its application in radiology. By means of simple paper-based methods, the work flow of most processes can be analyzed. For more complex work flow, it is suggested to choose a method that allows for an exact qualitative and quantitative prediction of the effect of variations. This review introduces network planning technique and process simulation. (orig.) [de

  14. Advances in Packaging Methods, Processes and Systems

    Directory of Open Access Journals (Sweden)

    Nitaigour Premchand Mahalik

    2014-10-01

    Full Text Available The food processing and packaging industry is becoming a multi-trillion dollar global business. The reason is that the recent increase in incomes in traditionally less economically developed countries has led to a rise in standards of living that includes a significantly higher consumption of packaged foods. As a result, food safety guidelines have been more stringent than ever. At the same time, the number of research and educational institutions—that is, the number of potential researchers and stakeholders—has increased in the recent past. This paper reviews recent developments in food processing and packaging (FPP, keeping in view the aforementioned advancements and bearing in mind that FPP is an interdisciplinary area in that materials, safety, systems, regulation, and supply chains play vital roles. In particular, the review covers processing and packaging principles, standards, interfaces, techniques, methods, and state-of-the-art technologies that are currently in use or in development. Recent advances such as smart packaging, non-destructive inspection methods, printing techniques, application of robotics and machineries, automation architecture, software systems and interfaces are reviewed.

  15. Operating cost budgeting methods: quantitative methods to improve the process

    Directory of Open Access Journals (Sweden)

    José Olegário Rodrigues da Silva

    Full Text Available Abstract Operating cost forecasts are used in economic feasibility studies of projects and in budgeting process. Studies have pointed out that some companies are not satisfied with the budgeting process and chief executive officers want updates more frequently. In these cases, the main problem lies in the costs versus benefits. Companies seek simple and cheap forecasting methods without, at the same time, conceding in terms of quality of the resulting information. This study aims to compare operating cost forecasting models to identify the ones that are relatively easy to implement and turn out less deviation. For this purpose, we applied ARIMA (autoregressive integrated moving average and distributed dynamic lag models to data from a Brazilian petroleum company. The results suggest that the models have potential application, and that multivariate models fitted better and showed itself a better way to forecast costs than univariate models.

  16. Method of controlling radioactive waste processing systems

    International Nuclear Information System (INIS)

    Mikawa, Hiroji; Sato, Takao.

    1981-01-01

    Purpose: To minimize the pellet production amount, maximize the working life of a solidifying device and maintaining the mechanical strength of pellets to a predetermined value irrespective of the type and the cycle of occurrence of the secondary waste in the secondary waste solidifying device for radioactive waste processing systems in nuclear power plants. Method: Forecasting periods for the type, production amount and radioactivity level of the secondary wastes are determined in input/output devices connected to a control system and resulted signals are sent to computing elements. The computing elements forecast the production amount of regenerated liquid wastes after predetermined days based on the running conditions of a condensate desalter and the production amounts of filter sludges and liquid resin wastes after predetermined days based on the liquid waste processing amount or the like in a processing device respectively. Then, the mass balance between the type and the amount of the secondary wastes presently stored in a tank are calculated and the composition and concentration for the processing liquid are set so as to obtain predetermined values for the strength of pellets that can be dried to solidify, the working life of the solidifying device itself and the radioactivity level of the pellets. Thereafter, the running conditions for the solidifying device are determined so as to maximize the working life of the solidifying device. (Horiuchi, T.)

  17. Method of processing radioactive liquid wastes

    International Nuclear Information System (INIS)

    Kurumada, Norimitsu; Shibata, Setsuo; Wakabayashi, Toshikatsu; Kuribayashi, Hiroshi.

    1984-01-01

    Purpose: To facilitate the procession of liquid wastes containing insoluble salts of boric acid and calcium in a process for solidifying under volume reduction of radioactive liquid wastes containing boron. Method: A soluble calcium compound (such as calcium hydroxide, calcium oxide and calcium nitrate) is added to liquid wastes whose pH value is adjusted neutral or alkaline such that the molar ratio of calcium to boron in the liquid wastes is at least 0.2. Then, they are agitated at a temperature between 40 - 70 0 C to form insoluble calcium salt containing boron. Thereafter, the liquid is maintained at a temperature less than the above-mentioned forming temperature to age the products and, thereafter, the liquid is evaporated to condensate into a liquid concentrate containing 30 - 80% by weight of solid components. The concentrated liquid is mixed with cement to solidify. (Ikeda, J.)

  18. OPTIMAL SIGNAL PROCESSING METHODS IN GPR

    Directory of Open Access Journals (Sweden)

    Saeid Karamzadeh

    2014-01-01

    Full Text Available In the past three decades, a lot of various applications of Ground Penetrating Radar (GPR took place in real life. There are important challenges of this radar in civil applications and also in military applications. In this paper, the fundamentals of GPR systems will be covered and three important signal processing methods (Wavelet Transform, Matched Filter and Hilbert Huang will be compared to each other in order to get most accurate information about objects which are in subsurface or behind the wall.

  19. Method of processing radioactive rare gase

    International Nuclear Information System (INIS)

    Tagusagawa, Atsushi; Tuda, Kazuaki.

    1988-01-01

    Purpose: To obtain a safety processing method without using mechanical pumps or pressure-proof containers and, accordingly, with no risk for the leakage of radioactive rare gas. Method: A container filled with zeolige is inserted with a cover being opened into an autoclave. Meanwhile, krypton-containing gases are supplied to an adsorption tower filled with adsorbents, cooled, adsorbed and then heated to desorb adsorbed krypton. The krypton-containing gases are introduced due to the pressure difference to the autoclave thereby causing krypton to adsorb at ambient temperature to zeolite. Then, the inside of the autoclave is heated to desorb krypton and adsorbed moistures from zeolite and the pressure is elevated. After sending the gases under pressure to the adsorption tower, the zeolite-filled container is taken out from the autoclave, tightly closed and then transferred to a predetermined site. (Takahashi, M.)

  20. Methods of control the machining process

    Directory of Open Access Journals (Sweden)

    Yu.V. Petrakov

    2017-12-01

    Full Text Available Presents control methods, differentiated by the time of receipt of information used: a priori, a posteriori and current. When used a priori information to determine the mode of cutting is carried out by simulation the process of cutting allowance, where the shape of the workpiece and the details are presented in the form of wireframes. The office for current information provides for a system of adaptive control and modernization of CNC machine, where in the input of the unit shall be computed by using established optimization software. For the control by a posteriori information of the proposed method of correction of shape-generating trajectory in the second pass measurement surface of the workpiece formed by the first pass. Developed programs that automatically design the adjusted file for machining.

  1. Processing method of radioactive metal wastes

    International Nuclear Information System (INIS)

    Uetake, Naoto; Urata, Megumu; Sato, Masao.

    1985-01-01

    Purpose: To reduce the volume and increase the density of radioactive metal wastes easily while preventing scattering of radioactivity and process them into suitable form to storage and treatment. Method: Metal wastes mainly composed of zirconium are discharged from nuclear power plants or fuel re-processing plants, and these metals such as zirconium and titanium vigorously react with hydrogen and rapidly diffuse as hydrides. Since the hydrides are extremely brittle and can be pulverized easily, they can be volume-reduced. However, since metal hydrides have no ductility, dehydrogenation is applied for the molding fabrication in view of the subsequent storage and processing. The dehydrogenation is easy like the hydrogenation and fine metal pieces can be molded in a small compression device. For the dehydrogenation, a temperature is slightly increased as compared with that in the hydrogenation, pressure is reduced through the vacuum evacuation system and the removed hydrogen is purified for reuse. The upper limit for the temperature of the hydrogenation is 680 0 C in order to prevent the scttering of radioactivity. (Kamimura, M.)

  2. Method of processing spent ion exchange resins

    International Nuclear Information System (INIS)

    Mori, Kazuhide; Tamada, Shin; Kikuchi, Makoto; Matsuda, Masami; Aoyama, Yoshiyuki.

    1985-01-01

    Purpose: To decrease the amount of radioactive spent ion exchange resins generated from nuclear power plants, etc and process them into stable inorganic compounds through heat decomposition. Method: Spent ion exchange resins are heat-decomposed in an inert atmosphere to selectively decompose only ion exchange groups in the preceeding step while high molecular skeltons are completely heat-decomposed in an oxidizing atmosphere in the succeeding step. In this way, gaseous sulfur oxides and nitrogen oxides are generated in the preceeding step, while gaseous carbon dioxide and hydrogen requiring no discharge gas procession are generated in the succeeding step. Accordingly, the amount of discharged gases requiring procession can significantly be reduced, as well as the residues can be converted into stable inorganic compounds. Further, if transition metals are ionically adsorbed as the catalyst to the ion exchange resins, the ion exchange groups are decomposed at 130 - 300 0 C, while the high molecular skeltons are thermally decomposed at 240 - 300 0 C. Thus, the temperature for the heat decomposition can be lowered to prevent the degradation of the reactor materials. (Kawakami, Y.)

  3. Method of noncontacting ultrasonic process monitoring

    Science.gov (United States)

    Garcia, Gabriel V.; Walter, John B.; Telschow, Kenneth L.

    1992-01-01

    A method of monitoring a material during processing comprising the steps of (a) shining a detection light on the surface of a material; (b) generating ultrasonic waves at the surface of the material to cause a change in frequency of the detection light; (c) detecting a change in the frequency of the detection light at the surface of the material; (d) detecting said ultrasonic waves at the surface point of detection of the material; (e) measuring a change in the time elapsed from generating the ultrasonic waves at the surface of the material and return to the surface point of detection of the material, to determine the transit time; and (f) comparing the transit time to predetermined values to determine properties such as, density and the elastic quality of the material.

  4. Method of processing radioactive metal wastes

    International Nuclear Information System (INIS)

    Inoue, Yoichi; Kitagawa, Kazuo; Tsuzura, Katsuhiko.

    1980-01-01

    Purpose: To enable long and safety storage for radioactive metal wastes such as used fuel cans after the procession or used pipe, instruments and the likes polluted with various radioactive substances, by compacting them to solidify. Method: Metal wastes such as used fuel cans, which have been cut shorter and reprocessed, are pressed into generally hexagonal blocks. The block is charged in a capsule of a hexagonal cross section made of non-gas permeable materials such as soft steels, stainless steels and the likes. Then, the capsule is subjected to static hydraulic hot pressing as it is or after deaeration and sealing. While various combinations are possible for temperature, pressure and time as the conditions for the static hydraulic hot pressing, dense block with no residual gas pores can be obtained, for example, under the conditions of 900 0 C, 1000 Kg/cm 2 and one hour where the wastes are composed of zircaloy. (Kawakami, Y.)

  5. Method for processing powdery radioactive wastes

    International Nuclear Information System (INIS)

    Yasumura, Keijiro; Matsuura, Hiroyuki; Tomita, Toshihide; Nakayama, Yasuyuki.

    1978-01-01

    Purpose: To solidify radioactive wastes with ease and safety at a high reaction speed but with no boiling by impregnating the radioactive wastes with chlorostyrene. Method: Beads-like dried ion exchange resin, powdery ion exchange resin, filter sludges, concentrated dried waste liquor or the like are mixed or impregnated with a chlorostyrene monomer dissolving therein a polymerization initiator such as methyl ethyl ketone peroxide and benzoyl peroxide. Mixed or impregnated products are polymerized to solid after a predetermined of time through curing reaction to produce solidified radioactive wastes. Since inflammable materials are used, this process has a high safety. About 70% wastes can be incorporated. The solidified products have a strength as high as 300 - 400 kg/cm 3 and are suitable to ocean disposal. The products have a greater radioactive resistance than other plastic solidification products. (Seki, T.)

  6. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  7. Method for enhanced control of welding processes

    Science.gov (United States)

    Sheaffer, Donald A.; Renzi, Ronald F.; Tung, David M.; Schroder, Kevin

    2000-01-01

    Method and system for producing high quality welds in welding processes, in general, and gas tungsten arc (GTA) welding, in particular by controlling weld penetration. Light emitted from a weld pool is collected from the backside of a workpiece by optical means during welding and transmitted to a digital video camera for further processing, after the emitted light is first passed through a short wavelength pass filter to remove infrared radiation. By filtering out the infrared component of the light emitted from the backside weld pool image, the present invention provides for the accurate determination of the weld pool boundary. Data from the digital camera is fed to an imaging board which focuses on a 100.times.100 pixel portion of the image. The board performs a thresholding operation and provides this information to a digital signal processor to compute the backside weld pool dimensions and area. This information is used by a control system, in a dynamic feedback mode, to automatically adjust appropriate parameters of a welding system, such as the welding current, to control weld penetration and thus, create a uniform weld bead and high quality weld.

  8. Extracellular Signatures as Indicators of Processing Methods

    Energy Technology Data Exchange (ETDEWEB)

    Wahl, Karen L.

    2012-01-09

    As described in other chapters within this volume, many aspects of microbial cells vary with culture conditions and therefore can potentially be analyzed as forensic signatures of growth conditions. In addition to changes or variations in components of the microbes themselves, extracellular materials indicative of production processes may remain associated with the final bacterial product. It is well recognized that even with considerable effort to make pure products such as fine chemicals or pharmaceuticals, trace impurities from components or synthesis steps associated with production processes can be detected in the final product. These impurities can be used as indicators of production source or methods, such as to help connect drugs of abuse to supply chains. Extracellular residue associated with microbial cells could similarly help to characterize production processes. For successful growth of microorganisms on culture media there must be an available source of carbon, nitrogen, inorganic phosphate and sulfur, trace metals, water and vitamins. The pH, temperature, and a supply of oxygen or other gases must also be appropriate for a given organism for successful culture. The sources of these components and the range in temperature, pH and other variables has adapted over the years with currently a wide range of possible combinations of media components, recipes and parameters to choose from for a given organism. Because of this wide variability in components, mixtures of components, and other parameters, there is the potential for differentiation of cultured organisms based on changes in culture conditions. The challenge remains how to narrow the field of potential combinations and be able to attribute variations in the final bacterial product and extracellular signatures associated with the final product to information about the culture conditions or recipe used in the production of that product.

  9. Processing method for high resolution monochromator

    International Nuclear Information System (INIS)

    Kiriyama, Koji; Mitsui, Takaya

    2006-12-01

    A processing method for high resolution monochromator (HRM) has been developed at Japanese Atomic Energy Agency/Quantum Beam Science Directorate/Synchrotron Radiation Research unit at SPring-8. For manufacturing a HRM, a sophisticated slicing machine and X-ray diffractometer have been installed for shaping a crystal ingot and orienting precisely the surface of a crystal ingot, respectively. The specification of the slicing machine is following; Maximum size of a diamond blade is φ 350mm in diameter, φ 38.1mm in the spindle diameter, and 2mm in thickness. A large crystal such as an ingot with 100mm in diameter, 200mm in length can be cut. Thin crystal samples such as a wafer can be also cut using by another sample holder. Working distance of a main shaft with the direction perpendicular to working table in the machine is 350mm at maximum. Smallest resolution of the main shaft with directions of front-and-back and top-and-bottom are 0.001mm read by a digital encoder. 2mm/min can set for cutting samples in the forward direction. For orienting crystal faces relative to the blade direction adjustment, a one-circle goniometer and 2-circle segment are equipped on the working table in the machine. A rotation and a tilt of the stage can be done by manual operation. Digital encoder in a turn stage is furnished and has angle resolution of less than 0.01 degrees. In addition, a hand drill as a supporting device for detailed processing of crystal is prepared. Then, an ideal crystal face can be cut from crystal samples within an accuracy of about 0.01 degrees. By installation of these devices, a high energy resolution monochromator crystal for inelastic x-ray scattering and a beam collimator are got in hand and are expected to be used for nanotechnology studies. (author)

  10. SEAM PUCKERING EVALUATION METHOD FOR SEWING PROCESS

    Directory of Open Access Journals (Sweden)

    BRAD Raluca

    2014-07-01

    Full Text Available The paper presents an automated method for the assessment and classification of puckering defects detected during the preproduction control stage of the sewing machine or product inspection. In this respect, we have presented the possible causes and remedies of the wrinkle nonconformities. Subjective factors related to the control environment and operators during the seams evaluation can be reduced using an automated system whose operation is based on image processing. Our implementation involves spectral image analysis using Fourier transform and an unsupervised neural network, the Kohonen Map, employed to classify material specimens, the input images, into five discrete degrees of quality, from grade 5 (best to grade 1 (the worst. The puckering features presented in the learning and test images have been pre-classified using the seam puckering quality standard. The network training stage will consist in presenting five input vectors (derived from the down-sampled arrays, representing the puckering grades. The puckering classification consists in providing an input vector derived from the image supposed to be classified. A scalar product between the input values vectors and the weighted training images is computed. The result will be assigned to one of the five classes of which the input image belongs. Using the Kohonen network the puckering defects were correctly classified in proportion of 71.42%.

  11. Method for Business Process Management System Selection

    NARCIS (Netherlands)

    Thijs van de Westelaken; Bas Terwee; Pascal Ravesteijn

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However

  12. Method for Business Process Management System Selection

    OpenAIRE

    Westelaken, van de, Thijs; Terwee, Bas; Ravesteijn, Pascal

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However the research on BPMS is mostly focused on the architecture of the system and how to implement such systems. How to select a BPM system that fits the strategy and goals of a specific organization is ...

  13. Advanced Methods of Biomedical Signal Processing

    CERN Document Server

    Cerutti, Sergio

    2011-01-01

    This book grew out of the IEEE-EMBS Summer Schools on Biomedical Signal Processing, which have been held annually since 2002 to provide the participants state-of-the-art knowledge on emerging areas in biomedical engineering. Prominent experts in the areas of biomedical signal processing, biomedical data treatment, medicine, signal processing, system biology, and applied physiology introduce novel techniques and algorithms as well as their clinical or physiological applications. The book provides an overview of a compelling group of advanced biomedical signal processing techniques, such as mult

  14. Standard-Setting Methods as Measurement Processes

    Science.gov (United States)

    Nichols, Paul; Twing, Jon; Mueller, Canda D.; O'Malley, Kimberly

    2010-01-01

    Some writers in the measurement literature have been skeptical of the meaningfulness of achievement standards and described the standard-setting process as blatantly arbitrary. We argue that standard setting is more appropriately conceived of as a measurement process similar to student assessment. The construct being measured is the panelists'…

  15. Hospital Registration Process Reengineering Using Simulation Method

    Directory of Open Access Journals (Sweden)

    Qiang Su

    2010-01-01

    Full Text Available With increasing competition, many healthcare organizations have undergone tremendous reform in the last decade aiming to increase efficiency, decrease waste, and reshape the way that care is delivered. This study focuses on the operational efficiency improvement of hospital’s registration process. The operational efficiency related factors including the service process, queue strategy, and queue parameters were explored systematically and illustrated with a case study. Guided by the principle of business process reengineering (BPR, a simulation approach was employed for process redesign and performance optimization. As a result, the queue strategy is changed from multiple queues and multiple servers to single queue and multiple servers with a prepare queue. Furthermore, through a series of simulation experiments, the length of the prepare queue and the corresponding registration process efficiency was quantitatively evaluated and optimized.

  16. Soil processing method journal article supporting data

    Data.gov (United States)

    U.S. Environmental Protection Agency — This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the...

  17. Geophysical Methods for Monitoring Soil Stabilization Processes

    Science.gov (United States)

    Soil stabilization involves methods used to turn unconsolidated and unstable soil into a stiffer, consolidated medium that could support engineered structures, alter permeability, change subsurface flow, or immobilize contamination through mineral precipitation. Among the variety...

  18. Method of processing decontaminating liquid waste

    International Nuclear Information System (INIS)

    Kusaka, Ken-ichi

    1989-01-01

    When decontaminating liquid wastes are processed by ion exchange resins, radioactive nuclides, metals, decontaminating agents in the liquid wastes are captured in the ion exchange resins. When the exchange resins are oxidatively deomposed, most of the ingredients are decomposed into water and gaseous carbonic acid and discharged, while sulfur ingredient in the resins is converted into sulfuric acid. In this case, even less oxidizable ingredients in the decontaminating agent made easily decomposable by oxidative decomposition together with the resins. The radioactive nuclides and a great amount of iron dissolved upon decontamination in the liquid wastes are dissolved in sulfuric acid formed. When the sulfuric acid wastes are nuetralized with sodium hydroxide, since they are formed into sodium sulfate, which is most popular as wastes from nuclear facilities, they can be condensated and solidified by existent waste processing systms to thereby facilitate the waste processing. (K.M.)

  19. Signal Processing Methods Monitor Cranial Pressure

    Science.gov (United States)

    2010-01-01

    Dr. Norden Huang, of Goddard Space Flight Center, invented a set of algorithms (called the Hilbert-Huang Transform, or HHT) for analyzing nonlinear and nonstationary signals that developed into a user-friendly signal processing technology for analyzing time-varying processes. At an auction managed by Ocean Tomo Federal Services LLC, licenses of 10 U.S. patents and 1 domestic patent application related to HHT were sold to DynaDx Corporation, of Mountain View, California. DynaDx is now using the licensed NASA technology for medical diagnosis and prediction of brain blood flow-related problems, such as stroke, dementia, and traumatic brain injury.

  20. Application of finite-element-methods in food processing

    DEFF Research Database (Denmark)

    Risum, Jørgen

    2004-01-01

    Presentation of the possible use of finite-element-methods in food processing. Examples from diffusion studies are given.......Presentation of the possible use of finite-element-methods in food processing. Examples from diffusion studies are given....

  1. Supporting Process Improvement using Method Increments

    NARCIS (Netherlands)

    Vlaanderen, Kevin

    2014-01-01

    With the research described in this dissertation, we aim to shed light on the characteristics of process improvement efforts by looking at their evolution (how to change?) rather than their content (what to change?). This research is triggered by three main propositions, derived from earlier work:

  2. Uranium manufacturing process employing the electrolytic reduction method

    International Nuclear Information System (INIS)

    Oda, Yoshio; Kazuhare, Manabu; Morimoto, Takeshi.

    1986-01-01

    The present invention related to a uranium manufacturing process that employs the electrolytic reduction method, but particularly to a uranium manufacturing process that employs an electrolytic reduction method requiring low voltage. The process, in which uranium is obtained by means of the electrolytic method and with uranyl acid as the raw material, is prior art

  3. Method for double-sided processing of thin film transistors

    Science.gov (United States)

    Yuan, Hao-Chih; Wang, Guogong; Eriksson, Mark A.; Evans, Paul G.; Lagally, Max G.; Ma, Zhenqiang

    2008-04-08

    This invention provides methods for fabricating thin film electronic devices with both front- and backside processing capabilities. Using these methods, high temperature processing steps may be carried out during both frontside and backside processing. The methods are well-suited for fabricating back-gate and double-gate field effect transistors, double-sided bipolar transistors and 3D integrated circuits.

  4. Plasma processing methods for hydrogen production

    International Nuclear Information System (INIS)

    Mizeraczyk, J.; Jasinski, M.

    2016-01-01

    In the future a transfer from the fossil fuel-based economy to hydrogen-based economy is expected. Therefore the development of systems for efficient H_2 production becomes important. The several conventional methods of mass-scale (or central) H_2 production (methane, natural gas and higher hydrocarbons reforming, coal gasification reforming) are well developed and their costs of H_2 production are acceptable. However, due to the H_2 transport and storage problems the small-scale (distributed) technologies for H_2 production are demanded. However, these new technologies have to meet the requirement of producing H_2 at a production cost of $(1-2)/kg(H_2) (or 60 g(H_2)/kWh) by 2020 (the U.S. Department of Energy's target). Recently several plasma methods have been proposed for the small-scale H_2 production. The most promising plasmas for this purpose seems to be those generated by gliding, plasmatron and nozzle arcs, and microwave discharges. In this paper plasma methods proposed for H_2 production are briefly described and critically evaluated from the view point of H_2 production efficiency. The paper is aiming at answering a question if any plasma method for the small-scale H_2 production approaches such challenges as the production energy yield of 60 g(H_2)/kWh, high production rate, high reliability and low investment cost. (authors)

  5. Processes of aggression described by kinetic method

    Science.gov (United States)

    Aristov, V. V.; Ilyin, O.

    2014-12-01

    In the last decades many investigations have been devoted to theoretical models in new areas concerning description of different biological, sociological and historical processes. In the present paper we suggest a model of the Nazi Germany invasion of Poland, France and USSR based on the kinetic theory. We model this process with the Cauchy boundary problem for the two-element kinetic equations with spatial initial conditions. The solution of the problem is given in the form of traveling wave. The propagation velocity of a frontline depends on the quotient between initial forces concentrations. Moreover it is obtained that the general solution of the model can be expressed in terms of quadratures and elementary functions. Finally it is shown that the frontline velocities are complied with the historical data.

  6. Processes of aggression described by kinetic method

    International Nuclear Information System (INIS)

    Aristov, V. V.; Ilyin, O.

    2014-01-01

    In the last decades many investigations have been devoted to theoretical models in new areas concerning description of different biological, sociological and historical processes. In the present paper we suggest a model of the Nazi Germany invasion of Poland, France and USSR based on the kinetic theory. We model this process with the Cauchy boundary problem for the two-element kinetic equations with spatial initial conditions. The solution of the problem is given in the form of traveling wave. The propagation velocity of a frontline depends on the quotient between initial forces concentrations. Moreover it is obtained that the general solution of the model can be expressed in terms of quadratures and elementary functions. Finally it is shown that the frontline velocities are complied with the historical data

  7. Signal processing method for Johnson noise thermometry

    International Nuclear Information System (INIS)

    Hwang, I. G.; Moon, B. S.; Kinser, Rpger

    2003-01-01

    The development of Johnson Noise Thermometry requires a high sensitive preamplifier circuit to pick up the temperature-related noise on the sensing element. However, the random noise generated in this amplification circuit causes a significant erroneous influence to the measurement. This paper describes signal processing mechanism of the Johnson Noise Thermometry system which is underway of development in collaboration between KAERI and ORNL. It adopts two identical amplifier channels and utilizes a digital signal processing technique to remove the independent noise of each channel. The CPSD(Cross Power Spectral Density) function is used to cancel the independent noise and the differentiation of narrow or single frequency peak from the CPSD data separates the common mode electromagnetic interference noise

  8. Processes of aggression described by kinetic method

    Energy Technology Data Exchange (ETDEWEB)

    Aristov, V. V.; Ilyin, O. [Dorodnicyn Computing Centre of Russian Academy of Sciences, Vavilova str. 40, Moscow, 119333 (Russian Federation)

    2014-12-09

    In the last decades many investigations have been devoted to theoretical models in new areas concerning description of different biological, sociological and historical processes. In the present paper we suggest a model of the Nazi Germany invasion of Poland, France and USSR based on the kinetic theory. We model this process with the Cauchy boundary problem for the two-element kinetic equations with spatial initial conditions. The solution of the problem is given in the form of traveling wave. The propagation velocity of a frontline depends on the quotient between initial forces concentrations. Moreover it is obtained that the general solution of the model can be expressed in terms of quadratures and elementary functions. Finally it is shown that the frontline velocities are complied with the historical data.

  9. Method of processing plutonium and uranium solution

    International Nuclear Information System (INIS)

    Otsuka, Katsuyuki; Kondo, Isao; Suzuki, Toru.

    1989-01-01

    Solutions of plutonium nitrate solutions and uranyl nitrate recovered in the solvent extraction step in reprocessing plants and nuclear fuel production plants are applied with low temperature treatment by means of freeze-drying under vacuum into residues containing nitrates, which are denitrated under heating and calcined under reduction into powders. That is, since complicate processes of heating, concentration and dinitration conducted so far for the plutonium solution and uranyl solution are replaced with one step of freeze-drying under vacuum, the process can be simplified significantly. In addition, since the treatment is applied at low temperature, occurrence of corrosion for the material of evaporation, etc. can be prevented. Further, the number of operators can be saved by dividing the operations into recovery of solidification products, supply and sintering of the solutions and vacuum sublimation. Further, since nitrates processed at a low temperature are powderized by heating dinitration, the powderization step can be simplified. The specific surface area and the grain size distribution of the powder is made appropriate and it is possible to obtain oxide powders of physical property easily to be prepared into pellets. (N.H.)

  10. Method and algorithm for image processing

    Science.gov (United States)

    He, George G.; Moon, Brain D.

    2003-12-16

    The present invention is a modified Radon transform. It is similar to the traditional Radon transform for the extraction of line parameters and similar to traditional slant stack for the intensity summation of pixels away from a given pixel, for example ray paths that spans 360 degree at a given grid in the time and offset domain. However, the present invention differs from these methods in that the intensity and direction of a composite intensity for each pixel are maintained separately instead of combined after the transformation. An advantage of this approach is elimination of the work required to extract the line parameters in the transformed domain. The advantage of the modified Radon Transform method is amplified when many lines are present in the imagery or when the lines are just short segments which both occur in actual imagery.

  11. Method of processing radioactive liquid wastes

    Energy Technology Data Exchange (ETDEWEB)

    Sugimoto, Y; Kikuchi, M; Funabashi, K; Yusa, H; Horiuchi, S

    1978-12-21

    Purpose: To decrease the volume of radioactive liquid wastes essentially consisting of sodium hydroxide and boric acid. Method: The concentration ratio of sodium hydroxide to boric acid by weight in radioactive liquid wastes essentially consisting of sodium hydroxide and boric acid is adjusted in the range of 0.28 - 0.4 by means of a pH detector and a sodium concentration detector. Thereafter, the radioactive liquid wastes are dried into powder and then discharged.

  12. Solar thermochemical processing system and method

    Science.gov (United States)

    Wegeng, Robert S.; Humble, Paul H.; Krishnan, Shankar; Leith, Steven D.; Palo, Daniel R.; Dagle, Robert A.

    2018-04-24

    A solar thermochemical processing system is disclosed. The system includes a first unit operation for receiving concentrated solar energy. Heat from the solar energy is used to drive the first unit operation. The first unit operation also receives a first set of reactants and produces a first set of products. A second unit operation receives the first set of products from the first unit operation and produces a second set of products. A third unit operation receives heat from the second unit operation to produce a portion of the first set of reactants.

  13. Method for processing radioactive wastes containing sodium

    International Nuclear Information System (INIS)

    Kubota, Takeshi.

    1975-01-01

    Object: To bake, solidify and process even radioactive wastes highly containing sodium. Structure: H and or NH 4 zeolites of more than 90g per chemical equivalent of sodium present in the waste is added to and left in radioactive wastes containing sodium, after which they are fed to a baker such as rotary cylindrical baker, spray baker and the like to bake and solidify the wastes at 350 to 800 0 C. Thereby, it is possible to bake and solidify even radioactive wastes highly containing sodium, which has been impossible to do so previously. (Kamimura, M.)

  14. Method of processing radioactive liquid waste

    International Nuclear Information System (INIS)

    Motojima, Kenji; Kawamura, Fumio.

    1981-01-01

    Purpose: To increase the efficiency of removing radioactive cesium from radioactive liquid waste by employing zeolite affixed to metallic compound ferrocyanide as an adsorbent. Method: Regenerated liquid waste of a reactor condensation desalting unit, floor drain and so forth are collected through respective supply tubes to a liquid waste tank, and the liquid waste is fed by a pump to a column filled with zeolite containing a metallic compound ferrocyanide, such as with copper, zinc, manganese, iron, cobalt, nickel or the like. The liquid waste from which radioactive cesium is removed is dried and pelletized by volume reducing and solidifying means. (Yoshino, Y.)

  15. Processing method of radioactive cleaning drain

    International Nuclear Information System (INIS)

    Otsuka, Shigemitsu; Murakami, Tadashi; Kitao, Hideo

    1998-01-01

    Upon processing of radioactive cleaning drains, contained Co-60 is removed by a selectively adsorbing adsorbent. In addition, after removing suspended materials by a filtering device, Co-60 as predominant nuclides in the drain is selectively adsorbed. The concentration of objective Co-60 is in the order of 0.1 Bq/cc, and non-radioactive metal ions such as Na + ions are present in the order of ppm in addition to Co-60. A granular adsorbent for selectively adsorbing Co-60 is oxine-added activated carbon, and has a grain size of from 20 to 48 mesh. The granular adsorbent is used while being filled in an adsorbing tower. Since a relatively simple device comprising the filtering device and the adsorbing tower in combination is provided, the reduction of the construction cost can be expected. In addition, since no filtering aid is used in the filtering device, the amount of secondary wastes is small. (N.H.)

  16. Discarding processing method for radioactive waste

    International Nuclear Information System (INIS)

    Komura, Shiro; Kato, Hiroaki; Hatakeyama, Takao; Oura, Masato.

    1992-01-01

    At first, in a discrimination step, extremely low level radioactive wastes are discriminated to metals and concretes and further, the metal wastes are discriminated to those having hollow portions and those not having hollow portions, and the concrete wastes are discriminated to those having block-like shape and those having other shapes respectively. Next, in a processing step, the metal wastes having hollow portions are applied with cutting, devoluming or packing treatment and block-like concrete wastes are applied with surface solidification treatment, and concrete wastes having other shapes are applied with crushing treatment respectively. Then, the extremely low level radioactive wastes contained in a container used exclusively for transportation are taken out, in a movable burying facility with diffusion inhibiter kept at a negative pressure as required, in a field for burying operation, and buried in a state that they are isolated from the outside. Accordingly, they can be buried safely and efficiently. (T.M.)

  17. Radiation methods in dairy production and processing

    International Nuclear Information System (INIS)

    Ganguli, N.C.

    1975-01-01

    Various uses of radiotracers and radiation in dairy technology are described. In dairy production, radiotracers are used for studying: (1) rumen metabolism leading to protein synthesis (2) total body water, blood volume and sodium (3) minerals metabolism (4) relation between climatic stress and thyroid functioning of dairy animals (5) volume of milk in mammary glands (6) hormone level in dairy animals and (7) spermatozoa metabolism. In dairy processing, radiotracers are used for studying: (1) compositional analysis of milk and milk products and (2) efficiency of cleaning agents for cleaning dairy equipment. Ionizing radiation is used for: (1) preservation of milk and milk products and (2) sterilization of packaging materials. Radiation source has been used to monitor the over-run in ice-cream and the fill control for fluid in papar cartons. (M.G.B.)

  18. Processing method for miscellaneous radioactive solid waste

    International Nuclear Information System (INIS)

    Matsuda, Masami; Komori, Itaru; Nishi, Takashi.

    1995-01-01

    Miscellaneous solid wastes are subjected to heat treatment at a temperature not lower than a carbonizing temperature of organic materials in the wastes and not higher than the melting temperature of inorganic materials in the wastes, for example, not lower than 200degC but not higher than 660degC, and then resultant miscellaneous solid wastes are solidified using a water hardening solidification material. With such procedures, the organic materials in the miscellaneous solids are decomposed into gases. Therefore, solid materials excellent in long term stability can be formed. In addition, since the heat treatment is conducted at a relatively low temperature such as not higher than 660degC, the generation amount of off gases is reduced to simplify an off gas processing system, and since molten materials are not formed, handing is facilitated. (T.M.)

  19. Method of processing radioactive solid wastes

    International Nuclear Information System (INIS)

    Ootaka, Hisashi; Aizu, Tadashi.

    1980-01-01

    Purpose: To improve the volume-reducing effect for the radioactive solids wastes by freezing and then pulverizing them. Method: Miscellaneous radioactive solid wastes produced from a nuclear power plant and packed in vinyl resin bags are filled in a drum can and nitrogen gas at low temperature (lower than 0 0 C) from a cylinder previously prepared by filling liquid nitrogen (at 15kg/cm 2 , -196 0 C) to freeze the radioactive solid wastes. Thereafter, a hydraulic press is inserted into the drum can to compress and pulverize the thus freezed miscellaneous radioactive solid wastes into powder. The powder thus formed does not expand even after removing the hydraulic press from the drum can, whereby the volume reduction of the radioactive solid wastes can be carried out effectively. (Horiuchi, T.)

  20. Method of processing radioactive liquid waste

    International Nuclear Information System (INIS)

    Hasegawa, Akira; Kuribayashi, Hiroshi; Soda, Kenzo; Mihara, Shigeru.

    1988-01-01

    Purpose: To obtain satisfactory plastic solidification products rapidly and smoothly by adding oxidizers to radioactive liquid wastes. Method: Sulfuric acid, etc. are added to radioactive liquid wastes to adjust the pH value of the liquid wastes to less than 3.0. Then, ferrous sulfates are added such that the iron concentration in the liquid wastes is 100 mg/l. Then, after adjusting pH suitably to the drying powderization by adding alkali such as hydroxide, the liquid wastes are dried and powderized. The resultant powder is subjected to plastic solidification by using polymerizable liquid unsaturated polyester resins as the solidifying agent. The thus obtained solidification products are stable in view of the physical property such as strength or water proofness, as well as stable operation is possible even for those radioactive liquid wastes in which the content ingredients are unknown. (Takahashi, M.)

  1. Method of processing radioactive liquid wastes

    International Nuclear Information System (INIS)

    Kawamura, Fumio; Funabashi, Kiyomi; Matsuda, Masami.

    1984-01-01

    Purpose: To improve the performance of removing metal ions in ion exchange resins for use in clean-up of service water or waste water in BWR type reactors. Method: A column filled with activated carbon is disposed at the pre- or post-stage of a clean-up system using ion exchange resins disposed for the clean-up of service water or waste water of a nuclear reactor so that organics contained in water may be removed through adsorption. Since the organic materials are thus adsorbed and eliminated, various types of radioactive ions contained in radioactive liquid are no more masked and the performance of removing ions in the ion exchanger resins of the clean-up device can be improved. (Moriyama, K.)

  2. Method and apparatus for processing oily wastewater

    International Nuclear Information System (INIS)

    Torline, W.N.; Williams, R.K.

    1993-01-01

    A method of treating oily wastewater is described comprising the steps of passing oily wastewater through a coalescer to coalesce dispersed oil droplets; separating a free oil fraction as a liquid stream having a lower specific gravity from a contaminated water stream having a higher specific gravity; filtering particulate material from said contaminated water stream; passing the filtered water stream under pressure across an ultrafiltration membrane to separate a retentate fraction enriched in residual emulsified oil from an aqueous permeate fraction; recycling said substantially only retentate fraction to said coalescer; filtering said aqueous permeate through an activated carbon filter to remove low molecular weight organic materials; subjecting the filtrate from said activated carbon filter to cation exchange to remove heavy metal ions; and periodically flushing said ultra filter with filtrate from said particulate filter to maintain the permeability of said ultrafiltration membrane

  3. Radioactive waste processing method and device

    International Nuclear Information System (INIS)

    Ozaki, Shigeru; Tateyama, Shinji.

    1998-01-01

    A powdery activated carbon is charged to radioactive liquid wastes to form a mixed slurry. The slurry is subjected to solid/liquid separation, and a high-molecular water absorbent is charged to the separated activated carbon sludge wastes to process them while stirring. The high-molecular water absorbent comprises a graft polymer of starch and acrylonitrile or a cross-linked polymer of sodium acrylate and a cross-linking agent. The high-molecular water absorbing agent is previously charged to a vessel for containing the wasted active carbon sludges. The device of the present invention comprises a filtration device for solid/liquid separation of the mixed slurry, a sludge-containing vessel, a device for charging the high-molecular water absorbent and a sludge stirring device. The device of charging the high-molecular water absorbent comprises a plurality of weighing devices for weighing the change of the weight of the charged products and a conveyor for transferring the sludge-containing vessels. With such a constitution, stable sludge can be obtained, and activated carbon sludge wastes can be burnt without crushing them. (T.M.)

  4. Underground processing method for radioactive wastes

    International Nuclear Information System (INIS)

    Endo, Yoshihiro

    1998-01-01

    In the present invention, even a processing vessel not having a satisfactory radiation shielding property can be covered by a waterproof material easily and safety. Namely, a large number of small waterproof blocks are laid on the bottom of a discarding hole of a shaft and then a large number of the above-mentioned blocks are stacked along the inner surface of the discarding hole to an appropriate height. A discarding vessel containing radioactive wastes is inserted to the containing space surrounded by each of the blocks, and then a single large waterproof block is settled on the upper portion of the discarding vessel and the discarding hole is closed. The discarding vessel is thus surrounded by the blocks. With such procedures, the small waterproof blocks are disposed while being reliably positioned with no gap by manual operation by operators before containing the discarding vessel into the discarding hole and then the large waterproof blocks can be settled simply by a remote control after containing the discarding vessel. (N.H.)

  5. Solidification processing method for radioactive waste

    International Nuclear Information System (INIS)

    Hiraki, Akimitsu; Tanaka, Keiji; Heta, Katsutoshi.

    1991-01-01

    The pressure in a vessel containing radioactive wastes is previously reduced and cement mortar prepared by kneading cement, sand and kneading agent with water is poured under shaking substantially to the upper end of the vessel. After the lowering of the mortar level due to the deforming has been terminated, the pressure is increased gradually. Then, the cement mortar is further poured substantially to the upper end of the vessel again. With such a two step pouring method, spaces other than the radioactive wastes in the vessel can be filled substantially completely with the cement mortar. Accordingly, it is possible to avoid the problem in view of the strength due to the formation of gaps at the inside of the vessel, or leaching of radioactive materials due to the intrusion of water into the gaps. Further, if washing water is reutilized as water for kneading or washing after the precipitation of the solid contents, the amount of the secondary wastes generated can be reduced. (T.M.)

  6. Comparative Study of Different Processing Methods for the ...

    African Journals Online (AJOL)

    The result of the two processing methods reduced the cyanide concentration to the barest minimum level required by World Health Organization (10mg/kg). The mechanical pressing-fermentation method removed more cyanide when compared to fermentation processing method. Keywords: Cyanide, Fermentation, Manihot ...

  7. Apparatus and method for radiation processing of materials

    International Nuclear Information System (INIS)

    Neuberg, W.B.; Luniewski, R.

    1983-01-01

    A method and apparatus for radiation degradation processing of polytetrafluoroethylene makes use of a simultaneous irradiation, agitation and cooling. The apparatus is designed to make efficent use of radiation in the processing. (author)

  8. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  9. Method of parallel processing in SANPO real time system

    International Nuclear Information System (INIS)

    Ostrovnoj, A.I.; Salamatin, I.M.

    1981-01-01

    A method of parellel processing in SANPO real time system is described. Algorithms of data accumulation and preliminary processing in this system as a parallel processes using a specialized high level programming language are described. Hierarchy of elementary processes are also described. It provides the synchronization of concurrent processes without semaphors. The developed means are applied to the systems of experiment automation using SM-3 minicomputers [ru

  10. Recent developments in analytical detection methods for radiation processed foods

    International Nuclear Information System (INIS)

    Wu Jilan

    1993-01-01

    A short summary of the programmes of 'ADMIT' (FAO/IAEA) and the developments in analytical detection methods for radiation processed foods has been given. It is suggested that for promoting the commercialization of radiation processed foods and controlling its quality, one must pay more attention to the study of analytical detection methods of irradiated food

  11. Process control and optimization with simple interval calculation method

    DEFF Research Database (Denmark)

    Pomerantsev, A.; Rodionova, O.; Høskuldsson, Agnar

    2006-01-01

    for the quality improvement in the course of production. The latter is an active quality optimization, which takes into account the actual history of the process. The advocate approach is allied to the conventional method of multivariate statistical process control (MSPC) as it also employs the historical process......Methods of process control and optimization are presented and illustrated with a real world example. The optimization methods are based on the PLS block modeling as well as on the simple interval calculation methods of interval prediction and object status classification. It is proposed to employ...... the series of expanding PLS/SIC models in order to support the on-line process improvements. This method helps to predict the effect of planned actions on the product quality and thus enables passive quality control. We have also considered an optimization approach that proposes the correcting actions...

  12. Device and method for shortening reactor process tubes

    Science.gov (United States)

    Frantz, Charles E.; Alexander, William K.; Lander, Walter E. B.

    1980-01-01

    This disclosure describes a device and method for in situ shortening of nuclear reactor zirconium alloy process tubes which have grown as a result of radiation exposure. An upsetting technique is utilized which involves inductively heating a short band of a process tube with simultaneous application of an axial load sufficient to cause upsetting with an attendant decrease in length of the process tube.

  13. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    Science.gov (United States)

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  14. Effect of thermal processing methods on the proximate composition ...

    African Journals Online (AJOL)

    The nutritive value of raw and thermal processed castor oil seed (Ricinus communis) was investigated using the following parameters; proximate composition, gross energy, mineral constituents and ricin content. Three thermal processing methods; toasting, boiling and soaking-and-boiling were used in the processing of the ...

  15. New Principles of Process Control in Geotechnics by Acoustic Methods

    OpenAIRE

    Leššo, I.; Flegner, P.; Pandula, B.; Horovčák, P.

    2007-01-01

    The contribution describes the new solution of the control of rotary drilling process as some elementary process in geotechnics. The article presents the first results of research on the utilization of acoustic methods in identification process by optimal control of rotary drilling.

  16. New Principles of Process Control in Geotechnics by Acoustic Methods

    Directory of Open Access Journals (Sweden)

    Leššo, I.

    2007-01-01

    Full Text Available The contribution describes the new solution of the control of rotary drilling process as some elementary process in geotechnics. The article presents the first results of research on the utilization of acoustic methods in identification process by optimal control of rotary drilling.

  17. Evaluation of processing methods for static radioisotope scan images

    International Nuclear Information System (INIS)

    Oakberg, J.A.

    1976-12-01

    Radioisotope scanning in the field of nuclear medicine provides a method for the mapping of a radioactive drug in the human body to produce maps (images) which prove useful in detecting abnormalities in vital organs. At best, radioisotope scanning methods produce images with poor counting statistics. One solution to improving the body scan images is using dedicated small computers with appropriate software to process the scan data. Eleven methods for processing image data are compared

  18. Multi-block methods in multivariate process control

    DEFF Research Database (Denmark)

    Kohonen, J.; Reinikainen, S.P.; Aaljoki, K.

    2008-01-01

    methods the effect of a sub-process can be seen and an example with two blocks, near infra-red, NIR, and process data, is shown. The results show improvements in modelling task, when a MB-based approach is used. This way of working with data gives more information on the process than if all data...... are in one X-matrix. The procedure is demonstrated by an industrial continuous process, where knowledge about the sub-processes is available and X-matrix can be divided into blocks between process variables and NIR spectra.......In chemometric studies all predictor variables are usually collected in one data matrix X. This matrix is then analyzed by PLS regression or other methods. When data from several different sub-processes are collected in one matrix, there is a possibility that the effects of some sub-processes may...

  19. The effect of volume-of-interest misregistration on quantitative planar activity and dose estimation

    International Nuclear Information System (INIS)

    Song, N; Frey, E C; He, B

    2010-01-01

    In targeted radionuclide therapy (TRT), dose estimation is essential for treatment planning and tumor dose response studies. Dose estimates are typically based on a time series of whole-body conjugate view planar or SPECT scans of the patient acquired after administration of a planning dose. Quantifying the activity in the organs from these studies is an essential part of dose estimation. The quantitative planar (QPlanar) processing method involves accurate compensation for image degrading factors and correction for organ and background overlap via the combination of computational models of the image formation process and 3D volumes of interest defining the organs to be quantified. When the organ VOIs are accurately defined, the method intrinsically compensates for attenuation, scatter and partial volume effects, as well as overlap with other organs and the background. However, alignment between the 3D organ volume of interest (VOIs) used in QPlanar processing and the true organ projections in the planar images is required. The aim of this research was to study the effects of VOI misregistration on the accuracy and precision of organ activity estimates obtained using the QPlanar method. In this work, we modeled the degree of residual misregistration that would be expected after an automated registration procedure by randomly misaligning 3D SPECT/CT images, from which the VOI information was derived, and planar images. Mutual information-based image registration was used to align the realistic simulated 3D SPECT images with the 2D planar images. The residual image misregistration was used to simulate realistic levels of misregistration and allow investigation of the effects of misregistration on the accuracy and precision of the QPlanar method. We observed that accurate registration is especially important for small organs or ones with low activity concentrations compared to neighboring organs. In addition, residual misregistration gave rise to a loss of precision

  20. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  1. Optoelectronic imaging of speckle using image processing method

    Science.gov (United States)

    Wang, Jinjiang; Wang, Pengfei

    2018-01-01

    A detailed image processing of laser speckle interferometry is proposed as an example for the course of postgraduate student. Several image processing methods were used together for dealing with optoelectronic imaging system, such as the partial differential equations (PDEs) are used to reduce the effect of noise, the thresholding segmentation also based on heat equation with PDEs, the central line is extracted based on image skeleton, and the branch is removed automatically, the phase level is calculated by spline interpolation method, and the fringe phase can be unwrapped. Finally, the imaging processing method was used to automatically measure the bubble in rubber with negative pressure which could be used in the tire detection.

  2. Development of continuous pharmaceutical production processes supported by process systems engineering methods and tools

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2012-01-01

    The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way.......The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way....

  3. Methods Used to Deal with Peace Process Spoilers

    Directory of Open Access Journals (Sweden)

    MA. Bilbil Kastrati

    2014-06-01

    Full Text Available The conflicts of the past three decades have shown that the major problems which peace processes face are the spoilers. Spoilers are warring parties and their leaders who believe that peaceful settlement of disputes threatens their interests, power and their reputation; therefore, they use all means to undermine or completely spoil the process. Spoilers of peace processes can be inside or outside of the process and are characterized as limited, greedy or total spoilers. Their motives for spoiling can be different, such as: political, financial, ethnic, security, etc. Furthermore, it is important to emphasise that spoilers are not only rebels and insurgents, but can often be governments, diasporas, warlords, private military companies, etc. In order to counteract the spoilers, the international community has adopted and implemented three methods: inducement, socialization and coercion. Often all three methods are used to convince the spoilers to negotiate, accept and implement peace agreements. Hence, this paper will examine the methods used to deal with peace process spoilers through an assessment of the strategies employed, impact, success and failures. This paper will also argue that the success or failure of the peace process depends on the method(s used to deal with spoilers. If the right method is chosen, with a persistent engagement of the international community, the peace process will be successful; on the contrary, if they fail to do so, the consequences will be devastating.

  4. Extension of moment projection method to the fragmentation process

    International Nuclear Information System (INIS)

    Wu, Shaohua; Yapp, Edward K.Y.; Akroyd, Jethro; Mosbach, Sebastian; Xu, Rong; Yang, Wenming; Kraft, Markus

    2017-01-01

    The method of moments is a simple but efficient method of solving the population balance equation which describes particle dynamics. Recently, the moment projection method (MPM) was proposed and validated for particle inception, coagulation, growth and, more importantly, shrinkage; here the method is extended to include the fragmentation process. The performance of MPM is tested for 13 different test cases for different fragmentation kernels, fragment distribution functions and initial conditions. Comparisons are made with the quadrature method of moments (QMOM), hybrid method of moments (HMOM) and a high-precision stochastic solution calculated using the established direct simulation algorithm (DSA) and advantages of MPM are drawn.

  5. Extension of moment projection method to the fragmentation process

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Shaohua [Department of Mechanical Engineering, National University of Singapore, Engineering Block EA, Engineering Drive 1, 117576 (Singapore); Yapp, Edward K.Y.; Akroyd, Jethro; Mosbach, Sebastian [Department of Chemical Engineering and Biotechnology, University of Cambridge, New Museums Site, Pembroke Street, Cambridge, CB2 3RA (United Kingdom); Xu, Rong [School of Chemical and Biomedical Engineering, Nanyang Technological University, 62 Nanyang Drive, 637459 (Singapore); Yang, Wenming [Department of Mechanical Engineering, National University of Singapore, Engineering Block EA, Engineering Drive 1, 117576 (Singapore); Kraft, Markus, E-mail: mk306@cam.ac.uk [Department of Chemical Engineering and Biotechnology, University of Cambridge, New Museums Site, Pembroke Street, Cambridge, CB2 3RA (United Kingdom); School of Chemical and Biomedical Engineering, Nanyang Technological University, 62 Nanyang Drive, 637459 (Singapore)

    2017-04-15

    The method of moments is a simple but efficient method of solving the population balance equation which describes particle dynamics. Recently, the moment projection method (MPM) was proposed and validated for particle inception, coagulation, growth and, more importantly, shrinkage; here the method is extended to include the fragmentation process. The performance of MPM is tested for 13 different test cases for different fragmentation kernels, fragment distribution functions and initial conditions. Comparisons are made with the quadrature method of moments (QMOM), hybrid method of moments (HMOM) and a high-precision stochastic solution calculated using the established direct simulation algorithm (DSA) and advantages of MPM are drawn.

  6. Method for pre-processing LWR spent fuel

    International Nuclear Information System (INIS)

    Otsuka, Katsuyuki; Ebihara, Hikoe.

    1986-01-01

    Purpose: To facilitate the decladding of spent fuel, cladding tube processing, and waste gas recovery, and to enable the efficient execution of main re-processing process thereafter. Constitution: Spent fuel assemblies are sent to a cutting process where they are cut into chips of easy-to-process size. The chips, in a thermal decladding process, undergo a thermal cycle processing in air with the processing temperatures increased and decreased within the range of from 700 deg C to 1200 deg C, oxidizing zircaloy comprising the cladding tubes into zirconia. The oxidized cladding tubes have a number of fine cracks and become very brittle and easy to loosen off from fuel pellets when even a slight mechanical force is applied thereto, thus changing into a form of powder. Processed products are then separated into zirconia sand and fuel pellets by a gravitational selection method or by a sifting method, the zirconia sand being sent to a waste processing process and the fuel pellets to a melting-refining process. (Yoshino, Y.)

  7. Dense Medium Machine Processing Method for Palm Kernel/ Shell ...

    African Journals Online (AJOL)

    ADOWIE PERE

    Cracked palm kernel is a mixture of kernels, broken shells, dusts and other impurities. In ... machine processing method using dense medium, a separator, a shell collector and a kernel .... efficiency, ease of maintenance and uniformity of.

  8. Novel welding image processing method based on fractal theory

    Institute of Scientific and Technical Information of China (English)

    陈强; 孙振国; 肖勇; 路井荣

    2002-01-01

    Computer vision has come into used in the fields of welding process control and automation. In order to improve precision and rapidity of welding image processing, a novel method based on fractal theory has been put forward in this paper. Compared with traditional methods, the image is preliminarily processed in the macroscopic regions then thoroughly analyzed in the microscopic regions in the new method. With which, an image is divided up to some regions according to the different fractal characters of image edge, and the fuzzy regions including image edges are detected out, then image edges are identified with Sobel operator and curved by LSM (Lease Square Method). Since the data to be processed have been decreased and the noise of image has been reduced, it has been testified through experiments that edges of weld seam or weld pool could be recognized correctly and quickly.

  9. Measuring methods, registration and signal processing for magnetic field research

    International Nuclear Information System (INIS)

    Nagiello, Z.

    1981-01-01

    Some measuring methods and signal processing systems based on analogue and digital technics, which have been applied in magnetic field research using magnetometers with ferromagnetic transducers, are presented. (author)

  10. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  11. Methods for Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte

    This thesis deals with the adaptation and implementation of various optimisation methods, in the field of experimental design, for the laser cutting process. The problem in optimising the laser cutting process has been defined and a structure for at Decision Support System (DSS......) for the optimisation of the laser cutting process has been suggested. The DSS consists of a database with the currently used and old parameter settings. Also one of the optimisation methods has been implemented in the DSS in order to facilitate the optimisation procedure for the laser operator. The Simplex Method has...... been adapted in two versions. A qualitative one, that by comparing the laser cut items optimise the process and a quantitative one that uses a weighted quality response in order to achieve a satisfactory quality and after that maximises the cutting speed thus increasing the productivity of the process...

  12. Digital processing method for monitoring the radioactivity of stack releases

    International Nuclear Information System (INIS)

    Vialettes, H.; Leblanc, P.; Perotin, J.P.; Lazou, J.P.

    1978-01-01

    The digital processing method proposed is adapted for data supplied by a fixed-filter detector normally used for analogue processing (integrator system). On the basis of the raw data (pulses) from the detector, the technique makes it possible to determine the rate of activity released whereas analogue processing gives only the released activity. Furthermore, the method can be used to develop alarm systems on the basis of a possible exposure rate at the point of fall-out, and by including in the program a coefficient which allows for atmospheric diffusion conditions at any given time one can improve the accuracy of the results. In order to test the digital processing method and demonstrate its advantages over analogue processing, various atmospheric contamination situations were simulated in a glove-box and analysed simultaneously, using both systems, from the pulses transmitted by the same sampling and fixed-filter detection unit. The experimental results confirm the advantages foreseen in the theoretical research. (author)

  13. Possibilities of implementing nonthermal processing methods in the dairy industry

    OpenAIRE

    Irena Jeličić

    2010-01-01

    In the past two decades a lot of research in the field of food science has focused on new, non-thermal processing methods. This article describes the most intensively investigated new processing methodsfor implementation in the dairy industry, like microfiltration, high hydrostatic pressure, ultrasound and pulsed electric fields. For each method an overview is given for the principle of microbial inactivation, the obtained results regarding reduction of microorganisms as well as the positive ...

  14. Minimal processing - preservation methods of the future: an overview

    International Nuclear Information System (INIS)

    Ohlsson, T.

    1994-01-01

    Minimal-processing technologies are modern techniques that provide sufficient shelf life to foods to allow their distribution, while also meeting the demands of the consumers for convenience and fresh-like quality. Minimal-processing technologies can be applied at various stages of the food distribution chain, in storage, in processing and/or in packaging. Examples of methods will be reviewed, including modified-atmosphere packaging, high-pressure treatment, sous-vide cooking and active packaging

  15. A process for application of ATHEANA - a new HRA method

    International Nuclear Information System (INIS)

    Parry, G.W.; Bley, D.C.; Cooper, S.E.

    1996-01-01

    This paper describes the analytical process for the application of ATHEANA, a new approach to the performance of human reliability analysis as part of a PRA. This new method, unlike existing methods, is based upon an understanding of the reasons why people make errors, and was developed primarily to address the analysis of errors of commission

  16. A Situational Implementation Method for Business Process Management Systems

    NARCIS (Netherlands)

    R.L. Jansen; J.P.P. Ravensteyn

    For the integrated implementation of Business Process Management and supporting information systems many methods are available. Most of these methods, however, apply a one-size fits all approach and do not take into account the specific situation of the organization in which an information system is

  17. Assessing Commercial and Alternative Poultry Processing Methods using Microbiome Analyses

    Science.gov (United States)

    Assessing poultry processing methods/strategies has historically used culture-based methods to assess bacterial changes or reductions, both in terms of general microbial communities (e.g. total aerobic bacteria) or zoonotic pathogens of interest (e.g. Salmonella, Campylobacter). The advent of next ...

  18. Method and apparatus for lysing and processing algae

    Science.gov (United States)

    Chew, Geoffrey; Reich, Alton J.; Dykes, Jr., H. Waite H.; Di Salvo, Roberto

    2013-03-05

    Methods and apparatus for processing algae are described in which a hydrophilic ionic liquid is used to lyse algae cells at lower temperatures than existing algae processing methods. A salt or salt solution is used as a separation agent and to remove water from the ionic liquid, allowing the ionic liquid to be reused. The used salt may be dried or concentrated and reused. The relatively low lysis temperatures and recycling of the ionic liquid and salt reduce the environmental impact of the algae processing while providing biofuels and other useful products.

  19. Municipal solid waste processing methods: Technical-economic comparison

    International Nuclear Information System (INIS)

    Bertanza, G.

    1993-01-01

    This paper points out the advantages and disadvantages of municipal solid waste processing methods incorporating different energy and/or materials recovery techniques, i.e., those involving composting or incineration and those with a mix of composting and incineration. The various technologies employed are compared especially with regard to process reliability, flexibility, modularity, pollution control efficiency and cost effectiveness. For that which regards composting, biodigestors are examined, while for incineration, the paper analyzes systems using combustion with complete recovery of vapour, combustion with total recovery of available electric energy, and combustion with cogeneration. Each of the processing methods examined includes an iron recovery cycle

  20. Processing methods for temperature-dependent MCNP libraries

    International Nuclear Information System (INIS)

    Li Songyang; Wang Kan; Yu Ganglin

    2008-01-01

    In this paper,the processing method of NJOY which transfers ENDF files to ACE (A Compact ENDF) files (point-wise cross-Section file used for MCNP program) is discussed. Temperatures that cover the range for reactor design and operation are considered. Three benchmarks are used for testing the method: Jezebel Benchmark, 28 cm-thick Slab Core Benchmark and LWR Benchmark with Burnable Absorbers. The calculation results showed the precision of the neutron cross-section library and verified the correct processing methods in usage of NJOY. (authors)

  1. An alternative method to achieve metrological confirmation in measurement process

    Science.gov (United States)

    Villeta, M.; Rubio, E. M.; Sanz, A.; Sevilla, L.

    2012-04-01

    Metrological confirmation process must be designed and implemented to ensure that metrological characteristics of the measurement system meet metrological requirements of the measurement process. The aim of this paper is to present an alternative method to the traditional metrological requirements about the relationship between tolerance and measurement uncertainty, to develop such confirmation processes. The proposed way to metrological confirmation considers a given inspection task of the measurement process into the manufacturing system, and it is based on the Index of Contamination of the Capability, ICC. Metrological confirmation process is then developed taking into account the producer risks and economic considerations on this index. As a consequence, depending on the capability of the manufacturing process, the measurement system will be or will not be in adequate state of metrological confirmation for the measurement process.

  2. System and method for deriving a process-based specification

    Science.gov (United States)

    Hinchey, Michael Gerard (Inventor); Rash, James Larry (Inventor); Rouff, Christopher A. (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  3. Calcification–carbonation method for red mud processing

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ruibing [School of Metallurgy, Northeastern University, Shenyang 110819 (China); Laboratory for Simulation and Modelling of Particulate Systems, Department of Chemical Engineering, Monash University, Clayton, Victoria, 3800 (Australia); Zhang, Tingan, E-mail: zhangta@smm.neu.edu.cn [School of Metallurgy, Northeastern University, Shenyang 110819 (China); Liu, Yan; Lv, Guozhi; Xie, Liqun [School of Metallurgy, Northeastern University, Shenyang 110819 (China)

    2016-10-05

    Highlights: • A new approach named calcification–carbonation method for red mud processing is proposed. • The method can prevent emission of red mud from alumina production and is good for the environment. • Thermodynamics characteristics were investigated. • The method was verified experimentally using a jet-flow reactor. - Abstract: Red mud, the Bayer process residue, is generated from alumina industry and causes environmental problem. In this paper, a novel calcification–carbonation method that utilized a large amount of the Bayer process residue is proposed. Using this method, the red mud was calcified with lime to transform the silicon phase into hydrogarnet, and the alkali in red mud was recovered. Then, the resulting hydrogarnet was decomposed by CO{sub 2} carbonation, affording calcium silicate, calcium carbonate, and aluminum hydroxide. Alumina was recovered using an alkaline solution at a low temperature. The effects of the new process were analyzed by thermodynamics analysis and experiments. The extraction efficiency of the alumina and soda obtained from the red mud reached 49.4% and 96.8%, respectively. The new red mud with <0.3% alkali can be used in cement production. Using a combination of this method and cement production, the Bayer process red mud can be completely utilized.

  4. An object-oriented description method of EPMM process

    Science.gov (United States)

    Jiang, Zuo; Yang, Fan

    2017-06-01

    In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.

  5. Post-Processing in the Material-Point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars Vabbersgaard

    The material-point method (MPM) is a numerical method for dynamic or static analysis of solids using a discretization in time and space. The method has shown to be successful in modelling physical problems involving large deformations, which are difficult to model with traditional numerical tools...... such as the finite element method. In the material-point method, a set of material points is utilized to track the problem in time and space, while a computational background grid is utilized to obtain spatial derivatives relevant to the physical problem. Currently, the research within the material-point method......-point method. The first idea involves associating a volume with each material point and displaying the deformation of this volume. In the discretization process, the physical domain is divided into a number of smaller volumes each represented by a simple shape; here quadrilaterals are chosen for the presented...

  6. Method for innovative synthesis-design of chemical process flowsheets

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Gani, Rafiqul

    Chemical process synthesis-design involve the identification of the processing route to reach a desired product from a specified set of raw materials, design of the operations involved in the processing route, the calculations of utility requirements, the calculations of waste and emission...... to the surrounding and many more. Different methods (knowledge-based [1], mathematical programming [2], hybrid, etc.) have been proposed and are also currently employed to solve these synthesis-design problems. D’ Anterroches [3] proposed a group contribution based approach to solve the synthesis-design problem...... of chemical processes, where, chemical process flowsheets could be synthesized in the same way as atoms or groups of atoms are synthesized to form molecules in computer aided molecular design (CAMD) techniques [4]. That, from a library of building blocks (functional process-groups) and a set of rules to join...

  7. Unsupervised process monitoring and fault diagnosis with machine learning methods

    CERN Document Server

    Aldrich, Chris

    2013-01-01

    This unique text/reference describes in detail the latest advances in unsupervised process monitoring and fault diagnosis with machine learning methods. Abundant case studies throughout the text demonstrate the efficacy of each method in real-world settings. The broad coverage examines such cutting-edge topics as the use of information theory to enhance unsupervised learning in tree-based methods, the extension of kernel methods to multiple kernel learning for feature extraction from data, and the incremental training of multilayer perceptrons to construct deep architectures for enhanced data

  8. Improved Methods for Production Manufacturing Processes in Environmentally Benign Manufacturing

    Directory of Open Access Journals (Sweden)

    Yan-Yan Wang

    2011-09-01

    Full Text Available How to design a production process with low carbon emissions and low environmental impact as well as high manufacturing performance is a key factor in the success of low-carbon production. It is important to address concerns about climate change for the large carbon emission source manufacturing industries because of their high energy consumption and environmental impact during the manufacturing stage of the production life cycle. In this paper, methodology for determining a production process is developed. This methodology integrates process determination from three different levels: new production processing, selected production processing and batch production processing. This approach is taken within a manufacturing enterprise based on prior research. The methodology is aimed at providing decision support for implementing Environmentally Benign Manufacturing (EBM and low-carbon production to improve the environmental performance of the manufacturing industry. At the first level, a decision-making model for new production processes based on the Genetic Simulated Annealing Algorithm (GSAA is presented. The decision-making model considers not only the traditional factors, such as time, quality and cost, but also energy and resource consumption and environmental impact, which are different from the traditional methods. At the second level, a methodology is developed based on an IPO (Input-Process-Output model that integrates assessments of resource consumption and environmental impact in terms of a materials balance principle for batch production processes. At the third level, based on the above two levels, a method for determining production processes that focus on low-carbon production is developed based on case-based reasoning, expert systems and feature technology for designing the process flow of a new component. Through the above three levels, a method for determining the production process to identify, quantify, assess, and optimize the

  9. Ethnographic methods for process evaluations of complex health behaviour interventions.

    Science.gov (United States)

    Morgan-Trimmer, Sarah; Wood, Fiona

    2016-05-04

    This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.

  10. Methods for the Evaluation of Waste Treatment Processes

    Directory of Open Access Journals (Sweden)

    Hans-Joachim Gehrmann

    2017-01-01

    Full Text Available Decision makers for waste management are confronted with the problem of selecting the most economic, environmental, and socially acceptable waste treatment process. This paper elucidates evaluation methods for waste treatment processes for the comparison of ecological and economic aspects such as material flow analysis, statistical entropy analysis, energetic and exergetic assessment, cumulative energy demand, and life cycle assessment. The work is based on the VDI guideline 3925. A comparison of two thermal waste treatment plants with different process designs and energy recovery systems was performed with the described evaluation methods. The results are mainly influenced by the type of energy recovery, where the waste-to-energy plant providing district heat and process steam emerged to be beneficial in most aspects. Material recovery options from waste incineration were evaluated according to sustainability targets, such as saving of resources and environmental protection.

  11. Adoption of the Creative Process According to the Immersive Method

    Directory of Open Access Journals (Sweden)

    Sonja Vuk

    2015-09-01

    Full Text Available The immersive method is a new concept of visual education that is better suited to the needs of students in contemporary post-industrial society. The features of the immersive method are: 1 it emerges from interaction with visual culture; 2 it encourages understanding of contemporary art (as an integral part of visual culture; and 3 it implements the strategies and processes of the dominant tendencies in contemporary art (new media art and relational art with the goal of adopting the creative process, expressing one’s thoughts and emotions, and communicating with the environment. The immersive method transfers the creative process from art to the process of creation by the students themselves. This occurs with the mediation of an algorithmic scheme that enables students to adopt ways to solve problems, to express thoughts and emotions, to develop ideas and to transfer these ideas to form, medium and material. The immersive method uses transfer in classes, the therapeutic aspect of art and “flow state” (the optimal experience of being immersed in an activity/aesthetic experience (a total experience that has a beginning, a process and a conclusion/immersive experience (comprehensive immersion in the present moment. This is a state leading to the sublimative effect of creation (identification with what has been expressed, as well as to self-actualisation. The immersive method teaches one to connect the context, social relations and the artwork as a whole in which one lives as an individual. The adopted creative process is implemented in a critical manner on one’s surrounding through analysis, aesthetic interventions, and ecologically and socially aware inclusion in the life of a community. The students gain the crucial meta-competence of a creative thinking process.

  12. Radiation process control, study and acceptance of dosimetric methods

    International Nuclear Information System (INIS)

    Radak, B.B.

    1984-01-01

    The methods of primary dosimetric standardization and the calibration of dosimetric monitors suitable for radiation process control were outlined in the form of a logical pattern in which they are in current use on industrial scale in Yugoslavia. The reliability of the process control of industrial sterilization of medical supplies for the last four years was discussed. The preparatory works for the intermittent use of electron beams in cable industry were described. (author)

  13. Decision Support Methods for Supply Processes in the Floral Industry

    Directory of Open Access Journals (Sweden)

    Kutyba Agata

    2017-12-01

    Full Text Available The aim of this paper was to show the application of the ABC and AHP (multi-criteria method for hierarchical analysis of decision processes as an important part of decision making in supply processes which are realized in the floral industry. The ABC analysis was performed in order to classify the product mix from the perspective of the demand values. This in consequence enabled us to identify the most important products which were then used as a variant in the AHP method.

  14. Methods of Complex Data Processing from Technical Means of Monitoring

    Directory of Open Access Journals (Sweden)

    Serhii Tymchuk

    2017-03-01

    Full Text Available The problem of processing the information from different types of monitoring equipment was examined. The use of generalized methods of information processing, based on the techniques of clustering combined territorial information sources for monitoring and the use of framing model of knowledge base for identification of monitoring objects was proposed as a possible solution of the problem. Clustering methods were formed on the basis of Lance-Williams hierarchical agglomerative procedure using the Ward metrics. Frame model of knowledge base was built using the tools of object-oriented modeling.

  15. Bridging Technometric Method and Innovation Process: An Initial Study

    Science.gov (United States)

    Rumanti, A. A.; Reynaldo, R.; Samadhi, T. M. A. A.; Wiratmadja, I. I.; Dwita, A. C.

    2018-03-01

    The process of innovation is one of ways utilized to increase the capability of a technology component that reflects the need of SME. Technometric method can be used to identify to what extent the level of technology advancement in a SME is, and also which technology component that needs to be maximized in order to significantly deliver an innovation. This paper serves as an early study, which lays out a conceptual framework that identifies and elaborates the principles of innovation process from a well-established innovation model by Martin with the technometric method, based on the initial background research conducted at SME Ira Silver in Jogjakarta, Indonesia.

  16. Numerical methods in image processing for applications in jewellery industry

    OpenAIRE

    Petrla, Martin

    2016-01-01

    Presented thesis deals with a problem from the field of image processing for application in multiple scanning of jewelery stones. The aim is to develop a method for preprocessing and subsequent mathematical registration of images in order to increase the effectivity and reliability of the output quality control. For these purposes the thesis summerizes mathematical definition of digital image as well as theoretical base of image registration. It proposes a method adjusting every single image ...

  17. A Block-Asynchronous Relaxation Method for Graphics Processing Units

    OpenAIRE

    Anzt, H.; Dongarra, J.; Heuveline, Vincent; Tomov, S.

    2011-01-01

    In this paper, we analyze the potential of asynchronous relaxation methods on Graphics Processing Units (GPUs). For this purpose, we developed a set of asynchronous iteration algorithms in CUDA and compared them with a parallel implementation of synchronous relaxation methods on CPU-based systems. For a set of test matrices taken from the University of Florida Matrix Collection we monitor the convergence behavior, the average iteration time and the total time-to-solution time. Analyzing the r...

  18. Apparatus and method X-ray image processing

    International Nuclear Information System (INIS)

    1984-01-01

    The invention relates to a method for X-ray image processing. The radiation passed through the object is transformed into an electric image signal from which the logarithmic value is determined and displayed by a display device. Its main objective is to provide a method and apparatus that renders X-ray images or X-ray subtraction images with strong reduction of stray radiation. (Auth.)

  19. Method of processing radioactive metallic sodium with recycling alcohols

    International Nuclear Information System (INIS)

    Sakai, Takuhiko; Mitsuzuka, Norimasa.

    1980-01-01

    Purpose: To employ high safety alcohol procession and decrease the amount of wastes in the procession of radioactive metallic sodium discharged from LMFBR type reactors. Method: Radioactive metallic sodium containing long half-decay period nuclides such as cesium, strontium, barium, cerium, lanthanum or zirconium is dissolved in an alcohol at about 70% purity. After extracting the sodium alcoholate thus formed, gaseous hydrochloride is blown-in to separate the sodium alcoholate into alcohol and sodium chloride, and regenerated alcohol is used again for dissolving sodium metal. The sodium chloride thus separated is processed into solid wastes. (Furukawa, Y.)

  20. Information processing systems, reasoning modules, and reasoning system design methods

    Science.gov (United States)

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  1. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  2. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  3. Method for treating a nuclear process off-gas stream

    International Nuclear Information System (INIS)

    Pence, D.T.; Chou, C.-C.

    1981-01-01

    A method is described for selectively removing and recovering the noble gas and other gaseous components typically emitted during nuclear process operations. The method is useful for treating dissolver off-gas effluents released during reprocessing of spent nuclear fuels to permit radioactive contaminant recovery prior to releasing the remaining off-gases to the atmosphere. The method involves a sequence of adsorption and desorption steps which are specified. Particular reference is made to the separation of xenon and krypton from the off-gas stream, and to the use of silver-exchanged mordenite as the adsorbent. (U.K.)

  4. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    International Nuclear Information System (INIS)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles

  5. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    Energy Technology Data Exchange (ETDEWEB)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles.

  6. Literature Review on Processing and Analytical Methods for ...

    Science.gov (United States)

    Report The purpose of this report was to survey the open literature to determine the current state of the science regarding the processing and analytical methods currently available for recovery of F. tularensis from water and soil matrices, and to determine what gaps remain in the collective knowledge concerning F. tularensis identification from environmental samples.

  7. Effect of Processing Methods on Nutrient Contents of Six Sweet ...

    African Journals Online (AJOL)

    in rural communities and was often given low priority. Currently ... (p≤0.05) differences between varieties in protein, fat, reducing sugars, carbohydrates, total carotenoids, calcium, iron ... maturity (about 5 months, average maturity rate for ... 630-12) (method. 968.08). ..... processing sweet potato by either boiling, roasting or.

  8. Possibilities of Particle Finite Element Methods in Industrial Forming Processes

    Science.gov (United States)

    Oliver, J.; Cante, J. C.; Weyler, R.; Hernandez, J.

    2007-04-01

    The work investigates the possibilities offered by the particle finite element method (PFEM) in the simulation of forming problems involving large deformations, multiple contacts, and new boundaries generation. The description of the most distinguishing aspects of the PFEM, and its application to simulation of representative forming processes, illustrate the proposed methodology.

  9. The Discovery of Processing Stages: Extension of Sternberg's Method

    NARCIS (Netherlands)

    Anderson, John R; Zhang, Qiong; Borst, Jelmer P; Walsh, Matthew M

    2016-01-01

    We introduce a method for measuring the number and durations of processing stages from the electroencephalographic signal and apply it to the study of associative recognition. Using an extension of past research that combines multivariate pattern analysis with hidden semi-Markov models, the approach

  10. Complexity, Methodology and Method: Crafting a Critical Process of Research

    Science.gov (United States)

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  11. Indigenous processing methods and raw materials of borde , an ...

    African Journals Online (AJOL)

    A flow chart of borde production was constructed showing four major processing stages. The short shelf life of borde and the seasonal variations in production volume were identified as major problems for the vendors in the study areas. Keywords: indigenous methods; cereal fermentation; borde; beverage; Ethiopia J Food ...

  12. Option pricing with COS method on Graphics Processing Units

    NARCIS (Netherlands)

    B. Zhang (Bo); C.W. Oosterlee (Kees)

    2009-01-01

    htmlabstractIn this paper, acceleration on the GPU for option pricing by the COS method is demonstrated. In particular, both European and Bermudan options will be discussed in detail. For Bermudan options, we consider both the Black-Scholes model and Levy processes of infinite activity. Moreover,

  13. Forest Service National Visitor Use Monitoring Process: Research Method Documentation

    Science.gov (United States)

    Donald B.K. English; Susan M. Kocis; Stanley J. Zarnoch; J. Ross Arnold

    2002-01-01

    In response to the need for improved information on recreational use of National Forest System lands, the authors have developed a nationwide, systematic monitoring process. This report documents the methods they used in estimating recreational use on an annual basis. The basic unit of measure is exiting volume of visitors from a recreation site on a given day. Sites...

  14. Non-filtration method of processing of uranium ores

    International Nuclear Information System (INIS)

    Laskorin, B.N.; Vodolazov, L.I.; Tokarev, N.N.; Vyalkov, V.I.; Goldobina, V.A.; Gosudarstvennyj Komitet po Ispol'zovaniyu Atomnoj Ehnergii SSSR, Moscow)

    1977-01-01

    The development of the filterless sorption method has lead to working out the sorption leaching process and the process of extraction desorption, which has made possible to intensify the process of uranium ore working and to improve greatly the technical economic indexes by liquidating the complex method of multiple filtration and repulping of cakes. This method makes possible to involve more poor uranium raw materials and at the same time to extract valuable components: molybdenum, vanadium, copper, etc. Great industrial experience has been accumulating in sorption of dense pulp with the ratio of solid phase to liquid one equal to 1:1. This has lead to the increase of productivity of working plants by 1,5-3,0 times, the increase of uranium extraction by 5-10%, the increase of labour capacity of main workers by 2-3 times, and to the decrease of reagents expense, auxiliary materials, electric energy and vapour by several times. In fact the developed technology is continuous in all its steps with complete complex automatization of the process with the help of the most simple and available means of regulation and controlling. The process is equipped with high productivity apparatuses of great power with mechanic and pneumatic mixing for high density pulps, and with the columns KDS, KDZS, KNSPR and PIK for the regeneration of saturated sorbent in the counterflow regime. The exploitation of fine-granular hydrophilic ion-exchange resins in hydrophobized state is foreseen [ru

  15. Recovery process of elite athletes: A review of contemporary methods

    Directory of Open Access Journals (Sweden)

    Veljović Draško

    2012-01-01

    Full Text Available A numerous training stimulus and competition as well can reduce level of abilities among athletes. This decline of performance can be a temporary phenomenon, with duration of several minutes or several hours after a workout, or take much longer, even a several days. The lack of adequate recovery process can influence on athletes not being able to train at the desired intensity or do not fully meet the tasks at the next training session. Chronic fatigue can lead to injuries, and therefore, full recovery is necessary for achieving optimal level of abilities that will ensure a better athletic performance. For this reasons, athletes often carry out a variety of techniques and methods aimed to recover after training or match. They have become a part of the training process and their purpose is reduction of stress and fatigue incurred as a result of daily exposure to intense training stimulus. There are numerous methods and techniques today that can accelerate the recovery process of athletes. For this reason it is necessary to know the efficiency of an adequate method which will be applied in the training process. The aim of this review article is to point to those currently used and their effects on the process of recovery after physical activity in elite sport.

  16. Research on interpolation methods in medical image processing.

    Science.gov (United States)

    Pan, Mei-Sen; Yang, Xiao-Li; Tang, Jing-Tian

    2012-04-01

    Image interpolation is widely used for the field of medical image processing. In this paper, interpolation methods are divided into three groups: filter interpolation, ordinary interpolation and general partial volume interpolation. Some commonly-used filter methods for image interpolation are pioneered, but the interpolation effects need to be further improved. When analyzing and discussing ordinary interpolation, many asymmetrical kernel interpolation methods are proposed. Compared with symmetrical kernel ones, the former are have some advantages. After analyzing the partial volume and generalized partial volume estimation interpolations, the new concept and constraint conditions of the general partial volume interpolation are defined, and several new partial volume interpolation functions are derived. By performing the experiments of image scaling, rotation and self-registration, the interpolation methods mentioned in this paper are compared in the entropy, peak signal-to-noise ratio, cross entropy, normalized cross-correlation coefficient and running time. Among the filter interpolation methods, the median and B-spline filter interpolations have a relatively better interpolating performance. Among the ordinary interpolation methods, on the whole, the symmetrical cubic kernel interpolations demonstrate a strong advantage, especially the symmetrical cubic B-spline interpolation. However, we have to mention that they are very time-consuming and have lower time efficiency. As for the general partial volume interpolation methods, from the total error of image self-registration, the symmetrical interpolations provide certain superiority; but considering the processing efficiency, the asymmetrical interpolations are better.

  17. Processing Methods of Alkaline Hydrolysate from Rice Husk

    Directory of Open Access Journals (Sweden)

    Olga D. Arefieva

    2017-07-01

    Full Text Available This paper devoted to finding processing methods of alkaline hydrolysate produced from rice husk pre-extraction, and discusses alkaline hydrolysate processing schemed and disengagement of some products: amorphous silica of various quality, alkaline lignin, and water and alkaline extraction polysaccharides. Silica samples were characterized: crude (air-dried, burnt (no preliminary water treatment, washed in distilled water, and washed in distilled water and burnt. Waste water parameters upon the extraction of solids from alkaline hydrolysate dropped a few dozens or thousand times depending on the applied processing method. Color decreased a few thousand times, turbidity was virtually eliminated, chemical oxygen demanded about 20–136 times; polyphenols content might decrease 50% or be virtually eliminated. The most prospective scheme obtained the two following solid products from rice husk alkaline hydrolysate: amorphous silica and alkaline extraction polysaccharide. Chemical oxygen demand of the remaining waste water decreased about 140 times compared to the silica-free solution.

  18. Method of processing low-level radioactive liquid wastes

    International Nuclear Information System (INIS)

    Matsunaga, Ichiro; Sugai, Hiroshi.

    1984-01-01

    Purpose: To effectively reduce the radioactivity density of low-level radioactive liquid wastes discharged from enriched uranium conversion processing steps or the likes. Method: Hydrazin is added to low-level radioactive liquid wastes, which are in contact with iron hydroxide-cation exchange resins prepared by processing strongly acidic-cation exchange resins with ferric chloride and aqueous ammonia to form hydrorizates of ferric ions in the resin. Hydrazine added herein may be any of hydrazine hydrate, hydrazine hydrochloride and hydranine sulfate. The preferred addition amount is more than 100 mg per one liter of the liquid wastes. If it is less than 100 mg, the reduction rate for the radioactivety density (procession liquid density/original liquid density) is decreased. This method enables to effectively reduce the radioactivity density of the low-level radioactive liquid wastes containing a trace amount of radioactive nucleides. (Yoshihara, H.)

  19. Methods and systems for the processing of physiological signals

    International Nuclear Information System (INIS)

    Cosnac, B. de; Gariod, R.; Max, J.; Monge, V.

    1975-01-01

    This note is a general survey of the processing of physiological signals. After an introduction about electrodes and their limitations, the physiological nature of the main signals are shortly recalled. Different methods (signal averaging, spectral analysis, shape morphological analysis) are described as applications to the fields of magnetocardiography, electro-encephalography, cardiography, electronystagmography. As for processing means (single portable instruments and programmable), they are described through the example of application to rheography and to the Plurimat'S general system. As a conclusion the methods of signal processing are dominated by the morphological analysis of curves and by the necessity of a more important introduction of the statistical classification. As for the instruments, microprocessors will appear but specific operators linked to computer will certainly grow [fr

  20. Collaborative simulation method with spatiotemporal synchronization process control

    Science.gov (United States)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  1. Standard CMMIsm Appraisal Method for Process Improvement (SCAMPIsm), Version 1.1: Method Definition Document

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard CMMI Appraisal Method for Process Improvement (SCAMPI(Service Mark)) is designed to provide benchmark quality ratings relative to Capability Maturity Model(registered) Integration (CMMI(Service Mark)) models...

  2. Studies of neutron methods for process control and criticality surveillance of fissile material processing facilities

    International Nuclear Information System (INIS)

    Zoltowski, T.

    1988-01-01

    The development of radiochemical processes for fissile material processing and spent fuel handling need new control procedures enabling an improvement of plant throughput. This is strictly related to the implementation of continuous criticality control policy and developing reliable methods for monitoring the reactivity of radiochemical plant operations in presence of the process perturbations. Neutron methods seem to be applicable for fissile material control in some technological facilities. The measurement of epithermal neutron source multiplication with heuristic evaluation of measured data enables surveillance of anomalous reactivity enhancement leading to unsafe states. 80 refs., 47 figs., 33 tabs. (author)

  3. Total focusing method with correlation processing of antenna array signals

    Science.gov (United States)

    Kozhemyak, O. A.; Bortalevich, S. I.; Loginov, E. L.; Shinyakov, Y. A.; Sukhorukov, M. P.

    2018-03-01

    The article proposes a method of preliminary correlation processing of a complete set of antenna array signals used in the image reconstruction algorithm. The results of experimental studies of 3D reconstruction of various reflectors using and without correlation processing are presented in the article. Software ‘IDealSystem3D’ by IDeal-Technologies was used for experiments. Copper wires of different diameters located in a water bath were used as a reflector. The use of correlation processing makes it possible to obtain more accurate reconstruction of the image of the reflectors and to increase the signal-to-noise ratio. The experimental results were processed using an original program. This program allows varying the parameters of the antenna array and sampling frequency.

  4. Methods for Dissecting Motivation and Related Psychological Processes in Rodents.

    Science.gov (United States)

    Ward, Ryan D

    2016-01-01

    Motivational impairments are increasingly recognized as being critical to functional deficits and decreased quality of life in patients diagnosed with psychiatric disease. Accordingly, much preclinical research has focused on identifying psychological and neurobiological processes which underlie motivation . Inferring motivation from changes in overt behavioural responding in animal models, however, is complicated, and care must be taken to ensure that the observed change is accurately characterized as a change in motivation , and not due to some other, task-related process. This chapter discusses current methods for assessing motivation and related psychological processes in rodents. Using an example from work characterizing the motivational impairments in an animal model of the negative symptoms of schizophrenia, we highlight the importance of careful and rigorous experimental dissection of motivation and the related psychological processes when characterizing motivational deficits in rodent models . We suggest that such work is critical to the successful translation of preclinical findings to therapeutic benefits for patients.

  5. Changing perspective on tissue processing - comparison of microwave histoprocessing method with the conventional method

    Directory of Open Access Journals (Sweden)

    G Shrestha

    2015-09-01

    Full Text Available Background: Histopathological examination of tissues requires sliver of formalin fixed tissue that has been chemically processed and then stained with Haematoxylin and Eosin. The time honored conventional method of tissue processing, which requires 12 to 13 hours for completion, is employed at majority of laboratories but is now seeing the

  6. Processing method and device for radioactive liquid waste

    International Nuclear Information System (INIS)

    Matsuo, Toshiaki; Nishi, Takashi; Matsuda, Masami; Yukita, Atsushi.

    1997-01-01

    When only suspended particulate ingredients are contained as COD components in radioactive washing liquid wastes, the liquid wastes are heated by a first process, for example, an adsorption step to adsorb the suspended particulate ingredients to an activated carbon, and then separating and removing the suspended particulate ingredients by filtration. When both of the floating particle ingredients and soluble organic ingredients are contained, the suspended particulate ingredients are separated and removed by the first process, and then soluble organic ingredients are removed by other process, or both of the suspended particulate ingredients and the soluble organic ingredients are removed by the first process. In an existent method of adding an activated carbon and then filtering them at a normal temperature, the floating particle ingredients cover the layer of activated carbon formed on a filter paper or fabric to sometimes cause clogging. However, according to the method of the present invention, since disturbance by the floating particle ingredients does not occur, the COD components can be separated and removed sufficiently without lowering liquid waste processing speed. (T.M.)

  7. High-resolution imaging methods in array signal processing

    DEFF Research Database (Denmark)

    Xenaki, Angeliki

    in active sonar signal processing for detection and imaging of submerged oil contamination in sea water from a deep-water oil leak. The submerged oil _eld is modeled as a uid medium exhibiting spatial perturbations in the acoustic parameters from their mean ambient values which cause weak scattering...... of the incident acoustic energy. A highfrequency active sonar is selected to insonify the medium and receive the backscattered waves. High-frequency acoustic methods can both overcome the optical opacity of water (unlike methods based on electromagnetic waves) and resolve the small-scale structure...... of the submerged oil field (unlike low-frequency acoustic methods). The study shows that high-frequency acoustic methods are suitable not only for large-scale localization of the oil contamination in the water column but also for statistical characterization of the submerged oil field through inference...

  8. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C......In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...

  9. Method of optimization of the natural gas refining process

    Energy Technology Data Exchange (ETDEWEB)

    Sadykh-Zade, E.S.; Bagirov, A.A.; Mardakhayev, I.M.; Razamat, M.S.; Tagiyev, V.G.

    1980-01-01

    The SATUM (automatic control system of technical operations) system introduced at the Shatlyk field should assure good quality of gas refining. In order to optimize the natural gas refining processes and experimental-analytical method is used in compiling the mathematical descriptions. The program, compiled in Fortran language, in addition to parameters of optimal conditions gives information on the yield of concentrate and water, concentration and consumption of DEG, composition and characteristics of the gas and condensate. The algorithm for calculating optimum engineering conditions of gas refining is proposed to be used in ''advice'' mode, and also for monitoring progress of the gas refining process.

  10. USING ANALYTIC HIERARCHY PROCESS (AHP METHOD IN RURAL DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Tülay Cengiz

    2003-04-01

    Full Text Available Rural development is a body of economical and social policies towards improving living conditions in rural areas through enabling rural population to utilize economical, social, cultural and technological blessing of city life in place, without migrating. As it is understood from this description, rural development is a very broad concept. Therefore, in development efforts problem should be stated clearly, analyzed and many criterias should be evaluated by experts. Analytic Hierarchy Process (AHP method can be utilized at there stages of development efforts. AHP methods is one of multi-criteria decision method. After degrading a problem in smaller pieces, relative importance and level of importance of two compared elements are determined. It allows evaluation of quality and quantity factors. At the same time, it permits utilization of ideas of many experts and use them in decision process. Because mentioned features of AHP method, it could be used in rural development works. In this article, cultural factors, one of the important components of rural development is often ignored in many studies, were evaluated as an example. As a result of these applications and evaluations, it is concluded that AHP method could be helpful in rural development efforts.

  11. Nuclear pulse signal processing techniques based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Qi Zhong; Meng Xiangting; Fu Yanyan; Li Dongcang

    2012-01-01

    This article presents a method of measurement and analysis of nuclear pulse signal, the FPGA to control high-speed ADC measurement of nuclear radiation signals and control the high-speed transmission status of the USB to make it work on the Slave FIFO mode, using the LabVIEW online data processing and display, using the blind deconvolution method to remove the accumulation of signal acquisition, and to restore the nuclear pulse signal with a transmission speed, real-time measurements show that the advantages. (authors)

  12. Nuclear pulse signal processing technique based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Fu Tingyan; Qi Zhong; Li Dongcang; Ren Zhongguo

    2012-01-01

    In this paper, we present a method for measurement and analysis of nuclear pulse signal, with which pile-up signal is removed, the signal baseline is restored, and the original signal is obtained. The data acquisition system includes FPGA, ADC and USB. The FPGA controls the high-speed ADC to sample the signal of nuclear radiation, and the USB makes the ADC work on the Slave FIFO mode to implement high-speed transmission status. Using the LabVIEW, it accomplishes online data processing of the blind deconvolution algorithm and data display. The simulation and experimental results demonstrate advantages of the method. (authors)

  13. Method of electrolytic processing for radioactive liquid waste

    International Nuclear Information System (INIS)

    Otsuka, Katsuyuki; Takahashi, Yoshiharu; Tamai, Hideaki.

    1989-01-01

    Radioactive liquid wastes containing sodium compounds are electrolized using mercury as a cathode. As a result, they are separated into sodium-containing metal amalgam and residues. Metals containing sodium are separated from amalgam, purified and re-utilized, while mercury is recycled to the electrolysis vessel. The foregoing method can provide advantageous effect such as: (1) volume of the wastes to be processed can be reduced, (2) since processing can be carried out at a relatively low temperature, low boiling elements can be handled with no evaporization, (3) useful elements can be recovered and (4) other method than glass solidification can easily be employed remarkable volume-reduction of solidification products can be expected. (K.M.)

  14. A qualitative diagnosis method for a continuous process monitor system

    International Nuclear Information System (INIS)

    Lucas, B.; Evrard, J.M.; Lorre, J.P.

    1993-01-01

    SEXTANT, an expert system for the analysis of transients, was built initially to study physical transients in nuclear reactors. It combines several knowledge bases concerning measurements, models and qualitative behavior of the plant with a generate-and-test mechanism and a set of numerical models of the physical process. The integration of an improved diagnosis method using a mixed model in SEXTANT in order to take into account the existence and the reliability of only a few number of sensors, the knowledge on failure and the possibility of non anticipated failures, is presented. This diagnosis method is based on two complementary qualitative models of the process and a methodology to build these models from a system description. 8 figs., 17 refs

  15. Process synthesis, design and analysis using a process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario R.; Gani, Rafiqul

    2015-01-01

    ) techniques. The fundamental pillars of this framework are the definition and use of functional process-groups (building blocks) representing a wide range of process operations, flowsheet connectivity rules to join the process-groups to generate all the feasible flowsheet alternatives and flowsheet property...... models like energy consumption, atom efficiency, environmental impact to evaluate the performance of the generated alternatives. In this way, a list of feasible flowsheets are quickly generated, screened and selected for further analysis. Since the flowsheet is synthesized and the operations......This paper describes the development and application of a process-group contribution method to model, simulate and synthesize chemical processes. Process flowsheets are generated in the same way as atoms or groups of atoms are combined to form molecules in computer aided molecular design (CAMD...

  16. Method and equipment of processing radioactive laundry wastes

    International Nuclear Information System (INIS)

    Shirai, Takamori; Suzuki, Takeo; Tabata, Masayuki; Takada, Takao; Yamaguchi, Shin-ichi; Noda, Tetsuya.

    1985-01-01

    Purpose: To effectively process radioactive laundry wastes generated due to water-washing after dry-cleaning of protective clothings which have been put on in nuclear facilities. Method: Dry cleaning soaps and ionic radioactive materials contained in radioactive laundry wastes are selectively adsorbed to decontaminate by adsorbents. Then, the adsorbents having adsorbed dry cleaning soaps and ionic radioactive materials are purified by being removed with these radioactive materials. The purified adsorbents are re-used. (Seki, T.)

  17. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  18. Dental ceramics: a review of new materials and processing methods.

    Science.gov (United States)

    Silva, Lucas Hian da; Lima, Erick de; Miranda, Ranulfo Benedito de Paula; Favero, Stéphanie Soares; Lohbauer, Ulrich; Cesar, Paulo Francisco

    2017-08-28

    The evolution of computerized systems for the production of dental restorations associated to the development of novel microstructures for ceramic materials has caused an important change in the clinical workflow for dentists and technicians, as well as in the treatment options offered to patients. New microstructures have also been developed by the industry in order to offer ceramic and composite materials with optimized properties, i.e., good mechanical properties, appropriate wear behavior and acceptable aesthetic characteristics. The objective of this literature review is to discuss the main advantages and disadvantages of the new ceramic systems and processing methods. The manuscript is divided in five parts: I) monolithic zirconia restorations; II) multilayered dental prostheses; III) new glass-ceramics; IV) polymer infiltrated ceramics; and V) novel processing technologies. Dental ceramics and processing technologies have evolved significantly in the past ten years, with most of the evolution being related to new microstructures and CAD-CAM methods. In addition, a trend towards the use of monolithic restorations has changed the way clinicians produce all-ceramic dental prostheses, since the more aesthetic multilayered restorations unfortunately are more prone to chipping or delamination. Composite materials processed via CAD-CAM have become an interesting option, as they have intermediate properties between ceramics and polymers and are more easily milled and polished.

  19. Dental ceramics: a review of new materials and processing methods

    Directory of Open Access Journals (Sweden)

    Lucas Hian da SILVA

    2017-08-01

    Full Text Available Abstract The evolution of computerized systems for the production of dental restorations associated to the development of novel microstructures for ceramic materials has caused an important change in the clinical workflow for dentists and technicians, as well as in the treatment options offered to patients. New microstructures have also been developed by the industry in order to offer ceramic and composite materials with optimized properties, i.e., good mechanical properties, appropriate wear behavior and acceptable aesthetic characteristics. The objective of this literature review is to discuss the main advantages and disadvantages of the new ceramic systems and processing methods. The manuscript is divided in five parts: I monolithic zirconia restorations; II multilayered dental prostheses; III new glass-ceramics; IV polymer infiltrated ceramics; and V novel processing technologies. Dental ceramics and processing technologies have evolved significantly in the past ten years, with most of the evolution being related to new microstructures and CAD-CAM methods. In addition, a trend towards the use of monolithic restorations has changed the way clinicians produce all-ceramic dental prostheses, since the more aesthetic multilayered restorations unfortunately are more prone to chipping or delamination. Composite materials processed via CAD-CAM have become an interesting option, as they have intermediate properties between ceramics and polymers and are more easily milled and polished.

  20. Non-filtration method of processing uranium ores

    International Nuclear Information System (INIS)

    Laskorin, B.N.; Vodolazov, L.I.; Tokarev, N.N.; Vyalkov, V.I.; Goldobina, V.A.; Gosudarstvennyj Komitet po Ispol'zovaniyu Atomnoj Ehnergii SSSR, Moscow)

    1977-01-01

    The development of the non-filtration sorption method has lead to procedures of the sorption leaching and the extraction desorption, which have made it possible to intensify the processing of uranium ores and to improve greatly the technical and economic indexes by eliminating the complex method of multiple filtration and re-pulping of cakes. This method makes it possible to involve more poor uranium raw materials, at the same time extracting valuable components such as molybdenum, vanadium, copper, etc. Considerable industrial experience has been acquired in the sorption of dense pulp with a solid-to-liquid phase ratio of 1:1. This has led to a plant production increase of 1.5-3.0 times, an increase of uranium extraction by 5-10%, a two- to- three-fold increase of labour capacity of the main workers, and to a several-fold decrease of reagents, auxiliary materials, electric energy and vapour. This non-filtration method is a continuous process in all its phases thanks to the use of high-yield and high-power equipment for high-density pulps. (author)

  1. Survey: interpolation methods for whole slide image processing.

    Science.gov (United States)

    Roszkowiak, L; Korzynska, A; Zak, J; Pijanowska, D; Swiderska-Chadaj, Z; Markiewicz, T

    2017-02-01

    Evaluating whole slide images of histological and cytological samples is used in pathology for diagnostics, grading and prognosis . It is often necessary to rescale whole slide images of a very large size. Image resizing is one of the most common applications of interpolation. We collect the advantages and drawbacks of nine interpolation methods, and as a result of our analysis, we try to select one interpolation method as the preferred solution. To compare the performance of interpolation methods, test images were scaled and then rescaled to the original size using the same algorithm. The modified image was compared to the original image in various aspects. The time needed for calculations and results of quantification performance on modified images were also compared. For evaluation purposes, we used four general test images and 12 specialized biological immunohistochemically stained tissue sample images. The purpose of this survey is to determine which method of interpolation is the best to resize whole slide images, so they can be further processed using quantification methods. As a result, the interpolation method has to be selected depending on the task involving whole slide images. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  2. Key technologies of drilling process with raise boring method

    Directory of Open Access Journals (Sweden)

    Zhiqiang Liu

    2015-08-01

    Full Text Available This study presents the concept of shaft constructed by raise boring in underground mines, and the idea of inverse construction can be extended to other fields of underground engineering. The conventional raise boring methods, such as the wood support method, the hanging cage method, the creeping cage method, and the deep-hole blasting method, are analyzed and compared. In addition, the raise boring machines are classified into different types and the characteristics of each type are described. The components of a raise boring machine including the drill rig, the drill string and the auxiliary system are also presented. Based on the analysis of the raise boring method, the rock mechanics problems during the raise boring process are put forward, including rock fragmentation, removal of cuttings, shaft wall stability, and borehole deviation control. Finally, the development trends of raise boring technology are described as follows: (i improvement of rock-breaking modes to raise drilling efficiency, (ii development of an intelligent control technique, and (iii development of technology and equipment for nonlinear raise boring.

  3. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  4. Regularization of the double period method for experimental data processing

    Science.gov (United States)

    Belov, A. A.; Kalitkin, N. N.

    2017-11-01

    In physical and technical applications, an important task is to process experimental curves measured with large errors. Such problems are solved by applying regularization methods, in which success depends on the mathematician's intuition. We propose an approximation based on the double period method developed for smooth nonperiodic functions. Tikhonov's stabilizer with a squared second derivative is used for regularization. As a result, the spurious oscillations are suppressed and the shape of an experimental curve is accurately represented. This approach offers a universal strategy for solving a broad class of problems. The method is illustrated by approximating cross sections of nuclear reactions important for controlled thermonuclear fusion. Tables recommended as reference data are obtained. These results are used to calculate the reaction rates, which are approximated in a way convenient for gasdynamic codes. These approximations are superior to previously known formulas in the covered temperature range and accuracy.

  5. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  6. A method for manufacturing a tool part for an injection molding process, a hot embossing process, a nano-imprint process, or an extrusion process

    DEFF Research Database (Denmark)

    2013-01-01

    The present invention relates to a method for manufacturing a tool part for an injection molding process, a hot embossing process, nano-imprint process or an extrusion process. First, there is provided a master structure (10) with a surface area comprising nanometre-sized protrusions (11...

  7. Measurement of company effectiveness using analytic network process method

    Directory of Open Access Journals (Sweden)

    Goran Janjić

    2017-07-01

    Full Text Available The sustainable development of an organisation is monitored through the organisation’s performance, which beforehand incorporates all stakeholders’ requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP to define the weight factors of the mutual influences of all the important elements of an organisation’s strategy. The calculation of an organisation’s effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation’s business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation’s most important measures.

  8. Measurement of company effectiveness using analytic network process method

    Science.gov (United States)

    Goran, Janjić; Zorana, Tanasić; Borut, Kosec

    2017-07-01

    The sustainable development of an organisation is monitored through the organisation's performance, which beforehand incorporates all stakeholders' requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP) to define the weight factors of the mutual influences of all the important elements of an organisation's strategy. The calculation of an organisation's effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation's business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation's most important measures.

  9. Field Methods for the Study of Slope and Fluvial Processes

    Science.gov (United States)

    Leopold, Luna Bergere; Leopold, Luna Bergere

    1967-01-01

    In Belgium during the summer of 1966 the Commission on Slopes and the Commission on Applied Geomorphology of the International Geographical Union sponsored a joint symposium, with field excursions, and meetings of the two commissions. As a result of the conference and associated discussions, the participants expressed the view that it would be a contribution to scientific work relating to the subject area if the Commission on Applied Geomorphology could prepare a small manual describling the methods of field investigation being used by research scientists throughout the world in the study of various aspects of &lope development and fluvial processes. The Commission then assumed this responsibility and asked as many persons as were known to be. working on this subject to contribute whatever they wished in the way of descriptions of methods being employed.The purpose of the present manual is to show the variety of study methods now in use, to describe from the experience gained the limitations and advantages of different techniques, and to give pertinent detail which might be useful to other investigators. Some details that would be useful to know are not included in scientific publications, but in a manual on methods the details of how best t6 use a method has a place. Various persons have learned certain things which cannot be done, as well as some methods that are successful. It is our hope that comparison of methods tried will give the reader suggestions as to how a particular method might best be applied to his own circumstance.The manual does not purport to include methods used by all workers. In particular, it does not interfere with a more systematic treatment of the subject (1) or with various papers already published in the present journal. In fact we are sure that there are pertinent research methods that we do not know of and the Commission would be glad to receive additions and other ideas from those who find they have something to contribute. Also, the

  10. Catalytic arylation methods from the academic lab to industrial processes

    CERN Document Server

    Burke, Anthony J

    2014-01-01

    This "hands-on" approach to the topic of arylation consolidates the body of key research over the last ten years (and up to around 2014) on various catalytic methods which involve an arylation process. Clearly structured, the chapters in this one-stop resource are arranged according to the reaction type, and focus on novel, efficient and sustainable processes, rather than the well-known and established cross-coupling methods. The entire contents are written by two authors with academic and industrial expertise to ensure consistent coverage of the latest developments in the field, as well as industrial applications, such as C-H activation, iron and gold-catalyzed coupling reactions, cycloadditions or novel methodologies using arylboron reagents. A cross-section of relevant tried-and-tested experimental protocols is included at the end of each chapter for putting into immediate practice, along with patent literature. Due to its emphasis on efficient, "green" methods and industrial applications of the products c...

  11. Systematic Development of Miniaturized (Bio)Processes using Process Systems Engineering (PSE) Methods and Tools

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Larsson, Hilde; Heintz, Søren

    2014-01-01

    The focus of this work is on process systems engineering (PSE) methods and tools, and especially on how such PSE methods and tools can be used to accelerate and support systematic bioprocess development at a miniature scale. After a short presentation of the PSE methods and the bioprocess...... development drivers, three case studies are presented. In the first example it is demonstrated how experimental investigations of the bi-enzymatic production of lactobionic acid can be modeled with help of a new mechanistic mathematical model. The reaction was performed at lab scale and the prediction quality...

  12. Estimation methods for process holdup of special nuclear materials

    International Nuclear Information System (INIS)

    Pillay, K.K.S.; Picard, R.R.; Marshall, R.S.

    1984-06-01

    The US Nuclear Regulatory Commission sponsored a research study at the Los Alamos National Laboratory to explore the possibilities of developing statistical estimation methods for materials holdup at highly enriched uranium (HEU)-processing facilities. Attempts at using historical holdup data from processing facilities and selected holdup measurements at two operating facilities confirmed the need for high-quality data and reasonable control over process parameters in developing statistical models for holdup estimations. A major effort was therefore directed at conducting large-scale experiments to demonstrate the value of statistical estimation models from experimentally measured data of good quality. Using data from these experiments, we developed statistical models to estimate residual inventories of uranium in large process equipment and facilities. Some of the important findings of this investigation are the following: prediction models for the residual holdup of special nuclear material (SNM) can be developed from good-quality historical data on holdup; holdup data from several of the equipment used at HEU-processing facilities, such as air filters, ductwork, calciners, dissolvers, pumps, pipes, and pipe fittings, readily lend themselves to statistical modeling of holdup; holdup profiles of process equipment such as glove boxes, precipitators, and rotary drum filters can change with time; therefore, good estimation of residual inventories in these types of equipment requires several measurements at the time of inventory; although measurement of residual holdup of SNM in large facilities is a challenging task, reasonable estimates of the hidden inventories of holdup to meet the regulatory requirements can be accomplished through a combination of good measurements and the use of statistical models. 44 references, 62 figures, 43 tables

  13. Contamination control methods for gases used in the microlithography process

    Science.gov (United States)

    Rabellino, Larry; Applegarth, Chuck; Vergani, Giorgio

    2002-07-01

    Sensitivity to contamination continues to increase as the technology shrinks from 365 nm I-line lamp illumination to 13.4 nm Extreme Ultraviolet laser activated plasma. Gas borne impurities can be readily distributed within the system, remaining both suspended in the gas and attached to critical surfaces. Effects from a variety of contamination, some well characterized and others not, remain a continuing obstacle for stepper manufacturers and users. Impurities like oxygen, moisture and hydrocarbons in parts per billion levels can absorb light, reducing the light intensity and subsequently reducing the consistence of the process. Moisture, sulfur compounds, ammonia, acid compounds and organic compounds such as hydrocarbons can deposit on lens or mirror surfaces affecting image quality. Regular lens replacement or removal for cleaning is a costly option and in-situ cleaning processes must be carefully managed to avoid recontamination of the system. The contamination can come from outside the controlled environment (local gas supply, piping system, & leaks), or from the materials moving into the controlled environment; or contamination may be generated inside the controlled environment as a result of the process itself. The release of amines can occur as a result of the degassing of the photo-resists. For the manufacturer and user of stepper equipment, the challenge is not in predictable contamination, but the variable or unpredictable contamination in the process. One type of unpredictable contamination may be variation in the environmental conditions when producing the nitrogen gas and Clean Dry Air (CDA). Variation in the CDA, nitrogen and xenon may range from parts per billion to parts per million. The risk due to uncontrolled or unmonitored variation in gas quality can be directly related to product defects. Global location can significantly affect the gas quality, due to the ambient air quality (for nitrogen and CDA), production methods, gas handling equipment

  14. Data warehousing methods and processing infrastructure for brain recovery research.

    Science.gov (United States)

    Gee, T; Kenny, S; Price, C J; Seghier, M L; Small, S L; Leff, A P; Pacurar, A; Strother, S C

    2010-09-01

    In order to accelerate translational neuroscience with the goal of improving clinical care it has become important to support rapid accumulation and analysis of large, heterogeneous neuroimaging samples and their metadata from both normal control and patient groups. We propose a multi-centre, multinational approach to accelerate the data mining of large samples and facilitate data-led clinical translation of neuroimaging results in stroke. Such data-driven approaches are likely to have an early impact on clinically relevant brain recovery while we simultaneously pursue the much more challenging model-based approaches that depend on a deep understanding of the complex neural circuitry and physiological processes that support brain function and recovery. We present a brief overview of three (potentially converging) approaches to neuroimaging data warehousing and processing that aim to support these diverse methods for facilitating prediction of cognitive and behavioral recovery after stroke, or other types of brain injury or disease.

  15. Method of volume-reducing processing for radioactive wastes

    International Nuclear Information System (INIS)

    Sato, Koei; Yamauchi, Noriyuki; Hirayama, Toshihiko.

    1985-01-01

    Purpose: To process the processing products of radioactive liquid wastes and burnable solid wastes produced from nuclear facilities into stable solidification products by heat melting. Method: At first, glass fiber wastes of contaminated air filters are charged in a melting furnace. Then, waste products obtained through drying, sintering, incineration, etc. are mixed with a proper amount of glass fibers and charged into the melting furnace. Both of the charged components are heated to a temperature at which the glass fibers are melted. The burnable materials are burnt out to provide a highly volume-reduced products. When the products are further heated to a temperature at which metals or metal oxides of a higher melting point than the glass fiber, the glass fibers and the metals or metal oxides are fused to each other to be combined in a molecular structure into more stabilized products. The products are excellent in strength, stability, durability and leaching resistance at ambient temperature. (Kamimura, M.)

  16. Processing method for cleaning water waste from cement kneader

    International Nuclear Information System (INIS)

    Soda, Kenzo; Fujita, Hisao; Nakajima, Tadashi.

    1990-01-01

    The present invention concerns a method of processing cleaning water wastes from a cement kneader in a case of processing liquid wastes containing radioactive wastes or deleterious materials such as heavy metals by means of cement solidification. Cleaning waste wastes from the kneader are sent to a cleaning water waste tank, in which gentle stirring is applied near the bottom and sludges are retained so as not to be coagulated. Sludges retained at the bottom of the cleaning water waste tank are sent after elapse of a predetermined time and then kneaded with cements. Thus, since the sludges in the cleaning water are solidified with cement, inhomogenous solidification products consisting only of cleaning sludges with low strength are not formed. The resultant solidification product is homogenous and the compression strength thereof reaches such a level as capable of satisfying marine disposal standards required for the solidification products of radioactive wastes. (I.N.)

  17. Method of processing radiation-contaminated organic polymer materials

    International Nuclear Information System (INIS)

    Kobayashi, Yoshii.

    1980-01-01

    Purpose: To process radiation contaminated organic high polymer materials with no evolution of toxic gases, at low temperature and with safety by hot-acid immersion process using sulfuric acid-hydrogen peroxide. Method: Less flammable or easily flammable organic polymers contaminated with radioactive substances, particularly with long life actinoid are heated and carbonized in concentrated sulfuric acid. Then, aqueous 30% H 2 O 2 solution is continuously added dropwise as an oxidizing agent till the solution turns colourless. If the carbonization was insufficient, addition of H 2 O 2 solution is stopped temporarily and the carbonization is conducted again. Thus, the organic polymers are completely decomposed by the wet oxidization. Then, the volume of the organic materials to be discharged is decreased and the radioactive substances contained are simultaneously concentrated and collected. (Seki, T.)

  18. Signal processing methods for in-situ creep specimen monitoring

    Science.gov (United States)

    Guers, Manton J.; Tittmann, Bernhard R.

    2018-04-01

    Previous work investigated using guided waves for monitoring creep deformation during accelerated life testing. The basic objective was to relate observed changes in the time-of-flight to changes in the environmental temperature and specimen gage length. The work presented in this paper investigated several signal processing strategies for possible application in the in-situ monitoring system. Signal processing methods for both group velocity (wave-packet envelope) and phase velocity (peak tracking) time-of-flight were considered. Although the Analytic Envelope found via the Hilbert transform is commonly applied for group velocity measurements, erratic behavior in the indicated time-of-flight was observed when this technique was applied to the in-situ data. The peak tracking strategies tested had generally linear trends, and tracking local minima in the raw waveform ultimately showed the most consistent results.

  19. Possibilities of implementing nonthermal processing methods in the dairy industry

    Directory of Open Access Journals (Sweden)

    Irena Jeličić

    2010-06-01

    Full Text Available In the past two decades a lot of research in the field of food science has focused on new, non-thermal processing methods. This article describes the most intensively investigated new processing methodsfor implementation in the dairy industry, like microfiltration, high hydrostatic pressure, ultrasound and pulsed electric fields. For each method an overview is given for the principle of microbial inactivation, the obtained results regarding reduction of microorganisms as well as the positive and undesirable effects on milk composition and characteristics. Most promising methods for further implementation in the dairy industry appeared to be combination of moderate temperatures with high hydrostatic pressure, respectively, pulsed electric fields and microfiltration, since those treatments did not result in any undesirable changes in sensory properties of milk. Additionally, milk treatment with these methodsresulted in a better milk fat homogenization, faster rennet coagulation, shorter duration of milk fermentations, etc. Very good results regarding microbial inactivation were obtained by treating milkwith combination of moderate temperatures and high intensity ultrasound which is also called a process of thermosonification. However, thermosonification treatments often result in undesirablechanges in milk sensory properties, which is most probably due to ultrasonic induced milk fat oxidation. This article also shortly describes the use of natural compounds with antimicrobial effects such as bacteriocins, lactoperoxidase system and lysozime. However their implementation is limited for reasons like high costs, interaction with other food ingredients, poor solubility, narrow activity spectrum, spontaneous loss of bacteriocinogenicity, etc. In addition, principles of antimicrobial effect of microwaves and ultraviolet irradiation are described. However their implementation in the dairy industry failed mostly due to technical and commercial reasons.

  20. Application of image processing methods to industrial radiography

    International Nuclear Information System (INIS)

    Goutte, R.; Odet, C.; Tuncer, T.; Bodson, F.; Varcin, E.

    1985-01-01

    This study was carried out with the financial support of the Commission of the European Communities as part of the CECA research program comprising of IRSID, INSA de Lyon and the Framatome and Creusot Loire companies. Its purpose was to evaluate the possibility of using digital enhancement of radiographic images to improve defect visibility in industrial radiography, thereby providing assistance in defect detection and a method for automatic analysis of radiographs. This paper provides full results obtained from work on digital processing of radiographs showing real and artificial defects. Furthermore, work on simulated automatic defect detection is also presented. 2 refs

  1. Optimal and adaptive methods of processing hydroacoustic signals (review)

    Science.gov (United States)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  2. Simulation of ecological processes using response functions method

    International Nuclear Information System (INIS)

    Malkina-Pykh, I.G.; Pykh, Yu. A.

    1998-01-01

    The article describes further development and applications of the already well-known response functions method (MRF). The method is used as a basis for the development of mathematical models of a wide set of ecological processes. The model of radioactive contamination of the ecosystems is chosen as an example. The mathematical model was elaborated for the description of 90 Sr dynamics in the elementary ecosystems of various geographical zones. The model includes the blocks corresponding with the main units of any elementary ecosystem: lower atmosphere, soil, vegetation, surface water. Parameters' evaluation was provided on a wide set of experimental data. A set of computer simulations was done on the model to prove the possibility of the model's use for ecological forecasting

  3. Developing an Engineering Design Process Assessment using Mixed Methods.

    Science.gov (United States)

    Wind, Stefanie A; Alemdar, Meltem; Lingle, Jeremy A; Gale, Jessica D; Moore, Roxanne A

    Recent reforms in science education worldwide include an emphasis on engineering design as a key component of student proficiency in the Science, Technology, Engineering, and Mathematics disciplines. However, relatively little attention has been directed to the development of psychometrically sound assessments for engineering. This study demonstrates the use of mixed methods to guide the development and revision of K-12 Engineering Design Process (EDP) assessment items. Using results from a middle-school EDP assessment, this study illustrates the combination of quantitative and qualitative techniques to inform item development and revisions. Overall conclusions suggest that the combination of quantitative and qualitative evidence provides an in-depth picture of item quality that can be used to inform the revision and development of EDP assessment items. Researchers and practitioners can use the methods illustrated here to gather validity evidence to support the interpretation and use of new and existing assessments.

  4. A Mixed-Methods Research Framework for Healthcare Process Improvement.

    Science.gov (United States)

    Bastian, Nathaniel D; Munoz, David; Ventura, Marta

    2016-01-01

    The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.

  5. Housing decision making methods for initiation development phase process

    Science.gov (United States)

    Zainal, Rozlin; Kasim, Narimah; Sarpin, Norliana; Wee, Seow Ta; Shamsudin, Zarina

    2017-10-01

    Late delivery and sick housing project problems were attributed to poor decision making. These problems are the string of housing developer that prefers to create their own approach based on their experiences and expertise with the simplest approach by just applying the obtainable standards and rules in decision making. This paper seeks to identify the decision making methods for housing development at the initiation phase in Malaysia. The research involved Delphi method by using questionnaire survey which involved 50 numbers of developers as samples for the primary stage of collect data. However, only 34 developers contributed to the second stage of the information gathering process. At the last stage, only 12 developers were left for the final data collection process. Finding affirms that Malaysian developers prefer to make their investment decisions based on simple interpolation of historical data and using simple statistical or mathematical techniques in producing the required reports. It was suggested that they seemed to skip several important decision-making functions at the primary development stage. These shortcomings were mainly due to time and financial constraints and the lack of statistical or mathematical expertise among the professional and management groups in the developer organisations.

  6. Methods of gated-blood-pool-spect data processing

    International Nuclear Information System (INIS)

    Kosa, I.; Mester, J.; Tanaka, M.; Csernay, L.; Mate, E.; Szasz, K.

    1991-01-01

    Three techniques of gated SPECT were evaluated. The methods of Integral SPECT (ISPECT), enddyastole-endsystole SPECT (ED-ES SPECT) and Fourier SPECT were adapted and developed on the Hungarian nuclear medicine data processing system microSEGAMS. The methods are based on data reduction before back projection which results in processing times acceptable for the clinical routine. The clinical performance of the introduced techniques was tested in 10 patients with old posterior myocardial infarction and in 5 patients without cardiac disease. The left ventricular ejection faction determined by ISPECT correlated well with the planar values. The correlation coefficient was 0.89. The correlation coefficient of EF values determined by ED-ES SPECT and planar radionuclide ventriculography was lower (0.70). For the identification of left ventricular wall motion abnormalities ED-ES SPECT and Fourier SPECT exhibited a favourable performance, but ISPECT only moderate suitability. In the detection of regional phase delay Fourier-SPECT demonstrated higher sensitivity than the planar radionuclide ventriculography. (author) 4 refs.; 3 figs.; 2 tabs

  7. Apparatus and method for plasma processing of SRF cavities

    Science.gov (United States)

    Upadhyay, J.; Im, Do; Peshl, J.; Bašović, M.; Popović, S.; Valente-Feliciano, A.-M.; Phillips, L.; Vušković, L.

    2016-05-01

    An apparatus and a method are described for plasma etching of the inner surface of superconducting radio frequency (SRF) cavities. Accelerator SRF cavities are formed into a variable-diameter cylindrical structure made of bulk niobium, for resonant generation of the particle accelerating field. The etch rate non-uniformity due to depletion of the radicals has been overcome by the simultaneous movement of the gas flow inlet and the inner electrode. An effective shape of the inner electrode to reduce the plasma asymmetry for the coaxial cylindrical rf plasma reactor is determined and implemented in the cavity processing method. The processing was accomplished by moving axially the inner electrode and the gas flow inlet in a step-wise way to establish segmented plasma columns. The test structure was a pillbox cavity made of steel of similar dimension to the standard SRF cavity. This was adopted to experimentally verify the plasma surface reaction on cylindrical structures with variable diameter using the segmented plasma generation approach. The pill box cavity is filled with niobium ring- and disk-type samples and the etch rate of these samples was measured.

  8. Sample processing method for the determination of perchlorate in milk

    International Nuclear Information System (INIS)

    Dyke, Jason V.; Kirk, Andrea B.; Kalyani Martinelango, P.; Dasgupta, Purnendu K.

    2006-01-01

    In recent years, many different water sources and foods have been reported to contain perchlorate. Studies indicate that significant levels of perchlorate are present in both human and dairy milk. The determination of perchlorate in milk is particularly important due to its potential health impact on infants and children. As for many other biological samples, sample preparation is more time consuming than the analysis itself. The concurrent presence of large amounts of fats, proteins, carbohydrates, etc., demands some initial cleanup; otherwise the separation column lifetime and the limit of detection are both greatly compromised. Reported milk processing methods require the addition of chemicals such as ethanol, acetic acid or acetonitrile. Reagent addition is undesirable in trace analysis. We report here an essentially reagent-free sample preparation method for the determination of perchlorate in milk. Milk samples are spiked with isotopically labeled perchlorate and centrifuged to remove lipids. The resulting liquid is placed in a disposable centrifugal ultrafilter device with a molecular weight cutoff of 10 kDa, and centrifuged. Approximately 5-10 ml of clear liquid, ready for analysis, is obtained from a 20 ml milk sample. Both bovine and human milk samples have been successfully processed and analyzed by ion chromatography-mass spectrometry (IC-MS). Standard addition experiments show good recoveries. The repeatability of the analytical result for the same sample in multiple sample cleanup runs ranged from 3 to 6% R.S.D. This processing technique has also been successfully applied for the determination of iodide and thiocyanate in milk

  9. Hybrid numerical methods for multiscale simulations of subsurface biogeochemical processes

    International Nuclear Information System (INIS)

    Scheibe, T D; Tartakovsky, A M; Tartakovsky, D M; Redden, G D; Meakin, P

    2007-01-01

    Many subsurface flow and transport problems of importance today involve coupled non-linear flow, transport, and reaction in media exhibiting complex heterogeneity. In particular, problems involving biological mediation of reactions fall into this class of problems. Recent experimental research has revealed important details about the physical, chemical, and biological mechanisms involved in these processes at a variety of scales ranging from molecular to laboratory scales. However, it has not been practical or possible to translate detailed knowledge at small scales into reliable predictions of field-scale phenomena important for environmental management applications. A large assortment of numerical simulation tools have been developed, each with its own characteristic scale. Important examples include 1. molecular simulations (e.g., molecular dynamics); 2. simulation of microbial processes at the cell level (e.g., cellular automata or particle individual-based models); 3. pore-scale simulations (e.g., lattice-Boltzmann, pore network models, and discrete particle methods such as smoothed particle hydrodynamics); and 4. macroscopic continuum-scale simulations (e.g., traditional partial differential equations solved by finite difference or finite element methods). While many problems can be effectively addressed by one of these models at a single scale, some problems may require explicit integration of models across multiple scales. We are developing a hybrid multi-scale subsurface reactive transport modeling framework that integrates models with diverse representations of physics, chemistry and biology at different scales (sub-pore, pore and continuum). The modeling framework is being designed to take advantage of advanced computational technologies including parallel code components using the Common Component Architecture, parallel solvers, gridding, data and workflow management, and visualization. This paper describes the specific methods/codes being used at each

  10. Method for treating a nuclear process off-gas stream

    Science.gov (United States)

    Pence, Dallas T.; Chou, Chun-Chao

    1984-01-01

    Disclosed is a method for selectively removing and recovering the noble gas and other gaseous components typically emitted during nuclear process operations. The method is adaptable and useful for treating dissolver off-gas effluents released during reprocessing of spent nuclear fuels whereby to permit radioactive contaminant recovery prior to releasing the remaining off-gases to the atmosphere. Briefly, the method sequentially comprises treating the off-gas stream to preliminarily remove NO.sub.x, hydrogen and carbon-containing organic compounds, and semivolatile fission product metal oxide components therefrom; adsorbing iodine components on silver-exchanged mordenite; removing water vapor carried by said stream by means of a molecular sieve; selectively removing the carbon dioxide components of said off-gas stream by means of a molecular sieve; selectively removing xenon in gas phase by passing said stream through a molecular sieve comprising silver-exchanged mordenite; selectively separating krypton from oxygen by means of a molecular sieve comprising silver-exchanged mordenite; selectively separating krypton from the bulk nitrogen stream using a molecular sieve comprising silver-exchanged mordenite cooled to about -140.degree. to -160.degree. C.; concentrating the desorbed krypton upon a molecular sieve comprising silver-exchange mordenite cooled to about -140.degree. to -160.degree. C.; and further cryogenically concentrating, and the recovering for storage, the desorbed krypton.

  11. Energy-saving method for technogenic waste processing.

    Directory of Open Access Journals (Sweden)

    Bayandy Dikhanbaev

    Full Text Available Dumps of a mining-metallurgical complex of post-Soviet Republics have accumulated a huge amount of technogenic waste products. Out of them, Kazakhstan alone has preserved about 20 billion tons. In the field of technogenic waste treatment, there is still no technical solution that leads it to be a profitable process. Recent global trends prompted scientists to focus on developing energy-saving and a highly efficient melting unit that can significantly reduce specific fuel consumption. This paper reports, the development of a new technological method-smelt layer of inversion phase. The introducing method is characterized by a combination of ideal stirring and ideal displacement regimes. Using the method of affine modelling, recalculation of pilot plant's test results on industrial sample has been obtained. Experiments show that in comparison with bubbling and boiling layers of smelt, the degree of zinc recovery increases in the layer of inversion phase. That indicates the reduction of the possibility of new formation of zinc silicates and ferrites from recombined molecules of ZnO, SiO2, and Fe2O3. Calculations show that in industrial samples of the pilot plant, the consumption of natural gas has reduced approximately by two times in comparison with fuming-furnace. The specific fuel consumption has reduced by approximately four times in comparison with Waelz-kiln.

  12. Comparison of tissue processing methods for microvascular visualization in axolotls.

    Science.gov (United States)

    Montoro, Rodrigo; Dickie, Renee

    2017-01-01

    The vascular system, the pipeline for oxygen and nutrient delivery to tissues, is essential for vertebrate development, growth, injury repair, and regeneration. With their capacity to regenerate entire appendages throughout their lifespan, axolotls are an unparalleled model for vertebrate regeneration, but they lack many of the molecular tools that facilitate vascular imaging in other animal models. The determination of vascular metrics requires high quality image data for the discrimination of vessels from background tissue. Quantification of the vasculature using perfused, cleared specimens is well-established in mammalian systems, but has not been widely employed in amphibians. The objective of this study was to optimize tissue preparation methods for the visualization of the microvascular network in axolotls, providing a basis for the quantification of regenerative angiogenesis. To accomplish this aim, we performed intracardiac perfusion of pigment-based contrast agents and evaluated aqueous and non-aqueous clearing techniques. The methods were verified by comparing the quality of the vascular images and the observable vascular density across treatment groups. Simple and inexpensive, these tissue processing techniques will be of use in studies assessing vascular growth and remodeling within the context of regeneration. Advantages of this method include: •Higher contrast of the vasculature within the 3D context of the surrounding tissue •Enhanced detection of microvasculature facilitating vascular quantification •Compatibility with other labeling techniques.

  13. Method for treating a nuclear process off-gas stream

    International Nuclear Information System (INIS)

    Pence, D.T.; Chou, C.C.

    1984-01-01

    Disclosed is a method for selectively removing and recovering the noble gas and other gaseous components typically emitted during nuclear process operations. The method is adaptable and useful for treating dissolver off-gas effluents released during reprocessing of spent nuclear fuels whereby to permit radioactive contaminant recovery prior to releasing the remaining off-gases to the atmosphere. Briefly, the method sequentially comprises treating the off-gas stream to preliminarily remove NO /SUB x/ , hydrogen and carbon-containing organic compounds, and semivolatile fission product metal oxide components therefrom; adsorbing iodine components on silver-exchanged mordenite; removing water vapor carried by said stream by means of a molecular sieve; selectively removing the carbon dioxide components of said off-gas stream by means of a molecular sieve; selectively removing xenon in gas phase by passing said stream through a molecular sieve comprising silver-exchanged mordenite; selectively separating krypton from oxygen by means of a molecular sieve comprising silver-exchanged mordenite; selectively separating krypton from the bulk nitrogen stream using a molecular sieve comprising silver-exchanged mordenite cooled to about -140 0 to -160 0 C.; concentrating the desorbed krypton upon a molecular sieve comprising silver-exchange mordenite cooled to about -140 0 to -160 0 C.; and further cryogenically concentrating, and the recovering for storage, the desorbed krypton

  14. A Data Pre-Processing Model for the Topsis Method

    Directory of Open Access Journals (Sweden)

    Kobryń Andrzej

    2016-12-01

    Full Text Available TOPSIS is one of the most popular methods of multi-criteria decision making (MCDM. Its fundamental role is the establishment of chosen alternatives ranking based on their distance from the ideal and negative-ideal solution. There are three primary versions of the TOPSIS method distinguished: classical, interval and fuzzy, where calculation algorithms are adjusted to the character of input rating decision-making alternatives (real numbers, interval data or fuzzy numbers. Various, specialist publications present descriptions on the use of particular versions of the TOPSIS method in the decision-making process, particularly popular is the fuzzy version. However, it should be noticed, that depending on the character of accepted criteria – rating of alternatives can have a heterogeneous character. The present paper suggests the means of proceeding in the situation when the set of criteria covers characteristic criteria for each of the mentioned versions of TOPSIS, as a result of which the rating of the alternatives is vague. The calculation procedure has been illustrated by an adequate numerical example.

  15. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  16. Development of rupture process analysis method for great earthquakes using Direct Solution Method

    Science.gov (United States)

    Yoshimoto, M.; Yamanaka, Y.; Takeuchi, N.

    2010-12-01

    Conventional rupture process analysis methods using teleseismic body waves were based on ray theory. Therefore, these methods have the following problems in applying to great earthquakes such as 2004 Sumatra earthquake: (1) difficulty in computing all later phases such as the PP reflection phase, (2) impossibility of computing called “W phase”, the long period phase arriving before S wave, (3) implausibility of hypothesis that the distance is far enough from the observation points to the hypocenter compared to the fault length. To solve above mentioned problems, we have developed a new method which uses the synthetic seismograms computed by the Direct Solution Method (DSM, e.g. Kawai et al. 2006) as Green’s functions. We used the DSM software (http://www.eri.u-tokyo.ac.jp/takeuchi/software/) for computing the Green’s functions up to 1 Hz for the IASP91 (Kennett and Engdahl, 1991) model, and determined the final slip distributions using the waveform inversion method (Kikuchi et al. 2003). First we confirmed whether the Green’s functions computed by DSM were accurate in higher frequencies up to 1 Hz. Next we performed the rupture process analysis of this new method for Mw8.0 (GCMT) large Solomon Islands earthquake on April 1, 2007. We found that this earthquake consisted of two asperities and the rupture propagated across the subducting Sinbo ridge. The obtained slip distribution better correlates to the aftershock distributions than existing method. Furthermore, this new method keep same accuracy of existing method (which has the advantage of calculating) with respect to direct P-wave and reflection phases near the source, and also accurately calculate the later phases such a PP-wave.

  17. Efficient point cloud data processing in shipbuilding: Reformative component extraction method and registration method

    Directory of Open Access Journals (Sweden)

    Jingyu Sun

    2014-07-01

    Full Text Available To survive in the current shipbuilding industry, it is of vital importance for shipyards to have the ship components’ accuracy evaluated efficiently during most of the manufacturing steps. Evaluating components’ accuracy by comparing each component’s point cloud data scanned by laser scanners and the ship’s design data formatted in CAD cannot be processed efficiently when (1 extract components from point cloud data include irregular obstacles endogenously, or when (2 registration of the two data sets have no clear direction setting. This paper presents reformative point cloud data processing methods to solve these problems. K-d tree construction of the point cloud data fastens a neighbor searching of each point. Region growing method performed on the neighbor points of the seed point extracts the continuous part of the component, while curved surface fitting and B-spline curved line fitting at the edge of the continuous part recognize the neighbor domains of the same component divided by obstacles’ shadows. The ICP (Iterative Closest Point algorithm conducts a registration of the two sets of data after the proper registration’s direction is decided by principal component analysis. By experiments conducted at the shipyard, 200 curved shell plates are extracted from the scanned point cloud data, and registrations are conducted between them and the designed CAD data using the proposed methods for an accuracy evaluation. Results show that the methods proposed in this paper support the accuracy evaluation targeted point cloud data processing efficiently in practice.

  18. Solid electrolyte material manufacturable by polymer processing methods

    Science.gov (United States)

    Singh, Mohit; Gur, Ilan; Eitouni, Hany Basam; Balsara, Nitash Pervez

    2012-09-18

    The present invention relates generally to electrolyte materials. According to an embodiment, the present invention provides for a solid polymer electrolyte material that is ionically conductive, mechanically robust, and can be formed into desirable shapes using conventional polymer processing methods. An exemplary polymer electrolyte material has an elastic modulus in excess of 1.times.10.sup.6 Pa at 90 degrees C. and is characterized by an ionic conductivity of at least 1.times.10.sup.-5 Scm-1 at 90 degrees C. An exemplary material can be characterized by a two domain or three domain material system. An exemplary material can include material components made of diblock polymers or triblock polymers. Many uses are contemplated for the solid polymer electrolyte materials. For example, the present invention can be applied to improve Li-based batteries by means of enabling higher energy density, better thermal and environmental stability, lower rates of self-discharge, enhanced safety, lower manufacturing costs, and novel form factors.

  19. Method of processing liquid wastes containing radioactive materials

    International Nuclear Information System (INIS)

    Matsumoto, Kaname; Shirai, Takamori; Nemoto, Kuniyoshi; Yoshikawa, Jun; Matsuda, Takeshi.

    1983-01-01

    Purpose: To reduce the number of solidification products by removing, particularly, Co-60 that is difficult to remove in a radioactive liquid wastes containing a water-soluble chelating agent, by adsorbing Co-60 to a specific chelating agent. Method: Liquid wastes containing radioactive cobalt and water-soluble chelating agent are passed through the layer of less water-soluble chelating agent that forms a complex compound with cobalt in an acidic pH region. Thus, the chelating compound of radioactive cobalt (particularly Co-60) is eliminated by adsorbing the same on a specific chelating agent layer. The chelating agent having Co-60 adsorbed thereon is discarded as it is through the cement- or asphalt-solidification process, whereby the number of solidification products to be generated can significantly be suppressed. (Moriyama, K.)

  20. Process system and method for fabricating submicron field emission cathodes

    Science.gov (United States)

    Jankowski, Alan F.; Hayes, Jeffrey P.

    1998-01-01

    A process method and system for making field emission cathodes exists. The deposition source divergence is controlled to produce field emission cathodes with height-to-base aspect ratios that are uniform over large substrate surface areas while using very short source-to-substrate distances. The rate of hole closure is controlled from the cone source. The substrate surface is coated in well defined increments. The deposition source is apertured to coat pixel areas on the substrate. The entire substrate is coated using a manipulator to incrementally move the whole substrate surface past the deposition source. Either collimated sputtering or evaporative deposition sources can be used. The position of the aperture and its size and shape are used to control the field emission cathode size and shape.

  1. A Signal Processing Method to Explore Similarity in Protein Flexibility

    Directory of Open Access Journals (Sweden)

    Simina Vasilache

    2010-01-01

    Full Text Available Understanding mechanisms of protein flexibility is of great importance to structural biology. The ability to detect similarities between proteins and their patterns is vital in discovering new information about unknown protein functions. A Distance Constraint Model (DCM provides a means to generate a variety of flexibility measures based on a given protein structure. Although information about mechanical properties of flexibility is critical for understanding protein function for a given protein, the question of whether certain characteristics are shared across homologous proteins is difficult to assess. For a proper assessment, a quantified measure of similarity is necessary. This paper begins to explore image processing techniques to quantify similarities in signals and images that characterize protein flexibility. The dataset considered here consists of three different families of proteins, with three proteins in each family. The similarities and differences found within flexibility measures across homologous proteins do not align with sequence-based evolutionary methods.

  2. Experimental Validation for Hot Stamping Process by Using Taguchi Method

    Science.gov (United States)

    Fawzi Zamri, Mohd; Lim, Syh Kai; Razlan Yusoff, Ahmad

    2016-02-01

    Due to the demand for reduction in gas emissions, energy saving and producing safer vehicles has driven the development of Ultra High Strength Steel (UHSS) material. To strengthen UHSS material such as boron steel, it needed to undergo a process of hot stamping for heating at certain temperature and time. In this paper, Taguchi method is applied to determine the appropriate parameter of thickness, heating temperature and heating time to achieve optimum strength of boron steel. The experiment is conducted by using flat square shape of hot stamping tool with tensile dog bone as a blank product. Then, the value of tensile strength and hardness is measured as response. The results showed that the lower thickness, higher heating temperature and heating time give the higher strength and hardness for the final product. In conclusion, boron steel blank are able to achieve up to 1200 MPa tensile strength and 650 HV of hardness.

  3. Variational methods for high-order multiphoton processes

    International Nuclear Information System (INIS)

    Gao, B.; Pan, C.; Liu, C.; Starace, A.F.

    1990-01-01

    Methods for applying the variationally stable procedure for Nth-order perturbative transition matrix elements of Gao and Starace [Phys. Rev. Lett. 61, 404 (1988); Phys. Rev. A 39, 4550 (1989)] to multiphoton processes involving systems other than atomic H are presented. Three specific cases are discussed: one-electron ions or atoms in which the electron--ion interaction is described by a central potential; two-electron ions or atoms in which the electronic states are described by the adiabatic hyperspherical representation; and closed-shell ions or atoms in which the electronic states are described by the multiconfiguration Hartree--Fock representation. Applications are made to the dynamic polarizability of He and the two-photon ionization cross section of Ar

  4. Method of processing liquid waste containing fission product

    International Nuclear Information System (INIS)

    Funabashi, Kiyomi; Kawamura, Fumio; Matsuda, Masami; Komori, Itaru; Miura, Eiichi.

    1988-01-01

    Purpose: To prepare solidification products of low surface dose by removing cesium which is main radioactive nuclides from re-processing plants. Method: Liquid wastes containing a great amount of fission products are generated accompanying the reprocessing for spent nuclear fuels. After pH adjustment, the liquid wastes are sent to a concentrator to concentrate the dissolved ingredients. The concentrated liquid wastes are pumped to an adsorption tower in which radioactive cesium contributing much to the surface dose is removed. Then, the liquid wastes are sent by way of a surge tank to a mixing tank, in which they are mixed under stirring with solidifying agents such as cements. Then, the mixture is filled in a drum-can and solidified. According to this invention, since radioactive cesium is removed before solidification, it is possible to prepare solidification products at low surface dose and facilitate the handling of the solidification products. (Horiuchi, T.)

  5. A New Digital Signal Processing Method for Spectrum Interference Monitoring

    Science.gov (United States)

    Angrisani, L.; Capriglione, D.; Ferrigno, L.; Miele, G.

    2011-01-01

    Frequency spectrum is a limited shared resource, nowadays interested by an ever growing number of different applications. Generally, the companies providing such services pay to the governments the right of using a limited portion of the spectrum, consequently they would be assured that the licensed radio spectrum resource is not interested by significant external interferences. At the same time, they have to guarantee that their devices make an efficient use of the spectrum and meet the electromagnetic compatibility regulations. Therefore the competent authorities are called to control the access to the spectrum adopting suitable management and monitoring policies, as well as the manufacturers have to periodically verify the correct working of their apparatuses. Several measurement solutions are present on the market. They generally refer to real-time spectrum analyzers and measurement receivers. Both of them are characterized by good metrological accuracies but show costs, dimensions and weights that make no possible a use "on the field". The paper presents a first step in realizing a digital signal processing based measurement instrument able to suitably accomplish for the above mentioned needs. In particular the attention has been given to the DSP based measurement section of the instrument. To these aims an innovative measurement method for spectrum monitoring and management is proposed in this paper. It performs an efficient sequential analysis based on a sample by sample digital processing. Three main issues are in particular pursued: (i) measurement performance comparable to that exhibited by other methods proposed in literature; (ii) fast measurement time, (iii) easy implementation on cost-effective measurement hardware.

  6. Analysis of hydrogen separation methods in low pressure industrial processes

    International Nuclear Information System (INIS)

    Milidoni, M.; Somoza, J.; Borzone, E.M.; Blanco, M.V.; Cestau, D.; Baruj, A.; Meyer, G.

    2012-01-01

    In this work we present strategies for removing part of the hydrogen contained in a tank of 500 1 at a total pressure of 95 kPa. Hydrogen is mixed with other gases in a relation 95:5. The gas is generated as an end product during the production of radioisotopes. Main impurities are N 2 , humidity and activated gases. Two separation methods are proposed: one of them based on the use of a commercial Pd/Cu membrane, while the other involves the use of materials capable of forming metal hydrides (HFM). Characterization of hydrogen separation properties using a Pd/Cu membrane from pure H 2 and H 2 /Ar mixture were performed in the laboratory. We present simulations of a device containing HFM of the LaNi 5 -xSnx (0.x.0,5), using the properties of reaction with hydrogen measured in our laboratory. The performance of the different options was evaluated. Results were compared using as evaluation criteria the value of the pressure in the tank after 3 h of separation process and the time needed to separate the same amount of hydrogen generated during a batch of the process (author)

  7. Information theoretic methods for image processing algorithm optimization

    Science.gov (United States)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  8. Study on Thixojoining Process Using Partial Remelting Method

    Directory of Open Access Journals (Sweden)

    M. N. Mohammed

    2013-01-01

    Full Text Available Cold-work tool steel is considered to be a nonweldable metal due to its high percentage content of carbon and alloy elements. The application of a new process of the semisolid joining of two dissimilar metals is proposed. AISI D2 cold-work tool steel was thixojoined to 304 stainless steel by using a partial remelting method. After thixojoining, microstructural examination including metallographic analysis, energy dispersive spectroscopy (EDS, and Vickers hardness tests was performed. From the results, metallographic analyses along the joint interface between semisolid AISI D2 and stainless steel showed a smooth transition from one to another and neither oxides nor microcracking was observed. Hardness values obtained from the points in the diffusion zone were much higher than those in the 304 stainless steel but lower than those in the AISI D2 tool steel. The study revealed that a new type of nonequilibrium diffusion interfacial structure was constructed at the interface of the two different types of steel. The current work successfully confirmed that avoidance of a dendritic microstructure in the semisolid joined zone and high bonding quality components can be achieved without the need for force or complex equipment when compared to conventional welding processes.

  9. Method for verification of constituents of a process stream

    Energy Technology Data Exchange (ETDEWEB)

    Baylor, L.C.; Buchanan, B.R.; O`Rourke, P.E.

    1993-01-01

    This invention is comprised of a method for validating a process stream for the presence or absence of a substance of interest such as a chemical warfare agent; that is, for verifying that a chemical warfare agent is present in an input line for feeding the agent into a reaction vessel for destruction, or, in a facility for producing commercial chemical products, that a constituent of the chemical warfare agent has not been substituted for the proper chemical compound. The method includes the steps of transmitting light through a sensor positioned in the feed line just before the chemical constituent in the input line enters the reaction vessel, measuring an optical spectrum of the chemical constituent from the light beam transmitted through it, and comparing the measured spectrum to a reference spectrum of the chemical agent and preferable also reference spectra of surrogates. A signal is given if the chemical agent is not entering a reaction vessel for destruction, or if a constituent of a chemical agent is added to a feed line in substitution of the proper chemical compound.

  10. Energy-saving method for technogenic waste processing

    Science.gov (United States)

    Dikhanbaev, Bayandy; Dikhanbaev, Aristan Bayandievich

    2017-01-01

    Dumps of a mining-metallurgical complex of post-Soviet Republics have accumulated a huge amount of technogenic waste products. Out of them, Kazakhstan alone has preserved about 20 billion tons. In the field of technogenic waste treatment, there is still no technical solution that leads it to be a profitable process. Recent global trends prompted scientists to focus on developing energy-saving and a highly efficient melting unit that can significantly reduce specific fuel consumption. This paper reports, the development of a new technological method—smelt layer of inversion phase. The introducing method is characterized by a combination of ideal stirring and ideal displacement regimes. Using the method of affine modelling, recalculation of pilot plant’s test results on industrial sample has been obtained. Experiments show that in comparison with bubbling and boiling layers of smelt, the degree of zinc recovery increases in the layer of inversion phase. That indicates the reduction of the possibility of new formation of zinc silicates and ferrites from recombined molecules of ZnO, SiO2, and Fe2O3. Calculations show that in industrial samples of the pilot plant, the consumption of natural gas has reduced approximately by two times in comparison with fuming-furnace. The specific fuel consumption has reduced by approximately four times in comparison with Waelz-kiln. PMID:29281646

  11. Soft-tissues Image Processing: Comparison of Traditional Segmentation Methods with 2D active Contour Methods

    Czech Academy of Sciences Publication Activity Database

    Mikulka, J.; Gescheidtová, E.; Bartušek, Karel

    2012-01-01

    Roč. 12, č. 4 (2012), s. 153-161 ISSN 1335-8871 R&D Projects: GA ČR GAP102/11/0318; GA ČR GAP102/12/1104; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : Medical image processing * image segmentation * liver tumor * temporomandibular joint disc * watershed method Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.233, year: 2012

  12. A design method for process design kit based on an SMIC 65 nm process

    International Nuclear Information System (INIS)

    Luo Haiyan; Chen Lan; Yin Minghui

    2010-01-01

    The frame structure of a process design kit (PDK) is described in detail, and a practical design method for PDK is presented. Based on this method, a useful SMIC 65 nm PDK has been successfully designed and realized, which is applicable to native EDA software of Zeni. The design process and difficulties of PDK are introduced by developing and analyzing these parameterized cell (Pcell) devices (MOS, resistor, etc.). A structured design method was proposed to implement Pcell, which makes thousands upon thousands of source codes of Pcell concise, readable, easy-to-upkeep and transplantable. Moreover, a Pcase library for each Pcell is designed to verify the Pcell in batches. By this approach, the Pcell can be verified efficiently and the PDK will be more reliable and steady. In addition, the component description format parameters and layouts of the Pcell are optimized by adding flexibility and improving performance, which benefits analog and custom IC designers to satisfy the demand of design. Finally, the SMIC 65 nm PDK was applied to IC design. The results indicate that the SMIC 65 nm PDK is competent to support IC design. (semiconductor integrated circuits)

  13. Cathodic processes in high-temperature molten salts for the development of new materials processing methods

    International Nuclear Information System (INIS)

    Schwandt, Carsten

    2017-01-01

    Molten salts play an important role in the processing of a range of commodity materials. This includes the large-scale production of iron, aluminium, magnesium and alkali metals as well as the refining of nuclear fuel materials. This presentation focuses on two more recent concepts in which the cathodic reactions in molten salt electrolytic cells are used to prepare high-value-added materials. Both were developed and advanced at the Department of Materials Science and Metallurgy at the University of Cambridge and are still actively being pursued. One concept is now generally known as the FFC-Cambridge process. The presentation will highlight the optimisation of the process towards high selectivities for tubes or particles depict a modification of the method to synthesize tin-filled carbon nanomaterial, and illustrate the implementation of a novel type of process control to enable the preparation of gramme quantities of material within a few hours with simple laboratory equipment. Also discussed will be the testing of these materials in lithium ion batteries

  14. [A new method of processing quantitative PCR data].

    Science.gov (United States)

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  15. Classical-processing and quantum-processing signal separation methods for qubit uncoupling

    Science.gov (United States)

    Deville, Yannick; Deville, Alain

    2012-12-01

    The Blind Source Separation problem consists in estimating a set of unknown source signals from their measured combinations. It was only investigated in a non-quantum framework up to now. We propose its first quantum extensions. We thus introduce the Quantum Source Separation field, investigating both its blind and non-blind configurations. More precisely, we show how to retrieve individual quantum bits (qubits) only from the global state resulting from their undesired coupling. We consider cylindrical-symmetry Heisenberg coupling, which e.g. occurs when two electron spins interact through exchange. We first propose several qubit uncoupling methods which typically measure repeatedly the coupled quantum states resulting from individual qubits preparations, and which then statistically process the classical data provided by these measurements. Numerical tests prove the effectiveness of these methods. We then derive a combination of quantum gates for performing qubit uncoupling, thus avoiding repeated qubit preparations and irreversible measurements.

  16. Image processing methods and architectures in diagnostic pathology.

    Directory of Open Access Journals (Sweden)

    Oscar DĂŠniz

    2010-05-01

    Full Text Available Grid technology has enabled the clustering and the efficient and secure access to and interaction among a wide variety of geographically distributed resources such as: supercomputers, storage systems, data sources, instruments and special devices and services. Their main applications include large-scale computational and data intensive problems in science and engineering. General grid structures and methodologies for both software and hardware in image analysis for virtual tissue-based diagnosis has been considered in this paper. This methods are focus on the user level middleware. The article describes the distributed programming system developed by the authors for virtual slide analysis in diagnostic pathology. The system supports different image analysis operations commonly done in anatomical pathology and it takes into account secured aspects and specialized infrastructures with high level services designed to meet application requirements. Grids are likely to have a deep impact on health related applications, and therefore they seem to be suitable for tissue-based diagnosis too. The implemented system is a joint application that mixes both Web and Grid Service Architecture around a distributed architecture for image processing. It has shown to be a successful solution to analyze a big and heterogeneous group of histological images under architecture of massively parallel processors using message passing and non-shared memory.

  17. Method of processing radioactive nuclide-containing liquids

    International Nuclear Information System (INIS)

    Hirai, Masahide; Tomoshige, Shozo; Kondo, Kozo; Suzuki, Kazunori; Todo, Fukuzo; Yamanaka, Akihiro.

    1985-01-01

    Purpose: To solidify radioactive nuclides in to a much compact state and facilitate the storage. Method: Liquid wastes such as drain liquids generated from a nuclear power plant at a low density of 1 x 10 -6 - 10 -4 μCi/ml are previously brought into contact with a chelate type ion exchange resin such as of phenolic resin or ion exchange resin to adsorb the radioactive nuclides on the resin and the nuclides are eluted with sulfuric acid or the like to obtain liquid concentrates. The liquid concentrates are electrolyzed in an ordinary electrolytic facility using platinum or the like as the anode, Al or the like as the cathode, under the presence of 1 - 20 g/l of non-radioactive heavy metals such as Co and Ni in the liquid and while adjusting pH to 2 - 8. The electrolysis liquid residue is returned again to the electrolysis tank as it is or in the form of precipitates coagulated with a polymeric floculant. The supernatant liquid upon floculating treatment is processed with the chelate type ion exchange resin into hazardless liquid. (Sekiya, K.)

  18. A generic remote method invocation for intensive data processing

    International Nuclear Information System (INIS)

    Neto, A.; Alves, D.; Fernandes, H.; Ferreira, J.S.; Varandas, C.A.F.

    2006-01-01

    Based on the Extensible Markup Language (XML) and the Remote Method Invocation (RMI) standards, a client/server remote data analysis application has been developed for intensive data processing. This GRID oriented philosophy allows a powerful tool to maintain updated code and centralized computational resources. Another major feature is the ability to share proprietary algorithms in remote computers without the need of local code and libraries installation and maintenance. The 16 CPU Orionte cluster in operation at Centro de Fusao Nuclear (CFN) is currently used to provide remote data analysis. The codes running in languages such as Octave, C, Fortran or IDL are called through a script remote invocation and data is released to the client as soon as available. The remote calculations parameters are described in an XML file containing the configuration for the server runtime environment. Since the execution is made by calling a script any program can be launched to perform the analysis, the only requirement is the implementation of the protocol described in XML. Some plasma properties of the CFN tokamak (ISTTOK) that require heavy computational resources are already obtained using this approach, allowing ready inter-shot analysis and parameterization decisions

  19. Method of radioactive waste processing and equipment therefor

    International Nuclear Information System (INIS)

    Napravnik, J.; Skaba, V.; Ditl, P.

    1988-01-01

    Mushy or liquid radioactive wastes are mixed with chemical additives, e.g., aluminium sulfate, colloidal silicon oxide, formic acid and cement suspension. The mix is heated to 100 to 320 degC. By drying the waste and by chemical reaction, a bulk intermediate product will be obtained which is homogenized with molten bitumen or organic polymers. The mass is then poured into containers where it will harden and will then be transported to the depository. The advantage of the method is that the final product is a stable mass resistant to separation, leaching and erosion, showing long-term storage safety. The main components of the installation are a mixed reactor, a doser of bulk material and a homogenizer which are series connected in that order. The apparatus is mounted on a support structure which may be divided into at least two parts. The advantage of this facility is that it is easily transported and can thereby be used for processing waste at source. (E.S.). 2 figs

  20. A generic remote method invocation for intensive data processing

    Energy Technology Data Exchange (ETDEWEB)

    Neto, A. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisbon (Portugal)]. E-mail: andre.neto@cfn.ist.utl.pt; Alves, D. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisbon (Portugal); Fernandes, H. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisbon (Portugal); Ferreira, J.S. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisbon (Portugal); Varandas, C.A.F. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisbon (Portugal)

    2006-07-15

    Based on the Extensible Markup Language (XML) and the Remote Method Invocation (RMI) standards, a client/server remote data analysis application has been developed for intensive data processing. This GRID oriented philosophy allows a powerful tool to maintain updated code and centralized computational resources. Another major feature is the ability to share proprietary algorithms in remote computers without the need of local code and libraries installation and maintenance. The 16 CPU Orionte cluster in operation at Centro de Fusao Nuclear (CFN) is currently used to provide remote data analysis. The codes running in languages such as Octave, C, Fortran or IDL are called through a script remote invocation and data is released to the client as soon as available. The remote calculations parameters are described in an XML file containing the configuration for the server runtime environment. Since the execution is made by calling a script any program can be launched to perform the analysis, the only requirement is the implementation of the protocol described in XML. Some plasma properties of the CFN tokamak (ISTTOK) that require heavy computational resources are already obtained using this approach, allowing ready inter-shot analysis and parameterization decisions.

  1. A comparative method for processing immunological parameters: developing an "Immunogram".

    Science.gov (United States)

    Ortolani, Riccardo; Bellavite, Paolo; Paiola, Fiorenza; Martini, Morena; Marchesini, Martina; Veneri, Dino; Franchini, Massimo; Chirumbolo, Salvatore; Tridente, Giuseppe; Vella, Antonio

    2010-04-01

    The immune system is a network of numerous cells that communicate both directly and indirectly with each other. The system is very sensitive to antigenic stimuli, which are memorised, and is closely connected with the endocrine and nervous systems. Therefore, in order to study the immune system correctly, it must be considered in all its complexity by analysing its components with multiparametric tools that take its dynamic characteristic into account. We analysed lymphocyte subpopulations by using monoclonal antibodies with six different fluorochromes; the monoclonal panel employed included CD45, CD3, CD4, CD8, CD16, CD56, CD57, CD19, CD23, CD27, CD5, and HLA-DR. This panel has enabled us to measure many lymphocyte subsets in different states and with different functions: helper, suppressor, activated, effector, naïve, memory, and regulatory. A database was created to collect the values of immunological parameters of approximately 8,000 subjects who have undergone testing since 2000. When the distributions of the values for these parameters were compared with the medians of reference values published in the literature, we found that most of the values from the subjects included in the database were close to the medians in the literature. To process the data we used a comparative method that calculates the percentile rank of the values of a subject by comparing them with the values for others subjects of the same age. From this data processing we obtained a set of percentile ranks that represent the positions of the various parameters with regard to the data for other age-matched subjects included in the database. These positions, relative to both the absolute values and percentages, are plotted in a graph. We have called the final plot, which can be likened to that subject's immunological fingerprint, an "Immunogram". In order to perform the necessary calculations automatically, we developed dedicated software (Immunogramma) which provides at least two different

  2. [Baseflow separation methods in hydrological process research: a review].

    Science.gov (United States)

    Xu, Lei-Lei; Liu, Jing-Lin; Jin, Chang-Jie; Wang, An-Zhi; Guan, De-Xin; Wu, Jia-Bing; Yuan, Feng-Hui

    2011-11-01

    Baseflow separation research is regarded as one of the most important and difficult issues in hydrology and ecohydrology, but lacked of unified standards in the concepts and methods. This paper introduced the theories of baseflow separation based on the definitions of baseflow components, and analyzed the development course of different baseflow separation methods. Among the methods developed, graph separation method is simple and applicable but arbitrary, balance method accords with hydrological mechanism but is difficult in application, whereas time series separation method and isotopic method can overcome the subjective and arbitrary defects caused by graph separation method, and thus can obtain the baseflow procedure quickly and efficiently. In recent years, hydrological modeling, digital filtering, and isotopic method are the main methods used for baseflow separation.

  3. Methods utilized in evaluating the profitability of commercial space processing

    Science.gov (United States)

    Bloom, H. L.; Schmitt, P. T.

    1976-01-01

    Profitability analysis is applied to commercial space processing on the basis of business concept definition and assessment and the relationship between ground and space functions. Throughput analysis is demonstrated by analysis of the space manufacturing of surface acoustic wave devices. The paper describes a financial analysis model for space processing and provides key profitability measures for space processed isoenzymes.

  4. A method for processing peat or brown coal

    Energy Technology Data Exchange (ETDEWEB)

    Belkevich, P.I.; Lishtvan, I.I.; Prokhorov, G.M.; Tolstikov, G.A.

    1983-01-01

    A method is patented for extraction of peat and brown coal using dimethylformamide or dimethylsulfoxide in order to increase the output of bitumen and to produce dyes and acids from it and to utilize the debituminized fuel. The extraction is conducted at a solvent to raw material ratio of 1 to 5 at a temperature of 95 to 160 degrees for 0.5 to 3 hours. The extract is processed by hydroxides or carbonates of alkaline metals at a ratio of extract to the bitumen of 0.1 to 0.5 at 95 to 160 degrees for 0.5 to 2 hours with isolation of the salts of carbonic acids and recrystallization of them from the hydroxide with the acquisition of the target product of humic acids. The solvent is distilled from the extraction residue and after drying the sediment, a dye D is produced, while the debituminized fuel is processed by hydroxides of alkaline metals in a 0.1 to 1 to 1 ratio at 100 to 150 degrees for 0.5 to 2 hours with the acquisition of thinner for cement solutions. Example. A suspension of 180 grams of peat with a particle size of 0.25 to 10 millimeters with indicators (in percent) of the degree of breakdown of 40, moisture level of 20, ash content of 3.1 and bitumen content of 4.2, is mixed with 810 grams of dimethylformamide (an extraction agent to peat ratio of 4.5) and is heated at 95 degrees for three hours. Eight hundred and seventy grams of the extract (the bitumen output is 33 percent) are acquired, along with 120 grams of debituminized peat. Thirty grams of NaOH (an alkaline to bitumen ratio of 0.5) is gradually added to the bitumen extract at 90 to 100 degrees. The reaction mixture is heated to 160 degrees and is cured at this temperature for 2 hours and subsequently cooled to 20 degrees, filtered off and the salts of the carbonic acids are washed out by a fresh portion of dimethylsulfoxide with the production of 21.3 grams of salts with a melting point of 122 to 175 degrees.

  5. Sandia solidification process: a broad range aqueous waste solidification method

    International Nuclear Information System (INIS)

    Lynch, R.W.; Dosch, R.G.; Kenna, B.T.; Johnstone, J.K.; Nowak, E.J.

    1976-01-01

    New ion-exchange materials of the hydrous oxide type were developed for solidifying aqueous radioactive wastes. These materials have the general formula M[M'/sub x/O/sub y/H/sub z/]/sub n/, where M is an exchangeable cation of charge +n and M' may be Ti; Nb; Zr, or Ta. Affinities for polyvalent cations were found to be very high and ion-exchange capacities large (e.g., 4.0--4.5 meq/g for NaTi 2 O 5 H depending on moisture content). The effectiveness of the exchangers for solidifying high-level waste resulting from reprocessing light-water reactor fuel was demonstrated in small-scale tests. Used in conjunction with anion exchange resin, these materials reduced test solution radioactivity from approximately 0.2 Ci/ml to as low as approximately 2 nCi/ml. The residual radioactivity was almost exclusively due to 106 Ru and total α-activity was only a few pCi/ml. Alternative methods of consolidating the solidified waste were evaluated using nonradioactive simulants. Best results were obtained by pressure-sintering which yielded essentially fully dense ceramics, e.g., titanate/titania ceramics with bulk density as high as 4.7 g/cm 3 , waste oxide content as high as 1.2 g/cm 3 , and leach resistance comparable to good borosilicate glass. Based on the above results, a baseline process for solidifying high-level waste was defined and approximate economic analyses indicated costs were not prohibitive. Additional tests have demonstrated that, if desired, operating conditions could be modified to allow recovery of radiocesium (and perhaps other isotopes) during solidification of the remaining constituents of high-level waste. Preliminary tests have also shown that these materials offer promise for treating tank-stored neutralized wastes

  6. Image Processing Methods Usable for Object Detection on the Chessboard

    Directory of Open Access Journals (Sweden)

    Beran Ladislav

    2016-01-01

    Full Text Available Image segmentation and object detection is challenging problem in many research. Although many algorithms for image segmentation have been invented, there is no simple algorithm for image segmentation and object detection. Our research is based on combination of several methods for object detection. The first method suitable for image segmentation and object detection is colour detection. This method is very simply, but there is problem with different colours. For this method it is necessary to have precisely determined colour of segmented object before all calculations. In many cases it is necessary to determine this colour manually. Alternative simply method is method based on background removal. This method is based on difference between reference image and detected image. In this paper several methods suitable for object detection are described. Thisresearch is focused on coloured object detection on chessboard. The results from this research with fusion of neural networks for user-computer game checkers will be applied.

  7. Vector-Parallel processing of the successive overrelaxation method

    International Nuclear Information System (INIS)

    Yokokawa, Mitsuo

    1988-02-01

    Successive overrelaxation method, called SOR method, is one of iterative methods for solving linear system of equations, and it has been calculated in serial with a natural ordering in many nuclear codes. After the appearance of vector processors, this natural SOR method has been changed for the parallel algorithm such as hyperplane or red-black method, in which the calculation order is modified. These methods are suitable for vector processors, and more high-speed calculation can be obtained compared with the natural SOR method on vector processors. In this report, a new scheme named 4-colors SOR method is proposed. We find that the 4-colors SOR method can be executed on vector-parallel processors and it gives the most high-speed calculation among all SOR methods according to results of the vector-parallel execution on the Alliant FX/8 multiprocessor system. It is also shown that the theoretical optimal acceleration parameters are equal among five different ordering SOR methods, and the difference between convergence rates of these SOR methods are examined. (author)

  8. A method of automatic data processing in radiometric control

    International Nuclear Information System (INIS)

    Adonin, V.M.; Gulyukina, N.A.; Nemirov, Yu.V.; Mogil'nitskij, M.I.

    1980-01-01

    Described is the algorithm for automatic data processing in gamma radiography of products. Rapidity due to application of recurrent evaluation is a specific feature of the processing. Experimental data of by-line control are presented. The results obtained have shown the applicability of automatic signal processing to the testing under industrial conditions, which would permit to increase the testing efficiency to eliminate the subjectivism in assessment of testing results and to improve working conditions

  9. MethodS of radioactive waste processing and disposal in the United Kingdom

    International Nuclear Information System (INIS)

    Tolstykh, V.D.

    1983-01-01

    The results of investigations into radioactive waste processing and disposal in the United Kingdom are discussed. Methods for solidification of metal and graphite radioactive wastes and radioactive slime of the Magnox reactors are described. Specifications of different installations used for radioactive waste disposal are given. Climatic and geological conditions in the United Kingdom are such that any deep storages of wastes will be lower than the underground water level. That is why dissolution and transport by underground waters will inevitably result in radionuclide mobility. In this connection an extended program of investigations into the main three aspects of disposal problem namely radionucleide release in storages, underground water transport and radionuclide migration is realized. The program is divided in two parts. The first part deals with retrival of hydrological and geochemical data on geological formations, development of specialized methods of investigations which are necessary for identification of places for waste final disposal. The second part represents theoretical and laboratory investigations into provesses of radionuclide transport in the system of ''sttorage-geological formation''. It is concluded that vitrification on the base of borosilicate glass is the most advanced method of radioactive waste solidification

  10. Effects of Process Parameters on Copper Powder Compaction Process Using Multi-Particle Finite Element Method

    Science.gov (United States)

    Güner, F.; Sofuoğlu, H.

    2018-01-01

    Powder metallurgy (PM) has been widely used in several industries; especially automotive and aerospace industries and powder metallurgy products grow up every year. The mechanical properties of the final product that is obtained by cold compaction and sintering in powder metallurgy are closely related to the final relative density of the process. The distribution of the relative density in the die is affected by parameters such as compaction velocity, friction coefficient and temperature. Moreover, most of the numerical studies utilizing finite element approaches treat the examined environment as a continuous media with uniformly homogeneous porosity whereas Multi-Particle Finite Element Method (MPFEM) treats every particles as an individual body. In MPFEM, each of the particles can be defined as an elastic- plastic deformable body, so the interactions of the particles with each other and the die wall can be investigated. In this study, each particle was modelled and analyzed as individual deformable body with 3D tetrahedral elements by using MPFEM approach. This study, therefore, was performed to investigate the effects of different temperatures and compaction velocities on stress distribution and deformations of copper powders of 200 µm-diameter in compaction process. Furthermore, 3-D MPFEM model utilized von Mises material model and constant coefficient of friction of μ=0.05. In addition to MPFEM approach, continuum modelling approach was also performed for comparison purposes.

  11. The method validation step of biological dosimetry accreditation process

    International Nuclear Information System (INIS)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph.

    2006-01-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was considered as

  12. Quality control methods in accelerometer data processing: identifying extreme counts.

    Directory of Open Access Journals (Sweden)

    Carly Rich

    Full Text Available Accelerometers are designed to measure plausible human activity, however extremely high count values (EHCV have been recorded in large-scale studies. Using population data, we develop methodological principles for establishing an EHCV threshold, propose a threshold to define EHCV in the ActiGraph GT1M, determine occurrences of EHCV in a large-scale study, identify device-specific error values, and investigate the influence of varying EHCV thresholds on daily vigorous PA (VPA.We estimated quantiles to analyse the distribution of all accelerometer positive count values obtained from 9005 seven-year old children participating in the UK Millennium Cohort Study. A threshold to identify EHCV was derived by differentiating the quantile function. Data were screened for device-specific error count values and EHCV, and a sensitivity analysis conducted to compare daily VPA estimates using three approaches to accounting for EHCV.Using our proposed threshold of ≥ 11,715 counts/minute to identify EHCV, we found that only 0.7% of all non-zero counts measured in MCS children were EHCV; in 99.7% of these children, EHCV comprised < 1% of total non-zero counts. Only 11 MCS children (0.12% of sample returned accelerometers that contained negative counts; out of 237 such values, 211 counts were equal to -32,768 in one child. The medians of daily minutes spent in VPA obtained without excluding EHCV, and when using a higher threshold (≥19,442 counts/minute were, respectively, 6.2% and 4.6% higher than when using our threshold (6.5 minutes; p<0.0001.Quality control processes should be undertaken during accelerometer fieldwork and prior to analysing data to identify monitors recording error values and EHCV. The proposed threshold will improve the validity of VPA estimates in children's studies using the ActiGraph GT1M by ensuring only plausible data are analysed. These methods can be applied to define appropriate EHCV thresholds for different accelerometer models.

  13. The method validation step of biological dosimetry accreditation process

    Energy Technology Data Exchange (ETDEWEB)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph. [Institut de Radioprotection et de Surete Nucleaire, LDB, 92 - Fontenay aux Roses (France)

    2006-07-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was

  14. A Critical Evaluation and Framework of Business Process Improvement Methods

    NARCIS (Netherlands)

    Vanwersch, R.J.B.; Shahzad, K.; Vanderfeesten, I.; Vanhaecht, K.; Grefen, P.; Pintelon, L.M.; Mendling, J.; van Merode, G.G.; Reijers, H.A.

    2016-01-01

    The redesign of business processes has a huge potential in terms of reducing costs and throughput times, as well as improving customer satisfaction. Despite rapid developments in the business process management discipline during the last decade, a comprehensive overview of the options to

  15. An overview of medical image processing methods | Maras | African ...

    African Journals Online (AJOL)

    Various standards were formed regarding these instruments and end products that are being used more frequently everyday. Personal computers (PCs) have reached a significant level in image processing, carried analysis and visualization processes which could be done with expensive hardware on doctors' desktops.

  16. Proposed methods for treating high-level pyrochemical process wastes

    International Nuclear Information System (INIS)

    Johnson, T.R.; Miller, W.E.; Steunenberg, R.K.

    1985-01-01

    This survey illustrates the large variety and number of possible techniques available for treating pyrochemical wastes; there are undoubtedly other process types and many variations. The choice of a suitable process is complicated by the uncertainty as to what will be an acceptable waste form in the future for both TRU and non-TRU wastes

  17. A critical evaluation and framework of business process improvement methods

    NARCIS (Netherlands)

    Vanwersch, R.J.B.; Shahzad, K.; Vanderfeesten, I.T.P.; Vanhaecht, K.; Grefen, P.W.P.J.; Pintelon, L.M.; Mendling, J.; Merode, van G.G.; Reijers, H.A.

    2016-01-01

    The redesign of business processes has a huge potential in terms of reducing costs and throughput times, as well as improving customer satisfaction. Despite rapid developments in the business process management discipline during the last decade, a comprehensive overview of the options to

  18. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    Science.gov (United States)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  19. Chemical analysis of cyanide in cyanidation process: review of methods

    International Nuclear Information System (INIS)

    Nova-Alonso, F.; Elorza-Rodriguez, E.; Uribe-Salas, A.; Perez-Garibay, R.

    2007-01-01

    Cyanidation, the world wide method for precious metals recovery, the chemical analysis of cyanide, is a very important, but complex operation. Cyanide can be present forming different species, each of them with different stability, toxicity, analysis method and elimination technique. For cyanide analysis, there exists a wide selection of analytical methods but most of them present difficulties because of the interference of species present in the solution. This paper presents the different available methods for chemical analysis of cyanide: titration, specific electrode and distillation, giving special emphasis on the interferences problem, with the aim of helping in the interpretation of the results. (Author)

  20. The Study of Image Processing Method for AIDS PA Test

    International Nuclear Information System (INIS)

    Zhang, H J; Wang, Q G

    2006-01-01

    At present, the main test technique of AIDS is PA in China. Because the judgment of PA test image is still depending on operator, the error ration is high. To resolve this problem, we present a new technique of image processing, which first process many samples and get the data including coordinate of center and the rang of kinds images; then we can segment the image with the data; at last, the result is exported after data was judgment. This technique is simple and veracious; and it also turns out to be suitable for the processing and analyzing of other infectious diseases' PA test image

  1. Interactive methods to involve users into workspace design process

    DEFF Research Database (Denmark)

    Souza da Conceição, Carolina; Broberg, Ole; Banke, Palle

    2013-01-01

    This paper addresses the question of whether the use of a combination of interactive methods involving workers can lead to a useful input to the (re)design of their workspace. The workbook and the layout design game methods were tested, and a comparison between their use and the ergonomic analysi...

  2. Methods and compositions for controlling gene expression by RNA processing

    Science.gov (United States)

    Doudna, Jennifer A.; Qi, Lei S.; Haurwitz, Rachel E.; Arkin, Adam P.

    2017-08-29

    The present disclosure provides nucleic acids encoding an RNA recognition sequence positioned proximal to an insertion site for the insertion of a sequence of interest; and host cells genetically modified with the nucleic acids. The present disclosure also provides methods of modifying the activity of a target RNA, and kits and compositions for carrying out the methods.

  3. Device and methods for processing biomass to biogas

    Energy Technology Data Exchange (ETDEWEB)

    1981-04-16

    In the title process, which gives high yields and can be used profitably in agriculture, biomass (e.g. manure, slaughterhouse wastes) is acidified prior to anaerobic gasification in a neutral alkaline medium.

  4. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    National Research Council Canada - National Science Library

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  5. The Use of Statistical Methods in Dimensional Process Control

    National Research Council Canada - National Science Library

    Krajcsik, Stephen

    1985-01-01

    ... erection. To achieve this high degree of unit accuracy, we have begun a pilot dimensional control program that has set the guidelines for systematically monitoring each stage of the production process prior to erection...

  6. Characterization of depleted uranium oxides fabricated using different processing methods

    International Nuclear Information System (INIS)

    Hastings, E.P.; Lewis, C.; FitzPatrick, J.; Rademacher, D.; Tandon, L.

    2008-01-01

    Identifying both physical and chemical characteristics of Special Nuclear Material (SNM) production processes is the corner stone of nuclear forensics. Typically, processing markers are based on measuring an interdicted sample's bulk chemical properties, such as the elemental or isotopic composition, or focusing on the chemical and physical morphology of only a few particles. Therefore, it is imperative that known SNM processes be fully characterized from bulk to trace level for each particle size range. This report outlines a series of particle size measurements and fractionation techniques that can be applied to a bulk SNM powders, categorizing both chemical and physical properties in discrete particle size fractions. This will be demonstrated by characterizing the process signatures of a series of different depleted uranium oxides prepared at increasing firing temperatures (350-1100 deg C). Results will demonstrate how each oxides' material density, particle size distribution, and morphology varies. (author)

  7. Method For Processing Spent (Trn,Zr)N Fuel

    Science.gov (United States)

    Miller, William E.; Richmann, Michael K.

    2004-07-27

    A new process for recycling spent nuclear fuels, in particular, mixed nitrides of transuranic elements and zirconium. The process consists of two electrorefiner cells in series configuration. A transuranic element such as plutonium is reduced at the cathode in the first cell, zirconium at the cathode in the second cell, and nitrogen-15 is released and captured for reuse to make transuranic and zirconium nitrides.

  8. Method for Processing Liver Spheroids Using an Automatic Tissue Processor

    Science.gov (United States)

    2016-05-01

    alcohol dehydration and hot liquid wax infiltration. After the water in the tissue is replaced with wax and cooled, it then becomes possible to cut...effective for processing and preparing microscopy slides of liver spheroids. The general process involved formalin fixation, dehydration in a...DPBS);  formalin (37% neutral buffer formaldehyde);  series of alcohol solutions: 70, 80, 95, and 100% ethanol in water; 2  xylene

  9. Method and apparatus for monitoring plasma processing operations

    Science.gov (United States)

    Smith, Jr., Michael Lane; Ward, Pamela Denise Peardon; Stevenson, Joel O'Don

    2002-01-01

    The invention generally relates to various aspects of a plasma process, and more specifically the monitoring of such plasma processes. One aspect relates in at least some manner to calibrating or initializing a plasma monitoring assembly. This type of calibration may be used to address wavelength shifts, intensity shifts, or both associated with optical emissions data obtained on a plasma process. A calibration light may be directed at a window through which optical emissions data is being obtained to determine the effect, if any, that the inner surface of the window is having on the optical emissions data being obtained therethrough, the operation of the optical emissions data gathering device, or both. Another aspect relates in at least some manner to various types of evaluations which may be undertaken of a plasma process which was run, and more typically one which is currently being run, within the processing chamber. Plasma health evaluations and process identification through optical emissions analysis are included in this aspect. Yet another aspect associated with the present invention relates in at least some manner to the endpoint of a plasma process (e.g., plasma recipe, plasma clean, conditioning wafer operation) or discrete/discernible portion thereof (e.g., a plasma step of a multiple step plasma recipe). Another aspect associated with the present invention relates to how one or more of the above-noted aspects may be implemented into a semiconductor fabrication facility, such as the distribution of wafers to a wafer production system. A final aspect of the present invention relates to a network a plurality of plasma monitoring systems, including with remote capabilities (i.e., outside of the clean room).

  10. Production method of hydrogen jet plasma process in hydro machinery

    International Nuclear Information System (INIS)

    Amini, F.

    2007-01-01

    The purpose of present paper is to the process of plasma formation in hydro machinery when a hydro turbine operates at various conditions and load rejection. By investigation the power, shock pressure , and impact effects of hydro machinery, it is revealed that energy and hydrogen are generated by the plasma process. The investigation on several turbines of various hydro power plants reveals that cold fusion process in hydro machinery generates hydrogen. The hypothesis concerning the participation of alkaline metals in river water and the atomic nuclei of the runner blade material in the formation of hydrogen are considered. It is possible to assume hydrogen, deuterium, helium, and tritium atoms (based on Dr. Mizuno and Dr. Kanarev theories) that are formed, diffuse into cavitation bubbles. The plasma is generated during the collapse of the bubble; thus, the quantity of burnt hydrogen determine the volume of generating hydrogen and the impact force caused by hydrogen explosion (noise).There are five main notions, which can determine hydrogen and plasma process: (1) turbine power effect, (2) high shock pressure, (3) crack on turbine parts, (4) impacts effects and (4) the lift of rotating parts. The frequency of the excitation lies in a range from 0.786 to 1.095 Hz.In future, it may be possible to design hydro turbines based on the plasma process that generates hydrogen; or there may exist turbines that rotate with a mixture of hydrogen explosion and water energies

  11. The effect of processing and preservation methods on the oxalate ...

    African Journals Online (AJOL)

    Dr. J. T. Ekanem

    vegetables and consequently the associated food safety problems. Keywords: .... vegetables prepared in slightly two different ... Table 2: Oxalate levels of selected leafy vegetables as a function of cooking method and the interplay of freezing.

  12. A simplified method for processing dynamic images of gastric antrum

    DEFF Research Database (Denmark)

    Madsen, J L; Graff, J; Fugisang, S

    2000-01-01

    versus geometric centre curve. In all subjects, our technique gave unequivocal frequencies of antral contractions at each time point. Statistical analysis did not reveal any intraindividual variation in this frequency during gastric emptying. We believe that the simplified scintigraphic method...

  13. Method to predict process signals to learn for SVM

    International Nuclear Information System (INIS)

    Minowa, Hirotsugu; Gofuku, Akio

    2013-01-01

    Study of diagnostic system using machine learning to reduce the incidents of the plant is in advance because an accident causes large damage about human, economic and social loss. There is a problem that 2 performances between a classification performance and generalization performance on the machine diagnostic machine is exclusive. However, multi agent diagnostic system makes it possible to use a diagnostic machine specialized either performance by multi diagnostic machines can be used. We propose method to select optimized variables to improve classification performance. The method can also be used for other supervised learning machine but Support Vector Machine. This paper reports that our method and result of evaluation experiment applied our method to output 40% of Monju. (author)

  14. Methods for Process Evaluation of Work Environment Interventions

    DEFF Research Database (Denmark)

    Fredslund, Hanne; Strandgaard Pedersen, Jesper

    2004-01-01

    or management perceptions and actions in implementing any intervention and their influence on the overall result of the intervention' (Nytrø, Saksvik, Mikkelsen, Bohle, and Quinlan, 2000). Process evaluation can be used to a) provide feedback for improving interventions, b) interpret the outcomes of effect......In recent years, intervention studies have become increasingly popular within occupational health psychology. The vast majority of such studies have focused on interventions themselves and their effects on the working environment and employee health and well-being. Few studies have focused on how...... the context and processes surrounding the intervention may have influenced the outcomes (Hurrell and Murphy, 1996). Thus, there is still relatively little published research that provides us with information on how to evaluate such strategies and processes (Saksvik, Nytrø, Dahl-Jørgensen, and Mikkelsen, 2002...

  15. Rotor assembly and method for automatically processing liquids

    Science.gov (United States)

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1992-12-22

    A rotor assembly is described for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water. It includes a rotor body for rotation about an axis and includes a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses. 34 figs.

  16. Processing and discarding method for contaminated concrete wastes

    International Nuclear Information System (INIS)

    Yamamoto, Kazuo; Konishi, Masao; Matsuda, Atsuo; Iwamoto, Yoshiaki; Yoshikane, Toru; Koie, Toshio; Nakajima, Yoshiro

    1998-01-01

    Contaminated concrete wastes are crashed into granular concrete wastes having a successive grain size distribution. They are filled in a contamination processing vessel and made hardenable in the presence of a water-hardenable material in the granular concrete wastes. When underground water intrudes into the contamination processing vessel filled with the granular concrete wastes upon long-term storage, the underground water reacts with the water-hardenable material to be used for the solidification effect. Accordingly, leaching of contaminated materials due to intrusion of underground water can be suppressed. Since the concrete wastes have a successive grain size distribution, coarse grains can be used as coarse aggregates, medium grains can be used as fine aggregates and fine grains can be used as a solidifying material. Accordingly, the amount of wastes after processing can be remarkably reduced, with no supply of a solidifying material from outside. (T.M.)

  17. Predictive Active Set Selection Methods for Gaussian Processes

    DEFF Research Database (Denmark)

    Henao, Ricardo; Winther, Ole

    2012-01-01

    We propose an active set selection framework for Gaussian process classification for cases when the dataset is large enough to render its inference prohibitive. Our scheme consists of a two step alternating procedure of active set update rules and hyperparameter optimization based upon marginal...... high impact to the classifier decision process while removing those that are less relevant. We introduce two active set rules based on different criteria, the first one prefers a model with interpretable active set parameters whereas the second puts computational complexity first, thus a model...... with active set parameters that directly control its complexity. We also provide both theoretical and empirical support for our active set selection strategy being a good approximation of a full Gaussian process classifier. Our extensive experiments show that our approach can compete with state...

  18. Processing method for drained water containing ethanol amine

    International Nuclear Information System (INIS)

    Wakuta, Kuniharu; Ogawa, Naoki; Sagawa, Hiroshi; Kamiyoshi, Hideki; Fukunaga, Kazuo; Iwamoto, Ken; Miki, Tsuyoshi; Hirata, Toshio

    1998-01-01

    Drained water containing ethanol amine is processed with microorganisms such as hydrazine resistant denitrification bacteria in a biodegrading vessel (A) in the coexistence of nitrous ions and/or nitric ions under an anaerobic condition, and then it is processed with microorganisms such as nitrification bacteria in another biotic oxidation vessel (B) under an aerobic condition to generate the coexistent nitrate ion and/or nitric ion, and returned to the biodegrading vessel (A). Further, they are exposed to air or incorporated with an oxidant and optionally a copper compound such as copper sulfate as a catalyst is added in a step of removing hydrazine. (T.M.)

  19. Iterative Methods for MPC on Graphical Processing Units

    DEFF Research Database (Denmark)

    Gade-Nielsen, Nicolai Fog; Jørgensen, John Bagterp; Dammann, Bernd

    2012-01-01

    The high oating point performance and memory bandwidth of Graphical Processing Units (GPUs) makes them ideal for a large number of computations which often arises in scientic computing, such as matrix operations. GPUs achieve this performance by utilizing massive par- allelism, which requires ree...... as to avoid the use of dense matrices, which may be too large for the limited memory capacity of current graphics cards.......The high oating point performance and memory bandwidth of Graphical Processing Units (GPUs) makes them ideal for a large number of computations which often arises in scientic computing, such as matrix operations. GPUs achieve this performance by utilizing massive par- allelism, which requires...

  20. Additional methods for the processing of solid radioactive wastes

    International Nuclear Information System (INIS)

    Tittlova, E.; Svrcek, A.; Hazucha, E. at el.

    1989-01-01

    An account is given of the work performed within the A 01-159-812/05 State Project concerned with the technology of and technical means for the processing of solid wastes arising during the operation of nuclear power plants. This included the development of the incineration equipment, development of the process of air filter disposal and equipment therefor, manufacture of a saw for fragmentation of wood, manufacture of a sorting box, ultimate solution of the problem of waste sorting, and use of high-pressure compression technology. (author). 1 tab., 9 refs

  1. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    Science.gov (United States)

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  2. THE BASE OF THE METHODICAL DESIGN AND IMPLEMENTATION OF ENGINEERING EDUCATION PROCESS

    Directory of Open Access Journals (Sweden)

    Renata Lis

    2012-12-01

    Full Text Available The article is devoted to the methodology of implementation of European and national qualifications framework in the academic process. It consists of: the methodic of design degree programs and classes and the methodic of the teaching process.

  3. A Method of Measuring Inequality within a Selection Process

    Science.gov (United States)

    Bulle, Nathalie

    2016-01-01

    To explain the inequalities in access to a discrete good G across two populations, or across time in a single national context, it is necessary to distinguish, for each population or period of time, the effect of the diffusion of G from that of unequal outcomes of underlying micro-social processes. The inequality of outcomes of these micro-social…

  4. Methods and tools for sustainable chemical process design

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Chairakwongsa, Siwanat; Quaglia, Alberto

    2015-01-01

    As the pressure on chemical and biochemical processes to achieve a more sustainable performance increases, the need to define a systematic and holistic way to accomplish this is becoming more urgent. In this chapter, a multilevel computer-aided framework for systematic design of more sustainable...

  5. Spatio-temporal point process filtering methods with an application

    Czech Academy of Sciences Publication Activity Database

    Frcalová, B.; Beneš, V.; Klement, Daniel

    2010-01-01

    Roč. 21, 3-4 (2010), s. 240-252 ISSN 1180-4009 R&D Projects: GA AV ČR(CZ) IAA101120604 Institutional research plan: CEZ:AV0Z50110509 Keywords : cox point process * filtering * spatio-temporal modelling * spike Subject RIV: BA - General Mathematics Impact factor: 0.750, year: 2010

  6. High-performance method of morphological medical image processing

    Directory of Open Access Journals (Sweden)

    Ryabykh M. S.

    2016-07-01

    Full Text Available the article shows the implementation of grayscale morphology vHGW algorithm for selection borders in the medical image. Image processing is executed using OpenMP and NVIDIA CUDA technology for images with different resolution and different size of the structuring element.

  7. Modelling Of Flotation Processes By Classical Mathematical Methods - A Review

    Science.gov (United States)

    Jovanović, Ivana; Miljanović, Igor

    2015-12-01

    Flotation process modelling is not a simple task, mostly because of the process complexity, i.e. the presence of a large number of variables that (to a lesser or a greater extent) affect the final outcome of the mineral particles separation based on the differences in their surface properties. The attempts toward the development of the quantitative predictive model that would fully describe the operation of an industrial flotation plant started in the middle of past century and it lasts to this day. This paper gives a review of published research activities directed toward the development of flotation models based on the classical mathematical rules. The description and systematization of classical flotation models were performed according to the available references, with emphasize exclusively given to the flotation process modelling, regardless of the model application in a certain control system. In accordance with the contemporary considerations, models were classified as the empirical, probabilistic, kinetic and population balance types. Each model type is presented through the aspects of flotation modelling at the macro and micro process levels.

  8. On-line sample processing methods in flow analysis

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald

    2008-01-01

    In this chapter, the state of the art of flow injection and related approaches thereof for automation and miniaturization of sample processing regardless of the aggregate state of the sample medium is overviewed. The potential of the various generation of flow injection for implementation of in...

  9. SILVER RECYCLING FROM PHOTO-PROCESSING WASTE USING ELECTRODEPOSITION METHOD

    Directory of Open Access Journals (Sweden)

    Mochammad Feri Hadiyanto

    2010-06-01

    Full Text Available Silver electrodeposition of photo-processing waste and without addition of KCN 1,0 M has been studied for silver recycling. Photo procesing waste containing silver in form of [Ag(S2O32]3- was electrolysed at constant potential and faradic efficiency was determined at various of electrolysis times. Electrolysis of 100 mL photo processing waste without addition of KCN 1,0 M was carried out at constant potential 1.20 Volt, while electrolysis 100 mL photo procesing waste with addition of 10 mL KCN 1,0 M electrolysis was done at 1.30 Volt.The results showed that for silver electrodeposition from photo processing waste with addition of KCN 1,0 M was more favorable with faradic efficiency respectively were 93,16; 87,02; 74,74 and 78,35% for 30; 60; 90 and 120 minutes of electrolysis.   Keywords: Silver extraction, electrodeposition, photo-processing waste

  10. Signal Processing Methods for Flood Early Warning Systems

    NARCIS (Netherlands)

    Pyayt, A.L.; Mokhov, I.I.; Kozionov, A.P.; Kusherbaeva, V.T.; Krzhizhanovskaya, V.V.; Broekhuijsen, B.J.; Meijer, R.J.; Hinkelmann, R.; Nasermoaddeli, M.H.; Liong, S.Y.; Savic, D.; Fröhle, P.; Daemrich, K.F.

    2012-01-01

    We present in a data-driven approach for detection of anomalies in earthen dam (dike) behaviour that can indicate the onset of flood defence structure failure. This approach is implemented in the UrbanFlood early warning system's Artificial Intelligence component that processes dike measurements in

  11. Probabilistic Methods for Processing High-Throughput Sequencing Signals

    DEFF Research Database (Denmark)

    Sørensen, Lasse Maretty

    High-throughput sequencing has the potential to answer many of the big questions in biology and medicine. It can be used to determine the ancestry of species, to chart complex ecosystems and to understand and diagnose disease. However, going from raw sequencing data to biological or medical insig....... By estimating the genotypes on a set of candidate variants obtained from both a standard mapping-based approach as well as de novo assemblies, we are able to find considerably more structural variation than previous studies...... for reconstructing transcript sequences from RNA sequencing data. The method is based on a novel sparse prior distribution over transcript abundances and is markedly more accurate than existing approaches. The second chapter describes a new method for calling genotypes from a fixed set of candidate variants....... The method queries the reads using a graph representation of the variants and hereby mitigates the reference-bias that characterise standard genotyping methods. In the last chapter, we apply this method to call the genotypes of 50 deeply sequencing parent-offspring trios from the GenomeDenmark project...

  12. A review of experiment data processing method for uranium mining and metallurgy in BRICEM

    International Nuclear Information System (INIS)

    Ye Guoqiang; Lu Kehong; Wang Congying

    1997-01-01

    The authors investigates the methods of experiment data processing in Beijing Research Institute of Chemical Engineering and Metallurgy (BRICEM). It turns out that error analysis method is used to process experiment data, single-factor transformation and orthogonal test design method are adopted for arranging test, and regression analysis and mathematical process simulation are applied to process mathematical model for uranium mining and metallurgy. The methods above-mentioned lay a foundation for the utilization of mathematical statistics in our subject

  13. Method and apparatus for semi-solid material processing

    Science.gov (United States)

    Han, Qingyou [Knoxville, TN; Jian, Xiaogang [Knoxville, TN; Xu, Hanbing [Knoxville, TN; Meek, Thomas T [Knoxville, TN

    2009-02-24

    A method of forming a material includes the steps of: vibrating a molten material at an ultrasonic frequency while cooling the material to a semi-solid state to form non-dendritic grains therein; forming the semi-solid material into a desired shape; and cooling the material to a solid state. The method makes semi-solid castings directly from molten materials (usually a metal), produces grain size usually in the range of smaller than 50 .mu.m, and can be easily retrofitted into existing conventional forming machine.

  14. The prioritization and categorization method (PCM) process evaluation at Ericsson : a case study

    NARCIS (Netherlands)

    Ohlsson, Jens; Han, Shengnan; Bouwman, W.A.G.A.

    2017-01-01

    Purpose: The purpose of this paper is to demonstrate and evaluate the prioritization and categorization method (PCM), which facilitates the active participation of process stakeholders (managers, owners, customers) in process assessments. Stakeholders evaluate processes in terms of effectiveness,

  15. Process evaluation of the method sCOOLsport.

    NARCIS (Netherlands)

    Ooms, L.; Veenhof, C.

    2012-01-01

    Introduction: Childhood obesity has increased rapidly worldwide. Primary schools could play a major role in prevention of childhood obesity. sCOOLsport is a method that primary schools can use to implement health policy and interventions promoting a healthy lifestyle. sCOOLsport focuses on four main

  16. Analysis of some methods for reduced rank Gaussian process regression

    DEFF Research Database (Denmark)

    Quinonero-Candela, J.; Rasmussen, Carl Edward

    2005-01-01

    While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performance in regression and classification problems, their computational complexity makes them impractical when the size of the training set exceeds a few thousand cases. This has motivated the recent...... proliferation of a number of cost-effective approximations to GPs, both for classification and for regression. In this paper we analyze one popular approximation to GPs for regression: the reduced rank approximation. While generally GPs are equivalent to infinite linear models, we show that Reduced Rank...... Gaussian Processes (RRGPs) are equivalent to finite sparse linear models. We also introduce the concept of degenerate GPs and show that they correspond to inappropriate priors. We show how to modify the RRGP to prevent it from being degenerate at test time. Training RRGPs consists both in learning...

  17. Influence of heat processing methods on the nutrient composition ...

    African Journals Online (AJOL)

    No significant difference (P > 0.05) was obtained in the saponification number of the three samples analyzed and values ranged from 161.3 ±2.92 in RS to 163.0 ± 2.60 in FS. Heat processing (boiling and frying) generally decreased significantly (p<0.05) the crude protein, crude fat, caloric value, Fe, Zn, vitamins A and C as ...

  18. Methods and Conditions for Achieving Continuous Improvement of Processes

    OpenAIRE

    Florica BADEA; Catalina RADU; Ana-Maria GRIGORE

    2010-01-01

    In the early twentieth century, the Taylor model improved, in a spectacular maner the efficiency of the production processes. This allowed obtaining high productivity by low-skilled workers, but used in large number in the execution of production. Currently this model is questioned by experts and was replaced by the concept of "continuous improvement". The first signs of change date from the '80s, with the apparition of quality circles and groups of operators on quality issues, principles whi...

  19. Profile and Instrumentation Driven Methods for Embedded Signal Processing

    Science.gov (United States)

    2015-01-01

    agnostic conventions for describing and organizing tests, and uses shell scripts and programs written in high-level languages to run and analyze these...relaxation (SOR), and lower tri- angular (TRI) transformation. A group of kernels running concurrently was defined as a benchmark; the 5-kernel, 6-kernel, and...Processing, Taipei, Taiwan, April 2009, pp. 565– 568. [75] Shuvra S. Bhattacharyya et al., “Heterogeneous concurrent modeling and design in java , volume 1

  20. Composition and method for coke retardant during hydrocarbon processing

    International Nuclear Information System (INIS)

    Reid, D.K.

    1988-01-01

    A process is described for inhibiting the formation and deposition of filamentous coke on metallic surfaces in contact with a hydrocarbon having a temperature of 600 0 -1300 0 F which comprises adding to the hydrocarbon a sufficient amount for the purpose of a boron compound selected from the group of boron oxide compounds, boric acid and metal borides, with the proviso that when boric acid is used, it is substantially free of water

  1. Wavelet based methods for improved wind profiler signal processing

    Directory of Open Access Journals (Sweden)

    V. Lehmann

    2001-08-01

    Full Text Available In this paper, we apply wavelet thresholding for removing automatically ground and intermittent clutter (airplane echoes from wind profiler radar data. Using the concept of discrete multi-resolution analysis and non-parametric estimation theory, we develop wavelet domain thresholding rules, which allow us to identify the coefficients relevant for clutter and to suppress them in order to obtain filtered reconstructions.Key words. Meteorology and atmospheric dynamics (instruments and techniques – Radio science (remote sensing; signal processing

  2. Performance of wet process method alternatives : terminal or continuous blend

    OpenAIRE

    Fontes, Liseane P. T. L.; Pereira, Paulo A. A.; Pais, Jorge C.; Trichês, Glicério

    2006-01-01

    This study presents the results of the research to investigate asphalt rubber mixtures produced with asphalt rubber binder obtained from two different processes; (i) terminal blend (produced in refinery); (ii) continuous blend (produced in laboratory). The experiment included the evaluation of fatigue and permanent deformation resistance of two gap graded mixtures (Caltrans ARHM -GG; ADOT AR-A C) and a dense gradation Asphalt Institute (AI) mix type IV) Two asphalt rubbers from terminal blend...

  3. The "Process" of Process Use: Methods for Longitudinal Assessment in a Multisite Evaluation

    Science.gov (United States)

    Shaw, Jessica; Campbell, Rebecca

    2014-01-01

    Process use refers to the ways in which stakeholders and/or evaluands change as a function of participating in evaluation activities. Although the concept of process use has been well discussed in the literature, exploration of methodological strategies for the measurement and assessment of process use has been limited. Typically, empirical…

  4. New Python-based methods for data processing

    International Nuclear Information System (INIS)

    Sauter, Nicholas K.; Hattne, Johan; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel

    2013-01-01

    The Computational Crystallography Toolbox (cctbx) is a flexible software platform that has been used to develop high-throughput crystal-screening tools for both synchrotron sources and X-ray free-electron lasers. Plans for data-processing and visualization applications are discussed, and the benefits and limitations of using graphics-processing units are evaluated. Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h −1 ) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in real time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units

  5. Optimization of Robotic Spray Painting process Parameters using Taguchi Method

    Science.gov (United States)

    Chidhambara, K. V.; Latha Shankar, B.; Vijaykumar

    2018-02-01

    Automated spray painting process is gaining interest in industry and research recently due to extensive application of spray painting in automobile industries. Automating spray painting process has advantages of improved quality, productivity, reduced labor, clean environment and particularly cost effectiveness. This study investigates the performance characteristics of an industrial robot Fanuc 250ib for an automated painting process using statistical tool Taguchi’s Design of Experiment technique. The experiment is designed using Taguchi’s L25 orthogonal array by considering three factors and five levels for each factor. The objective of this work is to explore the major control parameters and to optimize the same for the improved quality of the paint coating measured in terms of Dry Film thickness(DFT), which also results in reduced rejection. Further Analysis of Variance (ANOVA) is performed to know the influence of individual factors on DFT. It is observed that shaping air and paint flow are the most influencing parameters. Multiple regression model is formulated for estimating predicted values of DFT. Confirmation test is then conducted and comparison results show that error is within acceptable level.

  6. New Python-based methods for data processing

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, Nicholas K., E-mail: nksauter@lbl.gov; Hattne, Johan; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States)

    2013-07-01

    The Computational Crystallography Toolbox (cctbx) is a flexible software platform that has been used to develop high-throughput crystal-screening tools for both synchrotron sources and X-ray free-electron lasers. Plans for data-processing and visualization applications are discussed, and the benefits and limitations of using graphics-processing units are evaluated. Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h{sup −1}) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in real time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units.

  7. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs

    International Nuclear Information System (INIS)

    Brooks, Robin; Thorpe, Richard; Wilson, John

    2004-01-01

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice

  8. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, Robin [Curvaceous Software Limited, P.O. Box 43, Gerrards Cross, Bucks SL98UX (United Kingdom)]. E-mail: enquiries@curvaceous.com; Thorpe, Richard [Curvaceous Software Limited, P.O. Box 43, Gerrards Cross, Bucks SL98UX (United Kingdom); Wilson, John [Curvaceous Software Limited, P.O. Box 43, Gerrards Cross, Bucks SL98UX (United Kingdom)

    2004-11-11

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.

  9. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs.

    Science.gov (United States)

    Brooks, Robin; Thorpe, Richard; Wilson, John

    2004-11-11

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.

  10. Modelling a gamma irradiation process using the Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Gabriela A.; Pereira, Marcio T., E-mail: gas@cdtn.br, E-mail: mtp@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-07-01

    In gamma irradiation service it is of great importance the evaluation of absorbed dose in order to guarantee the service quality. When physical structure and human resources are not available for performing dosimetry in each product irradiated, the appliance of mathematic models may be a solution. Through this, the prediction of the delivered dose in a specific product, irradiated in a specific position and during a certain period of time becomes possible, if validated with dosimetry tests. At the gamma irradiation facility of CDTN, equipped with a Cobalt-60 source, the Monte Carlo method was applied to perform simulations of products irradiations and the results were compared with Fricke dosimeters irradiated under the same conditions of the simulations. The first obtained results showed applicability of this method, with a linear relation between simulation and experimental results. (author)

  11. Processing method and results of meteor shower radar observations

    International Nuclear Information System (INIS)

    Belkovich, O.I.; Suleimanov, N.I.; Tokhtasjev, V.S.

    1987-01-01

    Studies of meteor showers permit the solving of some principal problems of meteor astronomy: to obtain the structure of a stream in cross section and along its orbits; to retrace the evolution of particle orbits of the stream taking into account gravitational and nongravitational forces and to discover the orbital elements of its parent body; to find out the total mass of solid particles ejected from the parent body taking into account physical and chemical evolution of meteor bodies; and to use meteor streams as natural probes for investigation of the average characteristics of the meteor complex in the solar system. A simple and effective method of determining the flux density and mass exponent parameter was worked out. This method and its results are discussed

  12. Modelling a gamma irradiation process using the Monte Carlo method

    International Nuclear Information System (INIS)

    Soares, Gabriela A.; Pereira, Marcio T.

    2011-01-01

    In gamma irradiation service it is of great importance the evaluation of absorbed dose in order to guarantee the service quality. When physical structure and human resources are not available for performing dosimetry in each product irradiated, the appliance of mathematic models may be a solution. Through this, the prediction of the delivered dose in a specific product, irradiated in a specific position and during a certain period of time becomes possible, if validated with dosimetry tests. At the gamma irradiation facility of CDTN, equipped with a Cobalt-60 source, the Monte Carlo method was applied to perform simulations of products irradiations and the results were compared with Fricke dosimeters irradiated under the same conditions of the simulations. The first obtained results showed applicability of this method, with a linear relation between simulation and experimental results. (author)

  13. Evaluation process radiological in ternopil region method of box models

    Directory of Open Access Journals (Sweden)

    І.В. Матвєєва

    2006-02-01

    Full Text Available  Results of radionuclides Sr-90 flows analyses in the ecosystem of Kotsubinchiky village of Ternopolskaya oblast were analyzed. The block-scheme of ecosystem and its mathematical model using the box models method were made. It allowed us to evaluate the ways of dose’s loadings formation of internal irradiation for miscellaneous population groups – working people, retirees, children, and also to prognose the dynamic of these loadings during the years after the Chernobyl accident.

  14. Method of processing radioactive liquid wastes by using zeolites

    Energy Technology Data Exchange (ETDEWEB)

    Kanno, T; Mimura, H

    1975-09-18

    The object is to processing radioactive liquid waste by zeolites to be fixed to a solidified body having a very small lixiviation property. The nuclide in radioactive liquid waste is exchanged and adsorbed into natural or synthetic zeolites, which are then burnt to a temperature lower than 1000/sup 0/C -- melting point. Thus, the zeolite structure is broken to form fine amorphous silicate aluminate or silicate aluminate of the nuclide exchanged and adsorbed. Both are very hard to be soluble in water. Further, the lixiviation from the solidified body is limited to the surface thereof, and it will no longer be detected in a few days.

  15. System Health Monitoring Using a Novel Method: Security Unified Process

    Directory of Open Access Journals (Sweden)

    Alireza Shameli-Sendi

    2012-01-01

    and change management, and project management. The dynamic dimension, or phases, contains inception, analysis and design, construction, and monitoring. Risk assessment is a major part of the ISMS process. In SUP, we present a risk assessment model, which uses a fuzzy expert system to assess risks in organization. Since, the classification of assets is an important aspect of risk management and ensures that effective protection occurs, a Security Cube is proposed to identify organization assets as an asset classification model. The proposed model leads us to have an offline system health monitoring tool that is really a critical need in any organization.

  16. Processing method for discharged radioactive laundry water waste

    International Nuclear Information System (INIS)

    Izumida, Tatsuo; Kitsukawa, Ryozo; Tsuchiya, Hiroyuki; Kiuchi, Yoshimasa; Hattori, Yasuo.

    1995-01-01

    In order to process discharged radioactive laundry water wastes safely and decrease radioactive wastes, bubbling of a surface active agent in a detergent which causes a problem upon its condensation is suppressed, so that the liquid condensate are continuously and easily dried into a powder. A nonionic surface active agent is used against the bubbling of the surface active agent. In addition, the bubbling in an the evaporation can is reduced, and the powderization is facilitated by adding an appropriate inorganic builder. (T.M.)

  17. Remote sensing models and methods for image processing

    CERN Document Server

    Schowengerdt, Robert A

    2007-01-01

    Remote sensing is a technology that engages electromagnetic sensors to measure and monitor changes in the earth's surface and atmosphere. Normally this is accomplished through the use of a satellite or aircraft. This book, in its 3rd edition, seamlessly connects the art and science of earth remote sensing with the latest interpretative tools and techniques of computer-aided image processing. Newly expanded and updated, this edition delivers more of the applied scientific theory and practical results that helped the previous editions earn wide acclaim and become classroom and industry standa

  18. Radioactive waste processing method for a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Sugimoto, Y; Kuriyama, O

    1976-06-04

    Object is to subject radioactive liquid waste in a nuclear power plant to reverse permeation process after which it is vaporized and concentrated thereby decreasing the quantity of foam to be used to achieve effective concentration of the liquid waste. Liquid waste containing a radioactive material produced from a nuclear power plant is first applied with pressure in excess of osmotic pressure by a reverse permeation device and is separated into clean water and concentrated liquid by semi-permeable membrane. Next, the thus reverse-permeated and concentrated waste is fed to an evaporator which control foaming by the foam and then further reconcentrated for purification of the liquid waste.

  19. Device and method to enrich and process heavy water

    International Nuclear Information System (INIS)

    Hammerli, M.M.; Butler, J.P.

    1979-01-01

    A device to process and enrich heavy water is proposed which is based on a combined electrolysis catalyst exchange system in which a D 2 O enrichment of more than 99.8% is achieved in the end stage. Water partly enriched with D 2 -containing hydrogen gas from an electrolysis cell is brought into contact in a catalyst column. The water is further enriched here with deuterium. It is then fed to the electrolysis cell. Details of the apparatus are closely described. (UWI) [de

  20. Modelling and simulation of diffusive processes methods and applications

    CERN Document Server

    Basu, SK

    2014-01-01

    This book addresses the key issues in the modeling and simulation of diffusive processes from a wide spectrum of different applications across a broad range of disciplines. Features: discusses diffusion and molecular transport in living cells and suspended sediment in open channels; examines the modeling of peristaltic transport of nanofluids, and isotachophoretic separation of ionic samples in microfluidics; reviews thermal characterization of non-homogeneous media and scale-dependent porous dispersion resulting from velocity fluctuations; describes the modeling of nitrogen fate and transport

  1. Content of Wax during Dewaxing Process: Adopting a DOE Method

    Directory of Open Access Journals (Sweden)

    Mohammad Hosein Eghbali

    2013-01-01

    Full Text Available The oil content of the wax produced in a dewaxing process is the key economic parameter that should be reduced as much as possible. Some factors such as the type of solvents, cooling rate, temperature, and solvent to oil ratio influence the dewaxing process. Due to the fact that crude oil differs from place to place and since the operational conditions for wax extraction vary for different types of crude oil, the objective of this work is to study the operational conditions for wax production from an Iranian raffinate sample used in Sepahan Oil Company. All the experiments are conducted based on a design of experiment (DOE technique for minimizing the oil content of the wax produced. The effects of five factors have been determined quantitatively and appropriate levels are suggested for reducing the oil content. The results show that the solvent ratio, solvent composition, and cooling rate play the most important role in minimizing the oil content of the produced wax.

  2. XFEL diffraction: developing processing methods to optimize data quality

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, Nicholas K., E-mail: nksauter@lbl.gov [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2015-01-29

    Bragg spots recorded from a still crystal necessarily give partial measurements of the structure factor intensity. Correction to the full-spot equivalent, relying on both a physical model for crystal disorder and postrefinement of the crystal orientation, improves the electron density map in serial crystallography. Serial crystallography, using either femtosecond X-ray pulses from free-electron laser sources or short synchrotron-radiation exposures, has the potential to reveal metalloprotein structural details while minimizing damage processes. However, deriving a self-consistent set of Bragg intensities from numerous still-crystal exposures remains a difficult problem, with optimal protocols likely to be quite different from those well established for rotation photography. Here several data processing issues unique to serial crystallography are examined. It is found that the limiting resolution differs for each shot, an effect that is likely to be due to both the sample heterogeneity and pulse-to-pulse variation in experimental conditions. Shots with lower resolution limits produce lower-quality models for predicting Bragg spot positions during the integration step. Also, still shots by their nature record only partial measurements of the Bragg intensity. An approximate model that corrects to the full-spot equivalent (with the simplifying assumption that the X-rays are monochromatic) brings the distribution of intensities closer to that expected from an ideal crystal, and improves the sharpness of anomalous difference Fourier peaks indicating metal positions.

  3. Method of processing solidification product of radioactive waste

    International Nuclear Information System (INIS)

    Daime, Fumiyoshi.

    1988-01-01

    Purpose: To improve the long-time stability of solidification products by providing solidification products with liquid tightness, gas tightness, abrasion resistance, etc., of the products in the course of the solidification for the treatment of radioactive wastes. Method: The surface of solidification products prepared by mixing solidifying agents with powder or pellets is entirely covered with high molecular polymer such as epoxy resin. The epoxy resin has excellent properties such as radiation-resistance, heat resistance, water proofness and chemical resistance, as well as have satisfactory mechanical properties. This can completely isolate the solidification products of radioactive wastes from the surrounding atmosphere. (Yoshino, Y.)

  4. Process identification method based on the Z transformation

    International Nuclear Information System (INIS)

    Zwingelstein, G.

    1968-01-01

    A simple method is described for identifying the transfer function of a linear retard-less system, based on the inversion of the Z transformation of the transmittance using a computer. It is assumed in this study that the signals at the entrance and at the exit of the circuit considered are of the deterministic type. The study includes: the theoretical principle of the inversion of the Z transformation, details about programming simulation, and identification of filters whose degrees vary from the first to the fifth order. (authors) [fr

  5. Radiometric method for the characterization of particulate processes in colloidal suspensions. II. Experimental verification of the method

    Energy Technology Data Exchange (ETDEWEB)

    Subotic, B. [Institut Rudjer Boskovic, Zagreb (Yugoslavia)

    1979-09-15

    A radiometric method for the characterization of particulate processes is verified using stable hydrosols of silver iodide. Silver iodide hydrosols satisfy the conditions required for the applications of the proposed method. Comparison shows that the values for the change of particle size measured in silver iodide hydrosols by the proposed method are in excellent agreement with the values obtained by other methods on the same systems (electron microscopy, sedimentation analysis, light scattering). This shows that the proposed method is suitable for the characterization of particulate processes in colloidal suspensions. (Auth.).

  6. Characterization of additive manufacturing processes for polymer micro parts productions using direct light processing (DLP) method

    DEFF Research Database (Denmark)

    Davoudinejad, Ali; Pedersen, David Bue; Tosello, Guido

    The process capability of additive manufacturing (AM) for direct production of miniaturized polymer components with micro features is analyzed in this work. The consideration of the minimum printable feature size and obtainable tolerances of AM process is a critical step to establish a process...... chains for the production of parts with micro scale features. A specifically designed direct light processing (DLP) AM machine suitable for precision printing has been used. A test part is designed having features with different sizes and aspect ratios in order to evaluate the DLP AM machine capability...

  7. Intelligent methods for the process parameter determination of plastic injection molding

    Science.gov (United States)

    Gao, Huang; Zhang, Yun; Zhou, Xundao; Li, Dequn

    2018-03-01

    Injection molding is one of the most widely used material processing methods in producing plastic products with complex geometries and high precision. The determination of process parameters is important in obtaining qualified products and maintaining product quality. This article reviews the recent studies and developments of the intelligent methods applied in the process parameter determination of injection molding. These intelligent methods are classified into three categories: Case-based reasoning methods, expert system- based methods, and data fitting and optimization methods. A framework of process parameter determination is proposed after comprehensive discussions. Finally, the conclusions and future research topics are discussed.

  8. Selecting device for processing method of radioactive gaseous wastes

    International Nuclear Information System (INIS)

    Sasaki, Ryoichi; Komoda, Norihisa.

    1976-01-01

    Object: To extend the period of replacement of a filter for adsorbing radioactive material by discharging waste gas containing radioactive material produced from an atomic power equipment after treating it by a method selected on the basis of the results of measurement of wind direction. Structure: Exhaust gas containing radioactive material produced from atomic power equipment is discharged after it is treated by a method selected on the basis of the results of wind direction measurement. For Instance, in case of sea wind the waste gas passes through a route selected for this case and is discharged through the waste gas outlet. When the sea wind disappears (that is, when a land wind or calm sets in), the exhaust gas is switched to a route for the case other than that of the sea wind, so that it passes through a filter consisting of active carbon where the radioactive material is removed through adsorption. The waste gas now free from the radioactive material is discharged through the waste gas outlet. (Moriyama, K.)

  9. Optimization of Production Processes Using the Yamazumi Method

    Directory of Open Access Journals (Sweden)

    Dušan Sabadka

    2017-12-01

    Full Text Available Manufacturing companies are now placing great emphasis on competitiveness and looking for ways to explore their resources more efficiently. This paper presents optimum efficiency improvement of the automotive transmission assembly production line by using line balancing. To optimize has been selected 3 assembly stations where is waste and where requirements are not met for achieving the production capacity. Several measures have been proposed on the assembly lines concerned to reduce operations by using eliminating unnecessary activities of the assembly processes, reducing the cycle time, and balancing manpower workload using line balancing through Yamazumi chart and Takt time. The results of the proposed measures were compared with the current situation in terms of increasing the efficiency of the production line.

  10. Diffusion in Solids Fundamentals, Methods, Materials, Diffusion-Controlled Processes

    CERN Document Server

    Mehrer, Helmut

    2007-01-01

    Diffusion is a vital topic in solid-state physics and chemistry, physical metallurgy and materials science. Diffusion processes are ubiquitous in solids at elevated temperatures. A thorough understanding of diffusion in materials is crucial for materials development and engineering. This book first gives an account of the central aspects of diffusion in solids, for which the necessary background is a course in solid state physics. It then provides easy access to important information about diffuson in metals, alloys, semiconductors, ion-conducting materials, glasses and nanomaterials. Several diffusion-controlled phenomena, including ionic conduction, grain-boundary and dislocation pipe diffusion, are considered as well. Graduate students in solid-state physics, physical metallurgy, materials science, physical and inorganic chemistry or geophysics will benefit from this book as will physicists, chemists, metallurgists, materials engineers in academic and industrial research laboratories.

  11. Analysis of Optimal Process Flow Diagrams of Light Naphtha Isomerization Process by Mathematic Modelling Method

    OpenAIRE

    Chuzlov, Vyacheslav Alekseevich; Molotov, Konstantin

    2016-01-01

    An approach to simulation of hydrocarbons refining processes catalytic reactors. The kinetic and thermodynamic research of light naphtha isomerization process was conducted. The kinetic parameters of hydrocarbon feedstock chemical conversion on different types of platinum-content catalysts was established. The estimation of efficiency of including different types of isomerization technologies in oil refinery flow diagram was performed.

  12. Analysis of Optimal Process Flow Diagrams of Light Naphtha Isomerization Process by Mathematic Modelling Method

    Directory of Open Access Journals (Sweden)

    Chuzlov Vjacheslav

    2016-01-01

    Full Text Available An approach to simulation of hydrocarbons refining processes catalytic reactors. The kinetic and thermodynamic research of light naphtha isomerization process was conducted. The kinetic parameters of hydrocarbon feedstock chemical conversion on different types of platinum-content catalysts was established. The estimation of efficiency of including different types of isomerization technologies in oil refinery flow diagram was performed.

  13. Melt-processing method for radioactive solid wastes

    International Nuclear Information System (INIS)

    Kobayashi, Hiroaki

    1998-01-01

    Radioactive solid wastes are charged into a water-cooled type cold crucible induction melting furnace disposed in high frequency coils, and high frequency currents are supplied to high frequency coils which surround the melting furnace to melt the solid wastes by induction-heating. In this case, heat plasmas are jetted from above the solid wastes to the solid wastes to conduct initial heating to melt a portion of the solid wastes. Then, high frequency currents are supplied to the high frequency coils to conduct induction heating. According to this method, even when waste components of various kinds of materials are mixed, a portion of the solid wastes in the induction melting furnace can be melted by the initial heating by jetting heat plasmas irrespective of the kinds and the electroconductivity of the materials of the solid wastes. With such procedures, entire solid wastes in the furnace can be formed into a molten state uniformly and rapidly. (T.M.)

  14. Methods to incorporate different data types in the characterization process

    International Nuclear Information System (INIS)

    Gomez-Hernandez, J.J.; Carrera, J.; Medina, A.

    1998-01-01

    Spatial variability of the hydrodynamic parameters controlling radionuclide transport causes large uncertainties in the predictions. Methods have been devised to analyze spatial variability of these parameters and to model the uncertainty of the predictions. However, the final use given to large portions of the total data collected is minimal. Techniques have been developed and implemented with the aim of incorporating all types of data in the characterization of the spatial variability of conductivity/transmissivity. This serves to reduce the uncertainty in the predictions and to increase the confidence in the model. Types of data used in models include: geometric information, transmissivity data, piezometric data, geological/geophysical information tracer test concentration data, and isotopic data. (R.P.)

  15. An alternative method for processing northern blots after capillary transfer.

    Science.gov (United States)

    Nilsen, Timothy W

    2015-03-02

    Different laboratories use different methods for the prehybridization, hybridization, and washing steps of the northern blotting procedure. In this protocol, a northern blot is pretreated with Church and Gilbert hybridization buffer to block nonspecific probe-binding sites. The immobilized RNA is then hybridized to a DNA probe specific for the RNA of interest. Finally, the membrane is washed and subjected to autoradiography or phosphorimaging. The solutions and conditions described here may be ideal for those who prefer to use fewer ingredients in their solutions. This protocol is designed to achieve the same goals as other northern blotting approaches. It minimizes background (nonspecific adherence of probe to membrane and nonspecific hybridization) and maximizes specific hybridization to RNAs immobilized on a membrane. © 2015 Cold Spring Harbor Laboratory Press.

  16. Method of processing nitrate-containing radioactive liquid wastes

    International Nuclear Information System (INIS)

    Ogawa, Norito; Nagase, Kiyoharu; Otsuka, Katsuyuki; Ouchi, Jin.

    1983-01-01

    Purpose: To efficiently concentrate nitrate-containing low level radioactive liquid wastes by electrolytically dialyzing radioactive liquid wastes to decompose the nitrate salt by using an electrolytic cell comprising three chambers having ion exchange membranes and anodes made of special materials. Method: Nitrate-containing low level radioactive liquid wastes are supplied to and electrolytically dialyzed in a central chamber of an electrolytic cell comprising three chambers having cationic exchange membranes and anionic exchange membranes made of flouro-polymer as partition membranes, whereby the nitrate is decomposed to form nitric acid in the anode chamber and alkali hydroxide compound or ammonium hydroxide in the cathode chamber, as well as concentrate the radioactive substance in the central chamber. Coated metals of at least one type of platinum metal is used as the anode for the electrolytic cell. This enables efficient industrial concentration of nitrate-containing low level radioactive liquid wastes. (Yoshihara, H.)

  17. A simple method for rapidly processing HEU from weapons returns

    Energy Technology Data Exchange (ETDEWEB)

    McLean, W. II; Miller, P.E.

    1994-01-01

    A method based on the use of a high temperature fluidized bed for rapidly oxidizing, homogenizing and down-blending Highly Enriched Uranium (HEU) from dismantled nuclear weapons is presented. This technology directly addresses many of the most important issues that inhibit progress in international commerce in HEU; viz., transaction verification, materials accountability, transportation and environmental safety. The equipment used to carry out the oxidation and blending is simple, inexpensive and highly portable. Mobile facilities to be used for point-of-sale blending and analysis of the product material are presented along with a phased implementation plan that addresses the conversion of HEU derived from domestic weapons and related waste streams as well as material from possible foreign sources such as South Africa or the former Soviet Union.

  18. Degradation processes and the methods of securing wall crests

    Directory of Open Access Journals (Sweden)

    Maciej Trochonowicz

    2017-12-01

    Full Text Available The protection of historical ruins requires solution of doctrinal and technical problems. Technical problems concern above all preservation of walls, which are exposed to the influence of atmospheric factors. The problem that needs to be solved in any historic ruin is securing of wall crests. Form of protection of the wall crests depends on many factors, mainly technical features of the wall and architectural and conservatory vision. The following article presents three aspects important for protection of wall crests. Firstly, analysis of features of the wall as a structure, secondly the characteristics of destructive agents, thirdly forms of protection of wall crests. In the summary of the following article, advantages and disadvantages of each method of preservation of the wall crests were presented.

  19. ARCHITECTURE E-MALL USING RUP (RATIONAL UNIFED PROCESS METHODS

    Directory of Open Access Journals (Sweden)

    Atin Triwahyuni

    2016-09-01

    Full Text Available Sistem Informasi dan teknologi saat ini telah berkembang dengan cepat dan memberikan dampak kedalam semua aspek kehidupan. Kebutuhan akan sistem informasi yang didukung dengan teknologi mulai dirasakan oleh berbagai kalangan masyarakat, salah satunya adalah kalangan masyarakat yang bergerak di dunia bisnis dan usaha, dengan memanfaatkan teknologi, setiap orang yang bergerak di dunia usaha dapat memperkenalkan produk mereka dengan menggunakan media internet, sehingga dapat meningkatkan penjualan dan memperluas pemasarannya, sedangkan pada media penjualan dagangan yang dijajakan masih secara konvensional, yaitu langsung bertatap muka dengan pelanggan untuk melihat maupun melakukan pembelian produk, dimana pelanggan terlebih dahulu menuju toko yang diinginkan, tapi dengan adanya sebuah website E-Mall ini, pelanggan bisa menemukan apa yang dicari tanpa harus mengunjungi toko tersebut. Metode yang digunakan dalam penelitian ini adalah metode RUP (Rational Unifed Process yang menggunakan konsep object oriented, dengan aktifitas yang berfokus pada pengembangan model dengan menggunakan Unified Model Language (UML. Penelitian yang telah dilakukan menghasilkan sebuah website yang dapat membantu dalam bisnis perdagangan pakaian online.

  20. Method of processing cellulose filter sludge containing radioactive waste

    International Nuclear Information System (INIS)

    Shibata, Setsuo; Shibuya, Hidetoshi; Kusakabe, Takao; Kawakami, Hiroshi.

    1991-01-01

    To cellulose filter sludges deposited with radioactive wastes, 1 to 15% of cellulase based on the solid content of the filter sludges is caused to act in an aqueous medium with 4 to 8 pH at 10 to 50degC. If the pH value exceeds 8, hydrolyzing effect of cellulase is decreased, whereas a tank is corroded if the pH value is 4 or lower. If temperature is lower than 10degC, the rate of the hydrolysis reaction is too low to be practical. It is appropriate that the temperature is at the order of 40degC. If it exceeds 50degC, the cellulase itself becomes unstable. It is most effective that the amount of cellulase is about 8% and its addition by more than 15% is not effective. In this way, liquids in which most of filter sludges are hydrolyzed are processed as low level radioactive wastes. (T.M.)

  1. A method to automate the radiological survey process

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.

    1987-01-01

    This document describes the USRAD system, a hardware/software ranging and data transmission system, that provides real-time position data and combines it with other portable instrument measurements. Live display of position data and onsite data reduction, presentation, and formatting for reports and automatic transfer into databases are among the unusual attributes of USRADS. Approximately 25% of any survey-to-survey report process is dedicated to data recording and formatting, which is eliminated by USRADS. Cost savings are realized by the elimination of manual transcription of instrument readout in the field and clerical formatting of data in the office. Increased data reliability is realized by ensuring complete survey coverage of an area in the field, by elimination of mathematical errors in conversion of instrument readout to unit concentration, and by elimination of errors associated with transcribing data from the field into report format. The USRAD system can be adapted to measure other types of pollutants or physical/chemical/geological/biological conditions in which portable instrumentation exists. 2 refs., 2 figs

  2. Off-gas processing method in reprocessing plant

    International Nuclear Information System (INIS)

    Kobayashi, Yoshihiro; Seki, Eiji.

    1990-01-01

    Off-gases containing a radioactive Kr gas generated in a nuclear fuel reprocessing plant are at first sent to a Kr gas separator. Then, the radioactive Kr gas extracted there is introduced to a Kr gas fixing device. A pretreatment and a post-treatment are applied by using a non-radioactive clean inert gas except for the Kr gas as a purge gas. If the radioactive Kr gas is contained in the off-gases discharged from the Kr gas fixing device after applying the post-treatment, the off gases are returned to the Kr gas separator. Accordingly, in a case where the radioactive Kr gas is contained in the off-gases discharged from the Kr gas fixing device, it is not necessary to apply the fixing treatment to all of the off gases. In view of the above, increase of the amount of processing gases can be suppressed and the radioactive Kr gas can be fixed efficiently and economically. (I.N.)

  3. Analysis of therapeutic methods for treating vocal process granulomas.

    Science.gov (United States)

    Ma, Lijing; Xiao, Yang; Ye, Jingying; Yang, Qingwen; Wang, Jun

    2015-03-01

    The combination of laryngeal microsurgery and local injections of botulinum toxin type A (BTA) can increase the cure rate of patients with vocal process granulomas (VPGs). To analyze the therapeutic effects of conservative treatments, microsurgical resection with suturing and microsurgery in combination with local injections of BTA for the treatment of VPGs. A retrospective analysis of 168 cases of VPG was performed. All of the patients initially received a conservative treatment. Some of the patients who did not respond to the conservative treatments were treated using microsurgical resection and microsuturing using an 8-0 absorbable filament. Other patients additionally received a four-point injection of BTA into the thyroarytenoid muscle and the arytenoid muscle on the operated side. The lesions of 41.3% (71/168) of the patients who were given the conservative treatments (including acid suppression, vocal rest, and voice therapy) disappeared, and the lesions of 10.7% (18/168) of the patients were reduced. The conservative treatments were unsuccessful for 47% (79/168) of the patients. The cure rate was 78.4% (29/37) for the patients who were treated by microscope resection using a CO2 laser and microsuturing of the surrounding mucosa. Of the eight patients who experienced a recurrence, five of them had lesions that disappeared after 3 months of conservative treatment, whereas the other three patients recovered after a second operation. The cure rate of the 42 patients who were treated using microsurgery combined with local injections of BTA was 95.2% (40/42), with only 2 cases of recurrence at 2 months post-treatment.

  4. A method to evaluate process performance by integrating time and resources

    Science.gov (United States)

    Wang, Yu; Wei, Qingjie; Jin, Shuang

    2017-06-01

    The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.

  5. Kaizen: a method of process improvement in the emergency department.

    Science.gov (United States)

    Jacobson, Gregory H; McCoin, Nicole Streiff; Lescallette, Richard; Russ, Stephan; Slovis, Corey M

    2009-12-01

    Recent position statements from health care organizations have placed a strong emphasis on continuous quality improvement (CQI). CQI finds many of its roots in kaizen, which emphasizes small, low-cost, low-risk improvements. Based on the successful Kaizen Programs at organizations such as Toyota, the authors thought the emergency department (ED) would be an ideal environment to benefit from such a program. The authors sought to create a CQI program using a suggestion-based model that did not require a large time commitment, was easy to implement, and had the potential to empower all physicians in the department. It would not take the place of other improvement efforts, but instead augment them. The hypothesis was that such a program would foster sustainable engagement of emergency physicians in system improvement efforts and lead to a continuous stream of low-cost implementable system improvement interventions. A CQI program was created for the physician staff of the Department of Emergency Medicine at Vanderbilt University Medical Center, focusing on a suggestion-based model using kaizen philosophy. Lectures teaching kaizen philosophy were presented. Over the past 4 years, a methodology was developed utilizing a Web-based application, the Kaizen Tracker, which aids in the submission and implementation of suggestions that are called kaizen initiatives (KIs). The characteristics of the KIs submitted, details regarding resident and faculty participation, and the effectiveness of the Kaizen Tracker were retrospectively reviewed. There were 169, 105, and 101 KIs placed in the postimplementation calendar years 2006, 2007, and 2008, respectively. Seventy-six percent of KIs submitted thus far have identified a "process problem." Fifty-three percent of KIs submitted have led to operational changes within the ED. Ninety-three percent of the resident physicians entered at least one KI, and 73% of these residents submitted more than one KI. Sixty-nine percent of the

  6. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  7. Process-tracing methods in decision making: on growing up in the 70s

    NARCIS (Netherlands)

    Schulte-Mecklenbeck, M.; Johnson, J.G.; Böckenholt, U.; Goldstein, D.G.; Russo, J.E.; Sullivan, N.J.; Willemsen, M.C.

    2017-01-01

    Decision research has experienced a shift from simple algebraic theories of choice to an appreciation of mental processes underlying choice. A variety of process-tracing methods has helped researchers test these process explanations. Here, we provide a survey of these methods, including specific

  8. [Reasearch on evolution and transition of processing method of fuzi in ancient and modern times].

    Science.gov (United States)

    Liu, Chan-Chan; Cheng, Ming-En; Duan, Hai-Yan; Peng, Hua-Sheng

    2014-04-01

    Fuzi is a medicine used for rescuing from collapse by restoring yang as well as a famous toxic traditional Chinese medicine. In order to ensure the efficacy and safe medication, Fuzi has mostly been applied after being processed. There have been different Fuzi processing methods recorded by doctors of previous generations. Besides, there have also been differences in Fuzi processing methods recorded in modern pharmacopeia and ancient medical books. In this study, the authors traced back to medical books between the Han Dynasty and the period of Republic of China, and summarized Fuzi processing methods collected in ancient and modern literatures. According to the results, Fuzi processing methods and using methods have changed along with the evolution of dynasties, with differences in ancient and modern processing methods. Before the Tang Dynasty, Fuzi had been mostly processed and soaked. From Tang to Ming Dynasties, Fuzi had been mostly processed, soaked and stir-fried. During the Qing Dynasty, Fuzi had been mostly soaked and boiled. In the modem times, Fuzi is mostly processed by being boiled and soaked. Before the Tang Dynasty, a whole piece of Fuzi herbs or their fragments had been applied in medicines; Whereas their fragments are primarily used in the modern times. Because different processing methods have great impacts on the toxicity of Fuzi, it is suggested to study Fuzi processing methods.

  9. Development of Auto-Seeding System Using Image Processing Technology in the Sapphire Crystal Growth Process via the Kyropoulos Method

    Directory of Open Access Journals (Sweden)

    Churl Min Kim

    2017-04-01

    Full Text Available The Kyropoulos (Ky and Czochralski (Cz methods of crystal growth are used for large-diameter single crystals. The seeding process in these methods must induce initial crystallization by initiating contact between the seed crystals and the surface of the melted material. In the Ky and Cz methods, the seeding process lays the foundation for ingot growth during the entire growth process. When any defect occurs in this process, it is likely to spread to the entire ingot. In this paper, a vision system was constructed for auto seeding and for observing the surface of the melt in the Ky method. An algorithm was developed to detect the time when the internal convection of the melt is stabilized by observing the shape of the spoke pattern on the melt material surface. Then, the vision system and algorithm were applied to the growth furnace, and the possibility of process automation was examined for sapphire growth. To confirm that the convection of the melt was stabilized, the position of the island (i.e., the center of a spoke pattern was detected using the vision system and image processing. When the observed coordinates for the center of the island were compared with the coordinates detected from the image processing algorithm, there was an average error of 1.87 mm (based on an image with 1024 × 768 pixels.

  10. Evaluation of polymer micro parts produced by additive manufacturing processes using vat photopolymerization method

    DEFF Research Database (Denmark)

    Davoudinejad, Ali; Pedersen, David Bue; Tosello, Guido

    2017-01-01

    Micro manufacturing scale feature production by Additive Manufacturing (AM) processes for the direct production of miniaturized polymer components is analysed in this work. The study characterizes the AM processes for polymer micro parts productions using the vat photopolymerization method...

  11. Biological features produced by additive manufacturing processes using vat photopolymerization method

    DEFF Research Database (Denmark)

    Davoudinejad, Ali; Mendez Ribo, Macarena; Pedersen, David Bue

    2017-01-01

    of micro biological features by Additive Manufacturing (AM) processes. The study characterizes the additive manufacturing processes for polymeric micro part productions using the vat photopolymerization method. A specifically designed vat photopolymerization AM machine suitable for precision printing...

  12. Prediction of periodically correlated processes by wavelet transform and multivariate methods with applications to climatological data

    Science.gov (United States)

    Ghanbarzadeh, Mitra; Aminghafari, Mina

    2015-05-01

    This article studies the prediction of periodically correlated process using wavelet transform and multivariate methods with applications to climatological data. Periodically correlated processes can be reformulated as multivariate stationary processes. Considering this fact, two new prediction methods are proposed. In the first method, we use stepwise regression between the principal components of the multivariate stationary process and past wavelet coefficients of the process to get a prediction. In the second method, we propose its multivariate version without principal component analysis a priori. Also, we study a generalization of the prediction methods dealing with a deterministic trend using exponential smoothing. Finally, we illustrate the performance of the proposed methods on simulated and real climatological data (ozone amounts, flows of a river, solar radiation, and sea levels) compared with the multivariate autoregressive model. The proposed methods give good results as we expected.

  13. A Multi-Objective Optimization Method to integrate Heat Pumps in Industrial Processes

    OpenAIRE

    Becker, Helen; Spinato, Giulia; Maréchal, François

    2011-01-01

    Aim of process integration methods is to increase the efficiency of industrial processes by using pinch analysis combined with process design methods. In this context, appropriate integrated utilities offer promising opportunities to reduce energy consumption, operating costs and pollutants emissions. Energy integration methods are able to integrate any type of predefined utility, but so far there is no systematic approach to generate potential utilities models based on their technology limit...

  14. Research on the raw data processing method of the hydropower construction project

    Science.gov (United States)

    Tian, Zhichao

    2018-01-01

    In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.

  15. Multiresolution, Geometric, and Learning Methods in Statistical Image Processing, Object Recognition, and Sensor Fusion

    National Research Council Canada - National Science Library

    Willsky, Alan

    2004-01-01

    .... Our research blends methods from several fields-statistics and probability, signal and image processing, mathematical physics, scientific computing, statistical learning theory, and differential...

  16. The method of multispectral image processing of phytoplankton processing for environmental control of water pollution

    Science.gov (United States)

    Petruk, Vasil; Kvaternyuk, Sergii; Yasynska, Victoria; Kozachuk, Anastasia; Kotyra, Andrzej; Romaniuk, Ryszard S.; Askarova, Nursanat

    2015-12-01

    The paper presents improvement of the method of environmental monitoring of water bodies based on bioindication by phytoplankton, which identify phytoplankton particles carried out on the basis of comparison array multispectral images using Bayesian classifier of solving function based on Mahalanobis distance. It allows to evaluate objectively complex anthropogenic and technological impacts on aquatic ecosystems.

  17. Sequential method for the assessment of innovations in computer assisted industrial processes

    International Nuclear Information System (INIS)

    Suarez Antola R.

    1995-01-01

    A sequential method for the assessment of innovations in industrial processes is proposed, using suitable combinations of mathematical modelling and numerical simulation of dynamics. Some advantages and limitations of the proposed method are discussed. tabs

  18. Analysis of Unit Process Cost for an Engineering-Scale Pyroprocess Facility Using a Process Costing Method in Korea

    Directory of Open Access Journals (Sweden)

    Sungki Kim

    2015-08-01

    Full Text Available Pyroprocessing, which is a dry recycling method, converts spent nuclear fuel into U (Uranium/TRU (TRansUranium metal ingots in a high-temperature molten salt phase. This paper provides the unit process cost of a pyroprocess facility that can process up to 10 tons of pyroprocessing product per year by utilizing the process costing method. Toward this end, the pyroprocess was classified into four kinds of unit processes: pretreatment, electrochemical reduction, electrorefining and electrowinning. The unit process cost was calculated by classifying the cost consumed at each process into raw material and conversion costs. The unit process costs of the pretreatment, electrochemical reduction, electrorefining and electrowinning were calculated as 195 US$/kgU-TRU, 310 US$/kgU-TRU, 215 US$/kgU-TRU and 231 US$/kgU-TRU, respectively. Finally the total pyroprocess cost was calculated as 951 US$/kgU-TRU. In addition, the cost driver for the raw material cost was identified as the cost for Li3PO4, needed for the LiCl-KCl purification process, and platinum as an anode electrode in the electrochemical reduction process.

  19. Processing methods for operation test data of radioactive aerosols monitor based on accumulation techniques

    International Nuclear Information System (INIS)

    Fu Cuiming; Xi Pingping; Ma Yinghao; Tan Linglong; Shen Fu

    2011-01-01

    This article introduces a radioactive aerosol continuous monitor based on accumulation sampling and measuring and three methods for processing the operation data. The monitoring results are processed by the 3 methods which are applied both under the conditions of natural background and at workplaces of a nuclear facility. How the monitoring results are assessed and how to calculate the detection limit when using the 3 different methods are explained. Moreover, the advantages and disadvantages of the 3 methods are discussed. (authors)

  20. Integrated process development-a robust, rapid method for inclusion body harvesting and processing at the microscale level.

    Science.gov (United States)

    Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid

    2017-10-21

    Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.

  1. Research of Monte Carlo method used in simulation of different maintenance processes

    International Nuclear Information System (INIS)

    Zhao Siqiao; Liu Jingquan

    2011-01-01

    The paper introduces two kinds of Monte Carlo methods used in equipment life process simulation under the least maintenance: condition: method of producing the interval of lifetime, method of time scale conversion. The paper also analyzes the characteristics and the using scope of the two methods. By using the conception of service age reduction factor, the model of equipment's life process under incomplete maintenance condition is established, and also the life process simulation method applicable to this situation is invented. (authors)

  2. Process qualification and control in electron beams--requirements, methods, new concepts and challenges

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Gratzl, F.; Hanis, D.

    2004-01-01

    In this paper the status of process qualification and control in electron beam irradiation is analyzed in terms of requirements, concepts, methods and challenges for a state-of-the-art process control concept for medical device sterilization. Aspects from process qualification to routine process control are described together with the associated process variables. As a case study the 10 MeV beams at Mediscan GmbH are considered. Process control concepts like statistical process control (SPC) and a new concept to determine process capability is briefly discussed

  3. Process control monitoring systems, industrial plants, and process control monitoring methods

    Science.gov (United States)

    Skorpik, James R [Kennewick, WA; Gosselin, Stephen R [Richland, WA; Harris, Joe C [Kennewick, WA

    2010-09-07

    A system comprises a valve; a plurality of RFID sensor assemblies coupled to the valve to monitor a plurality of parameters associated with the valve; a control tag configured to wirelessly communicate with the respective tags that are coupled to the valve, the control tag being further configured to communicate with an RF reader; and an RF reader configured to selectively communicate with the control tag, the reader including an RF receiver. Other systems and methods are also provided.

  4. [Influence of different processing methods on Angelica sinensis polysaccharides from same origin].

    Science.gov (United States)

    Lv, Jieli; Chen, Hongli; Duan, Jinao; Yan, Hui; Tang, Yuping; Song, Bingsheng

    2011-04-01

    To study the influences of different processing methods on the content of Angelica sinensis polysaccharides (APS) from the same origin. The contents of neutral polysaccharides and acidic polysaccharides in various samples of A. sinensis were determined by phenol-sulfuric acid and carbazole-sulfuric acid method, respectively. The proliferation ability of lymphocyte was detected by MTT method after the cells were cultured with different concentrations of APS from two samples processed by different methods. The different processing methods had different effects on the contents of polysaccharide. The maximum content of APS (26.03%) was found in the sample processed by microwave drying medium-fired, but the minimum content of APS (2.25%) was found in the sample processed by vacuum drying at 50 TC. Furthermore, the APS (high concentration group, P methods have different effects on the contents of APS and the proliferation ability of lymphocytes.

  5. Cement solidification method for miscellaneous radioactive solid, processing device and processing tool therefor

    International Nuclear Information System (INIS)

    Mihara, Shigeru; Suzuki, Kazunori; Hasegawa, Akira.

    1994-01-01

    A basket made of a metal net and a lid with a spacer constituting a processing tool for processing miscellaneous radioactive solid wastes is formed as a mesh which scarcely passes the miscellaneous solids but pass mortars. The size of the mesh is usually from about 10 to 30mm. Since this mesh allows fine solids approximate to powders such as burning ashes and heat insulation materials, they fall to the bottom of a dram can, to cause corrosion. Then, the corners of the bottom and the bottom of the dram can are coated with cement. The miscellaneous solid wastes are contained, and the lid of a metal net having a spacer at the upper portion thereof is set, a provisional lid is put on, and it is evacuated, and mortars are injected. Since there is a possibility that light and fine radioactive powders are exposed on the surface of the mortars coagulated and hardened by curing, conditioning for further adding mortars is applied for securing the mortars in order to prevent scattering of the radioactive powders. With such procedures, a satisfactory safe solidified products can be formed. (T.M.)

  6. The measurement problem on classical diffusion process: inverse method on stochastic processes

    International Nuclear Information System (INIS)

    Bigerelle, M.; Iost, A.

    2004-01-01

    In a high number of diffusive systems, measures are processed to calculate material parameters such as diffusion coefficients, or to verify the accuracy of mathematical models. However, the precision of the parameter determination or of the model relevance depends on the location of the measure itself. The aim of this paper is first to analyse, for a mono-dimensional system, the precision of the measure in relation with its location by an inverse problem algorithm and secondly to examine the physical meaning of the results. Statistical mechanic considerations show that, passing over a time-distance criterion, measurement becomes uncertain whatever the initial conditions. The criterion proves that this chaotic mode is related to the production of anti-entropy at a mesoscopique scale that is in violation to quantum theory about measurement

  7. Hollow fiber structures, methods of use thereof, methods of making, and pressure-retarded processes

    KAUST Repository

    Le, Lieu Ngoc; Bettahalli, Narasimha Murthy Srivatsa; Nunes, Suzana Pereira; Chung, Neal Tai-Shung

    2016-01-01

    Embodiments of the present disclosure provide for composite materials, methods of making composite materials, methods of using composite materials, and the like. In particular, the present application relates to hollow fibers and to pressure-retarded osmosis systems comprising said fibers. The hollow fibers have an inside layer and an outside layer, wherein the outside layer covers an outside surface of the inside layer, wherein the inside layer forms a boundary around the lumen, wherein the inside layer includes a bi-layer structure, wherein the bi-layer structure includes a sponge-like layer and a finger-like layer, wherein the sponge-like layer is disposed closer to the lumen of the hollow fiber and the finger-like layer is disposed on the sponge-like layer on the side opposite the lumen, wherein the outside layer includes a polyamide layer.

  8. Hollow fiber structures, methods of use thereof, methods of making, and pressure-retarded processes

    KAUST Repository

    Le, Lieu Ngoc

    2016-12-08

    Embodiments of the present disclosure provide for composite materials, methods of making composite materials, methods of using composite materials, and the like. In particular, the present application relates to hollow fibers and to pressure-retarded osmosis systems comprising said fibers. The hollow fibers have an inside layer and an outside layer, wherein the outside layer covers an outside surface of the inside layer, wherein the inside layer forms a boundary around the lumen, wherein the inside layer includes a bi-layer structure, wherein the bi-layer structure includes a sponge-like layer and a finger-like layer, wherein the sponge-like layer is disposed closer to the lumen of the hollow fiber and the finger-like layer is disposed on the sponge-like layer on the side opposite the lumen, wherein the outside layer includes a polyamide layer.

  9. The Role of Attention in Somatosensory Processing: A Multi-Trait, Multi-Method Analysis

    Science.gov (United States)

    Wodka, Ericka L.; Puts, Nicolaas A. J.; Mahone, E. Mark; Edden, Richard A. E.; Tommerdahl, Mark; Mostofsky, Stewart H.

    2016-01-01

    Sensory processing abnormalities in autism have largely been described by parent report. This study used a multi-method (parent-report and measurement), multi-trait (tactile sensitivity and attention) design to evaluate somatosensory processing in ASD. Results showed multiple significant within-method (e.g., parent report of different…

  10. The Scientific Method and the Creative Process: Implications for the K-6 Classroom

    Science.gov (United States)

    Nichols, Amanda J.; Stephens, April H.

    2013-01-01

    Science and the arts might seem very different, but the processes that both fields use are very similar. The scientific method is a way to explore a problem, form and test a hypothesis, and answer questions. The creative process creates, interprets, and expresses art. Inquiry is at the heart of both of these methods. The purpose of this article is…

  11. Memory Indexing: A Novel Method for Tracing Memory Processes in Complex Cognitive Tasks

    Science.gov (United States)

    Renkewitz, Frank; Jahn, Georg

    2012-01-01

    We validate an eye-tracking method applicable for studying memory processes in complex cognitive tasks. The method is tested with a task on probabilistic inferences from memory. It provides valuable data on the time course of processing, thus clarifying previous results on heuristic probabilistic inference. Participants learned cue values of…

  12. Solidification method for organic solution and processing method of aqueous solution

    International Nuclear Information System (INIS)

    Kamoshida, Mamoru; Fukazawa, Tetsuo; Yazawa, Noriko; Hasegawa, Toshihiko

    1998-01-01

    The relative dielectric constant of an organic solution containing polar ingredients is controlled to 13 or less to enable its solidification. The polarity of the organic solution can be evaluated quantitatively by using the relative dielectric constant. If the relative dielectric constant is high, it can be controlled by dilution using a non-polar organic solvent of low relative dielectric constant. With such procedures, solidification can be conducted by using an economical 12-hydroxy stearic acid, process of liquid wastes can be facilitated and the safety can be ensured. (T.M.)

  13. A collaborative processes synchronization method with regard to system crashes and network failures

    NARCIS (Netherlands)

    Wang, Lei; Wombacher, Andreas; Ferreira Pires, Luis; van Sinderen, Marten J.; Chi, Chihung

    2014-01-01

    Processes can synchronize their states by exchanging messages. System crashes and network failures may cause message loss, so that state changes of a process may remain unnoticed by its partner processes, resulting in state inconsistency or deadlocks. In this paper we define a method to transform a

  14. A Bayesian MCMC method for point process models with intractable normalising constants

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2004-01-01

    to simulate from the "unknown distribution", perfect simulation algorithms become useful. We illustrate the method in cases whre the likelihood is given by a Markov point process model. Particularly, we consider semi-parametric Bayesian inference in connection to both inhomogeneous Markov point process models...... and pairwise interaction point processes....

  15. Process Research Methods and Their Application in the Didactics of Text Production and Translation

    DEFF Research Database (Denmark)

    Dam-Jensen, Helle; Heine, Carmen

    2009-01-01

    not only as learners, but also as thinkers and problem solvers. This can be achieved by systematically applying knowledge from process research as this can give insight into mental and physical processes of text production. This article provides an overview of methods commonly used in process research...

  16. Effect of the method of processing on quality and oxidative stability ...

    African Journals Online (AJOL)

    In this study four samn samples prepared from cow milk using two processing methods (traditional T1, T2 and factory processed T3, T4) were investigated for their physico-chemical properties, fatty acids composition, oxidative stability and sensory properties. The traditionally processed samples showed a significance ...

  17. Effect of processing methods on the mechanical properties of engineered bamboo

    OpenAIRE

    Sharma, Bhavna; Gatóo, Ana; Ramage, Michael H.

    2015-01-01

    Engineered bamboo is increasingly explored as a material with significant potential for structural applications. The material is comprised of raw bamboo processed into a laminated composite. Commercial methods vary due to the current primary use as an architectural surface material, with processing used to achieve different colours in the material. The present work investigates the effect of two types of processing methods, bleaching and caramelisation, to determine the effect on the mechanic...

  18. Metal Removal Process Optimisation using Taguchi Method - Simplex Algorithm (TM-SA) with Case Study Applications

    OpenAIRE

    Ajibade, Oluwaseyi A.; Agunsoye, Johnson O.; Oke, Sunday A.

    2018-01-01

    In the metal removal process industry, the current practice to optimise cutting parameters adoptsa conventional method. It is based on trial and error, in which the machine operator uses experience,coupled with handbook guidelines to determine optimal parametric values of choice. This method is notaccurate, is time-consuming and costly. Therefore, there is a need for a method that is scientific, costeffectiveand precise. Keeping this in mind, a different direction for process optimisation is ...

  19. 40 CFR 63.115 - Process vent provisions-methods and procedures for process vent group determination.

    Science.gov (United States)

    2010-07-01

    ... this section. (2) The gas volumetric flow rate shall be determined using Method 2, 2A, 2C, or 2D of 40... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or...)(3) of this section. (i) The vent stream volumetric flow rate (Qs), in standard cubic meters per...

  20. A radiometric method for the characterization of particulate processes in colloidal suspensions. II

    International Nuclear Information System (INIS)

    Subotic, B.

    1979-01-01

    A radiometric method for the characterization of particulate processes is verified using stable hydrosols of silver iodide. Silver iodide hydrosols satisfy the conditions required for the applications of the proposed method. Comparison shows that the values for the change of particle size measured in silver iodide hydrosols by the proposed method are in excellent agreement with the values obtained by other methods on the same systems (electron microscopy, sedimentation analysis, light scattering). This shows that the proposed method is suitable for the characterization of particulate processes in colloidal suspensions. (Auth.

  1. MULTIPLE CRITERA METHODS WITH FOCUS ON ANALYTIC HIERARCHY PROCESS AND GROUP DECISION MAKING

    Directory of Open Access Journals (Sweden)

    Lidija Zadnik-Stirn

    2010-12-01

    Full Text Available Managing natural resources is a group multiple criteria decision making problem. In this paper the analytic hierarchy process is the chosen method for handling the natural resource problems. The one decision maker problem is discussed and, three methods: the eigenvector method, data envelopment analysis method, and logarithmic least squares method are presented for the derivation of the priority vector. Further, the group analytic hierarchy process is discussed and six methods for the aggregation of individual judgments or priorities: weighted arithmetic mean method, weighted geometric mean method, and four methods based on data envelopment analysis are compared. The case study on land use in Slovenia is applied. The conclusions review consistency, sensitivity analyses, and some future directions of research.

  2. The Influence of Different Processing Methods on Component Content of Sophora japonica

    Science.gov (United States)

    Ji, Y. B.; Zhu, H. J.; Xin, G. S.; Wei, C.

    2017-12-01

    The purpose of this experiment is to understand the effect of different processing methods on the content of active ingredients in Sophora japonica, and to determine the content of rutin and quercetin in Sophora japonica under different processing methods by UV spectrophotometry of the content determination. So as to compare the effect of different processing methods on the active ingredient content of Sophora japonica. Experiments can be seen in the rutin content: Fried Sophora japonica>Vinegar sunburn Sophora> Health products Sophora japonica> Charred sophora flower, Vinegar sunburn Sophora and Fried Sophora japonica difference is not obvious; Quercetin content: Charred sophora flower> Fried Sophora japonica> Vinegar sunburn Sophora>Health products Sophora japonica. It is proved that there are some differences in the content of active ingredients in Sophora japonica in different processing methods. The content of rutin increased with the increase of the processing temperature, but the content decreased after a certain temperature; Quercetin content will increase gradually with time.

  3. A novel process control method for a TT-300 E-Beam/X-Ray system

    Science.gov (United States)

    Mittendorfer, Josef; Gallnböck-Wagner, Bernhard

    2018-02-01

    This paper presents some aspects of the process control method for a TT-300 E-Beam/X-Ray system at Mediscan, Austria. The novelty of the approach is the seamless integration of routine monitoring dosimetry with process data. This allows to calculate a parametric dose for each production unit and consequently a fine grain and holistic process performance monitoring. Process performance is documented in process control charts for the analysis of individual runs as well as historic trending of runs of specific process categories over a specified time range.

  4. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  5. [Development of an automated processing method to detect coronary motion for coronary magnetic resonance angiography].

    Science.gov (United States)

    Asou, Hiroya; Imada, N; Sato, T

    2010-06-20

    On coronary MR angiography (CMRA), cardiac motions worsen the image quality. To improve the image quality, detection of cardiac especially for individual coronary motion is very important. Usually, scan delay and duration were determined manually by the operator. We developed a new evaluation method to calculate static time of individual coronary artery. At first, coronary cine MRI was taken at the level of about 3 cm below the aortic valve (80 images/R-R). Chronological change of the signals were evaluated with Fourier transformation of each pixel of the images were done. Noise reduction with subtraction process and extraction process were done. To extract higher motion such as coronary arteries, morphological filter process and labeling process were added. Using these imaging processes, individual coronary motion was extracted and individual coronary static time was calculated automatically. We compared the images with ordinary manual method and new automated method in 10 healthy volunteers. Coronary static times were calculated with our method. Calculated coronary static time was shorter than that of ordinary manual method. And scan time became about 10% longer than that of ordinary method. Image qualities were improved in our method. Our automated detection method for coronary static time with chronological Fourier transformation has a potential to improve the image quality of CMRA and easy processing.

  6. Data Processing And Machine Learning Methods For Multi-Modal Operator State Classification Systems

    Science.gov (United States)

    Hearn, Tristan A.

    2015-01-01

    This document is intended as an introduction to a set of common signal processing learning methods that may be used in the software portion of a functional crew state monitoring system. This includes overviews of both the theory of the methods involved, as well as examples of implementation. Practical considerations are discussed for implementing modular, flexible, and scalable processing and classification software for a multi-modal, multi-channel monitoring system. Example source code is also given for all of the discussed processing and classification methods.

  7. Influence of harvesting and processing methods on organic viability of soybean seed

    Directory of Open Access Journals (Sweden)

    Đukanović Lana

    2000-01-01

    Full Text Available Organic viability of soybean seed for three soybean varieties - elite (Bosa, ZPS 015 and Nena depending on methods of manipulation with seeds during harvesting and processing phase were determined in this paper. Trial was conducted in Zemun Polje during 1999; manual and mechanized harvesting or processing methods were applied. Seed germination was tested using ISTA methods (Standard method and Cold test. Following parameters were evaluated: germination viability, germination, rate-speed of emergence, length of hypocotile and main root Rate-speed of emergence was based on number of emerged plants per day. Length of hypocotile or root and percent of germination determined vigour index. Based on obtained results it maybe concluded that methods of seed manipulation during harvesting or processing phase were influenced on soybean seed quality parameters evaluated. Ways of seed manipulation - methods evaluated were influenced organic viability of soybean seed by decreasing germination viability, total germination and length of main root.

  8. Implementation of a new rapid tissue processing method--advantages and challenges

    DEFF Research Database (Denmark)

    Munkholm, Julie; Talman, Maj-Lis; Hasselager, Thomas

    2008-01-01

    Conventional tissue processing of histologic specimens has been carried out in the same manner for many years. It is a time-consuming process involving batch production, resulting in a 1-day delay of the diagnosis. Microwave-assisted tissue processing enables a continuous high flow of histologic...... specimens through the processor with a processing time of as low as 1h. In this article, we present the effects of the automated microwave-assisted tissue processor on the histomorphologic quality and the turnaround time (TAT) for histopathology reports. We present a blind comparative study regarding...... the histomorphologic quality of microwave-processed and conventionally processed tissue samples. A total of 333 specimens were included. The microwave-assisted processing method showed a histomorphologic quality comparable to the conventional method for a number of tissue types, including skin and specimens from...

  9. Analysis of the overall energy intensity of alumina refinery process using unit process energy intensity and product ratio method

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Liru; Aye, Lu [International Technologies Center (IDTC), Department of Civil and Environmental Engineering,The University of Melbourne, Vic. 3010 (Australia); Lu, Zhongwu [Institute of Materials and Metallurgy, Northeastern University, Shenyang 110004 (China); Zhang, Peihong [Department of Municipal and Environmental Engineering, Shenyang Architecture University, Shenyang 110168 (China)

    2006-07-15

    Alumina refinery is an energy intensive industry. Traditional energy saving methods employed have been single-equipment-orientated. Based on two concepts of 'energy carrier' and 'system', this paper presents a method that analyzes the effects of unit process energy intensity (e) and product ratio (p) on overall energy intensity of alumina. The important conclusion drawn from this method is that it is necessary to decrease both the unit process energy intensity and the product ratios in order to decrease the overall energy intensity of alumina, which may be taken as a future policy for energy saving. As a case study, the overall energy intensity of the Chinese Zhenzhou alumina refinery plant with Bayer-sinter combined method between 1995 and 2000 was analyzed. The result shows that the overall energy intensity of alumina in this plant decreased by 7.36 GJ/t-Al{sub 2}O{sub 3} over this period, 49% of total energy saving is due to direct energy saving, and 51% is due to indirect energy saving. The emphasis in this paper is on decreasing product ratios of high-energy consumption unit processes, such as evaporation, slurry sintering, aluminium trihydrate calcining and desilication. Energy savings can be made (1) by increasing the proportion of Bayer and indirect digestion, (2) by increasing the grade of ore by ore dressing or importing some rich gibbsite and (3) by promoting the advancement in technology. (author)

  10. Composite media for fluid stream processing, a method of forming the composite media, and a related method of processing a fluid stream

    Science.gov (United States)

    Garn, Troy G; Law, Jack D; Greenhalgh, Mitchell R; Tranter, Rhonda

    2014-04-01

    A composite media including at least one crystalline aluminosilicate material in polyacrylonitrile. A method of forming a composite media is also disclosed. The method comprises dissolving polyacrylonitrile in an organic solvent to form a matrix solution. At least one crystalline aluminosilicate material is combined with the matrix solution to form a composite media solution. The organic solvent present in the composite media solution is diluted. The composite media solution is solidified. In addition, a method of processing a fluid stream is disclosed. The method comprises providing a beads of a composite media comprising at least one crystalline aluminosilicate material dispersed in a polyacrylonitrile matrix. The beads of the composite media are contacted with a fluid stream comprising at least one constituent. The at least one constituent is substantially removed from the fluid stream.

  11. Testing of the Defense Waste Processing Facility Cold Chemical Dissolution Method in Sludge Batch 9 Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Pareizs, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Coleman, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Young, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Brown, L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-05-10

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) tests the applicability of the digestion methods used by the DWPF Laboratory for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) Receipt samples and SRAT Product process control samples. DWPF SRAT samples are typically dissolved using a method referred to as the DWPF Cold Chemical or Cold Chem Method (CC), (see DWPF Procedure SW4- 15.201). Testing indicates that the CC method produced mixed results. The CC method did not result in complete dissolution of either the SRAT Receipt or SRAT Product with some fine, dark solids remaining. However, elemental analyses did not reveal extreme biases for the major elements in the sludge when compared with analyses obtained following dissolution by hot aqua regia (AR) or sodium peroxide fusion (PF) methods. The CC elemental analyses agreed with the AR and PF methods well enough that it should be adequate for routine process control analyses in the DWPF after much more extensive side-by-side tests of the CC method and the PF method are performed on the first 10 SRAT cycles of the Sludge Batch 9 (SB9) campaign. The DWPF Laboratory should continue with their plans for further tests of the CC method during these 10 SRAT cycles.

  12. First-order Convex Optimization Methods for Signal and Image Processing

    DEFF Research Database (Denmark)

    Jensen, Tobias Lindstrøm

    2012-01-01

    In this thesis we investigate the use of first-order convex optimization methods applied to problems in signal and image processing. First we make a general introduction to convex optimization, first-order methods and their iteration complexity. Then we look at different techniques, which can...... be used with first-order methods such as smoothing, Lagrange multipliers and proximal gradient methods. We continue by presenting different applications of convex optimization and notable convex formulations with an emphasis on inverse problems and sparse signal processing. We also describe the multiple...

  13. [Precautions of physical performance requirements and test methods during product standard drafting process of medical devices].

    Science.gov (United States)

    Song, Jin-Zi; Wan, Min; Xu, Hui; Yao, Xiu-Jun; Zhang, Bo; Wang, Jin-Hong

    2009-09-01

    The major idea of this article is to discuss standardization and normalization for the product standard of medical devices. Analyze the problem related to the physical performance requirements and test methods during product standard drafting process and make corresponding suggestions.

  14. A review of the uses and methods of processing banana and ...

    African Journals Online (AJOL)

    ) ... Journal of Agricultural Research and Development ... Different processing methods of Musa spp. into new food products which include production of flour, preparation of jams and jellies and the quality attributes of the products obtained from ...

  15. The advanced CECE process for enriching tritium by the chemical exchange method with a hydrophobic catalyst

    International Nuclear Information System (INIS)

    Kitamoto, Asashi; Shimizu, Masami; Masui, Takashi.

    1992-01-01

    The monothermal chemical exchange process with electrolysis, i.e., CECE process, was an effective method for enriching and removing tritium from tritiated water with low to middle level activity. The purpose of this study is to propose the theoretical background of the two-parameter evaluation method, which is based on a two-step isotope exchange reaction between hydrogen gas and liquid water, for improvement of the performance of a hydrophobic catalyst by a trickle bed-type column. Finally, a two-parameter method could attain the highest performance of isotope separation and the lowest liquid holdup for a trickle bed-type column. Therefore, this method will present some effective and practical procedures in scaling up a tritium enrichment process. The main aspect of the CECE process in engineering design and system evaluation was to develop the isotope exchange column with a high performance catalyst. (author)

  16. Applying Process Improvement Methods to Clinical and Translational Research: Conceptual Framework and Case Examples.

    Science.gov (United States)

    Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K

    2015-12-01

    There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.

  17. SELECTION OF NON-CONVENTIONAL MACHINING PROCESSES USING THE OCRA METHOD

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2015-04-01

    Full Text Available Selection of the most suitable nonconventional machining process (NCMP for a given machining application can be viewed as multi-criteria decision making (MCDM problem with many conflicting and diverse criteria. To aid these selection processes, different MCDM methods have been proposed. This paper introduces the use of an almost unexplored MCDM method, i.e. operational competitiveness ratings analysis (OCRA method for solving the NCMP selection problems. Applicability, suitability and computational procedure of OCRA method have been demonstrated while solving three case studies dealing with selection of the most suitable NCMP. In each case study the obtained rankings were compared with those derived by the past researchers using different MCDM methods. The results obtained using the OCRA method have good correlation with those derived by the past researchers which validate the usefulness of this method while solving complex NCMP selection problems.

  18. Systematic methods for synthesis and design of sustainable chemical and biochemical processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Chemical and biochemical process design consists of designing the process that can sustainably manufacture an identified chemical product through a chemical or biochemical route. The chemical product tree is potentially very large; starting from a set of basic raw materials (such as petroleum...... for process intensification, sustainable process design, identification of optimal biorefinery models as well as integrated process-control design, and chemical product design. The lecture will present the main concepts, the decomposition based solution approach, the developed methods and tools together...

  19. Quality control of roll-to-roll processed polymer solar modules by complementary imaging methods

    DEFF Research Database (Denmark)

    Rösch, R.; Krebs, Frederik C; Tanenbaum, D.M.

    2012-01-01

    We applied complementary imaging methods to investigate processing failures of roll-to-roll solution processed polymer solar modules based on polymer:fullerene bulk heterojunctions. For investigation of processing deficiencies in solar modules we employed dark lock-in thermography (DLIT......), electroluminescence (ELI) and photoluminescence/reflection imaging (PLI/RI) complemented by optical imaging (OI). The combination of all high resolution images allowed us to allocate the origin of processing errors to a specific deposition process, i.e. the insufficient coverage of an electrode interlayer...

  20. A method for automated processing of measurement information during mechanical drilling

    Energy Technology Data Exchange (ETDEWEB)

    Samonenko, V.I.; Belinkov, V.G.; Romanova, L.A.

    1984-01-01

    An algorithm is cited for a developed method for automated processing of measurement information during mechanical drilling. Its use in conditions of operation of an automated control system (ASU) from drilling will make it possible to precisely identify a change in the lithology, the physical and mechanical and the abrasive properties, in the stratum (pore) pressure in the rock being drilled out during mechanical drilling, which along with other methods for testing the drilling process will increase the reliability of the decisions made.

  1. Application of the dual reciprocity boundary element method for numerical modelling of solidification process

    Directory of Open Access Journals (Sweden)

    E. Majchrzak

    2008-12-01

    Full Text Available The dual reciprocity boundary element method is applied for numerical modelling of solidification process. This variant of the BEM is connected with the transformation of the domain integral to the boundary integrals. In the paper the details of the dual reciprocity boundary element method are presented and the usefulness of this approach to solidification process modelling is demonstrated. In the final part of the paper the examples of computations are shown.

  2. The application of nursing process method in training nurses working in the department of interventional radiology

    International Nuclear Information System (INIS)

    Ni Daihui; Wang Hongjuan; Yang Yajuan; Ye Rui; Qu Juan; Li Xinying; Xu Ying

    2010-01-01

    Objective: To describe the training procedure,typical training method and the clinical effect of nursing process method which was used to cultivate nurses working in the interventional ward. Methods: According to the evaluation index, the authors made a detail assessment of each nurse and found out individually the problems which needed to be perfected, then, the practicable measures were made for each individual nurse, after the training course the clinical results were evaluated. Results: After the nurses on different technical levels were cultivated with nursing process method, the comprehensive quality of each nurse was improved in different degree, and the general nursing quality of entire Department was also markedly improved. Conclusion: By using the nursing process method the cultivating period can be effectively shortened, the possible waste of time, manpower, material and energy cause by the blind training plan can be avoided. (authors)

  3. An Efficient Method to Search Real-Time Bulk Data for an Information Processing System

    International Nuclear Information System (INIS)

    Kim, Seong Jin; Kim, Jong Myung; Suh, Yong Suk; Keum, Jong Yong; Park, Heui Youn

    2005-01-01

    The Man Machine Interface System (MMIS) of System-integrated Modular Advanced ReacTor (SMART) is designed with fully digitalized features. The Information Processing System (IPS) of the MMIS acquires and processes plant data from other systems. In addition, the IPS provides plant operation information to operators in the control room. The IPS is required to process bulky data in a real-time. So, it is necessary to consider a special processing method with regards to flexibility and performance because more than a few thousands of Plant Information converges on the IPS. Among other things, the processing time for searching for data from the bulk data consumes much more than other the processing times. Thus, this paper explores an efficient method for the search and examines its feasibility

  4. Development of X-ray radiography examination technology by image processing method

    Energy Technology Data Exchange (ETDEWEB)

    Min, Duck Kee; Koo, Dae Seo; Kim, Eun Ka [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    Because the dimension of nuclear fuel rods was measured with rapidity and accuracy by X-ray radiography examination, the set-up of image processing system which was composed of 979 CCD-L camera, image processing card and fluorescent lighting was carried out, and the image processing system enabled image processing to perform. The examination technology of X-ray radiography, which enabled dimension measurement of nuclear fuel rods to perform, was developed by image processing method. The result of dimension measurement of standard fuel rod by image processing method was 2% reduction in relative measuring error than that of X-ray radiography film, while the former was better by 100 {approx} 200 {mu}m in measuring accuracy than the latter. (author). 9 refs., 22 figs., 3 tabs.

  5. Method for qualification of cementation processes and its application to a vibration mixer

    International Nuclear Information System (INIS)

    Vicente, R.; Rzyski, B.M.; Suarez, A.A.

    1987-01-01

    In this paper the definition of homogeneneity is discussed and methods to measure the 'degree of heterogeneity' of waste forms are proposed. These measurements are important as aids for mixing process qualification, and as tools in quality assurance procedures and in the development of waste management standards. Homogeneity is a basic quality requirement for waste forms to be accepted in final sites. It do not depend on the matrix immmobilization, rather it is one mean for qualification of the immobilization process. The proposed methods were applied to a vibration assisted mixing process and has proved to an useful mean to judge process improvements. There are many conceivable methods to evaluate homogeneity of waste forms. Some were selected as screening tests aiming at quickly reaching a promising set of process variables. Others were selected to evaluate the degree of excellence of the process in respect to product quality. These envisaged methods were: visual inspection, the use of cement dye as tracer, scanning of radioactive tracers, and measurements of variations of density, water absorption, porosity and mechanical strength across the waste form sample. The process variables were: waste-cement and water-cement ratios, mixer geometry, mixing time and vibration intensity. Some of the apparatus details were change during the experimental work in order to improve product quality. Experimental methods and results statistically analysed and compared with data obtained from samples prepared with a planetary paddle mixer, which were adopted as the homogeneity standard. (Author) [pt

  6. Effects of processing methods on nutritive values of Ekuru from two ...

    African Journals Online (AJOL)

    Beans contain substantial amount of protein, dietary fibre, B-vitamins, minerals, and anti-nutrients which limit their utilisation. Processing reduce the level of antinutrients in plant products but little information exist on effects of processing methods on nutrient and antinutrient composition of bean products. This study was ...

  7. The Open Method of Coordination and the Implementation of the Bologna Process

    Science.gov (United States)

    Veiga, Amelia; Amaral, Alberto

    2006-01-01

    In this paper the authors argue that the use of the Open Method of Coordination (OMC) in the implementation of the Bologna process presents coordination problems that do not allow for the full coherence of the results. As the process is quite complex, involving three different levels (European, national and local) and as the final actors in the…

  8. A robust method for processing scanning probe microscopy images and determining nanoobject position and dimensions

    NARCIS (Netherlands)

    Silly, F.

    2009-01-01

    P>Processing of scanning probe microscopy (SPM) images is essential to explore nanoscale phenomena. Image processing and pattern recognition techniques are developed to improve the accuracy and consistency of nanoobject and surface characterization. We present a robust and versatile method to

  9. Surface Nano Structures Manufacture Using Batch Chemical Processing Methods for Tooling Applications

    DEFF Research Database (Denmark)

    Tosello, Guido; Calaon, Matteo; Gavillet, J.

    2011-01-01

    The patterning of large surface areas with nano structures by using chemical batch processes to avoid using highenergy intensive nano machining processes was investigated. The capability of different surface treatment methods of creating micro and nano structured adaptable mould inserts for subse...

  10. Process and research method of radionuclide migration in high level radioactive waste geological disposal system

    International Nuclear Information System (INIS)

    Chen Rui; Zhang Zhanshi

    2014-01-01

    Radionuclides released from waste can migrate from the repository to the rock and soil outside. On the other hand, nuclides also are retarded by the backfill material. Radionuclide migration is the main geochemical process of the waste disposal. This paper introduces various methods for radionuclide migration research, and give a brief analysis of the geochemical process of radionuclide migration. Finally, two of the most important processes of the radionuclide migration have been instanced. (authors)

  11. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  12. Application of PROMETHEE-GAIA method for non-traditional machining processes selection

    Directory of Open Access Journals (Sweden)

    Prasad Karande

    2012-10-01

    Full Text Available With ever increasing demand for manufactured products of hard alloys and metals with high surface finish and complex shape geometry, more interest is now being paid to non-traditional machining (NTM processes, where energy in its direct form is used to remove material from workpiece surface. Compared to conventional machining processes, NTM processes possess almost unlimited capabilities and there is a strong believe that use of NTM processes would go on increasing in diverse range of applications. Presence of a large number of NTM processes along with complex characteristics and capabilities, and lack of experts in NTM process selection domain compel for development of a structured approach for NTM process selection for a given machining application. Past researchers have already attempted to solve NTM process selection problems using various complex mathematical approaches which often require a profound knowledge in mathematics/artificial intelligence from the part of process engineers. In this paper, four NTM process selection problems are solved using an integrated PROMETHEE (preference ranking organization method for enrichment evaluation and GAIA (geometrical analysis for interactive aid method which would act as a visual decision aid to the process engineers. The observed results are quite satisfactory and exactly match with the expected solutions.

  13. Defect recognition in CFRP components using various NDT methods within a smart manufacturing process

    Science.gov (United States)

    Schumacher, David; Meyendorf, Norbert; Hakim, Issa; Ewert, Uwe

    2018-04-01

    The manufacturing process of carbon fiber reinforced polymer (CFRP) components is gaining a more and more significant role when looking at the increasing amount of CFRPs used in industries today. The monitoring of the manufacturing process and hence the reliability of the manufactured products, is one of the major challenges we need to face in the near future. Common defects which arise during manufacturing process are e.g. porosity and voids which may lead to delaminations during operation and under load. To find irregularities and classify them as possible defects in an early stage of the manufacturing process is of high importance for the safety and reliability of the finished products, as well as of significant impact from an economical point of view. In this study we compare various NDT methods which were applied to similar CFRP laminate samples in order to detect and characterize regions of defective volume. Besides ultrasound, thermography and eddy current, different X-ray methods like radiography, laminography and computed tomography are used to investigate the samples. These methods are compared with the intention to evaluate their capability to reliably detect and characterize defective volume. Beyond the detection and evaluation of defects, we also investigate possibilities to combine various NDT methods within a smart manufacturing process in which the decision which method shall be applied is inherent within the process. Is it possible to design an in-line or at-line testing process which can recognize defects reliably and reduce testing time and costs? This study aims to show up opportunities of designing a smart NDT process synchronized to the production based on the concepts of smart production (Industry 4.0). A set of defective CFRP laminate samples and different NDT methods were used to demonstrate how effective defects are recognized and how communication between interconnected NDT sensors and the manufacturing process could be organized.

  14. Methods of modeling and optimization of work effects for chosen mineral processing systems

    Directory of Open Access Journals (Sweden)

    Tomasz Niedoba

    2005-11-01

    Full Text Available The methods being used in the mineral processing modeling are reviewed in this paper. Particularly, the heuristic approach was presented. The new, modern techniques of modeling and optimization were proposed, including the least median squares method and genetic algorithms. The rules of the latter were described in details.

  15. An intercomparison of computer assisted date processing and display methods in radioisotope scintigraphy using mathematical tumours

    International Nuclear Information System (INIS)

    Houston, A.S.; Macleod, M.A.

    1977-01-01

    Several computer assisted processing and display methods are evaluated using a series of 100 normal brain scintigrams, 50 of which have had single 'mathematical tumours' superimposed. Using a standard rating system, or in some cases quantitative estimation, LROC curves are generated for each method and compared. (author)

  16. MULTIAGENT TECHNOLOGIES’ METHOD IN MANAGING BUSINESS-PROCESSES OF THE TECHNICAL PREPARING FOR PRODUCTION

    Directory of Open Access Journals (Sweden)

    P.N. Pavlenko

    2005-02-01

    Full Text Available  The method of managing the process of the extended productions technological preparation is given. The method is used for integrating the automated systems of industrial assignment of  CAD/CAM/SAPP and ERP systems.

  17. Teaching Methods Influencing the Sustainability of the Teaching Process in Technology Education in General Education Schools

    Science.gov (United States)

    Soobik, Mart

    2014-01-01

    The sustainability of technology education is related to a traditional understanding of craft and the methods used to teach it; however, the methods used in the teaching process have been influenced by the innovative changes accompanying the development of technology. In respect to social and economic development, it is important to prepare young…

  18. Actively Teaching Research Methods with a Process Oriented Guided Inquiry Learning Approach

    Science.gov (United States)

    Mullins, Mary H.

    2017-01-01

    Active learning approaches have shown to improve student learning outcomes and improve the experience of students in the classroom. This article compares a Process Oriented Guided Inquiry Learning style approach to a more traditional teaching method in an undergraduate research methods course. Moving from a more traditional learning environment to…

  19. Detection of wood failure by image processing method: influence of algorithm, adhesive and wood species

    Science.gov (United States)

    Lanying Lin; Sheng He; Feng Fu; Xiping Wang

    2015-01-01

    Wood failure percentage (WFP) is an important index for evaluating the bond strength of plywood. Currently, the method used for detecting WFP is visual inspection, which lacks efficiency. In order to improve it, image processing methods are applied to wood failure detection. The present study used thresholding and K-means clustering algorithms in wood failure detection...

  20. Bispectral methods of signal processing applications in radar, telecommunications and digital image restoration

    CERN Document Server

    Totsky, Alexander V; Kravchenko, Victor F

    2015-01-01

    By studying applications in radar, telecommunications and digital image restoration, this monograph discusses signal processing techniques based on bispectral methods. Improved robustness against different forms of noise as well as preservation of phase information render this method a valuable alternative to common power-spectrum analysis used in radar object recognition, digital wireless communications, and jitter removal in images.

  1. Comparison of the quasi-static method and the dynamic method for simulating fracture processes in concrete

    Science.gov (United States)

    Liu, J. X.; Deng, S. C.; Liang, N. G.

    2008-02-01

    Concrete is heterogeneous and usually described as a three-phase material, where matrix, aggregate and interface are distinguished. To take this heterogeneity into consideration, the Generalized Beam (GB) lattice model is adopted. The GB lattice model is much more computationally efficient than the beam lattice model. Numerical procedures of both quasi-static method and dynamic method are developed to simulate fracture processes in uniaxial tensile tests conducted on a concrete panel. Cases of different loading rates are compared with the quasi-static case. It is found that the inertia effect due to load increasing becomes less important and can be ignored with the loading rate decreasing, but the inertia effect due to unstable crack propagation remains considerable no matter how low the loading rate is. Therefore, an unrealistic result will be obtained if a fracture process including unstable cracking is simulated by the quasi-static procedure.

  2. Calculations of the electromechanical transfer processes using implicit methods of numerical integration

    Energy Technology Data Exchange (ETDEWEB)

    Pogosyan, T A

    1983-01-01

    The article is dedicated to the solution of systems of differential equations which describe the transfer processes in an electric power system (EES) by implicit methods of numerical integration. The distinguishing feature of the implicit methods (Euler's reverse method and the trapeze method) is their absolute stability and, consequently, the relatively small accumulation of errors in each step of integration. Therefore, they are found to be very convenient for solving problems of electric power engineering, when the transfer processes are described by a rigid system of differential equations. The rigidity is associated with the range of values of the time constants considered. The advantage of the implicit methods over explicit are shown in a specific example (calculation of the dynamic stability of the simplest electric power system), along with the field of use of the implicit methods and the expedience of their use in power engineering problems.

  3. Possibilities of Utilizing the Method of Analytical Hierarchy Process Within the Strategy of Corporate Social Business

    Science.gov (United States)

    Drieniková, Katarína; Hrdinová, Gabriela; Naňo, Tomáš; Sakál, Peter

    2010-01-01

    The paper deals with the analysis of the theory of corporate social responsibility, risk management and the exact method of analytic hierarchic process that is used in the decision-making processes. The Chapters 2 and 3 focus on presentation of the experience with the application of the method in formulating the stakeholders' strategic goals within the Corporate Social Responsibility (CSR) and simultaneously its utilization in minimizing the environmental risks. The major benefit of this paper is the application of Analytical Hierarchy Process (AHP).

  4. THE USE OF AHP METHOD IN THE MULTI‐CRITERIA TASK SOLVING PROCESS – CASE STUDY

    Directory of Open Access Journals (Sweden)

    Zygmunt KORBAN

    2014-01-01

    Full Text Available In the decision‐making process, both single‐ and multi‐criteria tasks are dealt with. In the majority of cases, the selection of a solution comes down to determination of the “best” decision (most often based on the subjective assessment or to organisation of the set of decisions. The Analytic Hierarchy Process (AHP is one of the methods used for evaluation of qualitative features in the multi‐criteria optimisation processes. This article discusses the possibilities of using the above‐mentioned method, illustrated with an example of purchasing technical equipment for one of the municipal landfill sites in the Silesian Province.

  5. METHOD OF DISPLAYING AN EXECUTABLE BUSINESS PROCESS MODELS INTO PETRI NETS

    Directory of Open Access Journals (Sweden)

    Igor G. Fedorov

    2013-01-01

    Full Text Available Executable business process models, as well as programs, require evidence of a defect-free finish. The methods based on the formalism of Petri nets are widely used. A business process is a network of dishes, and its properties are set by the analysis of the properties of the network. The aim is to study the methods of displaying an executable business process model in a Petri net. Analysis of the properties of the resulting model allows us to prove a number of important properties: it is a network of free choice and clean without looping.

  6. Standardization of a method to study the distribution of Americium in purex process

    International Nuclear Information System (INIS)

    Dapolikar, T.T.; Pant, D.K.; Kapur, H.N.; Kumar, Rajendra; Dubey, K.

    2017-01-01

    In the present work the distribution of Americium in PUREX process is investigated in various process streams. For this purpose a method has been standardized for the determination of Am in process samples. The method involves extraction of Am with associated actinides using 30% TRPO-NPH at 0.3M HNO 3 followed by selective stripping of Am from the organic phase into aqueous phase at 6M HNO 3 . The assay of aqueous phase for Am content is carried out by alpha radiometry. The investigation has revealed that 100% Am follows the HLLW route. (author)

  7. Extrusion Process by Finite Volume Method Using OpenFoam Software

    International Nuclear Information System (INIS)

    Matos Martins, Marcelo; Tonini Button, Sergio; Divo Bressan, Jose; Ivankovic, Alojz

    2011-01-01

    The computational codes are very important tools to solve engineering problems. In the analysis of metal forming process, such as extrusion, this is not different because the computational codes allow analyzing the process with reduced cost. Traditionally, the Finite Element Method is used to solve solid mechanic problems, however, the Finite Volume Method (FVM) have been gaining force in this field of applications. This paper presents the velocity field and friction coefficient variation results, obtained by numerical simulation using the OpenFoam Software and the FVM to solve an aluminum direct cold extrusion process.

  8. Comparison of potential method in analytic hierarchy process for multi-attribute of catering service companies

    Science.gov (United States)

    Mamat, Siti Salwana; Ahmad, Tahir; Awang, Siti Rahmah

    2017-08-01

    Analytic Hierarchy Process (AHP) is a method used in structuring, measuring and synthesizing criteria, in particular ranking of multiple criteria in decision making problems. On the other hand, Potential Method is a ranking procedure in which utilizes preference graph ς (V, A). Two nodes are adjacent if they are compared in a pairwise comparison whereby the assigned arc is oriented towards the more preferred node. In this paper Potential Method is used to solve problem on a catering service selection. The comparison of result by using Potential method is made with Extent Analysis. The Potential Method is found to produce the same rank as Extent Analysis in AHP.

  9. The uranium waste fluid processing examination by liquid and liquid extraction method using the emulsion flow method

    International Nuclear Information System (INIS)

    Kanda, Nobuhiro; Daiten, Masaki; Endo, Yuji; Yoshida, Hideaki; Mita, Yutaka; Naganawa, Hirochika; Nagano, Tetsushi; Yanase, Nobuyuki

    2015-03-01

    Spent centrifuges which had used for the development of the uranium enrichment technology are stored in the uranium enrichment facility located in Ningyo-toge Environmental Center, Japan Atomic Energy Agency (JAEA). Our technology of the centrifugal machine processing are supposed to separate the radioactive material adhered on surface of inner parts of centrifuges by the wet way decontamination method using the ultrasonic bath filled dilute sulfuric acid and water, and it is generated the neutralization sediment (sludge) by the processing of the radioactive waste fluid with the decontamination. JAEA had been considering the applicability of a streamlining and reduction of the processing of the sludge by decreases radioactive concentration including the sludge through the removes uranium from the radioactive waste fluid. As part of considerations, JAEA have been promoting technological developments of the uranium extraction separation using The Emulsion Flow Extraction Method (a theory propounded by JAEA-Nuclear Science and Engineering Center) in close coordination and cooperation between with JAEA-Nuclear Science and Engineering Center and Ningyo-toge Environmental Center from 2007 fiscal year. This report describes the outline of the application test using actual waste fluid of dilute sulfuric acid and water by developed the examination system introducing the emulsion flow extraction method. (author)

  10. Downstream processing and chromatography based analytical methods for production of vaccines, gene therapy vectors, and bacteriophages

    Science.gov (United States)

    Kramberger, Petra; Urbas, Lidija; Štrancar, Aleš

    2015-01-01

    Downstream processing of nanoplexes (viruses, virus-like particles, bacteriophages) is characterized by complexity of the starting material, number of purification methods to choose from, regulations that are setting the frame for the final product and analytical methods for upstream and downstream monitoring. This review gives an overview on the nanoplex downstream challenges and chromatography based analytical methods for efficient monitoring of the nanoplex production. PMID:25751122

  11. Psychophysical "blinding" methods reveal a functional hierarchy of unconscious visual processing.

    Science.gov (United States)

    Breitmeyer, Bruno G

    2015-09-01

    Numerous non-invasive experimental "blinding" methods exist for suppressing the phenomenal awareness of visual stimuli. Not all of these suppressive methods occur at, and thus index, the same level of unconscious visual processing. This suggests that a functional hierarchy of unconscious visual processing can in principle be established. The empirical results of extant studies that have used a number of different methods and additional reasonable theoretical considerations suggest the following tentative hierarchy. At the highest levels in this hierarchy is unconscious processing indexed by object-substitution masking. The functional levels indexed by crowding, the attentional blink (and other attentional blinding methods), backward pattern masking, metacontrast masking, continuous flash suppression, sandwich masking, and single-flash interocular suppression, fall at progressively lower levels, while unconscious processing at the lowest levels is indexed by eye-based binocular-rivalry suppression. Although unconscious processing levels indexed by additional blinding methods is yet to be determined, a tentative placement at lower levels in the hierarchy is also given for unconscious processing indexed by Troxler fading and adaptation-induced blindness, and at higher levels in the hierarchy indexed by attentional blinding effects in addition to the level indexed by the attentional blink. The full mapping of levels in the functional hierarchy onto cortical activation sites and levels is yet to be determined. The existence of such a hierarchy bears importantly on the search for, and the distinctions between, neural correlates of conscious and unconscious vision. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Gear hot forging process robust design based on finite element method

    International Nuclear Information System (INIS)

    Xuewen, Chen; Won, Jung Dong

    2008-01-01

    During the hot forging process, the shaping property and forging quality will fluctuate because of die wear, manufacturing tolerance, dimensional variation caused by temperature and the different friction conditions, etc. In order to control this variation in performance and to optimize the process parameters, a robust design method is proposed in this paper, based on the finite element method for the hot forging process. During the robust design process, the Taguchi method is the basic robust theory. The finite element analysis is incorporated in order to simulate the hot forging process. In addition, in order to calculate the objective function value, an orthogonal design method is selected to arrange experiments and collect sample points. The ANOVA method is employed to analyze the relationships of the design parameters and design objectives and to find the best parameters. Finally, a case study for the gear hot forging process is conducted. With the objective to reduce the forging force and its variation, the robust design mathematical model is established. The optimal design parameters obtained from this study indicate that the forging force has been reduced and its variation has been controlled

  13. Using stable isotopes to monitor forms of sulfur during desulfurization processes: A quick screening method

    Science.gov (United States)

    Liu, Chao-Li; Hackley, Keith C.; Coleman, D.D.; Kruse, C.W.

    1987-01-01

    A method using stable isotope ratio analysis to monitor the reactivity of sulfur forms in coal during thermal and chemical desulfurization processes has been developed at the Illinois State Geological Survey. The method is based upon the fact that a significant difference exists in some coals between the 34S/32S ratios of the pyritic and organic sulfur. A screening method for determining the suitability of coal samples for use in isotope ratio analysis is described. Making these special coals available from coal sample programs would assist research groups in sorting out the complex sulfur chemistry which accompanies thermal and chemical processing of high sulfur coals. ?? 1987.

  14. An Integrated Computational Materials Engineering Method for Woven Carbon Fiber Composites Preforming Process

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Weizhao; Ren, Huaqing; Wang, Zequn; Liu, Wing K.; Chen, Wei; Zeng, Danielle; Su, Xuming; Cao, Jian

    2016-10-19

    An integrated computational materials engineering method is proposed in this paper for analyzing the design and preforming process of woven carbon fiber composites. The goal is to reduce the cost and time needed for the mass production of structural composites. It integrates the simulation methods from the micro-scale to the macro-scale to capture the behavior of the composite material in the preforming process. In this way, the time consuming and high cost physical experiments and prototypes in the development of the manufacturing process can be circumvented. This method contains three parts: the micro-scale representative volume element (RVE) simulation to characterize the material; the metamodeling algorithm to generate the constitutive equations; and the macro-scale preforming simulation to predict the behavior of the composite material during forming. The results show the potential of this approach as a guidance to the design of composite materials and its manufacturing process.

  15. Learning-based controller for biotechnology processing, and method of using

    Science.gov (United States)

    Johnson, John A.; Stoner, Daphne L.; Larsen, Eric D.; Miller, Karen S.; Tolle, Charles R.

    2004-09-14

    The present invention relates to process control where some of the controllable parameters are difficult or impossible to characterize. The present invention relates to process control in biotechnology of such systems, but not limited to. Additionally, the present invention relates to process control in biotechnology minerals processing. In the inventive method, an application of the present invention manipulates a minerals bioprocess to find local exterma (maxima or minima) for selected output variables/process goals by using a learning-based controller for bioprocess oxidation of minerals during hydrometallurgical processing. The learning-based controller operates with or without human supervision and works to find processor optima without previously defined optima due to the non-characterized nature of the process being manipulated.

  16. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Energy Technology Data Exchange (ETDEWEB)

    Senvar, O.; Sennaroglu, B.

    2016-07-01

    This study examines Clements’ Approach (CA), Box-Cox transformation (BCT), and Johnson transformation (JT) methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI) Ppu is handled for process capability analysis (PCA) because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD), which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB) and the Relative Root Mean Square Error (RRMSE) are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations... (Author)

  17. Analysis methods of stochastic transient electro–magnetic processes in electric traction system

    Directory of Open Access Journals (Sweden)

    T. M. Mishchenko

    2013-04-01

    Full Text Available Purpose. The essence and basic characteristics of calculation methods of transient electromagnetic processes in the elements and devices of non–linear dynamic electric traction systems taking into account the stochastic changes of voltages and currents in traction networks of power supply subsystem and power circuits of electric rolling stock are developed. Methodology. Classical methods and the methods of non–linear electric engineering, as well as probability theory method, especially the methods of stationary ergodic and non–stationary stochastic processes application are used in the research. Findings. Using the above-mentioned methods an equivalent circuit and the system of nonlinear integra–differential equations for electromagnetic condition of the double–track inter-substation zone of alternating current electric traction system are drawn up. Calculations allow obtaining electric traction current distribution in the areas of feeder zones. Originality. First of all the paper is interesting and important from scientific point of view due to the methods, which allow taking into account probabilistic character of change for traction voltages and electric traction system currents. On the second hand the researches develop the most efficient methods of nonlinear circuits’ analysis. Practical value. The practical value of the research is presented in application of the methods to the analysis of electromagnetic and electric energy processes in the traction power supply system in the case of high-speed train traffic.

  18. Nonparametric Inference of Doubly Stochastic Poisson Process Data via the Kernel Method.

    Science.gov (United States)

    Zhang, Tingting; Kou, S C

    2010-01-01

    Doubly stochastic Poisson processes, also known as the Cox processes, frequently occur in various scientific fields. In this article, motivated primarily by analyzing Cox process data in biophysics, we propose a nonparametric kernel-based inference method. We conduct a detailed study, including an asymptotic analysis, of the proposed method, and provide guidelines for its practical use, introducing a fast and stable regression method for bandwidth selection. We apply our method to real photon arrival data from recent single-molecule biophysical experiments, investigating proteins' conformational dynamics. Our result shows that conformational fluctuation is widely present in protein systems, and that the fluctuation covers a broad range of time scales, highlighting the dynamic and complex nature of proteins' structure.

  19. Temporal response methods for dynamic measurement of in-process inventory of dissolved nuclear materials

    International Nuclear Information System (INIS)

    Ziri, S.M.; Seefeldt, W.B.

    1977-08-01

    This analysis has demonstrated that a plant's temporal response to perturbation of feed isotope composition can be used to measure the in-process inventory, without suspending plant operations. The main advantages of the temporal response technique over the step-displacement method are (1) it (the temporal response method) obviates the need for large special feed batches, and (2) it obviates the requirement that all the in-process material have a uniform isotopic composition at the beginning of the measurement. The temporal response method holds promise for essentially continuous real-time determination of in-process SNM. However, the temporal response method requires the measurement of the isotopic composition of many samples, and it works best for a stationary random input time series of tracer perturbations. Both of these requirements appear amenable to satisfaction by practical equipment and procedures if the benefits are deemed sufficiently worthwhile

  20. Method and apparatus for surface characterization and process control utilizing radiation from desorbed particles

    International Nuclear Information System (INIS)

    Feldman, L.C.; Kraus, J.S.; Tolk, N.H.; Traum, M.M.; Tully, J.C.

    1983-01-01

    Emission of characteristic electromagnetic radiation in the infrared, visible, or UV from excited particles, typically ions, molecules, or neutral atoms, desorbed from solid surfaces by an incident beam of low-momentum probe radiation has been observed. Disclosed is a method for characterizing solid surfaces based on the observed effect, with low-momentum probe radiation consisting of electrons or photons. Further disclosed is a method for controlling manufacturing processes that is also based on the observed effect. The latter method can, for instance, be advantageously applied in integrated circuit-, integrated optics-, and magnetic bubble device manufacture. Specific examples of applications of the method are registering of masks, control of a direct-writing processing beam, end-point detection in etching, and control of a processing beam for laser- or electron-beam annealing or ion implantation

  1. TARGET CONTROLLING METHOD OF THE PRICING PROCESS IN THE TOURISM ENTERPRISES

    Directory of Open Access Journals (Sweden)

    N. Sagalakova

    2016-02-01

    Full Text Available Key stages of the pricing process in the tourism enterprises are investigated: subprocess of establishing of nominal value of the new tourism product price and subprocess of adjustment of the established price depending on a situation in the tourism market. For establishing of nominal value of the price it is offered by use of optimizing model, which maximizes the usefulness function of structural parts of the tourism product price. For adjustment of the tourism product price under change of external conditions procedure of installation of the target with use of the process behavior charts of the pricing process is applied. The new methodology of the pricing process controlling in the tourism enterprises, which based on complex application of methods of the statistical processes control and a method of dynamic programming, is presented in article and fully considers one of key features of the tourism sphere - seasonal fluctuations of the tourism product price.

  2. Application of remote sensing methods and GIS in erosive process investigations

    Directory of Open Access Journals (Sweden)

    Mustafić Sanja

    2007-01-01

    Full Text Available Modern geomorphologic investigations of condition and change of the intensity of erosive process should be based on application of remote sensing methods which are based on processing of aerial and satellite photographs. Using of these methods is very important because it enables good possibilities for realizing regional relations of the investigated phenomenon, as well as the estimate of spatial and temporal variability of all physical-geographical and anthropogenic factors influencing given process. Realizing process of land erosion, on the whole, is only possible by creating universal data base, as well as by using of appropriate software, more exactly by establishing uniform information system. Geographical information system, as the most effective one, the most complex and the most integral system of information about the space enables unification as well as analytical and synthetically processing of all data.

  3. Performance Analysis of Entropy Methods on K Means in Clustering Process

    Science.gov (United States)

    Dicky Syahputra Lubis, Mhd.; Mawengkang, Herman; Suwilo, Saib

    2017-12-01

    K Means is a non-hierarchical data clustering method that attempts to partition existing data into one or more clusters / groups. This method partitions the data into clusters / groups so that data that have the same characteristics are grouped into the same cluster and data that have different characteristics are grouped into other groups.The purpose of this data clustering is to minimize the objective function set in the clustering process, which generally attempts to minimize variation within a cluster and maximize the variation between clusters. However, the main disadvantage of this method is that the number k is often not known before. Furthermore, a randomly chosen starting point may cause two points to approach the distance to be determined as two centroids. Therefore, for the determination of the starting point in K Means used entropy method where this method is a method that can be used to determine a weight and take a decision from a set of alternatives. Entropy is able to investigate the harmony in discrimination among a multitude of data sets. Using Entropy criteria with the highest value variations will get the highest weight. Given this entropy method can help K Means work process in determining the starting point which is usually determined at random. Thus the process of clustering on K Means can be more quickly known by helping the entropy method where the iteration process is faster than the K Means Standard process. Where the postoperative patient dataset of the UCI Repository Machine Learning used and using only 12 data as an example of its calculations is obtained by entropy method only with 2 times iteration can get the desired end result.

  4. A fast exact simulation method for a class of Markov jump processes.

    Science.gov (United States)

    Li, Yao; Hu, Lili

    2015-11-14

    A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze its properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.

  5. A novel method for detecting and counting overlapping tracks in SSNTD by image processing techniques

    International Nuclear Information System (INIS)

    Ab Azar, N.; Babakhani, A.; Broumandnia, A.; Sepanloo, K.

    2016-01-01

    Overlapping object detection and counting is a challenge in image processing. A new method for detecting and counting overlapping circles is presented in this paper. This method is based on pattern recognition and feature extraction using “neighborhood values“ in an object image by implementation of image processing techniques. The junction points are detected by assignment of a value for each pixel in an image. As is shown, the neighborhood values for junction points are larger than the values for other points. This distinction of neighborhood values is the main feature which can be utilized to identify the junction points and to count the overlapping tracks. This method can be used for recognizing and counting charged particle tracks, blood cells and also cancer cells. The method is called “Track Counting based on Neighborhood Values” and is symbolized by “TCNV”. - Highlights: • A new method is introduced to recognize nuclear tracks by image processing. • The method is used to specify neighborhood pixels in junction points in overlapping tracks. • Enhanced method of counting overlapping tracks. • New counting system has linear behavior in counting tracks with density less than 300,000 tracks per cm"2. • In the new method, the overlap tracks can be recognized even to 10× tracks and more.

  6. Review of conventional and novel food processing methods on food allergens.

    Science.gov (United States)

    Vanga, Sai Kranthi; Singh, Ashutosh; Raghavan, Vijaya

    2017-07-03

    With the turn of this century, novel food processing techniques have become commercially very important because of their profound advantages over the traditional methods. These novel processing methods tend to preserve the characteristic properties of food including their organoleptic and nutritional qualities better when compared with the conventional food processing methods. During the same period of time, there is a clear rise in the populations suffering from food allergies, especially infants and children. Though, this fact is widely attributed to the changing livelihood of population in both developed and developing nations and to the introduction of new food habits with advent of novel foods and new processing techniques, their complete role is still uncertain. Under the circumstance, it is very important to understand the structural changes in the protein as food is processed to comprehend whether the specific processing technique (conventional and novel) is increasing or mitigating the allergenicity. Various modern means are now being employed to understand the conformational changes in the protein which can affect the allergenicity. In this review, the processing effects on protein structure and allergenicity are discussed along with the insinuations of recent studies and techniques for establishing a platform to investigate future pathway to reduce or eliminate allergenicity in the population.

  7. An Efficient Quality-Related Fault Diagnosis Method for Real-Time Multimode Industrial Process

    Directory of Open Access Journals (Sweden)

    Kaixiang Peng

    2017-01-01

    Full Text Available Focusing on quality-related complex industrial process performance monitoring, a novel multimode process monitoring method is proposed in this paper. Firstly, principal component space clustering is implemented under the guidance of quality variables. Through extraction of model tags, clustering information of original training data can be acquired. Secondly, according to multimode characteristics of process data, the monitoring model integrated Gaussian mixture model with total projection to latent structures is effective after building the covariance description form. The multimode total projection to latent structures (MTPLS model is the foundation of problem solving about quality-related monitoring for multimode processes. Then, a comprehensive statistics index is defined which is based on the posterior probability of the monitored samples belonging to each Gaussian component in the Bayesian theory. After that, a combined index is constructed for process monitoring. Finally, motivated by the application of traditional contribution plot in fault diagnosis, a gradient contribution rate is applied for analyzing the variation of variable contribution rate along samples. Our method can ensure the implementation of online fault monitoring and diagnosis for multimode processes. Performances of the whole proposed scheme are verified in a real industrial, hot strip mill process (HSMP compared with some existing methods.

  8. Comparison between two rheocasting processes of damper cooling tube method and low superheat casting

    Directory of Open Access Journals (Sweden)

    Zhang Xiaoli

    2014-09-01

    Full Text Available To produce a high quality semisolid slurry that consists of fine primary particles uniformly suspended in the liquid matrix for rheoforming, chemical refining and electromagnetic or mechanical stirring are the two methods commonly used. But these two methods either contaminate the melt or incur high cost. In this study, the damper cooling tube (DCT method was designed to prepare semisolid slurry of A356 aluminum alloy, and was compared with the low superheat casting (LSC method - a conventional process used to produce casting slab with equiaxed dendrite microstructure for thixoforming route. A series of comparative experiments were performed at the pouring temperatures of 650 °C, 638 °C and 622 °C. Metallographic observations of the casting samples were carried out using an optical electron microscope with image analysis software. Results show that the microstructure of semisolid slurry produced by the DCT process consists of spherical primary α-Al grains, while equiaxed grains microstructure is found in the LSC process. The lower the pouring temperature, the smaller the grain size and the rounder the grain morphology in both methods. The copious nucleation, which could be generated in the DCT, owing to the cooling and stirring effect, is the key to producing high quality semisolid slurry. DCT method could produce rounder and smaller α-Al grains, which are suitable for semisolid processing; and the equivalent grain size is no more than 60 μm when the pouring temperature is 622 °C.

  9. Drying of water based foundry coatings: Innovative test, process design and optimization methods

    DEFF Research Database (Denmark)

    Di Muoio, Giovanni Luca; Johansen, Bjørn Budolph

    on real industrial cases. These tools have been developed in order to simulate and optimize the drying process and reduce drying time and power consumption as well as production process design time and cost of expensive drying equipment. Results show that test methods from other industries can be used...... capacity goals there is a need to understand how to design, control and optimize drying processes. The main focus of this project was on the critical parameters and properties to be controlled in production in order to achieve a stable and predictable drying process. We propose for each of these parameters...... of Denmark with the overall aim to optimize the drying process of water based foundry coatings. Drying of foundry coatings is a relatively new process in the foundry industry that followed the introduction of water as a solvent. In order to avoid moisture related quality problems and reach production...

  10. System and method for integrating hazard-based decision making tools and processes

    Science.gov (United States)

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  11. The Oil Point Method - A tool for indicative environmental evaluation in material and process selection

    DEFF Research Database (Denmark)

    Bey, Niki

    2000-01-01

    to three essential assessment steps, the method enables rough environmental evaluations and supports in this way material- and process-related decision-making in the early stages of design. In its overall structure, the Oil Point Method is related to Life Cycle Assessment - except for two main differences...... of environmental evaluation and only approximate information about the product and its life cycle. This dissertation addresses this challenge in presenting a method, which is tailored to these requirements of designers - the Oil Point Method (OPM). In providing environmental key information and confining itself...

  12. Processing of low-quality bauxite feedstock by thermochemistry-Bayer method

    Directory of Open Access Journals (Sweden)

    О. А. Дубовиков

    2016-11-01

    Full Text Available The modern production of aluminum which by its global output ranks first among the non-ferrous metals includes three main stages: ore extraction, its processing into alumina and, finally, the production of primary aluminum. Alumina production from bauxites,  being the  primary raw material in the  alumina industry,  is based  on two main methods: the Bayer method and the sintering method developed in Russia under the lead of an academician Nikolay Semenovich Kurnakov. Alumina production by the Bayer’s method is more cost effective,  but  has  higher  requirements to the  quality of the bauxite feedstock.  A great deal  of research has  been carried  out on low quality bauxites focusing firstly on finding ways to enrich the feedstock, secondly on improving the combined sequential Bayer-sintering method and thirdly on developing new hydrometallurgical ways for bauxites processing. Mechanical methods of bauxite enrichment have not yet brought any positive outcome, and a development of new hydrometallurgical high alkaline  autoclave process  faced  significant hardware  difficulties not addressed so far. For efficient processing of such low quality bauxite feedstock it is suggested to use a universal thermochemistry-Bayer method, which was developed in St. Petersburg Mining University under  the lead  of  Nikolay Ivanovich Eremin, allows to process different substandard bauxite feedstock and has a competitive costing as compared to the sintering method and combined methods. The main stages of thermochemistry-Bayer method are thermal activation of feedstock, its further desiliconization with the alkaline solution and leaching of the resultant bauxite product  under Bayer’s method. Despite high energy consumption at  the baking stage,  it  allows to condition the  low quality bauxite feedstock by neutralizing a variety of technologically harmful impurities such as organic matter, sulfide sulfur, carbonates, and at the

  13. A new decomposition method for parallel processing multi-level optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Min Soo; Choi, Dong Hoon

    2002-01-01

    In practical designs, most of the multidisciplinary problems have a large-size and complicate design system. Since multidisciplinary problems have hundreds of analyses and thousands of variables, the grouping of analyses and the order of the analyses in the group affect the speed of the total design cycle. Therefore, it is very important to reorder and regroup the original design processes in order to minimize the total computational cost by decomposing large multidisciplinary problems into several MultiDisciplinary Analysis SubSystems (MDASS) and by processing them in parallel. In this study, a new decomposition method is proposed for parallel processing of multidisciplinary design optimization, such as Collaborative Optimization (CO) and Individual Discipline Feasible (IDF) method. Numerical results for two example problems are presented to show the feasibility of the proposed method

  14. A digital processing method for the analysis of complex nuclear spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Abani, M.C.; Bairi, B.R.

    1994-01-01

    This paper describes a digital processing method using frequency power spectra for the analysis of complex nuclear spectra. The power spectra were estimated by employing modified discrete Fourier transform. The method was applied to observed spectral envelopes. The results for separating closely-spaced doublets in nuclear spectra of low statistical precision compared favorably with those obtained by using a popular peak fitting program SAMPO. The paper also describes limitations of the peak fitting methods. It describes the advantages of digital processing techniques for type II digital signals including nuclear spectra. A compact computer program occupying less than 2.5 kByte of memory space was written in BASIC for the processing of observed spectral envelopes. (orig.)

  15. Temporal response methods for dynamic measurement of in-process inventory of dissolved nuclear materials

    International Nuclear Information System (INIS)

    Zivi, S.M.; Seefeldt, W.B.

    1976-01-01

    This analysis demonstrated that a plant's temporal response to perturbations of feed isotope composition can be used to measure the in-process inventory, without suspending plant operations. The main advantage of the temporal response technique over the step-displacement method are (1) it obviates the need for large special feed batches and (2) it obviates the requirement that all the in-process material have a uniform isotopic composition at the beginning of the measurement. The temporal response method holds promise for essentially continuous real-time determination of in-process SNM. The main disadvantage or problem with the temporal response method is that it requires the measurement of the isotopic composition of a great many samples to moderately high accuracy. This requirement appears amenable to solution by a modest effort in instrument development

  16. A Robust Photogrammetric Processing Method of Low-Altitude UAV Images

    Directory of Open Access Journals (Sweden)

    Mingyao Ai

    2015-02-01

    Full Text Available Low-altitude Unmanned Aerial Vehicles (UAV images which include distortion, illumination variance, and large rotation angles are facing multiple challenges of image orientation and image processing. In this paper, a robust and convenient photogrammetric approach is proposed for processing low-altitude UAV images, involving a strip management method to automatically build a standardized regional aerial triangle (AT network, a parallel inner orientation algorithm, a ground control points (GCPs predicting method, and an improved Scale Invariant Feature Transform (SIFT method to produce large number of evenly distributed reliable tie points for bundle adjustment (BA. A multi-view matching approach is improved to produce Digital Surface Models (DSM and Digital Orthophoto Maps (DOM for 3D visualization. Experimental results show that the proposed approach is robust and feasible for photogrammetric processing of low-altitude UAV images and 3D visualization of products.

  17. Method of transition from 3D model to its ontological representation in aircraft design process

    Science.gov (United States)

    Govorkov, A. S.; Zhilyaev, A. S.; Fokin, I. V.

    2018-05-01

    This paper proposes the method of transition from a 3D model to its ontological representation and describes its usage in the aircraft design process. The problems of design for manufacturability and design automation are also discussed. The introduced method is to aim to ease the process of data exchange between important aircraft design phases, namely engineering and design control. The method is also intended to increase design speed and 3D model customizability. This requires careful selection of the complex systems (CAD / CAM / CAE / PDM), providing the basis for the integration of design and technological preparation of production and more fully take into account the characteristics of products and processes for their manufacture. It is important to solve this problem, as investment in the automation define the company's competitiveness in the years ahead.

  18. Isoflavone profile in soymilk as affected by soybean variety, grinding, and heat-processing methods.

    Science.gov (United States)

    Zhang, Yan; Chang, Sam K C; Liu, Zhisheng

    2015-05-01

    Isoflavones impart health benefits and their overall content and profile in foods are greatly influenced at each step during processing. In this study, 2 soybean varieties (Prosoy and black soybean) were processed with 3 different grinding (ambient, cold, and hot grinding) and heating methods (traditional stove cooking, 1-phase UHT, and 2-phase UHT) for soymilk making. The results showed after cold, ambient, and hot grinding, the total isoflavones were 3917, 5013, and 5949 nmol/g for Prosoy; the total isoflavones were 4073, 3966, and 4284 nmol/g for black soybean. Grinding could significantly increase isoflavone extraction. The grinding process had a destructive effect on isoflavones and this effect varied with grinding temperature. Different heating methods had different effects on different isoflavone forms. Two soybean varieties showed distinct patterns with respect to the change of isoflavone profile during processing. © 2015 Institute of Food Technologists®

  19. CESAR cost-efficient methods and processes for safety-relevant embedded systems

    CERN Document Server

    Wahl, Thomas

    2013-01-01

    The book summarizes the findings and contributions of the European ARTEMIS project, CESAR, for improving and enabling interoperability of methods, tools, and processes to meet the demands in embedded systems development across four domains - avionics, automotive, automation, and rail. The contributions give insight to an improved engineering and safety process life-cycle for the development of safety critical systems. They present new concept of engineering tools integration platform to improve the development of safety critical embedded systems and illustrate capacity of this framework for end-user instantiation to specific domain needs and processes. They also advance state-of-the-art in component-based development as well as component and system validation and verification, with tool support. And finally they describe industry relevant evaluated processes and methods especially designed for the embedded systems sector as well as easy adoptable common interoperability principles for software tool integratio...

  20. Desalination Processes Evaluation at Common Platform: A Universal Performance Ratio (UPR) Method

    KAUST Repository

    Wakil Shahzad, Muhammad

    2018-01-31

    The inevitable escalation in economic development have serious implications on energy and environment nexus. The International Energy Outlook 2016 (IEO2016) predicted that the Non Organization for Economic Cooperation and Development (non-OECD) countries will lead with 71% rise in energy demand in contrast with only 18% in developed countries from 2012-2040. In Gulf Cooperation Council (GCC) countries, about 40% of primary energy is consumed for cogeneration based power and desalination plants. The cogeneration based plants are struggling with unfair primary fuel cost apportionment to electricity and desalination. Also, the desalination processes performance evaluated based on derived energy, providing misleading selection of processes. There is a need of (i) appropriate primary fuel cost appointment method for multi-purposed plants and (ii) desalination processes performance evaluation method based on primary energy. As a solution, we proposed exergetic analysis for primary fuel percentage apportionment to all components in the cycle according to the quality of working fluid utilized. The proposed method showed that the gas turbine was under charged by 40%, steam turbine was overcharged by 71% and desalination was overcharged by 350% by conventional energetic apportionment methods. We also proposed a new and most suitable desalination processes performance evaluation method based on primary energy, called universal performance ratio (UPR). Since UPR is based on primary energy, it can be used to evaluate any kind of desalination processes, thermally driven, pressure driven & humidification-dehumidification etc. on common platform. We showed that all desalination processes are operating only at 10-13% of thermodynamic limit (TL) of UPR. For future sustainability, desalination must achieve 25-30% of TL and it is only possible either by hybridization of different processes or by innovative membrane materials.

  1. Method and apparatus for rapid adjustment of process gas inventory in gaseous diffusion cascades

    International Nuclear Information System (INIS)

    1980-01-01

    A method is specified for the operation of a gaseous diffusion cascade wherein electrically driven compressors circulate a process gas through a plurality of serially connected gaseous diffusion stages to establish first and second countercurrently flowing cascade streams of process gas, one of the streams being at a relatively low pressure and enriched in a component of the process gas and the other being at a higher pressure and depleted in the same, and wherein automatic control systems maintain the stage process gas pressures by positioning process gas flow control valve openings at values which are functions of the difference between reference-signal inputs to the systems, and signal inputs proportional to the process gas pressures in the gaseous diffusion stages associated with the systems, the cascade process gas inventory being altered, while the cascade is operating, by simultaneously directing into separate process-gas freezing zones a plurality of substreams derived from one of the first and second streams at different points along the lengths thereof to solidify approximately equal weights of process gas in the zone while reducing the reference-signal inputs to maintain the positions of the control valves substantially unchanged despite the removal of process gas inventory via the substreams. (author)

  2. Silver recovery from the waste materials by the method of flotation process

    Directory of Open Access Journals (Sweden)

    B. Oleksiak

    2018-01-01

    Full Text Available During the leaching process of zinc concentrates, the waste materials rich in various metals such as eg. silver are produced. So far no attempts of silver recovery from the mentioned waste materials have been made due to the lack of any method which would be both effective and beneficial. The paper presents some possibilities of application of flotation process in silver recovery form waste materials generated during zinc production.

  3. Silver recovery from the waste materials by the method of flotation process

    OpenAIRE

    B. Oleksiak; G. Siwiec; A. Tomaszewska; D. Piękoś

    2018-01-01

    During the leaching process of zinc concentrates, the waste materials rich in various metals such as eg. silver are produced. So far no attempts of silver recovery from the mentioned waste materials have been made due to the lack of any method which would be both effective and beneficial. The paper presents some possibilities of application of flotation process in silver recovery form waste materials generated during zinc production.

  4. AUTOMR: An automatic processing program system for the molecular replacement method

    International Nuclear Information System (INIS)

    Matsuura, Yoshiki

    1991-01-01

    An automatic processing program system of the molecular replacement method AUTMR is presented. The program solves the initial model of the target crystal structure using a homologous molecule as the search model. It processes the structure-factor calculation of the model molecule, the rotation function, the translation function and the rigid-group refinement successively in one computer job. Test calculations were performed for six protein crystals and the structures were solved in all of these cases. (orig.)

  5. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    Science.gov (United States)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  6. DEVELOPMENT OF A METHOD STATISTICAL ANALYSIS ACCURACY AND PROCESS STABILITY PRODUCTION OF EPOXY RESIN ED-20

    Directory of Open Access Journals (Sweden)

    N. V. Zhelninskaya

    2015-01-01

    Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of

  7. Evaluation and selection of in-situ leaching mining method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Zhao Heyong; Tan Kaixuan; Liu Huizhen

    2007-01-01

    According to the complicated conditions and main influence factors of in-situ leaching min- ing, a model and processes of analytic hierarchy are established for evaluation and selection of in-situ leaching mining methods based on analytic hierarchy process. Taking a uranium mine in Xinjiang of China for example, the application of this model is presented. The results of analyses and calculation indicate that the acid leaching is the optimum project. (authors)

  8. Ductile cast iron obtaining by Inmold method with use of LOST FOAM process

    Directory of Open Access Journals (Sweden)

    T. Pacyniak

    2010-01-01

    Full Text Available The possibility of manufacturing of ductile cast iron castings by Inmold method with use of LOST FOAM process was presented in this work. The spheroidization was carried out by magnesium master alloy in amounts of 1% casting mass. Nodulizer was located in the reactive chamber in the gating system made of foamed polystyrene. Pretests showed, that there are technical possibilities of manufacturing of casts from ductile cast iron in the LOST FOAM process with use of spheroidization in mould.

  9. Impact Assessment of Various Methods for Control of Synchronous Generator Excitation on Quality of Transient Processes

    Directory of Open Access Journals (Sweden)

    Y. D. Filipchik

    2011-01-01

    Full Text Available The paper considers an impact of various methods for control of an exciting current pertaining to a synchronous generator on the nature of transient processes. A control algorithm for the exciting current in relation to changes in sliding and acceleration of a generator rotor has been proposed in the paper. The algorithm makes it possible to improve quality of the transient processes due to reduction of oscillation range concerning as an active power so a δ-angle as well.

  10. Unconscious neural processing differs with method used to render stimuli invisible

    Directory of Open Access Journals (Sweden)

    Sergey Victor Fogelson

    2014-06-01

    Full Text Available Visual stimuli can be kept from awareness using various methods. The extent of processing that a given stimulus receives in the absence of awareness is typically used to make claims about the role of consciousness more generally. The neural processing elicited by a stimulus, however, may also depend on the method used to keep it from awareness, and not only on whether the stimulus reaches awareness. Here we report that the method used to render an image invisible has a dramatic effect on how category information about the unseen stimulus is encoded across the human brain. We collected fMRI data while subjects viewed images of faces and tools, that were rendered invisible using either continuous flash suppression (CFS or chromatic flicker fusion (CFF. In a third condition, we presented the same images under normal fully visible viewing conditions. We found that category information about visible images could be extracted from patterns of fMRI responses throughout areas of neocortex known to be involved in face or tool processing. However, category information about stimuli kept from awareness using CFS could be recovered exclusively within occipital cortex, whereas information about stimuli kept from awareness using CFF was also decodable within temporal and frontal regions. We conclude that unconsciously presented objects are processed differently depending on how they are rendered subjectively invisible. Caution should therefore be used in making generalizations on the basis of any one method about the neural basis of consciousness or the extent of information processing without consciousness.

  11. Unconscious neural processing differs with method used to render stimuli invisible.

    Science.gov (United States)

    Fogelson, Sergey V; Kohler, Peter J; Miller, Kevin J; Granger, Richard; Tse, Peter U

    2014-01-01

    Visual stimuli can be kept from awareness using various methods. The extent of processing that a given stimulus receives in the absence of awareness is typically used to make claims about the role of consciousness more generally. The neural processing elicited by a stimulus, however, may also depend on the method used to keep it from awareness, and not only on whether the stimulus reaches awareness. Here we report that the method used to render an image invisible has a dramatic effect on how category information about the unseen stimulus is encoded across the human brain. We collected fMRI data while subjects viewed images of faces and tools, that were rendered invisible using either continuous flash suppression (CFS) or chromatic flicker fusion (CFF). In a third condition, we presented the same images under normal fully visible viewing conditions. We found that category information about visible images could be extracted from patterns of fMRI responses throughout areas of neocortex known to be involved in face or tool processing. However, category information about stimuli kept from awareness using CFS could be recovered exclusively within occipital cortex, whereas information about stimuli kept from awareness using CFF was also decodable within temporal and frontal regions. We conclude that unconsciously presented objects are processed differently depending on how they are rendered subjectively invisible. Caution should therefore be used in making generalizations on the basis of any one method about the neural basis of consciousness or the extent of information processing without consciousness.

  12. Curcumin complexation with cyclodextrins by the autoclave process: Method development and characterization of complex formation.

    Science.gov (United States)

    Hagbani, Turki Al; Nazzal, Sami

    2017-03-30

    One approach to enhance curcumin (CUR) aqueous solubility is to use cyclodextrins (CDs) to form inclusion complexes where CUR is encapsulated as a guest molecule within the internal cavity of the water-soluble CD. Several methods have been reported for the complexation of CUR with CDs. Limited information, however, is available on the use of the autoclave process (AU) in complex formation. The aims of this work were therefore to (1) investigate and evaluate the AU cycle as a complex formation method to enhance CUR solubility; (2) compare the efficacy of the AU process with the freeze-drying (FD) and evaporation (EV) processes in complex formation; and (3) confirm CUR stability by characterizing CUR:CD complexes by NMR, Raman spectroscopy, DSC, and XRD. Significant differences were found in the saturation solubility of CUR from its complexes with CD when prepared by the three complexation methods. The AU yielded a complex with expected chemical and physical fingerprints for a CUR:CD inclusion complex that maintained the chemical integrity and stability of CUR and provided the highest solubility of CUR in water. Physical and chemical characterizations of the AU complexes confirmed the encapsulated of CUR inside the CD cavity and the transformation of the crystalline CUR:CD inclusion complex to an amorphous form. It was concluded that the autoclave process with its short processing time could be used as an alternate and efficient methods for drug:CD complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. [Investigation of potential toxic factors for fleece-flower root: from perspective of processing methods evolution].

    Science.gov (United States)

    Cui, He-Rong; Bai, Zhao-Fang; Song, Hai-Bo; Jia, Tian-Zhu; Wang, Jia-Bo; Xiao, Xiao-He

    2016-01-01

    In recent years, the rapid growth of reports on fleece-flower root-caused liver damages has drawn wide attention of both at home and abroad, however, there were rare literature on toxicology of fleece-flower root in ancient Chinese medicine. But why there are so many reports on toxicology of fleece-flower root now compared with the ancient literature? As a typical tonic medicine, the clinical utility of fleece-flower root was largely limited by its standardization and reliability of processing methods in ancient Chinese medicine. The ancient processing methods of fleece-flower root emphasized nine times of steaming and nine times of drying, while the modern processes have been simplified into one time of steaming. Whether the differences between ancient and modern processing methods are the potential cause of the increased events of fleece-flower root-caused liver damages. We will make deep analysis and provide new clues and perspectives for the research on its toxicity. This article, therefore, would discuss the affecting factors and key problems in toxicity attenuation of fleece-flower root on the basis of sorting out the processing methods of fleece-flower root in ancient medical books and modern standards, in order to provide the reference for establishing specification for toxicity attenuation of fleece-flower root. Copyright© by the Chinese Pharmaceutical Association.

  14. [Influence of different original processing methods on quality of Salvia Miltiorrhizae Radix et Rhizoma from Shandong].

    Science.gov (United States)

    Zhao, Zhi-Gang; Gao, Shu-Rui; Hou, Jun-Ling; Wang, Wen-Quan; Xu, Zhen-Guang; Song, Yan; Zhang, Xian-Ming; Li, Jun

    2014-04-01

    In this paper the contents of rosmarinic acid, salvianolic acid B, crytotanshinone, tanshinone II(A) in samples of different original processed Salvia Miltiorrhizae Radix et Rhizoma were determined by HPLC. Different processing methods have varied influences on four active ingredients in Salvia Miltiorrhizae Radix et Rhizoma. Sun-drying reduced the content of crytotanshinone, tanshi-none II(A) and rosmarinic acid, integralsamples were better than those cut into segments. Oven dry method had great influence on water--soluble ingredients, high temperature (80-100 degrees C) could easily cause big loss of rosmarinic acid and salvianolic acid B. The role of traditional processing method "fahan: was complicated, the content of rosmarinic acid decreased, crytotanshinone and tanshinone II(A) increased, and salvianolic acid B showed no difference after "fahan". Drying in the shade and oven dry under low temperatrure (40-60 degrees C) were all effective to keep active ingredients of Salvia Miltiorrhizae Radix et Rhizoma, and, there was no difference between integral samples and samples cut into segments. Therefore, considering comprehensively the content of active ingredients in Salvia Miltiorrhizae Radix et Rhizoma, and processing costing etc., shade-drying or oven dry underlow temperature (40-60 degrees C) should be the most suitable original processing method.

  15. NUMERICAL WITHOUT ITERATION METHOD OF MODELING OF ELECTROMECHANICAL PROCESSES IN ASYNCHRONOUS ENGINES

    Directory of Open Access Journals (Sweden)

    D. G. Patalakh

    2018-02-01

    Full Text Available Purpose. Development of calculation of electromagnetic and electromechanic transients is in asynchronous engines without iterations. Methodology. Numeral methods of integration of usual differential equations, programming. Findings. As the system of equations, describing the dynamics of asynchronous engine, contents the products of rotor and stator currents and product of rotation frequency of rotor and currents, so this system is nonlinear one. The numeral solution of nonlinear differential equations supposes an iteration process on every step of integration. Time-continuing and badly converging iteration process may be the reason of calculation slowing. The improvement of numeral method by the way of an iteration process removing is offered. As result the modeling time is reduced. The improved numeral method is applied for integration of differential equations, describing the dynamics of asynchronous engine. Originality. The improvement of numeral method allowing to execute numeral integrations of differential equations containing product of functions is offered, that allows to avoid an iteration process on every step of integration and shorten modeling time. Practical value. On the basis of the offered methodology the universal program of modeling of electromechanics processes in asynchronous engines could be developed as taking advantage on fast-acting.

  16. Modelling dynamic processes in a nuclear reactor by state change modal method

    Science.gov (United States)

    Avvakumov, A. V.; Strizhov, V. F.; Vabishchevich, P. N.; Vasilev, A. O.

    2017-12-01

    Modelling of dynamic processes in nuclear reactors is carried out, mainly, using the multigroup neutron diffusion approximation. The basic model includes a multidimensional set of coupled parabolic equations and ordinary differential equations. Dynamic processes are modelled by a successive change of the reactor states. It is considered that the transition from one state to another occurs promptly. In the modal method the approximate solution is represented as eigenfunction expansion. The numerical-analytical method is based on the use of dominant time-eigenvalues of a group diffusion model taking into account delayed neutrons.

  17. Application of wavelet analysis to signal processing methods for eddy-current test

    International Nuclear Information System (INIS)

    Chen, G.; Yoneyama, H.; Yamaguchi, A.; Uesugi, N.

    1998-01-01

    This study deals with the application of wavelet analysis to detection and characterization of defects from eddy-current and ultrasonic testing signals of a low signal-to-noise ratio. Presented in this paper are the methods for processing eddy-current testing signals of heat exchanger tubes of a steam generator in a nuclear power plant. The results of processing eddy-current testing signals of tube testpieces with artificial flaws show that the flaw signals corrupted by noise and/or non-defect signals can be effectively detected and characterized by using the wavelet methods. (author)

  18. Comparative exergy analyses of Jatropha curcas oil extraction methods: Solvent and mechanical extraction processes

    International Nuclear Information System (INIS)

    Ofori-Boateng, Cynthia; Keat Teong, Lee; JitKang, Lim

    2012-01-01

    Highlights: ► Exergy analysis detects locations of resource degradation within a process. ► Solvent extraction is six times exergetically destructive than mechanical extraction. ► Mechanical extraction of jatropha oil is 95.93% exergetically efficient. ► Solvent extraction of jatropha oil is 79.35% exergetically efficient. ► Exergy analysis of oil extraction processes allow room for improvements. - Abstract: Vegetable oil extraction processes are found to be energy intensive. Thermodynamically, any energy intensive process is considered to degrade the most useful part of energy that is available to produce work. This study uses literature values to compare the efficiencies and degradation of the useful energy within Jatropha curcas oil during oil extraction taking into account solvent and mechanical extraction methods. According to this study, J. curcas seeds on processing into J. curcas oil is upgraded with mechanical extraction but degraded with solvent extraction processes. For mechanical extraction, the total internal exergy destroyed is 3006 MJ which is about six times less than that for solvent extraction (18,072 MJ) for 1 ton J. curcas oil produced. The pretreatment processes of the J. curcas seeds recorded a total internal exergy destructions of 5768 MJ accounting for 24% of the total internal exergy destroyed for solvent extraction processes and 66% for mechanical extraction. The exergetic efficiencies recorded are 79.35% and 95.93% for solvent and mechanical extraction processes of J. curcas oil respectively. Hence, mechanical oil extraction processes are exergetically efficient than solvent extraction processes. Possible improvement methods are also elaborated in this study.

  19. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    Science.gov (United States)

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  20. A measurement method for micro 3D shape based on grids-processing and stereovision technology

    International Nuclear Information System (INIS)

    Li, Chuanwei; Xie, Huimin; Liu, Zhanwei

    2013-01-01

    An integrated measurement method for micro 3D surface shape by a combination of stereovision technology in a scanning electron microscope (SEM) and grids-processing methodology is proposed. The principle of the proposed method is introduced in detail. By capturing two images of the tested specimen with grids on the surface at different tilt angles in an SEM, the 3D surface shape of the specimen can be obtained. Numerical simulation is applied to analyze the feasibility of the proposed method. A validation experiment is performed here. The surface shape of the metal-wire/polymer-membrane structures with thermal deformation is reconstructed. By processing the surface grids of the specimen, the out-of-plane displacement field of the specimen surface is also obtained. Compared with the measurement results obtained by a 3D digital microscope, the experimental error of the proposed method is discussed (paper)