WorldWideScience

Sample records for truex process applied

  1. TRUEX partitioning studies applied to ICPP sodium-bearing waste

    Energy Technology Data Exchange (ETDEWEB)

    Herbst, R.S.; Brewer, K.N.; Law, J.D.; Tranter, T.J.; Todd, T.A.

    1994-05-01

    The Idaho Chemical Processing Plant (ICPP), located in southeast Idaho at the USDOE Idaho National Engineering Laboratory, formerly reprocessed highly enriched spent nuclear fuel to recover fissionable uranium. The HLW raffinates from the combined PUREX/REDOX type uranium recovery process were converted to solid oxides (calcine) in a high temperature fluidized bed. Liquid effluents from the calcination process were combined with liquid sodium bearing waste (SBW) generated primarily in conjunction with decontamination activities. Due to the high sodium content in the SBW, this secondary waste stream is not directly amenable to solidification via calcination. Currently, approximately 1.5 millon gallons of liquid SBW are stored at the ICPP in large tanks. Several treatment options for the SBW are currently being considered, including the TRansUranic EXtraction (TRUEX) process developed by Horwitz and co-workers at Argonne National Laboratory (ANL), in preparation for the final disposition of SBW. Herein described are experimental results of radionuclide tracer studies with simulated SBW using the TRUEX process solvent.

  2. Transuranic decontamination of nitric acid solutions by the TRUEX solvent extraction process: preliminary development studies

    International Nuclear Information System (INIS)

    Vandegrift, G.F.; Leonard, R.A.; Steindler, M.J.; Horwitz, E.P.; Basile, L.J.; Diamond, H.; Kalina, D.G.; Kaplan, L.

    1984-07-01

    This report summarizes the work that has been performed to date at Argonne National Laboratory on the development of the TRUEX process, a solvent extraction process employing a bifunctional organophosphorous reagent in a PUREX process solvent (tributyl phosphate-normal paraffinic hydrocarbons). The purpose of this extraction process is to separate and concentrate transuranic (TRU) elements from nuclear waste. Assessments were made of the use of two TRUEX solvents: one incorporating the well-studied dihexyl-N,N-diethylcarbamoylmethylphosphonate (DHDECMP) and a second incorporating an extractant with superior properties for a 1M HNO 3 acid feed, octyl(phenyl)-N,N-diisobutylcarbamoylmethylphosphine oxide (O/sub phi/D[IB]CMPO). In this report, conceptual flowsheets for the removal of soluble TRUs from high-level nuclear wastes using these two TRUEX proces solvents are presented, and flowsheet features are discussed in detail. The conceptual flowsheet for TRU-element removal from a PUREX waste by the O/sub phi/D[IB]CMPO-TRUEX process solvent was tested in a bench-scale countercurrent experiment, and results of that experiment are presented and discussed. The conclusion of this study is that the TRUEX process is able to separate TRUs from high-level wastes so that the major portion of the solid waste (approx. 99%) can be classified as non-TRU. Areas where more experimentation is needed are listed at the end of the report. 45 references, 17 figures, 56 tables

  3. The TRUEX [TRansUranium EXtraction] process and the management of liquid TRU [transuranic] waste

    International Nuclear Information System (INIS)

    Schulz, W.W.; Horwitz, E.P.

    1987-01-01

    The TRUEX process is a new generic liquid-liquid extraction process for removal of all actinides from acidic nitrate or chloride nuclear waste solutions. Because of its high efficiency and great flexibility, the TRUEX process appears destined to be widely used in the US and possibly in other countries for cost-effective management and disposal of transuranic (TRU) wastes. In the US, TRU wastes are those that contain ≥3.7 x 10 6 Bq/kg) of TRU elements with half-lives greater than 20 y. This paper gives a brief review of the relevant chemistry and summarizes the current status of development and deployment of the TRUEX (TRansUranium EXtraction) process flowsheets to treat specific acidic waste solutions at several US Department of Energy sites. 19 refs., 4 figs., 4 tabs

  4. TRUEX hot demonstration

    International Nuclear Information System (INIS)

    Chamberlain, D.B.; Leonard, R.A.; Hoh, J.C.; Gay, E.C.; Kalina, D.G.; Vandegrift, G.F.

    1990-04-01

    In FY 1987, a program was initiated to demonstrate technology for recovering transuranic (TRU) elements from defense wastes. This hot demonstration was to be carried out with solution from the dissolution of irradiated fuels. This recovery would be accomplished with both PUREX and TRUEX solvent extraction processes. Work planned for this program included preparation of a shielded-cell facility for the receipt and storage of spent fuel from commercial power reactors, dissolution of this fuel, operation of a PUREX process to produce specific feeds for the TRUEX process, operation of a TRUEX process to remove residual actinide elements from PUREX process raffinates, and processing and disposal of waste and product streams. This report documents the work completed in planning and starting up this program. It is meant to serve as a guide for anyone planning similar demonstrations of TRUEX or other solvent extraction processing in a shielded-cell facility

  5. Modified TRUEX process for the treatment of high-level liquid waste

    International Nuclear Information System (INIS)

    Arai, Kazuhiro; Yamashita, Masatada; Hatta, Masahisa; Tomiyasu, Hiroshi; Ikeda, Yasuhisa.

    1997-01-01

    The TRUEX process has been examined to recover Am and Cm from the high-level liquid waste of Purex reprocessing plant. Continuous counter current extraction and back-extraction experiments were carried out by a mixer-settler using simulated waste solution for three process flowsheets, i.e. a process flowsheet developed by Argonne National Laboratory and other two process flowsheets which were modified in the scrub stage. The result indicates that the process flowsheet of Argonne National Laboratory cannot be applied for the high-level liquid waste containing high concentrations of lanthanide and actinide elements because of the formation of insoluble salts of these elements with oxalic acid, which is added to restrict the extraction of fission products such as Mo and Zr. A modified process flowsheet, which had only one scrub stage with high concentration of nitric acid, was found to be the best of three process flowsheets examined, where Nd as a simulated element of Am and Cm was sufficiently recovered and any precipitation of oxalate salt was not observed. (author)

  6. Extensive separations (CLEAN) processing strategy compared to TRUEX strategy and sludge wash ion exchange

    International Nuclear Information System (INIS)

    Knutson, B.J.; Jansen, G.; Zimmerman, B.D.; Seeman, S.E.; Lauerhass, L.; Hoza, M.

    1994-08-01

    Numerous pretreatment flowsheets have been proposed for processing the radioactive wastes in Hanford's 177 underground storage tanks. The CLEAN Option is examined along with two other flowsheet alternatives to quantify the trade-off of greater capital equipment and operating costs for aggressive separations with the reduced waste disposal costs and decreased environmental/health risks. The effect on the volume of HLW glass product and radiotoxicity of the LLW glass or grout product is predicted with current assumptions about waste characteristics and separations processes using a mass balance model. The prediction is made on three principal processing options: washing of tank wastes with removal of cesium and technetium from the supernatant, with washed solids routed directly to the glass (referred to as the Sludge Wash C processing strategy); the previous steps plus dissolution of the solids and removal of transuranic (TRU) elements, uranium, and strontium using solvent extraction processes (referred to as the Transuranic Extraction Option C (TRUEX-C) processing strategy); and an aggressive yet feasible processing strategy for separating the waste components to meet several main goals or objectives (referred to as the CLEAN Option processing strategy), such as the LLW is required to meet the US Nuclear Regulatory Commission Class A limits; concentrations of technetium, iodine, and uranium are reduced as low as reasonably achievable; and HLW will be contained within 1,000 borosilicate glass canisters that meet current Hanford Waste Vitrification Plant glass specifications

  7. Extensive separations (CLEAN) processing strategy compared to TRUEX strategy and sludge wash ion exchange

    Energy Technology Data Exchange (ETDEWEB)

    Knutson, B.J.; Jansen, G.; Zimmerman, B.D.; Seeman, S.E. [Westinghouse Hanford Co., Richland, WA (United States); Lauerhass, L.; Hoza, M. [Pacific Northwest Lab., Richland, WA (United States)

    1994-08-01

    Numerous pretreatment flowsheets have been proposed for processing the radioactive wastes in Hanford`s 177 underground storage tanks. The CLEAN Option is examined along with two other flowsheet alternatives to quantify the trade-off of greater capital equipment and operating costs for aggressive separations with the reduced waste disposal costs and decreased environmental/health risks. The effect on the volume of HLW glass product and radiotoxicity of the LLW glass or grout product is predicted with current assumptions about waste characteristics and separations processes using a mass balance model. The prediction is made on three principal processing options: washing of tank wastes with removal of cesium and technetium from the supernatant, with washed solids routed directly to the glass (referred to as the Sludge Wash C processing strategy); the previous steps plus dissolution of the solids and removal of transuranic (TRU) elements, uranium, and strontium using solvent extraction processes (referred to as the Transuranic Extraction Option C (TRUEX-C) processing strategy); and an aggressive yet feasible processing strategy for separating the waste components to meet several main goals or objectives (referred to as the CLEAN Option processing strategy), such as the LLW is required to meet the US Nuclear Regulatory Commission Class A limits; concentrations of technetium, iodine, and uranium are reduced as low as reasonably achievable; and HLW will be contained within 1,000 borosilicate glass canisters that meet current Hanford Waste Vitrification Plant glass specifications.

  8. Removal of actinides from dissolved ORNL MVST sludge using the TRUEX process

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, B.B.; Egan, B.Z.; Chase, C.W.

    1997-07-01

    Experiments were conducted to evaluate the transuranium extraction process for partitioning actinides from actual dissolved high-level radioactive waste sludge. All tests were performed at ambient temperature. Time and budget constraints permitted only two experimental campaigns. Samples of sludge from Melton Valley Storage Tank W-25 were rinsed with mild caustic (0.2 M NaOH) to reduce the concentrations of nitrates and fission products associated with the interstitial liquid. In one campaign, the rinsed sludge was dissolved in nitric acid to produce a solution containing total metal concentrations of ca. 1.8 M with a nitric acid concentration of ca. 2.9 M. About 50% of the dry mass of the sludge was dissolved. In the other campaign, the sludge was neutralized with nitric acid to destroy the carbonates, then leached with ca. 2.6 M NaOH for ca. 6 h before rinsing with the mild caustic. The sludge was then dissolved in nitric acid to produce a solution containing total metal concentrations of ca. 0.6 M with a nitric acid concentration of ca. 1.7 M. About 80% of the sludge dissolved. The dissolved sludge solution form the first campaign began gelling immediately, and a visible gel layer was observed after 8 days. In the second campaign, the solution became hazy after ca. 8 days, indicating gel formation, but did not display separated gel layers after aging for 20 days. Batch liquid-liquid equilibrium tests of both the extraction and stripping operations were conducted. Chemical analyses of both phases were used to evaluate the process. Evaluation was based on two metrics: the fraction of TRU elements removed from the dissolved sludge and comparison of the results with predictions made with the Generic TRUEX Model (GTM). The fractions of Eu, Pu, Cm, Th, and U species removed from aqueous solution in only one extraction stage were > 95% and were close to the values predicted by the GTM. Mercury was also found to be strongly extracted, with a one-stage removal of > 92%.

  9. Mercury extraction by the TRUEX process solvent. II. Selective partitioning of mercury from co-extracted actinides in a simulated acidic ICPP waste stream

    International Nuclear Information System (INIS)

    Brewer, K.N.; Herbst, R.S.; Tranter, T.J.; Todd, T.A.

    1995-01-01

    The TRUEX process is being evaluated at the Idaho Chemical Processing Plant (ICPP) as a means to partition the actinides from acidic sodium-bearing waste (SBW). The mercury content of this waste averages 1 g/l. Because the chemistry of mercury has not been extensively evaluated in the TRUEX process, mercury was singled out as an element of interest. Radioactive mercury, 203 Hg, was spiked into a simulated solution of SBW containing 1 g/l mercury. Successive extraction batch contacts with the mercury spiked waste and successive scrubbing and stripping batch contacts of the mercury loaded TRUEX solvent (0.2 M CMPO-1.4 M TBP in dodecane) show that mercury will extract into and strip from the solvent. The extraction distribution coefficient for mercury, as HgCl 2 , from SBW having a nitric acid concentration of 1.4 M and a chloride concentration of 0.035 M was found to be 3. The stripping distribution coefficient was found to be 0.5 with 5 M HNO 3 and 0.077 with 0.25 M Na 2 CO 3 . Because experiments described here show that mercury can be extracted from SBW and stripped from the solvent, a process has been developed to partition mercury from the actinides in SBW. 10 refs., 3 figs., 10 tabs

  10. TRUEX Radiolysis Testing Using the INL Radiolysis Test Loop: FY-2012 Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Dean R. Peterman; Lonnie G. Olson; Richard D. Tillotson; Rocklan G. McDowell; Jack D. Law

    2012-09-01

    The INL radiolysis test loop has been used to evaluate the affect of radiolytic degradation upon the efficacy of the strip section of the TRUEX flowsheet for the recovery of trivalent actinides and lanthanides from acidic solution. The nominal composition of the TRUEX solvent used in this study is 0.2 M CMPO and 1.4 M TBP dissolved in n-dodecane and the nominal composition of the TRUEX strip solution is 1.5 M lactic acid and 0.050 M diethylenetriaminepentaacetic acid. Gamma irradiation of a mixture of TRUEX process solvent and stripping solution in the test loop does not adversely impact flowsheet performance as measured by stripping americium ratios. The observed increase in americium stripping distribution ratios with increasing absorbed dose indicates the radiolytic production of organic soluble degradation compounds.

  11. Characterization of radiolytically generated degradation products in the strip section of a TRUEX flowsheet

    Energy Technology Data Exchange (ETDEWEB)

    Peterman, Dean R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Olson, Lonnie G. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groenewold, Gary S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); McDowell, Rocklan G. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Tillotson, Richard D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Law, Jack D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-08-01

    This report presents a summary of the work performed to meet the FCRD level 2 milestone M3FT-13IN0302053, “Identification of TRUEX Strip Degradation.” The INL radiolysis test loop has been used to identify radiolytically generated degradation products in the strip section of the TRUEX flowsheet. These data were used to evaluate impact of the formation of radiolytic degradation products in the strip section upon the efficacy of the TRUEX flowsheet for the recovery of trivalent actinides and lanthanides from acidic solution. The nominal composition of the TRUEX solvent used in this study is 0.2 M CMPO and 1.4 M TBP dissolved in n-dodecane and the nominal composition of the TRUEX strip solution is 1.5 M lactic acid and 0.050 M diethylenetriaminepentaacetic acid. Gamma irradiation of a mixture of TRUEX process solvent and stripping solution in the test loop does not adversely impact flowsheet performance as measured by stripping americium ratios. The observed increase in americium stripping distribution ratios with increasing absorbed dose indicates the radiolytic production of organic soluble degradation compounds.

  12. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  13. Summary of TRUEX Radiolysis Testing Using the INL Radiolysis Test Loop

    Energy Technology Data Exchange (ETDEWEB)

    Dean R. Peterman; Lonnie G. Olson; Rocklan G. McDowell; Gracy Elias; Jack D. Law

    2012-03-01

    The INL radiolysis and hydrolysis test loop has been used to evaluate the effects of hydrolytic and radiolytic degradation upon the efficacy of the TRUEX flowsheet for the recovery of trivalent actinides and lanthanides from acidic solution. Repeated irradiation and subsequent re-conditioning cycles did result in a significant decrease in the concentration of the TBP and CMPO extractants in the TRUEX solvent and a corresponding decrease in americium and europium extraction distributions. However, the build-up of solvent degradation products upon {gamma}-irradiation, had little impact upon the efficiency of the stripping section of the TRUEX flowsheet. Operation of the TRUEX flowsheet would require careful monitoring to ensure extraction distributions are maintained at acceptable levels.

  14. Extraction and stripping of neodymium (III) and dysprosium (III) by TRUEX solvent

    International Nuclear Information System (INIS)

    Rout, Alok; Venkatesan, K.A.; Antony, M.P.; Srinivasan, T.G.; Vasudeva Rao, P.R.

    2009-01-01

    McCabe-Thiele diagram for the extraction and stripping of Nd (III) and Dy (III) by TRUEX solvent has been constructed to determine the number of stages required for complete extraction and stripping. (author)

  15. Bologna Process: Apply or Not Apply

    Directory of Open Access Journals (Sweden)

    Muzaffer ELMAS

    2012-01-01

    Full Text Available There are lots of studies carried on education and training all over the world. U.S., Japan, Australia, East Asia and Europe continue this work in diff erent ways, but the main idea is the same. Here, the main idea is designing and sustaining educational systems based on learning outcomes, student centered approaches, system evaluation and quality cycles rather than inputs. Studies carried out in parallel with developments in the world are made under European Higher Education Area/Bologna Process. Th e aim here is to have graduates who are world citizens open to change. In recent years, in parallel with the changes in the world, especially based on Bologna Process reforms, comprehensive studies such as National Qualifications Framework, ECTS credits, student exchange programs and quality assurance systems are carried out in Turkey. Challenges, problems and bureaucracy make diff icult to sustain these important studies. Create a cycle in order to ensure the sustainability of the quality of this work is the most important step. A university management and evaluation scheme is consisting of corporate figures such as student, academic staff , physical infrastructure and financial parameters, and process qualities for student achievement, scientific research and community share, and their results shaped by managerial and behavioral capability of the system. Goals, objectives and performance in activities of individuals, departments and faculty of the universities are gett ing more important to identify and evaluate in certain periods. Th is challenging work having high bureaucracy strains university management and other stakeholders, and increases resistance to the quality processes. Establishing systems to ensure the sustainable quality gains importance as understanding and implementing the change in the world. Operating these quality processes with a systematic approach based on web technologies will result in reduced work load, increased

  16. Processing techniques applying laser technology

    International Nuclear Information System (INIS)

    Yamada, Yuji; Makino Yoshinobu

    2000-01-01

    The requirements for the processing of nuclear energy equipment include high precision, low distortion, and low heat input. Toshiba has developed laser processing techniques for cutting, welding, and surface heat treatment of nuclear energy equipment because the zone affected by distortion and heat in laser processing is very small. Laser processing contributes to the manufacturing of high-quality and high-reliability equipment and reduces the manufacturing period. (author)

  17. Safety confirmation study of TRUEX solvent by accelerating rate calorimeter (ARC)

    International Nuclear Information System (INIS)

    Sato, Yoshihiko; Hirumachi, Suguru; Takeda, Shinso; Kanazawa, Yoshito; Sasaya, Shinji

    1999-02-01

    In order to confirm the engineering safety on the TRUEX solvent (mixed solvent of CMPO/TBP/n-dodecane) for separating the transuranics from high-level activity liquid waste in advanced nuclear fuel recycling technological R and D, thermal behavior and pressure behavior in heating PUREX solvent (mixed solvent of 30% TBP-n-dodecane), TRUEX solvent and in the exothermic reaction of TRUEX solvent etc. and nitric acid in sealed adiabatic system which was severer condition than actual plant were measured by using accelerating rate calorimeter (ARC). The Arrhenius parameters (activation energy and frequency factor) which are necessary for the evaluation of reaction rate was examined from the measurement data in ARC. Analytical method and analysis condition of reaction products were examined in order to clarify chemical form of reaction products in exothermic reaction between solvent and nitric acid in ARC, and the qualitative evaluation was carried out. Main results are shown in the following. 1) TBP, CMPO, n-dodecane and 10 M nitric acid hardly exothermed in the simple substance. 2) On the solvent phase after the solvent contacted with 10 M nitric acid and the equilibrium has been attained (single-phase sample), the heat quantity per unit sample weight of the TRUEX solvent tended to be bigger than that of the PUREX solvent when heat quantity was evaluated in ARC. However, on the mixed sample of solvent and 10 M nitric acid enclosed in a sample container simultaneously (two phase system sample), the heat quantity per unit solvent weight was almost equivalent for PUREX solvent and TRUEX solvent. 3) The kinetic analysis was carried out, and on the TBP-10 M nitric acid single-phase sample, the activation energy of the reaction was evaluated to be 118 kJ/mol. Its activation energy was approximately equal to 112 kJ/mol by Nichols. The reaction rate constant was calculated, and it was shown that reaction rate constants of PUREX solvent-10 M nitric acid single-phase sample and

  18. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  19. Applied medical image processing a basic course

    CERN Document Server

    Birkfellner, Wolfgang

    2014-01-01

    A widely used, classroom-tested text, Applied Medical Image Processing: A Basic Course delivers an ideal introduction to image processing in medicine, emphasizing the clinical relevance and special requirements of the field. Avoiding excessive mathematical formalisms, the book presents key principles by implementing algorithms from scratch and using simple MATLAB®/Octave scripts with image data and illustrations on an accompanying CD-ROM or companion website. Organized as a complete textbook, it provides an overview of the physics of medical image processing and discusses image formats and data storage, intensity transforms, filtering of images and applications of the Fourier transform, three-dimensional spatial transforms, volume rendering, image registration, and tomographic reconstruction.

  20. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  1. Dissimilarity between Markovian processes applied to industrial processes

    Science.gov (United States)

    García, Jesús E.; González-López, V. A.; de Andrade, F. H. Kubo

    2017-07-01

    In this paper we introduce a result which solves the problem of how to measure the similarity or discrepancy between two Markovian stochastic processes. The result is based on a consistent measure derived from a generalization of the Bayesian Information Criterion. We apply this concept to a practical problem with the aim of analyzing the similarity or discrepancy between two processes related to fuel alcohol production, which should be considered indistinguishable, according to their technical specifications.

  2. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  3. Applying the redox process to arsenical concentrates

    Science.gov (United States)

    Beattie, M. J. V.; Ismay, Arnaldo

    1990-01-01

    Extensive batch and continuous testing has been completed using a high-temperature, nitric acid pressure leach (Redox) process for oxidizing the refractory gold-containing arsenopyrite tailings presently stockpiled at Snow Lake, Manitoba. This process has achieved up to 99% oxidation of the arsenopyrite compound and precipitated more than 90% arsenic into a stable iron-arsenic compound (resembling ferric arsenate) in less than eight minutes of overall retention time at temperatures of 195-210°C and an oxygen overpressure of 345 kPa. The oxidation step then exposes the contained gold, allowing a recovery of 91.5% in a standard carbon-in-leach circuit. The main advantages of this process are fast reaction rates, the high proportion of arsenic precipitated, and the stability of the precipitate.

  4. Process engineering applied to receiving waters

    DEFF Research Database (Denmark)

    Harremoës, Poul

    1973-01-01

    Processes in the media, which recieve waste, must be included in the sphere of interest within industry's environmental consciousness. Pollution problems are not very simple and should thus not be dealt with in too simple a fashion. Chemical engineers are very well suited to deal with the stagger...

  5. Applying interactive control to waste processing operations

    International Nuclear Information System (INIS)

    Grasz, E.L.; Merrill, R.D.; Couture, S.A.

    1992-08-01

    At present waste and residue processing includes steps that require human interaction. The risk of exposure to unknown hazardous materials and the potential for radiation contamination motivates the desire to remove operators from these processes. Technologies that facilitate this include glove box robotics, modular systems for remote and automated servicing, and interactive controls that minimize human intervention. LLNL is developing an automated system which is designed to supplant the operator for glove box tasks, thus protecting the operator from the risk of radiation exposure and minimizing operator-associated waste. Although most of the processing can be automated with minimal human interaction, there are some tasks where intelligent intervention is both desirable and necessary to adapt to Enexpected circumstances and events. These activities require that the operator interact with the process using a remote manipulator which provides or reflects a natural feel to the operator. The remote manipulation system which was developed incorporates sensor fusion and interactive control, and provides the operator with an effective means of controlling the robot in a potentially unknown environment. This paper describes recent accomplishments in technology development and integration, and outlines the future goals of Lawrence Livermore National Laboratory for achieving this integrated interactive control capability

  6. Digital Signal Processing applied to Physical Signals

    CERN Document Server

    Alberto, Diego; Musa, L

    2011-01-01

    It is well known that many of the scientific and technological discoveries of the XXI century will depend on the capability of processing and understanding a huge quantity of data. With the advent of the digital era, a fully digital and automated treatment can be designed and performed. From data mining to data compression, from signal elaboration to noise reduction, a processing is essential to manage and enhance features of interest after every data acquisition (DAQ) session. In the near future, science will go towards interdisciplinary research. In this work there will be given an example of the application of signal processing to different fields of Physics from nuclear particle detectors to biomedical examinations. In Chapter 1 a brief description of the collaborations that allowed this thesis is given, together with a list of the publications co-produced by the author in these three years. The most important notations, definitions and acronyms used in the work are also provided. In Chapter 2, the last r...

  7. The controlled vitrification/crystallisation process applied

    Directory of Open Access Journals (Sweden)

    Romero, M.

    2000-02-01

    Full Text Available The glass-ceramic process, as well as the usual processing of ceramic and vitreous materials, is being investigated as a promising way for isolation and recycling of both mineral wastes (debris and mineral residues, clearings in public works and inorganic industrial wastes (muds, slags, fly ashes. Synthetic materials with useful properties to be used as building materials have been prepared from inorganic wastes of different type (red muds from zinc hydrometalurgy, fly ashes from power thermal stations, slags and fly ashes from domiciliary incinerators as well as from mixtures of such wastes with other raw materials. The obtained results allow us to conclude that the ceramic and glass-ceramic processes are outlined as an useful alternative to solve the social and environmental problems associated to wastes production.

    El proceso vitrocerámico, así como el procesado habitual de materiales cerámicos y vítreos, está siendo actualmente investigado como una prometedora vía para el aislamiento, inertización e incluso el reciclado de residuos minerales (escombreras y estériles de minas, desmontes de Obras Públicas, etc... e industriales (lodos, fangos, escorias, cenizas, etc.... A partir de residuos inorgánicos de diferente naturaleza (lodos de la hidrometalurgia del zinc, cenizas de centrales térmicas, escorias y cenizas de plantas incineradoras así como de mezclas de los mismos con otras materias primas, se están obteniendo materiales sintéticos con amplias aplicaciones en la Construcción y en Obras Públicas. Los resultados que se están consiguiendo permiten concluir que los procesos cerámico y vitrocerámico se perfilan como una alternativa real y útil para resolver, al menos parcialmente, los problemas sociales y medioambientales asociados a la producción de dichos residuos.

  8. Rejuvenation processes applied to 'poisoned' anion exchangers in uranium processing

    International Nuclear Information System (INIS)

    Gilmore, A.J.

    1979-11-01

    The removal of 'poisons' from anion exchangers in uranium processing of Canadian radioactive ores is commonly called rejuvenation or regeneration. The cost of the ion exchange recovery of uranium is adversely affected by a decrease in the capacity and efficiency of the anion exchangers, due to their being 'poisoned' by silica, elemental sulphur, molybdenum and tetrathionates. These 'poisons' have a high affinity for the anion exchangers, are adsorbed in preference to the uranyl complex, and do not desorb with the reagents used normally in the uranyl desorption phase. The frequency of rejuvenation and the reagents required for rejuvenation are determined by the severity of the 'poisoning' accumulated by the exchanger in contact with the uranium leach liquor. Caustic soda (NaOH) at approximately equal to 18 cents/lb is commonly used to remove uranium anion exchangers of tetrathionate ((S 4 0 6 )/-/-) 'poisons'. A potential saving in operating cost would be of consequence if other reagents, e.g. sodium carbonate (Na 2 CO 3 ) at approximately equal to 3.6 cents/lb or calcium hydroxide (Ca(OH) 2 ) at approximately equal to 1.9 cents/lb, were effective in removing (S 4 0 6 )/-/-) from a 'poisoned' exchanger. A rejuvenation process for a test program was adopted after a perusal of the literature

  9. Process for applying control variables having fractal structures

    Science.gov (United States)

    Bullock, IV, Jonathan S.; Lawson, Roger L.

    1996-01-01

    A process and apparatus for the application of a control variable having a fractal structure to a body or process. The process of the present invention comprises the steps of generating a control variable having a fractal structure and applying the control variable to a body or process reacting in accordance with the control variable. The process is applicable to electroforming where first, second and successive pulsed-currents are applied to cause the deposition of material onto a substrate, such that the first pulsed-current, the second pulsed-current, and successive pulsed currents form a fractal pulsed-current waveform.

  10. Object Oriented Business Process Modelling in RFID Applied Computing Environments

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    As a tracking technology, Radio Frequency Identification (RFID) is now widely applied to enhance the context awareness of enterprise information systems. Such awareness provides great opportunities to facilitate business process automation and thereby improve operation efficiency and accuracy. With the aim to incorporate business logics into RFID-enabled applications, this book chapter addresses how RFID technologies impact current business process management and the characteristics of object-oriented business process modelling. This chapter first discusses the rationality and advantages of applying object-oriented process modelling in RFID applications, then addresses the requirements and guidelines for RFID data management and process modelling. Two typical solutions are introduced to further illustrate the modelling and incorporation of business logics/business processes into RFID edge systems. To demonstrate the applicability of these two approaches, a detailed case study is conducted within a distribution centre scenario.

  11. Applying Parallel Processing Techniques to Tether Dynamics Simulation

    Science.gov (United States)

    Wells, B. Earl

    1996-01-01

    The focus of this research has been to determine the effectiveness of applying parallel processing techniques to a sizable real-world problem, the simulation of the dynamics associated with a tether which connects two objects in low earth orbit, and to explore the degree to which the parallelization process can be automated through the creation of new software tools. The goal has been to utilize this specific application problem as a base to develop more generally applicable techniques.

  12. applying the design process to apparel prototype development

    African Journals Online (AJOL)

    user

    Applying the design process to apparel prototype development: students' experiences of a community service-learning project. 28 .... advantage for the community is job creation, improvement of the quality of products and ..... from Brown and Rice (2001:47-51), included price/value for money, quality/finishing, aes-.

  13. Controllable unit concept as applied to a hypothetical tritium process

    International Nuclear Information System (INIS)

    Seabaugh, P.W.; Sellers, D.E.; Woltermann, H.A.; Boh, D.R.; Miles, J.C.; Fushimi, F.C.

    1976-01-01

    A methodology (controllable unit accountability) is described that identifies controlling errors for corrective action, locates areas and time frames of suspected diversions, defines time and sensitivity limits of diversion flags, defines the time frame in which pass-through quantities of accountable material and by inference SNM remain controllable and provides a basis for identification of incremental cost associated with purely safeguards considerations. The concept provides a rationale from which measurement variability and specific safeguard criteria can be converted into a numerical value that represents the degree of control or improvement attainable with a specific measurement system or combination of systems. Currently the methodology is being applied to a high-throughput, mixed-oxide fuel fabrication process. The process described is merely used to illustrate a procedure that can be applied to other more pertinent processes

  14. Digital processing methodology applied to exploring of radiological images

    International Nuclear Information System (INIS)

    Oliveira, Cristiane de Queiroz

    2004-01-01

    In this work, digital image processing is applied as a automatic computational method, aimed for exploring of radiological images. It was developed an automatic routine, from the segmentation and post-processing techniques to the radiology images acquired from an arrangement, consisting of a X-ray tube, target and filter of molybdenum, of 0.4 mm and 0.03 mm, respectively, and CCD detector. The efficiency of the methodology developed is showed in this work, through a case study, where internal injuries in mangoes are automatically detected and monitored. This methodology is a possible tool to be introduced in the post-harvest process in packing houses. A dichotomic test was applied to evaluate a efficiency of the method. The results show a success of 87.7% to correct diagnosis and 12.3% to failures to correct diagnosis with a sensibility of 93% and specificity of 80%. (author)

  15. Condition Monitoring of a Process Filter Applying Wireless Vibration Analysis

    Directory of Open Access Journals (Sweden)

    Pekka KOSKELA

    2011-05-01

    Full Text Available This paper presents a novel wireless vibration-based method for monitoring the degree of feed filter clogging. In process industry, these filters are applied to prevent impurities entering the process. During operation, the filters gradually become clogged, decreasing the feed flow and, in the worst case, preventing it. The cleaning of the filter should therefore be carried out predictively in order to avoid equipment damage and unnecessary process downtime. The degree of clogging is estimated by first calculating the time domain indices from low frequency accelerometer samples and then taking the median of the processed values. Nine different statistical quantities are compared based on the estimation accuracy and criteria for operating in resource-constrained environments with particular focus on energy efficiency. The initial results show that the method is able to detect the degree of clogging, and the approach may be applicable to filter clogging monitoring.

  16. Statistical process control applied to the manufacturing of beryllia ceramics

    International Nuclear Information System (INIS)

    Ferguson, G.P.; Jech, D.E.; Sepulveda, J.L.

    1991-01-01

    To compete effectively in an international market, scrap and re-work costs must be minimized. Statistical Process Control (SPC) provides powerful tools to optimize production performance. These techniques are currently being applied to the forming, metallizing, and brazing of beryllia ceramic components. This paper describes specific examples of applications of SPC to dry-pressing of beryllium oxide 2x2 substrates, to Mo-Mn refractory metallization, and to metallization and brazing of plasma tubes used in lasers where adhesion strength is critical

  17. Applying Trusted Network Technology To Process Control Systems

    Science.gov (United States)

    Okhravi, Hamed; Nicol, David

    Interconnections between process control networks and enterprise networks expose instrumentation and control systems and the critical infrastructure components they operate to a variety of cyber attacks. Several architectural standards and security best practices have been proposed for industrial control systems. However, they are based on older architectures and do not leverage the latest hardware and software technologies. This paper describes new technologies that can be applied to the design of next generation security architectures for industrial control systems. The technologies are discussed along with their security benefits and design trade-offs.

  18. Applied Fourier analysis from signal processing to medical imaging

    CERN Document Server

    Olson, Tim

    2017-01-01

    The first of its kind, this focused textbook serves as a self-contained resource for teaching from scratch the fundamental mathematics of Fourier analysis and illustrating some of its most current, interesting applications, including medical imaging and radar processing. Developed by the author from extensive classroom teaching experience, it provides a breadth of theory that allows students to appreciate the utility of the subject, but at as accessible a depth as possible. With myriad applications included, this book can be adapted to a one or two semester course in Fourier Analysis or serve as the basis for independent study. Applied Fourier Analysis assumes no prior knowledge of analysis from its readers, and begins by making the transition from linear algebra to functional analysis. It goes on to cover basic Fourier series and Fourier transforms before delving into applications in sampling and interpolation theory, digital communications, radar processing, medical i maging, and heat and wave equations. Fo...

  19. Applying advanced digital signal processing techniques in industrial radioisotopes applications

    International Nuclear Information System (INIS)

    Mahmoud, H.K.A.E.

    2012-01-01

    Radioisotopes can be used to obtain signals or images in order to recognize the information inside the industrial systems. The main problems of using these techniques are the difficulty of identification of the obtained signals or images and the requirement of skilled experts for the interpretation process of the output data of these applications. Now, the interpretation of the output data from these applications is performed mainly manually, depending heavily on the skills and the experience of trained operators. This process is time consuming and the results typically suffer from inconsistency and errors. The objective of the thesis is to apply the advanced digital signal processing techniques for improving the treatment and the interpretation of the output data from the different Industrial Radioisotopes Applications (IRA). This thesis focuses on two IRA; the Residence Time Distribution (RTD) measurement and the defect inspection of welded pipes using a gamma source (gamma radiography). In RTD measurement application, this thesis presents methods for signal pre-processing and modeling of the RTD signals. Simulation results have been presented for two case studies. The first case study is a laboratory experiment for measuring the RTD in a water flow rig. The second case study is an experiment for measuring the RTD in a phosphate production unit. The thesis proposes an approach for RTD signal identification in the presence of noise. In this approach, after signal processing, the Mel Frequency Cepstral Coefficients (MFCCs) and polynomial coefficients are extracted from the processed signal or from one of its transforms. The Discrete Wavelet Transform (DWT), Discrete Cosine Transform (DCT), and Discrete Sine Transform (DST) have been tested and compared for efficient feature extraction. Neural networks have been used for matching of the extracted features. Furthermore, the Power Density Spectrum (PDS) of the RTD signal has been also used instead of the discrete

  20. Anomalous diffusion process applied to magnetic resonance image enhancement

    Science.gov (United States)

    Senra Filho, A. C. da S.; Garrido Salmon, C. E.; Murta Junior, L. O.

    2015-03-01

    Diffusion process is widely applied to digital image enhancement both directly introducing diffusion equation as in anisotropic diffusion (AD) filter, and indirectly by convolution as in Gaussian filter. Anomalous diffusion process (ADP), given by a nonlinear relationship in diffusion equation and characterized by an anomalous parameters q, is supposed to be consistent with inhomogeneous media. Although classic diffusion process is widely studied and effective in various image settings, the effectiveness of ADP as an image enhancement is still unknown. In this paper we proposed the anomalous diffusion filters in both isotropic (IAD) and anisotropic (AAD) forms for magnetic resonance imaging (MRI) enhancement. Filters based on discrete implementation of anomalous diffusion were applied to noisy MRI T2w images (brain, chest and abdominal) in order to quantify SNR gains estimating the performance for the proposed anomalous filter when realistic noise is added to those images. Results show that for images containing complex structures, e.g. brain structures, anomalous diffusion presents the highest enhancements when compared to classical diffusion approach. Furthermore, ADP presented a more effective enhancement for images containing Rayleigh and Gaussian noise. Anomalous filters showed an ability to preserve anatomic edges and a SNR improvement of 26% for brain images, compared to classical filter. In addition, AAD and IAD filters showed optimum results for noise distributions that appear on extreme situations on MRI, i.e. in low SNR images with approximate Rayleigh noise distribution, and for high SNR images with Gaussian or non central χ noise distributions. AAD and IAD filter showed the best results for the parametric range 1.2 MRI. This study indicates the proposed anomalous filters as promising approaches in qualitative and quantitative MRI enhancement.

  1. Applying Standard Interfaces to a Process-Control Language

    Science.gov (United States)

    Berthold, Richard T.

    2005-01-01

    A method of applying open-operating-system standard interfaces to the NASA User Interface Language (UIL) has been devised. UIL is a computing language that can be used in monitoring and controlling automated processes: for example, the Timeliner computer program, written in UIL, is a general-purpose software system for monitoring and controlling sequences of automated tasks in a target system. In providing the major elements of connectivity between UIL and the target system, the present method offers advantages over the prior method. Most notably, unlike in the prior method, the software description of the target system can be made independent of the applicable compiler software and need not be linked to the applicable executable compiler image. Also unlike in the prior method, it is not necessary to recompile the source code and relink the source code to a new executable compiler image. Abstraction of the description of the target system to a data file can be defined easily, with intuitive syntax, and knowledge of the source-code language is not needed for the definition.

  2. Discretized screening to apply EOR process in Western field, Venezuela

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, E.; Rodriguez, T.; Gonzalez, O. [PDVSA Petroleos de Venezuela SA, Caracas (Venezuela, Bolivarian Republic of). INTEVEP; Lara, V. [PDVSA Petroleos de Venezuela SA, Caracas (Venezuela, Bolivarian Republic of). CVP

    2009-07-01

    Increases in oil recovery factors through enhanced oil recovery (EOR) technologies has become an important issue in the petroleum industry because of depleting reserves of conventional fossil fuels and the low low mobility of extra heavy oils. Methodologies with different approaches have been developed to define the most suitable technology in specific reservoirs. The purpose of this paper was to determine which EOR technologies were the most appropriate for the entire Urdaneta reservoir in Venezuela, and to determine where the technologies could be applied in terms of reservoir volume. Specifically, the paper described the discretized screening methodology and showed an example of its application in the Urdaneta field. The processing of the static model of this field was described, since this is an input requirement for the EOR screening methodology. Screening results were also analysed and shown as color codes maps. The EOR screening methodology demonstrates that it is possible to evaluate the reservoir using very detailed input information. 4 refs., 3 tabs., 10 figs.

  3. Decision Framework for Applying Attenuation Processes to Metals and Radionuclides

    Science.gov (United States)

    Nyman, J.; Goswami, D.; Spreng, C.

    2010-12-01

    Until recently, there has been little regulatory guidance to support attenuation-based remedies for groundwater contaminated with metals and radionuclides. This has contributed to inconsistent application of those remedies and generally discouraged their consideration. The net result is that many sites face intractable closure problems. The U.S. Environmental Protection Agency (EPA) recently issued a three-volume technical guidance set that specifically addresses monitored natural attenuation (MNA) of inorganic contaminants. These new documents provide technical information related to the dominant attenuation mechanisms, as well as methods for characterization and evaluation of specific inorganic contaminants and radionuclides. Attenuation-based remedies for metals and long-lived radionuclides rely primarily on immobilization of contaminants as stable and/or nontoxic species. This stabilization and toxicity reduction can result from natural processes, geochemical gradients, or biogeochemical manipulation. Except for a few radionuclides, the original contaminant remains in the subsurface so that documentation of the sustainability, or permanence, of stabilization and detoxification is crucial to assessing performance. Another challenge in applying the existing and emerging guidance is the need to simultaneously address multiple contaminants at a target site, as is often the case in actual practice. The Interstate Technology & Regulatory Council (ITRC) has developed a technical and regulatory guidance to facilitate implementation of the new EPA guidance for MNA of metals and radionuclides. To determine the specific approach of this document, ITRC conducted a web-based survey of state regulators and stakeholders to determine the existing state of knowledge and acceptance regarding the application of attenuation processes as a remedy. The document addresses issues identified in the survey and provides examples of state protocols and stakeholder issues related to the

  4. Possibility of Applying Business Process Management Methodology in Logistic Processes Optimization

    Directory of Open Access Journals (Sweden)

    Diana Božić

    2014-12-01

    Full Text Available Raising the service level and developing new logistic services require better understanding of logistic processes and possibilities of optimization. Different methodologies have been used for that purpose, while the application of Business Process Management (BPM methodology is outlined in this paper. Identifying parts of logistic processes that could be optimized is facilitated by applying BPM methodology. It also enables more accurate quantification of impacts of the changes introduced in a particular process or activity on the processes as a whole and to other interacting processes. The application of BPM methodology is demonstrated in the case study, where a solution for logistic processes optimization is suggested and the prospective outcomes are simulated. The results of the logistic process comparative analysis have indicated a synergic effect of different improvements in sub-process on the effectiveness of the process as a whole, both on the operative and managerial level. The respective changes in workload distribution among interacting logistic processes have been quantified according to the same methodology.

  5. Fundamentals of Alloy Solidification Applied to Industrial Processes

    Science.gov (United States)

    1984-01-01

    Solidification processes and phenomena, segregation, porosity, gravity effects, fluid flow, undercooling, as well as processing of materials in the microgravity environment of space, now available on space shuttle flights were discussed.

  6. Science Process Skills in Science Curricula Applied in Turkey

    Science.gov (United States)

    Yumusak, Güngör Keskinkiliç

    2016-01-01

    One of the most important objectives of the science curricula is to bring in science process skills. The science process skills are skills that lie under scientific thinking and decision-making. Thus it is important for a science curricula to be rationalized in such a way that it brings in science process skills. New science curricula were…

  7. Applying the design process to apparel prototype development ...

    African Journals Online (AJOL)

    The partnership between a social entrepreneur and an academic department provided the opportunity to expose students to a real-life learning experience. The project enabled students to apply knowledge, gained in the Consumer Science Clothing Management programme, to the product development subject. The aims of ...

  8. DESIGNS FOR MIXTURE AND PROCESS VARIABLES APPLIED IN TABLET FORMULATIONS

    NARCIS (Netherlands)

    DUINEVELD, C. A. A.; Smilde, A. K.; Doornbos, D. A.

    1993-01-01

    Although there are several methods for the construction of a design for process variables and mixture variables, there are not very many methods which are suitable to combine mixture and process variables in one design. Some of the methods which are feasible will be shown. These methods will be

  9. Applying a punch with microridges in multistage deep drawing processes.

    Science.gov (United States)

    Lin, Bor-Tsuen; Yang, Cheng-Yu

    2016-01-01

    The developers of high aspect ratio components aim to minimize the processing stages in deep drawing processes. This study elucidates the application of microridge punches in multistage deep drawing processes. A microridge punch improves drawing performance, thereby reducing the number of stages required in deep forming processes. As an example, the original eight-stage deep forming process for a copper cylindrical cup with a high aspect ratio was analyzed by finite element simulation. Microridge punch designs were introduced in Stages 4 and 7 to replace the original punches. In addition, Stages 3 and 6 were eliminated. Finally, these changes were verified through experiments. The results showed that the microridge punches reduced the number of deep drawing stages yielding similar thickness difference percentages. Further, the numerical and experimental results demonstrated good consistency in the thickness distribution.

  10. Bayesian networks applied to process diagnostics. Applications in energy industry

    Energy Technology Data Exchange (ETDEWEB)

    Widarsson, Bjoern (ed.); Karlsson, Christer; Dahlquist, Erik [Maelardalen Univ., Vaesteraas (Sweden); Nielsen, Thomas D.; Jensen, Finn V. [Aalborg Univ. (Denmark)

    2004-10-01

    Uncertainty in process operation occurs frequently in heat and power industry. This makes it hard to find the occurrence of an abnormal process state from a number of process signals (measurements) or find the correct cause to an abnormality. Among several other methods, Bayesian Networks (BN) is a method to build a model which can handle uncertainty in both process signals and the process itself. The purpose of this project is to investigate the possibilities to use BN for fault detection and diagnostics in combined heat and power industries through execution of two different applications. Participants from Aalborg University represent the knowledge of BN and participants from Maelardalen University have the experience from modelling heat and power applications. The co-operation also includes two energy companies; Elsam A/S (Nordjyllandsverket) and Maelarenergi AB (Vaesteraas CHP-plant), where the two applications are made with support from the plant personnel. The project ended out in two quite different applications. At Nordjyllandsverket, an application based (due to the lack of process knowledge) on pure operation data is build with capability to detect an abnormal process state in a coal mill. Detection is made through a conflict analysis when entering process signals into a model built by analysing the operation database. The application at Maelarenergi is built with a combination of process knowledge and operation data and can detect various faults caused by the fuel. The process knowledge is used to build a causal network structure and the structure is then trained by data from the operation database. Both applications are made as off-online applications, but they are ready for being run on-line. The performance of fault detection and diagnostics are good, but a lack of abnormal process states with known cause reduces the evaluation possibilities. Advantages with combining expert knowledge of the process with operation data are the possibility to represent

  11. An Incentivized Capstone Design Team Applying the System Engineering Process

    Science.gov (United States)

    2015-01-02

    gaps. Some forms of these gaps include irrigation canals, moving from one rooftop to another, crossing minefields, fast flowing mountain streams... sprayed with Rust-Oleum Leakseal flexible rubber spray , giving its black color. The spray was applied in an attempt to protect the vectran against...it had the adverse effect of causing debris to adhere to the bag in an unclean environment. Therefore, the spray was abandoned in the production of

  12. Spectrophotometry with optical fibers applied to nuclear product processing

    International Nuclear Information System (INIS)

    Boisde, G.; Perez, J.J.; Velluet, M.T.; Jeunhomme, L.B.

    1988-01-01

    Absorption spectrophotometry is widely used in laboratories for composition analysis and quality control of chemical processes. Using optical fibers for transmitting the light between the instrument and the process line allows to improve the safety and productivity of chemical processes, thanks to real time measurements. Such applications have been developed since 1975 in CEA for the monitoring of nuclear products. This has led to the development of fibers, measurement cells, and optical feedthrough sustaining high radiation doses, of fiber/spectrophotometer couplers, and finally of a photodiode array spectrophotometer optimized for being used together with optical fibers [fr

  13. Business Intelligence Applied to the ALMA Software Integration Process

    Science.gov (United States)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  14. Applying commercial robotic technology to radioactive material processing

    International Nuclear Information System (INIS)

    Grasz, E.L.; Sievers, R.H. Jr.

    1990-11-01

    The development of robotic systems for glove box process automation is motivated by the need to reduce operator radiation dosage, minimize the generation of process waste, and to improve the security of nuclear materials. Commercial robotic systems are available with the required capabilities but are not compatible with a glove box environment. Alpha radiation, concentrated dust, a dry atmosphere and restricted work space result in the need for unique adaptations to commercial robotics. Implementation of these adaptations to commercial robotics require performance trade-offs. A design and development effort has been initiated to evaluate the feasibility of using a commercial overhead gantry robot for glove box processing. This paper will present the initial results and observations for this development effort. 1 ref

  15. 25 CFR 224.184 - How do other administrative appeals processes apply?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false How do other administrative appeals processes apply? 224... General Appeal Procedures § 224.184 How do other administrative appeals processes apply? The administrative appeals process in 25 CFR part 2 and 43 CFR part 4 are modified, only as they apply to appeals...

  16. Applying Information Processing Theory to Supervision: An Initial Exploration

    Science.gov (United States)

    Tangen, Jodi L.; Borders, L. DiAnne

    2017-01-01

    Although clinical supervision is an educational endeavor (Borders & Brown, [Borders, L. D., 2005]), many scholars neglect theories of learning in working with supervisees. The authors describe 1 learning theory--information processing theory (Atkinson & Shiffrin, 1968, 1971; Schunk, 2016)--and the ways its associated interventions may…

  17. Cavity prediction in sand mould production applying the DISAMATIC process

    DEFF Research Database (Denmark)

    Hovad, Emil; Larsen, Per; Spangenberg, Jon

    2017-01-01

    The sand shot in the DISAMATIC process is simulated by the discrete element method (DEM) taking into account the influence and coupling of the airflow with computational fluid dynamics (CFD). The DEM model is calibrated by a ring shear test, a sand pile experiment and a slump test. Subsequently...

  18. Formalizing and applying compliance patterns for business process compliance

    NARCIS (Netherlands)

    Elgammal, A.F.S.A.; Türetken, O.; van den Heuvel, W.J.A.M.; Papazoglou, M.

    Today’s enterprises demand a high degree of compliance of business processes to meet diverse regulations and legislations. Several industrial studies have shown that compliance management is a daunting task, and organizations are still struggling and spending billions of dollars annually to ensure

  19. Hoshin Planning Applies Total Quality Management to the Planning Process.

    Science.gov (United States)

    Heverly, Mary Ann; Parker, Jerome S.

    1993-01-01

    Hoshin, or breakthrough, planning is the Total Quality Management method of integrating strategic planning into the daily work of organizational units. Delaware County (Pennsylvania) Community College's adaptation of the process and its use in daily management are described and presented in the form of a flow chart. (MSE)

  20. Algorithm for applying interpolation in digital signal processing ...

    African Journals Online (AJOL)

    Software-defined radios and test equipment use a variety of digital signal processing techniques to improve system performance. Interpolation is one technique that can be used to increase the sample rate of digital signals. In this work, we illustrated interpolation in the time domain by writing appropriate codes using ...

  1. Integrated Gis-remote sensing processing applied to vegetation ...

    African Journals Online (AJOL)

    This study examines the special advantage offered by GIS-Remote Sensing processing to survey of vegetation, a renewable natural resource in Ibadan, South-Western, Nigeria with a view to eliciting support for sound environmental policy in the future. A remotely sensed digital image of SPOT by its linear enhancement on ...

  2. Applying MICP by denitrification in soils : A process analysis

    NARCIS (Netherlands)

    Pham, P.V.; Nakano, A.; van der Star, WRL; Heimovaara, T.J.; van Paassen, L.A.

    2016-01-01

    The process of microbially induced carbonate precipitation (MICP) by denitrification was investigated in relation to its potential use as a ground improvement method. Liquid batch experiments indicated that the substrate solution had an optimum carbon–nitrogen ratio of 1·6 and confirmed that

  3. Personal Computer (PC) based image processing applied to fluid mechanics

    Science.gov (United States)

    Cho, Y.-C.; Mclachlan, B. G.

    1987-01-01

    A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes computation.

  4. A Survey of Commonly Applied Methods for Software Process Improvement

    Science.gov (United States)

    1994-02-01

    modeling in the course of systems development and virtually every organization has a systems life-cycle model of some sort. But process definition as the...Taken to its logical conclusion, CASE could make coding as we now know it today virtually obsolete, just as third generation lan- guages dramatically...promotion staff, and research and development (R&D) people in a team that worked on a design together from drawing board to dealer showroom . Many ac

  5. Conditional Stochastic Processes Applied to Wave Load Predictions

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2015-01-01

    with an application where measured wave responses are used to predict the future variation in the responses within the next 5-30 seconds. The main part of the article is devoted to the application of the First Order Reliability Method for derivation of critical wave episodes for different nonlinear wave......The concept of conditional stochastic processes provides a powerful tool for evaluation and estimation of wave loads on ships and offshore structures. This article first considers conditional waves with a focus on critical wave episodes. Then the inherent uncertainty in the results is illustrated...

  6. Touch-sensitive graphics terminal applied to process control

    International Nuclear Information System (INIS)

    Bennion, S.I.; Creager, J.D.; VanHouten, R.D.

    1981-01-01

    Limited initial demonstrations of the system described took place during September 1980. A single CRT was used an an input device in the control center while operating a furnace and a pellet inspection gage. These two process line devices were completely controlled, despite the longer than desired response times noted, using a single control station located in the control center. The operator could conveniently execute any function from this remote location which could be performed locally at the hard-wired control panels. With the installation of the enhancements, the integrated touchscreen/graphics terminal will provide a preferable alternative to normal keyboard command input devices

  7. The process of outsourcing applied to public administration - legal approach

    Directory of Open Access Journals (Sweden)

    Bruno Magera Conceição

    2018-01-01

    Full Text Available The article presents an analysis of outsourcing in the Brazilian Public Administration and the way it currently operates, initiating a process of reform, aiming to reduce the size of the administrative apparatus. Several doctrinal and jurisprudential concepts have been demonstrated in order to demonstrate in a concise way the applicability of outsourcing as a mechanism to reduce costs of the public machine, increase its efficiency and decrease its size. Some advantages of the use of outsourcing by the public administration by several jurists are presented, including the gains in competitiveness and ease of supervision by the citizens. The extension of the responsibility of the public administration by the outsourced company and the way in which it operates, in the light of the legislation, jurisprudence and the best doctrine, suggest efficient alternatives in the exercise of the administration, increasing its credibility and efficiency in the attainment of governmental objectives. The methodology used was through deductive. The achieved results demonstrate that the process of outsourcing after quality and a new dynamics in the public service.

  8. Applying Softcomputing for Copper Recovery in Leaching Process

    Directory of Open Access Journals (Sweden)

    Claudio Leiva

    2017-01-01

    Full Text Available The mining industry of the last few decades recognizes that it is more profitable to simulate model using historical data and available mining process knowledge rather than draw conclusions regarding future mine exploitation based on certain conditions. The variability of the composition of copper leach piles makes it unlikely to obtain high precision simulations using traditional statistical methods; however the same data collection favors the use of softcomputing techniques to enhance the accuracy of copper recovery via leaching by way of prediction models. In this paper, a predictive modeling contrasting is made; a linear model, a quadratic model, a cubic model, and a model based on the use of an artificial neural network (ANN are presented. The model entries were obtained from operation data and data of piloting in columns. The ANN was constructed with 9 input variables, 6 hidden layers, and a neuron in the output layer corresponding to copper leaching prediction. The validation of the models was performed with real information and these results were used by a mining company in northern Chile to improve copper mining processes.

  9. Software factory techniques applied to process control at CERN

    CERN Document Server

    Dutour, Mathias D

    2008-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical information seamlessly into deployable PLC (Programmable logic Controller) - SCADA (Supervisory Control And Data Acquisition) systems. This software factory delivers consistently high quality by reducing human error and repetitive tasks, and adapts to user specifications in a cost-efficient way. Hence, this production tool is designed to encapsulate and hide the PLC and SCADA target platforms, enabling the experts to focus on the business model rather than specific syntaxes and grammars. Based on industry standard software, ...

  10. Software factory techniques applied to Process Control at CERN

    CERN Multimedia

    Dutour, MD

    2007-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical information seamlessly into deployable PLC (Programmable logic Controller) – SCADA (Supervisory Control And Data Acquisition) systems. This software factory delivers consistently high quality by reducing human error and repetitive tasks, and adapts to user specifications in a cost-efficient way. Hence, this production tool is designed to encapsulate and hide the PLC and SCADA target platforms, enabling the experts to focus on the business model rather than specific syntaxes and grammars. Based on industry standard software...

  11. Applying operating experience to design the CANDU 3 process

    International Nuclear Information System (INIS)

    Harris, D.S.; Hinchley, E.M.; Pauksens, J.; Snell, V.; Yu, S.K.W.

    1991-01-01

    The CANDU 3 is an advanced, smaller (450 MWe), standardized version of the CANDU now being designed for service later in the decade and beyond. The design of this evolutionary nuclear power plant has been carefully planned and organized to gain maximum benefits from new technologies and from world experience to date in designing, building, commissioning and operating nuclear power stations. The good performance record of existing CANDU reactors makes consideration of operating experience from these plants a particularly vital component of the design process. Since the completion of the first four CANDU 6 stations in the early 1980s, and with the continuing evolution of the multi-unit CANDU station designs since then, AECL CANDU has devised several processes to ensure that such feedback is made available to designers. An important step was made in 1986 when a task force was set up to review and process ideas arising from the commissioning and early operation of the CANDU 6 reactors which were, by that time, operating successfully in Argentina and Korea, as well as the Canadian provinces of Quebec and New Brunswick. The task force issued a comprehensive report which, although aimed at the design of an improved CANDU 6 station, was made available to the CANDU 3 team. By that time also, the Institute of Power Operations (INPO) in the U.S., of which AECL is a Supplier Participant member, was starting to publish Good Practices and Guidelines related to the review and the use of operating experiences. In addition, details of significant events were being made available via the INPO SEE-IN (Significant Event Evaluation and Information Network) Program, and subsequently the CANNET network of the CANDU Owners' Group (COG). Systematic review was thus possible by designers of operations reports, significant event reports, and related documents in a continuing program of design improvement. Another method of incorporating operations feedback is to involve experienced utility

  12. Applying lean principles to continuous renal replacement therapy processes.

    Science.gov (United States)

    Benfield, C Brett; Brummond, Philip; Lucarotti, Andrew; Villarreal, Maria; Goodwin, Adam; Wonnacott, Rob; Talley, Cheryl; Heung, Michael

    2015-02-01

    The application of lean principles to continuous renal replacement therapy (CRRT) processes in an academic medical center is described. A manual audit over six consecutive weeks revealed that 133 5-L bags of CRRT solution were discarded after being dispensed from pharmacy but before clinical use. Lean principles were used to examine the workflow for CRRT preparation and develop and implement an intervention. An educational program was developed to encourage and enhance direct communication between nursing and pharmacy about changes in a patient's condition or CRRT order. It was through this education program that the reordering workflow shifted from nurses to pharmacy technicians. The primary outcome was the number of CRRT solution bags delivered in the preintervention and postintervention periods. Nurses and pharmacy technicians were surveyed to determine their satisfaction with the workflow change. After implementation of lean principles, the mean number of CRRT solution bags dispensed per day of CRRT decreased substantially. Respondents' overall satisfaction with the CRRT solution preparation process increased during the postintervention period, and the satisfaction scores for each individual component of the workflow after implementation of lean principles. The decreased solution waste resulted in projected annual cost savings exceeding $70,000 in product alone. The use of lean principles to identify medication waste in the CRRT workflow and implementation of an intervention to shift the workload from intensive care unit nurses to pharmacy technicians led to reduced CRRT solution waste, improved efficiency of CRRT workflow, and increased satisfaction among staff. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  13. Diffusing wave spectroscopy applied to material analysis and process control

    International Nuclear Information System (INIS)

    Lloyd, Christopher James

    1997-01-01

    Diffusing Wave Spectroscopy (DWS) was studied as a method of laboratory analysis of sub-micron particles, and developed as a prospective in-line, industrial, process control sensor, capable of near real-time feedback. No sample pre-treatment was required and measurement was via a non-invasive, flexible, dip in probe. DWS relies on the concept of the diffusive migration of light, as opposed to the ballistic scatter model used in conventional dynamic light scattering. The specific requirements of the optoelectronic hardware, data analysis methods and light scattering model were studied experimentally and, where practical, theoretically resulting in a novel technique of analysis of particle suspensions and emulsions of volume fractions between 0.01 and 0.4. Operation at high concentrations made the technique oblivious to dust and contamination. A pure homodyne (autodyne) experimental arrangement described was resilient to environmental disturbances, unlike many other systems which utilise optical fibres or heterodyne operation. Pilot and subsequent prototype development led to a highly accurate method of size ranking, suitable for analysis of a wide range of suspensions and emulsions. The technique was shown to operate on real industrial samples with statistical variance as low as 0.3% with minimal software processing. Whilst the application studied was the analysis of TiO 2 suspensions, a diverse range of materials including polystyrene beads, cell pastes and industrial cutting fluid emulsions were tested. Results suggest that, whilst all sizing should be comparative to suitable standards, concentration effects may be minimised and even completely modelled-out in many applications. Adhesion to the optical probe was initially a significant problem but was minimised after the evaluation and use of suitable non stick coating materials. Unexpected behaviour in the correlation in the region of short decay times led to consideration of the effects of rotational diffusion

  14. Data Processing and Programming Applied to an Environmental Radioactivity Laboratory

    International Nuclear Information System (INIS)

    Trinidad, J. A.; Gasco, C.; Palacios, M. A.

    2009-01-01

    This report is the original research work presented for the attainment of the author master degree and its main objective has been the resolution -by means of friendly programming- of some of the observed problems in the environmental radioactivity laboratory belonging to the Department of Radiological Surveillance and Environmental Radioactivity from CIEMAT. The software has been developed in Visual Basic for applications in Excel files and it solves by macro orders three of the detected problems: a) calculation of characteristic limits for the measurements of the beta total and beta rest activity concentrations according to standards MARLAP, ISO and UNE and the comparison of the three results b) Pb-210 and Po-210 decontamination factor determination in the ultra-low level Am-241 analysis in air samples by alpha spectrometry and c) comparison of two analytical techniques for measuring Pb-210 in air ( direct-by gamma spectrometry- and indirect -by radiochemical separation and alpha spectrometry). The organization processes of the different excel files implied in the subroutines, calculations and required formulae are explained graphically for its comprehension. The advantage of using this kind of programmes is based on their versatility and the ease for obtaining data that lately are required by tables that can be modified as time goes by and the laboratory gets more data with the special applications for describing a method (Pb-210 decontamination factors for americium analysis in air) or comparing temporal series of Pb-210 data analysed by different methods (Pb-210 in air). (Author)

  15. Applying state diagrams to food processing and development

    Science.gov (United States)

    Roos, Y.; Karel, M.

    1991-01-01

    The physical state of food components affects their properties during processing, storage, and consumption. Removal of water by evaporation or by freezing often results in formation of an amorphous state (Parks et al., 1928; Troy and Sharp, 1930; Kauzmann, 1948; Bushill et al., 1965; White and Cakebread, 1966; Slade and Levine, 1991). Amorphous foods are also produced from carbohydrate melts by rapid cooling after extrusion or in the manufacturing of hard sugar candies and coatings (Herrington and Branfield, 1984). Formation of the amorphous state and its relation to equilibrium conditions are shown in Fig. 1 [see text]. The most important change, characteristic of the amorphous state, is noticed at the glass transition temperature (Tg), which involves transition from a solid "glassy" to a liquid-like "rubbery" state. The main consequence of glass transition is an increase of molecular mobility and free volume above Tg, which may result in physical and physico-chemical deteriorative changes (White and Cakebread, 1966; Slade and Levine, 1991). We have conducted studies on phase transitions of amorphous food materials and related Tg to composition, viscosity, stickiness, collapse, recrystallization, and ice formation. We have also proposed that some diffusion-limited deteriorative reactions are controlled by the physical state in the vicinity of Tg (Roos and Karel, 1990, 1991a, b, c). The results are summarized in this article, with state diagrams based on experimental and calculated data to characterize the relevant water content, temperature, and time-dependent phenomena of amorphous food components.

  16. Applying fuzzy analytic network process in quality function deployment model

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Afsharkazemi

    2012-08-01

    Full Text Available In this paper, we propose an empirical study of QFD implementation when fuzzy numbers are used to handle the uncertainty associated with different components of the proposed model. We implement fuzzy analytical network to find the relative importance of various criteria and using fuzzy numbers we calculate the relative importance of these factors. The proposed model of this paper uses fuzzy matrix and house of quality to study the products development in QFD and also the second phase i.e. part deployment. In most researches, the primary objective is only on CRs to implement the quality function deployment and some other criteria such as production costs, manufacturing costs etc were disregarded. The results of using fuzzy analysis network process based on the QFD model in Daroupat packaging company to develop PVDC show that the most important indexes are being waterproof, resistant pill packages, and production cost. In addition, the PVDC coating is the most important index in terms of company experts’ point of view.

  17. BUSINESS PROCESS IMPROVEMENT BY APPLYING BENCHMARKING BASED MODEL

    Directory of Open Access Journals (Sweden)

    Aleksandar Vujovic

    2013-09-01

    Full Text Available The choice of theme is identified by the need to improve business processes in organizations, as well as the continuous improvement of overall quality, which is under-represented in the Montenegrin organizations. The state of Montenegro has recognized t he growing importance of small and medium-sized organizations in the development of the national economy. Small and medium-sized organizations are the drivers of future economic growth and development of every country whose competitiveness has to pay special attention. One of the main sources of the competitiveness of small and medium-sized organizations is their pursuit to the business excellence, because it has become the most powerful means of achieving competitive advantage of organizations. The paper investigates certified organizations in Montenegro and their contemporary business and commitment towards business excellence. These organizations in Montenegro adapt its business to international standards and procedures that represent the future of economic growth and development of modern business. Research results of Montenegrin organizations were compared with small and medium-sized organizations from Serbia, which won the awards for business excellence "Quality Oscar" in the category of small and medium-sized organizations, for the last three years (2009, 2010, and 2011. The idea comes from the neccessity of Montenegrin economy to give small contribution in order that small and medium organizations adjust their businesses to the new business.

  18. 25 CFR 224.183 - What other administrative appeals processes also apply?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What other administrative appeals processes also apply... DETERMINATION ACT General Appeal Procedures § 224.183 What other administrative appeals processes also apply? The administrative appeal processes in 25 CFR part 2 and 43 CFR part 4, subject to the limitations in...

  19. Applying Lean to the AC-130 Maintenance Process for the Royal Saudi Air Force

    Science.gov (United States)

    2016-09-01

    Applying lean thinking in construction and performance improvement. Alexandria Engineering Journal, 52(4), 679–695. http://doi.org/10.1016/j.aej...APPLYING LEAN TO THE AC-130 MAINTENANCE PROCESS FOR THE ROYAL SAUDI AIR FORCE THESIS...Organization or any other defense organization. AFIT-ENS-MS-16-S-024 APPLYING LEAN TO THE

  20. Quantification of UV-Visible and Laser Spectroscopic Techniques for Materials Accountability and Process Control

    International Nuclear Information System (INIS)

    Czerwinski, Kenneth; Weck, Phil

    2013-01-01

    Ultraviolet-visible spectroscopy (UV-Visible) and time-resolved laser fluorescence spectroscopy (TRLFS) optical techniques can permit on-line analysis of actinide elements in a solvent extraction process in real time. These techniques have been used for measuring actinide speciation and concentration under laboratory conditions and are easily adaptable to multiple sampling geometries, such as dip probes, fiber-optic sample cells, and flow-through cell geometries. To fully exploit these techniques, researchers must determine the fundamental speciation of target actinides and the resulting influence on spectroscopic properties. Detection limits, process conditions, and speciation of key actinide components can be established and utilized in a range of areas, particularly those related to materials accountability and process control. Through this project, researchers will develop tools and spectroscopic techniques to evaluate solution extraction conditions and concentrations of U, Pu, and Cm in extraction processes, addressing areas of process control and materials accountability. The team will evaluate UV- Visible and TRLFS for use in solvent extraction-based separations. Ongoing research is examining efficacy of UV-Visible spectroscopy to evaluate uranium and plutonium speciation under conditions found in the UREX process and using TRLFS to evaluate Cm speciation and concentration in the TALSPEAK process. A uranyl and plutonium nitrate UV-Visible spectroscopy study met with success, which supports the utility and continued exploration of spectroscopic methods for evaluation of actinide concentrations and solution conditions for other aspects of the UREX+ solvent extraction scheme. This project will examine U and Pu absorbance in TRUEX and TALSPEAK, perform detailed examination of Cm in TRUEX and TALSPEAK, study U laser fluorescence, and apply project data to contactors. The team will also determine peak ratios as a function of solution concentrations for the UV

  1. The JMB Applied Chemistry Syllabus--the Place of Case Studies and Industrial Processes

    Science.gov (United States)

    Hallas, G.; Hughes, W. J.

    1974-01-01

    Describes two novel topics in the JMB Applied Chemistry Core Syllabus. These are the social and economic aspects of chemical technology, involving the use of six case studies, and industrial processes. (Author/GS)

  2. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  3. Tutorial - applying extreme value theory to characterize food-processing systems

    DEFF Research Database (Denmark)

    Skou, Peter Bæk; Holroyd, Stephen E.; van der Berg, Franciscus Winfried J

    2017-01-01

    This tutorial presents extreme value theory (EVT) as an analytical tool in process characterization and shows its potential to describe production performance, eg, across different factories, via reliable estimates of the frequency and scale of extreme events. Two alternative EVT methods...... are discussed: point over threshold and block maxima. We illustrate the theoretical framework for EVT by process data from two different examples from the food-processing industry. Finally, we discuss limitations, decisions, and possibilities when applying EVT for process data....

  4. Fluid Dynamics Applied to Streams. Physical Processes in Terrestrial and Aquatic Ecosystems, Transport Processes.

    Science.gov (United States)

    Cowan, Christina E.

    This module is part of a series designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module deals specifically with concepts that are basic to fluid flow and…

  5. Statistical process control applied to the liquid-fed ceramic melter process

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-09-01

    In this report, an application of control charts to the apparent feed composition of a Liquid-Fed Ceramic Melter (LFCM) is demonstrated by using results from a simulation of the LFCM system. Usual applications of control charts require the assumption of uncorrelated observations over time. This assumption is violated in the LFCM system because of the heels left in tanks from previous batches. Methods for dealing with this problem have been developed to create control charts for individual batches sent to the feed preparation tank (FPT). These control charts are capable of detecting changes in the process average as well as changes in the process variation. All numbers reported in this document were derived from a simulated demonstration of a plausible LFCM system. In practice, site-specific data must be used as input to a simulation tailored to that site. These data directly affect all variance estimates used to develop control charts. 64 refs., 3 figs., 2 tabs

  6. Making Faces - State-Space Models Applied to Multi-Modal Signal Processing

    DEFF Research Database (Denmark)

    Lehn-Schiøler, Tue

    2005-01-01

    The two main focus areas of this thesis are State-Space Models and multi modal signal processing. The general State-Space Model is investigated and an addition to the class of sequential sampling methods is proposed. This new algorithm is denoted as the Parzen Particle Filter. Furthermore...... optimizer can be applied to speed up convergence. The linear version of the State-Space Model, the Kalman Filter, is applied to multi modal signal processing. It is demonstrated how a State-Space Model can be used to map from speech to lip movements. Besides the State-Space Model and the multi modal...

  7. Formulation and validation of applied engineering equations for heat transfer processes in the food industry

    OpenAIRE

    Christensen, Martin Gram; Adler-Nissen, Jens; Løje, Hanne

    2014-01-01

    The study is focused on convective heat transfer in the processing of solid foods, specifically with the scope to develop simple analytical calculation tools that can be incorporated into spreadsheet solutions. In areas of food engineering such as equipment manufacture the use of predictive calculations, modelling activities and simulations for improved design is employed to a high degree. In food manufacture the use process calculations are seldom applied. Even though, the calculation of the...

  8. 21 CFR 111.460 - What requirements apply to holding in-process material?

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What requirements apply to holding in-process material? 111.460 Section 111.460 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING PRACTICE IN...

  9. Undergraduate Journal Club as an Intervention to Improve Student Development in Applying the Scientific Process

    Science.gov (United States)

    Sandefur, Conner I.; Gordy, Claire

    2016-01-01

    We developed and implemented a series of workshops and seminars in an undergraduate journal club targeted at improving student development in applying the scientific process. Students were surveyed before and after participating in the club about their confidence in accessing, analyzing, and reporting scientific research. Post-club, the students…

  10. Emotion processing and regulation in women with morbid obesity who apply for bariatric surgery

    NARCIS (Netherlands)

    Zijlstra, H.; Middendorp, H. van; Devaere, L.; Larsen, J.K.; Ramshorst, B. van; Geenen, M.J.M.

    2012-01-01

    Emotional eating, the tendency to eat when experiencing negative affect, is prevalent in morbid obesity and may indicate that ways to deal with emotions are disturbed. Our aim was to compare emotion processing and regulation between 102 women with morbid obesity who apply for bariatric surgery and

  11. Exploration on practice teaching reform of Photoelectric Image Processing course under applied transformation

    Science.gov (United States)

    Cao, Binfang; Li, Xiaoqin; Liu, Changqing; Li, Jianqi

    2017-08-01

    With the further applied transformation of local colleges, teachers are urgently needed to make corresponding changes in the teaching content and methods from different courses. The article discusses practice teaching reform of the Photoelectric Image Processing course in the Optoelectronic Information Science and Engineering major. The Digital Signal Processing (DSP) platform is introduced to the experimental teaching. It will mobilize and inspire students and also enhance their learning motivation and innovation through specific examples. The course via teaching practice process has become the most popular course among students, which will further drive students' enthusiasm and confidence to participate in all kinds of electronic competitions.

  12. A novel processed food classification system applied to Australian food composition databases.

    Science.gov (United States)

    O'Halloran, S A; Lacy, K E; Grimes, C A; Woods, J; Campbell, K J; Nowson, C A

    2017-08-01

    The extent of food processing can affect the nutritional quality of foodstuffs. Categorising foods by the level of processing emphasises the differences in nutritional quality between foods within the same food group and is likely useful for determining dietary processed food consumption. The present study aimed to categorise foods within Australian food composition databases according to the level of food processing using a processed food classification system, as well as assess the variation in the levels of processing within food groups. A processed foods classification system was applied to food and beverage items contained within Australian Food and Nutrient (AUSNUT) 2007 (n = 3874) and AUSNUT 2011-13 (n = 5740). The proportion of Minimally Processed (MP), Processed Culinary Ingredients (PCI) Processed (P) and Ultra Processed (ULP) by AUSNUT food group and the overall proportion of the four processed food categories across AUSNUT 2007 and AUSNUT 2011-13 were calculated. Across the food composition databases, the overall proportions of foods classified as MP, PCI, P and ULP were 27%, 3%, 26% and 44% for AUSNUT 2007 and 38%, 2%, 24% and 36% for AUSNUT 2011-13. Although there was wide variation in the classifications of food processing within the food groups, approximately one-third of foodstuffs were classified as ULP food items across both the 2007 and 2011-13 AUSNUT databases. This Australian processed food classification system will allow researchers to easily quantify the contribution of processed foods within the Australian food supply to assist in assessing the nutritional quality of the dietary intake of population groups. © 2017 The British Dietetic Association Ltd.

  13. Second-order relational face processing is applied to faces of different race and photographic contrast.

    Science.gov (United States)

    Matheson, H E; Bilsbury, T G; McMullen, P A

    2012-03-01

    A large body of research suggests that faces are processed by a specialized mechanism within the human visual system. This specialized mechanism is made up of subprocesses (Maurer, LeGrand, & Mondloch, 2002). One subprocess, called second- order relational processing, analyzes the metric distances between face parts. Importantly, it is well established that other-race faces and contrast-reversed faces are associated with impaired performance on numerous face processing tasks. Here, we investigated the specificity of second-order relational processing by testing how this process is applied to faces of different race and photographic contrast. Participants completed a feature displacement discrimination task, directly measuring the sensitivity to second-order relations between face parts. Across three experiments we show that, despite absolute differences in sensitivity in some conditions, inversion impaired performance in all conditions. The presence of robust inversion effects for all faces suggests that second-order relational processing can be applied to faces of different race and photographic contrast.

  14. Plug and Play Process Control Applied to a District Heating System

    DEFF Research Database (Denmark)

    Knudsen, Torben; Trangbæk, Klaus; Kallesøe, Carsten Skovmose

    2008-01-01

    The general ideas within plug and play process control (PTC) are to initialize and reconfigure control systems just by plug and play. In this paper these ideas are applied to a district heating pressure control problem. First of all this serves as a concrete example of PTC, secondly some of the f...... of the first techniques developed in the project to solve the problems in PTC are presented. These are in the area of incremental modelling and control and they make it possible to "plug'' in a new sensor and actuator and make it "play'' automatically.......The general ideas within plug and play process control (PTC) are to initialize and reconfigure control systems just by plug and play. In this paper these ideas are applied to a district heating pressure control problem. First of all this serves as a concrete example of PTC, secondly some...

  15. Applying circular economy innovation theory in business process modeling and analysis

    Science.gov (United States)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  16. Applying the health action process approach (HAPA) to the choice of health products: An exploratory study

    DEFF Research Database (Denmark)

    Krutulyte, Rasa; Grunert, Klaus G.; Scholderer, Joachim

    on the role of the behavioural intention predictors such as risk perception, outcome expectations and self-efficacy. The model has been proved to be a useful framework for understanding consumer choosing health food and is substantial in the further application of dietary choice issues.......This paper presents the results of a qualitative pilot study that aimed to uncovering Danish consumers' motives for choosing health food. Schwarzer's (1992) health action process approach (HAPA) was applied to understand the process by which people chose health products. The research focused...

  17. Rapid Computer Aided Ligand Design and Screening of Precious Metal Extractants from TRUEX Raffinate with Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Clark, Aurora Sue [Washington State Univ., Pullman, WA (United States); Wall, Nathalie [Washington State Univ., Pullman, WA (United States); Benny, Paul [Washington State Univ., Pullman, WA (United States)

    2015-11-16

    Rhodium is the most extensively used metal in catalytic applications; it occurs in mixed ores with platinum group metals (PGMs) in the earth’s crust in low concentrations (0.4 - 10 ppb). It is resistant to aerial oxidation and insoluble in all acids, including aqua regia, making classical purification methods time-consuming and inefficient. To ensure adequate purity, several precipitation and dissolution steps are necessary during separation. Low abundance, high demand, and extensive processing make rhodium the most expensive of all PGMs. From alternative sources, rhodium is also produced in sufficient quantities (0.47 kg per ton initial heavy metal (tIHM)) during the fission of U-235 in nuclear reactors along with other PGMs (i.e., Ag, Pd, Ru). A typical power water reactor operating with UO2 fuel after cooling can generate PGMs in quantities greater than found in the earth’s crust (0.5-2 kg/tIHM). This currently untapped supply of PGMs has the potential to yield $5,000-30,000/tIHM. It is estimated that by the year 2030, the amount of rhodium generated in reactors could exceed natural reserves. Typical SNF processing removes the heavier lanthanides and actinides and can leave PGMs at ambient temperatures in aqueous acidic (Cl⁻ or NO3⁻; pH < 1) solutions at various activities. While the retrieval of these precious metals from SNF would minimize waste generation and improve resource utilization, it has been difficult to achieve thus far. Two general strategies have been utilized to extract Rh(III) from chloride media: ion pairing and coordination complexation. Ion pairing mechanisms have been studied primarily with the tertiary and quaternary amines. Additionally, mixed mechanism extractions have been observed in which ion pairing is the initial mechanism, and longer extraction equilibrium time generated coordination complexes. Very few coordination complexation extraction ligands have been studied. This project approached this problem

  18. Rapid Computer Aided Ligand Design and Screening of Precious Metal Extractants from TRUEX Raffinate with Experimental Validation

    International Nuclear Information System (INIS)

    Clark, Aurora Sue; Wall, Nathalie; Benny, Paul

    2015-01-01

    Rhodium is the most extensively used metal in catalytic applications; it occurs in mixed ores with platinum group metals (PGMs) in the earth's crust in low concentrations (0.4 - 10 ppb). It is resistant to aerial oxidation and insoluble in all acids, including aqua regia, making classical purification methods time-consuming and inefficient. To ensure adequate purity, several precipitation and dissolution steps are necessary during separation. Low abundance, high demand, and extensive processing make rhodium the most expensive of all PGMs. From alternative sources, rhodium is also produced in sufficient quantities (0.47 kg per ton initial heavy metal (tIHM)) during the fission of U-235 in nuclear reactors along with other PGMs (i.e. Ag, Pd, Ru). A typical power water reactor operating with UO 2 fuel after cooling can generate PGMs in quantities greater than found in the earth's crust (0.5-2 kg/tIHM). This currently untapped supply of PGMs has the potential to yield $5,000-30,000/tIHM. It is estimated that by the year 2030, the amount of rhodium generated in reactors could exceed natural reserves. Typical SNF processing removes the heavier lanthanides and actinides and can leave PGMs at ambient temperatures in aqueous acidic (Cl - or NO 3 - ; pH < 1) solutions at various activities. While the retrieval of these precious metals from SNF would minimize waste generation and improve resource utilization, it has been difficult to achieve thus far. Two general strategies have been utilized to extract Rh(III) from chloride media: ion pairing and coordination complexation. Ion pairing mechanisms have been studied primarily with the tertiary and quaternary amines. Additionally, mixed mechanism extractions have been observed in which ion pairing is the initial mechanism, and longer extraction equilibrium time generated coordination complexes. Very few coordination complexation extraction ligands have been studied. This project approached this problem through the

  19. Modeling segregated in- situ combustion processes through a vertical displacement model applied to a Colombian field

    International Nuclear Information System (INIS)

    Guerra Aristizabal, Jose Julian; Grosso Vargas, Jorge Luis

    2005-01-01

    Recently it has been proposed the incorporation of horizontal well technologies in thermal EOR processes like the in situ combustion process (ISC). This has taken to the conception of new recovery mechanisms named here as segregated in-situ combustion processes, which are conventional in-situ combustion process with a segregated flow component. Top/Down combustion, Combustion Override Split-production Horizontal-well and Toe-to-Heel Air Injection are three of these processes, which incorporate horizontal producers and gravity drainage phenomena. When applied to thick reservoirs a process of this nature could be reasonably modeled under concepts of conventional in-situ combustion and Crestal Gas injection, especially for heavy oils mobile at reservoir conditions. A process of this nature has been studied through an analytic model conceived for the particular conditions of the Castilla field, a homogeneous thick anticline structure containing high mobility heavy oil, which seems to be an excellent candidate for the application of these technologies

  20. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  1. Image processing applied to automatic detection of defects during ultrasonic examination

    International Nuclear Information System (INIS)

    Moysan, J.

    1992-10-01

    This work is a study about image processing applied to ultrasonic BSCAN images which are obtained in the field of non destructive testing of weld. The goal is to define what image processing techniques can bring to ameliorate the exploitation of the data collected and, more precisely, what image processing can do to extract the meaningful echoes which enable to characterize and to size the defects. The report presents non destructive testing by ultrasounds in the nuclear field and it indicates specificities of the propagation of ultrasonic waves in austenitic weld. It gives a state of the art of the data processing applied to ultrasonic images in nondestructive evaluation. A new image analysis is then developed. It is based on a powerful tool, the co-occurrence matrix. This matrix enables to represent, in a whole representation, relations between amplitudes of couples of pixels. From the matrix analysis, a new complete and automatic method has been set down in order to define a threshold which separates echoes from noise. An automatic interpretation of the ultrasonic echoes is then possible. Complete validation has been done with standard pieces

  2. Process management incorporating the intellectual capital and knowledge management: an applied study in research centres

    Directory of Open Access Journals (Sweden)

    Enrique Saravia Vergara

    2015-09-01

    Full Text Available In today’s competitive environment, organizations seek to create value for customers through management approaches that not only ensure the supply of goods and services of quality and at low prices, but that achieve long-term competitive advantages. In this context, process management appears as a management model based on "quality"; whereas "intellectual capital" and "knowledge management" models represent the main models based on the management of intangible assets, the basis of competitive success of the XXI century. This study represents a trial that, from a process management model applied to a research and review of the relevant theoretical framework to the disciplines of "intellectual capital" and "knowledge management", analyses and proposes a model of process management in research centres incorporating Intellectual Capital and Knowledge Management.

  3. 21 CFR 111.90 - What requirements apply to treatments, in-process adjustments, and reprocessing when there is a...

    Science.gov (United States)

    2010-04-01

    ... Requirement to Establish a Production and Process Control System § 111.90 What requirements apply to... reprocessing, treatment or in-process adjustment is permitted by § 111.77; (c) Any batch of dietary supplement... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What requirements apply to treatments, in-process...

  4. The Process of Laying Concrete and Analysis of Operations Applying the Lean Approach

    Directory of Open Access Journals (Sweden)

    Vidmantas Gramauskas

    2012-11-01

    Full Text Available The paper considers Lean philosophy ‘Just in Time’, a value stream map and total quality management principles applying them to the construction. In order to follow these principles, a case study was performed, thus observing and recording the process of laying concrete in three houses where a lower ground floor was casted employing fiber concrete. The collected data were required to fragment the process of laying concrete into smaller operations and examine each operation independently. The examination of operations was introduced in certain units of measurement – time, the number of workers, cubic meters of concrete used, space area, etc. This information helped with distinguishing useful operations from useless actions bringing no value to the product. The previously mentioned methods can be applied to useless operations to reduce their duration or even eliminate them. The main problem is the process of laying concrete splitting it into smaller operations, operation analysis and adaptation of Lean principles. The implementation of Lean system can reduce waste and increase the value of the final product.

  5. Influence of the applied pressure of processing upon bioactive components of diets made of feathers

    Directory of Open Access Journals (Sweden)

    Kormanjoš Šandor M.

    2013-01-01

    Full Text Available The feathers gained by slaughtering fattening chickens can be processed into protein meal for feeding certain animals, as indicated by its chemical characteristics. However, raw feather proteins (keratin are faintly digestible (cca. 19%, even inert in digestive tract. Digestion of feather proteins could be improved by hydrolysis (alkaline, enzymatic, microbiological or hydrothermal. Practically, hydrothermal processing of raw feathers is mostly applied. The influence of hydrothermal processing under the pressures of 3.0, 3.5 or 4.0 bar on the nutritive value of the resulting meal is presented in this paper. For the hydrolysis of raw feathers, semi continuous procedure was applied. Semi continuous procedure of feathers processing comprise hydrolysis of raw wet feathers followed by partial drying of hydrolyzed mass that has to be done in a hydrolyser with indirect heating. Continuous tubular dryer with recycled air was used during the final process of drying. Protein nitrogen decreased by 3.46% and 4.80% in comparison with total protein nitrogen content in raw feathers under the pressure of 3.0 and 3.5 bar, respectively. The highest applied hydrolysis pressure caused the greatest loss of protein nitrogen up to 9.52%. Hydrothermal hydrolysis under pressure has increased in vitro protein digestibility significantly. Under pressure of 3.0, 3.5 and 4.0 bar digestibility of proteins increasing from 19.01 to 76.39, 81.71 and 87.03%, respectively. Under pressure of 3.0, 3.5 and 4.0 bar cysteine content decreased from 6.44 to 4.17% (loss 35.25%, 3.94 (loss 38.825% and to 3.75% (loss 41.77%, respectively. These decreases are statistically significant. It can be concluded that the hydrolysis carried out under the pressure of 3.5 bar, during the period of 25 minutes, and with the content of water in raw feathers of cca. 61% is the optimal technological process for converting raw feathers into diets for certain animal diets.

  6. Applying traditional signal processing techniques to social media exploitation for situational understanding

    Science.gov (United States)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  7. Texts and data mining and their possibilities applied to the process of news production

    Directory of Open Access Journals (Sweden)

    Walter Teixeira Lima Jr

    2008-06-01

    Full Text Available The proposal of this essay is to discuss the challenges of representing in a formalist computational process the knowledge which the journalist uses to articulate news values for the purpose of selecting and imposing hierarchy on news. It discusses how to make bridges to emulate this knowledge obtained in an empirical form with the bases of computational science, in the area of storage, recovery and linked to data in a database, which must show the way human brains treat information obtained through their sensorial system. Systemizing and automating part of the journalistic process in a database contributes to eliminating distortions, faults and to applying, in an efficient manner, techniques for Data Mining and/or Texts which, by definition, permit the discovery of nontrivial relations.

  8. Texts and data mining and their possibilities applied to the process of news production

    Directory of Open Access Journals (Sweden)

    Walter Teixeira Lima Jr

    2011-02-01

    Full Text Available The proposal of this essay is to discuss the challenges of representing in a formalist computational process the knowledge which the journalist uses to articulate news values for the purpose of selecting and imposing hierarchy on news. It discusses how to make bridges to emulate this knowledge obtained in an empirical form with the bases of computational science, in the area of storage, recovery and linked to data in a database, which must show the way human brains treat information obtained through their sensorial system. Systemizing and automating part of the journalistic process in a database contributes to eliminating distortions, faults and to applying, in an efficient manner, techniques for Data Mining and/or Texts which, by definition, permit the discovery of nontrivial relations.

  9. Environmental assessment of different advanced oxidation processes applied to a bleaching Kraft mill effluent.

    Science.gov (United States)

    Muñoz, Iván; Rieradevall, Joan; Torrades, Francesc; Peral, José; Domènech, Xavier

    2006-01-01

    Different advanced oxidation processes (AOPs) have been applied to remove the organic carbon content of a paper mill effluent originating from the Kraft pulp bleaching process. The considered AOPs were: TiO(2)-mediated heterogeneous photocatalysis, TiO(2)-mediated heterogeneous photocatalysis assisted with H(2)O(2), TiO(2)-mediated heterogeneous photocatalysis coupled with Fenton, photo-Fenton, ozonation and ozonation with UV-A light irradiation. The application of the selected AOPs all resulted in a considerable decrease in dissolved organic carbon (DOC) content with variable treatment efficiencies depending upon the nature/type of the applied AOP. A Life Cycle Assessment (LCA) study was used as a tool to compare the different AOPs in terms of their environmental impact. Heterogeneous photocatalysis coupled with the Fenton's reagent proved to have the lowest environmental impact accompanied with a moderate-to-high DOC removal rate. On the other hand, heterogeneous photocatalysis appeared to be the worst AOP both in terms of DOC abatement rate and environmental impact. For the studied AOPs, LCA has indicated that the environmental impact was attributable to the high electrical energy (power) consumption necessary to run a UV-A lamp or to produce ozone.

  10. Applying a structured innovation process to interventional radiology: a single-center experience.

    Science.gov (United States)

    Sista, Akhilesh K; Hwang, Gloria L; Hovsepian, David M; Sze, Daniel Y; Kuo, William T; Kothary, Nishita; Louie, John D; Yamada, Kei; Hong, Richard; Dhanani, Riaz; Brinton, Todd J; Krummel, Thomas M; Makower, Joshua; Yock, Paul G; Hofmann, Lawrence V

    2012-04-01

    To determine the feasibility and efficacy of applying an established innovation process to an active academic interventional radiology (IR) practice. The Stanford Biodesign Medical Technology Innovation Process was used as the innovation template. Over a 4-month period, seven IR faculty and four IR fellow physicians recorded observations. These observations were converted into need statements. One particular need relating to gastrostomy tubes was diligently screened and was the subject of a single formal brainstorming session. Investigators collected 82 observations, 34 by faculty and 48 by fellows. The categories that generated the most observations were enteral feeding (n = 9, 11%), biopsy (n = 8, 10%), chest tubes (n = 6, 7%), chemoembolization and radioembolization (n = 6, 7%), and biliary interventions (n = 5, 6%). The output from the screening on the gastrostomy tube need was a specification sheet that served as a guidance document for the subsequent brainstorming session. The brainstorming session produced 10 concepts under three separate categories. This formalized innovation process generated numerous observations and ultimately 10 concepts to potentially to solve a significant clinical need, suggesting that a structured process can help guide an IR practice interested in medical innovation. Copyright © 2012 SIR. Published by Elsevier Inc. All rights reserved.

  11. Twenty-First Century Research Needs in Electrostatic Processes Applied to Industry and Medicine

    Science.gov (United States)

    Mazumder, M. K.; Sims, R. A.; Biris, A. S.; Srirama, P. K.; Saini, D.; Yurteri, C. U.; Trigwell, S.; De, S.; Sharma, R.

    2005-01-01

    From the early century Nobel Prize winning (1923) experiments with charged oil droplets, resulting in the discovery of the elementary electronic charge by Robert Millikan, to the early 21st century Nobel Prize (2002) awarded to John Fenn for his invention of electrospray ionization mass spectroscopy and its applications to proteomics, electrostatic processes have been successfully applied to many areas of industry and medicine. Generation, transport, deposition, separation, analysis, and control of charged particles involved in the four states of matter: solid, liquid, gas, and plasma are of interest in many industrial and biomedical processes. In this paper, we briefly discuss some of the applications and research needs involving charged particles in industrial and medical applications including: (1) Generation and deposition of unipolarly charged dry powder without the presence of ions or excessive ozone, (2) Control of tribocharging process for consistent and reliable charging, (3) Thin film (less than 25 micrometers) powder coating and Powder coating on insulative surfaces, (4) Fluidization and dispersion of fine powders, (5) Mitigation of Mars dust, (6) Effect of particle charge on the lung deposition of inhaled medical aerosols, (7) Nanoparticle deposition, and (8) Plasma/Corona discharge processes. A brief discussion on the measurements of charged particles and suggestions for research needs are also included.

  12. Improving performances of the knee replacement surgery process by applying DMAIC principles.

    Science.gov (United States)

    Improta, Giovanni; Balato, Giovanni; Romano, Maria; Ponsiglione, Alfonso Maria; Raiola, Eliana; Russo, Mario Alessandro; Cuccaro, Patrizia; Santillo, Liberatina Carmela; Cesarelli, Mario

    2017-12-01

    The work is a part of a project about the application of the Lean Six Sigma to improve health care processes. A previously published work regarding the hip replacement surgery has shown promising results. Here, we propose an application of the DMAIC (Define, Measure, Analyse, Improve, and Control) cycle to improve quality and reduce costs related to the prosthetic knee replacement surgery by decreasing patients' length of hospital stay (LOS) METHODS: The DMAIC cycle has been adopted to decrease the patients' LOS. The University Hospital "Federico II" of Naples, one of the most important university hospitals in Southern Italy, participated in this study. Data on 148 patients who underwent prosthetic knee replacement between 2010 and 2013 were used. Process mapping, statistical measures, brainstorming activities, and comparative analysis were performed to identify factors influencing LOS and improvement strategies. The study allowed the identification of variables influencing the prolongation of the LOS and the implementation of corrective actions to improve the process of care. The adopted actions reduced the LOS by 42%, from a mean value of 14.2 to 8.3 days (standard deviation also decreased from 5.2 to 2.3 days). The DMAIC approach has proven to be a helpful strategy ensuring a significant decreasing of the LOS. Furthermore, through its implementation, a significant reduction of the average costs of hospital stay can be achieved. Such a versatile approach could be applied to improve a wide range of health care processes. © 2017 John Wiley & Sons, Ltd.

  13. Nanotechnological applied tasks of the increase in the efficiency of the hardening processes of cement concrete

    Directory of Open Access Journals (Sweden)

    Chernishov Evgeny Mihalovich

    2017-02-01

    Full Text Available The scientific basis of the solution to the applied tasks of concrete technology through the use of «nano» tools, which provide the organization of the heterogeneous process of cement hydration and hardening, has been characterized. It is shown that the introduction of nanoadditives enables the direct regulation of the processes of structure formation in cement systems at the nanolevel. The effectiveness of the use of «nano» tools has been proposed to evaluate by means of complex criteria characterizing quantitatively the change in the activation energy, the rate of the process and time of its completion τ, the size and power consumption of the technology E while ensuring quality levels specified by R. According to the criteria, the monitoring of the results of the research has been made. Moreover, the most effective nanomodifying admixtures of two types have been identified. Type I is a compound nanoadditive based on nanoparticles SiO2 in combination with a superplasticizer, which mechanism of action is associated and also characterized by the increase in specific strength per unit measure the degree of cement hydration by 1.25–1.35 times. Engineering problems have been formulated. Moreover, the solutions are indicated for increasing the energy efficiency of the factory production of reinforced concrete products and structures. These solutions predetermine the reduction in the value of the maximum temperature for the curing of concrete, the reduction of the duration of the achievement of the required degree of cement hydration while concrete hardens, the reduction of time of cement concrete hardening to reach the regulated values of its strength, the increase in concrete strength per unit of cement consumption per m3 and energy efficiency of concrete hardening process in the preparation of reinforced concrete products. with the catalytic role in the processes of phase formation of nanoparticles of hydrated compounds. Type II is a

  14. The use of chemical tracers to water injection processes applied on Romanian reservoirs

    Directory of Open Access Journals (Sweden)

    Zecheru M.

    2013-05-01

    Full Text Available The hydrocarbon reservoirs are extremely complex, each reservoir having its own identity. Reservoirs heterogeneity (mainly regarding the layered ones frequently results in low recovery efficiencies, both under the primary regime and when different agents are injected from the surface. EOR processes efficiency depends on how detailed the reservoir is known and on the information related to fluids flow through reservoir. There are certain analyzes, investigations and tests providing good knowledge about the reservoir. The tracer tests are among them, being frequently used to water injection processes. Depending on the method used, IWTT (Interwell tracer test, SWTT (Single-Well Tracer Test, TWTT (Two-Well Tracer Test, information are obtained as related to: the setting of the preferential flow path of the injected fluid, the identification of water channels, evidencing the geological barriers, determining the residual oil saturation, around the well bore or along the tracer's path between two wells. This paper is focused on ICPT Câmpina efforts related to the use of the chemical tracers to the water injection processes applied to the oil reservoirs of Romania. It describes the usual tracers and the methods used to detect them in the reaction wells. Up to now, more than 50 tests with IWTT tracers have been performed on-site and this work presents some of their results.

  15. Regional LLRW [low-level radioactive waste] processing alternatives applying the DOE REGINALT systems analysis model

    International Nuclear Information System (INIS)

    Beers, G.H.

    1987-01-01

    The DOE Low-Level Waste Management Progam has developed a computer-based decision support system of models that may be used by nonprogrammers to evaluate a comprehensive approach to commercial low-level radioactive waste (LLRW) management. REGINALT (Regional Waste Management Alternatives Analysis Model) implementation will be described as the model is applied to a hypothetical regional compact for the purpose of examining the technical and economic potential of two waste processing alternaties. Using waste from a typical regional compact, two specific regional waste processing centers will be compared for feasibility. Example 1 will assume will assume that a regional supercompaction facility is being developed for the region. Example 2 will assume that a regional facility with both supercompation and incineration is specified. Both examples will include identical disposal facilities, except that capacity may differ due to variation in volume reduction achieved. The two examples will be compared with regard to volume reduction achieved, estimated occupational exposure for the processing facilities, and life cylcle costs per generated unit waste. A base case will also illustrate current disposal practices. The results of the comparisons will be evaluated, and other steps, if necessary, for additional decision support will be identified

  16. Cameco engineered tailings program: linking applied research with industrial processes for improved tailings performance

    International Nuclear Information System (INIS)

    Kotzer, T.G.

    2010-01-01

    'Full text:' Mine tailings at Cameco's operations are by-products of milling uranium ore having variable concentrations of uranium, metals, oxyanions and trace elements or elements of concern (EOC). Cameco has undertaken an Engineered Tailings (ET) program to optimize tailings performance and minimize environmental EOC impacts, regardless of the milled ore source. Applied geochemical and geotechnical tailings research is key within the ET program. In-situ drilling and experimental programs are used to understand long-term tailings behaviour and help validate source term predictions. Within this, the ET program proactively aids in the development of mill-based processes for production of tailings having improved long-term stability. (author)

  17. Applying an integrated fuzzy gray MCDM approach: A case study on mineral processing plant site selection

    Directory of Open Access Journals (Sweden)

    Ezzeddin Bakhtavar

    2017-12-01

    Full Text Available The accurate selection of a processing plant site can result in decreasing total mining cost. This problem can be solved by multi-criteria decision-making (MCDM methods. This research introduces a new approach by integrating fuzzy AHP and gray MCDM methods to solve all decision-making problems. The approach is applied in the case of a copper mine area. The critical criteria are considered adjacency to the crusher, adjacency to tailing dam, adjacency to a power source, distance from blasting sources, the availability of sufficient land, and safety against floods. After studying the mine map, six feasible alternatives are prioritized using the integrated approach. Results indicated that sites A, B, and E take the first three ranks. The separate results of fuzzy AHP and gray MCDM confirm that alternatives A and B have the first two ranks. Moreover, the field investigations approved the results obtained by the approach.

  18. Possibilities of Applying Video Surveillance and other ICT Tools and Services in the Production Process

    Directory of Open Access Journals (Sweden)

    Adis Rahmanović

    2018-02-01

    Full Text Available The paper presents the possibilities of applying Video surveillance and other ICT tools and services in the production process. The first part of the paper presented the system for controlling video surveillance for and the given opportunity of application of video surveillance for the security of the employees and the assets. In the second part of the paper an analysis of the system for controlling production is given and then a video surveillance of a work excavator. The next part of the paper presents integration of video surveillance and the accompanying tools. At the end of the paper, suggestions were also given for further works in the field of data protection and cryptography in video surveillance use.

  19. Advanced gamma spectrum processing technique applied to the analysis of scattering spectra for determining material thickness

    International Nuclear Information System (INIS)

    Hoang Duc Tam; VNUHCM-University of Science, Ho Chi Minh City; Huynh Dinh Chuong; Tran Thien Thanh; Vo Hoang Nguyen; Hoang Thi Kieu Trang; Chau Van Tao

    2015-01-01

    In this work, an advanced gamma spectrum processing technique is applied to analyze experimental scattering spectra for determining the thickness of C45 heat-resistant steel plates. The single scattering peak of scattering spectra is taken as an advantage to measure the intensity of single scattering photons. Based on these results, the thickness of steel plates is determined with a maximum deviation of real thickness and measured thickness of about 4 %. Monte Carlo simulation using MCNP5 code is also performed to cross check the results, which yields a maximum deviation of 2 %. These results strongly confirm the capability of this technique in analyzing gamma scattering spectra, which is a simple, effective and convenient method for determining material thickness. (author)

  20. A review of the technology and process on integrated circuits failure analysis applied in communications products

    Science.gov (United States)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  1. A combined electrocoagulation-sorption process applied to mixed industrial wastewater

    International Nuclear Information System (INIS)

    Linares-Hernandez, Ivonne; Barrera-Diaz, Carlos; Roa-Morales, Gabriela; Bilyeu, Bryan; Urena-Nunez, Fernando

    2007-01-01

    The removal of organic pollutants from a highly complex industrial wastewater by a aluminium electrocoagulation process coupled with biosorption was evaluated. Under optimal conditions of pH 8 and 45.45 A m -2 current density, the electrochemical method yields a very effective reduction of all organic pollutants, this reduction was enhanced when the biosorption treatment was applied as a polishing step. Treatment reduced chemical oxygen demand (COD) by 84%, biochemical oxygen demand (BOD 5 ) by 78%, color by 97%, turbidity by 98% and fecal coliforms by 99%. The chemical species formed in aqueous solution were determined. The initial and final pollutant levels in the wastewater were monitored using UV-vis spectrometry and cyclic voltammetry. Finally, the morphology and elemental composition of the biosorbent was characterized with scanning electron microscopy (SEM) and energy dispersion spectra (EDS)

  2. Imaging and pattern recognition techniques applied to particulate solids material characterization in mineral processing

    International Nuclear Information System (INIS)

    Bonifazi, G.; La Marca, F.; Massacci, P.

    1999-01-01

    The characterization of particulate solids can be carried out by chemical and mineralogical analysis, or, in some cases, following a new approach based on the combined use of: i) imaging techniques to detect the surface features of the particles, and ii) pattern recognition procedures, to identify and classify the mineralogical composition on the bases of the previously detected 'pictorial' features. The aim of this methodology is to establish a correlation between image parameters (texture and color) and physical chemical parameters characterizing the set of particles to be evaluated. The technique was applied to characterize the raw-ore coming from a deposit of mineral sands of three different lithotypes. An appropriate number of samples for each lithotype has been collected. A vector of attributes (pattern vector), by either texture and color parameters, has been associated to each sample. Image analysis demonstrated as the selected parameters are quite sensitive to the conditions of image acquisition: in fact optical properties may be strongly influenced by physical condition, in terms of moisture content and optics set-up and lighting conditions. Standard conditions for acquisition have been selected according to the in situ conditions during sampling. To verify the reliability of the proposed methodology, images have been acquired under different conditions of humidity, focusing and illumination. In order to evaluate the influence of these parameters on image pictorial properties, textural analysis procedures have been applied to the image acquired from different samples. Data resulting from the processing have been used for remote controlling of the material fed to the mineral processing plant. (author)

  3. Applying lessons learned from the USAID family planning graduation experience to the GAVI graduation process.

    Science.gov (United States)

    Shen, Angela K; Farrell, Marguerite M; Vandenbroucke, Mary F; Fox, Elizabeth; Pablos-Mendez, Ariel

    2015-07-01

    As low income countries experience economic transition, characterized by rapid economic growth and increased government spending potential in health, they have increased fiscal space to support and sustain more of their own health programmes, decreasing need for donor development assistance. Phase out of external funds should be systematic and efforts towards this end should concentrate on government commitments towards country ownership and self-sustainability. The 2006 US Agency for International Development (USAID) family planning (FP) graduation strategy is one such example of a systematic phase-out approach. Triggers for graduation were based on pre-determined criteria and programme indicators. In 2011 the GAVI Alliance (formerly the Global Alliance for Vaccines and Immunizations) which primarily supports financing of new vaccines, established a graduation policy process. Countries whose gross national income per capita exceeds $1570 incrementally increase their co-financing of new vaccines over a 5-year period until they are no longer eligible to apply for new GAVI funding, although previously awarded support will continue. This article compares and contrasts the USAID and GAVI processes to apply lessons learned from the USAID FP graduation experience to the GAVI process. The findings of the review are 3-fold: (1) FP graduation plans served an important purpose by focusing on strategic needs across six graduation plan foci, facilitating graduation with pre-determined financial and technical benchmarks, (2) USAID sought to assure contraceptive security prior to graduation, phasing out of contraceptive donations first before phasing out from technical assistance in other programme areas and (3) USAID sought to sustain political support to assure financing of products and programmes continue after graduation. Improving sustainability more broadly beyond vaccine financing provides a more comprehensive approach to graduation. The USAID FP experience provides a

  4. Optimized nanocrystalline strontium hexaferrite prepared by applying a methane GTR process on a conventionally synthesized powder

    Energy Technology Data Exchange (ETDEWEB)

    Dehghan, R.; Seyyed Ebrahimi, S.A., E-mail: saseyyed@ut.ac.ir

    2014-11-15

    Optimization of the effective re-calcination parameters in a gaseous heat treatment and re-calcination (GTR) process for producing nanocrystalline Sr-hexaferrite powder using CH{sub 4} has been investigated for the first time in this research. The initial Sr-hexaferrite powder was prepared by a conventional route with calcination of the mixture of SrCO{sub 3} and α-Fe{sub 2}O{sub 3} at 1100 °C for 1 h. Then the resultant powder was isothermally heat treated in CH{sub 4} dynamic atmosphere at 950 °C with a gas flow of 15 cc/min for 30 min. Finally the resultant powder was re-calcined at various temperatures for different times. The rate of heating and cooling was 10 °C/min. Due to the gas heat treatment, the hard magnetic nature of the material changed from hard to soft with changes in the phase composition, particle size and morphology. In the second step of the process, the soft magnetic nature of the intermediate material returned from soft to hard again by re-calcination. However, the resultant nanocrystalline Sr-hexaferrite powder had a higher coercivity compared to that of the initial powder. The results showed significant changes in morphology and crystallite size of the initial powder during re-calcination process which made a great increase of about 17% in its coercivity. The crystallite size of the resultant Sr-hexaferrite was measured lower than 50 nm. - Highlights: • Optimized re-calcination in GTR using CH{sub 4} has been investigated for the first time. • The results showed a great increase of 17% in initial powder coercivity. • The crystallite size of the resultant Sr-hexaferrite was lower than 50 nm. • Applying this process can make it suitable for a wide range of magnetic properties.

  5. Parametric Accuracy: Building Information Modeling Process Applied to the Cultural Heritage Preservation

    Science.gov (United States)

    Garagnani, S.; Manferdini, A. M.

    2013-02-01

    Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.

  6. A system simulation model applied to the production schedule of a fish processing facility

    Directory of Open Access Journals (Sweden)

    Carla Roberta Pereira

    2012-11-01

    Full Text Available The simulation seeks to import the reality to a controlled environment, where it is possible to study it behavior, under several conditions, without involving physical risks and/or high costs. Thus, the system simulation becomes a useful and powerful technique in emergence markets, as the tilapiculture sector that needs to expand its business. The main purpose of this study was the development of a simulation model to assist the decisions making of the production scheduling of a fish processing facility. It was applied, as research method, the case study and the modeling/simulation, including in this set the SimuCAD methodology and the development phases of a simulation model. The model works with several alternative scenarios, testing different working shifts, types of flows and production capacity, besides variations of the ending inventory and sales. The result of this research was a useful and differentiated model simulation to assist the decision making of the production scheduling of fish processing facility studied.

  7. Applying the Analytical Hierarchy Process (AHP into the effects assessment of river training works

    Directory of Open Access Journals (Sweden)

    Hachoł Justyna

    2017-12-01

    Full Text Available The aim of the following study was to compare a few methods of river regulations and indicate the one which fully meets technical regulative standard and concurrently ensures protection of the watercourse ecosystem. According to the sustainable development rules it is of the most importance in every human activity to compromise between developmental and environmental needs of current and future generations. Therefore, both technical criteria related to flood safety and environmental ones were taken into consideration in the analysis. Field study was conducted in vegetation stage between 2008 and 2014 in small and medium lowland watercourses in Lower Silesia. The research comprised of measurements and descriptions of selected technical and environmental elements of a complex system of the watercourse river bed. Basing on obtained results a multicriterial assessment of the effects of the works was conducted. In order to assess the results an Analytic Hierarchy Process (AHP was used in the study. It facilitated the creation of linear ranking of river beds and indicate the most optimal solution in terms of sustainable development. Such methods have not been applied in solving problems connected with river regulation. That’s why this study aims also at checking the utility of this method in decision making in both planning and regulation works realization. Results of the study indicate high usefulness of AHP method in the decision-making process.

  8. Applying Enhancement Filters in the Pre-processing of Images of Lymphoma

    International Nuclear Information System (INIS)

    Silva, Sérgio Henrique; Do Nascimento, Marcelo Zanchetta; Neves, Leandro Alves; Batista, Valério Ramos

    2015-01-01

    Lymphoma is a type of cancer that affects the immune system, and is classified as Hodgkin or non-Hodgkin. It is one of the ten types of cancer that are the most common on earth. Among all malignant neoplasms diagnosed in the world, lymphoma ranges from three to four percent of them. Our work presents a study of some filters devoted to enhancing images of lymphoma at the pre-processing step. Here the enhancement is useful for removing noise from the digital images. We have analysed the noise caused by different sources like room vibration, scraps and defocusing, and in the following classes of lymphoma: follicular, mantle cell and B-cell chronic lymphocytic leukemia. The filters Gaussian, Median and Mean-Shift were applied to different colour models (RGB, Lab and HSV). Afterwards, we performed a quantitative analysis of the images by means of the Structural Similarity Index. This was done in order to evaluate the similarity between the images. In all cases we have obtained a certainty of at least 75%, which rises to 99% if one considers only HSV. Namely, we have concluded that HSV is an important choice of colour model at pre-processing histological images of lymphoma, because in this case the resulting image will get the best enhancement

  9. A fast process development flow by applying design technology co-optimization

    Science.gov (United States)

    Chen, Yi-Chieh; Yeh, Shin-Shing; Ou, Tsong-Hua; Lin, Hung-Yu; Mai, Yung-Ching; Lin, Lawrence; Lai, Jun-Cheng; Lai, Ya Chieh; Xu, Wei; Hurat, Philippe

    2017-03-01

    Beyond 40 nm technology node, the pattern weak points and hotspot types increase dramatically. The typical patterns for lithography verification suffers huge turn-around-time (TAT) to handle the design complexity. Therefore, in order to speed up process development and increase pattern variety, accurate design guideline and realistic design combinations are required. This paper presented a flow for creating a cell-based layout, a lite realistic design, to early identify problematic patterns which will negatively affect the yield. A new random layout generating method, Design Technology Co-Optimization Pattern Generator (DTCO-PG), is reported in this paper to create cell-based design. DTCO-PG also includes how to characterize the randomness and fuzziness, so that it is able to build up the machine learning scheme which model could be trained by previous results, and then it generates patterns never seen in a lite design. This methodology not only increases pattern diversity but also finds out potential hotspot preliminarily. This paper also demonstrates an integrated flow from DTCO pattern generation to layout modification. Optical Proximity Correction, OPC and lithographic simulation is then applied to DTCO-PG design database to detect hotspots and then hotspots or weak points can be automatically fixed through the procedure or handled manually. This flow benefits the process evolution to have a faster development cycle time, more complexity pattern design, higher probability to find out potential hotspots in early stage, and a more holistic yield ramping operation.

  10. [New technologies applied to the medication-dispensing process, error analysis and contributing factors].

    Science.gov (United States)

    Alvarez Díaz, A M; Delgado Silveira, E; Pérez Menéndez-Conde, C; Pintor Recuenco, R; Gómez de Salazar López de Silanes, E; Serna Pérez, J; Mendoza Jiménez, T; Bermejo Vicedo, T

    2010-01-01

    Calculate error prevalence occurred in different medication-dispensing systems, the stages of occurrence, and contributing factors. Prospective observational study. The staging of the dispensing process were reviewed in five dispensing systems: Stock, Unitary-Dose dispensing systems (UDDS) without Computerized Prescription Order Entry (CPOE), CPOE-UDDS, Automated Dispensing Systems (ADS) without CPOE and CPOE-ADS. Dispensing errors were identified, together with the stages of occurrence of such errors and their contributing factors. 2,181 errors were detected among 54,169 opportunities of error. Error-rate: Stock, 10.7%; no-CPOE-UDDS, 3.7%, CPOE-UDDS, 2.2%, no-CPOE-ADS, 20.7%; CPOE-ADS, 2.9%. Most frequent stage when error occurs: Stock, preparation of order; no-CPOE-UDDS and CPOE-UDDS, filling of the unit dose cart; no-CPOE-ADS and CPOE-ADS, filling of the ADS. Most frequent error: Stock, no-CPOE-ADS and CPOE-ADS, omission; CPOE-UDDS, different amount of drug and no-CPOE-UDDS, extra medication. Contributing factor: Stock, CPOE-ADS and no-CPOE-ADS, stock out/supply problems; CPOE-UDDS, inexperienced personnel and deficient communication system between professionals; no-CPOE-UDDS, deficient communication system between professionals. Applying new technologies to the dispensing process has increased its safety, particularly, implementation of CPOE has enabled to reduce dispensing errors. Copyright © 2009 SEFH. Published by Elsevier Espana. All rights reserved.

  11. Nursing process applied to a mother gynecobstetric teenager: a case study

    Directory of Open Access Journals (Sweden)

    Nathalie Alfaro Vargas

    2013-10-01

    Full Text Available The present article summarizes the intervention done towards an adolescent mother who had a twinpregnancy. The methodology used is the process of nursing attention; the process is divided into four stages: first,the stage of valuating in which information is gathered through the revision of the health file of the adolescent andthe application of the nursing history; the second stage is nursing diagnosis used to identify the problemspresented by the patient; the third stage is planning which includes the designing of objectives and actions thatwould lead the interventions. The previous information is summarized in the “Plan de Cuidados de Enfermería”based on the theory of Dorotea Orem. Finally, it was apply the stage of performing in which there wereimplemented actions with the adolescent mother; she obtained necessary information based on her needs for theimprovement of life quality. Comprehensive and timely intervention allowed the reduction of risks for the motherand her children and the use of Orem's theory allowed the teenager taking skills to cope with their new role asmother.

  12. Applying CFD in the Analysis of Heavy Oil/Water Separation Process via Hydrocyclone

    Directory of Open Access Journals (Sweden)

    K Angelim

    2017-06-01

    Full Text Available In recent years most of the oil reserves discovered has been related to heavy oil reservoirs whose reserves are abundant but still show operational difficulties. This fact provoked great interest of the petroleum companies in developing new technologies for increasing the heavy oil production. Produced water generation, effluent recovered from the production wells together with oil and natural gas, is among the greatest potential factors for environmental degradation. Thus, a new scenario of the oil industry appears requiring improvement in treatment units for produced water. Among the technological improvements in the facilities, the use of hydrocyclones has been applied in the treatment of the oily water. In this sense, this study aims to investigate numerically the separation process of heavy oil from a water stream via hydrocyclone, using the computational fluid dynamics technique. In the mathematical modeling was considered a two-phase, three-dimensional, stationary, isothermal and turbulent flow. Results of streamlines, pressure and volume fraction fields of the involved phases (oil and water into the hydrocyclone, and mechanical efficiency and pumping power of the fluids are shown and analyzed. In conclusion, it seems that with increasing fluid input velocity in the device there is an increase in pressure drop, indicating a greater pumping energy consumption of the mixture, and greatly influences the separation process efficiency.

  13. Applying some methods to process the data coming from the nuclear reactions

    International Nuclear Information System (INIS)

    Suleymanov, M.K.; Abdinov, O.B.; Belashev, B.Z.

    2010-01-01

    Full text : The methods of a posterior increasing the resolution of the spectral lines are offered to process the data coming from the nuclear reactions. The methods have applied to process the data coming from the nuclear reactions at high energies. They give possibilities to get more detail information on a structure of the spectra of particles emitted in the nuclear reactions. The nuclear reactions are main source of the information on the structure and physics of the atomic nuclei. Usually the spectrums of the fragments of the reactions are complex ones. Apparently it is not simple to extract the necessary for investigation information. In the talk we discuss the methods of a posterior increasing the resolution of the spectral lines. The methods could be useful to process the complex data coming from the nuclear reactions. We consider the Fourier transformation method and maximum entropy one. The complex structures were identified by the method. One can see that at lest two selected points are indicated by the method. Recent we presented a talk where we shown that the results of the analyzing the structure of the pseudorapidity spectra of charged relativistic particles with ≥ 0.7 measured in Au+Em and Pb+Em at AGS and SPS energies using the Fourier transformation method and maximum entropy one. The dependences of these spectra on the number of fast target protons were studied. These distribution shown visually some plateau and shoulder that was at least three selected points on the distributions. The plateaus become wider in PbEm reactions. The existing of plateau is necessary for the parton models. The maximum entropy method could confirm the existing of the plateau and the shoulder on the distributions. The figure shows the results of applying the maximum entropy method. One can see that the method indicates several clean selected points. Some of them same with observed visually ones. We would like to note that the Fourier transformation method could not

  14. Embedding chiropractic in Indigenous Health Care Organisations: applying the normalisation process model.

    Science.gov (United States)

    Polus, Barbara I; Paterson, Charlotte; van Rotterdam, Joan; Vindigni, Dein

    2012-11-26

    Improving the health of Indigenous Australians remains a major challenge. A chiropractic service was established to evaluate this treatment option for musculoskeletal illness in rural Indigenous communities, based on the philosophy of keeping the community involved in all the phases of development, implementation, and evaluation. The development and integration of this service has experienced many difficulties with referrals, funding and building sustainability. Evaluation of the program was a key aspect of its implementation, requiring an appropriate process to identify specific problems and formulate solutions to improve the service. We used the normalisation process model (May 2006) to order the data collected in consultation meetings and to inform our strategy and actions. The normalisation process model provided us with a structure for organising consultation meeting data and helped prioritise tasks. Our data was analysed as it applied to each dimension of the model, noting aspects that the model did not encompass. During this process we reworded the dimensions into more everyday terminology. The final analysis focused on to what extent the model helped us to prioritise and systematise our tasks and plans. We used the model to consider ways to promote the chiropractic service, to enhance relationships and interactions between clinicians and procedures within the health service, and to avoid disruption of the existing service. We identified ways in which chiropractors can become trusted team members who have acceptable and recognised knowledge and skills. We also developed strategies that should result in chiropractic practitioners finding a place within a complex occupational web, by being seen as similar to well-known occupations such as physiotherapy. Interestingly, one dimension identified by our data, which we have labelled 'emancipatory', was absent from the model. The normalisation process model has resulted in a number of new insights and questions. We

  15. Effect of Shear Applied During a Pharmaceutical Process on Near Infrared Spectra.

    Science.gov (United States)

    Hernández, Eduardo; Pawar, Pallavi; Rodriguez, Sandra; Lysenko, Sergiy; Muzzio, Fernando J; Romañach, Rodolfo J

    2016-03-01

    This study describes changes observed in the near-infrared (NIR) diffuse reflectance (DR) spectra of pharmaceutical tablets after these tablets were subjected to different levels of strain (exposure to shear) during the mixing process. Powder shearing is important in the mixing of powders that are cohesive. Shear stress is created in a system by moving one surface over another causing displacements in the direction of the moving surface and is part of the mixing dynamics of particulates in many industries including the pharmaceutical industry. In continuous mixing, shear strain is developed within the process when powder particles are in constant movement and can affect the quality attributes of the final product such as dissolution. These changes in the NIR spectra could affect results obtained from NIR calibration models. The aim of the study was to understand changes in the NIR diffuse reflectance spectra that can be associated with different levels of strain developed during blend shearing of laboratory samples. Shear was applied using a Couette cell and tablets were produced using a tablet press emulator. Tablets with different shear levels were measured using NIR spectroscopy in the diffuse reflectance mode. The NIR spectra were baseline corrected to maintain the scattering effect associated with the physical properties of the tablet surface. Principal component analysis was used to establish the principal sources of variation within the samples. The angular dependence of elastic light scattering shows that the shear treatment reduces the size of particles and produces their uniform and highly isotropic distribution. Tablet compaction further reduces the diffuse component of scattering due to realignment of particles. © The Author(s) 2016.

  16. High-pressure processing applied to cooked sausages: bacterial populations during chilled storage.

    Science.gov (United States)

    Yuste, J; Pla, R; Capellas, M; Ponce, E; Mor-Mur, M

    2000-08-01

    Vacuum-packaged cooked sausages were pressurized at 500 MPa for 5 or 15 min at mild temperature (65 degrees C) and later stored at 2 and 8 degrees C for 18 weeks. Counts of aerobic mesophiles and psychrotrophs, lactic acid bacteria, enterobacteria, Baird-Parker microbiota, and Listeria spp. were determined 1 day and 3, 6, 9, 12, 15, and 18 weeks after treatment and compared with those of cooked sausages treated at 80 to 85 degrees C for 40 min. Pressurization generated reductions of about 4 log CFU/g in psychrotrophs and lactic acid bacteria. Enterobacteria and Listeria proved the most pressure sensitive; insignificant or no growth was detected throughout the study. Heat treatment inactivated psychrotrophs and enterobacteria similarly to pressure treatment. Listeria monocytogenes and enterotoxigenic Staphylococcus aureus were not found in treated samples. In general, there was no significant difference in counts of any bacterial populations either among treatments or between storage temperatures. High-pressure processing at mild temperature is an effective preservation method that can replace heat pasteurization applied to some cooked meat and poultry products after packaging.

  17. Performance Evaluation of Various STL File Mesh Refining Algorithms Applied for FDM-RP Process

    Science.gov (United States)

    Ledalla, Siva Rama Krishna; Tirupathi, Balaji; Sriram, Venkatesh

    2016-06-01

    Layered manufacturing machines use the stereolithography (STL) file to build parts. When a curved surface is converted from a computer aided design (CAD) file to STL, it results in a geometrical distortion and chordal error. Parts manufactured with this file, might not satisfy geometric dimensioning and tolerance requirements due to approximated geometry. Current algorithms built in CAD packages have export options to globally reduce this distortion, which leads to an increase in the file size and pre-processing time. In this work, different mesh subdivision algorithms are applied on STL file of a complex geometric features using MeshLab software. The mesh subdivision algorithms considered in this work are modified butterfly subdivision technique, loops sub division technique and general triangular midpoint sub division technique. A comparative study is made with respect to volume and the build time using the above techniques. It is found that triangular midpoint sub division algorithm is more suitable for the geometry under consideration. Only the wheel cap part is then manufactured on Stratasys MOJO FDM machine. The surface roughness of the part is measured on Talysurf surface roughness tester.

  18. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    Science.gov (United States)

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.

  19. Development of vertical electropolishing process applied on 1300 and 704 MHz superconducting niobium resonators

    Directory of Open Access Journals (Sweden)

    F. Eozénou

    2014-08-01

    Full Text Available An advanced setup for vertical electropolishing of superconducting radio-frequency niobium elliptical cavities has been installed at CEA Saclay. Cavities are vertically electropolished with circulating standard HF-HF-H_{2}SO_{4} electrolytes. Parameters such as voltage, cathode shape, acid flow, and temperature have been investigated. A low voltage (between 6 and 10 V depending on the cavity geometry, a high acid flow (25  L/min, and a low acid temperature (20° C are considered as promising parameters. Such a recipe has been tested on single-cell and nine-cell International Linear Collider (ILC as well as 704 MHz five-cell Super Proton Linac (SPL cavities. Single-cell cavities showed similar performances at 1.6 K being either vertically or horizontally electropolished. The applied baking process provides similar benefit. An asymmetric removal is observed with faster removal in the upper half-cells. Multicell cavities (nine-cell ILC and five-cell SPL cavities exhibit a standard Q_{0} value at low and medium accelerating fields though limited by power losses due to field emitted electrons.

  20. Detecting Events Beyond the Catalog - Applying Empirical Matched Field Processing to Salton Sea Geothermal Field Seismicity

    Science.gov (United States)

    Templeton, D. C.; Wang, J.; Harris, D. B.

    2011-12-01

    We apply the empirical Matched Field Processing (MFP) method to continuous seismic data obtained from the Salton Sea Geothermal Field to detect and locate more micorearthquakes than can be detected using only traditional earthquake detections methods. The empirical MFP method compares the amplitude and phase of the incoming seismic data to a set of pre-computed master templates. The master templates are created from previously observed earthquakes with good signal-to-noise ratio. We will relocate the seismicity using two different methods: hypoDD and BayesLoc. hypoDD is a double-difference earthquake relocation method that utilizes absolute P- and S-wave travel-time measurements and cross-correlation P- and S- wave differential travel-time measurements to determine high-resolution relative hypocenter locations (Waldhouser and Ellsworth, 2000). BayesLoc is a probabilistic (Bayesian) multiple-event locator that simultaneously provides a probabilistic characterization of the unknown origin parameters, corrections to the assumed travel-time model, improvements in the precision of the observed arrival-time data and accuracy of the assigned phase labels (Myers et al., 2007, 2009). Additionally, we will model the Coulomb stress changes, assuming the seismicity is due to an opening or shearing crack. We will match the location of the modeled stress increases with the locations of the mapped seismicity using a grid-search method.

  1. Applying the health action process approach to bicycle helmet use and evaluating a social marketing campaign.

    Science.gov (United States)

    Karl, Florian M; Smith, Jennifer; Piedt, Shannon; Turcotte, Kate; Pike, Ian

    2017-08-05

    Bicycle injuries are of concern in Canada. Since helmet use was mandated in 1996 in the province of British Columbia, Canada, use has increased and head injuries have decreased. Despite the law, many cyclists do not wear a helmet. Health action process approach (HAPA) model explains intention and behaviour with self-efficacy, risk perception, outcome expectancies and planning constructs. The present study examines the impact of a social marketing campaign on HAPA constructs in the context of bicycle helmet use. A questionnaire was administered to identify factors determining helmet use. Intention to obey the law, and perceived risk of being caught if not obeying the law were included as additional constructs. Path analysis was used to extract the strongest influences on intention and behaviour. The social marketing campaign was evaluated through t-test comparisons after propensity score matching and generalised linear modelling (GLM) were applied to adjust for the same covariates. 400 cyclists aged 25-54 years completed the questionnaire. Self-efficacy and Intention were most predictive of intention to wear a helmet, which, moderated by planning, strongly predicted behaviour. Perceived risk and outcome expectancies had no significant impact on intention. GLM showed that exposure to the campaign was significantly associated with higher values in self-efficacy, intention and bicycle helmet use. Self-efficacy and planning are important points of action for promoting helmet use. Social marketing campaigns that remind people of appropriate preventive action have an impact on behaviour. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. 25 CFR 900.133 - Does the declination process or the Contract Dispute Act apply to construction contract...

    Science.gov (United States)

    2010-04-01

    ...-DETERMINATION AND EDUCATION ASSISTANCE ACT Construction § 900.133 Does the declination process or the Contract... 25 Indians 2 2010-04-01 2010-04-01 false Does the declination process or the Contract Dispute Act apply to construction contract amendments proposed either by an Indian tribe or tribal organization or...

  3. The Time Lens Concept Applied to Ultra-High-Speed OTDM Signal Processing

    DEFF Research Database (Denmark)

    Clausen, Anders; Palushani, Evarist; Mulvad, Hans Christian Hansen

    2013-01-01

    This survey paper presents some of the applications where the versatile time-lens concept successfully can be applied to ultra-high-speed serial systems by offering expected needed functionalities for future optical communication networks....

  4. Parallel Processing and Applied Mathematics. 10th International Conference, PPAM 2013. Revised Selected Papers

    DEFF Research Database (Denmark)

    The following topics are dealt with: parallel scientific computing; numerical algorithms; parallel nonnumerical algorithms; cloud computing; evolutionary computing; metaheuristics; applied mathematics; GPU computing; multicore systems; hybrid architectures; hierarchical parallelism; HPC systems...

  5. Assessment of radiobiological metrics applied to patient-specific QA process of VMAT prostate treatments.

    Science.gov (United States)

    Clemente-Gutiérrez, Francisco; Pérez-Vara, Consuelo; Clavo-Herranz, María H; López-Carrizosa, Concepción; Pérez-Regadera, José; Ibáñez-Villoslada, Carmen

    2016-03-08

    VMAT is a powerful technique to deliver hypofractionated prostate treatments. The lack of correlations between usual 2D pretreatment QA results and the clinical impact of possible mistakes has allowed the development of 3D verification systems. Dose determination on patient anatomy has provided clinical predictive capability to patient-specific QA process. Dose-volume metrics, as evaluation criteria, should be replaced or complemented by radiobiological indices. These metrics can be incorporated into individualized QA extracting the information for response parameters (gEUD, TCP, NTCP) from DVHs. The aim of this study is to assess the role of two 3D verification systems dealing with radiobiological metrics applied to a prostate VMAT QA program. Radiobiological calculations were performed for AAPM TG-166 test cases. Maximum differences were 9.3% for gEUD, -1.3% for TCP, and 5.3% for NTCP calculations. Gamma tests and DVH-based comparisons were carried out for both systems in order to assess their performance in 3D dose determination for prostate treatments (high-, intermediate-, and low-risk, as well as prostate bed patients). Mean gamma passing rates for all structures were bet-ter than 92.0% and 99.1% for both 2%/2 mm and 3%/3 mm criteria. Maximum discrepancies were (2.4% ± 0.8%) and (6.2% ± 1.3%) for targets and normal tis-sues, respectively. Values for gEUD, TCP, and NTCP were extracted from TPS and compared to the results obtained with the two systems. Three models were used for TCP calculations (Poisson, sigmoidal, and Niemierko) and two models for NTCP determinations (LKB and Niemierko). The maximum mean difference for gEUD calculations was (4.7% ± 1.3%); for TCP, the maximum discrepancy was (-2.4% ± 1.1%); and NTCP comparisons led to a maximum deviation of (1.5% ± 0.5%). The potential usefulness of biological metrics in patient-specific QA has been explored. Both systems have been successfully assessed as potential tools for evaluating the clinical

  6. The Z-Transform Applied to Birth-Death Markov Processes ...

    African Journals Online (AJOL)

    Birth-death Markov models have been widely used in the study of natural and physical processes. The analysis of such processes, however, is mostly performed using time series analysis. In this report, a finite state birth‑death Markov process is analyzed using the z‑transform approach. The performance metrics of the ...

  7. Decision making for wildfires: A guide for applying a risk management process at the incident level

    Science.gov (United States)

    Mary A. Taber; Lisa M. Elenz; Paul G. Langowski

    2013-01-01

    This publication focuses on the thought processes and considerations surrounding a risk management process for decision making on wildfires. The publication introduces a six element risk management cycle designed to encourage sound risk-informed decision making in accordance with Federal wildland fire policy, although the process is equally applicable to non-Federal...

  8. Process modeling and control applied to real-time monitoring of distillation processes by near-infrared spectroscopy.

    Science.gov (United States)

    de Oliveira, Rodrigo R; Pedroza, Ricardo H P; Sousa, A O; Lima, Kássio M G; de Juan, Anna

    2017-09-08

    A distillation device that acquires continuous and synchronized measurements of temperature, percentage of distilled fraction and NIR spectra has been designed for real-time monitoring of distillation processes. As a process model, synthetic commercial gasoline batches produced in Brazil, which contain mixtures of pure gasoline blended with ethanol have been analyzed. The information provided by this device, i.e., distillation curves and NIR spectra, has served as initial information for the proposal of new strategies of process modeling and multivariate statistical process control (MSPC). Process modeling based on PCA batch analysis provided global distillation trajectories, whereas multiset MCR-ALS analysis is proposed to obtain a component-wise characterization of the distillation evolution and distilled fractions. Distillation curves, NIR spectra or compressed NIR information under the form of PCA scores and MCR-ALS concentration profiles were tested as the seed information to build MSPC models. New on-line PCA-based MSPC approaches, some inspired on local rank exploratory methods for process analysis, are proposed and work as follows: a) MSPC based on individual process observation models, where multiple local PCA models are built considering the sole information in each observation point; b) Fixed Size Moving Window - MSPC, in which local PCA models are built considering a moving window of the current and few past observation points; and c) Evolving MSPC, where local PCA models are built with an increasing window of observations covering all points since the beginning of the process until the current observation. Performance of different approaches has been assessed in terms of sensitivity to fault detection and number of false alarms. The outcome of this work will be of general use to define strategies for on-line process monitoring and control and, in a more specific way, to improve quality control of petroleum derived fuels and other substances submitted

  9. Applying social network analysis to monitor web-enabled business processes

    OpenAIRE

    Carroll, Noel; Richardson, Ita; Whelan, Eoin

    2010-01-01

    peer-reviewed The unprecedented growth in service-based business processes over a short period of time has underscored the need for understanding the mechanisms and theorising the business models and business process management adopted across many organisations today. This is more evident within the Irish health sector. This research summarises a survey of the literature and argues that the inability of current Business Process Management (BPM) techniques to visualise and monitor web-enabl...

  10. Applying lean principles to achieve continuous flow in 3PLs’ outbound processes

    NARCIS (Netherlands)

    Overboom, M.A.; Small, J.S.; Naus, A.J.A.M.; de Haan, J.A.C.

    2013-01-01

    The article offers information on the application of lean principles to achieve continuous flow in third party logistics providers (3PLs). It mentions that lean management principles and practices have been traditionally applied to manufacturing systems and try to make products flow through the

  11. Reducing Quantization Error and Contextual Bias problems in Software Development Processes by Applying Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet

    Object-oriented methods define a considerable number of rules, which are generally expressed using two-valued logic. For example, an entity in a requirement specification is either accepted or rejected as a class. There are two major problems how rules are defined and applied in current methods.

  12. Applying Collaborative Engineering to the Facility Delivery Process: A Testbed Demonstration

    National Research Council Canada - National Science Library

    Brucker, Beth

    1998-01-01

    ...) have been developing a collaborative engineering (CE) software environment to enable sharing of design information as it is created and refined during the facility design and construction process...

  13. Applying the Theory of Constraints in the Course of Process Improvement

    Science.gov (United States)

    Marton, Michal; Paulová, Iveta

    2010-01-01

    Theory of constraints (TOC) is about thinking in logical, systematic, or structured processes similar to the PDCA learning loop. It is about analyzing cause and effect, verifying basic assumptions, exploring alternatives and process improvement. The goal of TOC is to maximize the efficiency of a process selectively at the most critical points and thereby maximize profitability, quality, or other corporate objectives. This paper include basic theoretical information about TOC and following application during process improvement. This paper was realised with VEGA support No. 1/0229/08 Perspectives of quality management development in coherence with requirements of Slovak republic market.

  14. Coagulation-flocculation process applied to wastewaters generated in hydrocarbon-contaminated soil washing

    International Nuclear Information System (INIS)

    Torres, L. g.; Belloc, C.; Iturbe, R.; Bandala, E.

    2009-01-01

    A wastewater produced in the contaminated soil washing was treated by means of coagulation-flocculation (CF) process. the wastewater treatment in this work continued petroleum hydrocarbons, a surfactant, i. e., sodium dodecyl sulphate (SDS) as well as salts, humic acids and other constituents that were lixiviated rom the soil during the washing process. The aim of this work was to develop a process for treating the wastewaters generated when washing hydrocarbon-contaminated soils in such a way that it could be recycled to the washing process, and at the end of the cleaning up, the waters could be disposed properly. (Author)

  15. Model Reduction in Chemical Engineering : Case studies applied to process analysis, design and operation

    NARCIS (Netherlands)

    Dorneanu, B.

    2011-01-01

    During the last decades, models have become widely used for supporting a broad range of chemical engineering activities, such as product and process design and development, process monitoring and control, real time optimization of plant operation or supply chain management. Although tremendous

  16. the z-transform applied to a birth-death process having varying birth

    African Journals Online (AJOL)

    DEPT OF AGRICULTURAL ENGINEERING

    The analysis of a birth-death process using the z-transform was recently reported for processes hav- ing fixed transition probabilities ... model can be used to study practical queuing and birth-death systems where the arrival, birth, ser- vice and death rates may ..... John-Wiley & Sons, New York. Krogh, A., Brown, M. Mian, ...

  17. Sensitivity of process design to uncertainties in property estimates applied to extractive distillation

    DEFF Research Database (Denmark)

    Jones, Mark Nicholas; Hukkerikar, Amol; Sin, Gürkan

    During the design of a chemical process engineers typically switch from simple (shortcut) calculations to more detailed rigorous models to perform mass and energy balances around unit operations and to design process equipment involved in that process. The choice of the most appropriate thermodyn......During the design of a chemical process engineers typically switch from simple (shortcut) calculations to more detailed rigorous models to perform mass and energy balances around unit operations and to design process equipment involved in that process. The choice of the most appropriate...... of the methodology is illustrated using a case study of extractive distillation in which acetone is separated from methanol using water as a solvent. Among others, the vapour pressure of acetone and water was found to be the most critical and even small uncertainties from -0.25 % to +0.75 % in vapour pressure data...... have shown a significant impact on the reflux ratio of the extractive distillation process. In general, systematic sensitivity analysis should be part of process design efforts and expected to contribute to better-informed and reliable design solutions in chemical industries....

  18. Metal mask free dry-etching process for integrated optical devices applying highly photostabilized resist.

    NARCIS (Netherlands)

    Sengo, G.; Sengo, G.; van Wolferen, Hendricus A.G.M.; Worhoff, Kerstin; Driessen, A.; Koonen, A.M.J.; Leijtens, X.J.M.; van den Boom, H.P.A.; Verdurmen, E.J.M.; Molina Vazquez, J.

    2006-01-01

    Photostabilization is a widely used post lithographic resist treatment process, which allows to harden the resist profile in order to maintain critical dimensions and to increase selectivity in subsequent process steps such as reactive ion etching. In this paper we present the optimization of deep

  19. Analysis of data from radioactive wastes treatment process and implementation of a data management applied program

    International Nuclear Information System (INIS)

    Jeo, H. S.; Son, J. S.; Kim, T. K.; Kang, I. S.; Lee, Y. H

    2003-01-01

    As for the generated radioactive waste, a nuclide and a form are various, and by small quantity occurs the irregular times in KAERI. Record management of a radioactive waste personal history is an important element in disposal. A data collection of a liquid / solid radioactive waste treatment process of a research organization became necessary while developing the RAWMIS which it can generate personal history management for efficient management of a waste, documents, all kinds of statistics. This paper introduces an input and output application program design to do to database with data in the results and a stream process of a treatment that analyzed the waste occurrence present situation and data by treatment process. Data on the actual treatment process that is not limited experiment improve by a document, human traces, saving of material resources and improve with efficiency of tracking about a radioactive waste and a process and give help to radioactive waste material valance and inventory study

  20. Applying Process Improvement Methods to Clinical and Translational Research: Conceptual Framework and Case Examples.

    Science.gov (United States)

    Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K

    2015-12-01

    There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.

  1. Agents Modeling Experience Applied To Control Of Semi-Continuous Production Process

    Directory of Open Access Journals (Sweden)

    Gabriel Rojek

    2014-01-01

    Full Text Available The lack of proper analytical models of some production processes prevents us from obtaining proper values of process parameters by simply computing optimal values. Possible solutions of control problems in such areas of industrial processes can be found using certain methods from the domain of artificial intelligence: neural networks, fuzzy logic, expert systems, or evolutionary algorithms. Presented in this work, a solution to such a control problem is an alternative approach that combines control of the industrial process with learning based on production results. By formulating the main assumptions of the proposed methodology, decision processes of a human operator using his experience are taken into consideration. The researched model of using and gathering experience of human beings is designed with the contribution of agent technology. The presented solution of the control problem coincides with case-based reasoning (CBR methodology.

  2. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    Energy Technology Data Exchange (ETDEWEB)

    Credille, Jennifer [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States); Owens, Elizabeth [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States)

    2017-10-11

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restricted to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.

  3. Applying lean methods to improve quality and safety in surgical sterile instrument processing.

    Science.gov (United States)

    Blackmore, C Craig; Bishop, Robbi; Luker, Samuel; Williams, Barbara L

    2013-03-01

    Surgical instrument processing is critical to safe, high-quality surgical care but has received little attention in the medical literature. Typical hospitals have inventories in the tens of thousands of surgical instruments organized into thousands of instrument sets. The use of these instruments for multiple procedures per day leads to millions of instrument sets being reprocessed yearly in a single hospital. Errors in the processing of sterile instruments may lead to increased operative times and costs, as well as potentially contributing to surgical infections and perioperative morbidity. At Virginia Mason Medical Center (Seattle), a quality monitoring approach was developed to identify and categorize errors in sterile instrument processing, through use of a daily defect sheet. Lean methods were used to improve the quality of surgical instrument processing through redefining operator roles, alteration of the workspace, mistake-proofing, quality monitoring, staff training, and continuous feedback. To study the effectiveness of the quality improvement project, a before/after comparison of prospectively collected sterile processing error rates during a 37-month time frame was performed. Before the intervention, instrument processing errors occurred in 3.0% of surgical cases, decreasing to 1.5% at the final follow-up (p instrument processing errors are a barrier to the highest quality and safety in surgical care but are amenable to substantial improvement using Lean techniques.

  4. Machine Vision for High Precision Volume Measurement Applied to Levitated Containerless Materials Processing

    Science.gov (United States)

    Bradshaw, R. C.; Schmidt, D. P.; Rogers, J. R.; Kelton, K. F.; Hyers, R. W.

    2005-01-01

    By combining the best practices in optical dilatometry with new numerical methods, a high-speed and high precision technique has been developed to measure volume of levitated, containerlessly processed samples with sub- pixel resolution. Containerless processing provides the ability to study highly reactive materials without the possibility of contamination affecting thermo-physical properties. Levitation is a common technique used to isolate a sample as it is being processed. Noncontact optical measurement of thermo-ophysical properties is very important as traditional measuring methods cannot be used. Modern, digitally recorded images require advanced numerical routines to recover the sub-pixel locations of sample edges and, in turn produce high precision measurements.

  5. Regime switching state-space models applied to psychological processes: handling missing data and making inferences

    NARCIS (Netherlands)

    Hamaker, E.L.; Grasman, R.P.P.P.

    2012-01-01

    Many psychological processes are characterized by recurrent shifts between distinct regimes or states. Examples that are considered in this paper are the switches between different states associated with premenstrual syndrome, hourly fluctuations in affect during a major depressive episode, and

  6. Design and implementation of HMI and SCADA applied to a didactic emulation of a packing process

    Directory of Open Access Journals (Sweden)

    Manuel Alejandro Bohórquez-Dallos

    2013-07-01

    The supervision and control of the signals are performed with the CANopen protocol. The developed system allows the user to interact directly with the emulation of the process, triggering elements of it in the computer remotely and locally using the touch screen. The SCADA and HMI are embedded in a CoDeSys® function that the user can use as a learning tool for automating the process emulation.

  7. Design strategy for optimal iterative learning control applied on a deep drawing process

    DEFF Research Database (Denmark)

    Endelt, Benny Ørtoft

    2017-01-01

    changes in the material properties. The process is highly non-linear and the system plant is modelled using a non-linear finite element and the gain factors for the iterative learning controller is identified solving a non-linear optimal control problem. The optimal control problem is formulated as a non-linear...... least square problem where the system response is evaluated using a non-linear finite element model of the process....

  8. Research on the experiment of reservoir water treatment applying ultrafiltration membrane technology of different processes.

    Science.gov (United States)

    Zhang, Liyong; Zhang, Penghui; Wang, Meng; Yang, Kai; Liu, Junliang

    2016-09-01

    The processes and effects of coagulation-ultrafiltration (C-UF) and coagulation sedimentation-ultrafiltration (CS-UF) process used in the treatment of Dalangdian Reservoir water were compared. The experiment data indicated that 99% of turbidity removal and basically 100% of microorganism and algae removal were achieved in both C-UF and CS-UF process. The organic removal effect of CS-UF? process was slightly better than C-UF process. However, the organic removal effect under different processes was not obvious due to limitation of ultrafiltration membrane aperture. Polyaluminium chloride was taken as a coagulant in water plant. The aluminum ion removal result revealed that coagulant dosage was effectively saved by using membrane technology during megathermal high algae laden period. Within the range of certain reagent concentration and soaking time, air-water backwashing of every filtration cycle of membrane was conducted to effectively reduce membrane pollution. Besides, maintenance cleaning was conducted every 60 min. whether or not restorative cleaning was conducted depends on the pollution extent. After cleaning, recovery of membrane filtration effect was obvious.

  9. Early process development of API applied to poorly water-soluble TBID.

    Science.gov (United States)

    Meise, Marius; Niggemann, Matthias; Dunens, Alexandra; Schoenitz, Martin; Kuschnerow, Jan C; Kunick, Conrad; Scholl, Stephan

    2018-01-12

    Finding and optimising of synthesis processes for active pharmaceutical ingredients (API) is time consuming. In the finding phase, established methods for synthesis, purification and formulation are used to achieve a high purity API for biological studies. For promising API candidates, this is followed by pre-clinical and clinical studies requiring sufficient quantities of the active component. Ideally, these should be produced with a process representative for a later production process and suitable for scaling to production capacity. This work presents an overview of different approaches for process synthesis based on an existing lab protocol. This is demonstrated for the production of the model drug 4,5,6,7-tetrabromo-2-(1H-imidazol-2-yl) isoindolin-1,3-dione (TBID). Early batch synthesis and purification procedures typically suffer from low and fluctuating yields and purities due to poor process control. In a first step the literature synthesis and purification procedure was modified and optimized using solubility measurements, targeting easier and safer processing for consecutive studies. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. A universal electronical adaptation of automats for biochemical analysis to a central processing computer by applying CAMAC-signals

    International Nuclear Information System (INIS)

    Schaefer, R.

    1975-01-01

    A universal expansion of a CAMAC-subsystem - BORER 3000 - for adapting analysis instruments in biochemistry to a processing computer is described. The possibility of standardizing input interfaces for lab instruments with such circuits is discussed and the advantages achieved by applying the CAMAC-specifications are described

  11. Efficient reductive amination process for enantioselective synthesis of L-phosphinothricin applying engineered glutamate dehydrogenase.

    Science.gov (United States)

    Yin, Xinjian; Wu, Jianping; Yang, Lirong

    2018-03-16

    The objective of this study was to identify and exploit a robust biocatalyst that can be applied in reductive amination for enantioselective synthesis of the competitive herbicide L-phosphinothricin. Applying a genome mining-based library construction strategy, eight NADPH-specific glutamate dehydrogenases (GluDHs) were identified for reductively aminating 2-oxo-4-[(hydroxy)(methyl)phosphinoyl]butyric acid (PPO) to L-phosphinothricin. Among them, the glutamate dehydrogenase cloned from Pseudomonas putida (PpGluDH) exhibited relatively high catalytic activity and favorable soluble expression. This enzyme was purified to homogeneity for further characterization. The specific activity of PpGluDH was 296.1 U/g-protein, which is significantly higher than the reported value for a GluDH. To the best of our knowledge, there has not been any report on protein engineering of GluDH for PPO-oriented activity. Taking full advantage of the available information and the diverse characteristics of the enzymes in the enzyme library, PpGluDH was engineered by site-directed mutation based on multiple sequence alignment. The mutant I170M, which had 2.1-fold enhanced activity, was successfully produced. When the I170M mutant was applied in the batch production of L-phosphinothricin, it showed markedly improved catalytic efficiency compared with the wild type enzyme. The conversion reached 99% (0.1 M PPO) with an L-phosphinothricin productivity of 1.35 g/h·L, which far surpassed the previously reported level. These results show that PpGluDH I170M is a promising biocatalyst for highly enantioselective synthesis of L-phosphinothricin by reductive amination.

  12. Integrated processing for the treatment of materials applied to thermal compression of hydrogen

    International Nuclear Information System (INIS)

    Rodriguez, M.G; Esquivel, M. R

    2009-01-01

    In this work, AB 5 intermetallics are synthesized by low energy mechanical alloying according to: AB 5 + AB 5 = AB 5 . The obtained intermetallics are annealed at 600 oC to optimize both the microstructural and hydrogen sorption properties. Then, the material is applied to the design of schemes for thermal compression of hydrogen (TCH). These results are obtained within the frame of a research project related to Energy and Environment and focused on the replacement on fossil supply systems by a hydrogen based one. [es

  13. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry

    International Nuclear Information System (INIS)

    Villani, N.; Noel, A.; Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A.; Francois, P.

    2010-01-01

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  14. The reversible process concept applied to the environmental management of large river systems

    Science.gov (United States)

    Amoros, Claude; Rostan, Jean-Claude; Pautou, Guy; Bravard, Jean-Paul

    1987-09-01

    The wetland ecosystems occurring within alluvial floodplains change rapidly. Within the ecological successions, the life span of pioneer and transient stages may be measured in several years or decades depending on the respective influences of allogenic (water dynamics, erosion, and deposition) and autogenic developmental processes (population dynamics, eutrophication, and terrestrialization). This article emphasizes the mechanisms that are responsible for the ecosystem changes and their importance to environmental management. Two case studies exemplify reversible and irreversible successional processes in reference to different spatial and temporal scales. On the scale of the former channels, the standing-water ecosystems with low homeostasis may recover their previous status after human action on the allogenic processes. On the scale of a whole reach of the floodplain, erosion and deposition appear as reversible processes that regenerate the ecological successions. The concepts of stability and reversibility are discussed in relation to different spatiotemporal referential frameworks and different levels of integration. The reversible process concept is also considered with reference to the energy inputs into the involved subsystems. To estimate the probability of ecosystem regeneration or the cost of restoration, a concept of “degrees of reversibility” is proposed.

  15. Workflow for Criticality Assessment Applied in Biopharmaceutical Process Validation Stage 1

    Directory of Open Access Journals (Sweden)

    Thomas Zahel

    2017-10-01

    Full Text Available Identification of critical process parameters that impact product quality is a central task during regulatory requested process validation. Commonly, this is done via design of experiments and identification of parameters significantly impacting product quality (rejection of the null hypothesis that the effect equals 0. However, parameters which show a large uncertainty and might result in an undesirable product quality limit critical to the product, may be missed. This might occur during the evaluation of experiments since residual/un-modelled variance in the experiments is larger than expected a priori. Estimation of such a risk is the task of the presented novel retrospective power analysis permutation test. This is evaluated using a data set for two unit operations established during characterization of a biopharmaceutical process in industry. The results show that, for one unit operation, the observed variance in the experiments is much larger than expected a priori, resulting in low power levels for all non-significant parameters. Moreover, we present a workflow of how to mitigate the risk associated with overlooked parameter effects. This enables a statistically sound identification of critical process parameters. The developed workflow will substantially support industry in delivering constant product quality, reduce process variance and increase patient safety.

  16. A Three Tier Architecture Applied to LiDAR Processing and Monitoring

    Directory of Open Access Journals (Sweden)

    Efrat Jaeger-Frank

    2006-01-01

    Full Text Available Emerging Grid technologies enable solving scientific problems that involve large datasets and complex analyses, which in the past were often considered difficult to solve. Coordinating distributed Grid resources and computational processes requires adaptable interfaces and tools that provide modularized and configurable environments for accessing Grid clusters and executing high performance computational tasks. Computationally intensive processes are also subject to a high risk of component failures and thus require close monitoring. In this paper we describe a scientific workflow approach to coordinate various resources via data analysis pipelines. We present a three tier architecture for LiDAR interpolation and analysis, a high performance processing of point intensive datasets, utilizing a portal, a scientific workflow engine and Grid technologies. Our proposed solution is available to the community in a unified framework through a shared cyberinfrastructure, the GEON portal, enabling scientists to focus on their scientific work and not be concerned with the implementation of the underlying infrastructure.

  17. Digital Signal Processing Applied to the Modernization Of Polish Navy Sonars

    Directory of Open Access Journals (Sweden)

    Marszal Jacek

    2014-04-01

    Full Text Available The article presents the equipment and digital signal processing methods used for modernizing the Polish Navy’s sonars. With the rapid advancement of electronic technologies and digital signal processing methods, electronic systems, including sonars, become obsolete very quickly. In the late 1990s a team of researchers of the Department of Marine Electronics Systems, Faculty of Electronics, Telecommunications and Informatics, Gdansk University of Technology, began work on modernizing existing sonar systems for the Polish Navy. As part of the effort, a methodology of sonar modernization was implemented involving a complete replacement of existing electronic components with newly designed ones by using bespoke systems and methods of digital signal processing. Large and expensive systems of ultrasound transducers and their dipping and stabilisation systems underwent necessary repairs but were otherwise left unchanged. As a result, between 2001 and 2014 the Gdansk University of Technology helped to modernize 30 sonars of different types.

  18. Study Of The Wet Multipass Drawing Process Applied On High Strength Thin Steel Wires

    Science.gov (United States)

    Thimont, J.; Felder, E.; Bobadilla, C.; Buessler, P.; Persem, N.; Vaubourg, JP.

    2011-05-01

    Many kinds of high strength thin steel wires are involved in so many applications. Most of the time, these wires are made of a pearlitic steel grade. The current developments mainly concern the wire last drawing operation: after a patenting treatment several reduction passes are performed on a slip-type multipass drawing machine. This paper focuses on modeling this multipass drawing process: a constitutive law based on the wire microstructure evolutions is created, a mechanical study is performed, a set of experiments which enables determining the process friction coefficients is suggested and finally the related analytical model is introduced. This model provides several general results about the process and can be used in order to set the drawing machines.

  19. Applying industrial process improvement techniques to increase efficiency in a surgical practice.

    Science.gov (United States)

    Reznick, David; Niazov, Lora; Holizna, Eric; Siperstein, Allan

    2014-10-01

    The goal of this study was to examine how industrial process improvement techniques could help streamline the preoperative workup. Lean process improvement was used to streamline patient workup at an endocrine surgery service at a tertiary medical center utilizing multidisciplinary collaboration. The program consisted of several major changes in how patients are processed in the department. The goal was to shorten the wait time between initial call and consult visit and between consult and surgery. We enrolled 1,438 patients enrolled in the program. The wait time from the initial call until consult was reduced from 18.3 ± 0.7 to 15.4 ± 0.9 days. Wait time from consult until operation was reduced from 39.9 ± 1.5 to 33.9 ± 1.3 days for the overall practice and to 15.0 ± 4.8 days for low-risk patients. Patient cancellations were reduced from 27.9 ± 2.4% to 17.3 ± 2.5%. Overall patient flow increased from 30.9 ± 5.1 to 52.4 ± 5.8 consults per month (all P process improvement methodology, surgery patients can benefit from an improved, streamlined process with significant reduction in wait time from call to initial consult and initial consult to surgery, with reduced cancellations. This generalized process has resulted in increased practice throughput and efficiency and is applicable to any surgery practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. A Plan for Academic Biobank Solvency-Leveraging Resources and Applying Business Processes to Improve Sustainability.

    Science.gov (United States)

    Uzarski, Diane; Burke, James; Turner, Barbara; Vroom, James; Short, Nancy

    2015-10-01

    Researcher-initiated biobanks based at academic institutions contribute valuable biomarker and translational research advances to medicine. With many legacy banks once supported by federal funding, reductions in fiscal support threaten the future of existing and new biobanks. When the Brain Bank at Duke University's Bryan Alzheimer's Disease Center (ADRC) faced a funding crisis, a collaborative, multidisciplinary team embarked on a 2-year biobank sustainability project utilizing a comprehensive business strategy, dedicated project management, and a systems approach involving many Duke University entities. By synthesizing and applying existing knowledge, Duke Translational Medicine Institute created and launched a business model that can be adjusted and applied to legacy and start-up academic biobanks. This model provides a path to identify new funding mechanisms, while also emphasizing improved communication, business development, and a focus on collaborating with industry to improve access to biospecimens. Benchmarks for short-term Brain Bank stabilization have been successfully attained, and the evaluation of long-term sustainability metrics is ongoing. © 2015 Wiley Periodicals, Inc.

  1. A Plan for Academic Biobank Solvency—Leveraging Resources and Applying Business Processes to Improve Sustainability

    Science.gov (United States)

    Burke, James; Turner, Barbara; Vroom, James; Short, Nancy

    2015-01-01

    Abstract Researcher‐initiated biobanks based at academic institutions contribute valuable biomarker and translational research advances to medicine. With many legacy banks once supported by federal funding, reductions in fiscal support threaten the future of existing and new biobanks. When the Brain Bank at Duke University's Bryan Alzheimer's Disease Center (ADRC) faced a funding crisis, a collaborative, multidisciplinary team embarked on a 2‐year biobank sustainability project utilizing a comprehensive business strategy, dedicated project management, and a systems approach involving many Duke University entities. By synthesizing and applying existing knowledge, Duke Translational Medicine Institute created and launched a business model that can be adjusted and applied to legacy and start‐up academic biobanks. This model provides a path to identify new funding mechanisms, while also emphasizing improved communication, business development, and a focus on collaborating with industry to improve access to biospecimens. Benchmarks for short‐term Brain Bank stabilization have been successfully attained, and the evaluation of long‐term sustainability metrics is ongoing. PMID:25996355

  2. Digital processing methodology applied to exploring of radiological images; Metodologia de processamento digital aplicada a exploracao de imagens radiologicas

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Cristiane de Queiroz

    2004-07-01

    In this work, digital image processing is applied as a automatic computational method, aimed for exploring of radiological images. It was developed an automatic routine, from the segmentation and post-processing techniques to the radiology images acquired from an arrangement, consisting of a X-ray tube, target and filter of molybdenum, of 0.4 mm and 0.03 mm, respectively, and CCD detector. The efficiency of the methodology developed is showed in this work, through a case study, where internal injuries in mangoes are automatically detected and monitored. This methodology is a possible tool to be introduced in the post-harvest process in packing houses. A dichotomic test was applied to evaluate a efficiency of the method. The results show a success of 87.7% to correct diagnosis and 12.3% to failures to correct diagnosis with a sensibility of 93% and specificity of 80%. (author)

  3. Mild processing applied to the inactivation of the main foodborne bacterial pathogens

    DEFF Research Database (Denmark)

    Barba Orellana, Francisco Jose; Koubaa, Mohamed; do Prado-Silva, Leonardo

    2017-01-01

    such as high pressure processing, ultrasounds, pulsed electric fields, UV-light, and atmospheric cold plasma may serve, in some conditions, as useful alternatives to commercial sterilization and pasteurization aiming to destroy foodborne pathogens. Each of these mild technologies has a specific mode...

  4. Biologically-based signal processing system applied to noise removal for signal extraction

    Science.gov (United States)

    Fu, Chi Yung; Petrich, Loren I.

    2004-07-13

    The method and system described herein use a biologically-based signal processing system for noise removal for signal extraction. A wavelet transform may be used in conjunction with a neural network to imitate a biological system. The neural network may be trained using ideal data derived from physical principles or noiseless signals to determine to remove noise from the signal.

  5. Medical Image Processing Server applied to Quality Control of Nuclear Medicine.

    Science.gov (United States)

    Vergara, C.; Graffigna, J. P.; Marino, E.; Omati, S.; Holleywell, P.

    2016-04-01

    This paper is framed within the area of medical image processing and aims to present the process of installation, configuration and implementation of a processing server of medical images (MIPS) in the Fundación Escuela de Medicina Nuclear located in Mendoza, Argentina (FUESMEN). It has been developed in the Gabinete de Tecnologia Médica (GA.TE.ME), Facultad de Ingeniería-Universidad Nacional de San Juan. MIPS is a software that using the DICOM standard, can receive medical imaging studies of different modalities or viewing stations, then it executes algorithms and finally returns the results to other devices. To achieve the objectives previously mentioned, preliminary tests were conducted in the laboratory. More over, tools were remotely installed in clinical enviroment. The appropiate protocols for setting up and using them in different services were established once defined those suitable algorithms. Finally, it’s important to focus on the implementation and training that is provided in FUESMEN, using nuclear medicine quality control processes. Results on implementation are exposed in this work.

  6. Medical Image Processing Server applied to Quality Control of Nuclear Medicine

    International Nuclear Information System (INIS)

    Vergara, C.; Graffigna, J.P.; Holleywell, P.; Marino, E.; Omati, S.

    2016-01-01

    This paper is framed within the area of medical image processing and aims to present the process of installation, configuration and implementation of a processing server of medical images (MIPS) in the Fundación Escuela de Medicina Nuclear located in Mendoza, Argentina (FUESMEN). It has been developed in the Gabinete de Tecnologia Médica (GA.TE.ME), Facultad de Ingeniería-Universidad Nacional de San Juan. MIPS is a software that using the DICOM standard, can receive medical imaging studies of different modalities or viewing stations, then it executes algorithms and finally returns the results to other devices. To achieve the objectives previously mentioned, preliminary tests were conducted in the laboratory. More over, tools were remotely installed in clinical enviroment. The appropiate protocols for setting up and using them in different services were established once defined those suitable algorithms. Finally, it’s important to focus on the implementation and training that is provided in FUESMEN, using nuclear medicine quality control processes. Results on implementation are exposed in this work. (paper)

  7. Applying Topographic Classification, Based on the Hydrological Process, to Design Habitat Linkages for Climate Change

    Directory of Open Access Journals (Sweden)

    Yongwon Mo

    2017-11-01

    Full Text Available The use of biodiversity surrogates has been discussed in the context of designing habitat linkages to support the migration of species affected by climate change. Topography has been proposed as a useful surrogate in the coarse-filter approach, as the hydrological process caused by topography such as erosion and accumulation is the basis of ecological processes. However, some studies that have designed topographic linkages as habitat linkages, so far have focused much on the shape of the topography (morphometric topographic classification with little emphasis on the hydrological processes (generic topographic classification to find such topographic linkages. We aimed to understand whether generic classification was valid for designing these linkages. First, we evaluated whether topographic classification is more appropriate for describing actual (coniferous and deciduous and potential (mammals and amphibians habitat distributions. Second, we analyzed the difference in the linkages between the morphometric and generic topographic classifications. The results showed that the generic classification represented the actual distribution of the trees, but neither the morphometric nor the generic classification could represent the potential animal distributions adequately. Our study demonstrated that the topographic classes, according to the generic classification, were arranged successively according to the flow of water, nutrients, and sediment; therefore, it would be advantageous to secure linkages with a width of 1 km or more. In addition, the edge effect would be smaller than with the morphometric classification. Accordingly, we suggest that topographic characteristics, based on the hydrological process, are required to design topographic linkages for climate change.

  8. Applying the Decoding the Disciplines Process to Teaching Structural Mechanics: An Autoethnographic Case Study

    Science.gov (United States)

    Tingerthal, John Steven

    2013-01-01

    Using case study methodology and autoethnographic methods, this study examines a process of curricular development known as "Decoding the Disciplines" (Decoding) by documenting the experience of its application in a construction engineering mechanics course. Motivated by the call to integrate what is known about teaching and learning…

  9. Applying unit process life cycle inventor (UPLCI) methodology in product/packaging combinatons

    NARCIS (Netherlands)

    Oude Luttikhuis, Ellen; Toxopeus, Marten E.; Overcash, M.; Nee, Andrew Y.C.; Song, Bin; Ong, Soh-Khim

    2013-01-01

    This paper discusses how the UPLCI approach can be used for determining the inventory of the manufacturing phases of product/packaging combinations. The UPLCI approach can make the inventory of the manufacturing process of the product that is investigated more accurate. The life cycle of

  10. Systems Theoretic Process Analysis Applied to an Offshore Supply Vessel Dynamic Positioning System

    Science.gov (United States)

    2016-06-01

    13] ............................ 52 Figure 14: Eight Elements of the System Safety Process [15...assertion at a 2004 Dynamic Positioning Conference: To improve the safety of DP operation thus requires that all major elements in this human... safety • Maintain communication with target vessel • Adhere to all commands given by the target vessel DP system ( auto and manual) • Provide safe

  11. Sequential Analysis Applied to Counseling Process and Outcome: A Case Study Revisited.

    Science.gov (United States)

    Wampold, Bruce E.; Kim, Kay-Hyon

    1989-01-01

    Analyzed case study presented by Hill, Carter, and O'Farrell (1983) with sequential analysis methods developed to demonstrate the usefulness of these methods for understanding counseling process and outcome. Sequential analysis method revealed several facets of interaction between counselor and client that were undetected by Hill et al.'s…

  12. Electrolytic in process dressing (ELID) applied to double side grinding of ceramic materials

    Science.gov (United States)

    Spanu, Cristian E.

    The objective of the present work is to design, optimize, and validate an electrolytic in-process dressing (ELID)-assisted double side grinding process for finishing advanced ceramic components. To attain this objective, an original ELID double side grinding system was designed, fabricated, and operated at Precision Micro-Machining Center at The University of Toledo, Ohio. The ELID technique was selected from among other options to assure the in-situ dressing of the metal-bonded superabrasive grinding wheel and to maintain its cutting ability throughout the operation, which is, otherwise, a challenging enterprise. Optimizing the ELID double side grinding process parameters is an important goal of the present study. To achieve this goal, a complex integrated model was developed and validated through extensive experimental testing. Four analytical computerized models were developed and integrated: (1) an improved kinematic model of double side grinding accounting for workpiece rotation, which is used to simulate the grinding trajectories; (2) a microscopic model of the interaction between a single diamond grit and the work surface, which is used to predict the volume of material removed; (3) a stochastic model for the topographical characterization of the superabrasive wheel, which leads to a new prediction method of depth of indentation; and (4) an electrolytic oxidation model, which explains the dynamics of the oxide layer. In order to validate the models and to confirm the optimized process, experimental tests were conducted under different conditions: with vitrified and metallic bond grinding wheels, with various average grain sizes of diamond grits, with different superabrasive concentrations, with different grinding fluids, with and without ELID assistance. Our findings show that an optimized ceramic double side grinding process using fine diamond grit is more efficient than lapping in producing very fine surfaces. The experiments confirmed the superiority of

  13. Generalized Software Architecture Applied to the Continuous Lunar Water Separation Process and the Lunar Greenhouse Amplifier

    Science.gov (United States)

    Perusich, Stephen; Moos, Thomas; Muscatello, Anthony

    2011-01-01

    This innovation provides the user with autonomous on-screen monitoring, embedded computations, and tabulated output for two new processes. The software was originally written for the Continuous Lunar Water Separation Process (CLWSP), but was found to be general enough to be applicable to the Lunar Greenhouse Amplifier (LGA) as well, with minor alterations. The resultant program should have general applicability to many laboratory processes (see figure). The objective for these programs was to create a software application that would provide both autonomous monitoring and data storage, along with manual manipulation. The software also allows operators the ability to input experimental changes and comments in real time without modifying the code itself. Common process elements, such as thermocouples, pressure transducers, and relative humidity sensors, are easily incorporated into the program in various configurations, along with specialized devices such as photodiode sensors. The goal of the CLWSP research project is to design, build, and test a new method to continuously separate, capture, and quantify water from a gas stream. The application is any In-Situ Resource Utilization (ISRU) process that desires to extract or produce water from lunar or planetary regolith. The present work is aimed at circumventing current problems and ultimately producing a system capable of continuous operation at moderate temperatures that can be scaled over a large capacity range depending on the ISRU process. The goal of the LGA research project is to design, build, and test a new type of greenhouse that could be used on the moon or Mars. The LGA uses super greenhouse gases (SGGs) to absorb long-wavelength radiation, thus creating a highly efficient greenhouse at a future lunar or Mars outpost. Silica-based glass, although highly efficient at trapping heat, is heavy, fragile, and not suitable for space greenhouse applications. Plastics are much lighter and resilient, but are not

  14. Image processing technology applied in quality inspecting of the motor carling

    Science.gov (United States)

    Xu, Hongsheng; Lei, Jun; Fu, Yongzhi

    2010-07-01

    After a deep analysis on how to use an image processing system to detect the missing holes on the motor carling, we design the whole system combined with the actual production conditions of the motor carling. Afterwards we explain the whole system's hardware and software in detail. We introduce the general functions for the system's hardware and software. Analyzed these general functions, we discuss the modules of the system's hardware and software and the theory to design these modules in detail. The measurement to confirm the area to image processing, edge detection, randomized Hough transform to circle detecting is explained in detail. Finally, the system result tested in the laboratory and in the factory is given out.

  15. Formulation and validation of applied engineering equations for heat transfer processes in the food industry

    DEFF Research Database (Denmark)

    Christensen, Martin Gram

    are conducted in food manufacture. The study provides a method where traditional processes can be calculated with a high precision by using an expanded 1st term approximation to the series expansion. This is an advantageous in terms of application in the industry where the solution can be incorporated...... in food manufacture. It is also hoped that the solutions provided and the insight to the [Fo-exp] will become a part of the engineering training for food science students. And most important, that the study will find application in the food industry.......The study is focused on convective heat transfer in the processing of solid foods, specifically with the scope to develop simple analytical calculation tools that can be incorporated into spreadsheet solutions. In areas of food engineering such as equipment manufacture the use of predictive...

  16. FAILURES AND DEFECTS IN THE BUILDING PROCESSAPPLYING THE BOW-TIE APPROACH

    DEFF Research Database (Denmark)

    Jørgensen, Kirsten

    2009-01-01

    site was observed from the very start to the very end and all failures and defects of a certain size were recorded and analysed. The methodological approach used in this analysis was the bow-tie model from the area of safety research. It combines critical-event analysis for both causes and effects...... with event-tree analysis. The paper describes this analytical approach as an introduction to a new concept for understanding failures and defects in construction. Analysing the many critical events in the building process with the bow-tie model visualises the complexity of causes. This visualisation offers...... the possibility for a much more direct and focused discussion of what needs doing, by whom and when – not only to avoid the number of defects in the final product, but also to make the building process flow much better and reduce the need for damage control....

  17. Considerations on the question of applying ion exchange or reverse osmosis methods in boiler feedwater processing

    International Nuclear Information System (INIS)

    Marquardt, K.; Dengler, H.

    1976-01-01

    This consideration is to show that the method of reverse osmosis presents in many cases an interesting and economical alternative to part and total desolination plants using ion exchangers. The essential advantages of the reverse osmosis are a higher degree of automization, no additional salting of the removed waste water, small constructional volume of the plant as well as favourable operational costs with increasing salt content of the crude water to be processed. As there is a relatively high salt breakthrough compared to the ion exchange method, the future tendency in boiler feedwater processing will be more towards a combination of methods of reverse osmosis and post-purification through continuous ion exchange methods. (orig./LH) [de

  18. Supervisory System and Multivariable Control Applying Weighted Fuzzy-PID Logic in an Alcoholic Fermentation Process

    Directory of Open Access Journals (Sweden)

    Márcio Mendonça

    2015-10-01

    Full Text Available In this work, it is analyzed a multivariate system control of an alcoholic fermentation process with no minimum phase. The control is made with PID classic controllers associated with a supervisory system based on Fuzzy Systems. The Fuzzy system, a priori, send set-points to PID controllers, but also adds protection functions, such as if the biomass valued is at zero or very close. The Fuzzy controller changes the campaign to prevent or mitigate the paralyzation of the process. Three control architectures based on Fuzzy Control Systems are presented and compared in performance with classic control in different campaigns. The third architecture, in particular, adds an adaptive function. A brief summary of Fuzzy theory and correlated works will be presented. And, finally simulations results, conclusions and future works end the article.

  19. ANALYTIC NETWORK PROCESS AND BALANCED SCORECARD APPLIED TO THE PERFORMANCE EVALUATION OF PUBLIC HEALTH SYSTEMS

    Directory of Open Access Journals (Sweden)

    Marco Aurélio Reis dos Santos

    2015-08-01

    Full Text Available The performance of public health systems is an issue of great concern. After all, to assure people's quality of life, public health systems need different kinds of resources. Balanced Scorecard provides a multi-dimensional evaluation framework. This paper presents the application of the Analytic Network Process and Balanced Scorecard in the performance evaluation of a public health system in a typical medium-sized Southeastern town in Brazil.

  20. Development of a Procedure to Apply Detailed Chemical Kinetic Mechanisms to CFD Simulations as Post Processing

    DEFF Research Database (Denmark)

    Skjøth-Rasmussen, Martin Skov; Glarborg, Peter; Jensen, Anker

    2003-01-01

    It is desired to make detailed chemical kinetic mechanisms applicable to the complex geometries of practical combustion devices simulated with computational fluid dynamics tools. This work presents a novel general approach to combining computational fluid dynamics and a detailed chemical kinetic...... mechanism. It involves post-processing of data extracted from computational fluid dynamics simulations. Application of this approach successfully describes combustion chemistry in a standard swirl burner, the so-called Harwell furnace. Nevertheless, it needs validation against more complex combustion models...

  1. Applied Neural Cross-Correlation into the Curved Trajectory Detection Process for Braitenberg Vehicles

    OpenAIRE

    Macktoobian, Matin; Jafari, Mohammad; Gh, Erfan Attarzadeh

    2014-01-01

    Curved Trajectory Detection (CTD) process could be considered among high-level planned capabilities for cognitive agents, has which been acquired under aegis of embedded artificial spiking neuronal circuits. In this paper, hard-wired implementation of the cross-correlation, as the most common comparison-driven scheme for both natural and artificial bionic constructions named Depth Detection Module(DDM), has been taken into account. It is manifestation of efficient handling upon epileptic seiz...

  2. Data-processing problems in filmless readout systems applied to physical experiments

    International Nuclear Information System (INIS)

    Bogdanova, N.B.; Prikhod'ko, V.I.; Ososkov, G.A.; Gadzhokov, V.

    1984-01-01

    The applications of filmless readout systems in modern physical experiments are considered. The basic characteristics of systems built on TV tubes and on charge-coupled devices (CCD) are reported. Filmless-data processing problems are formulated: recognition of images of tracks and fiducial marks; data compression; computation of the calibration transforms and of the system accuracy parameter. Results from mathematical algorithms and computer codes are reported for the case of streamer-chamber systems

  3. An Assessment of Software Safety as Applied to the Department of Defense Software Development Process

    Science.gov (United States)

    1992-12-01

    Proceso . , . . . . . . . . 19 3. Relationship between 300 Series Tasks and the Software Development Process . .. . . . . . . . 38 4. Real Time Logic...Support documents include 0 Computer System Operator’s Manual * Software User’s Manual 9 Software Programmer’s Manual 6 Firmware Support Manual Computer...implement the procedures must be developed and included in technical manuals (8:6). In order to assess the risk of any hazard, the hazard must be

  4. Process management incorporating the intellectual capital and knowledge management: an applied study in research centres

    OpenAIRE

    Enrique Saravia Vergara

    2015-01-01

    In today’s competitive environment, organizations seek to create value for customers through management approaches that not only ensure the supply of goods and services of quality and at low prices, but that achieve long-term competitive advantages. In this context, process management appears as a management model based on "quality"; whereas "intellectual capital" and "knowledge management" models represent the main models based on the management of intangible assets, the basis of competitive...

  5. Kaolin processing waste applied in the manufacturing of ceramic tiles and mullite bodies.

    Science.gov (United States)

    Menezes, Romualdo R; Farias, Felipe F; Oliveira, Maurício F; Santana, Lisiane N L; Neves, Gelmires A; Lira, Helio L; Ferreira, Heber C

    2009-02-01

    In the last few years, mineral extraction and processing industries have been identified as sources of environmental contamination and pollution. The kaolin processing industry around the world generates large amounts of waste materials. The present study evaluated the suitability of kaolin processing waste as an alternative source of ceramic raw material for the production of ceramic tiles and dense mullite bodies. Several formulations were prepared and sintered at different temperatures. The sintered samples were characterized to determine their porosity, water absorption, firing shrinkage and mechanical strength. The fired samples were microstructurally analysed by X-ray diffraction. The results indicated that ceramic tile formulations containing up to 60% of waste could be used for the production of tiles with low water absorption (approximately 0.5%) and low sintering temperature (1150 degrees C). Mullite formulations with more than 40% of kaolin waste could be used in the production of bodies with high strength, of about 75 MPa, which can be used as refractory materials.

  6. ROBUST REGULATION FOR SYSTEMS WITH POLYNOMIAL NONLINEARITY APPLIED TO RAPID THERMAL PROCESSES

    Directory of Open Access Journals (Sweden)

    S. V. Aranovskiy

    2014-07-01

    Full Text Available Abstract. A problem of output robust control for a system with power nonlinearity is considered. The considered problem can be rewritten as a stabilization problem for a system with polynomial nonlinearity by introducing the error term. The problem of temperature regulation is considered as application; the rapid thermal processes in vapor deposition processing are studied. Modern industrial equipment uses complex sensors and control systems; these devices are not available for laboratory setups. The limited amount of available sensors and other technical restrictions for laboratory setups make it an actual problem to design simple low-order output control laws. The problem is solved by the consecutive compensator approach. The paper deals with a new type of restriction which is a combination of linear and power restrictions. It is shown that the polynomial nonlinearity satisfies this restriction. Asymptotical stability of the closed-loop system is proved by the Lyapunov functions approach for the considered nonlinear function; this contribution extends previously known results. Numerical simulation of the vapor deposition processing illustrates that the proposed approach results in zero-mean tracking error with standard deviation less than 1K.

  7. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  8. Electron beam irradiation process applied to primary and secondary recycled high density polyethylene

    Energy Technology Data Exchange (ETDEWEB)

    Cardoso, Jéssica R.; Moura, Eduardo de; Geraldo, Áurea B.C., E-mail: ageraldo@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Plastic bags, packaging and furniture items are examples of plastic utilities always present in life. However, the end-of-life of plastics impacts the environment because of this ubiquity and also often their high degradation time. Recycling processes are important in this scenario because they offer many solutions to this problem. Basically, four ways are known for plastic recycling: primary recycling, which consists in re-extrusion of clean plastic scraps from a production plant; secondary recycling, that uses end-of-life products that generally are reduced in size by extrusion to obtain a more desirable shape for reprocessing (pellets and powder); tertiary recover which is related to thermo-chemical methods to produce fuels and petrochemical feedstock; and quaternary route, that is related to energy recovery and it is done in appropriate reactors. In this work, high density polyethylene (HDPE) was recovered to simulate empirically the primary and secondary recycling ways using materials which ranged from pristine to 20-fold re-extrused materials. The final 20-fold recycled thermoplastic was irradiated in an electron beam accelerator under a dose rate of 22.4 kGy/s and absorbed doses of 50 kGy and 100 kGy. The characterization of HDPE in distinct levels of recovering was performed by infrared spectroscopy (FTIR) and thermogravimetric degradation. In the HDPE recycling, degradation and crosslinking are consecutive processes; degradation is very noticeable in the 20-fold recycled product. Despite this, the 20-fold recycled product presents crosslinking after irradiation process and the post-irradiation product presents similarities in spectroscopic and thermal degradation characteristics of pristine, irradiated HDPE. These results are discussed. (author)

  9. Electron beam irradiation process applied to primary and secondary recycled high density polyethylene

    International Nuclear Information System (INIS)

    Cardoso, Jéssica R.; Moura, Eduardo de; Geraldo, Áurea B.C.

    2017-01-01

    Plastic bags, packaging and furniture items are examples of plastic utilities always present in life. However, the end-of-life of plastics impacts the environment because of this ubiquity and also often their high degradation time. Recycling processes are important in this scenario because they offer many solutions to this problem. Basically, four ways are known for plastic recycling: primary recycling, which consists in re-extrusion of clean plastic scraps from a production plant; secondary recycling, that uses end-of-life products that generally are reduced in size by extrusion to obtain a more desirable shape for reprocessing (pellets and powder); tertiary recover which is related to thermo-chemical methods to produce fuels and petrochemical feedstock; and quaternary route, that is related to energy recovery and it is done in appropriate reactors. In this work, high density polyethylene (HDPE) was recovered to simulate empirically the primary and secondary recycling ways using materials which ranged from pristine to 20-fold re-extrused materials. The final 20-fold recycled thermoplastic was irradiated in an electron beam accelerator under a dose rate of 22.4 kGy/s and absorbed doses of 50 kGy and 100 kGy. The characterization of HDPE in distinct levels of recovering was performed by infrared spectroscopy (FTIR) and thermogravimetric degradation. In the HDPE recycling, degradation and crosslinking are consecutive processes; degradation is very noticeable in the 20-fold recycled product. Despite this, the 20-fold recycled product presents crosslinking after irradiation process and the post-irradiation product presents similarities in spectroscopic and thermal degradation characteristics of pristine, irradiated HDPE. These results are discussed. (author)

  10. Study on structural design technique of silicon carbide applied for thermochemical hydrogen production IS process

    International Nuclear Information System (INIS)

    Takegami, Hiroaki; Terada, Atsuhiko; Inagaki, Yoshiyuki; Ishikura, Syuichi

    2011-03-01

    The IS process is the hydrogen production method which used the thermochemical reaction cycle of sulfuric acid and iodyne. Therefore, the design to endure the high temperature and moreover corrode-able environment is required to the equipment. Specifically, the sulfuric acid decomposer which is one of the main equipment of the IS process is the equipment to heat with hot helium and for the sulfuric acid of 90 wt% to evaporate. Moreover, it is the important equipment to supply the SO 3 decomposer which is the following process, resolving the part of sulfuric acid vapor into SO 3 with. The heat exchanger that sulfuric acid evaporates must be made pressure-resistant structure because it has the high-pressure helium of 4 MPa and the material that the high temperature and the corrosion environment of equal to or more than 700degC can be endured must be used. As the material, it is selected from the corrosion experiment and so on when SiC which is carbonization silicone ceramics is the most excellent material. However, even if it damages the ceramic block which is a heat exchanger because it becomes the structure which is stored in pressure-resistant metallic container, fluid such as sulfuric acid becomes the structure which doesn't leak out outside. However, the structure design technique to have been unified when using ceramics as the structure part isn't serviced as the standard. This report is the one which was studied about the structural design technique to have taken the material strength characteristic of the ceramics into consideration, refer to existing structural design standard. (author)

  11. Fulfillment of GMP standard, halal standard, and applying HACCP for production process of beef floss (Case study: Ksatria enterprise)

    Science.gov (United States)

    A'diat, Arkan Addien Al; Liquiddanu, Eko; Laksono, Pringgo Widyo; Sutopo, Wahyudi; Suletra, I. Wayan

    2018-02-01

    Along with the increasing number of the modern retail business in Indonesia, give an opportunity to small and medium enterprise (SME) to sell its products through the modern retailer. There are some obstacles faced by the SMEs, one of them is about product standard. Product standard that must be owned by SMEs are GMP standard and halal standard. This research was conducted to know the fulfillment by the beef floss enterprise in jagalan in fulfilling the GMP standard and halal. In addition, Hazard Analysis and Critical Control Points (HACCP) system was applied to analyze the process. HACCP which used in this research was based on the seven principles in SNI (Indonesian National Standard) 01-4852-1998. The seven principles included hazard analysis, critical control point (CCP) determination, critical limit establishment, CCP monitor system establishment, corrective action establishment, verification, and also documentation establishment that must be applied in preparing HACCP plan. Based on this case study, it is concluded that there were 5 CCPs : the boiling process, roasting process, frying process, the beef floss draining process, and the packaging process.

  12. Neural networks-based modeling applied to a process of heavy metals removal from wastewaters.

    Science.gov (United States)

    Suditu, Gabriel D; Curteanu, Silvia; Bulgariu, Laura

    2013-01-01

    This article approaches the problem of environment pollution with heavy metals from disposal of industrial wastewaters, namely removal of these metals by means of biosorbents, particularly with Romanian peat (from Poiana Stampei). The study is carried out by simulation using feed-forward and modular neural networks with one or two hidden layers, pursuing the influence of certain operating parameters (metal nature, sorbent dose, pH, temperature, initial concentration of metal ion, contact time) on the amount of metal ions retained on the unit mass of sorbent. In neural network modeling, a consistent data set was used, including five metals: lead, mercury, cadmium, nickel and cobalt, the quantification of the metal nature being done by its electronegativity. Even if based on successive trials, the method of designing neural models was systematically conducted, recording and comparing the errors obtained with different types of neural networks, having various numbers of hidden layers and neurons, number of training epochs, or using various learning methods. The errors with values under 5% make clear the efficiency of the applied method.

  13. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    Science.gov (United States)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  14. Automation of the process of generation of the students insurance, applying RFID and GPRS technologies

    Directory of Open Access Journals (Sweden)

    Nelson Barrera-Lombana

    2013-07-01

    Full Text Available This article presents the description of the design and implementation of a system which allows the fulfilment of a consultation service on various parameters to a web server using a GSM modem, exchanging information systems over the Internet (ISS and radio-frequency identification (RFID. The application validates for its use in automation of the process of generation of the student insurance, and hardware and software, developed by the Research Group in Robotics and Industrial Automation GIRAof UPTC, are used as a platform.

  15. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  16. Analysis of the possibility of applying radio frequency identification in the flexible production process

    Directory of Open Access Journals (Sweden)

    Mirkov Gligorije I.

    2017-01-01

    Full Text Available Flexible manufacturing systems (FMS as a complex and stochastic environments require the development of innovative, intelligent control architectures in order to improve flexibility, agility and reconfiguration. Distribution management system addresses this challenge by introducing the optimal process management which is supported by the autonomous control units that cooperate with each other. Most of the existing transport management system, has a lack of flexibility and agility, especially in cases where a large variety of products, a small representation of parts of smaller dimensions. In such cases, the system is insensitive to random ‘ad-hoc’ events. Phase transport parts through flexible manufacturing system can be potentially used to obtain information about the product in order to process management. Radio frequency identification (RFID has been introduced as new technology allows monitoring, identifying and categorizing parts. This paper gives grounds on the flexible cell architecture (FMC and the deployment of RFID devices with the aim of the distribution and tracking of parts. The paper gives an example of setting agent base control architecture FMC.

  17. Video processing of remote sensor data applied to uranium exploration in Wyoming

    International Nuclear Information System (INIS)

    Levinson, R.A.; Marrs, R.W.; Crockell, F.

    1979-01-01

    LANDSAT satellite imagery and aerial photography can be used to map areas of altered sandstone associated with roll-front uranium deposits. Image data must be enhanced so that alteration spectral contrasts can be seen, and video image processing is a fast, low-cost, and efficient tool. For LANDSAT data, the 7/4 ratio produces the best enhancement of altered sandstone. The 6/4 ratio is most effective for color infrared aerial photography. Geochemical and mineralogical associations occur in unaltered, altered, and ore roll-front zones. Samples from Pumpkin Buttes show that iron is the primary coloring agent which makes alteration visually detectable. Eh and pH changes associated with passage of a roll front cause oxidation of magnetite and pyrite to hematite, goethite, and limonite in the host sandstone, thereby producing the alteration. Statistical analysis show that the detectability of geochemical and color zonation in host sands is weakened by soil-forming processes. Alteration can only be mapped in areas of thin soil cover and moderate to sparse vegetative cover

  18. Windowed persistent homology: A topological signal processing algorithm applied to clinical obesity data.

    Directory of Open Access Journals (Sweden)

    Craig Biwer

    Full Text Available Overweight and obesity are highly prevalent in the population of the United States, affecting roughly 2/3 of Americans. These diseases, along with their associated conditions, are a major burden on the healthcare industry in terms of both dollars spent and effort expended. Volitional weight loss is attempted by many, but weight regain is common. The ability to predict which patients will lose weight and successfully maintain the loss versus those prone to regain weight would help ease this burden by allowing clinicians the ability to skip treatments likely to be ineffective. In this paper we introduce a new windowed approach to the persistent homology signal processing algorithm that, when paired with a modified, semimetric version of the Hausdorff distance, can differentiate the two groups where other commonly used methods fail. The novel approach is tested on accelerometer data gathered from an ongoing study at the University of Michigan. While most standard approaches to signal processing show no difference between the two groups, windowed persistent homology and the modified Hausdorff semimetric show a clear separation. This has significant implications for clinical decision making and patient care.

  19. Modeling and simulation of continuous powder blending applied to a continuous direct compression process.

    Science.gov (United States)

    Galbraith, Shaun C; Liu, Huolong; Cha, Bumjoon; Park, Seo-Young; Huang, Zhuangrong; Yoon, Seongkyu

    2018-01-17

    Continuous manufacturing techniques are increasingly being adopted in the pharmaceutical industry and powder blending is a key operation for solid-dosage tablets. A modeling methodology involving axial and radial tanks-in-series flowsheet models is developed to describe the residence time distribution (RTD) and blend uniformity of a commercial powder blending system. Process data for a six-component formulation processed in a continuous direct compression line (GEA Pharma Systems) is used to test the methodology. Impulse tests were used to generate experimental RTDs which are used along with parameter estimation to determine the number of axial tanks in the flowsheet. The weighted residual from the parameter estimation was less than the χ 2 value at a 95% confidence indicating a good fit between the model and measured data. In-silico impulse tests showed the tanks-in-series modeling methodology could successfully describe the RTD behavior of the blenders along with blend uniformity through the use of radial tanks. The simulation output for both impulse weight percentage and blend uniformity were within the experimentally observed variance.

  20. APPLIED BOTANY, I. PROTECTION OF TREES AND BUSHES IN THE INVESTMENT PROCESS IN URBAN AREAS

    Directory of Open Access Journals (Sweden)

    Mariola Garczyńska

    2017-06-01

    Full Text Available Ecosystem services are the benefits resulting from resources and processes in nature. Trees and stand densities constitute a significant element of the landscape (both in urban and rural areas and serve a number of ecosystem functions, forming an inherent part of each group of benefits singled out on the basis of the Millennium Ecosystem Assessment. In the thesis, selected ecosystem services of trees and stand densities were detailed - provisioning, regulatory, supporting and cultural functions. Diagnosis of services performed by trees and their valuation may contribute to taking increased care of them and protection during performing various investments, it is therefore appropriate to launch multifaceted ecological education to each person, particularly to those directly responsible for trees and bushes in towns and rural areas. In order to restrict construction stress to trees and bushes, environmental impact assessment has to be made as soon as the construction planning is being made (natural, cultural and landscape conditions should be provided additionally, it is advisable to conduct a dendrological inventory for planning purposes. Appropriate protection of trees and standing densities is also legally regulated by the Nature Conservation Act and the Construction Law. During the investment process, the trees and their settlement conditions should be adequately secured , so that it will not affect their viability. After completed investment, the condition of the tree stand should be monitored.

  1. Interated Intelligent Industrial Process Sensing and Control: Applied to and Demonstrated on Cupola Furnaces

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed Abdelrahman; roger Haggard; Wagdy Mahmoud; Kevin Moore; Denis Clark; Eric Larsen; Paul King

    2003-02-12

    The final goal of this project was the development of a system that is capable of controlling an industrial process effectively through the integration of information obtained through intelligent sensor fusion and intelligent control technologies. The industry of interest in this project was the metal casting industry as represented by cupola iron-melting furnaces. However, the developed technology is of generic type and hence applicable to several other industries. The system was divided into the following four major interacting components: 1. An object oriented generic architecture to integrate the developed software and hardware components @. Generic algorithms for intelligent signal analysis and sensor and model fusion 3. Development of supervisory structure for integration of intelligent sensor fusion data into the controller 4. Hardware implementation of intelligent signal analysis and fusion algorithms

  2. Digital image processing applied to analysis of geophysical and geochemical data for southern Missouri

    Science.gov (United States)

    Guinness, E. A.; Arvidson, R. E.; Leff, C. E.; Edwards, M. H.; Bindschadler, D. L.

    1983-01-01

    Digital image-processing techniques have been used to analyze a variety of geophysical and geochemical map data covering southern Missouri, a region with important basement and strata-bound mineral deposits. Gravity and magnetic anomaly patterns, which have been reformatted to image displays, indicate a deep crustal structure cutting northwest-southeast through southern Missouri. In addition, geologic map data, topography, and Landsat multispectral scanner images have been used as base maps for the digital overlay of aerial gamma-ray and stream sediment chemical data for the 1 x 2-deg Rolla quadrangle. Results indicate enrichment of a variety of elements within the clay-rich alluvium covering many of the interfluvial plains, as well as a complicated pattern of enrichment for the sedimentary units close to the Precambrian rhyolites and granites of the St. Francois Mountains.

  3. Personal computer (PC) based image processing applied to fluid mechanics research

    Science.gov (United States)

    Cho, Y.-C.; Mclachlan, B. G.

    1987-01-01

    A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processsed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes commputation.

  4. Applied Multi-Mission Telemetry Processing and Display for Operations, Integration, Training, Playback and Event Reconstruction

    Science.gov (United States)

    Pomerantz, Marc; Nguyen, Viet; Lee, Daren; Lim, Christopher; Huynh, Tom

    2015-01-01

    Conveying spacecraft health and status information to mission engineering personnel during various mission phases, including mission operations, is a requirement to achieve a successful mission. For NASA/JPL spacecraft, that often means displaying hundreds of telemetry channels from a variety of sensors and components emitting data at rates varying from 1hz-100hz (and faster) in a way that allows the operations team to quickly evaluate the health of the vehicle, identify any off-nominal states and resolve any issues. In this paper we will discuss the system design, requirements and use cases of three telemetry processing and visualization systems recently developed and deployed by our team for NASA's Low Density Supersonic Decelerator (LDSD) test vehicle, NASA's Soil Moisture Active/Passive (SMAP) orbiter, and JPL's Sampling Lab Universal Robotic Manipulator (SLURM) test bed.

  5. Accident consequence analysis models applied to licensing process of nuclear installations, radioactive and conventional industries

    International Nuclear Information System (INIS)

    Senne Junior, Murillo; Vasconcelos, Vanderley de; Jordao, Elizabete

    2002-01-01

    The industrial accidents happened in the last years, particularly in the eighty's decade, had contributed in a significant way to call the attention to government authorities, industry and society as a whole, demanding mechanisms for preventing episodes that could affect people's safety and environment quality. Techniques and methods already thoroughly used in the nuclear, aeronautic and war industries were then adapted for performing analysis and evaluation of the risks associated to other industrial activities, especially in the petroleum, chemistry and petrochemical areas. Some models for analyzing the consequences of accidents involving fire and explosion, used in the licensing processes of nuclear and radioactive facilities, are presented in this paper. These models have also application in the licensing of conventional industrial facilities. (author)

  6. SITEGI Project: Applying Geotechnologies to Road Inspection. Sensor Integration and software processing

    Directory of Open Access Journals (Sweden)

    J. Martínez-Sánchez

    2013-10-01

    Full Text Available Infrastructure management represents a critical economic milestone. The current decision-making process in infrastructure rehabilitation is essentially based on qualitative parameters obtained from visual inspections and subject to the ability of technicians. In order to increase both efficiency and productivity in infrastructure management, this work addresses the integration of different instrumentation and sensors in a mobile mapping vehicle. This vehicle allows the continuous recording of quantitative data suitable for roadside inspection. The geometric integration and synchronization of these sensors is achieved through hardware and/or software strategies that permit the georeferencing of the data obtained with each sensor. In addition, a visualization software for simpler data management was implemented using Qt framework, PCL library and C++. As a result, the developed system supports the decision-making in road inspection, providing quantitative information suitable for sophisticated analysis systems.

  7. Applying fenton process in acrylic fiber wastewater treatment and practice teaching

    Science.gov (United States)

    Zhang, Chunhui; Jiang, Shan

    2018-02-01

    Acrylic fiber manufacturing wastewater, containing a wider range of pollutants, high concentration of refractory organics, poisonous and harmful matters, was significant to treat from the effluents of wastewater treatment plants (WWTPs). In this work, a Fenton reactor was employed for advanced treatment of the WWTP effluents. An orthogonal test and a parametric study were carried out to determine the effect of the main operating conditions and the Fenton process attain excellent performance on the degradation of pollutants under an optimal condition of ferrous dosage was 6.25 mM, hydrogen peroxide was 75 mM and initial pH value was 3.0 in 90 min reaction time. The removal efficiency of COD, TOC, NH4 +-N and TN reached from 45% to 69%. Lastly, as a teaching advice, the Fenton reactor was used in practicing teaching nicely.

  8. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  9. The Design Process of Physical Security as Applied to a U.S. Border Point of Entry

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, G.G.

    1998-10-26

    This paper describes the design process of physical security as applied to a U.S. Border Port of Entry (PoE). Included in this paper are descriptions of the elements that compose U.S. border security. The physical security design will describe the various elements that make up the process as well as the considerations that must be taken into account when dealing with system integration of those elements. The distinctions between preventing unlawful entry and exit of illegal contraband will be emphasized.

  10. Does applying technology throughout the medication use process improve patient safety with antineoplastics?

    Science.gov (United States)

    Bubalo, Joseph; Warden, Bruce A; Wiegel, Joshua J; Nishida, Tess; Handel, Evelyn; Svoboda, Leanne M; Nguyen, Lam; Edillo, P Neil

    2014-12-01

    Medical errors, in particular medication errors, continue to be a troublesome factor in the delivery of safe and effective patient care. Antineoplastic agents represent a group of medications highly susceptible to medication errors due to their complex regimens and narrow therapeutic indices. As the majority of these medication errors are frequently associated with breakdowns in poorly defined systems, developing technologies and evolving workflows seem to be a logical approach to provide added safeguards against medication errors. This article will review both the pros and cons of today's technologies and their ability to simplify the medication use process, reduce medication errors, improve documentation, improve healthcare costs and increase provider efficiency as relates to the use of antineoplastic therapy throughout the medication use process. Several technologies, mainly computerized provider order entry (CPOE), barcode medication administration (BCMA), smart pumps, electronic medication administration record (eMAR), and telepharmacy, have been well described and proven to reduce medication errors, improve adherence to quality metrics, and/or improve healthcare costs in a broad scope of patients. The utilization of these technologies during antineoplastic therapy is weak at best and lacking for most. Specific to the antineoplastic medication use system, the only technology with data to adequately support a claim of reduced medication errors is CPOE. In addition to the benefits these technologies can provide, it is also important to recognize their potential to induce new types of errors and inefficiencies which can negatively impact patient care. The utilization of technology reduces but does not eliminate the potential for error. The evidence base to support technology in preventing medication errors is limited in general but even more deficient in the realm of antineoplastic therapy. Though CPOE has the best evidence to support its use in the

  11. Acquisition, processing, and visualization of big data as applied to robust multivariate impact models

    Science.gov (United States)

    Romeo, L.; Rose, K.; Bauer, J. R.; Dick, D.; Nelson, J.; Bunn, A.; Buenau, K. E.; Coleman, A. M.

    2016-02-01

    Increased offshore oil exploration and production emphasizes the need for environmental, social, and economic impact models that require big data from disparate sources to conduct thorough multi-scale analyses. The National Energy Technology Laboratory's Cumulative Spatial Impact Layers (CSILs) and Spatially Weighted Impact Model (SWIM) are user-driven flexible suites of GIS-based tools that can efficiently process, integrate, visualize, and analyze a wide variety of big datasets that are acquired to better to understand potential impacts for oil spill prevention and response readiness needs. These tools provide solutions to address a range of stakeholder questions and aid in prioritization decisions needed when responding to oil spills. This is particularly true when highlighting ecologically sensitive areas and spatially analyzing which species may be at risk. Model outputs provide unique geospatial visualizations of potential impacts and informational reports based on user preferences. The spatio-temporal capabilities of these tools can be leveraged to a number of anthropogenic and natural disasters enabling decision-makers to be better informed to potential impacts and response needs.

  12. Radioscopy applied to the improvement of industrial processes of quality control in the Brazilian footwear production

    International Nuclear Information System (INIS)

    Fernandes, Marcela Tatiana Fernandes; Mello Filho, Mauro Otto de Cavalcanti; Raupp, Fernanda Maria Pereira

    2013-01-01

    According to the Ministry of Development, Industry and Foreign Trade, China has the last five years in the Brazilian footwear market for imports, representing 70% of total imports. Brazil has been recording declines in footwear exports; in 2011 there was an average reduction of 21.5% compared to 2010. Thus, Brazil has moved to the eighth position in the export market. Moreover, Asians have been improving the quality and technological level of their footwear for niche markets. It is well known that the introduction of new technologies into industrial organizations enables adding value to their products, making the organizations more competitive in the global market. In this work, we present a study on the use of radioscopy technique to improve quality control of the Brazilian footwear industry. Being already used by some international footwear manufactures, aiming at the identification of strange bodies, control jumps, among other aspects, this technique brings innovation to the referred industry, since it is a non-destructive test approach that makes use of X-rays. We also propose a tool for the application of radioscopy technique to improve quality control processes of footwear production, employing concepts of Failure Modes and Effects Analysis (FMEA). (author)

  13. EMPIRICAL MODELS FOR PERFORMANCE OF DRIPPERS APPLYING CASHEW NUT PROCESSING WASTEWATER

    Directory of Open Access Journals (Sweden)

    KETSON BRUNO DA SILVA

    2016-01-01

    Full Text Available The objective of this work was to develop empirical models for hydraulic performance of drippers operating with cashew nut processing wastewater depending on operating time, operating pressure and effluent quality. The experiment consisted of two factors, types of drippers (D1=1.65 L h-1, D2=2.00 L h-1 and D3=4.00 L h-1, and operating pressures (70, 140, 210 and 280 kPa, with three replications. The flow variation coefficient (FVC, distribution uniformity coefficient (DUC and the physicochemical and biological characteristics of the effluent were evaluated every 20 hours until complete 160 hours of operation. Data were interpreted through simple and multiple linear stepwise regression models. The regression models that fitted to the FVC and DUC as a function of operating time were square root, linear and quadratic, with 17%, 17% and 8%, and 17%, 17% and 0%, respectively. The regression models that fitted to the FVC and DUC as a function of operating pressures were square root, linear and quadratic, with 11%, 22% and 0% and the 0%, 22% and 11%, respectively. Multiple linear regressions showed that the dissolved solids content is the main wastewater characteristic that interfere in the FVC and DUC values of the drip units D1 (1.65 L h-1 and D3 (4.00 L h-1, operating at work pressure of 70 kPa (P1.

  14. Technology Readiness Level Assessment Process as Applied to NASA Earth Science Missions

    Science.gov (United States)

    Leete, Stephen J.; Romero, Raul A.; Dempsey, James A.; Carey, John P.; Cline, Helmut P.; Lively, Carey F.

    2015-01-01

    Technology assessments of fourteen science instruments were conducted within NASA using the NASA Technology Readiness Level (TRL) Metric. The instruments were part of three NASA Earth Science Decadal Survey missions in pre-formulation. The Earth Systematic Missions Program (ESMP) Systems Engineering Working Group (SEWG), composed of members of three NASA Centers, provided a newly modified electronic workbook to be completed, with instructions. Each instrument development team performed an internal assessment of its technology status, prepared an overview of its instrument, and completed the workbook with the results of its assessment. A team from the ESMP SEWG met with each instrument team and provided feedback. The instrument teams then reported through the Program Scientist for their respective missions to NASA's Earth Science Division (ESD) on technology readiness, taking the SEWG input into account. The instruments were found to have a range of TRL from 4 to 7. Lessons Learned are presented; however, due to the competition-sensitive nature of the assessments, the results for specific missions are not presented. The assessments were generally successful, and produced useful results for the agency. The SEWG team identified a number of potential improvements to the process. Particular focus was on ensuring traceability to guiding NASA documents, including the NASA Systems Engineering Handbook. The TRL Workbook has been substantially modified, and the revised workbook is described.

  15. Relationship between arsenic content of food and water applied for food processing.

    Science.gov (United States)

    Sugár, Eva; Tatár, Enikő; Záray, Gyula; Mihucz, Victor G

    2013-12-01

    As part of a survey conducted by the Central Agricultural Office of Hungary, 67 food samples including beverages were taken from 57 food industrial and catering companies, 75% of them being small and medium-sized enterprises (SMEs). Moreover, 40% of the SMEs were micro entities. Water used for food processing was simultaneously sampled. The arsenic (As) content of solid food stuff was determined by hydride generation atomic absorption spectrometry after dry ashing. Food stuff with high water content and water samples were analyzed by inductively coupled plasma mass spectrometry. The As concentration exceeded 10 μg/L in 74% of the water samples taken from SMEs. The As concentrations of samples with high water content and water used were linearly correlated. Estimated As intake from combined exposure to drinking water and food of the population was on average 40% of the daily lower limit of WHO on the benchmark dose for a 0.5% increased incidence of lung cancer (BMDL0.5) for As. Five settlements had higher As intake than the BMDL0.5. Three of these settlements are situated in Csongrád county and the distance between them is less than 55 km. The maximum As intake might be 3.8 μg/kg body weight. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Statistical process control applied to mechanized peanut sowing as a function of soil texture.

    Directory of Open Access Journals (Sweden)

    Cristiano Zerbato

    Full Text Available The successful establishment of agricultural crops depends on sowing quality, machinery performance, soil type and conditions, among other factors. This study evaluates the operational quality of mechanized peanut sowing in three soil types (sand, silt, and clay with variable moisture contents. The experiment was conducted in three locations in the state of São Paulo, Brazil. The track-sampling scheme was used for 80 sampling locations of each soil type. Descriptive statistics and statistical process control (SPC were used to evaluate the quality indicators of mechanized peanut sowing. The variables had normal distributions and were stable from the viewpoint of SPC. The best performance for peanut sowing density, normal spacing, and the initial seedling growing stand was found for clayey soil followed by sandy soil and then silty soil. Sandy or clayey soils displayed similar results regarding sowing depth, which was deeper than in the silty soil. Overall, the texture and the moisture of clayey soil provided the best operational performance for mechanized peanut sowing.

  17. A self-organizing algorithm for vector quantizer design applied to signal processing.

    Science.gov (United States)

    Madeiro, F; Vilar, R M; Fechine, J M; Neto, B G

    1999-06-01

    Vector quantization plays an important role in many signal processing problems, such as speech/speaker recognition and signal compression. This paper presents an unsupervised algorithm for vector quantizer design. Although the proposed method is inspired in Kohonen learning, it does not incorporate the classical definition of topological neighborhood as an array of nodes. Simulations are carried out to compare the performance of the proposed algorithm, named SOA (self-organizing algorithm), to that of the traditional LBG (Linde-Buzo-Gray) algorithm. The authors present an evaluation concerning the codebook design for Gauss-Markov and Gaussian sources, since the theoretic optimal performance bounds for these sources, as described by Shannon's Rate-Distortion Theory, are known. In speech and image compression, SOA codebooks lead to reconstructed (vector-quantized) signals with better quality as compared to the ones obtained by using LBG codebooks. Additionally, the influence of the initial codebook in the algorithm performance is investigated and the algorithm ability to learn representative patterns is evaluated. In a speaker identification system, it is shown that the the codebooks designed by SOA lead to higher identification rates when compared to the ones designed by LBG.

  18. Dynamic laser speckle applied to the analysis of maturation process of irradiated fresh fruits

    Science.gov (United States)

    Vincitorio, F. M.; Budini, N.; Freyre, C.; Mulone, C.; Fiorucci, M. P.; López, A. J.; Ramil, A.

    2012-10-01

    The treatment of fresh fruits with different doses of ionizing radiation has been found effective for delaying ripening and, in this way, to extend shelf life. This preservation method is likely to produce some functional or constitutive changes in the cellular structure of the fruit. In this work, a test of the effectiveness of fruit irradiation with relatively low doses was performed by using dynamic speckle imaging. Bananas from a same lot were chosen, being a first series of them irradiated with different doses of 0.2, 0.4 and 0.6 kGy (Gy = J/kg) and a second series with doses of 0.2, 0.4, 0.6 and 1 kGy. Non irradiated bananas (0 kGy) were considered as the lot reference for contrast. Irradiation was carried out at the Semi-Industrial Cobalt 60 facility of the Ezeiza Atomic Center, with an activity of 6 × 105 Curie and a dose rate of 28.5 Gy/min. The objective of this work is to analyze differences in the maturation process between irradiated and nonirradiated fruits by means of dynamic speckle pattern evaluation.

  19. Applying a Data Stewardship Maturity Matrix to the NOAA Observing System Portfolio Integrated Assessment Process

    Science.gov (United States)

    Peng, G.; Austin, M.

    2017-12-01

    Identification and prioritization of targeted user community needs are not always considered until after data has been created and archived. Gaps in data curation and documentation in the data production and delivery phases limit data's broad utility specifically for decision makers. Expert understanding and knowledge of a particular dataset is often required as a part of the data and metadata curation process to establish the credibility of the data and support informed decision-making. To enhance curation practices, content from NOAA's Observing System Integrated Assessment (NOSIA) Value Tree, NOAA's Data Catalog/Digital Object Identifier (DOI) projects (collection-level metadata) have been integrated with Data/Stewardship Maturity Matrices (data and stewardship quality information) focused on assessment of user community needs. This results in user focused evidence based decision making tools created by NOAA's National Environmental Satellite, Data, and Information Service (NESDIS) through identification and assessment of data content gaps related to scientific knowledge and application to key areas of societal benefit. Through enabling user need feedback from the beginning of data creation through archive allows users to determine the quality and value of data that is fit for purpose. Data gap assessment and prioritization are presented in a user-friendly way using the data stewardship maturity matrices as measurement of data management quality. These decision maker tools encourages data producers and data providers/stewards to consider users' needs prior to data creation and dissemination resulting in user driven data requirements increasing return on investment. A use case focused on need for NOAA observations linked societal benefit will be used to demonstrate the value of these tools.

  20. Applying the Analytic Hierarchy Process to Oil Sands Environmental Compliance Risk Management

    Science.gov (United States)

    Roux, Izak Johannes, III

    Oil companies in Alberta, Canada, invested $32 billion on new oil sands projects in 2013. Despite the size of this investment, there is a demonstrable deficiency in the uniformity and understanding of environmental legislation requirements that manifest into increased project compliance risks. This descriptive study developed 2 prioritized lists of environmental regulatory compliance risks and mitigation strategies and used multi-criteria decision theory for its theoretical framework. Information from compiled lists of environmental compliance risks and mitigation strategies was used to generate a specialized pairwise survey, which was piloted by 5 subject matter experts (SMEs). The survey was validated by a sample of 16 SMEs, after which the Analytic Hierarchy Process (AHP) was used to rank a total of 33 compliance risks and 12 mitigation strategy criteria. A key finding was that the AHP is a suitable tool for ranking of compliance risks and mitigation strategies. Several working hypotheses were also tested regarding how SMEs prioritized 1 compliance risk or mitigation strategy compared to another. The AHP showed that regulatory compliance, company reputation, environmental compliance, and economics ranked the highest and that a multi criteria mitigation strategy for environmental compliance ranked the highest. The study results will inform Alberta oil sands industry leaders about the ranking and utility of specific compliance risks and mitigations strategies, enabling them to focus on actions that will generate legislative and public trust. Oil sands leaders implementing a risk management program using the risks and mitigation strategies identified in this study will contribute to environmental conservation, economic growth, and positive social change.

  1. FOUR SQUARE WRITING METHOD APPLIED IN PRODUCT AND PROCESS BASED APPROACHES COMBINATION TO TEACHING WRITING DISCUSSION TEXT

    Directory of Open Access Journals (Sweden)

    Vina Agustiana

    2017-12-01

    Full Text Available Four Square Writing Method is a writing method which helps students in organizing concept to write by using a graphic organizer. This study aims to examine the influence of applying FSWM in combination of product and process based approaches to teaching writing discussion texts toward students’ writing skill, the teaching-learning writing process and the students’ attitude toward the implementation of the writing method. This study applies a mixed-method through applying an embedded design. 26 EFL university students of a private university in West Java, Indonesia, are involved in the study. There are 3 kinds of instrument used, namely tests (pre and post-test, field notes, and questionnaires. Data taken from students’ writing test are analyzed statistically to identify the influence of applying the writing method toward students’ writing skill; data taken from field notes are analyzed qualitatively to examine the learning writing activities at the time the writing method is implemented; and data taken from questionnaires are analyzed descriptive statistic to explore students’ attitude toward the implementation of the writing method. Regarding the result of paired t-test, the writing method is effective in improving students’ writing skill since level of significant (two-tailed is less than alpha (0.000<0.05. Furthermore, the result taken from field notes shows that each steps applied and graphic organizer used in the writing method lead students to compose discussion texts which meet a demand of genre. In addition, regard with the result taken from questionnaire, the students show highly positive attitude toward the treatment since the mean score is 4.32.

  2. Features, events, processes, and safety factor analysis applied to a near-surface low-level radioactive waste disposal facility

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, M.E.; Dolinar, G.M.; Lange, B.A. [Atomic Energy of Canada Limited, Ontario (Canada)] [and others

    1995-12-31

    An analysis of features, events, processes (FEPs) and other safety factors was applied to AECL`s proposed IRUS (Intrusion Resistant Underground Structure) near-surface LLRW disposal facility. The FEP analysis process which had been developed for and applied to high-level and transuranic disposal concepts was adapted for application to a low-level facility for which significant efforts in developing a safety case had already been made. The starting point for this process was a series of meetings of the project team to identify and briefly describe FEPs or safety factors which they thought should be considered. At this early stage participants were specifically asked not to screen ideas. This initial list was supplemented by selecting FEPs documented in other programs and comments received from an initial regulatory review. The entire list was then sorted by topic and common issues were grouped, and issues were classified in three priority categories and assigned to individuals for resolution. In this paper, the issue identification and resolution process will be described, from the initial description of an issue to its resolution and inclusion in the various levels of the safety case documentation.

  3. Respirometry applied for biological nitrogen removal process; Aplicacion de la respirometria al tratamiento biologico para la eliminacion del nitrogeno

    Energy Technology Data Exchange (ETDEWEB)

    Serrano, E.

    2004-07-01

    In waste water treatment plants, the Biological Nitrogen Removal (BNR) has acquired a fundamental importance. The BNR processes are Nitrification ( aerobic) and Denitrification (anoxic). Since both processes are carried on living microorganisms, a lack of their bioactivity information might cause serious confusion about their control criteria and following up purposes. For this reason, the Re spirometry applied to those processes has reached an important role by getting an essential information in a timely manner through respiration rate measurements in static and dynamic modes and applications such as AUR (Ammonium Uptake Rate), Nitrification Capacity. RBCOD (Readily Biodegradable COD) as well as AUR related to SRT (Sludge age), RBCOD related to NUR (Specific Nitrate Uptake Rate) and others. By other side in this article we have introduced a not very well known applications related to denitrification, about the methanol acclimatization and generated bioactivity. (Author) 6 refs.

  4. Applying the Quadruple Process model to evaluate change in implicit attitudinal responses during therapy for panic disorder.

    Science.gov (United States)

    Clerkin, Elise M; Fisher, Christopher R; Sherman, Jeffrey W; Teachman, Bethany A

    2014-01-01

    This study explored the automatic and controlled processes that may influence performance on an implicit measure across cognitive-behavioral group therapy for panic disorder. The Quadruple Process model was applied to error scores from an Implicit Association Test evaluating associations between the concepts Me (vs. Not Me) + Calm (vs. Panicked) to evaluate four distinct processes: Association Activation, Detection, Guessing, and Overcoming Bias. Parameter estimates were calculated in the panic group (n = 28) across each treatment session where the IAT was administered, and at matched times when the IAT was completed in the healthy control group (n = 31). Association Activation for Me + Calm became stronger over treatment for participants in the panic group, demonstrating that it is possible to change automatically activated associations in memory (vs. simply overriding those associations) in a clinical sample via therapy. As well, the Guessing bias toward the calm category increased over treatment for participants in the panic group. This research evaluates key tenets about the role of automatic processing in cognitive models of anxiety, and emphasizes the viability of changing the actual activation of automatic associations in the context of treatment, versus only changing a person's ability to use reflective processing to overcome biased automatic processing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. State and parameter estimation based on a nonlinear filter applied to an industrial process control of ethanol production

    Directory of Open Access Journals (Sweden)

    Meleiro L.A.C.

    2000-01-01

    Full Text Available Most advanced computer-aided control applications rely on good dynamics process models. The performance of the control system depends on the accuracy of the model used. Typically, such models are developed by conducting off-line identification experiments on the process. These experiments for identification often result in input-output data with small output signal-to-noise ratio, and using these data results in inaccurate model parameter estimates [1]. In this work, a multivariable adaptive self-tuning controller (STC was developed for a biotechnological process application. Due to the difficulties involving the measurements or the excessive amount of variables normally found in industrial process, it is proposed to develop "soft-sensors" which are based fundamentally on artificial neural networks (ANN. A second approach proposed was set in hybrid models, results of the association of deterministic models (which incorporates the available prior knowledge about the process being modeled with artificial neural networks. In this case, kinetic parameters - which are very hard to be accurately determined in real time industrial plants operation - were obtained using ANN predictions. These methods are especially suitable for the identification of time-varying and nonlinear models. This advanced control strategy was applied to a fermentation process to produce ethyl alcohol (ethanol in industrial scale. The reaction rate considered for substratum consumption, cells and ethanol productions are validated with industrial data for typical operating conditions. The results obtained show that the proposed procedure in this work has a great potential for application.

  6. Applying the Quadruple Process Model to Evaluate Change in Implicit Attitudinal Responses During Therapy for Panic Disorder

    Science.gov (United States)

    Clerkin, Elise M.; Fisher, Christopher R.; Sherman, Jeffrey W.; Teachman, Bethany A.

    2013-01-01

    Objective This study explored the automatic and controlled processes that may influence performance on an implicit measure across cognitive-behavioral group therapy for panic disorder. Method The Quadruple Process model was applied to error scores from an Implicit Association Test evaluating associations between the concepts Me (vs. Not Me) + Calm (vs. Panicked) to evaluate four distinct processes: Association Activation, Detection, Guessing, and Overcoming Bias. Parameter estimates were calculated in the panic group (n=28) across each treatment session where the IAT was administered, and at matched times when the IAT was completed in the healthy control group (n=31). Results Association Activation for Me + Calm became stronger over treatment for participants in the panic group, demonstrating that it is possible to change automatically activated associations in memory (vs. simply overriding those associations) in a clinical sample via therapy. As well, the Guessing bias toward the calm category increased over treatment for participants in the panic group. Conclusions This research evaluates key tenets about the role of automatic processing in cognitive models of anxiety, and emphasizes the viability of changing the actual activation of automatic associations in the context of treatment, versus only changing a person’s ability to use reflective processing to overcome biased automatic processing. PMID:24275066

  7. Application of structured flowsheets to global evaluation of tank waste processing alternatives

    International Nuclear Information System (INIS)

    Jansen, G.; Knutson, B.J.; Niccoli, L.G.; Frank, D.D.

    1994-01-01

    Remediation of the Hanford waste tanks requires integration of chemical technologies and evaluation of alternatives from the perspective of the overall Hanford cleanup purpose. The use of Design/IDEF (R) logic to connect chemical process functions to the overall cleanup mission in the Hanford Strategic Analysis (HSA) and to Aspen Plus (R) process models can show the effect of each process step on global performance measures such as safety, cost, and public perception. This hybrid of chemical process analysis and systems engineering produces structured material balance flowsheets at any level of process aggregation within the HSA. Connectivity and consistent process and stream nomenclature are automatically transferred between detailed process models, the HSA top purpose, and the global material balance flowsheet evaluation. Applications to separation processes is demonstrated for a generic Truex-Sludge Wash flowsheet with many process options and for the aggregation of a Clean Option flowsheet from a detailed chemical process level to a global evaluation level

  8. Interpretation of sedimentological processes of coarse-grained deposits applying a novel combined cluster and discriminant analysis

    Directory of Open Access Journals (Sweden)

    Farics Éva

    2017-10-01

    Full Text Available The main aim of this paper is to determine the depositional environments of an Upper-Eocene coarse-grained clastic succession in the Buda Hills, Hungary. First of all, we measured some commonly used parameters of samples (size, amount, roundness and sphericity in a much more objective overall and faster way than with traditional measurement approaches, using the newly developed Rock Analyst application. For the multivariate data obtained, we applied Combined Cluster and Discriminant Analysis (CCDA in order to determine homogeneous groups of the sampling locations based on the quantitative composition of the conglomerate as well as the shape parameters (roundness and sphericity. The result is the spatial pattern of these groups, which assists with the interpretation of the depositional processes. According to our concept, those sampling sites which belong to the same homogeneous groups were likely formed under similar geological circumstances and by similar geological processes.

  9. Automated chromatographic system with polarimetric detection laser applied in the control of fermentation processes and seaweed extracts characterization

    International Nuclear Information System (INIS)

    Fajer, V.; Naranjo, S.; Mora, W.; Patinno, R.; Coba, E.; Michelena, G.

    2012-01-01

    There are presented applications and innovations of chromatographic and polarimetric systems in which develop methodologies for measuring the input molasses and the resulting product of a fermentation process of alcohol from a rich honey and evaluation of the fermentation process honey servery in obtaining a drink native to the Yucatan region. Composition was assessed optically active substances in seaweed, of interest to the pharmaceutical industry. The findings provide measurements alternative raw materials and products of the sugar industry, beekeeping and pharmaceutical liquid chromatography with automated polarimetric detection reduces measurement times up to 15 min, making it comparable to the times of high chromatography resolution, significantly reducing operating costs. By chromatography system with polarimetric detection (SCDP) is new columns have included standard size designed by the authors, which allow process samples with volumes up to 1 ml and reduce measurement time to 15 min, decreasing to 5 times the volume sample and halving the time of measurement. Was evaluated determining the concentration of substances using the peaks of the chromatograms obtained for the different columns and calculate the uncertainty of measurements. The results relating to the improvement of a data acquisition program (ADQUIPOL v.2.0) and new programs for the preparation of chromatograms (CROMAPOL CROMAPOL V.1.0 and V.1.2) provide important benefits, which allow a considerable saving of time the processing of the results and can be applied in other chromatography systems with the appropriate adjustments. (Author)

  10. Central Composite Design (CCD) applied for statistical optimization of glucose and sucrose binary carbon mixture in enhancing the denitrification process

    Science.gov (United States)

    Lim, Jun-Wei; Beh, Hoe-Guan; Ching, Dennis Ling Chuan; Ho, Yeek-Chia; Baloo, Lavania; Bashir, Mohammed J. K.; Wee, Seng-Kew

    2017-11-01

    The present study provides an insight into the optimization of a glucose and sucrose mixture to enhance the denitrification process. Central Composite Design was applied to design the batch experiments with the factors of glucose and sucrose measured as carbon-to-nitrogen (C:N) ratio each and the response of percentage removal of nitrate-nitrogen (NO3 --N). Results showed that the polynomial regression model of NO3 --N removal had been successfully derived, capable of describing the interactive relationships of glucose and sucrose mixture that influenced the denitrification process. Furthermore, the presence of glucose was noticed to have more consequential effect on NO3 --N removal as opposed to sucrose. The optimum carbon sources mixture to achieve complete removal of NO3 --N required lesser glucose (C:N ratio of 1.0:1.0) than sucrose (C:N ratio of 2.4:1.0). At the optimum glucose and sucrose mixture, the activated sludge showed faster acclimation towards glucose used to perform the denitrification process. Later upon the acclimation with sucrose, the glucose uptake rate by the activated sludge abated. Therefore, it is vital to optimize the added carbon sources mixture to ensure the rapid and complete removal of NO3 --N via the denitrification process.

  11. Managing Zirconium Chemistry and Phase Compatibility in Combined Process Separations for Minor Actinide Partitioning

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Nathalie [Washington State Univ., Pullman, WA (United States); Nash, Ken [Washington State Univ., Pullman, WA (United States); Martin, Leigh [Washington State Univ., Pullman, WA (United States)

    2017-03-17

    In response to the NEUP Program Supporting Fuel Cycle R&D Separations and Waste Forms call DEFOA- 0000799, this report describes the results of an R&D project focusing on streamlining separation processes for advanced fuel cycles. An example of such a process relevant to the U.S. DOE FCR&D program would be one combining the functions of the TRUEX process for partitioning of lanthanides and minor actinides from PUREX(UREX) raffinates with that of the TALSPEAK process for separating transplutonium actinides from fission product lanthanides. A fully-developed PUREX(UREX)/TRUEX/TALSPEAK suite would generate actinides as product(s) for reuse (or transmutation) and fission products as waste. As standalone, consecutive unit-operations, TRUEX and TALSPEAK employ different extractant solutions (solvating (CMPO, octyl(phenyl)-N,Ndiisobutylcarbamoylmethylphosphine oxide) vs. cation exchanging (HDEHP, di-2(ethyl)hexylphosphoric acid) extractants), and distinct aqueous phases (2-4 M HNO3 vs. concentrated pH 3.5 carboxylic acid buffers containing actinide selective chelating agents). The separate processes may also operate with different phase transfer kinetic constraints. Experience teaches (and it has been demonstrated at the lab scale) that, with proper control, multiple process separation systems can operate successfully. However, it is also recognized that considerable economies of scale could be achieved if multiple operations could be merged into a single process based on a combined extractant solvent. The task of accountability of nuclear materials through the process(es) also becomes more robust with fewer steps, providing that the processes can be accurately modeled. Work is underway in the U.S. and Europe on developing several new options for combined processes (TRUSPEAK, ALSEP, SANEX, GANEX, ExAm are examples). There are unique challenges associated with the operation of such processes, some relating to organic phase chemistry, others arising from the

  12. Impact on process results of clinical decision support systems (CDSSs) applied to medication use: overview of systematic reviews.

    Science.gov (United States)

    Reis, Wálleri C; Bonetti, Aline F; Bottacin, Wallace E; Reis, Alcindo S; Souza, Thaís T; Pontarolo, Roberto; Correr, Cassyano J; Fernandez-Llimos, Fernando

    2017-01-01

    The purpose of this overview (systematic review of systematic reviews) is to evaluate the impact of clinical decision support systems (CDSS) applied to medication use in the care process. A search for systematic reviews that address CDSS was performed on Medline following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and Cochrane recommendations. Terms related to CDSS and systematic reviews were used in combination with Boolean operators and search field tags to build the electronic search strategy. There was no limitation of date or language for inclusion. We included revisions that investigated, as a main or secondary objective, changes in process outcomes. The Revised Assessment of Multiple Systematic Reviews (R-AMSTAR) score was used to evaluate the quality of the studies. The search retrieved 954 articles. Five articles were added through manual search, totaling an initial sample of 959 articles. After screening and reading in full, 44 systematic reviews met the inclusion criteria. In the medication-use processes where CDSS was used, the most common stages were prescribing (n=38 (86.36%) and administering (n=12 (27.27%)). Most of the systematic reviews demonstrated improvement in the health care process (30/44 - 68.2%). The main positive results were related to improvement of the quality of prescription by the physicians (14/30 - 46.6%) and reduction of errors in prescribing (5/30 - 16.6%). However, the quality of the studies was poor, according to the score used. CDSSs represent a promising technology to optimize the medication-use process, especially related to improvement in the quality of prescriptions and reduction of prescribing errors, although higher quality studies are needed to establish the predictors of success in these systems.

  13. Molecule-based kinetic Monte Carlo modeling of hydrotreating processes applied to Light Cycle Oil gas oils

    Science.gov (United States)

    Kolb, Max; Pereira de Oliveira, Luis; Verstraete, Jan

    2013-03-01

    A novel kinetic modeling strategy for refining processes for heavy petroleum fractions is proposed. The approach allows to overcome the notorious lack of molecular details in describing the petroleum fractions. The simulation of the reactions process consists of a two-step procedure. In the first step, a mixture of molecules representing the feedstock of the process is generated via two sucessive molecular reconstruction algorithms. The first algorithm, termed stochastic reconstruction, generates an equimolar set of molecules with the appropriate analytical properties via a Monte Carlo method. The second algorithm, called reconstruction by entropy maximization, adjusts the molar fractions of the generated molecules in order to further improve the properties of the mixture. In the second step, a kinetic Monte Carlo method is used to simulate the effect of the refining reactions on the previously generated set of molecules. The full two-step methodology has been applied to the hydrotreating of LCO gas oils and to the hydrocracking of vacuum residues from different origins (e.g. Athabasca).

  14. Zero order and signal processing spectrophotometric techniques applied for resolving interference of metronidazole with ciprofloxacin in their pharmaceutical dosage form.

    Science.gov (United States)

    Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed

    2016-02-05

    Four rapid, simple, accurate and precise spectrophotometric methods were used for the determination of ciprofloxacin in the presence of metronidazole as interference. The methods under study are area under the curve, simultaneous equation in addition to smart signal processing techniques of manipulating ratio spectra namely Savitsky-Golay filters and continuous wavelet transform. All the methods were validated according to the ICH guidelines where accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can therefore be used for the routine analysis of ciprofloxacin in quality-control laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Natural Language Processing and Machine Learning (NLP/ML): Applying Advances in Biomedicine to the Earth Sciences

    Science.gov (United States)

    Duerr, R.; Myers, S.; Palmer, M.; Jenkins, C. J.; Thessen, A.; Martin, J.

    2015-12-01

    Semantics underlie many of the tools and services available from and on the web. From improving search results to enabling data mashups and other forms of interoperability, semantic technologies have proven themselves. But creating semantic resources, especially re-usable semantic resources, is extremely time consuming and labor intensive. Why? Because it is not just a matter of technology but also of obtaining rough consensus if not full agreement amongst community members on the meaning and order of things. One way to develop these resources in a more automated way would be to use NLP/ML techniques to extract the required resources from large corpora of subject-specific text such as peer-reviewed papers where presumably a rough consensus has been achieved at least about the basics of the particular discipline involved. While not generally applied to Earth Sciences, considerable resources have been spent in other fields such as medicine on these types of techniques with some success. The NSF-funded ClearEarth project is applying the techniques developed for biomedicine to the cryosphere, geology, and biology in order to spur faster development of the semantic resources needed in these fields. The first area being addressed by the project is the cryosphere, specifically sea ice nomenclature where an existing set of sea ice ontologies are being used as the "Gold Standard" against which to test and validate the NLP/ML techniques. The processes being used, lessons learned and early results will be described.

  16. Study and methodology development for quality control in the production process of iodine-125 radioactive sealed sources applied to brachytherapy

    International Nuclear Information System (INIS)

    Moura, Joao Augusto

    2009-01-01

    Today cancer is the second cause of death by disease in several countries, including Brazil. Excluding skin cancer, prostate cancer is the most incident in the population. Prostate tumor can be treated by several ways, including brachytherapy, which consists in introducing sealed radioactive sources (Iodine - 125 seeds) inside the tumor. The target region of treatment receives a high radiation dose, but healthy neighbor tissues receive a significantly reduced radiation dose. The seed is made of a welding sealed titanium capsule, 0.8 mm external diameter and 4.5 mm length, enclosing a 0.5 mm diameter silver wire with Iodine-125 adsorbed. After welded, the seeds have to be submitted to a leak test to prevent any radioactive material release. The aims of this work were: (a) the study of the different leakage test methods applied to radioactive seeds and recommended by the ISO 997820, (b) the choice of the appropriate method and (c) the flowchart determination of the process to be used during the seeds production. The essays exceeded the standards with the use of ultra-sound during immersion and the corresponding benefits to leakage detection. Best results were obtained with the immersion in distilled water at 20 degree C for 24 hours and distilled water at 70 degree C for 30 minutes. These methods will be used during seed production. The process flowchart has all the phases of the leakage tests according to the sequence determined in the experiments. (author)

  17. Interpretation of sedimentological processes of coarse-grained deposits applying a novel combined cluster and discriminant analysis

    Science.gov (United States)

    Farics, Éva; Farics, Dávid; Kovács, József; Haas, János

    2017-10-01

    The main aim of this paper is to determine the depositional environments of an Upper-Eocene coarse-grained clastic succession in the Buda Hills, Hungary. First of all, we measured some commonly used parameters of samples (size, amount, roundness and sphericity) in a much more objective overall and faster way than with traditional measurement approaches, using the newly developed Rock Analyst application. For the multivariate data obtained, we applied Combined Cluster and Discriminant Analysis (CCDA) in order to determine homogeneous groups of the sampling locations based on the quantitative composition of the conglomerate as well as the shape parameters (roundness and sphericity). The result is the spatial pattern of these groups, which assists with the interpretation of the depositional processes. According to our concept, those sampling sites which belong to the same homogeneous groups were likely formed under similar geological circumstances and by similar geological processes. In the Buda Hills, we were able to distinguish various sedimentological environments within the area based on the results: fan, intermittent stream or marine.

  18. Occurrence and distribution study of residues from pesticides applied under controlled conditions in the field during rice processing.

    Science.gov (United States)

    Pareja, Lucía; Colazzo, Marcos; Pérez-Parada, Andrés; Besil, Natalia; Heinzen, Horacio; Böcking, Bernardo; Cesio, Verónica; Fernández-Alba, Amadeo R

    2012-05-09

    The results of an experiment to study the occurrence and distribution of pesticide residues during rice cropping and processing are reported. Four herbicides, nine fungicides, and two insecticides (azoxystrobin, byspiribac-sodium, carbendazim, clomazone, difenoconazole, epoxiconazole, isoprothiolane, kresoxim-methyl, propanil, quinclorac, tebuconazole, thiamethoxam, tricyclazole, trifloxystrobin, λ-cyhalotrin) were applied to an isolated rice-crop plot under controlled conditions, during the 2009-2010 cropping season in Uruguay. Paddy rice was harvested and industrially processed to brown rice, white rice, and rice bran, which were analyzed for pesticide residues using the original QuEChERS methodology and its citrate variation by LC-MS/MS and GC-MS. The distribution of pesticide residues was uneven among the different matrices. Ten different pesticide residues were found in paddy rice, seven in brown rice, and eight in rice bran. The highest concentrations were detected in paddy rice. These results provide information regarding the fate of pesticides in the rice food chain and its safety for consumers.

  19. Applying a Markov approach as a Lean Thinking analysis of waste elimination in a Rice Production Process

    Directory of Open Access Journals (Sweden)

    Eldon Glen Caldwell Marin

    2015-01-01

    Full Text Available The Markov Chains Model was proposed to analyze stochastic events when recursive cycles occur; for example, when rework in a continuous flow production affects the overall performance. Typically, the analysis of rework and scrap is done through a wasted material cost perspective and not from the perspective of waste capacity that reduces throughput and economic value added (EVA. Also, we can not find many cases of this application in agro-industrial production in Latin America, given the complexity of the calculations and the need for robust applications. This scientific work presents the results of a quasi-experimental research approach in order to explain how to apply DOE methods and Markov analysis in a rice production process located in Central America, evaluating the global effects of a single reduction in rework and scrap in a part of the whole line. The results show that in this case it is possible to evaluate benefits from Global Throughput and EVA perspective and not only from the saving costs perspective, finding a relationship between operational indicators and corporate performance. However, it was found that it is necessary to analyze the markov chains configuration with many rework points, also it is still relevant to take into account the effects on takt time and not only scrap´s costs.

  20. An integrated approach of analytical network process and fuzzy based spatial decision making systems applied to landslide risk mapping

    Science.gov (United States)

    Abedi Gheshlaghi, Hassan; Feizizadeh, Bakhtiar

    2017-09-01

    Landslides in mountainous areas render major damages to residential areas, roads, and farmlands. Hence, one of the basic measures to reduce the possible damage is by identifying landslide-prone areas through landslide mapping by different models and methods. The purpose of conducting this study is to evaluate the efficacy of a combination of two models of the analytical network process (ANP) and fuzzy logic in landslide risk mapping in the Azarshahr Chay basin in northwest Iran. After field investigations and a review of research literature, factors affecting the occurrence of landslides including slope, slope aspect, altitude, lithology, land use, vegetation density, rainfall, distance to fault, distance to roads, distance to rivers, along with a map of the distribution of occurred landslides were prepared in GIS environment. Then, fuzzy logic was used for weighting sub-criteria, and the ANP was applied to weight the criteria. Next, they were integrated based on GIS spatial analysis methods and the landslide risk map was produced. Evaluating the results of this study by using receiver operating characteristic curves shows that the hybrid model designed by areas under the curve 0.815 has good accuracy. Also, according to the prepared map, a total of 23.22% of the area, amounting to 105.38 km2, is in the high and very high-risk class. Results of this research are great of importance for regional planning tasks and the landslide prediction map can be used for spatial planning tasks and for the mitigation of future hazards in the study area.

  1. Flexible ITO-free organic solar cells applying aqueous solution-processed V2O5 hole transport layer: An outdoor stability study

    DEFF Research Database (Denmark)

    Lima, F. Anderson S.; Beliatis, Michail J.; Roth, Bérenger

    2016-01-01

    Solution processable semiconductor oxides have opened a new paradigm for theenhancement of the lifetime of thin film solar cells. Their fabrication by low-costand environmentally friendly solution-processable methods makes them ideal barrier(hole and electron) transport layers. In this work, we...... fabricate flexible ITO-freeorganic solar cells (OPV) by printing methods applying an aqueous solution-processed V2O5 as the hole transport layer (HTL) and compared them to devices applying PEDOT:PSS. The transparent conducting electrode was PET/Ag/PEDOT/ZnO, and the OPV configuration was PET/Ag/PEDOT/ZnO/P3...

  2. Development of coal petrography applied in technical processes at the Bergbau-Forschung/DMT during the last 50 years

    Energy Technology Data Exchange (ETDEWEB)

    Steller, Monika; Arendt, Paul; Kuehl, Helmut [Deutsche Montan Technologie GmbH ? Mining Service Division?Essen (Germany)

    2006-06-06

    The paper deals with the activities of the Bergbau-Forschung Coal Petrography Laboratory in Essen (Germany), which, under the influence of Marie-Therese Mackowsky, developed into a stronghold of the industrial application of coal petrology. In 1979, the formerly independent Section for Mineralogy and Petrology was merged with the Chemistry Section. This synergy has widened the research limits and resulted in higher efficiency of projects being carried out within both units. Since 1990, after transforming Bergbau-Forschung into DMT GmbH, a worldwide competition within hard coal and hard coal-based coke markets, together with the switch of the industry towards alternative energy sources, have significantly lowered the importance of the domestic coal mining industry. This in turn resulted in reduction of coal research programs. However, it is stressed that, in spite of transformations of the applied coal petrology experienced during the past 50 years, some achievements are still as applicable as ever. Among them, the method of predicting coke strength using maceral analysis and coal types, published by Mackowsky and Simonis [Mackowsky, M.-Th., Simonis, W., 1969. Die Kennzeichnung von Kokskohlen fur die mathematische Beschreibung der Hochtemperaturverkokung im Horizontalkammerofen bei Schuttbetrieb durch Ergebnisse mikroskopischer Analysen. Gluckauf-Forschungshefte 30, 25-27], is still in use today. The second part of this paper presents some examples of coal petrography applications, which are still important in carbonization processes. Mackowsky discovered that the pyrolytic components were influencing the coke homogeneity in coke ovens and affected coke quality parameters such as CRI and CSR. These highly graphitic layers and lenses prevent gasification of the inner zones of coke lumps, thus lowering the reactivity of metallurgical coke. Moreover, it also seems possible to predict wall load and maximum internal gas pressure as to prevent coke ovens from damage

  3. Development of coal petrography applied in technical processes at the Bergbau-Forschung/DMT during the last 50 years

    International Nuclear Information System (INIS)

    Steller, Monika; Arendt, Paul; Kuehl, Helmut

    2006-01-01

    The paper deals with the activities of the Bergbau-Forschung Coal Petrography Laboratory in Essen (Germany), which, under the influence of Marie-Therese Mackowsky, developed into a stronghold of the industrial application of coal petrology. In 1979, the formerly independent Section for Mineralogy and Petrology was merged with the Chemistry Section. This synergy has widened the research limits and resulted in higher efficiency of projects being carried out within both units. Since 1990, after transforming Bergbau-Forschung into DMT GmbH, a worldwide competition within hard coal and hard coal-based coke markets, together with the switch of the industry towards alternative energy sources, have significantly lowered the importance of the domestic coal mining industry. This in turn resulted in reduction of coal research programs. However, it is stressed that, in spite of transformations of the applied coal petrology experienced during the past 50 years, some achievements are still as applicable as ever. Among them, the method of predicting coke strength using maceral analysis and coal types, published by Mackowsky and Simonis [Mackowsky, M.-Th., Simonis, W., 1969. Die Kennzeichnung von Kokskohlen fur die mathematische Beschreibung der Hochtemperaturverkokung im Horizontalkammerofen bei Schuttbetrieb durch Ergebnisse mikroskopischer Analysen. Gluckauf-Forschungshefte 30, 25-27], is still in use today. The second part of this paper presents some examples of coal petrography applications, which are still important in carbonization processes. Mackowsky discovered that the pyrolytic components were influencing the coke homogeneity in coke ovens and affected coke quality parameters such as CRI and CSR. These highly graphitic layers and lenses prevent gasification of the inner zones of coke lumps, thus lowering the reactivity of metallurgical coke. Moreover, it also seems possible to predict wall load and maximum internal gas pressure as to prevent coke ovens from damage

  4. Applied mathematics

    CERN Document Server

    Logan, J David

    2013-01-01

    Praise for the Third Edition"Future mathematicians, scientists, and engineers should find the book to be an excellent introductory text for coursework or self-study as well as worth its shelf space for reference." -MAA Reviews Applied Mathematics, Fourth Edition is a thoroughly updated and revised edition on the applications of modeling and analyzing natural, social, and technological processes. The book covers a wide range of key topics in mathematical methods and modeling and highlights the connections between mathematics and the applied and nat

  5. Analysis of the Second Law of Thermodynamics applied to GS process in the primary production of heavy water

    International Nuclear Information System (INIS)

    Chavez, Rosa Hilda

    1991-01-01

    An evaluation of the entropic change, through the analysis of the Second Law of Thermodynamic of GS process, with the aim of determine the sections where the majors irreversibilities of the process are located, is presented in this work. This process consist in the first enrichment stage of GS, which operates bithermally at 305 D and 403 Kelvin degree and a pressure of 2 MPa, participating four chemical compounds: H 2 O, HDO, H 2 S and HDS (Author)

  6. Technological forecasting applied to the processes of hydrogen generation; Previsao tecnologica sobre os processos de geracao de hidrogenio

    Energy Technology Data Exchange (ETDEWEB)

    Nakano, Milton Satocy; Oliveira, Wagner dos Santos [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)], e-mails: msnakano@usp.br; wagner@ipen.br

    2008-11-15

    Fuel cells are attracting interest as efficient and clean energy conversion devices. Hydrogen is the combustible of the fuel cells and must be generated by an efficient and clean method. This work exploits Delphi methodology of technological forecasting applied to hydrogen generation and identifies the most probable methods that, in future, can be used to obtain hydrogen in Brazil. (author)

  7. Collaborative Technology Assessments Of Transient Field Processing And Additive Manufacturing Technologies As Applied To Gas Turbine Components

    Energy Technology Data Exchange (ETDEWEB)

    Ludtka, Gerard Michael [ORNL; Dehoff, Ryan R [ORNL; Szabo, Attila [General Electric (GE) Power and Water; Ucok, Ibrahim [General Electric (GE) Power and Water

    2016-01-01

    ORNL partnered with GE Power & Water to investigate the effect of thermomagnetic processing on the microstructure and mechanical properties of GE Power & Water newly developed wrought Ni-Fe-Cr alloys. Exploration of the effects of high magnetic field process during heat treatment of the alloys indicated conditions where applications of magnetic fields yields significant property improvements. The alloy aged using high magnetic field processing exhibited 3 HRC higher hardness compared to the conventionally-aged alloy. The alloy annealed at 1785 F using high magnetic field processing demonstrated an average creep life 2.5 times longer than that of the conventionally heat-treated alloy. Preliminary results show that high magnetic field processing can improve the mechanical properties of Ni-Fe-Cr alloys and potentially extend the life cycle of the gas turbine components such as nozzles leading to significant energy savings.

  8. Relevance of Toxicity Assessment in Wastewater Treatments: Case Study—Four Fenton Processes Applied to the Mineralization of C.I. Acid Red 14

    Directory of Open Access Journals (Sweden)

    Rajaa Idel-aouad

    2015-01-01

    Full Text Available Fenton and Fenton-like processes, both in homogeneous and heterogeneous phases, have been applied to an aqueous solution containing the dye AR 14 in order to study the mineralization and toxicity of the solutions generated after color elimination. The mineralization of AR 14 occurred slower than the decolorization. The Microtox analysis of the treated solutions showed low toxicity intrinsic to the chemicals used in the process rather than the degradation products obtained after the treatment of the dye solution. The dye degradation for the Fenton oxidation process was initially faster than for the Fenton-like process but after a short time, the four processes showed similar degradation yields. All processes have shown good results being the heterogeneous process the most convenient since the pH adjustment is not necessary, the catalyst is recovered and reused and the generation of contaminated sludge is avoided.

  9. Relevance of Toxicity Assessment in Wastewater Treatments: Case Study-Four Fenton Processes Applied to the Mineralization of C.I. Acid Red 14.

    Science.gov (United States)

    Idel-Aouad, Rajaa; Valiente, Manuel; Gutiérrez-Bouzán, Carmen; Vilaseca, Mercè; Yaacoubi, Abdlrani; Tanouti, Boumediene; López-Mesas, Montserrat

    2015-01-01

    Fenton and Fenton-like processes, both in homogeneous and heterogeneous phases, have been applied to an aqueous solution containing the dye AR 14 in order to study the mineralization and toxicity of the solutions generated after color elimination. The mineralization of AR 14 occurred slower than the decolorization. The Microtox analysis of the treated solutions showed low toxicity intrinsic to the chemicals used in the process rather than the degradation products obtained after the treatment of the dye solution. The dye degradation for the Fenton oxidation process was initially faster than for the Fenton-like process but after a short time, the four processes showed similar degradation yields. All processes have shown good results being the heterogeneous process the most convenient since the pH adjustment is not necessary, the catalyst is recovered and reused and the generation of contaminated sludge is avoided.

  10. Flexible ITO-free organic solar cells applying aqueous solution-processed V2O5 hole transport layer: An outdoor stability study

    Directory of Open Access Journals (Sweden)

    F. Anderson S. Lima

    2016-02-01

    Full Text Available Solution processable semiconductor oxides have opened a new paradigm for the enhancement of the lifetime of thin film solar cells. Their fabrication by low-cost and environmentally friendly solution-processable methods makes them ideal barrier (hole and electron transport layers. In this work, we fabricate flexible ITO-free organic solar cells (OPV by printing methods applying an aqueous solution-processed V2O5 as the hole transport layer (HTL and compared them to devices applying PEDOT:PSS. The transparent conducting electrode was PET/Ag/PEDOT/ZnO, and the OPV configuration was PET/Ag/PEDOT/ZnO/P3HT:PC60BM/HTL/Ag. Outdoor stability analyses carried out for more than 900 h revealed higher stability for devices fabricated with the aqueous solution-processed V2O5.

  11. Design and simulation of rate-based CO2 capture processes using carbonic anhydrase (CA) applied to biogas

    DEFF Research Database (Denmark)

    Fosbøl, Philip Loldrup; Gaspar, Jozsef; Jacobsen, Bjartur

    2017-01-01

    a potential to create negative emissions using bio-energy carbon capture and storage (BECCS). All sectors are still in the need for applying more sustainable carbon capture and storage (CCS) technologies which result in lower energy consumption while reducing the impact on the environment. Recently several......Today the mix of the energy sector is changing from reduction of CO2 emission from fossil fueled power industry into a general focus on renewable industry which is emitting less greenhouse gases. Renewable fuels like biomass for electricity production or biogas for bio-methane production have....... The advantage is a noticeably lower regeneration energy compared to primary and secondary amines. As a result the cost for stripping is significantly lower. Reactivated slow tertiary amines are applied in this study with the aim of reducing energy consumption. This is achieved byusing carbonic anhydrase (CA...

  12. Some experience in applying the REDUCE algebraic system to the calculation of scattering processes in QED and QCD

    International Nuclear Information System (INIS)

    Mohring, H.J.; Schiller, A.

    1980-01-01

    The problems arising in the use of the REDUCE algebraic system for calculating traces of the Dirac matrix products describing scattering processes in quantum electrodynamics (QED) and quantum chromodynamics (QCD) are considered. Application of the REDUCE system for describing two-photon processes in e + e - reactions is discussed. An example of using the REDUCE system for calculating matrix elements of elementary processes of hard scattering is described. The calculations were performed by means of the REDUCE2 version on an EC1040 computer. The computations take almost 10 minutes of machine time and computer storage capacity of abo t 800 kiuobites

  13. Business process modeling applied to oil pipeline and terminal processes: a proposal for TRANSPETRO's oil pipelines and terminals in Rio de Janeiro and Minas Gerais

    Energy Technology Data Exchange (ETDEWEB)

    Santiago, Adilson da Silva [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil); Caulliraux, Heitor Mansur [Universidade Federal do Rio de Janeiro (COPPE/UFRJ/GPI), RJ (Brazil). Coordenacao de Pos-graduacao em Engenharia. Grupo de Producao Integrada; Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Felippe, Adriana Vieira de Oliveira [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Business process modeling (BPM) using event driven process chain diagrams (EPCs) to lay out business process work flows is now widely adopted around the world. The EPC method was developed within the framework of the ARIS Toolset developed by Prof. Wilhelm-August Scheer at the Institut fur Wirtschaftsinformatik at the Universitat des Saarlandes, in the early 1990s. It is used by many companies to model, analyze and redesign business processes. As such it forms the core technique for modeling in ARIS, which serves to link the different aspects of the so-called control view, which is discussed in the section on ARIS business process modeling. This paper describes a proposal made to TRANSPETRO's Oil Pipelines and Terminals Division in the states of Rio de Janeiro and Minas Gerais, which will be jointly developed by specialists and managers from TRANSPETRO and from COPPETEC, the collaborative research arm of Rio de Janeiro Federal University (UFRJ). The proposal is based on ARIS business process modeling and is presented here according to its seven phases, as follows: information survey and definition of the project structure; mapping and analysis of Campos Eliseos Terminal (TECAM) processes; validation of TECAM process maps; mapping and analysis of the remaining organizational units' processes; validation of the remaining organizational units' process maps; proposal of a business process model for all organizational units of TRANSPETRO's Oil Pipelines and Terminals Division in Rio de Janeiro and Minas Gerais; critical analysis of the process itself and the results and potential benefits of BPM. (author)

  14. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Science.gov (United States)

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  15. Applying a transdisciplinary process to define a research agenda in a smallholder irrigated farming system in South Africa

    CSIR Research Space (South Africa)

    Musvoto, Constansia D

    2015-07-01

    Full Text Available Defining an agenda is critical to a research process, and a transdisciplinary approach is expected to improve relevance of an agenda and resultant research outputs. Given the complexity of farming systems, farmer differences and the involvement...

  16. MODELING OF PATTERN FORMING PROCESS OF AUTOMATIC RADIO DIRECTION FINDER OF PHASE VHF IN THE DEVELOPMENT ENVIRONMENT OF LabVIEW APPLIED PROGRAMS

    Directory of Open Access Journals (Sweden)

    G. K. Aslanov

    2015-01-01

    Full Text Available In the article is developed the model demonstrating the forming process of pattern of antenna system of aerodrome quasidopler automatic radiodirection-finder station in the development environment of LabVIEW applied programs of National Instrument company. 

  17. Development of a Curriculum Management Process by Applying Lean Concept for Waste Elimination to Enhance Curriculum Implementation of Primary School Teacher

    Science.gov (United States)

    Chitrangsan, Nadrudee; Sawekngam, Wichai; Thongthew, Sumlee

    2015-01-01

    This research aims to study and develop a curriculum management process by applying Lean concept for waste elimination to enhance curriculum implementation of primary school teacher. This study was conducted with a focus on qualitative data collection by dividing into 2 phases, including (1) analyze and synthesize relevant notions, theories,…

  18. Process of super-black shading material applied to the star sensor based on Ni-P alloys

    Science.gov (United States)

    Liu, Fengdeng; Xing, Fei; Wu, Yuelong; You, Zheng

    2014-12-01

    Super-black materials based on Nanotechnology have very important applications in many science fields. Super-black materials which have been reported currently, although have excellent light-trapping properties, most of them need the use of sophisticated equipment , the long-time synthesis , high temperature environment and release flammable, explosive and other dangerous gases. So many kinds of problems have hindered the application of such super-black material in practice. This project had nano super-black material developed with simple equipment and process, instead of complicated and dangerous process steps in high temperature and high pressure. On the basis of literature research, we successfully worked out a set of large-area Ni-P alloy plating method through a series of experiments exploring and analyze the experimental results. In the condition of the above Ni-P alloy, we took the solution, which anodized the Ni-P alloy immersed in the non-oxidizing acid, instead of conventional blackening process. It`s a big break for changing the situation in which oxidation, corrosion, vigorous evolution of hydrogen gas in the process are performed at the same location. As a result, not only the reaction process decreased sensitivity to time error, but also the position of the bubble layer no longer located in the surface of the workpiece which may impede observing the process of reaction. Consequently, the solution improved the controllability of the blackening process. In addition, we conducted the research of nano super-black material, exploring nano-super-black material in terms of space optical sensor.

  19. Microstructural and mechanical approaches of the selective laser melting process applied to a nickel-base superalloy

    International Nuclear Information System (INIS)

    Vilaro, T.; Colin, C.; Bartout, J.D.; Nazé, L.; Sennour, M.

    2012-01-01

    Highlights: ► We examine the as-fabricated microstructure of the Nimonic 263 processed by selective laser melting. ► We optimized heat treatments to modify the microstructure and improve the mechanical properties. ► We tested through tensile tests the various microstructures in order to compare the effects of the heat treatments. - Abstract: This article aims at presenting the Nimonic 263 as-processed microstructure of the selective laser melting which is an innovative process. Because the melting pool is small and the scanning speed of the laser beam is relatively high, the as-processed microstructure is out-of-equilibrium and very typical to additive manufacturing processes. To match the industrial requirement, the microstructures are modified through heat treatments in order to either produce precipitation hardening or relieve the thermal stresses. Tensile tests at room temperature give rise to high mechanical properties close or above those presented by Wang et al. . However, it is noted a strong anisotropy as a function of the building direction of the samples because of the columnar grain growth.

  20. The transfer and growth of Salmonella modelled during pork processing and applied to a risk assessment for the catering sector

    DEFF Research Database (Denmark)

    Møller, Cleide

    reported outbreaks in Denmark in 2010 were associated with outside-the-home settings, such as restaurants, canteens, hotels, schools, shops, institutions and sport events (Anonymous 2011), food prepared outside the home is a significant source of foodborne illness. In the present study, Quantitative...... Microbiological Risk Assessment (QMRA), following the Codex Alimentarius Principles and using the modular process risk model (MPRM) methodology, was used as the tool to investigate the fate of Salmonella during processing of pork meatballs from the reception of whole pork cuttings, through processing, until...... of data from observational studies, models specifically developed studying transfer and growth of Salmonella in pork (PAPER I and MANUSCRIPT I), and literature data related to Salmonella in different meat matrices resulted in a new approach that may improve the quality of estimates in risk assessments...

  1. Elimination of man-made radionuclides from natural waters by applying a standard coagulation-flocculation process

    International Nuclear Information System (INIS)

    Baeza, A.; Miro, C.; Salas, A.; Fernandez, M.; Herranz, M.; Legarda, F.

    2004-01-01

    Effectiveness of potable water treatment processes that consist of the stages of coagulation-flocculation-decantation, using iron-based coagulants, in eliminating gamma-emitting man-made radioisotopes of cesium, strontium, and americium from two natural waters with different degrees of mineralization was studied. The resulting decontamination was found to depend on the chemical behavior of each of the radionuclides considered, on the pH at which the process of coagulation is carried out, and on the concentration of the other stable cations present. (author)

  2. Applying the fWLR concept to Stress induced leakage current in non-volatile memory processes

    NARCIS (Netherlands)

    Tao, Guoqiao; Scarpa, Andrea; van Marwijk, Leo; van Dijk, Kitty; Kuper, F.G.

    A fast wafer level reliability structure and evaluation method has been developed for stress induced leakage current (SILC) in non-volatile memory processes. The structure is based on parallel floating gate cell arrays. The evaluation method is straightforward, and not time-consuming. The

  3. Applying 3-PG, a simple process-based model designed to produce practical results, to data from loblolly pine experiments

    Science.gov (United States)

    Joe J. Landsberg; Kurt H. Johnsen; Timothy J. Albaugh; H. Lee Allen; Steven E. McKeand

    2001-01-01

    3-PG is a simple process-based model that requires few parameter values and only readily available input data. We tested the structure of the model by calibrating it against loblolly pine data from the control treatment of the SETRES experiment in Scotland County, NC, then altered the fertility rating to simulate the effects of fertilization. There was excellent...

  4. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2007-01-01

    Full Text Available In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart sensors that today’s cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher’s interest in the fusion of intelligent sensors and optimal signal processing techniques.

  5. Discovering Decision Knowledge from Web Log Portfolio for Managing Classroom Processes by Applying Decision Tree and Data Cube Technology.

    Science.gov (United States)

    Chen, Gwo-Dong; Liu, Chen-Chung; Ou, Kuo-Liang; Liu, Baw-Jhiune

    2000-01-01

    Discusses the use of Web logs to record student behavior that can assist teachers in assessing performance and making curriculum decisions for distance learning students who are using Web-based learning systems. Adopts decision tree and data cube information processing methodologies for developing more effective pedagogical strategies. (LRW)

  6. Applied Research in the Institute of Chemical Process Fundamentals, Academy of Sciences of the Czech Republic (ACR), Prague

    Czech Academy of Sciences Publication Activity Database

    Hanika, Jiří

    2005-01-01

    Roč. 1, - (2005), 121 /6P20/ ISSN 1336-7242. [Zjazd chemických spoločností /57./. 04.09.2005-08.09.2005, Tatranské Matliare] Institutional research plan: CEZ:AV0Z40720504 Keywords : fundamental research * environmental engineering * chemical processes Subject RIV: CI - Industrial Chemistry, Chemical Engineering

  7. A Method for Re-using Existing ITIL Processes for Creating an ISO 27001 ISMS Process Applied to a High Availability Video Conferencing Cloud Scenario

    OpenAIRE

    Beckers , Kristian; Hofbauer , Stefan; Quirchmayr , Gerald; Wills , Christopher ,

    2013-01-01

    Part 1: Cross-Domain Conference and Workshop on Multidisciplinary Research and Practice for Information Systems (CD-ARES 2013); International audience; Many companies have already adopted their business processes to be in accordance with defined and organized standards. Two standards that are sought after by companies are IT Infrastructure Library (ITIL) and ISO 27001. Often companies start certifying their business processes with ITIL and continue with ISO 27001. For small and medium-sized b...

  8. Modelling the Processes of Maximizing Hotel Revenues, Based on Applying the Linear Programming and the Network Flows

    Directory of Open Access Journals (Sweden)

    Margareta RACOVITA

    2011-11-01

    Full Text Available This work proposes to solve a problem related to maximizing hotels’ revenues through two methods established in operational research domain. In the first part of the paper, the approach involves formulating the objective function and problem’s constraints, as well as the expansion of the model, taking into consideration clients’ preferences and the opportunities of group reservations. In the second part of the paper, the problem is solved with the help of network flows model, which allows optimum allocation of the rooms in real time. At the end of the paper, there are highlighted the advantages of applying those two mathematic methods within the strategies of performances development within hotel industry.

  9. Applying Idea Management System (IMS Approach to Design and Implement a collaborative Environment in Public Service related open Innovation Processes

    Directory of Open Access Journals (Sweden)

    Marco Alessi

    2015-12-01

    Full Text Available Novel ideas are the key ingredients for innovation processes, and Idea Management System (IMS plays a prominent role in managing captured ideas from external stakeholders and internal actors within an Open Innovation process. By considering a specific case study, Lecce-Italy, we have designed and implemented a collaborative environment, which provides an ideal platform for government, citizens, etc. to share ideas and co-create the value of innovative public services in Lecce. In this study the application of IMS with six main steps, including: idea generation, idea improvement, idea selection, refinement, idea implementation, and monitoring, shows that this, remarkably, helps service providers to exploit the intellectual capital and initiatives of the regional stakeholders and citizens and assist service providers to stay in line with the needs of society. Moreover, we have developed two support tools to foster collaboration and transparency: sentiment analysis tool and gamification application.

  10. The Design Process of Physical Security as Applied to a U.S. Border Port of Entry

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, G.G.

    1999-02-22

    This paper details the application of a standard physical security system design process to a US Border Port of Entry (PoE) for vehicle entry/exit. The physical security design methodology is described as well as the physical security similarities to facilities currently at a US Border PoE for vehicles. The physical security design process description includes the various elements that make up the methodologies well as the considerations that must be taken into account when dealing with system integration of those elements. The distinctions between preventing unlawful entry/exit of illegal contraband and personnel are described. The potential to enhance the functions of drug/contraband detection in the Pre-Primary Inspection area through the application of emerging technologies are also addressed.

  11. The adaptation process following acute onset disability: an interactive two-dimensional approach applied to acquired brain injury.

    Science.gov (United States)

    Brands, Ingrid M H; Wade, Derick T; Stapert, Sven Z; van Heugten, Caroline M

    2012-09-01

    To describe a new model of the adaptation process following acquired brain injury, based on the patient's goals, the patient's abilities and the emotional response to the changes and the possible discrepancy between goals and achievements. The process of adaptation after acquired brain injury is characterized by a continuous interaction of two processes: achieving maximal restoration of function and adjusting to the alterations and losses that occur in the various domains of functioning. Consequently, adaptation requires a balanced mix of restoration-oriented coping and loss-oriented coping. The commonly used framework to explain adaptation and coping, 'The Theory of Stress and Coping' of Lazarus and Folkman, does not capture this interactive duality. This model additionally considers theories concerned with self-regulation of behaviour, self-awareness and self-efficacy, and with the setting and achievement of goals. THE TWO-DIMENSIONAL MODEL: Our model proposes the simultaneous and continuous interaction of two pathways; goal pursuit (short term and long term) or revision as a result of success and failure in reducing distance between current state and expected future state and an affective response that is generated by the experienced goal-performance discrepancies. This affective response, in turn, influences the goals set. This two-dimensional representation covers the processes mentioned above: restoration of function and consideration of long-term limitations. We propose that adaptation centres on readjustment of long-term goals to new achievable but desired and important goals, and that this adjustment underlies re-establishing emotional stability. We discuss how the proposed model is related to actual rehabilitation practice.

  12. Fractionation and fluxes of metals and radionuclides during the recycling process of phosphogypsum wastes applied to mineral CO₂ sequestration.

    Science.gov (United States)

    Contreras, M; Pérez-López, R; Gázquez, M J; Morales-Flórez, V; Santos, A; Esquivias, L; Bolívar, J P

    2015-11-01

    The industry of phosphoric acid produces a calcium-rich by-product known as phosphogypsum, which is usually stored in large stacks of millions of tons. Up to now, no commercial application has been widely implemented for its reuse because of the significant presence of potentially toxic contaminants. This work confirmed that up to 96% of the calcium of phosphogypsum could be recycled for CO2 mineral sequestration by a simple two-step process: alkaline dissolution and aqueous carbonation, under ambient pressure and temperature. This CO2 sequestration process based on recycling phosphogypsum wastes would help to mitigate greenhouse gasses emissions. Yet this work goes beyond the validation of the sequestration procedure; it tracks the contaminants, such as trace metals or radionuclides, during the recycling process in the phosphogypsum. Thus, most of the contaminants were transferred from raw phosphogypsum to portlandite, obtained by dissolution of the phosphogypsum in soda, and from portlandite to calcite during aqueous carbonation. These findings provide valuable information for managing phosphogypsum wastes and designing potential technological applications of the by-products of this environmentally-friendly proposal. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Evaluation of Preclinical Assays to Investigate an Anthroposophic Pharmaceutical Process Applied to Mistletoe (Viscum album L. Extracts

    Directory of Open Access Journals (Sweden)

    Stephan Baumgartner

    2014-01-01

    Full Text Available Extracts from European mistletoe (Viscum album L. developed in anthroposophic medicine are based on specific pharmaceutical procedures to enhance remedy efficacy. One such anthroposophic pharmaceutical process was evaluated regarding effects on cancer cell toxicity in vitro and on colchicine tumor formation in Lepidium sativum. Anthroposophically processed Viscum album extract (APVAE was produced by mixing winter and summer mistletoe extracts in the edge of a high-speed rotating disk and was compared with manually mixed Viscum album extract (VAE. The antiproliferative effect of VAE/APVAE was determined in five cell lines (NCI-H460, DU-145, HCC1143, MV3, and PA-TU-8902 by WST-1 assay in vitro; no difference was found between VAE and APVAE in any cell line tested (P>0.14. Incidence of colchicine tumor formation was assessed by measurement of the root/shoot-ratio of seedlings of Lepidium sativum treated with colchicine as well as VAE, APVAE, or water. Colchicine tumor formation decreased after application of VAE (−5.4% compared to water, P<0.001 and was even stronger by APVAE (−8.8% compared to water, P<0.001. The high-speed mistletoe extract mixing process investigated thus did not influence toxicity against cancer cells but seemed to sustain morphostasis and to enhance resistance against external noxious influences leading to phenomenological malformations.

  14. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    Science.gov (United States)

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  15. [Process control in acute pain management. An analysis of the degree of organization of applied standard protocols].

    Science.gov (United States)

    Erlenwein, J; Emons, M I; Hecke, A; Nestler, N; Przemeck, M; Bauer, M; Meißner, W; Petzke, F

    2014-10-01

    The aim of this study was to analyze the degree of organization of different standard protocols for acute pain management, as well as the derivation and definition of typical but structurally different models. A total of 85 hospitals provided their written standardized protocols for analysis. Protocols for defined target processes from 76 hospitals and another protocol used by more than one hospital were included into the analysis. The suggested courses of action were theoretically simulated to identify and characterize process types in a multistage evaluation process. The analysis included 148 standards. Four differentiated process types were defined ("standardized order", "analgesic ladder", "algorithm", "therapy path"), each with an increasing level of organization. These four types had the following distribution: 27 % (n = 40) "standardized order", 47 % (n = 70) "analgesic ladder", 22 % (n = 33) "algorithm", 4 % (n = 5) "therapy path". Models with a higher degree of organization included more control elements, such as action and intervention triggers or safety and supervisory elements, and were also associated with a formally better access to medication. For models with a lower degree of organization, immediate courses of action were more dependent on individual decisions. Although not quantifiable, this was particularly evident when simulating downstream courses of action. Interfaces between areas of hospital activity and a cross-departmental-boundary validity were only considered in a fraction of the protocols. Concepts from clinics with a certificate in (acute) pain management were more strongly process-oriented. For children, there were proportionately more simple concepts with a lower degree of organization and less controlling elements. This is the first analysis of a large sample of standardized protocols for acute pain management focusing on the degree of organization and the possible influence on courses of action. The analysis

  16. Tensor decomposition-based unsupervised feature extraction applied to matrix products for multi-view data processing

    Science.gov (United States)

    2017-01-01

    In the current era of big data, the amount of data available is continuously increasing. Both the number and types of samples, or features, are on the rise. The mixing of distinct features often makes interpretation more difficult. However, separate analysis of individual types requires subsequent integration. A tensor is a useful framework to deal with distinct types of features in an integrated manner without mixing them. On the other hand, tensor data is not easy to obtain since it requires the measurements of huge numbers of combinations of distinct features; if there are m kinds of features, each of which has N dimensions, the number of measurements needed are as many as Nm, which is often too large to measure. In this paper, I propose a new method where a tensor is generated from individual features without combinatorial measurements, and the generated tensor was decomposed back to matrices, by which unsupervised feature extraction was performed. In order to demonstrate the usefulness of the proposed strategy, it was applied to synthetic data, as well as three omics datasets. It outperformed other matrix-based methodologies. PMID:28841719

  17. Tensor decomposition-based unsupervised feature extraction applied to matrix products for multi-view data processing.

    Directory of Open Access Journals (Sweden)

    Y-H Taguchi

    Full Text Available In the current era of big data, the amount of data available is continuously increasing. Both the number and types of samples, or features, are on the rise. The mixing of distinct features often makes interpretation more difficult. However, separate analysis of individual types requires subsequent integration. A tensor is a useful framework to deal with distinct types of features in an integrated manner without mixing them. On the other hand, tensor data is not easy to obtain since it requires the measurements of huge numbers of combinations of distinct features; if there are m kinds of features, each of which has N dimensions, the number of measurements needed are as many as Nm, which is often too large to measure. In this paper, I propose a new method where a tensor is generated from individual features without combinatorial measurements, and the generated tensor was decomposed back to matrices, by which unsupervised feature extraction was performed. In order to demonstrate the usefulness of the proposed strategy, it was applied to synthetic data, as well as three omics datasets. It outperformed other matrix-based methodologies.

  18. It's Only a Phase: Applying the 5 Phases of Clinical Trials to the NSCR Model Improvement Process

    Science.gov (United States)

    Elgart, S. R.; Milder, C. M.; Chappell, L. J.; Semones, E. J.

    2017-01-01

    NASA limits astronaut radiation exposures to a 3% risk of exposure-induced death from cancer (REID) at the upper 95% confidence level. Since astronauts approach this limit, it is important that the estimate of REID be as accurate as possible. The NASA Space Cancer Risk 2012 (NSCR-2012) model has been the standard for NASA's space radiation protection guidelines since its publication in 2013. The model incorporates elements from U.S. baseline statistics, Japanese atomic bomb survivor research, animal models, cellular studies, and radiation transport to calculate astronaut baseline risk of cancer and REID. The NSCR model is under constant revision to ensure emerging research is incorporated into radiation protection standards. It is important to develop guidelines, however, to determine what new research is appropriate for integration. Certain standards of transparency are necessary in order to assess data quality, statistical quality, and analytical quality. To this effect, all original source code and any raw data used to develop the code are required to confirm there are no errors which significantly change reported outcomes. It is possible to apply a clinical trials approach to select and assess the improvement concepts that will be incorporated into future iterations of NSCR. This poster describes the five phases of clinical trials research, pre-clinical research, and clinical research phases I-IV, explaining how each step can be translated into an appropriate NSCR model selection guideline.

  19. Analysis of the United States Marine Corps Continuous Process Improvement Program Applied to the Contracting Process at Marine Corps Regional Contracting Office - Southwest

    Science.gov (United States)

    2007-12-01

    SAP Simplified Acquisition Procedures SE Supporting Establishment SIPOC Suppliers, Inputs, Processes, Outputs, Customers SMED Single...Minimization of non-value added activities (muda) • Decreased cycle times • Single minute exchange of dies ( SMED ) • Set-up reduction (SUR

  20. Video processing of remote sensor data applied to uranium exploration in Wyoming. [Roll-front U deposits

    Energy Technology Data Exchange (ETDEWEB)

    Levinson, R.A.; Marrs, R.W.; Crockell, F.

    1979-06-30

    LANDSAT satellite imagery and aerial photography can be used to map areas of altered sandstone associated with roll-front uranium deposits. Image data must be enhanced so that alteration spectral contrasts can be seen, and video image processing is a fast, low-cost, and efficient tool. For LANDSAT data, the 7/4 ratio produces the best enhancement of altered sandstone. The 6/4 ratio is most effective for color infrared aerial photography. Geochemical and mineralogical associations occur in unaltered, altered, and ore roll-front zones. Samples from Pumpkin Buttes show that iron is the primary coloring agent which makes alteration visually detectable. Eh and pH changes associated with passage of a roll front cause oxidation of magnetite and pyrite to hematite, goethite, and limonite in the host sandstone, thereby producing the alteration. Statistical analysis show that the detectability of geochemical and color zonation in host sands is weakened by soil-forming processes. Alteration can only be mapped in areas of thin soil cover and moderate to sparse vegetative cover.

  1. Applying value engineering and modern assessment tools in managing NEPA: Improving effectiveness of the NEPA scoping and planning process

    Energy Technology Data Exchange (ETDEWEB)

    ECCLESTON, C.H.

    1998-09-03

    While the National Environmental Policy Act (NEPA) implementing regulations focus on describing ''What'' must be done, they provide surprisingly little direction on ''how'' such requirements are to be implemented. Specific implementation of these requirements has largely been left to the discretion of individual agencies. More than a quarter of a century after NEPA's enactment, few rigorous tools, techniques, or methodologies have been developed or widely adopted for implementing the regulatory requirements. In preparing an Environmental Impact Statement, agencies are required to conduct a public scoping process to determine the range of actions, alternatives, and impacts that will be investigated. Determining the proper scope of analysis is an element essential in the successful planning and implementation of future agency actions. Lack of rigorous tools and methodologies can lead to project delays, cost escalation, and increased risk that the scoping process may not adequately capture the scope of decisions that eventually might need to be considered. Recently, selected Value Engineering (VE) techniques were successfully used in managing a prescoping effort. A new strategy is advanced for conducting a pre-scoping/scoping effort that combines NEPA with VE. Consisting of five distinct phases, this approach has potentially wide-spread implications in the way NEPA, and scoping in particular, is practiced.

  2. Circumferential welding applied for inox steel super duplex UNS S32750 using the process MIG using CMT® control

    International Nuclear Information System (INIS)

    Invernizzi, Bruno Pizol

    2017-01-01

    This study carried out circumferential welding experiments in UNS S32750 Super Duplex Stainless Steel tubes using diameters of 19,05 mm and 48,20 mm. Welds were performed using various welding parameters on a MIG machine with Cold Metal Transfer® CMT control. The weld joints were evaluated by visual and dimensional inspection in addition to the Vickers microhardness and traction tests, as well as the microstructural analysis in conjunction with phase precipitation analysis, which was performed according to practice A of ASTM A923, and corrosion test in accordance with practice A of ASTM G48 in conjunction with ASTM A923. The results indicated that welds performed in pipes with a diameter of 19.05 mm showed a weld joint with unacceptable dimensions according to the standard, this condition being attributed the use of a high wire diameter for the welding conditions used. Welding performed for pipes with a diameter of 48.20 mm showed a lack of penetration under the conditions employed when welded by the conventional CMT® process. In the case of the use of CMT® combined with pulsed arc, under conditions that generated greater heat input during welding, this resulted in total penetration of the joint and adequate surface finish. The results indicated that welding using the CMT® process combined with pulsed arc, under the conditions (parameters) employed generated good surface finish, combined mechanical properties, meeting standards requirements, as well as a balanced microstructure and high resistance to corrosion. (author)

  3. Studies on molybdenum elution study in dowex 1x8 resin applied on purification process of fission 99Mo

    International Nuclear Information System (INIS)

    Damasceno, M.O.; Yamaura, M.; Santos, J.L. dos; Forbicini, C.A.L.G. de O.

    2013-01-01

    Molybdenum-99 is the most widely employed radioisotope in nuclear medicine, due to its decay product, Technetium-99, which is used in radio-pharmaceutical marking molecules for diagnostic examinations tumor dis-eases. Today Brazil imports 99 Mo from some countries, so the National Commission of Nuclear Energy (CNEN) is implementing a new research reactor RMB, currently in the conceptual design phase. The process of separation of fission 99 Mo begins with the dissolution of uranium targets after irradiation in reactor; the resulting solution goes through a series of chromatographic columns that allows a gradual decontamination of other components, yielding the 99 Mo with high radio-chemical and chemical purity for use in nuclear medicine as a generator of 99 mTc. This work is part of the RMB research project to separate and purify the fission 99 Mo by chromatographic columns from alkaline dissolution of LEU UAl x -Al targets. In the present study Mo removal by batch assays and glass column was investigated using anionic exchanger Dowex 1x8. Different salts and its concentration, cations and temperature were evaluated on elution of molybdenum and iodine (contaminant) retained on resin Dowex 1x8, aiming at their use in the process of separation and purification in chromatography columns on Brazilian project. Results showed high recovery of Mo and low-level contamination by iodine using NaHCO 3 hot solution. (author)

  4. 3D Point Clouds in Archaeology: Advances in Acquisition, Processing and Knowledge Integration Applied to Quasi-Planar Objects

    Directory of Open Access Journals (Sweden)

    Florent Poux

    2017-09-01

    Full Text Available Digital investigations of the real world through point clouds and derivatives are changing how curators, cultural heritage researchers and archaeologists work and collaborate. To progressively aggregate expertise and enhance the working proficiency of all professionals, virtual reconstructions demand adapted tools to facilitate knowledge dissemination. However, to achieve this perceptive level, a point cloud must be semantically rich, retaining relevant information for the end user. In this paper, we review the state of the art of point cloud integration within archaeological applications, giving an overview of 3D technologies for heritage, digital exploitation and case studies showing the assimilation status within 3D GIS. Identified issues and new perspectives are addressed through a knowledge-based point cloud processing framework for multi-sensory data, and illustrated on mosaics and quasi-planar objects. A new acquisition, pre-processing, segmentation and ontology-based classification method on hybrid point clouds from both terrestrial laser scanning and dense image matching is proposed to enable reasoning for information extraction. Experiments in detection and semantic enrichment show promising results of 94% correct semantization. Then, we integrate the metadata in an archaeological smart point cloud data structure allowing spatio-semantic queries related to CIDOC-CRM. Finally, a WebGL prototype is presented that leads to efficient communication between actors by proposing optimal 3D data visualizations as a basis on which interaction can grow.

  5. Process control and dosimetry applied to establish a relation between reference dose measurements and actual dose distribution

    International Nuclear Information System (INIS)

    Ehlerman, D.A.E.

    2001-01-01

    The availability of the first commercial dose level indicator prompted attempts to verify radiation absorbed dose to items under quarantine control (e.g. for insect disinfestation) by some indicator attached to these items. Samples of the new commercial dose level indicators were tested for their metrological properties using gamma and electron irradiation. The devices are suitable for the intended purpose and the subjective judgement whether the threshold dose was surpassed is possible in a reliable manner. The subjective judgements are completely backed by the instrumental results. Consequently, a prototype reader was developed; first tests were successful. The value of dose level indicators and the implications of its use for food or quarantine inspection depends on a link between dose measured (indicated) at the position of such indicator and the characteristic parameters of the frequency distribution of dose throughout the product load i.e. a box or a container or a whole batch of multiple units. Therefore, studies into variability and statistical properties of dose distributions obtained under a range of commercial situations were undertaken. Gamma processing at a commercial multipurpose contract irradiator, electron processing and bremsstrahlung applications at a largescale research facility were included; products were apples, potatoes, wheat, maize, pistachio. Studies revealed that still more detailed information on irradiation geometries are needed in order to render meaningful information from dose label indicators. (author)

  6. Algal Foams Applied in Fixed-Bed Process for Lead(II Removal Using Recirculation or One-Pass Modes

    Directory of Open Access Journals (Sweden)

    Shengye Wang

    2017-10-01

    Full Text Available The incorporation of brown algae into biopolymer beads or foams for metal sorption has been previously reported. However, the direct use of these biomasses for preparing foams is a new approach. In this study, two kinds of porous foams were prepared by ionotropic gelation using algal biomass (AB, Laminaria digitata or alginate (as the reference and applied for Pb(II sorption. These foams (manufactured as macroporous discs were packed in filtration holders (simulating fixed-bed column and the system was operated in either a recirculation or a one-pass mode. Sorption isotherms, uptake kinetics and sorbent reuse were studied in the recirculation mode (analogous to batch system. In the one-pass mode (continuous fixed-bed system, the influence of parameters such as flow rate, feed metal concentration and bed height were investigated on both sorption and desorption. In addition, the effect of Cu(II on Pb(II recovery from binary solutions was also studied in terms of both sorption and desorption. Sorption isotherms are well fitted by the Langmuir equation while the pseudo-second order rate equation described well both sorption and desorption kinetic profiles. The study of material regeneration confirms that the reuse of the foams was feasible with a small mass loss, even after 9 cycles. In the one-pass mode, for alginate foams, a slower flow rate led to a smaller saturation volume, while the effect of flow rate was less marked for AB foams. Competitive study suggests that the foams have a preference for Pb(II over Cu(II but cannot selectively remove Pb(II from the binary solution.

  7. An eco design strategy for high pressure die casting components: microstructural analysis applied to mass reducing processes

    International Nuclear Information System (INIS)

    Suarez-Pena, B.; Asensio-Lozano, J.

    2009-01-01

    In this work the study focused on the possibility of use of new aluminium alloys with optimized microstructures that ensure the mechanical properties requested for cast components made by high pressure die casting. The objective was to check the possibility of manufacture of structurally sound eco-steps for escalators with reduced structural integrity. The former arises as a result of a new redesign of the traditional steps aiming at a significant weight reduction. The experimental results show that it is feasible to cut the use of materials during processing and therefore to reduce the impact of the components during its lifetime, whilst the performance and safety standards are kept identical or even improved. (Author) 17 refs

  8. Fitting Frequency-Lowering Signal Processing Applying the American Academy of Audiology Pediatric Amplification Guideline: Updates and Protocols.

    Science.gov (United States)

    Scollie, Susan; Glista, Danielle; Seto, Julie; Dunn, Andrea; Schuett, Brittany; Hawkins, Marianne; Pourmand, Nazanin; Parsa, Vijay

    2016-03-01

    Although guidelines for fitting hearing aids for children are well developed and have strong basis in evidence, specific protocols for fitting and verifying technologies can supplement such guidelines. One such technology is frequency-lowering signal processing. Children require access to a broad bandwidth of speech to detect and use all phonemes including female /s/. When access through conventional amplification is not possible, the use of frequency-lowering signal processing may be considered as a means to overcome limitations. Fitting and verification protocols are needed to better define candidacy determination and options for assessing and fine tuning frequency-lowering signal processing for individuals. This work aims to (1) describe a set of calibrated phonemes that can be used to characterize the variation in different brands of frequency-lowering processors in hearing aids and the verification with these signals and (2) determine whether verification with these signal are predictive of perceptual changes associated with changes in the strength of frequency-lowering signal processing. Finally, we aimed to develop a fitting protocol for use in pediatric clinical practice. Study 1 used a sample of six hearing aids spanning four types of frequency lowering algorithms for an electroacoustic evaluation. Study 2 included 21 adults who had hearing loss (mean age 66 yr). Simulated fricatives were designed to mimic the level and frequency shape of female fricatives extracted from two sources of speech. These signals were used to verify the frequency-lowering effects of four distinct types of frequency-lowering signal processors available in commercial hearing aids, and verification measures were compared to extracted fricatives made in a reference system. In a second study, the simulated fricatives were used within a probe microphone measurement system to verify a wide range of frequency compression settings in a commercial hearing aid, and 27 adult listeners were

  9. Comparison of Roller Burnishing Method with Other Hole Surface Finishing Processes Applied on AISI 304 Austenitic Stainless Steel

    Science.gov (United States)

    Akkurt, Adnan

    2011-08-01

    Component surface quality and selection of the optimum material are the main factors determining the performance of components used in machine manufacturing. The level of hole surface quality can be evaluated by the measurements regarding surface roughness, micro-hardness, and cylindricity. In this study, data had been obtained for different hole drilling methods. The characteristics of materials obtained after applications were compared for different hole-finishing processes to identify best hole drilling method. AISI 304 austenitic stainless steel material was used. Surface finishing of holes were performed using drilling, turning, reaming, grinding, honing, and roller burnishing methods. The results of the study show that the roller burnishing method gives the best results for mechanical, metallurgical properties, and hole surface quality of the material. On the other hand, the worst characteristics were obtained in the drilling method.

  10. Students’ attitude to the possibility of applying modern information and communication technologies in the educational process in physical education

    Directory of Open Access Journals (Sweden)

    A.S. Ilnitskaya

    2014-04-01

    Full Text Available Purpose: to analyze the problem of the formation of students’ attitudes toward physical education classes and the application of information and communication technologies in physical education in higher education institutions. Material: in the survey participated 245 students. Results: it was found that according to students in physical education classes with the use of modern technologies are more efficient than traditional occupations (52% are more emotional nature, help to improve mood (28%, helps to provide students the latest up to date information relative health (26 % contribute to increased power consumption of an organism (8%. Conclusion: the need for the development and application of information and communication technologies and non-traditional forms of physical education to improve the effectiveness of the educational process in physical education in higher education institutions.

  11. Marine Spatial Planning Applied to the High Seas - Process and Results of an Exercise Focused on the Sargasso Sea

    Science.gov (United States)

    Siuda, A. N.; Smythe, T. C.

    2016-12-01

    The Sargasso Sea, at the center of the North Atlantic gyre, is recognized by the United Nations Convention on Biological Diversity as a globally unique ecosystem threatened by anthropogenic activity. In its stewardship capacity, the Sargasso Sea Commission works within the current system of international organizations and treaties to secure protection for particular species or areas. Without a single governing authority to implement and enforce protective measures across the region, a coordinated management plan for the region is lacking. A research team comprised of 20 advanced undergraduate scientists participating in the spring 2015 SEA Semester: Marine Biodiversity and Conservation program of Sea Education Association (Woods Hole, MA) engaged in a groundbreaking simulated high seas marine spatial planning process resulting in A Marine Management Proposal for the Sargasso Sea. Based on natural and social science research, the interdisciplinary Proposal outlines goals, objectives and realistic strategies that encompass ecological, economic, human use, and future use considerations. Notably, the Proposal is the product of a classroom-based simulation intended to improve emerging scientists' understanding of how research is integrated into the policy process and how organizations work across disciplinary boundaries to address complex ocean management problems. Student researchers identified several discrete management areas and associated policy recommendations for those areas, as well as strategies for coordinated management across the entire Sargasso Sea region. The latter include establishment of a United Nations Regional Ocean Management Organization as well as provisions for monitoring and managing high seas traffic. To make progress toward these strategies, significant attention to the importance of high seas regions for global-scale conservation will be necessary.

  12. DEVELOPMENT OF A KINETIC MODEL OF BOEHMITE DISSOLUTION IN CAUSTIC SOLUTIONS APPLIED TO OPTIMIZE HANFORD WASTE PROCESSING

    Energy Technology Data Exchange (ETDEWEB)

    DISSELKAMP RS

    2011-01-06

    Boehmite (e.g., aluminum oxyhydroxide) is a major non-radioactive component in Hanford and Savannah River nuclear tank waste sludge. Boehmite dissolution from sludge using caustic at elevated temperatures is being planned at Hanford to minimize the mass of material disposed of as high-level waste (HLW) during operation of the Waste Treatment Plant (WTP). To more thoroughly understand the chemistry of this dissolution process, we have developed an empirical kinetic model for aluminate production due to boehmite dissolution. Application of this model to Hanford tank wastes would allow predictability and optimization of the caustic leaching of aluminum solids, potentially yielding significant improvements to overall processing time, disposal cost, and schedule. This report presents an empirical kinetic model that can be used to estimate the aluminate production from the leaching of boehmite in Hanford waste as a function of the following parameters: (1) hydroxide concentration; (2) temperature; (3) specific surface area of boehmite; (4) initial soluble aluminate plus gibbsite present in waste; (5) concentration of boehmite in the waste; and (6) (pre-fit) Arrhenius kinetic parameters. The model was fit to laboratory, non-radioactive (e.g. 'simulant boehmite') leaching results, providing best-fit values of the Arrhenius A-factor, A, and apparent activation energy, E{sub A}, of A = 5.0 x 10{sup 12} hour{sup -1} and E{sub A} = 90 kJ/mole. These parameters were then used to predict boehmite leaching behavior observed in previously reported actual waste leaching studies. Acceptable aluminate versus leaching time profiles were predicted for waste leaching data from both Hanford and Savannah River site studies.

  13. DEVELOPMENT OF A KINETIC MODEL OF BOEHMITE DISSOLUTION IN CAUSTIC SOLUTIONS APPLIED TO OPTIMIZE HANFORD WASTE PROCESSING

    International Nuclear Information System (INIS)

    Disselkamp, R.S.

    2011-01-01

    Boehmite (e.g., aluminum oxyhydroxide) is a major non-radioactive component in Hanford and Savannah River nuclear tank waste sludge. Boehmite dissolution from sludge using caustic at elevated temperatures is being planned at Hanford to minimize the mass of material disposed of as high-level waste (HLW) during operation of the Waste Treatment Plant (WTP). To more thoroughly understand the chemistry of this dissolution process, we have developed an empirical kinetic model for aluminate production due to boehmite dissolution. Application of this model to Hanford tank wastes would allow predictability and optimization of the caustic leaching of aluminum solids, potentially yielding significant improvements to overall processing time, disposal cost, and schedule. This report presents an empirical kinetic model that can be used to estimate the aluminate production from the leaching of boehmite in Hanford waste as a function of the following parameters: (1) hydroxide concentration; (2) temperature; (3) specific surface area of boehmite; (4) initial soluble aluminate plus gibbsite present in waste; (5) concentration of boehmite in the waste; and (6) (pre-fit) Arrhenius kinetic parameters. The model was fit to laboratory, non-radioactive (e.g. 'simulant boehmite') leaching results, providing best-fit values of the Arrhenius A-factor, A, and apparent activation energy, E A , of A = 5.0 x 10 12 hour -1 and E A = 90 kJ/mole. These parameters were then used to predict boehmite leaching behavior observed in previously reported actual waste leaching studies. Acceptable aluminate versus leaching time profiles were predicted for waste leaching data from both Hanford and Savannah River site studies.

  14. Agro-ecological aspects when applying the remaining products from agricultural biogas processes as fertilizer in crop production

    Energy Technology Data Exchange (ETDEWEB)

    Bermejo Dominguez, Gabriela

    2012-06-11

    With the increase of biogas production in recent years, the amount of digestates or the remaining residues increased accordingly. Every year in Germany more than 50 million tons of digestates are produced, which are used as fertilizer. Thus nutrients return into the circulation of agricultural ecosystems. However, the agro-ecological effects have not been deeply researched until now. For this reason, the following parameters were quantified: the influence of dry and liquid fermentation products on the yield of three selected crops in comparison to or in combination with mineral-N-fertilizers in on-farm experiments; the growth, development and yield of two selected crops in comparison to mineral-N-fertilizer, liquid manure and farmyard manure in a randomized complete block design; selected soil organisms as compared to mineral-N-fertilizer, liquid manure and farmyard manure in a randomized complete block design. In addition, the mineralization of dry and wet digestates in comparison with liquid manure and farmyard manure was investigated in order to evaluate the effects of different fertilizers on the humus formation under controlled conditions. The 2-year results of on-farm experiments showed that for a sandy soil, the combination of digestates in autumn and mineral-N-fertilizer in spring for winter crops (wheat, rye and rape) brought the highest yields. The wet digestate achieved the highest dry-matter yield as the only fertilizer for maize in spring. In a clayey soil, the use of 150 kg ha{sup -1} N mineral-N-fertilizer brought the highest grain yield. These results were similar to the ones obtained by the application of dry digestates, if they were applied in two doses. Maize showed no signif-icant differences between the dry-matter yields of the different treatments. The results in the field experiments from 2009 to 2011 showed that the effect of digestates on the yield of winter wheat and Sorghum sudanense was up to 15 % lower than the effect of the mineral

  15. An integrated multi attribute decision model for energy efficiency processes in petrochemical industry applying fuzzy set theory

    International Nuclear Information System (INIS)

    Taylan, Osman; Kaya, Durmus; Demirbas, Ayhan

    2016-01-01

    Graphical abstract: Evaluation of compressors by comparing the different cost parameters. - Highlights: • Fuzzy sets and systems are used for decision making in MCDM problems. • An integrated Fuzzy AHP and fuzzy TOPSIS approaches are employed for compressor selection. • Compressor selection is a highly complex and non-linear process. • This approach increases the efficiency, reliability of alternative scenarios, and reduces the pay-back period. - Abstract: Energy efficient technologies offered by the market increases productivity. However, decision making for these technologies is usually obstructed in the firms and comes up with organizational barriers. Compressor selection in petrochemical industry requires assessment of several criteria such as ‘reliability, energy consumption, initial investment, capacity, pressure, and maintenance cost.’ Therefore, air compressor selection is a multi-attribute decision making (MADM) problem. The aim of this study is to select the most eligible compressor(s) so as to avoid the high energy consumption due to the capacity and maintenance costs. It is also aimed to avoid failures due to the reliability problems and high pressure. MADM usually takes place in a vague and imprecise environment. Soft computing techniques such as fuzzy sets and system can be used for decision making where vague and imprecise knowledge is available. In this study, an integrated fuzzy analytical hierarchy process (FAHP) and fuzzy technique for order performance by similarity to ideal solution (TOPSIS) methodologies are employed for the compressor selection. Fuzzy AHP was used to determine the weights of criteria and fuzzy TOPSIS was employed to order the scenarios according to their superiority. The total effect of all criteria was determined for all alternative scenarios to make an optimal decision. Moreover, the types of compressor, carbon emission, waste heat recovery and their capacities were analyzed and compared by statistical

  16. Applying the Fuzzy Analytic Hierarchy Process to Construct the Product Innovative Service System of Wedding Photography Apparel

    Directory of Open Access Journals (Sweden)

    Jui-Che Tu

    2015-01-01

    Full Text Available This study aimed to investigate the indicators of the wedding photography apparel product system in order to construct the wedding photography apparel product system indicators and analyze the hieratical weights of the wedding photography apparel product system indicators. By using the Delphi method and the fuzzy analytic hierarchy process, this study constructed the questionnaire for wedding photography apparel product system indicator weights. This study used the mean and standard deviation to learn about the distribution of opinions of the experts and scholars. According to the findings of this study, for the fuzzy weight analysis results of the wedding photography apparel product system indicators, the most important indicators are the product-service indicators. Moreover, for the product-service indicators, the wedding apparel package service is the most important. For the information platform indicator, the wedding apparel style opinion platform is the most important. For the maintenance and recycling indicators, the wedding apparel second-hand auction/donation is the most important. For the sales market indicators, the wedding apparel store sales/rental is the most important. The main purpose of the indicators of the wedding photography apparel product system constructed in this study is to propose detailed items and connotations to provide a substantial reference and basis of business strategic indicators for the wedding photography enterprises.

  17. Efficient degradation of solid yeast biomass from ethanol industry by Fenton and UV-Fenton processes applying multivariate analysis

    Directory of Open Access Journals (Sweden)

    Geórgia Labuto

    2017-12-01

    Full Text Available Organic agro-industrial residues have been successfully used as biosorbents and promoting new uses from agricultural wastes benefits the economy. However, the allocation of a solid waste biosorbent after the sorption of contaminants has limited their effective application on a large scale as an alternative treatment of water and wastewaters. One solution could be degradation to convert the biosorbent material and adsorbed organic contaminants into environmental friendly compounds suitable for discharge. This study used an experimental design to evaluate the Fenton degradation of yeast biomass (YB from the alcohol industry as a potential biosorbent. The efficiency of degradation was monitored according to the degraded mass (DM and total organic carbon (TOC remaining in the solution. The ANOVA showed an error of 9.7% for the effects and the media of interaction for the employed model for DM. Conducting the experiments with the best-predicted conditions (60 min, 25 g of YB, pH 3, 8,000 mg L-1 H2O2 and 40 mg L-1 Fe2+ with 30 W UV irradiation resulted in a YB reduction of 72  2% with a TOC of 30  2%. This suggests that an advanced oxidative process is an alternative for degradation of a biosorbent after sorption.

  18. Applying the Context, Input, Process, Product Evaluation Model for Evaluation, Research, and Redesign of an Online Master’s Program

    Directory of Open Access Journals (Sweden)

    Hatice Sancar Tokmak

    2013-07-01

    Full Text Available This study aimed to evaluate and redesign an online master’s degree program consisting of 12 courses from the informatics field using a context, input, process, product (CIPP evaluation model. Research conducted during the redesign of the online program followed a mixed methodology in which data was collected through a CIPP survey, focus-group interview, and open-ended questionnaire. An initial CIPP survey sent to students, which had a response rate of approximately 60%, indicated that the Fuzzy Logic course did not fully meet the needs of students. Based on these findings, the program managers decided to improve this course, and a focus group was organized with the students of the Fuzzy Logic course in order to obtain more information to help in redesigning the course. Accordingly, the course was redesigned to include more examples and visuals, including videos; student-instructor interaction was increased through face-to-face meetings; and extra meetings were arranged before exams so that additional examples could be presented for problem-solving to satisfy students about assessment procedures. Lastly, the modifications to the Fuzzy Logic course were implemented, and the students in the course were sent an open-ended form asking them what they thought about the modifications. The results indicated that most students were pleased with the new version of the course.

  19. Electrochemical Study of Ni20Cr Coatings Applied by HVOF Process in ZnCl2-KCl at High Temperatures

    Directory of Open Access Journals (Sweden)

    J. Porcayo-Calderón

    2014-01-01

    Full Text Available Corrosion behavior of Ni20Cr coatings deposited by HVOF (high velocity oxygen-fuel process was evaluated in ZnCl2-KCl (1 : 1 mole ratio molten salts. Electrochemical techniques employed were potentiodynamic polarization curves, open circuit potential, and linear polarization resistance (LPR measurements. Experimental conditions included static air and temperatures of 350, 400, and 450°C. 304-type SS was evaluated in the same conditions as the Ni20Cr coatings and it was used as a reference material to assess the coatings corrosion resistance. Coatings were evaluated as-deposited and with a grinded surface finished condition. Results showed that Ni20Cr coatings have a better corrosion performance than 304-type SS. Analysis showed that Ni content of the coatings improved its corrosion resistance, and the low corrosion resistance of 304 stainless steel was attributed to the low stability of Fe and Cr and their oxides in the corrosive media used.

  20. Data Processing and Programming Applied to an Environmental Radioactivity Laboratory; Desarrollo Informatico Aplicado a un Laboratorio de Radiactividad Ambiental

    Energy Technology Data Exchange (ETDEWEB)

    Trinidad, J.A.; Gasco, C.; Palacios, M.A.

    2009-07-01

    This report is the original research work presented for the attainment of the author master degree and its main objective has been the resolution -by means of friendly programming- of some of the observed problems in the environmental radioactivity laboratory belonging to the Department of Radiological Surveillance and Environmental Radioactivity from CIEMAT. The software has been developed in Visual Basic for applications in Excel files and it solves by macro orders three of the detected problems: a) calculation of characteristic limits for the measurements of the beta total and beta rest activity concentrations according to standards MARLAP, ISO and UNE and the comparison of the three results b) Pb-210 and Po-210 decontamination factor determination in the ultra-low level Am-241 analysis in air samples by alpha spectrometry and c) comparison of two analytical techniques for measuring Pb-210 in air ( direct-by gamma spectrometry- and indirect -by radiochemical separation and alpha spectrometry). The organization processes of the different excel files implied in the subroutines, calculations and required formulae are explained graphically for its comprehension. The advantage of using this kind of programmes is based on their versatility and the ease for obtaining data that lately are required by tables that can be modified as time goes by and the laboratory gets more data with the special applications for describing a method (Pb-210 decontamination factors for americium analysis in air) or comparing temporal series of Pb-210 data analysed by different methods (Pb-210 in air). (Author)

  1. Applying Hillslope Hydrology to Bridge between Ecosystem and Grid-Scale Processes within an Earth System Model

    Science.gov (United States)

    Subin, Z. M.; Sulman, B. N.; Malyshev, S.; Shevliakova, E.

    2013-12-01

    Soil moisture is a crucial control on surface energy fluxes, vegetation properties, and soil carbon cycling. Its interactions with ecosystem processes are highly nonlinear across a large range, as both drought stress and anoxia can impede vegetation and microbial growth. Earth System Models (ESMs) generally only represent an average soil-moisture state in grid cells at scales of 50-200 km, and as a result are not able to adequately represent the effects of subgrid heterogeneity in soil moisture, especially in regions with large wetland areas. We addressed this deficiency by developing the first ESM-coupled subgrid hillslope-hydrological model, TiHy (Tiled-hillslope Hydrology), embedded within the Geophysical Fluid Dynamics Laboratory (GFDL) land model. In each grid cell, one or more representative hillslope geometries are discretized into land model tiles along an upland-to-lowland gradient. These geometries represent ~1 km hillslope-scale hydrological features and allow for flexible representation of hillslope profile and plan shapes, in addition to variation of subsurface properties among or within hillslopes. Each tile (which may represent ~100 m along the hillslope) has its own surface fluxes, vegetation state, and vertically-resolved state variables for soil physics and biogeochemistry. Resolution of water state in deep layers (~200 m) down to bedrock allows for physical integration of groundwater transport with unsaturated overlying dynamics. Multiple tiles can also co-exist at the same vertical position along the hillslope, allowing the simulation of ecosystem heterogeneity due to disturbance. The hydrological model is coupled to the vertically-resolved Carbon, Organisms, Respiration, and Protection in the Soil Environment (CORPSE) model, which captures non-linearity resulting from interactions between vertically-heterogeneous soil carbon and water profiles. We present comparisons of simulated water table depth to observations. We examine sensitivities to

  2. Signal processing for airborne doppler radar detection of hazardous wind shear as applied to NASA 1991 radar flight experiment data

    Science.gov (United States)

    Baxa, Ernest G., Jr.

    1992-01-01

    Radar data collected during the 1991 NASA flight tests have been selectively analyzed to support research directed at developing both improved as well as new algorithms for detecting hazardous low-altitude windshear. Analysis of aircraft attitude data from several flights indicated that platform stability bandwidths were small compared to the data rate bandwidths which should support an assumption that radar returns can be treated as short time stationary. Various approaches at detection of weather returns in the presence of ground clutter are being investigated. Non-coventional clutter rejection through spectrum mode tracking and classification algorithms is a subject of continuing research. Based upon autoregressive modeling of the radar return time sequence, this approach may offer an alternative to overcome errors in conventional pulse-pair estimates. Adaptive filtering is being evaluated as a means of rejecting clutter with emphasis on low signal-to-clutter ratio situations, particularly in the presence of discrete clutter interference. An analysis of out-of-range clutter returns is included to illustrate effects of ground clutter interference due to range aliasing for aircraft on final approach. Data are presented to indicate how aircraft groundspeed might be corrected from the radar data as well as point to an observed problem of groundspeed estimate bias variation with radar antenna scan angle. A description of how recorded clutter return data are mixed with simulated weather returns is included. This enables the researcher to run controlled experiments to test signal processing algorithms. In the summary research efforts involving improved modelling of radar ground clutter returns and a Bayesian approach at hazard factor estimation are mentioned.

  3. Redes neurais artificiais aplicadas ao processo de coagulação Artificial neural networks applied to the coagulation process

    Directory of Open Access Journals (Sweden)

    Fábio Conceição de Menezes

    2009-12-01

    Full Text Available A coagulação é uma etapa de tratamento da água, e para tal são realizados ensaios de teste de jarro que permitem determinar a dose necessária dos agentes coagulante e químico de ajuste de pH no processo de coagulação. Contudo, esses ensaios demoram a ser executados, não respondendo em tempo real às mudanças da qualidade da água bruta. Para superar tal limitação, redes neurais artificiais multicamadas foram construídas (e seus pesos sinápticos ajustados, validadas e testadas para predizer a dosagem do hidróxido de sódio e do sulfato de alumínio - utilizados como agentes químico de ajuste de pH e coagulante, respectivamente. Os resultados dos modelos obtidos são compatíveis com os dados experimentais tendo em vista que as incertezas das estimativas estão na mesma ordem de grandeza das faixas indicadas pelos ensaios realizados de testes de jarro ao longo de quase seis anos.Coagulation is a stage in water treatment and, for this, jar tests are performed, which allows determining the optimal coagulant and alkalizer doses in coagulation process. However, these tests are time-consuming and do not enable real-time responses to changes in raw water quality. To overcome these limitations, artificial multilayer perceptron neural networks were built, trained, validated and tested to predict the aluminum and sodium hydroxide doses - used as coagulant and alkalizer, respectively. The results of these models are encouraging to consider that the estimated uncertainties have the same order of the variation limits magnitude indicated by the jar tests for almost a six-year period.

  4. Immobilized TiO2on glass spheres applied to heterogeneous photocatalysis: photoactivity, leaching and regeneration process.

    Science.gov (United States)

    Cunha, Deivisson Lopes; Kuznetsov, Alexei; Achete, Carlos Alberto; Machado, Antonio Eduardo da Hora; Marques, Marcia

    2018-01-01

    Heterogeneous photocatalysis using titanium dioxide as catalyst is an attractive advanced oxidation process due to its high chemical stability, good performance and low cost. When immobilized in a supporting material, additional benefits are achieved in the treatment. The purpose of this study was to develop a simple protocol for impregnation of TiO 2 -P25 on borosilicate glass spheres and evaluate its efficiency in the photocatalytic degradation using an oxidizable substrate (methylene blue), in a Compound Parabolic Concentrator (CPC) reactor. The assays were conducted at lab-scale using radiation, which simulated the solar spectrum. TiO 2 leaching from the glass and the catalyst regeneration were both demonstrated. A very low leaching ratio (0.03%) was observed after 24 h of treatment, suggesting that deposition of TiO 2 resulted in good adhesion and stability of the photocatalyst on the surface of borosilicate. This deposition was successfully achieved after calcination of the photocatalyst at 400 °C (TiO 2 -400 °C). The TiO 2 film was immobilized on glass spheres and the powder was characterized by scanning electron microscopy (SEM), X-ray diffraction and BET. This characterization suggested that thermal treatment did not introduce substantial changes in the measured microstructural characteristics of the photocatalyst. The immobilized photocatalyst degraded more than 96% of the MB in up to 90 min of reaction. The photocatalytic activity decreased after four photocatalytic cycles, but it was recovered by the removal of contaminants adsorbed on the active sites after washing in water under UV-Vis irradiation. Based on these results, the TiO 2 -400 °C coated on glass spheres is potentially a very attractive option for removal of persistent contaminants present in the environment.

  5. Quantification of compensatory processes of postnatal hypoxia in newborn piglets applying short-term nonlinear dynamics analysis

    Directory of Open Access Journals (Sweden)

    Walter Bernd

    2011-10-01

    Full Text Available Abstract Background Newborn mammals suffering from moderate hypoxia during or after birth are able to compensate a transitory lack of oxygen by adapting their vital functions. Exposure to hypoxia leads to an increase in the sympathetic tone causing cardio-respiratory response, peripheral vasoconstriction and vasodilatation in privileged organs like the heart and brain. However, there is only limited information available about the time and intensity changes of the underlying complex processes controlled by the autonomic nervous system. Methods In this study an animal model involving seven piglets was used to examine an induced state of circulatory redistribution caused by moderate oxygen deficit. In addition to the main focus on the complex dynamics occurring during sustained normocapnic hypoxia, the development of autonomic regulation after induced reoxygenation had been analysed. For this purpose, we first introduced a new algorithm to prove stationary conditions in short-term time series. Then we investigated a multitude of indices from heart rate and blood pressure variability and from bivariate interactions, also analysing respiration signals, to quantify the complexity of vegetative oscillations influenced by hypoxia. Results The results demonstrated that normocapnic hypoxia causes an initial increase in cardiovascular complexity and variability, which decreases during moderate hypoxia lasting one hour (p Conclusions In conclusion, indices from linear and nonlinear dynamics reflect considerable temporal changes of complexity in autonomous cardio-respiratory regulation due to normocapnic hypoxia shortly after birth. These findings might be suitable for non-invasive clinical monitoring of hypoxia-induced changes of autonomic regulation in newborn humans.

  6. Applying the polarity rapid assessment method to characterize nitrosamine precursors and to understand their removal by drinking water treatment processes.

    Science.gov (United States)

    Liao, Xiaobin; Bei, Er; Li, Shixiang; Ouyang, Yueying; Wang, Jun; Chen, Chao; Zhang, Xiaojian; Krasner, Stuart W; Suffet, I H Mel

    2015-12-15

    Some N-nitrosamines (NAs) have been identified as emerging disinfection by-products during water treatment. Thus, it is essential to understand the characteristics of the NA precursors. In this study, the polarity rapid assessment method (PRAM) and the classical resin fractionation method were studied as methods to fractionate the NA precursors during drinking water treatment. The results showed that PRAM has much higher selectivity for NA precursors than the resin approach. The normalized N-nitrosodimethylamine formation potential (NDMA FP) and N-nitrosodiethylamine (NDEA) FP of four resin fractions was at the same level as the average yield of the bulk organic matter whereas that of the cationic fraction by PRAM showed 50 times the average. Thus, the cationic fraction was shown to be the most important NDMA precursor contributor. The PRAM method also helped understand which portions of the NA precursor were removed by different water treatment processes. Activated carbon (AC) adsorption removed over 90% of the non-polar PRAM fraction (that sorbs onto the C18 solid phase extraction [SPE] cartridge) of NDMA and NDEA precursors. Bio-treatment removed 80-90% of the cationic fraction of PRAM (that is retained on the cation exchange SPE cartridge) and 40-60% of the non-cationic fractions. Ozonation removed 50-60% of the non-polar PRAM fraction of NA precursors and transformed part of them into the polar fraction. Coagulation and sedimentation had very limited removal of various PRAM fractions of NA precursors. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Applying Multi-Criteria Decision Aiding Techniques in the Process of Project Management within the Wedding Planning Business

    Directory of Open Access Journals (Sweden)

    Dorota Górecka

    2012-01-01

    Full Text Available Numerous problems that emerge in the process of project management can be presented as multi-criteria issues and solved with the help of appropriate methods. The contracting authority, selecting one tender out of many available tenders, assesses them, taking into account various criteria, e.g. price, expected execution time and the contractor's experience. The owner of a company intending to purchase the fixed assets requisite for the realization of the project behaves similarly, i.e. the most advantageous model of the device is chosen, taking into account not only its price but also production capacity, energy intensity, noise emission, service availability, etc. From among many concepts, the investor has to choose a solution which frequently constitutes a compromise between price, functional properties, durability and aesthetics of performance, as well as safety of the utilization and impact on the environment. The choice of an investment location depends not only on the market, financial and supply factors, but also on so called soft factors such as the perceived quality of institutions and the attitude of local communities. All such situations can be described in the same way: taking into account preferences of the decision maker, the best possible choice must be made out of a finite set of alternatives evaluated according to a finite set of criteria. There are many different methods that can be used to aid a decision maker in this choice, including, but not limited to, techniques based on the outranking relation, verbal decision analysis and the MACBETH method. In this article, they will be compared and their applicability to different types of decision making problems will be considered. Furthermore, the PROMETHEE II method with a veto threshold will be presented within the text. Because the application of project management in the wedding planning business has gained wide popularity, as an illustrative example an empirical study of

  8. Image processing applied to automatic detection of defects during ultrasonic examination; Imagerie numerique ultrasonore pour la detection automatique de defauts en controle non destructif

    Energy Technology Data Exchange (ETDEWEB)

    Moysan, J.

    1992-10-01

    This work is a study about image processing applied to ultrasonic BSCAN images which are obtained in the field of non destructive testing of weld. The goal is to define what image processing techniques can bring to ameliorate the exploitation of the data collected and, more precisely, what image processing can do to extract the meaningful echoes which enable to characterize and to size the defects. The report presents non destructive testing by ultrasounds in the nuclear field and it indicates specificities of the propagation of ultrasonic waves in austenitic weld. It gives a state of the art of the data processing applied to ultrasonic images in nondestructive evaluation. A new image analysis is then developed. It is based on a powerful tool, the co-occurrence matrix. This matrix enables to represent, in a whole representation, relations between amplitudes of couples of pixels. From the matrix analysis, a new complete and automatic method has been set down in order to define a threshold which separates echoes from noise. An automatic interpretation of the ultrasonic echoes is then possible. Complete validation has been done with standard pieces.

  9. Applying multi-physics requirements and loads in FEM analysis and testing—The JET KL11 endoscope design verification process

    International Nuclear Information System (INIS)

    Zauner, C.; Klammer, J.; Hartl, M.; Kampf, D.; Huber, A.; Mertens, Ph.; Schweer, B.; Terra, A.; Balshaw, N.

    2013-01-01

    Considering multi-physics requirements and loads in the early design phase as well as during the later experimental verification is especially important for the design of fusion devices due to the extreme environmental conditions and loads. Typical disciplines in design of fusion devices are thermodynamics, structural-mechanics, electro-magnetics, and optics. The interaction of these disciplines as well as an efficient approach to implement this interaction in numerical and experimental simulations is presented as applied at the new JET KL11 divertor endoscope design and verification process. The endoscope's first pictures already showed the very good performance of the instrument

  10. STUDY REGARDING THE IMPORTANCE OF APPLYING THE DIDACTIC STRATEGIES IN LEARNING AND PERFECTING PROCESS OF WOMEN VAULT YURCHENKO - ROUND OFF FLIC FLAC 1 ½ SALTO BACKWARD STRETCHED OFF

    Directory of Open Access Journals (Sweden)

    Gina Gogean

    2009-03-01

    Full Text Available The artistic gymnastic registers significant process at the international level, the contest exigency for the jump test rising to a very high achievement level so as it needs a special training methodology. I broach this study from the point of view of the importance of instructional strategies applied in the learning and perfecting process of the jump through Yurchenko, this providing the quality of the instruction process and the competitive behavior.The instructional strategies aim at solving the operational objectives through a coherent methodological system, well defined, by using efficiently the best combination of the 3M ( methods, materials, means , chosen and adapted to the learning and perfecting process of jumps through Yurchenko, after choosing and obeyingcertain rules, principles and organization methods in order to provide a high level of quality and efficiency in whatconcerns the training process for achieving the proposed objectives and which to allow the junior gymnasts, at thesenior level, the approaching of certain jumps of a higher difficulty.

  11. Control of nanoparticle size and amount by using the mesh grid and applying DC-bias to the substrate in silane ICP-CVD process

    Science.gov (United States)

    Yoo, Seung-Wan; Hwang, Nong-Moon; You, Shin-Jae; Kim, Jung-Hyung; Seong, Dae-Jin

    2017-11-01

    The effect of applying a bias to the substrate on the size and amount of charged crystalline silicon nanoparticles deposited on the substrate was investigated in the inductively coupled plasma chemical vapor deposition process. By inserting the grounded grid with meshes above the substrate, the region just above the substrate was separated from the plasma. Thereby, crystalline Si nanoparticles formed by the gas-phase reaction in the plasma could be deposited directly on the substrate, successfully avoiding the formation of a film. Moreover, the size and the amount of deposited nanoparticles could be changed by applying direct current bias to the substrate. When the grid of 1 × 1-mm-sized mesh was used, the nanoparticle flux was increased as the negative substrate bias increased from 0 to - 50 V. On the other hand, when a positive bias was applied to the substrate, Si nanoparticles were not deposited at all. Regardless of substrate bias voltages, the most frequently observed nanoparticles synthesized with the grid of 1 × 1-mm-sized mesh had the size range of 10-12 nm in common. When the square mesh grid of 2-mm size was used, as the substrate bias was increased from - 50 to 50 V, the size of the nanoparticles observed most frequently increased from the range of 8-10 to 40-45 nm but the amount that was deposited on the substrate decreased.

  12. Spectroscopic methods of process monitoring for safeguards of used nuclear fuel separations

    Science.gov (United States)

    Warburton, Jamie Lee

    To support the demonstration of a more proliferation-resistant nuclear fuel processing plant, techniques and instrumentation to allow the real-time, online determination of special nuclear material concentrations in-process must be developed. An ideal materials accountability technique for proliferation resistance should provide nondestructive, realtime, on-line information of metal and ligand concentrations in separations streams without perturbing the process. UV-Visible spectroscopy can be adapted for this precise purpose in solvent extraction-based separations. The primary goal of this project is to understand fundamental URanium EXtraction (UREX) and Plutonium-URanium EXtraction (PUREX) reprocessing chemistry and corresponding UV-Visible spectroscopy for application in process monitoring for safeguards. By evaluating the impact of process conditions, such as acid concentration, metal concentration and flow rate, on the sensitivity of the UV-Visible detection system, the process-monitoring concept is developed from an advanced application of fundamental spectroscopy. Systematic benchtop-scale studies investigated the system relevant to UREX or PUREX type reprocessing systems, encompassing 0.01-1.26 M U and 0.01-8 M HNO3. A laboratory-scale TRansUranic Extraction (TRUEX) demonstration was performed and used both to analyze for potential online monitoring opportunities in the TRUEX process, and to provide the foundation for building and demonstrating a laboratory-scale UREX demonstration. The secondary goal of the project is to simulate a diversion scenario in UREX and successfully detect changes in metal concentration and solution chemistry in a counter current contactor system with a UV-Visible spectroscopic process monitor. UREX uses the same basic solvent extraction flowsheet as PUREX, but has a lower acid concentration throughout and adds acetohydroxamic acid (AHA) as a complexant/reductant to the feed solution to prevent the extraction of Pu. By examining

  13. Comparative Study on Interaction of Form and Motion Processing Streams by Applying Two Different Classifiers in Mechanism for Recognition of Biological Movement

    Science.gov (United States)

    2014-01-01

    Research on psychophysics, neurophysiology, and functional imaging shows particular representation of biological movements which contains two pathways. The visual perception of biological movements formed through the visual system called dorsal and ventral processing streams. Ventral processing stream is associated with the form information extraction; on the other hand, dorsal processing stream provides motion information. Active basic model (ABM) as hierarchical representation of the human object had revealed novelty in form pathway due to applying Gabor based supervised object recognition method. It creates more biological plausibility along with similarity with original model. Fuzzy inference system is used for motion pattern information in motion pathway creating more robustness in recognition process. Besides, interaction of these paths is intriguing and many studies in various fields considered it. Here, the interaction of the pathways to get more appropriated results has been investigated. Extreme learning machine (ELM) has been implied for classification unit of this model, due to having the main properties of artificial neural networks, but crosses from the difficulty of training time substantially diminished in it. Here, there will be a comparison between two different configurations, interactions using synergetic neural network and ELM, in terms of accuracy and compatibility. PMID:25276860

  14. Comparative Study on Interaction of Form and Motion Processing Streams by Applying Two Different Classifiers in Mechanism for Recognition of Biological Movement

    Directory of Open Access Journals (Sweden)

    Bardia Yousefi

    2014-01-01

    Full Text Available Research on psychophysics, neurophysiology, and functional imaging shows particular representation of biological movements which contains two pathways. The visual perception of biological movements formed through the visual system called dorsal and ventral processing streams. Ventral processing stream is associated with the form information extraction; on the other hand, dorsal processing stream provides motion information. Active basic model (ABM as hierarchical representation of the human object had revealed novelty in form pathway due to applying Gabor based supervised object recognition method. It creates more biological plausibility along with similarity with original model. Fuzzy inference system is used for motion pattern information in motion pathway creating more robustness in recognition process. Besides, interaction of these paths is intriguing and many studies in various fields considered it. Here, the interaction of the pathways to get more appropriated results has been investigated. Extreme learning machine (ELM has been implied for classification unit of this model, due to having the main properties of artificial neural networks, but crosses from the difficulty of training time substantially diminished in it. Here, there will be a comparison between two different configurations, interactions using synergetic neural network and ELM, in terms of accuracy and compatibility.

  15. Applying the food technology neophobia scale in a developing country context. A case-study on processed matooke (cooking banana) flour in Central Uganda.

    Science.gov (United States)

    De Steur, Hans; Odongo, Walter; Gellynck, Xavier

    2016-01-01

    The success of new food technologies largely depends on consumers' behavioral responses to the innovation. In Eastern Africa, and Uganda in particular, a technology to process matooke into flour has been introduced with limited success. We measure and apply the Food technology Neophobia Scale (FTNS) to this specific case. This technique has been increasingly used in consumer research to determine consumers' fear for foods produced by novel technologies. Although it has been successful in developed countries, the low number and limited scope of past studies underlines the need for testing its applicability in a developing country context. Data was collected from 209 matooke consumers from Central Uganda. In general, respondents are relatively neophobic towards the new technology, with an average FTNS score of 58.7%, which hampers the success of processed matooke flour. Besides socio-demographic indicators, 'risk perception', 'healthiness' and the 'necessity of technologies' were key factors that influenced consumer's preference of processed matooke flour. Benchmarking the findings against previous FTNS surveys allows to evaluate factor solutions, compare standardized FTNS scores and further lends support for the multidimensionality of the FTNS. Being the first application in a developing country context, this study provides a case for examining food technology neophobia for processed staple crops in various regions and cultures. Nevertheless, research is needed to replicate this method and evaluate the external validity of our findings. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Combinatorial process optimization for negative photo-imageable spin-on dielectrics and investigation of post-apply bake and post-exposure bake interactions

    Science.gov (United States)

    Kim, Jihoon; Zhang, Ruzhi M.; Wolfer, Elizabeth; Patel, Bharatkumar K.; Toukhy, Medhat; Bogusz, Zachary; Nagahara, Tatsuro

    2012-03-01

    Patternable dielectric materials were developed and introduced to reduce semiconductor manufacturing complexity and cost of ownership (CoO). However, the bestowed dual functionalities of photo-imageable spin-on dielectrics (PSOD) put great challenges on the material design and development. In this work, we investigated the combinatorial process optimization for the negative-tone PSOD lithography by employing the Temperature Gradient Plate (TGP) technique which significantly reduced the numbers of wafers processed and minimized the developmental time. We demonstrated that this TGP combinatorial is very efficient at evaluating the effects and interactions of several independent variables such as post-apply bake (PAB) and post-exposure bake (PEB). Unlike most of the conventional photoresists, PAB turned out to have a great effect on the PSOD pattern profiles. Based on our extensive investigation, we observed great correlation between PAB and PEB processes. In this paper, we will discuss the variation of pattern profiles as a matrix of PAB and PEB and propose two possible cross-linking mechanisms for the PSOD materials to explain the unusual experimental results.

  17. A Tooth Flank Crowning Method by Applying a Novel Crossed Angle Function Between the Hob Cutter and Work Gear in the Gear Hobbing Process

    Directory of Open Access Journals (Sweden)

    Wu Yu-Ren

    2016-01-01

    Full Text Available In this paper, a novel longitudinal tooth flank crowning method is proposed by setting the crossed angle between the hob cutter and work gear as a linear function of hob’s traverse feed movement in the gear hobbing process. However, this method makes twisted tooth flanks on the hobbed work gear. Therefore, a variable pressure angle hob cutter is applied to obtain an anti-twist tooth flank of hobbed work gear. A computer simulation example is performed to verify the superiority of the proposed novel hobbing method by comparing topographies of the crowned work gear surfaces hobbed by a standard hob cutter and a variable pressure angle hob cutter.

  18. Quality by design (QbD), Process Analytical Technology (PAT), and design of experiment applied to the development of multifunctional sunscreens.

    Science.gov (United States)

    Peres, Daniela D'Almeida; Ariede, Maira Bueno; Candido, Thalita Marcilio; de Almeida, Tania Santos; Lourenço, Felipe Rebello; Consiglieri, Vladi Olga; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Baby, André Rolim

    2017-02-01

    Multifunctional formulations are of great importance to ensure better skin protection from harm caused by ultraviolet radiation (UV). Despite the advantages of Quality by Design and Process Analytical Technology approaches to the development and optimization of new products, we found in the literature only a few studies concerning their applications in cosmetic product industry. Thus, in this research work, we applied the QbD and PAT approaches to the development of multifunctional sunscreens containing bemotrizinol, ethylhexyl triazone, and ferulic acid. In addition, UV transmittance method was applied to assess qualitative and quantitative critical quality attributes of sunscreens using chemometrics analyses. Linear discriminant analysis allowed classifying unknown formulations, which is useful for investigation of counterfeit and adulteration. Simultaneous quantification of ethylhexyl triazone, bemotrizinol, and ferulic acid presented at the formulations was performed using PLS regression. This design allowed us to verify the compounds in isolation and in combination and to prove that the antioxidant action of ferulic acid as well as the sunscreen actions, since the presence of this component increased 90% of antioxidant activity in vitro.

  19. Modulating the resting-state functional connectivity patterns of language processing areas in the human brain with anodal transcranial direct current stimulation applied over the Broca's area.

    Science.gov (United States)

    Cao, Jianwei; Liu, Hanli; Alexandrakis, George

    2018-04-01

    Cortical circuit reorganization induced by anodal transcranial direct current stimulation (tDCS) over the Broca's area of the dominant language hemisphere in 13 healthy adults was quantified by functional near-infrared spectroscopy (fNIRS). Transient cortical reorganization patterns in steady-state functional connectivity (seed-based and graph theory analysis) and temporal functional connectivity (sliding window correlation analysis) were recorded before, during, and after applying high current tDCS (1 mA, 8 min). fNIRS connectivity mapping showed that tDCS induced significantly ([Formula: see text]) increased functional connectivity between Broca's area and its neighboring cortical regions while it simultaneously decreased the connectivity to remote cortical regions. Furthermore, the anodal stimulation caused significant increases to the functional connectivity variability (FCV) of remote cortical regions related to language processing. In addition to the high current tDCS, low current tDCS (0.5 mA, 2 min 40 s) was also applied to test whether the transient effects of lower stimulation current could qualitatively predict cortical connectivity alterations induced by the higher currents. Interestingly, low current tDCS could qualitatively predict the increase in clustering coefficient and FCV but not the enhancement of local connectivity. Our findings indicate the possibility of combining future studies fNIRS with tDCS at lower currents to help guide therapeutic interventions.

  20. Applied Macroeconomics

    NARCIS (Netherlands)

    Heijman, W.J.M.

    2000-01-01

    This book contains a course in applied macroeconomics. Macroeconomic theory is applied to real world cases. Students are expected to compute model results with the help of a spreadsheet program. To that end the book also contains descriptions of the spreadsheet applications used, such as linear

  1. Applied Electromagnetics

    International Nuclear Information System (INIS)

    Yamashita, H.; Marinova, I.; Cingoski, V.

    2002-01-01

    These proceedings contain papers relating to the 3rd Japanese-Bulgarian-Macedonian Joint Seminar on Applied Electromagnetics. Included are the following groups: Numerical Methods I; Electrical and Mechanical System Analysis and Simulations; Inverse Problems and Optimizations; Software Methodology; Numerical Methods II; Applied Electromagnetics

  2. Proposal for implementation risk management according ABNT NBR ISO 31000 standard applied to internal audit process of Integrated Management System of IPEN

    International Nuclear Information System (INIS)

    Scapin Junior, Wilson S.; Salvetti, Tereza C.; Longo, Guilherme C.

    2015-01-01

    The paper objective is to establish a risk management methodology applied to internal audits processes of IPEN Integrated Management System (IMS). In continuous seeking of updating methodologies to assist effective management based on the constant changes in the organizational world, and the development of management tools used for decision making, risk management demonstrates trends to be a new tool with high efficiency. This trend is accentuated by the fact that risk management is being incorporated into the new revision of quality management standard ISO 9001, estimated conclusion in November 2015. The identification, evaluation and treatment of risks are present in eleven items of its ten requirements at new revision. From the conclusion of the review, all organizations certified by that standard should make the necessary changes in their systems to meet the new requirements. This proposal will provide anticipate the changes that will occur in the management system of IPEN in accordance with this new revision. With the character of a pilot program to implement the organizational culture change in relationship to new concepts related to risks and implementation of risk management all other system processes that will be affected by the new revision of this standard. The methodology used for this paper is supported by the standards ABNT NBR ISO 31000. (author)

  3. Proposal for implementation risk management according ABNT NBR ISO 31000 standard applied to internal audit process of Integrated Management System of IPEN

    Energy Technology Data Exchange (ETDEWEB)

    Scapin Junior, Wilson S.; Salvetti, Tereza C.; Longo, Guilherme C., E-mail: wsscapin@ipen.br, E-mail: salvetti@ipen.br, E-mail: glongo@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The paper objective is to establish a risk management methodology applied to internal audits processes of IPEN Integrated Management System (IMS). In continuous seeking of updating methodologies to assist effective management based on the constant changes in the organizational world, and the development of management tools used for decision making, risk management demonstrates trends to be a new tool with high efficiency. This trend is accentuated by the fact that risk management is being incorporated into the new revision of quality management standard ISO 9001, estimated conclusion in November 2015. The identification, evaluation and treatment of risks are present in eleven items of its ten requirements at new revision. From the conclusion of the review, all organizations certified by that standard should make the necessary changes in their systems to meet the new requirements. This proposal will provide anticipate the changes that will occur in the management system of IPEN in accordance with this new revision. With the character of a pilot program to implement the organizational culture change in relationship to new concepts related to risks and implementation of risk management all other system processes that will be affected by the new revision of this standard. The methodology used for this paper is supported by the standards ABNT NBR ISO 31000. (author)

  4. A new automated assessment method for contrast–detail images by applying support vector machine and its robustness to nonlinear image processing

    International Nuclear Information System (INIS)

    Takei, Takaaki; Ikeda, Mitsuru; Imai, Kumiharu; Yamauchi-Kawaura, Chiyo; Kato, Katsuhiko; Isoda, Haruo

    2013-01-01

    The automated contrast–detail (C–D) analysis methods developed so-far cannot be expected to work well on images processed with nonlinear methods, such as noise reduction methods. Therefore, we have devised a new automated C–D analysis method by applying support vector machine (SVM), and tested for its robustness to nonlinear image processing. We acquired the CDRAD (a commercially available C–D test object) images at a tube voltage of 120 kV and a milliampere-second product (mAs) of 0.5–5.0. A partial diffusion equation based technique was used as noise reduction method. Three radiologists and three university students participated in the observer performance study. The training data for our SVM method was the classification data scored by the one radiologist for the CDRAD images acquired at 1.6 and 3.2 mAs and their noise-reduced images. We also compared the performance of our SVM method with the CDRAD Analyser algorithm. The mean C–D diagrams (that is a plot of the mean of the smallest visible hole diameter vs. hole depth) obtained from our devised SVM method agreed well with the ones averaged across the six human observers for both original and noise-reduced CDRAD images, whereas the mean C–D diagrams from the CDRAD Analyser algorithm disagreed with the ones from the human observers for both original and noise-reduced CDRAD images. In conclusion, our proposed SVM method for C–D analysis will work well for the images processed with the non-linear noise reduction method as well as for the original radiographic images.

  5. Design and construction of coal/biomass to liquids (CBTL) process development unit (PDU) at the University of Kentucky Center for Applied Energy Research (CAER)

    Energy Technology Data Exchange (ETDEWEB)

    Placido, Andrew [Univ. of Kentucky, Lexington, KY (United States); Liu, Kunlei [Univ. of Kentucky, Lexington, KY (United States); Challman, Don [Univ. of Kentucky, Lexington, KY (United States); Andrews, Rodney [Univ. of Kentucky, Lexington, KY (United States); Jacques, David [Univ. of Kentucky, Lexington, KY (United States)

    2015-10-30

    This report describes a first phase of a project to design, construct and commission an integrated coal/biomass-to-liquids facility at a capacity of 1 bbl. /day at the University of Kentucky Center for Applied Energy Research (UK-CAER) – specifically for construction of the building and upstream process units for feed handling, gasification, and gas cleaning, conditioning and compression. The deliverables from the operation of this pilot plant [when fully equipped with the downstream process units] will be firstly the liquid FT products and finished fuels which are of interest to UK-CAER’s academic, government and industrial research partners. The facility will produce research quantities of FT liquids and finished fuels for subsequent Fuel Quality Testing, Performance and Acceptability. Moreover, the facility is expected to be employed for a range of research and investigations related to: Feed Preparation, Characteristics and Quality; Coal and Biomass Gasification; Gas Clean-up/ Conditioning; Gas Conversion by FT Synthesis; Product Work-up and Refining; Systems Analysis and Integration; and Scale-up and Demonstration. Environmental Considerations - particularly how to manage and reduce carbon dioxide emissions from CBTL facilities and from use of the fuels - will be a primary research objectives. Such a facility has required significant lead time for environmental review, architectural/building construction, and EPC services. UK, with DOE support, has advanced the facility in several important ways. These include: a formal EA/FONSI, and permits and approvals; construction of a building; selection of a range of technologies and vendors; and completion of the upstream process units. The results of this project are the FEED and detailed engineering studies, the alternate configurations and the as-built plant - its equipment and capabilities for future research and demonstration and its adaptability for re-purposing to meet other needs. These are described in

  6. Applied superconductivity

    CERN Document Server

    Newhouse, Vernon L

    1975-01-01

    Applied Superconductivity, Volume II, is part of a two-volume series on applied superconductivity. The first volume dealt with electronic applications and radiation detection, and contains a chapter on liquid helium refrigeration. The present volume discusses magnets, electromechanical applications, accelerators, and microwave and rf devices. The book opens with a chapter on high-field superconducting magnets, covering applications and magnet design. Subsequent chapters discuss superconductive machinery such as superconductive bearings and motors; rf superconducting devices; and future prospec

  7. MO-G-BRE-05: Clinical Process Improvement and Billing in Radiation Oncology: A Case Study of Applying FMEA for CPT Code 77336 (continuing Medical Physics Consultation)

    International Nuclear Information System (INIS)

    Spirydovich, S; Huq, M

    2014-01-01

    Purpose: The improvement of quality in healthcare can be assessed by Failure Mode and Effects Analysis (FMEA). In radiation oncology, FMEA, as applied to the billing CPT code 77336, can improve both charge capture and, most importantly, quality of the performed services. Methods: We created an FMEA table for the process performed under CPT code 77336. For a given process step, each member of the assembled team (physicist, dosimetrist, and therapist) independently assigned numerical values for: probability of occurrence (O, 1–10), severity (S, 1–10), and probability of detection (D, 1–10) for every failure mode cause and effect combination. The risk priority number, RPN, was then calculated as a product of O, S and D from which an average RPN was calculated for each combination mentioned above. A fault tree diagram, with each process sorted into 6 categories, was created with linked RPN. For processes with high RPN recommended actions were assigned. 2 separate R and V systems (Lantis and EMR-based ARIA) were considered. Results: We identified 9 potential failure modes and corresponding 19 potential causes of these failure modes all resulting in unjustified 77336 charge and compromised quality of care. In Lantis, the range of RPN was 24.5–110.8, and of S values – 2–10. The highest ranking RPN of 110.8 came from the failure mode described as “end-of-treatment check not done before the completion of treatment”, and the highest S value of 10 (RPN=105) from “overrides not checked”. For the same failure modes, within ARIA electronic environment with its additional controls, RPN values were significantly lower (44.3 for end-of-treatment missing check and 20.0 for overrides not checked). Conclusion: Our work has shown that when charge capture was missed that also resulted in some services not being performed. Absence of such necessary services may result in sub-optimal quality of care rendered to patients

  8. IRECCSEM: Evaluating Clare Basin potential for onshore carbon sequestration using magnetotelluric data (Preliminary results). New approaches applied for processing, modeling and interpretation

    Science.gov (United States)

    Campanya i Llovet, J.; Ogaya, X.; Jones, A. G.; Rath, V.

    2014-12-01

    The IRECCSEM project (www.ireccsem.ie) is a Science Foundation Ireland Investigator Project that is funded to evaluate Ireland's potential for onshore carbon sequestration in saline aquifers by integrating new electromagnetic data with existing geophysical and geological data. The main goals of the project are to determine porosity-permeability values of the potential reservoir formation as well as to evaluate the integrity of the seal formation. During the Summer of 2014 a magnetotelluric (MT) survey was carried out at the Clare basin (Ireland). A total of 140 sites were acquired including audiomagnetotelluric (AMT), broadband magnetotelluric (BBMT) and long period magnetotelluric (LMT) data. The nominal space between sites is 0.6 km for AMT sites, 1.2 km for BBMT sites and 8 km for LMT sites. To evaluate the potential for carbon sequestration of the Clare basin three advances on geophysical methodology related to electromagnetic techniques were applied. First of all, processing of the MT data was improved following the recently published ELICIT methodology. Secondly, during the inversion process, the electrical resistivity distribution of the subsurface was constrained combining three different tensor relationships: Impedances (Z), induction arrows (TIP) and multi-site horizontal magnetic transfer-functions (HMT). Results from synthetic models were used to evaluate the sensitivity and properties of each tensor relationship. Finally, a computer code was developed, which employs a stabilized least squares approach to estimate the cementation exponent in the generalized Archie law formulated by Glover (2010). This allows relating MT-derived electrical resistivity models to porosity distributions. The final aim of this procedure is to generalize the porosity - permeability values measured in the boreholes to regional scales. This methodology will contribute to the evaluation of possible sequestration targets in the study area.

  9. Applying accreditation standards in a self-evaluation process: The experience of Educational Development Center of Tehran University of Medical Sciences

    Directory of Open Access Journals (Sweden)

    A Mirzazadeh

    2016-03-01

    Full Text Available Introduction: Educational Development Centers (EDCs, as the coordinator in education development in Medical Sciences universities, in order to improve their quality should evaluate their activities. In spite of remarkable performance of Tehran University of Medical Sciences (TUMS EDC in previous national rankings, but it faces many challenges and problems. This paper provided the process, results and lessons learned from a self-evaluation experience conducted at TUMS EDC based on accreditation standards. Method: The present study is an Institutional self-evaluation study based on the national accreditation standards of EDCs (2012. Data were gathered using an open-ended questionnaire developed on the basis of the SWOT format. A directional content analysis applied to analyze the data. Results: In total, 84 point of strengths, 87 weaknesses, 15 opportunities, 24 threats and also 99 recommendations for quality improvement were reported. The most important strengths of the center were the existence of an established mechanism regarding research process in education and scholarship of education, holding various faculty development courses and training standardized patient. The most important weaknesses were the lack of specified procedures in some areas such as monitoring the planning and reviewing of educational programs in the field of educational programs and evaluation of empowerment courses. Conclusion: The present evaluation results will be useful in directing future policies of TUMS EDC such as revising its strategic planning. We hope that the current experience can be helpful for administrators in EDCs in the Ministry of Health and Medical Education and also other Medical Sciences Universities.

  10. Applied Macroeconometrics

    OpenAIRE

    Nektarios Aslanidis

    2017-01-01

    This book treats econometric methods for analysis of applied econometrics with a particular focus on applications in macroeconomics. Topics include macroeconomic data, panel data models, unobserved heterogeneity, model comparison, endogeneity, dynamic econometric models, vector autoregressions, forecast evaluation, structural identification. The books provides undergraduate students with the necessary knowledge to be able to undertake econometric analysis in modern macroeconomic research.

  11. Applied dynamics

    CERN Document Server

    Schiehlen, Werner

    2014-01-01

    Applied Dynamics is an important branch of engineering mechanics widely applied to mechanical and automotive engineering, aerospace and biomechanics as well as control engineering and mechatronics. The computational methods presented are based on common fundamentals. For this purpose analytical mechanics turns out to be very useful where D’Alembert’s principle in the Lagrangian formulation proves to be most efficient. The method of multibody systems, finite element systems and continuous systems are treated consistently. Thus, students get a much better understanding of dynamical phenomena, and engineers in design and development departments using computer codes may check the results more easily by choosing models of different complexity for vibration and stress analysis.

  12. Applied optics

    International Nuclear Information System (INIS)

    Orszag, A.; Antonetti, A.

    1988-01-01

    The 1988 progress report, of the Applied Optics laboratory, of the (Polytechnic School, France), is presented. The optical fiber activities are focused on the development of an optical gyrometer, containing a resonance cavity. The following domains are included, in the research program: the infrared laser physics, the laser sources, the semiconductor physics, the multiple-photon ionization and the nonlinear optics. Investigations on the biomedical, the biological and biophysical domains are carried out. The published papers and the congress communications are listed [fr

  13. Applied mathematics

    International Nuclear Information System (INIS)

    Nedelec, J.C.

    1988-01-01

    The 1988 progress report of the Applied Mathematics center (Polytechnic School, France), is presented. The research fields of the Center are the scientific calculus, the probabilities and statistics and the video image synthesis. The research topics developed are: the analysis of numerical methods, the mathematical analysis of the physics and mechanics fundamental models, the numerical solution of complex models related to the industrial problems, the stochastic calculus and the brownian movement, the stochastic partial differential equations, the identification of the adaptive filtering parameters, the discrete element systems, statistics, the stochastic control and the development, the image synthesis techniques for education and research programs. The published papers, the congress communications and the thesis are listed [fr

  14. Applying radiation

    International Nuclear Information System (INIS)

    Mallozzi, P.J.; Epstein, H.M.; Jung, R.G.; Applebaum, D.C.; Fairand, B.P.; Gallagher, W.J.; Uecker, R.L.; Muckerheide, M.C.

    1979-01-01

    The invention discloses a method and apparatus for applying radiation by producing X-rays of a selected spectrum and intensity and directing them to a desired location. Radiant energy is directed from a laser onto a target to produce such X-rays at the target, which is so positioned adjacent to the desired location as to emit the X-rays toward the desired location; or such X-rays are produced in a region away from the desired location, and are channeled to the desired location. The radiant energy directing means may be shaped (as with bends; adjustable, if desired) to circumvent any obstruction between the laser and the target. Similarly, the X-ray channeling means may be shaped (as with fixed or adjustable bends) to circumvent any obstruction between the region where the X-rays are produced and the desired location. For producing a radiograph in a living organism the X-rays are provided in a short pulse to avoid any blurring of the radiograph from movement of or in the organism. For altering tissue in a living organism the selected spectrum and intensity are such as to affect substantially the tissue in a preselected volume without injuring nearby tissue. Typically, the selected spectrum comprises the range of about 0.1 to 100 keV, and the intensity is selected to provide about 100 to 1000 rads at the desired location. The X-rays may be produced by stimulated emission thereof, typically in a single direction

  15. Characterization of the effect generated by the preformed and formed processes applied to drainage catheters of QuadrathaneTM, in the blistered defect

    International Nuclear Information System (INIS)

    Rodriguez Forero, Diana Catalina

    2014-01-01

    The effect generated by preformed and formed processes on drainage catheters is characterized in the blistered defect. The potential root causes generated from the blistered defect are identified by the experimental design of one factor at a time. The experimental phases performed on the blistered defect have been: chemical interaction, humidity, mechanical stress, parameters RO Bonding, parameters of temperature and time of retention in the forming process. The application of quality control process methodology is recommended to obtain robust information about the defect and the process in general. Polymeric extrusions and construction of drainage catheters processes are described. The processes of preformed, formed and blistered defect are explained. The incidence of blistered defect and the yields of each batch produced of catheters should be controlled by means of weekly records to avoid further complications at the level of yield or quality [es

  16. Development of an Integrated Multi-Contaminant Removal Process Applied to Warm Syngas Cleanup for Coal-Based Advanced Gasification Systems

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Howard

    2010-11-30

    This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion concepts were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-Plus®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.

  17. The influence of multi-frontal teaching method on the effectiveness of the teaching process in the applied studies in technical sciences

    Directory of Open Access Journals (Sweden)

    Novković Dragan

    2015-01-01

    Full Text Available The orientation towards equity and quality in education clearly imposes the need for individual approach to each student separately. This situation is especially pronounced in the higher education institutions of applied studies in the field of technology, whose primary goal is very often individual training for use of highly specialized software and hardware tools. In such a situation it is necessary to move away from classical ex-cathedra methodology, and to develop student-centred learning environments. The multi-frontal teaching method was until now experimentally analyzed at the level of primary and secondary education in Serbia, where it shows results that suggest additional research is warranted. The research presented in this paper aims to investigate the effectiveness of this method applied in higher education institution of applied studies in the domain of technology. Results of the conducted research indicate that the application of the multi-frontal teaching method shows a positive effect on students' performance, self-efficacy and overall sense of personal gain and satisfaction.

  18. Applying Failure Modes, Effects, And Criticality Analysis And Human Reliability Analysis Techniques To Improve Safety Design Of Work Process In Singapore Armed Forces

    Science.gov (United States)

    2016-09-01

    suitable for this work process. Table 18 shows a comparison of the advantages and disadvantages for the remaining HRA methods. 47 Table 18. Comparison...aims to study the feasibility of adopting HEART as an alternate hazard assessment technique for human-centric work processes. 10 B. METHODOLOGY...of THERP, HEART, and SPAR-H. Adapted from Bell and Holroyd (2009). Advantages Disadvantages THERP - THERP is well used in practice - It has a

  19. Applying process mapping and analysis as a quality improvement strategy to increase the adoption of fruit, vegetable, and water breaks in Australian primary schools.

    Science.gov (United States)

    Biggs, Janice S; Farrell, Louise; Lawrence, Glenda; Johnson, Julie K

    2014-03-01

    Over the past decade, public health policy in Australia has prioritized the prevention and control of obesity and invested in programs that promote healthy eating-related behaviors, which includes increasing fruit and vegetable consumption in children. This article reports on a study that used process mapping and analysis as a quality improvement strategy to improve the delivery of a nutrition primary prevention program delivered in primary schools in New South Wales, Australia. Crunch&Sip® has been delivered since 2008. To date, adoption is low with only 25% of schools implementing the program. We investigated the cause of low adoption and propose actions to increase school participation. We conducted semistructured interviews with key stakeholders and analyzed the process of delivering Crunch&Sip to schools. Interviews and process mapping and analysis identified a number of barriers to schools adopting the program. The analyses identified the need to simplify and streamline the process of delivering the program to schools and introduce monitoring and feedback loops to track ongoing participation. The combination of stakeholder interviews and process mapping and analysis provided important practical solutions to improving program delivery and also contributed to building an understanding of factors that help and hinder program adoption. The insight provided by this analysis helped identify usable routine measures of adoption, which were an improvement over those used in the existing program plan. This study contributed toward improving the quality and efficiency of delivering a health promoting program to work toward achieving healthy eating behaviors in children.

  20. Evaluation of diamide insecticides co-applied with other agrochemicals at various times to manage Ostrinia nubilalis in processing snap bean.

    Science.gov (United States)

    Huseth, Anders S; Groves, Russell L; Chapman, Scott A; Nault, Brian A

    2015-12-01

    Multiple applications of pyrethroid insecticides are used to manage European corn borer, Ostrinia nubilalis Hübner, in snap bean, but new diamide insecticides may reduce application frequency. In a 2 year small-plot study, O. nubilalis control was evaluated by applying cyantraniliprole (diamide) and bifenthrin (pyrethroid) insecticides at one of three phenological stages (bud, bloom and pod formation) of snap bean development. Co-application of these insecticides with either herbicides or fungicides was also examined as a way to reduce the total number of sprays during a season. Cyantraniliprole applications timed either during bloom or during pod formation controlled O. nubilalis better than similar timings of bifenthrin. Co-applications of insecticides with fungicides controlled O. nubilalis as well as insecticide applications alone. Insecticides applied either alone or with herbicides during bud stage did not control this pest. Diamides are an alternative to pyrethroids for the management of O. nubilalis in snap bean. Adoption of diamides by snap bean growers could improve the efficiency of production by reducing the number of sprays required each season. © 2015 Society of Chemical Industry.

  1. Experimental device, corresponding forward model and processing of the experimental data using wavelet analysis for tomographic image reconstruction applied to eddy current nondestructive evaluation

    International Nuclear Information System (INIS)

    Joubert, P.Y.; Madaoui, N.

    1999-01-01

    In the context of eddy current non destructive evaluation using a tomographic image reconstruction process, the success of the reconstruction depends not only on the choice of the forward model and of the inversion algorithms, but also on the ability to extract the pertinent data from the raw signal provided by the sensor. We present in this paper, an experimental device designed for imaging purposes, the corresponding forward model, and a pre-processing of the experimental data using wavelet analysis. These three steps implemented with an inversion algorithm, will allow in the future to perform image reconstruction of 3-D flaws. (authors)

  2. Evaluation Model for Applying an E-Learning System in a Course: An Analytic Hierarchy Process-Multi-Choice Goal Programming Approach

    Science.gov (United States)

    Lin, Teng-Chiao; Ho, Hui-Ping; Chang, Ching-Ter

    2014-01-01

    With the widespread use of the Internet, adopting e-learning systems in courses has gradually become more and more important in universities in Taiwan. However, because of limitations of teachers' time, selecting suitable online IT tools has become very important. This study proposes an analytic hierarchy process (AHP)-multi-choice goal…

  3. Using social network analysis tools in ecology : Markov process transition models applied to the seasonal trophic network dynamics of the Chesapeake Bay

    NARCIS (Netherlands)

    Johnson, Jeffrey C.; Luczkovich, Joseph J.; Borgatti, Stephen P.; Snijders, Tom A. B.; Luczkovich, S.P.

    2009-01-01

    Ecosystem components interact in complex ways and change over time due to a variety of both internal and external influences (climate change, season cycles, human impacts). Such processes need to be modeled dynamically using appropriate statistical methods for assessing change in network structure.

  4. A Comparative Study of Applying Active-Set and Interior Point Methods in MPC for Controlling Nonlinear pH Process

    Directory of Open Access Journals (Sweden)

    Syam Syafiie

    2014-06-01

    Full Text Available A comparative study of Model Predictive Control (MPC using active-set method and interior point methods is proposed as a control technique for highly non-linear pH process. The process is a strong acid-strong base system. A strong acid of hydrochloric acid (HCl and a strong base of sodium hydroxide (NaOH with the presence of buffer solution sodium bicarbonate (NaHCO3 are used in a neutralization process flowing into reactor. The non-linear pH neutralization model governed in this process is presented by multi-linear models. Performance of both controllers is studied by evaluating its ability of set-point tracking and disturbance-rejection. Besides, the optimization time is compared between these two methods; both MPC shows the similar performance with no overshoot, offset, and oscillation. However, the conventional active-set method gives a shorter control action time for small scale optimization problem compared to MPC using IPM method for pH control.

  5. RESEARCH OF PROCESS OF AN ALLOYING OF THE FUSED COATINGS RECEIVED FROM THE SUPERFICIAL ALLOYED WIRE BY BORON WITH IN ADDITIONALLY APPLIED ELECTROPLATED COATING OF CHROME AND COPPER

    Directory of Open Access Journals (Sweden)

    V. A. Stefanovich

    2015-01-01

    Full Text Available Researches on distribution of chrome and copper in the fused coating received from the superficial alloyed wire by boron with in additionally applied electroplated coating of chrome and copper were executed. The structure of the fused coating consists of dendrites on which borders the boride eutectic is located. It is established that the content of chrome in dendrites is 1,5– 1,6 times less than in the borid; distribution of copper on structure is uniformed. Coefficients of digestion of chrome and copper at an argon-arc welding from a wire electrode with electroplated coating are established. The assimilation coefficient for chrome is equal to 0,9–1,0; for copper – 0,6–0,75.

  6. High School Student Perceptions of the Utility of the Engineering Design Process: Creating Opportunities to Engage in Engineering Practices and Apply Math and Science Content

    Science.gov (United States)

    Berland, Leema; Steingut, Rebecca; Ko, Pat

    2014-12-01

    Research and policy documents increasingly advocate for incorporating engineering design into K-12 classrooms in order to accomplish two goals: (1) provide an opportunity to engage with science content in a motivating real-world context; and (2) introduce students to the field of engineering. The present study uses multiple qualitative data sources (i.e., interviews, artifact analysis) in order to examine the ways in which engaging in engineering design can support students in participating in engineering practices and applying math and science knowledge. This study suggests that students better understand and value those aspects of engineering design that are more qualitative (i.e., interviewing users, generating multiple possible solutions) than the more quantitative aspects of design which create opportunities for students to integrate traditional math and science content into their design work (i.e., modeling or systematically choosing between possible design solutions). Recommendations for curriculum design and implementation are discussed.

  7. Relationships between structure, process and outcome to assess quality of integrated chronic disease management in a rural South African setting: applying a structural equation model.

    Science.gov (United States)

    Ameh, Soter; Gómez-Olivé, Francesc Xavier; Kahn, Kathleen; Tollman, Stephen M; Klipstein-Grobusch, Kerstin

    2017-03-23

    South Africa faces a complex dual burden of chronic communicable and non-communicable diseases (NCDs). In response, the Integrated Chronic Disease Management (ICDM) model was initiated in primary health care (PHC) facilities in 2011 to leverage the HIV/ART programme to scale-up services for NCDs, achieve optimal patient health outcomes and improve the quality of medical care. However, little is known about the quality of care in the ICDM model. The objectives of this study were to: i) assess patients' and operational managers' satisfaction with the dimensions of ICDM services; and ii) evaluate the quality of care in the ICDM model using Avedis Donabedian's theory of relationships between structure (resources), process (clinical activities) and outcome (desired result of healthcare) constructs as a measure of quality of care. A cross-sectional study was conducted in 2013 in seven PHC facilities in the Bushbuckridge municipality of Mpumalanga Province, north-east South Africa - an area underpinned by a robust Health and Demographic Surveillance System (HDSS). The patient satisfaction questionnaire (PSQ-18), with measures reflecting structure/process/outcome (SPO) constructs, was adapted and administered to 435 chronic disease patients and the operational managers of all seven PHC facilities. The adapted questionnaire contained 17 dimensions of care, including eight dimensions identified as priority areas in the ICDM model - critical drugs, equipment, referral, defaulter tracing, prepacking of medicines, clinic appointments, waiting time, and coherence. A structural equation model was fit to operationalise Donabedian's theory, using unidirectional, mediation, and reciprocal pathways. The mediation pathway showed that the relationships between structure, process and outcome represented quality systems in the ICDM model. Structure correlated with process (0.40) and outcome (0.75). Given structure, process correlated with outcome (0.88). Of the 17 dimensions of care in

  8. COMPARATIVE ANALYSIS OF MECHANICAL CHARACTERISTICS OF THE STEELS, APPLIED FOR PRODUCTION OF CHIPPING KNIVES, RECEIVED BY METHODS OF THERMAL AND THERMOMECHANICAL PROCESSINGS

    Directory of Open Access Journals (Sweden)

    A. V. Alifanov

    2014-01-01

    Full Text Available Results of researches of chemical composition of chipping knives of foreign and domestic producers are given in the article. Results of mechanical tests of samples with determination of temporary resistance, percentage elongation, ultimate strength at cross bending, bend from the various tool steels, subjected to heat treatment (tempering and thermomechanical processing with low tempering, are given. Recommendations on use of TO and TMO for investigated steels are given.

  9. Applying meta-pathway analyses through metagenomics to identify the functional properties of the major bacterial communities of a single spontaneous cocoa bean fermentation process sample.

    Science.gov (United States)

    Illeghems, Koen; Weckx, Stefan; De Vuyst, Luc

    2015-09-01

    A high-resolution functional metagenomic analysis of a representative single sample of a Brazilian spontaneous cocoa bean fermentation process was carried out to gain insight into its bacterial community functioning. By reconstruction of microbial meta-pathways based on metagenomic data, the current knowledge about the metabolic capabilities of bacterial members involved in the cocoa bean fermentation ecosystem was extended. Functional meta-pathway analysis revealed the distribution of the metabolic pathways between the bacterial members involved. The metabolic capabilities of the lactic acid bacteria present were most associated with the heterolactic fermentation and citrate assimilation pathways. The role of Enterobacteriaceae in the conversion of substrates was shown through the use of the mixed-acid fermentation and methylglyoxal detoxification pathways. Furthermore, several other potential functional roles for Enterobacteriaceae were indicated, such as pectinolysis and citrate assimilation. Concerning acetic acid bacteria, metabolic pathways were partially reconstructed, in particular those related to responses toward stress, explaining their metabolic activities during cocoa bean fermentation processes. Further, the in-depth metagenomic analysis unveiled functionalities involved in bacterial competitiveness, such as the occurrence of CRISPRs and potential bacteriocin production. Finally, comparative analysis of the metagenomic data with bacterial genomes of cocoa bean fermentation isolates revealed the applicability of the selected strains as functional starter cultures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Applying a process based erosion model to assess off-site effects of soil erosion from the regional scale to the measure level

    Science.gov (United States)

    Schindewolf, Marcus; Arevalo, Annika; Saathoff, Ulfert; Käpermann, Philipp; Schmidt, Jürgen

    2013-04-01

    Since soil erosion is one of the most important issues of global soil degradation, great effort was put into the application of erosion models for the assessment and prevention of on-site damages. Beside the primary impact of soil loss in decreasing soil fertility, erosion can cause significant impacts if transported sediments are entering downslope ecosystems, settlements, infrastructure or traffic routes. These off-site damages can be very costly, affect a lot of people and contaminate water-resources. The analysis of these problems is intensified by the requirements of new legislation, such as the EU Water Framework Directive (WFD), providing new challenges for planning authorities in order to combat off-site damage. Hence there is strong public and scientific interest in understanding the processes of sediment as well as particle attached nutrient and pollutant transport. Predicting the frequency, magnitude and extent of off-site impacts of water erosion is a necessary precondition for adequate risk assessments and mitigation measures. Process based models are increasingly used for the simulation of soil erosion. Regarding the requirements of the WFD, these models need to deliver comparable estimates from the regional scale to the level of mitigation measures. This study aims on the application of the process based model EROSION 3D for off-site risk assessment on different scales for the German federal state of Saxony using available geo data, data base applications and GIS-routines. Following issues were investigated: - Where are the expected sediment deposition areas? - Which settlements, infrastructures and traffic routes are affected by sediment fluxes? - Which river sections are affected by sediment inputs? - Which river sections are affected by nutrient and heavy metal inputs? The model results identify the Saxon loess belt as highly endangered by off-site damages although hotspots can be found in the northern flatlands and the southern mountain range as

  11. A Dynamic study of Mantle processes applying In-situ Methods to Compound Xenoliths: implications for small to intermediate scale heterogeneity

    Science.gov (United States)

    Baziotis, Ioannis; Asimow, Paul; Koroneos, Antonios; Ntaflos, Theodoros; Poli, Giampero

    2013-04-01

    The mantle is the major geochemical reservoir of most rock-forming elements in the Earth. Convection and plate-tectonic driven processes act to generate local and regional heterogeneity within the mantle, which in turn through thermal and chemical interactions modulates ongoing geophysical processes; this feedback shapes the dynamics of the deep interior. Consequently, these processes contribute to the evolution of the earth throughout its geological history. Up to now, the heterogeneity of the mantle has been extensively studied in terms of conventional methods using basalt chemistry, bulk rock and mineral major and trace element analysis of isolated xenolith specimens of varying lithology, and massif exposures. The milestone of the present study, part of an ongoing research project, is the application of in-situ analytical methods such as microprobe, LA-ICP-MS and high resolution SEM in order to provide high quality major and trace element analyses as well as elemental distribution of the coexisting phases in the preserved intra-mantle lithologies, Particularly, in the context of the current study we used selected compound xenoliths from San Carlos (Arizona, USA), Kilbourne Hole (New Mexico, USA), Cima Dome and Dish Hill suites (California, USA), San Quintin (Baja California, Mexico) and Chino Valley (Arizona, USA), from the Howard Wilshire collection archived at the Smithsonian Institution. The selection of these compound xenoliths was based upon freshness and integrity of specimens, maximum distance on both sides of lithologic contacts, and rock types thought most likely to represent subsolidus juxtaposition of different lithologies that later partially melted in contact. The San Carlos samples comprise composite xenoliths with websterite, lherzolite and clinopyroxenite layers or clinopyroxenite veins surrounded by lherzolite or orthopyroxenite-rich rims. The Kilbourne Hole suite comprises spinel-(olivine) clinopyroxenite and orthopyroxenite dikes cutting

  12. Tank waste processing analysis: Database development, tank-by-tank processing requirements, and examples of pretreatment sequences and schedules as applied to Hanford Double-Shell Tank Supernatant Waste - FY 1993

    International Nuclear Information System (INIS)

    Colton, N.G.; Orth, R.J.; Aitken, E.A.

    1994-09-01

    This report gives the results of work conducted in FY 1993 by the Tank Waste Processing Analysis Task for the Underground Storage Tank Integrated Demonstration. The main purpose of this task, led by Pacific Northwest Laboratory, is to demonstrate a methodology to identify processing sequences, i.e., the order in which a tank should be processed. In turn, these sequences may be used to assist in the development of time-phased deployment schedules. Time-phased deployment is implementation of pretreatment technologies over a period of time as technologies are required and/or developed. The work discussed here illustrates how tank-by-tank databases and processing requirements have been used to generate processing sequences and time-phased deployment schedules. The processing sequences take into account requirements such as the amount and types of data available for the tanks, tank waste form and composition, required decontamination factors, and types of compact processing units (CPUS) required and technology availability. These sequences were developed from processing requirements for the tanks, which were determined from spreadsheet analyses. The spreadsheet analysis program was generated by this task in FY 1993. Efforts conducted for this task have focused on the processing requirements for Hanford double-shell tank (DST) supernatant wastes (pumpable liquid) because this waste type is easier to retrieve than the other types (saltcake and sludge), and more tank space would become available for future processing needs. The processing requirements were based on Class A criteria set by the U.S. Nuclear Regulatory Commission and Clean Option goals provided by Pacific Northwest Laboratory

  13. O uso da estatística de qui-quadrado no controle de processos The noncentral chi square statistic applied to process control

    Directory of Open Access Journals (Sweden)

    Antônio Fernando Branco Costa

    2005-08-01

    Full Text Available Dois gráficos de controle são, usualmente, utilizados no monitoramento da média e da variância de um processo. Em geral, utiliza-se o gráfico de "Xbarra" para a detecção de alterações da média, e o gráfico de R para a sinalização de aumentos da variabilidade. Neste artigo, propõe-se o uso de uma única estatística e, portanto, de um único gráfico, como alternativa à prática comum do monitoramento de processos por meio de dois gráficos de controle. O gráfico proposto, baseado na estatística de Qui-quadrado não-central, tem se mostrado mais eficiente que os gráficos de "Xbarra" e R. Além disso, se as decisões sobre as condições dos parâmetros do processo são baseadas no histórico das observações e não apenas na última observação, então o uso da estatística de Qui-quadrado não-central é indicado para a detecção de pequenas perturbações. Neste estudo, são também apresentados os gráficos de controle da média móvel ponderada exponencialmente (EWMA baseados na estatística Qui-quadrado não-central.It is standard practice to use joint charts in process control, one designed to detect shifts in the mean and the other to detect changes in the variance of the process. In this paper, we propose the use of a single chart to control both mean and variance. Based on the noncentral chi square statistic, the single chart is faster in detecting shifts in the mean and increases in variance than its competitor, the joint "Xbar" and R charts. The noncentral chi square statistic can also be used with the EWMA procedure, particularly in the detection of small mean shifts, accompanied or not by slight increases in variance.

  14. DOE applied to study the effect of process parameters on silicon spacing in lost foam Al-Si-Cu alloy casting

    International Nuclear Information System (INIS)

    Shayganpour, A; Izman, S; Idris, M H; Jafari, H

    2012-01-01

    Lost foam casting as a relatively new manufacturing process is extensively employed to produce sound complicated castings. In this study, an experimental investigation on lost foam casting of an Al-Si-Cu aluminium cast alloy was conducted. The research was aimed in evaluating the effect of different pouring temperatures, slurry viscosities, vibration durations and sand grain sizes on eutectic silicon spacing of thin-wall castings. A stepped-pattern was used in the study and the focus of the investigations was at the thinnest 3 mm section. A full two-level factorial design experimental technique was used to plan the experiments and afterwards identify the significant factors affecting casting silicon spacing. The results showed that pouring temperature and its interaction with vibration time have pronounced effect on eutectic silicon phase size. Increasing pouring temperature coarsened the eutectic silicon spacing while the higher vibration time diminished coarsening effect. Moreover, no significant effects on silicon spacing were found with variation of sand size and slurry viscosity.

  15. Optimization of process condition for the preparation of amine-impregnated activated carbon developed for CO2capture and applied to methylene blue adsorption by response surface methodology.

    Science.gov (United States)

    Das, Dipa; Meikap, Bhim C

    2017-10-15

    The present research describes the optimal adsorption condition for methylene blue (MB). The adsorbent used here was monoethanol amine-impregnated activated carbon (MEA-AC) prepared from green coconut shell. Response surface methodology (RSM) is the multivariate statistical technique used for the optimization of the process variables. The central composite design is used to determine the effect of activation temperature, activation time and impregnation ratio on the MB removal. The percentage (%) MB adsorption by MEA-AC is evaluated as a response of the system. A quadratic model was developed for response. From the analysis of variance, the factor which was the most influential on the experimental design response has been identified. The optimum condition for the preparation of MEA-AC from green coconut shells is the temperature of activation 545.6°C, activation time of 41.64 min and impregnation ratio of 0.33 to achieve the maximum removal efficiency of 98.21%. At the same optimum parameter, the % MB removal from the textile-effluent industry was examined and found to be 96.44%.

  16. Study of the electrolyte process efficiency applied to the oil industry wastewaters; Estudo da eficiencia do processo eletrolitico aplicado em aguas de producao da industria do petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Brasileiro, Ilza M.N.; Cavalcanti, Eliane B. [Universidade Federal de Campina Grande (UFCG), PB (Brazil); Vilar, Eudesio O. [Universidade Federal de Campina Grande (UFCG), PB (Brazil). Dept. de Engenharia Quimica; Tonholo, Josealdo [Universidade Federal de Alagoas (UFAL), Maceio, AL (Brazil). Dept. de Quimica

    2004-07-01

    The present work presents resulted obtained from the oxidation of the ammoniacal nitrogen and sulphides using a bench scale electrochemical cell which consisted of 6 pairs of stainless steel electrodes and DSA{sup R}. An experimental planning of the factorial type 2{sup 4} more three repetitions of the central point was carried through, where the following dependent parameters had been studied: salinity (mg/L), electric potential (volts), outflow (l/h), time of electrolysis (min). As indicating of the efficiency of the electrochemical treatment the following independent parameters had been analyzed: percentage of removal of sulfate and percentage of removal of N-ammoniacal, in different operational conditions. The percentage of removal of n-ammoniacal was of the order of 100 %, with a consumption of energy to 8 kWh/m{sup 3} for a flowrate of 600 l/h and 15,000 mg-L of chlorides. Moreover the work approaches the study of the incrustation in the electrodes, which considerable diminishes the income of the electrochemical process. Aiming at to diminish this problem we are studying a procedure from a controlled electrolysis for a timer. With the purpose to evaluate the incrustation level had been carried through physical-chemical analyses of alkalinity and hardness in CaCO{sub 3}, index of incrustation for chronoamperometry and electronic microscopy of sweepings. (author)

  17. Study of the coefficient of separation for some processes which are applied to lithium isotopes; Etude du coefficient de separation de quelques processus concernant les isotopes du lithium

    Energy Technology Data Exchange (ETDEWEB)

    Perret, L.; Rozand, L.; Saito, E. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1958-07-01

    The fundamental separation factors of some processes are investigated: the distillation of metallic lithium, counter current electromigration in fused salts (particularly in lithium nitrate) electrolysis in aqueous solution and ion exchange. The chemical transfer between a lithium amalgam and lithium salts in a dimethylformamide solution (a solvent which is not attacked by the amalgam) is also studied. Finally a description is given of isotopic analyses carried out either by scintillation counting or by mass spectrography using apparatus specially designed for this particular task. (author) [French] Les facteurs de separation elementaires de quelques processus connus sont etudies: distillation du lithium metallique, electromigration a contre-courant en sels fondus (en particulier le nitrate), electrolyse en solution aqueuse et echange d'ions. L'echange chimique entre l'amalgame de lithium et les sels de lithium en solution dans la dimethylformamide - solvant non attaque par l'amalgame - est egalement etudie. Enfin, on decrit les methodes d 'analyse isotopique, soit par comptage par scintillation, soit par spectrometrie de masse au moyen d'un appareil specialement concu pour cet usage particulier. (auteur)

  18. Reduction of emission when applying thermal separation processes in the dismantling of nuclear facilities - oxy-fuel gas and plasma arc cutting

    International Nuclear Information System (INIS)

    Stoiber, H.; Hammer, G.; Schultz, H.

    1995-01-01

    Plasma arc cutting and laser beam cutting was used for the studies with the goal of significantly reducing material emission by changing the operating and equipment parameters. Some separations using the oxy-fuel gas cutting process served the purpose of providing a guide for determining which factors can most effectively reduce emission. The separation experiments were carried out with specimens of R-St 37-2, 10 mm thick, as well as of X 6 CrNi 18 10 steel 5, 10, 15 and 20 mm thick. In all cases, lowering speed and the amount of gas proved at first to be effective measures to check material emission. It was also possible to achieve adherence of molten mass and slag on the flank of the joint with excessive icicling. When the plasma separates the CrNi steel, it is possible to increase emission reduction additionally by using an argon/hydrogen mixture instead of nitrogen as a cutting gas. (orig./DG) [de

  19. DSS-13 - Using an OSI process control standard for monitor and control. [Deep Space Network experimental station applying Open System interconnection

    Science.gov (United States)

    Heuser, W. R.; Chen, Richard L.; Stockett, Michael H.

    1993-01-01

    The flexibility and robustness of a monitor and control (M&C) system are a direct result of the underlying inter-processor communications architecture. A new architecture for M&C at the Deep Space Communications Complexes has been developed based on the manufacturing message specification (MMS) process control standard of the open system interconnection (OSI) suite of protocols. This architecture has been tested both in a laboratory environment and under operational conditions at the Deep Space Network experimental station (DSS-13). The DSS-13 experience in the application of OSI standards to support M&C has been extremely successful. MMS meets the functional needs of the station and provides a level of flexibility and responsiveness previously unknown in that environment. The architecture is robust enough to meet current operational needs and flexible enough to provide a migration path for new subsystems. This paper describes the architecture of the DSS-13 M&C system, discuss how MMS was used and the requirements this imposed on other parts of the system, and provides results from systems and operational testing at DSS-13.

  20. APPLIED GEOSPATIAL EDUCATION: ACQUISITION AND PROCESSING OF HIGH RESOLUTION AIRBORNE LIDAR AND ORTHOIMAGES FOR THE GREAT SMOKY MOUNTAINS NATIONAL PARK, SOUTHEASTERN UNITED STATES

    Directory of Open Access Journals (Sweden)

    T. R. Jordan

    2012-07-01

    Full Text Available In an innovative collaboration between government, university and private industry, researchers at the University of Georgia and Gainesville State College are collaborating with Photo Science, Inc. to acquire, process and quality control check lidar and or-thoimages of forest areas in the Southern Appalachian Mountains of the United States. Funded by the U.S. Geological Survey, this project meets the objectives of the ARRA initiative by creating jobs, preserving jobs and training students for high skill positions in geospatial technology. Leaf-off lidar data were acquired at 1-m resolution of the Tennessee portion of the Great Smoky Mountain National Park (GRSM and adjacent Foothills Parkway. This 1400-sq. km. area is of high priority for national/global interests due to biodiversity, rare and endangered species and protection of some of the last remaining virgin forest in the U.S. High spatial resolution (30 cm leaf-off 4-band multispectral orthoimages also were acquired for both the Chattahoochee National Forest in north Georgia and the entire GRSM. The data are intended to augment the National Elevation Dataset and orthoimage database of The National Map with information that can be used by many researchers in applications of LiDAR point clouds, high resolution DEMs and or-thoimage mosaics. Graduate and undergraduate students were involved at every stage of the workflow in order to provide then with high level technical educational and professional experience in preparation for entering the geospatial workforce. This paper will present geospatial workflow strategies, multi-team coordination, distance-learning training and industry-academia partnership.

  1. Impact of urban effluents on summer hypoxia in the highly turbid Gironde Estuary, applying a 3D model coupling hydrodynamics, sediment transport and biogeochemical processes

    Science.gov (United States)

    Lajaunie-Salla, Katixa; Wild-Allen, Karen; Sottolichio, Aldo; Thouvenin, Bénédicte; Litrico, Xavier; Abril, Gwenaël

    2017-10-01

    Estuaries are increasingly degraded due to coastal urban development and are prone to hypoxia problems. The macro-tidal Gironde Estuary is characterized by a highly concentrated turbidity maximum zone (TMZ). Field observations show that hypoxia occurs in summer in the TMZ at low river flow and a few days after the spring tide peak. In situ data highlight lower dissolved oxygen (DO) concentrations around the city of Bordeaux, located in the upper estuary. Interactions between multiple factors limit the understanding of the processes controlling the dynamics of hypoxia. A 3D biogeochemical model was developed, coupled with hydrodynamics and a sediment transport model, to assess the contribution of the TMZ and the impact of urban effluents through wastewater treatment plants (WWTPs) and sewage overflows (SOs) on hypoxia. Our model describes the transport of solutes and suspended material and the biogeochemical mechanisms impacting oxygen: primary production, degradation of all organic matter (i.e. including phytoplankton respiration, degradation of river and urban watershed matter), nitrification and gas exchange. The composition and the degradation rates of each variable were characterized by in situ measurements and experimental data from the study area. The DO model was validated against observations in Bordeaux City. The simulated DO concentrations show good agreement with field observations and satisfactorily reproduce the seasonal and neap-spring time scale variations around the city of Bordeaux. Simulations show a spatial and temporal correlation between the formation of summer hypoxia and the location of the TMZ, with minimum DO centered in the vicinity of Bordeaux. To understand the contribution of the urban watershed forcing, different simulations with the presence or absence of urban effluents were compared. Our results show that in summer, a reduction of POC from SO would increase the DO minimum in the vicinity of Bordeaux by 3% of saturation. Omitting

  2. The characteristics of unusual OBS data exposed to strong shaking and the influence of applying these data to EEW processing: examples of Off-Kushiro OBS, JAMSTEC

    Science.gov (United States)

    Hayashimoto, N.; Nakamura, T.; Hoshiba, M.

    2014-12-01

    On-line cable type Ocean Bottom Seismograph (OBS) is expected to be useful for making Earthquake Early Warning (EEW) earlier. However, careful handling of these data is required because the installation environment of OBSs may be different from that of land stations. The stability of OBS data exposed to strong shaking is one of those problems. For instance, Yamamoto et al. (2004) pointed out that the attitude of one of Off-Kushiro OBS (JAMSTEC) was changed about 5 degree by strong ground motion during the 2003 Tokachi-oki earthquake of Mjma8.0. The inclination of OBS causes baseline offset change in acceleration waveform on the gravitational acceleration component. Furthermore, it is also expected that coupling of the OBS and the ocean floor becomes weak. Since the processing of the EEW is ongoing in real-time, it is difficult to detect abnormal data appropriately. In this study, we investigate the characteristics of unusual OBS data exposed to strong motion at the Off-Kushiro OBSs. First, we estimate the amount of acceleration offset caused by rotation of the cable. The acceleration offset by slight inclination of OBS increases with input acceleration. And it is found that the acceleration offsets is larger on the horizontal component (Y', perpendicular to the cable line) than the other horizontal component (X', along cable line). Second, we compare the difference between the S-wave H/V spectral ratio for strong ground motion and that for weak ground motion to investigate nonlinear response. We found that S-wave H/V ratio for strong motion at OBS has typical features of nonlinear response which is similar with land stations: the dominant peak shift of lower frequency and the attenuation at high frequency. Finally, we discuss the influence of these unusual OBS data on EEW magnitude. We conclude that acceleration offsets resulting from incline of OBS could cause overestimation of magnitude. Acknowledgment: Strong motion acceleration waveform data of Off-Kushiro OBS

  3. A method to assist in the diagnosis of early diabetic retinopathy: Image processing applied to detection of microaneurysms in fundus images.

    Science.gov (United States)

    Rosas-Romero, Roberto; Martínez-Carballido, Jorge; Hernández-Capistrán, Jonathan; Uribe-Valencia, Laura J

    2015-09-01

    Diabetes increases the risk of developing any deterioration in the blood vessels that supply the retina, an ailment known as Diabetic Retinopathy (DR). Since this disease is asymptomatic, it can only be diagnosed by an ophthalmologist. However, the growth of the number of ophthalmologists is lower than the growth of the population with diabetes so that preventive and early diagnosis is difficult due to the lack of opportunity in terms of time and cost. Preliminary, affordable and accessible ophthalmological diagnosis will give the opportunity to perform routine preventive examinations, indicating the need to consult an ophthalmologist during a stage of non proliferation. During this stage, there is a lesion on the retina known as microaneurysm (MA), which is one of the first clinically observable lesions that indicate the disease. In recent years, different image processing algorithms, which allow the detection of the DR, have been developed; however, the issue is still open since acceptable levels of sensitivity and specificity have not yet been reached, preventing its use as a pre-diagnostic tool. Consequently, this work proposes a new approach for MA detection based on (1) reduction of non-uniform illumination; (2) normalization of image grayscale content to improve dependence of images from different contexts; (3) application of the bottom-hat transform to leave reddish regions intact while suppressing bright objects; (4) binarization of the image of interest with the result that objects corresponding to MAs, blood vessels, and other reddish objects (Regions of Interest-ROIs) are completely separated from the background; (5) application of the hit-or-miss Transformation on the binary image to remove blood vessels from the ROIs; (6) two features are extracted from a candidate to distinguish real MAs from FPs, where one feature discriminates round shaped candidates (MAs) from elongated shaped ones (vessels) through application of Principal Component Analysis (PCA

  4. Paper spray mass spectrometry applied in the monitoring of a chemical system in dynamic chemical equilibrium: the redox process of methylene blue.

    Science.gov (United States)

    de Paula, Camila Cristina Almeida; Valadares, Alberto; Jurisch, Marina; Piccin, Evandro; Augusti, Rodinei

    2016-05-15

    The monitoring of chemical systems in dynamic equilibrium is not an easy task. This is due to the high rate at which the system returns to equilibrium after being perturbed, which hampers the possibility of following the aftereffects of the disturbance. In this context, it is necessary to use a fast analytical technique that requires no (or minimal) sample preparation, and which is capable of monitoring the species constituting the system in equilibrium. Paper spray ionization mass spectrometry (PS-MS), a recently introduced ambient ionization technique, has such characteristics and hence was chosen for monitoring a model system: the redox process of methylene blue. The model system evaluated herein was composed of three cationic species of methylene blue (MB), which coexist in a dynamic redox system: (1) [MB](+) of m/z 284 (cationic MB); (2) [MB + H + e](+•) of m/z 285 (the protonated form of a transient species resulting from the reduction of [MB](+) ); (3) [MB + 2H + 2e](+) or [leuco-MB + H](+) of m/z 286 (the protonated leuco form of MB). Aliquots of a MB solution were collected before and after the addition of a reducing agent (metallic zinc) and directly analyzed by PS-MS for identification of the predominant cationic species at different conditions. The mass spectra revealed that before the addition of the reducing agent the ion of m/z 284 (cationic MB) is the unique species. Upon the addition of the reducing agent and acid, however, the solution continuously undergo discoloration while reduced species derived directly from cationic MB (m/z 285 and 286) are detected in the mass spectra with increasing intensities. Fragmentation patterns obtained for each ionic species, i.e. [MB](+) , [MB + H + e](+•) and [leuco-MB + H](+) , shown to be consistent with the proposed structures. The PS-MS technique proved to be suitable for an in situ and 'near' real-time analysis of the dynamic equilibrium involving the redox of MB in aqueous medium. The data clearly

  5. APPLIED ORGANIZATION OF CONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Kievskiy Leonid Vladimirovich

    2017-03-01

    Full Text Available Applied disciplines in the sphere of construction which are engaged in the solution of vital macroeconomical problems (the general trend of development of these disciplines is the expansion of problematics and mutual integration are considered. Construction organization characteristic at the present stage as a systems engineering discipline covering the investment process of creation of real estate items, is given. The main source of current research trends for applied sciences (socio-economic development forecasts, regional and local programs is determined. Interpenetration and integration of various fields of knowledge exemplified by the current interindustry problem of blocks renovation organization of existing development, is demonstrated. Mathematical model of wave construction (for the period of deployment is proposed. Nature of dependence of the total duration of renovation on the limit of annual input and coefficient of renovation, is established. Overall structure of the Moscow region housing market is presented, and approaches to definition of effective demand are proposed.

  6. Post-factum detection of radiation treatment in processed food by analysis of radiation-induced hydrocarbons. Pt. 1. Applying the method L 06.00-37 defined in Para. 35 LMBG (German Act on Food Irradiation) to processed food

    International Nuclear Information System (INIS)

    Hartmann, M.; Ammon, J.; Berg, H.

    1995-01-01

    The German official method L 06.00-37 (Para. 35 German Act on Food Irradiation) is used for the identification of irradiated fat-containing food by GC-analysis of radiation-induced hydrocarbons. Simple modifications in sample preparation allow a distinctive improvement in detection possibilities and detection limits as well. The applicability of the modified method for the detection of irradiated ingredients in model-like processed food is shown. An identification of only 3% (irradiated fat to total fat ratio) irradiated ingredient (1,5 kGy) in processed food was possible. Additionally, the kind of irradiated ingredient could be identified by the pattern of radiation induced hydrocarbons. Their concentrations are corresponding with the fatty acid composition of the irradiated compound. (orig.) [de

  7. Generalized reciprocal method applied in processing seismic ...

    African Journals Online (AJOL)

    A geophysical investigation was carried out at Shika, near Zaria, using seismic refraction method; with the aim of analyzing the data obtained using the generalized reciprocal method (GRM). The technique is for delineating undulating refractors at any depth from in-line seismic refraction data consisting of forward and ...

  8. Digital image processing applied Rock Art tracing

    Directory of Open Access Journals (Sweden)

    Montero Ruiz, Ignacio

    1998-06-01

    Full Text Available Adequate graphic recording has been one of the main objectives of rock art research. Photography has increased its role as a documentary technique. Now, digital image and its treatment allows new ways to observe the details of the figures and to develop a recording procedure which is as, or more, accurate than direct tracing. This technique also avoid deterioration of the rock paintings. The mathematical basis of this method is also presented.

    La correcta documentación del arte rupestre ha sido una preocupación constante por parte de los investigadores. En el desarrollo de nuevas técnicas de registro, directas e indirectas, la fotografía ha ido adquiriendo mayor protagonismo. La imagen digital y su tratamiento permiten nuevas posibilidades de observación de las figuras representadas y, en consecuencia, una lectura mediante la realización de calcos indirectos de tanta o mayor fiabilidad que la observación directa. Este sistema evita los riesgos de deterioro que provocan los calcos directos. Se incluyen las bases matemáticas que sustentan el método.

  9. Emotional recognition applying speech signal processing

    OpenAIRE

    Morales Pérez, Mauricio; Echeverry Correa, Julián David; Orozco Gutiérrez, Alvaro Angel

    2007-01-01

    Se presenta en este trabajo una metodología para la caracterización de la señal de voz aplicada en el reconocimiento de estados emocionales. Los diferentes estados emocionales de un hablante producen cambios fisiológicos en el aparato fonador, lo que se ve reflejado en la variación de algunos parámetros de la voz. Las técnicas de procesamiento empleadas son: transformadas tiempo-frecuencia, análisis de predicción lineal y raw data. In this paper a methodology for extraction of features ...

  10. SIFT applied to CBIR

    Directory of Open Access Journals (Sweden)

    ALMEIDA, J.

    2009-12-01

    Full Text Available Content-Based Image Retrieval (CBIR is a challenging task. Common approaches use only low-level features. Notwithstanding, such CBIR solutions fail on capturing some local features representing the details and nuances of scenes. Many techniques in image processing and computer vision can capture these scene semantics. Among them, the Scale Invariant Features Transform~(SIFT has been widely used in a lot of applications. This approach relies on the choice of several parameters which directly impact its effectiveness when applied to retrieve images. In this paper, we discuss the results obtained in several experiments proposed to evaluate the application of the SIFT in CBIR tasks.

  11. Applying normalization process theory to understand implementation of a family violence screening and care model in maternal and child health nursing practice: a mixed method process evaluation of a randomised controlled trial.

    Science.gov (United States)

    Hooker, Leesa; Small, Rhonda; Humphreys, Cathy; Hegarty, Kelsey; Taft, Angela

    2015-03-28

    In Victoria, Australia, Maternal and Child Health (MCH) services deliver primary health care to families with children 0-6 years, focusing on health promotion, parenting support and early intervention. Family violence (FV) has been identified as a major public health concern, with increased prevalence in the child-bearing years. Victorian Government policy recommends routine FV screening of all women attending MCH services. Using Normalization Process Theory (NPT), we aimed to understand the barriers and facilitators of implementing an enhanced screening model into MCH nurse clinical practice. NPT informed the process evaluation of a pragmatic, cluster randomised controlled trial in eight MCH nurse teams in metropolitan Melbourne, Victoria, Australia. Using mixed methods (surveys and interviews), we explored the views of MCH nurses, MCH nurse team leaders, FV liaison workers and FV managers on implementation of the model. Quantitative data were analysed by comparing proportionate group differences and change within trial arm over time between interim and impact nurse surveys. Qualitative data were inductively coded, thematically analysed and mapped to NPT constructs (coherence, cognitive participation, collective action and reflexive monitoring) to enhance our understanding of the outcome evaluation. MCH nurse participation rates for interim and impact surveys were 79% (127/160) and 71% (114/160), respectively. Twenty-three key stakeholder interviews were completed. FV screening work was meaningful and valued by participants; however, the implementation coincided with a significant (government directed) change in clinical practice which impacted on full engagement with the model (coherence and cognitive participation). The use of MCH nurse-designed FV screening/management tools in focussed women's health consultations and links with FV services enhanced the participants' work (collective action). Monitoring of FV work (reflexive monitoring) was limited. The use of

  12. Applied ALARA techniques

    International Nuclear Information System (INIS)

    Waggoner, L.O.

    1998-01-01

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work

  13. Revisión de los principales modelos para aplicar técnicas de Minería de Procesos (Review of models for applying process mining techniques

    Directory of Open Access Journals (Sweden)

    Arturo Orellana García

    2016-03-01

    Full Text Available Spanish abstract La minería de procesos constituye una alternativa novedosa para mejorar los procesos en una variedad de dominios de aplicación. Tiene como objetivo extraer información a partir de los datos almacenados en los registros de trazas de los sistemas de información, en busca de errores, inconsistencias, vulnerabilidades y variabilidad en los procesos que se ejecutan. Las técnicas de minería de procesos se utilizan en múltiples sectores, como la industria, los servicios web, la inteligencia de negocios y la salud. Sin embargo, para aplicar estas técnicas existen varios modelos a seguir y poca información sobre cual aplicar, al no contar con un análisis comparativo entre estos. La investigación se centró en recopilar información sobre los principales modelos propuestos por autores de referencia mundial en el tema de minería de procesos para aplicar técnicas en el descubrimiento, chequeo de conformidad y mejoramiento de los procesos. Se realiza un análisis de los mismos en función de seleccionar los elementos y características útiles para su aplicación en el entorno hospitalario. La actual investigación contribuye al desarrollo de un modelo para la detección y análisis de variabilidad en procesos hospitalarios utilizando técnicas de minería de procesos. Permite a los investigadores tener de forma centralizada, los criterios para decidir qué modelo utilizar, o qué fases emplear de uno o más modelos. English abstract Process mining is a novel alternative to improve processes in a variety of application domains. It aims to extract information from data stored in records of traces from information systems, looking for errors, inconsistencies, vulnerabilities and variability in processes that are executing. The process mining techniques are used in multiple sectors such as industry, web services, business intelligence and health. However, to apply these techniques there are several models and little information on

  14. Separation Science and Technology. Semiannual progress report, April 1993--September 1993

    International Nuclear Information System (INIS)

    Vandegrift, G.F.; Chamberlain, D.B.; Conner, C.

    1996-01-01

    This document reports on the work done by the Separations Science and Technology Programs of the Chemical Technology Division, Argonne National Laboratory, in the period April-September 1993. This effort is mainly concerned with developing the TRUEX process for removing and concentrating actinides from acidic waste streams contaminated with transuranic (TRU) elements. The objectives of TRUEX processing are to recover valuable TRU elements and to lower disposal costs for the nonTRU waste product of the process. Other projects are underway with the objective of developing (1) evaporation technology for concentrating radioactive waste and product streams such as those generated by the TRUEX process, (2) treatment schemes for liquid wastes stored or being generated at Argonne, (3) a process based on sorbing modified TRUEX solvent on magnetic beads to be used for separation of contaminants from radioactive and hazardous waste streams, and (4) a process that uses low-enriched uranium targets for production of 99 Mo for nuclear medicine uses

  15. Different Ways to Apply a Measurement Instrument of E-Nose Type to Evaluate Ambient Air Quality with Respect to Odour Nuisance in a Vicinity of Municipal Processing Plants.

    Science.gov (United States)

    Szulczyński, Bartosz; Wasilewski, Tomasz; Wojnowski, Wojciech; Majchrzak, Tomasz; Dymerski, Tomasz; Namieśnik, Jacek; Gębicki, Jacek

    2017-11-19

    This review paper presents different ways to apply a measurement instrument of e-nose type to evaluate ambient air with respect to detection of the odorants characterized by unpleasant odour in a vicinity of municipal processing plants. An emphasis was put on the following applications of the electronic nose instruments: monitoring networks, remote controlled robots and drones as well as portable devices. Moreover, this paper presents commercially available sensors utilized in the electronic noses and characterized by the limit of quantification below 1 ppm v / v , which is close to the odour threshold of some odorants. Additionally, information about bioelectronic noses being a possible alternative to electronic noses and their principle of operation and application potential in the field of air evaluation with respect to detection of the odorants characterized by unpleasant odour was provided.

  16. Different Ways to Apply a Measurement Instrument of E-Nose Type to Evaluate Ambient Air Quality with Respect to Odour Nuisance in a Vicinity of Municipal Processing Plants

    Directory of Open Access Journals (Sweden)

    Bartosz Szulczyński

    2017-11-01

    Full Text Available This review paper presents different ways to apply a measurement instrument of e-nose type to evaluate ambient air with respect to detection of the odorants characterized by unpleasant odour in a vicinity of municipal processing plants. An emphasis was put on the following applications of the electronic nose instruments: monitoring networks, remote controlled robots and drones as well as portable devices. Moreover, this paper presents commercially available sensors utilized in the electronic noses and characterized by the limit of quantification below 1 ppm v/v, which is close to the odour threshold of some odorants. Additionally, information about bioelectronic noses being a possible alternative to electronic noses and their principle of operation and application potential in the field of air evaluation with respect to detection of the odorants characterized by unpleasant odour was provided.

  17. The effect of an acidic, copper sulfate-based commercial sanitizer on indicator, pathogenic, and spoilage bacteria associated with broiler chicken carcasses when applied at various intervention points during poultry processing.

    Science.gov (United States)

    Russell, S M

    2008-07-01

    Studies were conducted to evaluate 1) the effect of an acidic, copper sulfate-based commercial sanitizer on pathogenic, indicator, and spoilage bacteria in a model scalder system, 2) the effect of this sanitizer on total aerobic bacteria (APC) and Escherichia coli counts, and Salmonella prevalence on broiler chicken carcasses when applied during scalding or scalding and postpick dipping, and 3) the ability of sanitizer to extend the shelf-life of broiler chicken carcasses. Exposure of Salmonella Typhimurium, Listeria monocytogenes, Staphylococcus aureus, Pseudomonas fluorescens, or Shewanella putrefaciens to the sanitizer in scalder water at 54 degrees C for 2 min resulted in complete elimination of these bacterial species. Exposure of E. coli to the treated scald water resulted in a 4.9 log(10) reduction. These data suggest that this sanitizer would be effective for use in scalders. When applied during scalding in a commercial processing plant, APC and E. coli counts were significantly (P sanitizer, APC were significantly P sanitizer, except for d 2 and 10. Averages on these days were higher for controls, but were not significantly different. Salmonella prevalence was not consistently impacted overall. For the shelf-life study, odor scores were significantly (P sanitizer suppressed spoilage bacteria with a 99.99% reduction at d 10 and a 99.9% reduction at d 12 of storage. This effect could result in an extension of the shelf life of the poultry carcasses by up to 4 d.

  18. Applied systems theory

    CERN Document Server

    Dekkers, Rob

    2017-01-01

    Offering an up-to-date account of systems theories and its applications, this book provides a different way of resolving problems and addressing challenges in a swift and practical way, without losing overview and grip on the details. From this perspective, it offers a different way of thinking in order to incorporate different perspectives and to consider multiple aspects of any given problem. Drawing examples from a wide range of disciplines, it also presents worked cases to illustrate the principles. The multidisciplinary perspective and the formal approach to modelling of systems and processes of ‘Applied Systems Theory’ makes it suitable for managers, engineers, students, researchers, academics and professionals from a wide range of disciplines; they can use this ‘toolbox’ for describing, analysing and designing biological, engineering and organisational systems as well as getting a better understanding of societal problems. This revised, updated and expanded second edition includes coverage of a...

  19. Applying Continuous Process Improvement to the Contract Closeout Process

    Science.gov (United States)

    1993-12-16

    the work they supervise. Closing out U. S. Government contracts is part of the acquisition cycle and can not continue to be brushed aside when higher...rAplc to crAtain aly10t *65 00w5NWd as f iacsan odfctin teet nexes f$2, entered into w or alte January 1. 1989. 125 FAC 90-13 SEPTEMBER 24, 1992 4.f3

  20. Applied and Professional Ethics

    OpenAIRE

    Collste, Göran

    2012-01-01

    The development of applied ethics in recent decades has had great significance for philosophy and society. In this article, I try to characterise this field of philosophical inquiry. I also discuss the relation of applied ethics to social policy and to professional ethics. In the first part, I address the following questions: What is applied ethics? When and why did applied ethics appear? How do we engage in applied ethics? What are the methods? In the second part of the article, I introduce...

  1. Método para aplicação de gráficos de controle de regressão no monitoramento de processos Method for applying regression control charts to process monitoring

    Directory of Open Access Journals (Sweden)

    Danilo Cuzzuol Pedrini

    2011-03-01

    Full Text Available Este artigo propõe um método para a aplicação do gráfico de controle de regressão no monitoramento de processos industriais. Visando facilitar a aplicação do gráfico, o método é apresentado em duas fases: análise retrospectiva (Fase I e monitoramento do processo (Fase II, além de incluir uma modificação do gráfico de controle de regressão múltipla, permitindo o monitoramento direto da característica de qualidade do processo ao invés do monitoramento dos resíduos padronizados do modelo. Também é proposto o gráfico de controle de extrapolação, que verifica se as variáveis de controle extrapolam o conjunto de valores utilizado para estimar o modelo de regressão. O método foi aplicado em um processo de uma indústria de borrachas. O desempenho do gráfico de controle foi avaliado pelo Número Médio de Amostras (NMA até o sinal através do método de Monte Carlo, mostrando a eficiência do gráfico em detectar algumas modificações nos parâmetros do processo.This work proposes a method for the application of regression control charts in the monitoring of manufacturing processes. The proposed method is presented in two phases: retrospective analysis (Phase I and process monitoring (Phase II. It includes a simple modification of the multiple regression control chart, allowing the monitoring of the values of quality characteristics of the process, instead of monitoring the regression standardized residuals. It also proposes an extrapolation control chart, which verifies whether the control variables extrapolate the set of data used in regression model estimation. The proposed method was successfully applied in a rubber manufacturing process. The Average Run Length (ARL distribution was estimated using the Monte Carlo method, proving the efficiency of the proposed chart in detecting some alterations in process parameters.

  2. Análisis de vibraciones para el diagnóstico aplicando procesamiento estadístico de orden superior. // Diagnosis by vibration analysis applying higher-order statistics signal processing.

    Directory of Open Access Journals (Sweden)

    F. E. Hernández Montero

    2004-05-01

    Full Text Available El siguiente trabajo tiene la finalidad de aplicar algunas de las generalidades que involucra el procesamiento estadístico deorden superior al monitoreo del estado de las máquinas rotatorias, a partir del sensado de las vibraciones que esta genera.Básicamente se trabajó en la detección del desequilibrio presente, así como en la estimación de la magnitud del mismo.Como elementos estadísticos que se procesaron figuran los momentos y cumulantes de orden superior (hasta orden cuartode las vibraciones medidas tanto en el eje horizontal, como el vertical.Finalmente, se constató como resultado, la efectividad de tal método para detectar un fallo específico, en este caso, eldesequilibrio y que podría ser extendido para otros tipos de fallos.Palabras claves: Análisis de vibraciones, procesamiento estadístico de orden superior, diagnóstico demaquinarias rotatorias.___________________________________________________________________________Abstract.The aim of this work is to apply higher-order statistics foundations on machine condition monitoring via vibration analysis.Specifically, shaft unbalance was the defect under study, and tasks for detection and magnitude estimation of thismechanical problem were accomplish.Higher-order (up to fourth-order moments and cumulants of vertical and horizontalvibrations were estimated and analyzed.It was probed the method effectiveness when it is applied on unbalance detection.Passwords: Vibration analysis, higher-order estatistics processing, rotating machine diagnosis.

  3. Journal of applied mathematics

    National Research Council Canada - National Science Library

    2001-01-01

    "[The] Journal of Applied Mathematics is a refereed journal devoted to the publication of original research papers and review articles in all areas of applied, computational, and industrial mathematics...

  4. Advances in Applied Mechanics

    OpenAIRE

    2013-01-01

    Advances in Applied Mechanics draws together recent significant advances in various topics in applied mechanics. Published since 1948, Advances in Applied Mechanics aims to provide authoritative review articles on topics in the mechanical sciences, primarily of interest to scientists and engineers working in the various branches of mechanics, but also of interest to the many who use the results of investigations in mechanics in various application areas, such as aerospace, chemical, civil, en...

  5. Perspectives on Applied Ethics

    OpenAIRE

    2007-01-01

    Applied ethics is a growing, interdisciplinary field dealing with ethical problems in different areas of society. It includes for instance social and political ethics, computer ethics, medical ethics, bioethics, envi-ronmental ethics, business ethics, and it also relates to different forms of professional ethics. From the perspective of ethics, applied ethics is a specialisation in one area of ethics. From the perspective of social practice applying eth-ics is to focus on ethical aspects and ...

  6. Applied Neuroscience Laboratory Complex

    Data.gov (United States)

    Federal Laboratory Consortium — Located at WPAFB, Ohio, the Applied Neuroscience lab researches and develops technologies to optimize Airmen individual and team performance across all AF domains....

  7. Essays in applied microeconomics

    Science.gov (United States)

    Wang, Xiaoting

    In this dissertation I use Microeconomic theory to study firms' behavior. Chapter One introduces the motivations and main findings of this dissertation. Chapter Two studies the issue of information provision through advertisement when markets are segmented and consumers' price information is incomplete. Firms compete in prices and advertising strategies for consumers with transportation costs. High advertising costs contribute to market segmentation. Low advertising costs promote price competition among firms and improves consumer welfare. Chapter Three also investigates market power as a result of consumers' switching costs. A potential entrant can offer a new product bundled with an existing product to compensate consumers for their switching cost. If the primary market is competitive, bundling simply plays the role of price discrimination, and it does not dominate unbundled sales in the process of entry. If the entrant has market power in the primary market, then bundling also plays the role of leveraging market power and it dominates unbundled sales. The market for electric power generation has been opened to competition in recent years. Chapter Four looks at issues involved in the deregulated electricity market. By comparing the performance of the competitive market with the social optimum, we identify the conditions under which market equilibrium generates socially efficient levels of electric power. Chapter Two to Four investigate the strategic behavior among firms. Chapter Five studies the interaction between firms and unemployed workers in a frictional labor market. We set up an asymmetric job auction model, where two types of workers apply for two types of job openings by bidding in auctions and firms hire the applicant offering them the most profits. The job auction model internalizes the determination of the share of surplus from a match, therefore endogenously generates incentives for an efficient division of the matching surplus. Microeconomic

  8. Applied eye tracking research

    NARCIS (Netherlands)

    Jarodzka, Halszka

    2011-01-01

    Jarodzka, H. (2010, 12 November). Applied eye tracking research. Presentation and Labtour for Vereniging Gewone Leden in oprichting (VGL i.o.), Heerlen, The Netherlands: Open University of the Netherlands.

  9. Applied Mathematics Seminar 1982

    International Nuclear Information System (INIS)

    1983-01-01

    This report contains the abstracts of the lectures delivered at 1982 Applied Mathematics Seminar of the DPD/LCC/CNPq and Colloquy on Applied Mathematics of LCC/CNPq. The Seminar comprised 36 conferences. Among these, 30 were presented by researchers associated to brazilian institutions, 9 of them to the LCC/CNPq, and the other 6 were given by visiting lecturers according to the following distribution: 4 from the USA, 1 from England and 1 from Venezuela. The 1981 Applied Mathematics Seminar was organized by Leon R. Sinay and Nelson do Valle Silva. The Colloquy on Applied Mathematics was held from october 1982 on, being organized by Ricardo S. Kubrusly and Leon R. Sinay. (Author) [pt

  10. Applied Learning Networks (ALN)

    National Research Council Canada - National Science Library

    Bannister, Joseph; Shen, Wei-Min; Touch, Joseph; Hou, Feili; Pingali, Venkata

    2007-01-01

    Applied Learning Networks (ALN) demonstrates that a network protocol can learn to improve its performance over time, showing how to incorporate learning methods into a general class of network protocols...

  11. Mesothelioma Applied Research Foundation

    Science.gov (United States)

    Mesothelioma Foundation Experts Can Answer Your Questions! The Mesothelioma Applied Research Foundation's team of experts is available ... up for our e-newsletter . Our Impact Against Mesothelioma 9.8 Million in research funded 600 People ...

  12. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  13. Computer and Applied Ethics

    OpenAIRE

    越智, 貢

    2014-01-01

    With this essay I treat some problems raised by the new developments in science and technology, that is, those about Computer Ethics to show how and how far Applied Ethics differs from traditional ethics. I take up backgrounds on which Computer Ethics rests, particularly historical conditions of morality. Differences of conditions in time and space explain how Computer Ethics and Applied Ethics are not any traditional ethics in concrete cases. But I also investigate the normative rea...

  14. Applied machining technology

    CERN Document Server

    Tschätsch, Heinz

    2010-01-01

    Machining and cutting technologies are still crucial for many manufacturing processes. This reference presents all important machining processes in a comprehensive and coherent way. It includes many examples of concrete calculations, problems and solutions.

  15. PSYCHOANALYSIS AS APPLIED AESTHETICS.

    Science.gov (United States)

    Richmond, Stephen H

    2016-07-01

    The question of how to place psychoanalysis in relation to science has been debated since the beginning of psychoanalysis and continues to this day. The author argues that psychoanalysis is best viewed as a form of applied art (also termed applied aesthetics) in parallel to medicine as applied science. This postulate draws on a functional definition of modernity as involving the differentiation of the value spheres of science, art, and religion. The validity criteria for each of the value spheres are discussed. Freud is examined, drawing on Habermas, and seen to have erred by claiming that the psychoanalytic method is a form of science. Implications for clinical and metapsychological issues in psychoanalysis are discussed. © 2016 The Psychoanalytic Quarterly, Inc.

  16. Applied chemical engineering thermodynamics

    CERN Document Server

    Tassios, Dimitrios P

    1993-01-01

    Applied Chemical Engineering Thermodynamics provides the undergraduate and graduate student of chemical engineering with the basic knowledge, the methodology and the references he needs to apply it in industrial practice. Thus, in addition to the classical topics of the laws of thermodynamics,pure component and mixture thermodynamic properties as well as phase and chemical equilibria the reader will find: - history of thermodynamics - energy conservation - internmolecular forces and molecular thermodynamics - cubic equations of state - statistical mechanics. A great number of calculated problems with solutions and an appendix with numerous tables of numbers of practical importance are extremely helpful for applied calculations. The computer programs on the included disk help the student to become familiar with the typical methods used in industry for volumetric and vapor-liquid equilibria calculations.

  17. On applying cognitive psychology.

    Science.gov (United States)

    Baddeley, Alan

    2013-11-01

    Recent attempts to assess the practical impact of scientific research prompted my own reflections on over 40 years worth of combining basic and applied cognitive psychology. Examples are drawn principally from the study of memory disorders, but also include applications to the assessment of attention, reading, and intelligence. The most striking conclusion concerns the many years it typically takes to go from an initial study, to the final practical outcome. Although the complexity and sheer timescale involved make external evaluation problematic, the combination of practical satisfaction and theoretical stimulation make the attempt to combine basic and applied research very rewarding. © 2013 The British Psychological Society.

  18. Introduction to applied thermodynamics

    CERN Document Server

    Helsdon, R M; Walker, G E

    1965-01-01

    Introduction to Applied Thermodynamics is an introductory text on applied thermodynamics and covers topics ranging from energy and temperature to reversibility and entropy, the first and second laws of thermodynamics, and the properties of ideal gases. Standard air cycles and the thermodynamic properties of pure substances are also discussed, together with gas compressors, combustion, and psychrometry. This volume is comprised of 16 chapters and begins with an overview of the concept of energy as well as the macroscopic and molecular approaches to thermodynamics. The following chapters focus o

  19. Applied mathematics made simple

    CERN Document Server

    Murphy, Patrick

    1982-01-01

    Applied Mathematics: Made Simple provides an elementary study of the three main branches of classical applied mathematics: statics, hydrostatics, and dynamics. The book begins with discussion of the concepts of mechanics, parallel forces and rigid bodies, kinematics, motion with uniform acceleration in a straight line, and Newton's law of motion. Separate chapters cover vector algebra and coplanar motion, relative motion, projectiles, friction, and rigid bodies in equilibrium under the action of coplanar forces. The final chapters deal with machines and hydrostatics. The standard and conte

  20. Applied Music (Individual Study).

    Science.gov (United States)

    Texas Education Agency, Austin.

    Background information and resources to help students in grades 9-12 in Texas pursue an individual study contract in applied music is presented. To fulfill a contract students must publicly perform from memory, with accompaniment as specified, three selections from a list of approved music for their chosen field (instrument or voice). Material…

  1. Essays in applied microeconometrics

    NARCIS (Netherlands)

    Cervený, Jakub

    2017-01-01

    Duration analysis has been widely used in the applied economic research since the late 1970s. The framework allows to examine the duration of time intervals and the rate of transition across a set of states over time. Many economic behaviors follow a similar pattern, such as transition from the

  2. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  3. Advances in applied mechanics

    CERN Document Server

    Wu, Theodore Y; Wu, Theodore Y

    2000-01-01

    This highly acclaimed series provides survey articles on the present state and future direction of research in important branches of applied solid and fluid mechanics. Mechanics is defined as a branch of physics that focuses on motion and on the reaction of physical systems to internal and external forces.

  4. What Are Applied Linguistics?

    Science.gov (United States)

    Sridhar, S. N.

    1993-01-01

    Several different conceptualizations of applied linguistics are evaluated, ranging from "applications of linguistic theory" to alternative models for studying language that extend and complement generative grammar as a theory of language. It is shown that they imply substantive differences in goals, methods, and priorities of language study. (30…

  5. Thermodynamics applied. Where? Why?

    NARCIS (Netherlands)

    Hirs, Gerard

    2003-01-01

    In recent years, thermodynamics has been applied in a number of new fields leading to a greater societal impact. This paper gives a survey of these new fields and the reasons why these applications are important. In addition, it is shown that the number of fields could be even greater in the future

  6. Thermodynamics, applied. : Where? why?

    NARCIS (Netherlands)

    Hirs, Gerard

    1999-01-01

    In recent years thermodynamics has been applied in a number of new fields leading to a greater societal impact. The paper gives a survey of these new fields and the reasons why these applications are important. In addition it is shown that the number of fields could be even greater in the future and

  7. Essays on Applied Microeconomics

    Science.gov (United States)

    Mejia Mantilla, Carolina

    2013-01-01

    Each chapter of this dissertation studies a different question within the field of Applied Microeconomics. The first chapter examines the mid- and long-term effects of the 1998 Asian Crisis on the educational attainment of Indonesian children ages 6 to 18, at the time of the crisis. The effects are identified as deviations from a linear trend for…

  8. Applied Behavior Analysis

    Science.gov (United States)

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  9. Journal of Applied Biosciences

    African Journals Online (AJOL)

    The Journal of Applied Biosciences provides a forum for scholars and practitioners in all spheres of biological sciences to publish their research findings or theoretical concepts and ideas of a scientific nature. Other websites related to this journal: http://m.elewa.org/Journals/about-jab/ ...

  10. 32 CFR 37.1220 - Applied research.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false Applied research. 37.1220 Section 37.1220... REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Definitions of Terms Used in This Part § 37.1220 Applied research... technology such as new materials, devices, methods and processes. It typically is funded in Research...

  11. Applied Linguistics in Its Disciplinary Context

    Science.gov (United States)

    Liddicoat, Anthony J.

    2010-01-01

    Australia's current attempt to develop a process to evaluate the quality of research (Excellence in Research for Australia--ERA) places a central emphasis on the disciplinary organisation of academic work. This disciplinary focus poses particular problems for Applied Linguistics in Australia. This paper will examine Applied Linguistics in relation…

  12. Applying quantum principles to psychology

    International Nuclear Information System (INIS)

    Busemeyer, Jerome R; Wang, Zheng; Khrennikov, Andrei; Basieva, Irina

    2014-01-01

    This article starts out with a detailed example illustrating the utility of applying quantum probability to psychology. Then it describes several alternative mathematical methods for mapping fundamental quantum concepts (such as state preparation, measurement, state evolution) to fundamental psychological concepts (such as stimulus, response, information processing). For state preparation, we consider both pure states and densities with mixtures. For measurement, we consider projective measurements and positive operator valued measurements. The advantages and disadvantages of each method with respect to applications in psychology are discussed. (paper)

  13. Applying quantum principles to psychology

    Science.gov (United States)

    Busemeyer, Jerome R.; Wang, Zheng; Khrennikov, Andrei; Basieva, Irina

    2014-12-01

    This article starts out with a detailed example illustrating the utility of applying quantum probability to psychology. Then it describes several alternative mathematical methods for mapping fundamental quantum concepts (such as state preparation, measurement, state evolution) to fundamental psychological concepts (such as stimulus, response, information processing). For state preparation, we consider both pure states and densities with mixtures. For measurement, we consider projective measurements and positive operator valued measurements. The advantages and disadvantages of each method with respect to applications in psychology are discussed.

  14. Applied in vitro radio bioassay

    International Nuclear Information System (INIS)

    Gaburo, J.C.G.; Sordi, G.M.A.A.

    1992-11-01

    The aim of this publication is to show the concepts and in vitro bioassay techniques as well as experimental procedures related with internal contamination evaluation. The main routes of intake, metabolic behavior, and the possible types of bioassay samples that can be collected for radionuclides analysis are described. Both biological processes and the chemical and physical behavior of the radioactive material of interest are considered and the capabilities of analytical techniques to detect and quantify the radionuclides are discussed. Next, the need of quality assurance throughout procedures are considered and finally a summary of the techniques applied to the internal routine monitoring of IPEN workers is given. (author)

  15. Quality in applied science

    Science.gov (United States)

    Sten, T.

    1993-12-01

    Science is in many senses a special kind of craft and only skilled craftsmen are able to distinguish good work from bad. Due to the variation in approaches, methods and even philosophical basis, it is nearly impossible to derive a general set of quality criteria for scientific work outside specific research traditions. Applied science introduces a new set of quality criteria having to do with the application of results in practical situations and policy making. A scientist doing basic research relates mainly to the scientific community of which he is a member, while in applied contract research the scientist has to consider the impact of his results both for the immediate users and upon interest groups possibly being affected. Application thus raises a whole new set of requirements having to do with business ethics, policy consequences and societal ethics in general.

  16. Applied evaluative informetrics

    CERN Document Server

    Moed, Henk F

    2017-01-01

    This book focuses on applied evaluative informetric artifacts or topics. It explains the base notions and assumptions of evaluative informetrics by discussing a series of important applications. The structure of the book is therefore not organized by methodological characteristics, but is centered around popular, often discussed or used informetric artifacts - indicators, methodologies, products, databases - or so called hot topics in which informetric indicators play an important role. Most of the artifacts and topics emerged during the past decade. The principal aim of the book is to present a state of the art in applied evaluative informetrics, and to inform the readers about the pros and cons, potentialities and limitations of the use of informetric/bibliometric indicators in research assessment. The book is a continuation of the book Citation Analysis in Research Evaluation (Springer, 2005). It is of interest to non-specialists, especially research students at advanced master level and higher, all thos...

  17. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  18. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  19. Methods of applied mathematics

    CERN Document Server

    Hildebrand, Francis B

    1992-01-01

    This invaluable book offers engineers and physicists working knowledge of a number of mathematical facts and techniques not commonly treated in courses in advanced calculus, but nevertheless extremely useful when applied to typical problems in many different fields. It deals principally with linear algebraic equations, quadratic and Hermitian forms, operations with vectors and matrices, the calculus of variations, and the formulations and theory of linear integral equations. Annotated problems and exercises accompany each chapter.

  20. Applied clinical engineering

    International Nuclear Information System (INIS)

    Feinberg, B.

    1986-01-01

    This book demonstrates how clinical engineering has applied engineering principles to the development and use of complex medical devices for the diagnosis and treatment of the sick and injured. It discusses the proper utilization of medical devices and equipment in the health-care industry and provides understanding of complex engineering systems, and their uses in the modern hospital or other health-care facility

  1. Applied mediation analyses

    DEFF Research Database (Denmark)

    Lange, Theis; Hansen, Kim Wadt; Sørensen, Rikke

    2017-01-01

    In recent years, mediation analysis has emerged as a powerful tool to disentangle causal pathways from an exposure/treatment to clinically relevant outcomes. Mediation analysis has been applied in scientific fields as diverse as labour market relations and randomized clinical trials of heart...... disease treatments. In parallel to these applications, the underlying mathematical theory and computer tools have been refined. This combined review and tutorial will introduce the reader to modern mediation analysis including: the mathematical framework; required assumptions; and software implementation...

  2. Applied science. Introduction.

    Science.gov (United States)

    Bud, Robert

    2012-09-01

    Such categories as applied science and pure science can be thought of as "ideological." They have been contested in the public sphere, exposing long-term intellectual commitments, assumptions, balances of power, and material interests. This group of essays explores the contest over applied science in Britain and the United States during the nineteenth century. The essays look at the concept in the context of a variety of neighbors, including pure science, technology, and art. They are closely related and connected to contemporary historiographic debate. Jennifer Alexander links the issues raised to a recent paper by Paul Forman. Paul Lucier and Graeme Gooday deal with the debates in the last quarter of the century in the United States and Britain, respectively. Robert Bud deals with the earlier part of the nineteenth century, with an eye specifically on the variety of concepts hybridized under the heading of "applied science." Eric Schatzberg looks at the erosion of the earlier concept of art. As a whole, the essays illuminate both long-term changes and nuanced debate and are themselves intended to provoke further reflection on science in the public sphere.

  3. Towards "open applied" Earth sciences

    Science.gov (United States)

    Ziegler, C. R.; Schildhauer, M.

    2014-12-01

    Concepts of open science -- in the context of cyber/digital technology and culture -- could greatly benefit applied and secondary Earth science efforts. However, international organizations (e.g., environmental agencies, conservation groups and sustainable development organizations) that are focused on applied science have been slow to incorporate open practices across the spectrum of scientific activities, from data to decisions. Myriad benefits include transparency, reproducibility, efficiency (timeliness and cost savings), stakeholder engagement, direct linkages between research and environmental outcomes, reduction in bias and corruption, improved simulation of Earth systems and improved availability of science in general. We map out where and how open science can play a role, providing next steps, with specific emphasis on applied science efforts and processes such as environmental assessment, synthesis and systematic reviews, meta-analyses, decision support and emerging cyber technologies. Disclaimer: The views expressed in this paper are those of the authors and do not necessarily reflect the views or policies of the organizations for which they work and/or represent.

  4. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry;Maitrise statistique des processus appliquee aux controles avant traitement par dosimetrie portale en radiotherapie conformationnelle avec modulation d'intensite

    Energy Technology Data Exchange (ETDEWEB)

    Villani, N.; Noel, A. [Laboratoire de recherche en radiophysique, CRAN UMR 7039, Nancy universite-CNRS, 54 - Vandoeuvre-les-Nancy (France); Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A. [Departement de radiophysique, centre Alexis-Vautrin, 54 - Vandoeuvre-les-Nancy (France); Francois, P. [Institut Curie, 75 - Paris (France)

    2010-06-15

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  5. Implementing an Applied Science Program

    Science.gov (United States)

    Rickman, Doug; Presson, Joan

    2007-01-01

    The work implied in the NASA Applied Science Program requires a delicate balancing act for the those doing it. At the implementation level there are multiple tensions intrinsic to the program. For example each application of an existing product to a decision support process requires deep knowledge about the data and deep knowledge about the decision making process. It is highly probable no one person has this range of knowledge. Otherwise the decision making process would already be using the data. Therefore, a team is required. But building a team usually requires time, especially across agencies. Yet the program mandates efforts of relatively short duration. Further, those who know the data are scientists, which makes them essential to the program. But scientists are evaluated on their publication record. Anything which diverts a scientist from the research for his next publication is an anathema to him and potential death to their career. Trying to get another agency to use NASA data does not strike most scientists as material inherently suitable for publication. Also, NASA wishes to rapidly implement often substantial changes to another agency's process. For many reasons, such as budget and program constraints, speed is important. But the owner of a decision making process is tightly constrained, usually by law, regulation, organization and custom. Changes when made are slow, cautious, even hesitant, and always done according a process specific to the situation. To manage this work MSFC must balance these and other tensions. Some things we have relatively little control over, such as budget. These we try to handle by structural techniques. For example by insisting all of our people work on multiple projects simultaneously we inherently have diversification of funding for all of our people. In many cases we explicitly use some elements of tension to be productive. For example the need for the scientists to constantly publish is motivation to keep tasks short and

  6. The scientific research methods applied from Physical Education Theory and Methodology contents at physical skills treatment and its relationship with the sport motor skills connected with the Teaching and Learning Process

    Directory of Open Access Journals (Sweden)

    Naivy Lanza-Escobar

    2015-06-01

    Full Text Available Within the current theoretical and methodological conceptions that approach the educational processes and phenomena of Physical Education in the Physical capabilities topic there are several levels that, with a systemic character, offer coherence and unity from their more general argumentation to the description of how they should be studied. However, this reality, that is implicit in the different theories, is usually unnoticed by the researchers and thus it brings about theoretical and methodological in-consequences in the investigation which damage the strictness of the research process. The aim of this article is to analyse each level in the theoretical foundation of investigations about the educational process of Physical Education in the Physical capabilities topic. The teaching learning process is valued on how different authors have approached methodological and theoretical levels; later, each of them are specified, highlighting new element which according to the author can enrich the personified work and the relationship among the content to be deal with the mentioned process.

  7. Study of a 'zero discharge' process applied to the treatment of wastewater containing heavy metals and radionuclides by coupling nano-filtration and a controlled electrical elution

    International Nuclear Information System (INIS)

    Ferreira-Esmi, Caue

    2014-01-01

    This thesis aim is to study a process designed to remove nickel and cobalt cations present in low concentrations from the wastewater of a nuclear fuel reprocessing facility. The proposed process combines nano-filtration and a sorption step in which the adsorbent (carbon felts) is a conductive material that may be electrically regenerated. Each step of the process is studied separately and its association is evaluated. Nano-filtration step is studied by an approach integrating experiments to numerical simulation. A simple experiment-based method was developed to supply the simulation software database, improving its predictive capacities. Three commercial nano-filtration membranes were compared in terms of a continuous or batch recycling operation mode. This has allowed the most suited membrane for the process to be chosen. Permeate produced by nano-filtration was used to study the sorption step. After a physical characterization of the carbon felts, its application was studied in two different stages. The first was a closed batch operation mode which allowed characterization of the sorption kinetics and obtaining equilibrium isotherms. The second was a fixed bed operating mode in which adsorbent breakthrough curves were studied. The influence of the operating conditions and the composition of the wastewater in the output result were analyzed. The carbon felts regeneration was investigated by both acid and electric regeneration. A process scheme using acid regeneration was proposed. The electrical one still required further study. (author) [fr

  8. Applied impulsive mathematical models

    CERN Document Server

    Stamova, Ivanka

    2016-01-01

    Using the theory of impulsive differential equations, this book focuses on mathematical models which reflect current research in biology, population dynamics, neural networks and economics. The authors provide the basic background from the fundamental theory and give a systematic exposition of recent results related to the qualitative analysis of impulsive mathematical models. Consisting of six chapters, the book presents many applicable techniques, making them available in a single source easily accessible to researchers interested in mathematical models and their applications. Serving as a valuable reference, this text is addressed to a wide audience of professionals, including mathematicians, applied researchers and practitioners.

  9. Applied energy an introduction

    CERN Document Server

    Abdullah, Mohammad Omar

    2012-01-01

    Introduction to Applied EnergyGeneral IntroductionEnergy and Power BasicsEnergy EquationEnergy Generation SystemsEnergy Storage and MethodsEnergy Efficiencies and LossesEnergy industry and Energy Applications in Small -Medium Enterprises (SME) industriesEnergy IndustryEnergy-Intensive industryEnergy Applications in SME Energy industriesEnergy Sources and SupplyEnergy SourcesEnergy Supply and Energy DemandEnergy Flow Visualization and Sankey DiagramEnergy Management and AnalysisEnergy AuditsEnergy Use and Fuel Consumption StudyEnergy Life-Cycle AnalysisEnergy and EnvironmentEnergy Pollutants, S

  10. Applied Meteorology Unit (AMU)

    Science.gov (United States)

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Wheeler, Mark

    2010-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the first quarter of Fiscal Year 2010 (October - December 2009). A detailed project schedule is included in the Appendix. Included tasks are: (1) Peak Wind Tool for User Launch Commit Criteria (LCC), (2) Objective Lightning Probability Tool, Phase III, (3) Peak Wind Tool for General Forecasting, Phase II, (4) Upgrade Summer Severe Weather Tool in Meteorological Interactive Data Display System (MIDDS), (5) Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS) Update and Maintainability, (5) Verify 12-km resolution North American Model (MesoNAM) Performance, and (5) Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) Graphical User Interface.

  11. Applied Semantic Web Technologies

    CERN Document Server

    Sugumaran, Vijayan

    2011-01-01

    The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with

  12. Applied nonparametric statistical methods

    CERN Document Server

    Sprent, Peter

    2007-01-01

    While preserving the clear, accessible style of previous editions, Applied Nonparametric Statistical Methods, Fourth Edition reflects the latest developments in computer-intensive methods that deal with intractable analytical problems and unwieldy data sets. Reorganized and with additional material, this edition begins with a brief summary of some relevant general statistical concepts and an introduction to basic ideas of nonparametric or distribution-free methods. Designed experiments, including those with factorial treatment structures, are now the focus of an entire chapter. The text also e

  13. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  14. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  15. Metoder for Modellering, Simulering og Regulering af Større Termiske Processer anvendt i Sukkerproduktion. Methods for Modelling, Simulation and Control of Large Scale Thermal Systems Applied in Sugar Production

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    The subject of this Ph.D. thesis is to investigate and develop methods for modelling, simulation and control applicable in large scale termal industrial plants. An ambition has been to evaluate the results in a physical process. Sugar production is well suited for the purpose. In collaboration...... with The Danish Sugar Corporation two subsystems in the production have been chosen for application - the evaporation process and the crystallization process. In order to obtain information about the static and dynamic behaviour of the subsystems, field measurements have been performed. A realtime evaporator...... of a computer, a data terminal and an electric interface corresponding to the interface at the sugar plant. The simulator is operating in realtime and thus a realistic test of controllers is possible. The idiomatic control methodology has been investigated developing a control concept for the evaporation...

  16. Applied multidimensional systems theory

    CERN Document Server

    Bose, Nirmal K

    2017-01-01

    Revised and updated, this concise new edition of the pioneering book on multidimensional signal processing is ideal for a new generation of students. Multidimensional systems or m-D systems are the necessary mathematical background for modern digital image processing with applications in biomedicine, X-ray technology and satellite communications. Serving as a firm basis for graduate engineering students and researchers seeking applications in mathematical theories, this edition eschews detailed mathematical theory not useful to students. Presentation of the theory has been revised to make it more readable for students, and introduce some new topics that are emerging as multidimensional DSP topics in the interdisciplinary fields of image processing. New topics include Groebner bases, wavelets, and filter banks.

  17. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  18. Determination of optimum parameters of the technological process for plates forming from V95 and V-1461 alloys in creep applied in aircrafts constructed by “Sukhoi design bureau”

    Science.gov (United States)

    Raevskaya, G. A.; Zakharchenko, K.; Larichkin, A.

    2017-10-01

    The research is devoted to the scientific justification of metal processing by pressure with the help of thick monolithic plates forming (thickness 40 mm) from the V95 (analog 7475) (Al-Zn-Mg-Cu) and V-1461 (analog 2099) (Al-Cu-Li-Zn) alloys in creep and close-to-superplasticity. Optimum parameters of the technological process of plate forming are described. The effect of temperature on the magnitude of mechanical stresses (relaxation) during the tests of materials on pure bending is experimentally determined. Forming of thick plates (40 mm) on the UFP-1M unit, and the control of the obtained surface, in comparison with the given electronic model, made it possible to experimentally determine the time and number of forming stages. Mechanical properties of the material after the technological process and heat treatment are preliminary evaluated. The efficiency of using the obtained parameters of the technological process and treatment of metals by pressure in such methods in general is shown.

  19. Optimizing the technological and informational relationship of the health care process and of the communication between physician and patient. The impact of Preventive Medicine and social marketing applied in Health Care on youth awareness- preliminary study.

    Science.gov (United States)

    Petrescu, C M; Gheorghe, I R; Petrescu, G D

    2011-01-01

    In this case study we wanted to find out the impact of Preventive Medicine and implicitly social marketing upon young students with the average age of 19, belonging to the academic environment in Romania. The study lasted one month and consisted of a questionnaire that was conceived and applied to 304 adolescents. The questionnaire contained demographic and personal information, such as age, origin, gender, marital status and some questions related to the respondents' attitude towards several issues that are inserted in the preventive medicine discipline, such as the date of their last consultation, if the respondents were registered to a family physician, suffered from chronic diseases, what was the rate of doing physical exercises, if they ate salty and fat meals, if they were on a diet, their rate of alcohol, caffeine and tobacco consumption. The panel was made up of more female respondents than male, with the average age rate of 19, who had medical consultations in the last 3 months, are included in the evidence of a family physician, had no chronic diseases, usually do workout exercises moderately, are not on a diet and have 3 meals per day. The meals are medium salty and rarely rich in fats. They drink 2 cups of tea per day and are non-alcohol drinkers and non-smokers. After applying several statistical tests to find a correlation between our variables, we reached the conclusion that even if the results are encouraging; there is no correlation between the impact of Preventive Medicine and the respondents' health behavior.

  20. NASA Applied Sciences Program

    Science.gov (United States)

    Estes, Sue M.; Haynes, J. A.

    2009-01-01

    NASA's strategic Goals: a) Develop a balanced overall program of science, exploration, and aeronautics consistent with the redirection of human spaceflight program to focus on exploration. b) Study Earth from space to advance scientific understanding and meet societal needs. NASA's partnership efforts in global modeling and data assimilation over the next decade will shorten the distance from observations to answers for important, leading-edge science questions. NASA's Applied Sciences program will continue the Agency's efforts in benchmarking the assimilation of NASA research results into policy and management decision-support tools that are vital for the Nation's environment, economy, safety, and security. NASA also is working with NOAH and inter-agency forums to transition mature research capabilities to operational systems, primarily the polar and geostationary operational environmental satellites, and to utilize fully those assets for research purposes.

  1. Applied number theory

    CERN Document Server

    Niederreiter, Harald

    2015-01-01

    This textbook effectively builds a bridge from basic number theory to recent advances in applied number theory. It presents the first unified account of the four major areas of application where number theory plays a fundamental role, namely cryptography, coding theory, quasi-Monte Carlo methods, and pseudorandom number generation, allowing the authors to delineate the manifold links and interrelations between these areas.  Number theory, which Carl-Friedrich Gauss famously dubbed the queen of mathematics, has always been considered a very beautiful field of mathematics, producing lovely results and elegant proofs. While only very few real-life applications were known in the past, today number theory can be found in everyday life: in supermarket bar code scanners, in our cars’ GPS systems, in online banking, etc.  Starting with a brief introductory course on number theory in Chapter 1, which makes the book more accessible for undergraduates, the authors describe the four main application areas in Chapters...

  2. Applied plasma physics

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    Applied Plasma Physics is a major sub-organizational unit of the Magnetic Fusion Energy (MFE) Program. It includes Fusion Plasma Theory and Experimental Plasma Research. The Fusion Plasma Theory group has the responsibility for developing theoretical-computational models in the general areas of plasma properties, equilibrium, stability, transport, and atomic physics. This group has responsibility for giving guidance to the mirror experimental program. There is a formal division of the group into theory and computational; however, in this report the efforts of the two areas are not separated since many projects have contributions from members of both. Under the Experimental Plasma Research Program we are developing a neutral-beam source, the intense, pulsed ion-neutral source (IPINS), for the generation of a reversed-field configuration on 2XIIB. We are also studying the feasibility of using certain neutron-detection techniques as plasma diagnostics in the next generation of thermonuclear experiments

  3. Applied partial differential equations

    CERN Document Server

    Logan, J David

    2015-01-01

    This text presents the standard material usually covered in a one-semester, undergraduate course on boundary value problems and PDEs.  Emphasis is placed on motivation, concepts, methods, and interpretation, rather than on formal theory. The concise treatment of the subject is maintained in this third edition covering all the major ideas: the wave equation, the diffusion equation, the Laplace equation, and the advection equation on bounded and unbounded domains. Methods include eigenfunction expansions, integral transforms, and characteristics. In this third edition, text remains intimately tied to applications in heat transfer, wave motion, biological systems, and a variety other topics in pure and applied science. The text offers flexibility to instructors who, for example, may wish to insert topics from biology or numerical methods at any time in the course. The exposition is presented in a friendly, easy-to-read, style, with mathematical ideas motivated from physical problems. Many exercises and worked e...

  4. Applied mechanics of solids

    CERN Document Server

    Bower, Allan F

    2009-01-01

    Modern computer simulations make stress analysis easy. As they continue to replace classical mathematical methods of analysis, these software programs require users to have a solid understanding of the fundamental principles on which they are based. Develop Intuitive Ability to Identify and Avoid Physically Meaningless Predictions Applied Mechanics of Solids is a powerful tool for understanding how to take advantage of these revolutionary computer advances in the field of solid mechanics. Beginning with a description of the physical and mathematical laws that govern deformation in solids, the text presents modern constitutive equations, as well as analytical and computational methods of stress analysis and fracture mechanics. It also addresses the nonlinear theory of deformable rods, membranes, plates, and shells, and solutions to important boundary and initial value problems in solid mechanics. The author uses the step-by-step manner of a blackboard lecture to explain problem solving methods, often providing...

  5. Applied statistical thermodynamics

    CERN Document Server

    Lucas, Klaus

    1991-01-01

    The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.

  6. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  7. Applied computational physics

    CERN Document Server

    Boudreau, Joseph F; Bianchi, Riccardo Maria

    2018-01-01

    Applied Computational Physics is a graduate-level text stressing three essential elements: advanced programming techniques, numerical analysis, and physics. The goal of the text is to provide students with essential computational skills that they will need in their careers, and to increase the confidence with which they write computer programs designed for their problem domain. The physics problems give them an opportunity to reinforce their programming skills, while the acquired programming skills augment their ability to solve physics problems. The C++ language is used throughout the text. Physics problems include Hamiltonian systems, chaotic systems, percolation, critical phenomena, few-body and multi-body quantum systems, quantum field theory, simulation of radiation transport, and data modeling. The book, the fruit of a collaboration between a theoretical physicist and an experimental physicist, covers a broad range of topics from both viewpoints. Examples, program libraries, and additional documentatio...

  8. Remote sensing and geo processing techniques applied to decision making for the implantation of wind parking; Tecnicas de sensoriamento remoto e geoprocessamento aplicadas no auxilio a tomada de decisao na implantacao de parques eolicos

    Energy Technology Data Exchange (ETDEWEB)

    Lahm, Regis Alexandre [Pontificia Univ. Catolica do Rio Grande do Sul (PUCRS), Porto Alegre, RS (Brazil). Lab. de Tratamento de Imagens e Geoprocessamento (LTIG); Ale, Jorge Antonio Villar [Pontificia Univ. Catolica do Rio Grande do Sul (PUCRS), Porto Alegre, RS (Brazil). Nucleo Tecnologico de Energia e Meio Ambiente (NUTEMA); Bottezini, Dilane [Pontificia Univ. Catolica do Rio Grande do Sul (PUCRS), Porto Alegre, RS (Brazil)

    2004-07-01

    The paper presents a methodology for decision making help for wind turbines positioning on previously selected sites and presenting potential for wind parking installation. 1:50,000 scale elaborated by DSG topographic charts and also orbital image from the Landsat 7 ETM were used. Based on those materials and through remote sensing and geo processing a digital relief model and rugosity charts were elaborated.

  9. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  10. Applied heterogeneous catalysis

    International Nuclear Information System (INIS)

    Le Page, A.J.F.

    1988-01-01

    This reference book explains the scientific principles of heterogeneous catalysis while also providing details on the methods used to develop commercially viable catalyst products. A section of the book presents reactor design engineering theory and practices for the profitable application of these catalysts in large-scale industrial processes. A description of the mechanisms and commercial applications of catalysis is followed by a review of catalytic reaction kinetics. There are five chapters on selecting catalyst agents, developing and preparing industrial catalysts, measuring catalyst properties, and analyzing the physico-chemical characteristics of solid catalyst particles. The final chapter reviews the elements of catalytic reactor design, with emphasis on flow regimes vs. reactor types, heat and mass transfer in reactor beds, single- and multi-phase flows, and the effects of thermodynamics and other catalyst properties on the process flow scheme

  11. Improvisation and Learning Processes in Organizations: a metaphor applying the Brazilian rhythm choro [Improvisação e Processos de Aprendizagem nas Organizações: uma metáfora a partir do ritmo Brasileiro Choro

    Directory of Open Access Journals (Sweden)

    Leonardo Flach

    2011-12-01

    Full Text Available Whereas improvisation has been discussed in international literature mainly from the metaphor of jazz and theater, this essay discusses how the phenomenon of improvisation can contribute to new interpretations of Organizational Learning. We use the metaphor of improvisation in the Brazilian rhythm ‘Choro’ in order to understand the process of improvisation in organizations. Thus, the main objective of the study is to discuss and analyze the role of improvisation in the Organizational Learning process. In the fi nal considerations, we conclude that improvisation plays a signifi cant role in the processes of Organizational Learning. Thus, we argue that the socio-cultural approach in Organizational Learning can help to understand the process of improvisation, with the role of communities of practice, culture, social practices and sensemaking in this phenomenon. ---- Improvisação e Processos de Aprendizagem nas Organizações: uma metáfora a partir do ritmo brasileiro Choro ---- Resumo ---- Considerando que a improvisação tem sido discutida na literatura internacional principalmente a partir da metáfora do jazz e do teatro, este ensaio teórico pretende lançar luzes e discutir como o fenômeno da improvisação pode contribuir para novas interpretações da Aprendizagem Organizacional. Utiliza-se a metáfora da improvisação no ritmo Choro para auxiliar na compreensão do processo de improvisação nas organizações. Desta maneira, o principal objetivo do estudo é discutir e analisar o papel da improvisação nos processos de Aprendizagem Organizacional. Nas considerações levantadas, entende-se que a improvisação exerce importante influência nos processos de Aprendizagem Organizacional e que a perspectiva da aprendizagem baseada em práticas pode auxiliar na compreensão da improvisação organizacional.

  12. Applied hydraulic transients

    CERN Document Server

    Chaudhry, M Hanif

    2014-01-01

    This book covers hydraulic transients in a comprehensive and systematic manner from introduction to advanced level and presents various methods of analysis for computer solution. The field of application of the book is very broad and diverse and covers areas such as hydroelectric projects, pumped storage schemes, water-supply systems, cooling-water systems, oil pipelines and industrial piping systems. Strong emphasis is given to practical applications, including several case studies, problems of applied nature, and design criteria. This will help design engineers and introduce students to real-life projects. This book also: ·         Presents modern methods of analysis suitable for computer analysis, such as the method of characteristics, explicit and implicit finite-difference methods and matrix methods ·         Includes case studies of actual projects ·         Provides extensive and complete treatment of governed hydraulic turbines ·         Presents design charts, desi...

  13. Academic training: Applied superconductivity

    CERN Multimedia

    2007-01-01

    LECTURE SERIES 17, 18, 19 January from 11.00 to 12.00 hrs Council Room, Bldg 503 Applied Superconductivity : Theory, superconducting Materials and applications E. PALMIERI/INFN, Padova, Italy When hearing about persistent currents recirculating for several years in a superconducting loop without any appreciable decay, one realizes that we are dealing with a phenomenon which in nature is the closest known to the perpetual motion. Zero resistivity and perfect diamagnetism in Mercury at 4.2 K, the breakthrough during 75 years of several hundreds of superconducting materials, the revolution of the "liquid Nitrogen superconductivity"; the discovery of still a binary compound becoming superconducting at 40 K and the subsequent re-exploration of the already known superconducting materials: Nature discloses drop by drop its intimate secrets and nobody can exclude that the last final surprise must still come. After an overview of phenomenology and basic theory of superconductivity, the lectures for this a...

  14. Applying Evolutionary Anthropology

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. PMID:25684561

  15. Informatics applied to cytology

    Directory of Open Access Journals (Sweden)

    Pantanowitz Liron

    2008-01-01

    Full Text Available Automation and emerging information technologies are being adopted by cytology laboratories to augment Pap test screening and improve diagnostic accuracy. As a result, informatics, the application of computers and information systems to information management, has become essential for the successful operation of the cytopathology laboratory. This review describes how laboratory information management systems can be used to achieve an automated and seamless workflow process. The utilization of software, electronic databases and spreadsheets to perform necessary quality control measures are discussed, as well as a Lean production system and Six Sigma approach, to reduce errors in the cytopathology laboratory.

  16. Circumferential welding applied for inox steel super duplex UNS S32750 using the process MIG using CMT® control; Soldagem circunferencial do aço inoxidável super duplex UNS S32750 pelo processo MIG com controle CMT®

    Energy Technology Data Exchange (ETDEWEB)

    Invernizzi, Bruno Pizol

    2017-07-01

    This study carried out circumferential welding experiments in UNS S32750 Super Duplex Stainless Steel tubes using diameters of 19,05 mm and 48,20 mm. Welds were performed using various welding parameters on a MIG machine with Cold Metal Transfer® CMT control. The weld joints were evaluated by visual and dimensional inspection in addition to the Vickers microhardness and traction tests, as well as the microstructural analysis in conjunction with phase precipitation analysis, which was performed according to practice A of ASTM A923, and corrosion test in accordance with practice A of ASTM G48 in conjunction with ASTM A923. The results indicated that welds performed in pipes with a diameter of 19.05 mm showed a weld joint with unacceptable dimensions according to the standard, this condition being attributed the use of a high wire diameter for the welding conditions used. Welding performed for pipes with a diameter of 48.20 mm showed a lack of penetration under the conditions employed when welded by the conventional CMT® process. In the case of the use of CMT® combined with pulsed arc, under conditions that generated greater heat input during welding, this resulted in total penetration of the joint and adequate surface finish. The results indicated that welding using the CMT® process combined with pulsed arc, under the conditions (parameters) employed generated good surface finish, combined mechanical properties, meeting standards requirements, as well as a balanced microstructure and high resistance to corrosion. (author)

  17. Vygotsky in applied neuropsychology

    Directory of Open Access Journals (Sweden)

    Glozman J. M.

    2016-12-01

    Full Text Available The aims of this paper are: 1 to show the role of clinical experience for the theoretical contributions of L.S. Vygotsky, and 2 to analyze the development of these theories in contemporary applied neuropsychology. An analysis of disturbances of mental functioning is impossible without a systemic approach to the evidence observed. Therefore, medical psychology is fundamental for forming a systemic approach to psychology. The assessment of neurological patients at the neurological hospital of Moscow University permitted L.S. Vygotsky to create, in collaboration with A.R. Luria, the theory of systemic dynamic localization of higher mental functions and their relationship to cultural conditions. In his studies of patients with Parkinson’s disease, Vygotsky also set out 3 steps of systemic development: interpsychological, then extrapsychological, then intrapsychological. L.S. Vygotsky and A.R. Luria in the late 1920s created a program to compensate for the motor subcortical disturbances in Parkinson’s disease (PD through a cortical (visual mediation of movements. We propose to distinguish the objective mediating factors — like teaching techniques and modalities — from subjective mediating factors, like the individual’s internal representation of his/her own disease. The cultural-historical approach in contemporary neuropsychology forces neuropsychologists to re-analyze and re-interpret the classic neuropsychological syndromes; to develop new assessment procedures more in accordance with the patient’s conditions of life; and to reconsider the concept of the social brain as a social and cultural determinant and regulator of brain functioning. L.S. Vygotsky and A.R. Luria proved that a defect interferes with a child’s appropriation of his/her culture, but cultural means can help the child overcome the defect. In this way, the cultural-historical approach became, and still is, a methodological basis for remedial education.

  18. Applied Historical Astronomy

    Science.gov (United States)

    Stephenson, F. Richard

    2014-01-01

    F. Richard Stephenson has spent most of his research career -- spanning more than 45 years -- studying various aspects of Applied Historical Astronomy. The aim of this interdisciplinary subject is the application of historical astronomical records to the investigation of problems in modern astronomy and geophysics. Stephenson has almost exclusively concentrated on pre-telescopic records, especially those preserved from ancient and medieval times -- the earliest reliable observations dating from around 700 BC. The records which have mainly interested him are of eclipses (both solar and lunar), supernovae, sunspots and aurorae, and Halley's Comet. The main sources of early astronomical data are fourfold: records from ancient and medieval East Asia (China, together with Korea and Japan); ancient Babylon; ancient and medieval Europe; and the medieval Arab world. A feature of Stephenson's research is the direct consultation of early astronomical texts in their original language -- either working unaided or with the help of colleagues. He has also developed a variety of techniques to help interpret the various observations. Most pre-telescopic observations are very crude by present-day standards. In addition, early motives for skywatching were more often astrological rather than scientific. Despite these drawbacks, ancient and medieval astronomical records have two remarkable advantages over modern data. Firstly, they can enable the investigation of long-term trends (e.g. in the terrestrial rate of rotation), which in the relatively short period covered by telescopic observations are obscured by short-term fluctuations. Secondly, over the lengthy time-scale which they cover, significant numbers of very rare events (such as Galactic supernovae) were reported, which have few -- if any-- counterparts in the telescopic record. In his various researches, Stephenson has mainly focused his attention on two specific topics. These are: (i) long-term changes in the Earth's rate of

  19. An eco design strategy for high pressure die casting components: microstructural analysis applied to mass reducing processes; Una estrategia de ecodiseno de piezas obtenidas mediante moldeo a presion: analisis microestructrual aplicado a la desmaterializacion

    Energy Technology Data Exchange (ETDEWEB)

    Suarez-Pena, B.; Asensio-Lozano, J.

    2009-07-01

    In this work the study focused on the possibility of use of new aluminium alloys with optimized microstructures that ensure the mechanical properties requested for cast components made by high pressure die casting. The objective was to check the possibility of manufacture of structurally sound eco-steps for escalators with reduced structural integrity. The former arises as a result of a new redesign of the traditional steps aiming at a significant weight reduction. The experimental results show that it is feasible to cut the use of materials during processing and therefore to reduce the impact of the components during its lifetime, whilst the performance and safety standards are kept identical or even improved. (Author) 17 refs.

  20. Applying the 5 Why method to verification of non-compliance causes established after application of the Ishikawa diagram in the process of improving the production of drive half-shafts

    Directory of Open Access Journals (Sweden)

    Szymon T. Dziuba

    2014-06-01

    Full Text Available The automotive industry is one of the most important branches of the global industry. For this purpose, products are produced by an extensive network of suppliers. They supply components directly to the (OEM - Original Equipment Manufacturer or to the secondary market of parts. In recent years, the economic situation in these companies has dramatically deteriorated and it is associated with a decrease in new car sales. These companies are constantly making decisions to improve their production processes using various quality tools. These actions may help to increase their competitive advantage and thereby improve their financial situation. The surveyed company produces drive half-shafts for different types of cars. The studies were carried out to analyse the causes of non-compliance - falling band from the housing on the drive shaft. In order to identify and verify the causes for non-compliance and undertake corrective action following methods were used: Ishikawa diagram, 5WHY and brainstorming.

  1. Lab work goes social, and vice versa: strategising public engagement processes : commentary on: "What happens in the lab does not stay in the lab: applying midstream modulation to enhance critical reflection in the laboratory".

    Science.gov (United States)

    Wynne, Brian

    2011-12-01

    Midstream modulation is a form of public engagement with science which benefits from strategic application of science and technology studies (STS) insights accumulated over nearly 20 years. These have been developed from STS researchers' involvement in practical engagement processes and research with scientists, science funders, policy and other public stakeholders. The strategic aim of this specific method, to develop what is termed second-order reflexivity amongst scientist-technologists, builds upon and advances earlier more general STS work. However this method is focused and structured so as to help generate such reflexivity-over the 'upstream' questions which have been identified in other STS research as important public issues for scientific research, development and innovation-amongst practising scientists-technologists in their specialist contexts (public or private, in principle). This is a different focus from virtually all such previous work, and offers novel opportunities for those key broader issues to be opened up. The further development of these promising results depends on some important conditions such as identifying and engaging research funders and other stakeholders like affected publics in similar exercises. Implementing these conditions could connect the productive impacts of midstream modulation with wider public engagement work, including with 'uninvited' public engagement with science. It would also generate broader institutional and political changes in the larger networks of institutional actors which constitute contemporary technoscientific innovation and governance processes. All of these various broader dimensions, far beyond the laboratory alone, need to be appropriately open, committed to democratic needs, and reflexive, for the aims of midstream modulation to be achieved, whilst allowing specialists to work as specialists.

  2. Applied electromagnetic scattering theory

    CERN Document Server

    Osipov, Andrey A

    2017-01-01

    Besides classical applications (radar and stealth, antennas, microwave engineering), scattering and diffraction are enabling phenomena for some emerging research fields (artificial electromagnetic materials or metamaterials, terahertz technologies, electromagnetic aspects of nano-science). This book is a tutorial for advanced students who need to study diffraction theory. The textbook gives fundamental knowledge about scattering and diffraction of electromagnetic waves and provides some working examples of solutions for practical high-frequency scattering and diffraction problems. The book focuses on the most important diffraction effects and mechanisms influencing the scattering process and describes efficient and physically justified simulation methods - physical optics (PO) and the physical theory of diffraction (PTD) - applicable in typical remote sensing scenarios. The material is presented in a comprehensible and logical form, which relates the presented results to the basic principles of electromag...

  3. Applying technical versatility

    Energy Technology Data Exchange (ETDEWEB)

    None

    1970-01-01

    The breadth and depth of the diversified technical programs at Mound Laboratory since its inception are characterized by a record of competence and versatility during the past generation. A spectrum of mission assignments has been completed successfully in such diverse technical areas as process development and manufacturing of explosive components, research on fuels for the Civilian Power Reactor Program, separation of radioactive materials, fabrication of radioisotopic heat sources, stable gaseous isotope separation and purification, and many other areas. Mound Laboratory is one of the key U.S. Atomic Energy Commission sites that has demonstrated its technical competence in both weapons and non-weapons activities. This report has been prepared to complement the AEC’s vigorous program of scientific information dissemination. Three broad areas of technical competence are highlighted here: explosive technology, radionuclide technology, and stable gaseous isotope separation, which encompass a broad variety of techniques and supporting disciplines.

  4. Applied partial differential equations

    CERN Document Server

    Logan, J David

    2004-01-01

    This primer on elementary partial differential equations presents the standard material usually covered in a one-semester, undergraduate course on boundary value problems and PDEs. What makes this book unique is that it is a brief treatment, yet it covers all the major ideas: the wave equation, the diffusion equation, the Laplace equation, and the advection equation on bounded and unbounded domains. Methods include eigenfunction expansions, integral transforms, and characteristics. Mathematical ideas are motivated from physical problems, and the exposition is presented in a concise style accessible to science and engineering students; emphasis is on motivation, concepts, methods, and interpretation, rather than formal theory. This second edition contains new and additional exercises, and it includes a new chapter on the applications of PDEs to biology: age structured models, pattern formation; epidemic wave fronts, and advection-diffusion processes. The student who reads through this book and solves many of t...

  5. Essays in Applied Microeconomics

    Science.gov (United States)

    Ge, Qi

    This dissertation consists of three self-contained applied microeconomics essays on topics related to behavioral economics and industrial organization. Chapter 1 studies how sentiment as a result of sports event outcomes affects consumers' tipping behavior in the presence of social norms. I formulate a model of tipping behavior that captures consumer sentiment following a reference-dependent preference framework and empirically test its relevance using the game outcomes of the NBA and the trip and tipping data on New York City taxicabs. While I find that consumers' tipping behavior responds to unexpected wins and losses of their home team, particularly in close game outcomes, I do not find evidence for loss aversion. Coupled with the findings on default tipping, my empirical results on the asymmetric tipping responses suggest that while social norms may dominate loss aversion, affect and surprises can result in freedom on the upside of tipping. Chapter 2 utilizes a novel data source of airline entry and exit announcements and examines how the incumbent airlines adjust quality provisions as a response to their competitors' announcements and the role of timing in such responses. I find no evidence that the incumbents engage in preemptive actions when facing probable entry and exit threats as signaled by the competitors' announcements in either short term or long term. There is, however, evidence supporting their responses to the competitors' realized entry or exit. My empirical findings underscore the role of timing in determining preemptive actions and suggest that previous studies may have overestimated how the incumbent airlines respond to entry threats. Chapter 3, which is collaborated with Benjamin Ho, investigates the habit formation of consumers' thermostat setting behavior, an often implicitly made decision and yet a key determinant of home energy consumption and expenditures. We utilize a high frequency dataset on household thermostat usage and find that

  6. Essays in applied economics

    Science.gov (United States)

    Arano, Kathleen

    Three independent studies in applied economics are presented. The first essay looks at the US natural gas industrial sector and estimates welfare effects associated with the changes in natural gas regulatory policy over the past three decades. Using a disequilibrium model suited to the natural gas industry, welfare transfers and deadweight losses are calculated. Results indicate that deregulation policies, beginning with the NGPA of 1978, have caused the industry to become more responsive to market conditions. Over time, regulated prices converge toward the estimated equilibrium prices. As a result of this convergence, deadweight losses associated with regulation are also diminished. The second essay examines the discounted utility model (DU), the standard model used for intertemporal decision-making. Prior empirical studies challenge the descriptive validity of the model. This essay addresses the four main inconsistencies that have been raised: domain dependence, magnitude effects, time effects, and gain/loss asymmetries. These inconsistencies, however, may be the result of the implicit assumption of linear utility and not a failure of the DU model itself. In order to test this hypothesis, data was collected from in-class surveys of economics classes at Mississippi State University. A random effects model for panel data estimation which accounts for individual specific effects was then used to impute discount rates measured in terms of dollars and utility. All four inconsistencies were found to be present when the dollar measures were used. Using utility measures of the discount rate resolved the inconsistencies in some cases. The third essay brings together two perspectives in the study of religion and economics: modeling religious behavior using economic tools and variables, and modeling economic behavior using religious variables. A system of ordered probit equations is developed to simultaneously model religious activities and economic outcomes. Using data

  7. Validação de limpeza de zidovudina: estratégia aplicada ao processo de fabricação de medicamentos anti-retrovirais Cleaning validation of zidovudine: strategy applied to the process manufacture of antiretroviral medicines

    Directory of Open Access Journals (Sweden)

    João Rui Barbosa de Alencar

    2004-03-01

    Full Text Available A validação de limpeza é parte integrante do conjunto de normas que compõem as boas práticas de fabricação de medicamentos. Trata-se de sistemática utilizada para assegurar que os procedimentos de limpeza de equipamentos, efetivamente, removam os resíduos existentes até um nível de aceitação pré-determinado. Poucos trabalhos abordando a validação de limpeza estão disponíveis na literatura concernente à área. Neste, apresenta-se estratégia para validação do processo de limpeza utilizado na fabricação do medicamento zidovudina, produzido pelo LAFEPE® (Recife - PE, Brasil largamente prescrito no tratamento da AIDS. Utilizou-se um método analítico por via espectrofotométrica e técnica de amostragem de superfícies por swab. O critério de aceitação da limpeza utilizado foi de 10 ppm de zidovudina no produto subseqüente (estavudina. Os resíduos de zidovudina encontrados nos equipamentos após a limpeza foram inferiores aos critérios de aceitação da limpeza, bem como do menor nível de concentração capaz de provocar ação farmacológica.The cleaning validation is integrant part of the laws of good manufacturing practices of medicines. Cleaning validation procedures are carried out in order to assure that residues are within acceptable limits after the cleaning process. Very little has been published regarding practices within the pharmaceutical industry. This work presents a strategy for cleaning validation of the process equipments of the medicine zidovudine produced by LAFEPE (Recife - PE, Brazil utilized in AIDS treatment. An analytical method by spectrofotometry and samples surfaces by swab was utilized. The acceptance criteria from the cleaning utilized was 10 ppm of zidovudine in the subsequent product (stavudine. The residues of zidovudine found in the equipment after cleaning were lower that limits established as well as to the smaller level of concentration capable of producing pharmacological effects.

  8. Applying lean thinking in construction

    Directory of Open Access Journals (Sweden)

    Remon Fayek Aziz

    2013-12-01

    Full Text Available The productivity of the construction industry worldwide has been declining over the past 40 years. One approach for improving the situation is using lean construction. Lean construction results from the application of a new form of production management to construction. Essential features of lean construction include a clear set of objectives for the delivery process, aimed at maximizing performance for the customer at the project level, concurrent design, construction, and the application of project control throughout the life cycle of the project from design to delivery. An increasing number of construction academics and professionals have been storming the ramparts of conventional construction management in an effort to deliver better value to owners while making real profits. As a result, lean-based tools have emerged and have been successfully applied to simple and complex construction projects. In general, lean construction projects are easier to manage, safer, completed sooner, and cost less and are of better quality. Significant research remains to complete the translation to construction of lean thinking in Egypt. This research will discuss principles, methods, and implementation phases of lean construction showing the waste in construction and how it could be minimized. The Last Planner System technique, which is an important application of the lean construction concepts and methodologies and is more prevalent, proved that it could enhance the construction management practices in various aspects. Also, it is intended to develop methodology for process evaluation and define areas for improvement based on lean approach principles.

  9. The university extension as a process of digital and social inclusion: Literacy Project in Informatics - PAI applied to the students of the Esperidião Marques State School, Cáceres, State of Mato Grosso, Brazil

    Directory of Open Access Journals (Sweden)

    Tiago Luís de Andrade

    2017-06-01

    Full Text Available This article presents the results of a digital inclusion extension project using free tools carried out in the city of Cáceres, State of Mato Grosso, in 2016. The project was implemented through practical training courses in basic computer science, internet and social networks aiming to encourage the use of computers and free softwares installed in the teaching-learning process. The target audience was elementary school students - grades 6 to 9 – and had the participation of 72 students from the four classes of the school. The results show the importance of the project in the insertion of the students in the computer usage as a technological resource in the day to day of the involved people, and encourage them to use this instrument for the construction and transmission of knowledge. Such resource favored digital and social inclusion, and the pluralization of knowledge, as well as have generated a positive view of the institutions involved with the local community.

  10. [Echinocandins: Applied pharmacology].

    Science.gov (United States)

    Azanza Perea, José Ramón

    2016-01-01

    The echinocandins share pharmacodynamic properties, although there are some interesting differences in their pharmacokinetic behaviour in the clinical practice. They are not absorbed by the oral route. They have a somewhat special distribution in the organism, as some of them can reach high intracellular concentrations while, with some others, the concentration is reduced. They are highly bound to plasma proteins, thus it is recommended to administer a loading dose for anidulafungin and caspofungin, although this procedure is not yet clear with micafungin. Echinocandins are excreted via a non-microsomal metabolism, so the urinary concentration is very low. Some carrier proteins that take part in the biliary clearance process are probably involved in the interactions described with caspofungin and micafungin. These two drugs must be used with caution in patients with severely impaired hepatic function, while all of them can be used without special precautions when there is renal impairment or the patient requires renal replacement therapy. Copyright © 2016 Asociación Española de Micología. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. Scientific methodology applied.

    Science.gov (United States)

    Lussier, A

    1975-04-01

    The subject of this symposium is naproxen, a new drug that resulted from an investigation to find a superior anti-inflammatory agent. It was synthesized by Harrison et al. in 1970 at the Syntex Institute of Organic Chemistry and Biological Sciences. How can we chart the evolution of this or any other drug? Three steps are necessary: first, chemical studies (synthesis, analysis); second, animal pharmacology; third, human pharmacology. The last step can additionally be divided into four phases: metabolism and toxicology of the drug in normal volunteers; dose titration and initial clinical trials with sick subjects (pharmacometry); confirmatory clinical trials when the drug is accepted on the market and revaluation (familiarization trials). To discover the truth about naproxen, we must all participate actively with a critical mind, following the principles of scientific methodology. We shall find that the papers to be presented today all deal with the third step in the evaluation process--clinical pharmacology. It is quite evident that the final and most decisive test must be aimed at the most valuable target: the human being. The end product of this day's work for each of us should be the formation of an opinion based on solid scientific proofs. And let us hope that we will all enjoy fulfilling the symposium in its entire etymological meaning this evening. In vino veritas.

  12. Separations Science and Technology, Semiannual progress report, October 1991--March 1992

    International Nuclear Information System (INIS)

    Vandegrift, G.F.; Betts, S.; Chamberlain, D.B.

    1994-01-01

    This document reports on the work done by the Separations Science and Technology Programs of the Chemical Technology Division, Argonne National Laboratory, in the period October 1991--March 1992. This effort is mainly concerned with developing the TRUEX process for removing and concentrating actinides from acidic waste streams contaminated with transuranic (TRU) elements. The objectives of TRUEX processing are to recover valuable TRU elements and to lower disposal costs for the nonTRU waste product of the process. Two other projects are underway with the objective of developing (1) a membrane-assisted solvent extraction method for treating natural and process waters contaminated by volatile organic compounds and (2) evaporation technology for concentrating radioactive waste and product streams such as those generated by the TRUEX process

  13. Separations Science and Technology, Semiannual progress report, October 1991--March 1992

    Energy Technology Data Exchange (ETDEWEB)

    Vandegrift, G.F.; Betts, S.; Chamberlain, D.B. [and others

    1994-01-01

    This document reports on the work done by the Separations Science and Technology Programs of the Chemical Technology Division, Argonne National Laboratory, in the period October 1991--March 1992. This effort is mainly concerned with developing the TRUEX process for removing and concentrating actinides from acidic waste streams contaminated with transuranic (TRU) elements. The objectives of TRUEX processing are to recover valuable TRU elements and to lower disposal costs for the nonTRU waste product of the process. Two other projects are underway with the objective of developing (1) a membrane-assisted solvent extraction method for treating natural and process waters contaminated by volatile organic compounds and (2) evaporation technology for concentrating radioactive waste and product streams such as those generated by the TRUEX process.

  14. Applied quantum cryptography

    International Nuclear Information System (INIS)

    Kollmitzer, Christian; Pivk, Mario

    2010-01-01

    Using the quantum properties of single photons to exchange binary keys between two partners for subsequent encryption of secret data is an absolutely novel technology. Only a few years ago quantum cryptography - or better: quantum key distribution - was the domain of basic research laboratories at universities. But during the last few years things changed. QKD left the laboratories and was picked up by more practical oriented teams that worked hard to develop a practically applicable technology out of the astonishing results of basic research. One major milestone towards a QKD technology was a large research and development project funded by the European Commission that aimed at combining quantum physics with complementary technologies that are necessary to create a technical solution: electronics, software, and network components were added within the project SECOQC (Development of a Global Network for Secure Communication based on Quantum Cryptography) that teamed up all expertise on European level to get a technology for future encryption. The practical application of QKD in a standard optical fibre network was demonstrated October 2008 in Vienna, giving a glimpse of the future of secure communication. Although many steps have still to be done in order to achieve a real mature technology, the corner stone for future secure communication is already laid. QKD will not be the Holy Grail of security, it will not be able to solve all problems for evermore. But QKD has the potential to replace one of the weakest parts of symmetric encryption: the exchange of the key. It can be proven that the key exchange process cannot be corrupted and that keys that are generated and exchanged quantum cryptographically will be secure for ever (as long as some additional conditions are kept). This book will show the state of the art of Quantum Cryptography and it will sketch how it can be implemented in standard communication infrastructure. The growing vulnerability of sensitive

  15. Um modelo multiobjetivo de otimização aplicado ao processo de orçamento de capital A multicriteria optimization model applied to the capital budgeting process

    Directory of Open Access Journals (Sweden)

    Eder Oliveira Abensur

    2012-12-01

    Full Text Available O processo de orçamento de capital envolve a análise e seleção de projetos de longo prazo de maturação. Essas decisões de investimento são tradicionalmente feitas pela aplicação simultânea de vários métodos financeiros com uso de fluxo de caixa descontado como, por exemplo, o Valor Presente Líquido (VPL e a Taxa Interna de Retorno (TIR. Apesar da longa e ampla disseminação dessas técnicas, são notórios os problemas de avaliação especialmente quando consideradas funções mono-objetivas e projetos mutuamente excludentes. Em se tratando de decisões financeiras, parece ilusório falar em otimização sem considerar múltiplos objetivos e atributos. O objetivo deste trabalho é propor um modelo matemático multiobjetivo prático que auxilie na seleção de projetos de investimento submetidos simultaneamente a vários indicadores de desempenho que incorpora uma nova medida de risco (GAFT. Os testes foram feitos sobre uma amostra de quarenta e cinco projetos e os resultados demonstram que o modelo proposto é uma ferramenta gerencial prática e promissora.The capital budgeting process involves the analysis and selection of projects committed over long periods of time. These investment decisions are traditionally made by the simultaneous application of various financial techniques using discounted cash flow, such as the Net Present Value (NPV and Internal Rate of Return (IRR. Despite the long-term and wide dissemination of these techniques, there are major problems of inconsistency especially in mono-criterion functions and mutually exclusive projects. When dealing with financial decisions, it seems illusory to address optimization without taking multiple objectives and attributes into account. The objective of this paper is to present a mathematical model that allows the multi-criteria selection of investment projects submitted to various financial indicators; the mathematical model incorporates a new measure of risk (GAFT. Forty

  16. Bioprocess microfluidics: applying microfluidic devices for bioprocessing.

    Science.gov (United States)

    Marques, Marco Pc; Szita, Nicolas

    2017-11-01

    Scale-down approaches have long been applied in bioprocessing to resolve scale-up problems. Miniaturized bioreactors have thrived as a tool to obtain process relevant data during early-stage process development. Microfluidic devices are an attractive alternative in bioprocessing development due to the high degree of control over process variables afforded by the laminar flow, and the possibility to reduce time and cost factors. Data quality obtained with these devices is high when integrated with sensing technology and is invaluable for scale-translation and to assess the economical viability of bioprocesses. Microfluidic devices as upstream process development tools have been developed in the area of small molecules, therapeutic proteins, and cellular therapies. More recently, they have also been applied to mimic downstream unit operations.

  17. Special issue - Applying the accelerator

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    T'he CERN Courier is the international journal of high energy physics, covering current developments in and around this branch of basic science. A recurrent theme is applying the technology developed for particle accelerators, the machines which produce beams of high energy particles for physics experiments. Twentieth-century science is full of similar examples of applications derived from pure research. This special issue of the CERN Courier is given over to one theme - the applications of accelerators. Accelerator systems and facilities are normally associated with highenergy particle physics research, the search for fundamental particles and the quest to understand the physics of the Big Bang. To the layman, accelerator technology has become synonymous with large and expensive machines, exploiting the most modern technology for basic research. In reality, the range of accelerators and their applications is much broader. A vast number of accelerators, usually much smaller and operating for specific applications, create wealth and directly benefit the population, particularly in the important areas of healthcare, energy and the environment. There are well established applications in diagnostic and therapeutic medicine for research and routine clinical treatments. Accelerators and associated technologies are widely employed by industry for manufacturing and process control. In fundamental and applied research, accelerator systems are frequently used as tools. The biennial conference on the Applications of Accelerators in Industry and Research at Denton, Texas, attracts a thousand participants. This special issue of the CERN Courier includes articles on major applications, reflecting the diversity and value of accelerator technology. Under Guest Editor Dewi Lewis of Amersham International, contributions from leading international specialists with experience of the application end of the accelerator chain describe their fields of direct interest. The

  18. Applied Behavior Analysis in Education.

    Science.gov (United States)

    Cooper, John O.

    1982-01-01

    Applied behavioral analysis in education is expanding rapidly. This article describes the dimensions of applied behavior analysis and the contributions this technology offers teachers in the area of systematic applications, direct and daily measurement, and experimental methodology. (CJ)

  19. Applying Lean on Agile Scrum Development Methodology

    OpenAIRE

    SurendRaj Dharmapal; K. Thirunadana Sikamani

    2015-01-01

    This journal introduces the reader to Agile and Lean concepts and provides a basic level of understanding of each process. This journal will also provide a brief background about applying Lean concepts on each phase of agile scrum methodology and summarize their primary business advantages for delivering value to customer.

  20. Applied programs at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    1991-09-01

    This document overviews the areas of current research at Brookhaven National Laboratory. Technology transfer and the user facilities are discussed. Current topics are presented in the areas of applied physics, chemical science, material science, energy efficiency and conservation, environmental health and mathematics, biosystems and process science, oceanography, and nuclear energy. (GHH)