WorldWideScience

Sample records for catacarb process

  1. Primary Processing

    NARCIS (Netherlands)

    Mulder, W.J.; Harmsen, P.F.H.; Sanders, J.P.M.; Carre, P.; Kamm, B.; Schoenicke, P.

    2012-01-01

    Primary processing of oil-containing material involves pre-treatment processes, oil recovery processes and the extraction and valorisation of valuable compounds from waste streams. Pre-treatment processes, e.g. thermal, enzymatic, electrical and radio frequency, have an important effect on the oil r

  2. Elektrokemiske Processer

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers

    1997-01-01

    Electrochemical processes in: Power sources, Electrosynthesis, Corrosion.Pourbaix-diagrams.Decontamination of industrial waste water for heavy metals.......Electrochemical processes in: Power sources, Electrosynthesis, Corrosion.Pourbaix-diagrams.Decontamination of industrial waste water for heavy metals....

  3. Data processing

    CERN Document Server

    Fry, T F

    2013-01-01

    Data Processing discusses the principles, practices, and associated tools in data processing. The book is comprised of 17 chapters that are organized into three parts. The first part covers the characteristics, systems, and methods of data processing. Part 2 deals with the data processing practice; this part discusses the data input, output, and storage. The last part discusses topics related to systems and software in data processing, which include checks and controls, computer language and programs, and program elements and structures. The text will be useful to practitioners of computer-rel

  4. Design Processes

    DEFF Research Database (Denmark)

    Ovesen, Nis

    2009-01-01

    advantages and challenges of agile processes in mobile software and web businesses are identified. The applicability of these agile processes is discussed in re- gards to design educations and product development in the domain of Industrial Design and is briefly seen in relation to the concept of dromology......Inspiration for most research and optimisations on design processes still seem to focus within the narrow field of the traditional design practise. The focus in this study turns to associated businesses of the design professions in order to learn from their development processes. Through interviews...

  5. Magnetics Processing

    Data.gov (United States)

    Federal Laboratory Consortium — The Magnetics Processing Lab equipped to perform testing of magnetometers, integrate them into aircraft systems, and perform data analysis, including noise reduction...

  6. Stochastic processes

    CERN Document Server

    Parzen, Emanuel

    2015-01-01

    Well-written and accessible, this classic introduction to stochastic processes and related mathematics is appropriate for advanced undergraduate students of mathematics with a knowledge of calculus and continuous probability theory. The treatment offers examples of the wide variety of empirical phenomena for which stochastic processes provide mathematical models, and it develops the methods of probability model-building.Chapter 1 presents precise definitions of the notions of a random variable and a stochastic process and introduces the Wiener and Poisson processes. Subsequent chapters examine

  7. Image processing

    NARCIS (Netherlands)

    Heijden, van der F.; Spreeuwers, L.J.; Blanken, H.M.; Vries de, A.P.; Blok, H.E.; Feng, L

    2007-01-01

    The field of image processing addresses handling and analysis of images for many purposes using a large number of techniques and methods. The applications of image processing range from enhancement of the visibility of cer- tain organs in medical images to object recognition for handling by industri

  8. Sustainable processing

    DEFF Research Database (Denmark)

    Kristensen, Niels Heine

    2004-01-01

    Kristensen_NH and_Beck A: Sustainable processing. In Otto Schmid, Alexander Beck and Ursula Kretzschmar (Editors) (2004): Underlying Principles in Organic and "Low-Input Food" Processing - Literature Survey. Research Institute of Organic Agriculture FiBL, CH-5070 Frick, Switzerland. ISBN 3-906081-58-3...

  9. Process mining

    DEFF Research Database (Denmark)

    van der Aalst, W.M.P.; Rubin, V.; Verbeek, H.M.W.

    2010-01-01

    Process mining includes the automated discovery of processes from event logs. Based on observed events (e.g., activities being executed or messages being exchanged) a process model is constructed. One of the essential problems in process mining is that one cannot assume to have seen all possible...... behavior. At best, one has seen a representative subset. Therefore, classical synthesis techniques are not suitable as they aim at finding a model that is able to exactly reproduce the log. Existing process mining techniques try to avoid such “overfitting” by generalizing the model to allow for more...... behavior. This generalization is often driven by the representation language and very crude assumptions about completeness. As a result, parts of the model are “overfitting” (allow only for what has actually been observed) while other parts may be “underfitting” (allowfor much more behavior without strong...

  10. Organizing Process

    DEFF Research Database (Denmark)

    Hull Kristensen, Peer; Bojesen, Anders

    This paper invites to discuss the processes of individualization and organizing being carried out under what we might see as an emerging regime of change. The underlying argumentation is that in certain processes of change, competence becomes questionable at all times. The hazy characteristics...... of this regime of change are pursued through a discussion of competencies as opposed to qualifications illustrated by distinct cases from the Danish public sector in the search for repetitive mechanisms. The cases are put into a general perspective by drawing upon experiences from similar change processes...

  11. Electrochemical Processes

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers

    1997-01-01

    The notes describe in detail primary and secondary galvanic cells, fuel cells, electrochemical synthesis and electroplating processes, corrosion: measurments, inhibitors, cathodic and anodic protection, details of metal dissolution reactions, Pourbaix diagrams and purification of waste water from...

  12. Processing Proteases

    DEFF Research Database (Denmark)

    Ødum, Anders Sebastian Rosenkrans

    Processing proteases are proteases which proteolytically activate proteins and peptides into their biologically active form. Processing proteases play an important role in biotechnology as tools in protein fusion technology. Fusion strategies where helper proteins or peptide tags are fused...... to the protein of interest are an elaborate method to optimize expression or purification systems. It is however critical that fusion proteins can be removed and processing proteases can facilitate this in a highly specific manner. The commonly used proteases all have substrate specificities to the N...... of few known proteases to have substrate specificity for the C-terminal side of the scissile bond. LysN exhibits specificity for lysine, and has primarily been used to complement trypsin in to proteomic studies. A working hypothesis during this study was the potential of LysN as a processing protease...

  13. Grants Process

    Science.gov (United States)

    The NCI Grants Process provides an overview of the end-to-end lifecycle of grant funding. Learn about the types of funding available and the basics for application, review, award, and on-going administration within the NCI.

  14. Sewer Processes

    DEFF Research Database (Denmark)

    Hvitved-Jacobsen, Thorkild; Vollertsen, Jes; Nielsen, Asbjørn Haaning

    and valuable information on the sewer as a chemical and biological reactor. It focuses on how to predict critical impacts and control adverse effects. It also provides an integrated description of sewer processes in modeling terms. This second edition is full of illustrative examples and figures, includes...... microbial and chemical processes and demonstrates how this knowledge can be applied for the design, operation, and the maintenance of wastewater collection systems. The authors add chemical and microbial dimensions to the design and management of sewer networks with an overall aim of improved sustainability...... by hydrogen sulfide and other volatile organic compounds, as well as other potential health issues, have caused environmental concerns to rise. Reflecting the most current developments, Sewer Processes: Microbial and Chemical Process Engineering of Sewer Networks, Second Edition, offers the reader updated...

  15. In process...

    OpenAIRE

    LI Xin

    1999-01-01

    Architecture is a wonderful world. As a student of architecture, time and time again I am inpressed by its powerful imagines. The more I study and learn, however, the more I question. What is the truth beyond those fantastic imagines? What is the nuture of Architecture? Is there any basic way or process to approach the work of Architecture? With these questions, I begin my thesis project and the process of looking for answers. MArch

  16. In process...

    OpenAIRE

    LI Xin

    2000-01-01

    Architecture is a wonderful world. As a student of architecture, time and time again I am inpressed by its powerful imagines. The more I study and learn, however, the more I question. What is the truth beyond those fantastic imagines? What is the nuture of Architecture? Is there any basic way or process to approach the work of Architecture? With these questions, I begin my thesis project and the process of looking for answers.

  17. Renewal processes

    CERN Document Server

    Mitov, Kosto V

    2014-01-01

    This monograph serves as an introductory text to classical renewal theory and some of its applications for graduate students and researchers in mathematics and probability theory. Renewal processes play an important part in modeling many phenomena in insurance, finance, queuing systems, inventory control and other areas. In this book, an overview of univariate renewal theory is given and renewal processes in the non-lattice and lattice case are discussed. A pre-requisite is a basic knowledge of probability theory.

  18. Macdonald processes

    CERN Document Server

    Borodin, Alexei

    2011-01-01

    Macdonald processes are probability measures on sequences of partitions defined in terms of nonnegative specializations of the Macdonald symmetric functions and two Macdonald parameters q,t in [0,1). We prove several results about these processes, which include the following. (1) We explicitly evaluate expectations of a rich family of observables for these processes. (2) In the case t=0, we find a Fredholm determinant formula for a q-Laplace transform of the distribution of the last part of the Macdonald-random partition. (3) We introduce Markov dynamics that preserve the class of Macdonald processes and lead to new "integrable" 2d and 1d interacting particle systems. (4) In a large time limit transition, and as q goes to 1, the particles of these systems crystallize on a lattice, and fluctuations around the lattice converge to O'Connell's Whittaker process that describe semi-discrete Brownian directed polymers. (5) This yields a Fredholm determinant for the Laplace transform of the polymer partition function...

  19. Offshoring Process

    DEFF Research Database (Denmark)

    Slepniov, Dmitrij; Sørensen, Brian Vejrum; Katayama, Hiroshi

    2011-01-01

    The purpose of this chapter is to contribute to the knowledge on how production offshoring and international operations management vary across cultural contexts. The chapter attempts to shed light on how companies approach the process of offshoring in different cultural contexts. In order...... of globalisation. Yet there are clear differences in how offshoring is conducted in Denmark and Japan. The main differences are outlined in a framework and explained employing cultural variables. The findings lead to a number of propositions suggesting that the process of offshoring is not simply a uniform...

  20. Processing Branches

    DEFF Research Database (Denmark)

    Schindler, Christoph; Tamke, Martin; Tabatabai, Ali;

    2014-01-01

    Angled and forked wood – a desired material until 19th century, was swept away by industrialization and its standardization of processes and materials. Contemporary information technology has the potential for the capturing and recognition of individual geometries through laser scanning and compu...

  1. Innovation process

    DEFF Research Database (Denmark)

    Kolodovski, A.

    2006-01-01

    Purpose of this report: This report was prepared for RISO team involved in design of the innovation system Report provides innovation methodology to establish common understanding of the process concepts and related terminology The report does not includeRISO- or Denmark-specific cultural, economic...

  2. BENTONITE PROCESSING

    Directory of Open Access Journals (Sweden)

    Anamarija Kutlić

    2012-07-01

    Full Text Available Bentonite has vide variety of uses. Special use of bentonite, where its absorbing properties are employed to provide water-tight sealing is for an underground repository in granites In this paper, bentonite processing and beneficiation are described.

  3. Optical Processing.

    Science.gov (United States)

    1985-12-31

    34perceptron" (F. Rosenblatt, Principles of Neurodynamics ), workers in the neural network field have been seeking to understand how neural networks can perform...Moscow). 13. F. Rosenblatt, Principles of Neurodynamics , (Spartan, 1962). 14. W. Stoner "Incoherent optical processing via spatially offset pupil

  4. Photobiomodulation Process

    Directory of Open Access Journals (Sweden)

    Yang-Yi Xu

    2012-01-01

    Full Text Available Photobiomodulation (PBM is a modulation of laser irradiation or monochromatic light (LI on biosystems. There is little research on PBM dynamics although its phenomena and mechanism have been widely studied. The PBM was discussed from dynamic viewpoint in this paper. It was found that the primary process of cellular PBM might be the key process of cellular PBM so that the transition rate of cellular molecules can be extended to discuss the dose relationship of PBM. There may be a dose zone in which low intensity LI (LIL at different doses has biological effects similar to each other, so that biological information model of PBM might hold. LIL may self-adaptively modulate a chronic stress until it becomes successful.

  5. Boolean process

    Institute of Scientific and Technical Information of China (English)

    闵应骅; 李忠诚; 赵著行

    1997-01-01

    Boolean algebra successfully describes the logical behavior of a digital circuit, and has been widely used in electronic circuit design and test With the development of high speed VLSIs it is a drawback for Boolean algebra to be unable to describe circuit timing behavior. Therefore a Boolean process is defined as a family of Boolean van ables relevant to the time parameter t. A real-valued sample of a Boolean process is a waveform. Waveform functions can be manipulated formally by using mathematical tools. The distance, difference and limit of a waveform polynomial are defined, and a sufficient and necessary condition of the limit existence is presented. Based on this, the concept of sensitization is redefined precisely to demonstrate the potential and wide application possibility The new definition is very different from the traditional one, and has an impact on determining the sensitizable paths with maximum or minimum length, and false paths, and then designing and testing high performance circuits

  6. Purification process

    Energy Technology Data Exchange (ETDEWEB)

    Marshall, A.

    1981-02-17

    A process for the removal of hydrogen sulphide from gases or liquid hydrocarbons, comprises contacting the gas or liquid hydrocarbon with an aqueous alkaline solution, preferably having a pH value of 8 to 10, comprising (A) an anthraquinone disulphonic acid or a water-soluble sulphonamide thereof (B) a compound of a metal which can exist in at least two valency states and (C) a sequestering agent.

  7. Ceramic Processing

    Energy Technology Data Exchange (ETDEWEB)

    EWSUK,KEVIN G.

    1999-11-24

    Ceramics represent a unique class of materials that are distinguished from common metals and plastics by their: (1) high hardness, stiffness, and good wear properties (i.e., abrasion resistance); (2) ability to withstand high temperatures (i.e., refractoriness); (3) chemical durability; and (4) electrical properties that allow them to be electrical insulators, semiconductors, or ionic conductors. Ceramics can be broken down into two general categories, traditional and advanced ceramics. Traditional ceramics include common household products such as clay pots, tiles, pipe, and bricks, porcelain china, sinks, and electrical insulators, and thermally insulating refractory bricks for ovens and fireplaces. Advanced ceramics, also referred to as ''high-tech'' ceramics, include products such as spark plug bodies, piston rings, catalyst supports, and water pump seals for automobiles, thermally insulating tiles for the space shuttle, sodium vapor lamp tubes in streetlights, and the capacitors, resistors, transducers, and varistors in the solid-state electronics we use daily. The major differences between traditional and advanced ceramics are in the processing tolerances and cost. Traditional ceramics are manufactured with inexpensive raw materials, are relatively tolerant of minor process deviations, and are relatively inexpensive. Advanced ceramics are typically made with more refined raw materials and processing to optimize a given property or combination of properties (e.g., mechanical, electrical, dielectric, optical, thermal, physical, and/or magnetic) for a given application. Advanced ceramics generally have improved performance and reliability over traditional ceramics, but are typically more expensive. Additionally, advanced ceramics are typically more sensitive to the chemical and physical defects present in the starting raw materials, or those that are introduced during manufacturing.

  8. Hydrocarbon processing

    Energy Technology Data Exchange (ETDEWEB)

    Hill, S.G.; Seddon, D.

    1989-06-28

    A process for the catalytic conversion of synthesis-gas into a product which comprises naphtha, kerosene and distillate is characterized in that the catalyst is a Fischer-Tropsch catalyst also containing a zeolite, the naphtha fraction contains 60% or less linear paraffins and the kerosene and distillated fractions contain more linear paraffins and olefins than found in the naphtha fraction. Reduction of the relative amount of straight chain material in the naphtha fraction increases the octane number and so enhances the quality of the gasoline product, while the high quality of the kerosene and distillate fractions is maintained.

  9. Crystallization process

    Science.gov (United States)

    Adler, Robert J.; Brown, William R.; Auyang, Lun; Liu, Yin-Chang; Cook, W. Jeffrey

    1986-01-01

    An improved crystallization process is disclosed for separating a crystallizable material and an excluded material which is at least partially excluded from the solid phase of the crystallizable material obtained upon freezing a liquid phase of the materials. The solid phase is more dense than the liquid phase, and it is separated therefrom by relative movement with the formation of a packed bed of solid phase. The packed bed is continuously formed adjacent its lower end and passed from the liquid phase into a countercurrent flow of backwash liquid. The packed bed extends through the level of the backwash liquid to provide a drained bed of solid phase adjacent its upper end which is melted by a condensing vapor.

  10. Lithospheric processes

    Energy Technology Data Exchange (ETDEWEB)

    Baldridge, W. [and others

    2000-12-01

    The authors used geophysical, geochemical, and numerical modeling to study selected problems related to Earth's lithosphere. We interpreted seismic waves to better characterize the thickness and properties of the crust and lithosphere. In the southwestern US and Tien Shari, crust of high elevation is dynamically supported above buoyant mantle. In California, mineral fabric in the mantle correlate with regional strain history. Although plumes of buoyant mantle may explain surface deformation and magmatism, our geochemical work does not support this mechanism for Iberia. Generation and ascent of magmas remains puzzling. Our work in Hawaii constrains the residence of magma beneath Hualalai to be a few hundred to about 1000 years. In the crust, heat drives fluid and mass transport. Numerical modeling yielded robust and accurate predictions of these processes. This work is important fundamental science, and applies to mitigation of volcanic and earthquake hazards, Test Ban Treaties, nuclear waste storage, environmental remediation, and hydrothermal energy.

  11. Image Processing

    Science.gov (United States)

    1993-01-01

    Electronic Imagery, Inc.'s ImageScale Plus software, developed through a Small Business Innovation Research (SBIR) contract with Kennedy Space Flight Center for use on space shuttle Orbiter in 1991, enables astronauts to conduct image processing, prepare electronic still camera images in orbit, display them and downlink images to ground based scientists for evaluation. Electronic Imagery, Inc.'s ImageCount, a spin-off product of ImageScale Plus, is used to count trees in Florida orange groves. Other applications include x-ray and MRI imagery, textile designs and special effects for movies. As of 1/28/98, company could not be located, therefore contact/product information is no longer valid.

  12. Dynamic Processes

    Science.gov (United States)

    Klingshirn, C.

    . Phys. Lett. 92:211105, 2008). For this point, recall Figs. 6.16 and 6.33. Since the polarisation amplitude is gone in any case after the recombination process, there is an upper limit for T 2 given by T 2 ≤ 2 T1. The factor of two comes from the fact that T 2 describes the decay of an amplitude and T 1 the decay of a population, which is proportional to the amplitude squared. Sometimes T 2 is subdivided in a term due to recombination described by T 1 and another called 'pure dephasing' called T 2 ∗ with the relation 1 / T 2 = 1 / 2 T 1 + 1 / T2 ∗. The quantity T 2 ∗ can considerably exceed 2 T 1. In the part on relaxation processes that is on processes contributing to T 3, we give also examples for the capture of excitons into bound, localized, or deep states. For more details on dynamics in semiconductors in general see for example, the (text-) books [Klingshirn, Semiconductor Optics, 3rd edn. (Springer, Berlin, 2006); Haug and Koch, Quantum Theory of the Optical and Electronic Properties of Semiconductors, 4th edn. (World Scientific, Singapore, 2004); Haug and Jauho, Quantum Kinetics in Transport and Optics of Semiconductors, Springer Series in Solid State Sciences vol. 123 (Springer, Berlin, 1996); J. Shah, Ultrafast Spectroscopy of Semiconductors and of Semiconductor Nanostructures, Springer Series in Solid State Sciences vol. 115 (Springer, Berlin, 1996); Schafer and Wegener, Semiconductor Optics and Transport Phenomena (Springer, Berlin, 2002)]. We present selected data for free, bound and localized excitons, biexcitons and electron-hole pairs in an EHP and examples for bulk materials, epilayers, quantum wells, nano rods and nano crystals with the restriction that - to the knowledge of the author - data are not available for all these systems, density ranges and temperatures. Therefore, we subdivide the topic below only according to the three time constants T 2, T 3 and T 1.

  13. Data Processing

    Science.gov (United States)

    Grangeat, P.

    A new area of biology has been opened up by nanoscale exploration of the living world. This has been made possible by technological progress, which has provided the tools needed to make devices that can measure things on such length and time scales. In a sense, this is a new window upon the living world, so rich and so diverse. Many of the investigative methods described in this book seek to obtain complementary physical, chemical, and biological data to understand the way it works and the way it is organised. At these length and time scales, only dedicated instrumentation could apprehend the relevant phenomena. There is no way for our senses to observe these things directly. One important field of application is molecular medicine, which aims to explain the mechanisms of life and disease by the presence and quantification of specific molecular entities. This involves combining information about genes, proteins, cells, and organs. This in turn requires the association of instruments for molecular diagnosis, either in vitro, e.g., the microarray or the lab-on-a-chip, or in vivo, e.g., probes for molecular biopsy, and tools for molecular imaging, used to localise molecular information in living organisms in a non-invasive way. These considerations concern both preclinical research for drug design and human medical applications. With the development of DNA and RNA chips [1], genomics has revolutionised investigative methods for cells and cell processes [2,3]. By sequencing the human genome, new ways have been found for understanding the fundamental mechanisms of life [4]. A revolution is currently under way with the analysis of the proteome [5-8], i.e., the complete set of proteins that can be found in some given biological medium, such as the blood plasma. The goal is to characterise certain diseases by recognisable signatures in the proteomic profile, as determined from a blood sample or a biopsy, for example [9-13]. What is at stake is the early detection of

  14. Information Processing - Administrative Data Processing

    Science.gov (United States)

    Bubenko, Janis

    A three semester, 60-credit course package in the topic of Administrative Data Processing (ADP), offered in 1966 at Stockholm University (SU) and the Royal Institute of Technology (KTH) is described. The package had an information systems engineering orientation. The first semester focused on datalogical topics, while the second semester focused on the infological topics. The third semester aimed to deepen the students’ knowledge in different parts of ADP and at writing a bachelor thesis. The concluding section of this paper discusses various aspects of the department’s first course effort. The course package led to a concretisation of our discipline and gave our discipline an identity. Our education seemed modern, “just in time”, and well adapted to practical needs. The course package formed the first concrete activity of a group of young teachers and researchers. In a forty-year perspective, these people have further developed the department and the topic to an internationally well-reputed body of knowledge and research. The department has produced more than thirty professors and more than one hundred doctoral degrees.

  15. Electrotechnologies to process foods

    Science.gov (United States)

    Electrical energy is being used to process foods. In conventional food processing plants, electricity drives mechanical devices and controls the degree of process. In recent years, several processing technologies are being developed to process foods directly with electricity. Electrotechnologies use...

  16. Studies on process synthesis and process integration

    OpenAIRE

    Fien, Gert-Jan A. F.

    1994-01-01

    This thesis discusses topics in the field of process engineering that have received much attention over the past twenty years: (1) conceptual process synthesis using heuristic shortcut methods and (2) process integration through heat-exchanger networks and energy-saving power and refrigeration systems. The shortcut methods for conceptual process synthesis presented in Chapter 2, utilize Residue Curve Maps in ternary diagrams and are illustrated with examples of processes...

  17. Extensible packet processing architecture

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  18. Process mineralogy IX

    Energy Technology Data Exchange (ETDEWEB)

    Petruk, W.; Hagni, R.D.; Pignolet-Brandom, S.; Hausen, D.M. (eds.) (Canada Centre for Mineral and Energy Technology, Ottawa, ON (Canada))

    1990-01-01

    54 papers are presented under the headings: keynote address; process mineralogy applications to mineral processing; process mineralogy applications to gold; process mineralogy applications to pyrometallurgy and hydrometallurgy; process mineralogy applications to environment and health; and process mineralogy applications to synthetic materials. Subject and author indexes are provided. Three papers have been abstracted separately.

  19. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    2010-01-01

    are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and...

  20. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    , and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can...

  1. Refactoring Process Models in Large Process Repositories.

    NARCIS (Netherlands)

    Weber, B.; Reichert, M.U.

    2008-01-01

    With the increasing adoption of process-aware information systems (PAIS), large process model repositories have emerged. Over time respective models have to be re-aligned to the real-world business processes through customization or adaptation. This bears the risk that model redundancies are introdu

  2. Process Intensification: A Perspective on Process Synthesis

    DEFF Research Database (Denmark)

    Lutze, Philip; Gani, Rafiqul; Woodley, John

    2010-01-01

    In recent years, process intensification (PI) has attracted considerable academic interest as a potential means for process improvement, to meet the increasing demands for sustainable production. A variety of intensified operations developed in academia and industry creates a large number...... of options to potentially improve the process but to identify the set of feasible solutions for PI in which the optimal can be found takes considerable resources. Hence, a process synthesis tool to achieve PI would potentially assist in the generation and evaluation of PI options. Currently, several process...... of the main concepts is illustrated through an example involving the operation of a membrane reactor....

  3. The permanental process

    DEFF Research Database (Denmark)

    McCullagh, Peter; Møller, Jesper

    2006-01-01

    We extend the boson process first to a large class of Cox processes and second to an even larger class of infinitely divisible point processes. Density and moment results are studied in detail. These results are obtained in closed form as weighted permanents, so the extension i called a permanental...... process. Temporal extensions and a particularly tractable case of the permanental process are also studied. Extensions of the fermion process along similar lines, leading to so-called determinantal processes, are discussed....

  4. Food processing and allergenicity

    NARCIS (Netherlands)

    Verhoeckx, K.C.M.; Vissers, Y.M.; Baumert, J.L.; Faludi, R.; Feys, M.; Flanagan, S.; Herouet-Guicheney, C.; Holzhauser, T.; Shimojo, R.; Bolt, N. van der; Wichers, H.; Kimber, I.

    2015-01-01

    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat tre

  5. Food Processing and Allergenicity

    NARCIS (Netherlands)

    Verhoeckx, K.; Vissers, Y.; Baumert, J.L.; Faludi, R.; Fleys, M.; Flanagan, S.; Herouet-Guicheney, C.; Holzhauser, T.; Shimojo, R.; Bolt, van der Nieke; Wichers, H.J.; Kimber, I.

    2015-01-01

    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed.

    In this review the impact of processing (heat and non

  6. SAR processing using SHARC signal processing systems

    Science.gov (United States)

    Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.

    1998-09-01

    Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.

  7. INTEGRATED RENEWAL PROCESS

    Directory of Open Access Journals (Sweden)

    Suyono .

    2012-07-01

    Full Text Available The marginal distribution of integrated renewal process is derived in this paper. Our approach is based on the theory of point processes, especially Poisson point processes. The results are presented in the form of Laplace transforms.

  8. Integrated Process Capability Analysis

    Institute of Scientific and Technical Information of China (English)

    Chen; H; T; Huang; M; L; Hung; Y; H; Chen; K; S

    2002-01-01

    Process Capability Analysis (PCA) is a powerful too l to assess the ability of a process for manufacturing product that meets specific ations. The larger process capability index implies the higher process yield, a nd the larger process capability index also indicates the lower process expected loss. Chen et al. (2001) has applied indices C pu, C pl, and C pk for evaluating the process capability for a multi-process product wi th smaller-the-better, larger-the-better, and nominal-the-best spec...

  9. Product Development Process Modeling

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The use of Concurrent Engineering and other modern methods of product development and maintenance require that a large number of time-overlapped "processes" be performed by many people. However, successfully describing and optimizing these processes are becoming even more difficult to achieve. The perspective of industrial process theory (the definition of process) and the perspective of process implementation (process transition, accumulation, and inter-operations between processes) are used to survey the method used to build one base model (multi-view) process model.

  10. From Process Understanding to Process Control

    NARCIS (Netherlands)

    Streefland, M.

    2010-01-01

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged. Recent changes in the regul

  11. Business Process Redesign: Design the Improved Process

    Science.gov (United States)

    1993-09-01

    64 C. MULTIVOTING ......... .................. .. 65 D. ELECTRONIC VOTING TECHNOLOGY ... ......... .. 65 v E. PAIRED...PROCESS IMPROVEMENT PROCESS (PIP) Diagram of each Activity (AI-A4) ......... .. 122 vi APPENDIX D: PRODUCTS AND VENDORS WHICH SUPPORT ELECTRONIC VOTING ............. 126...requirements. D. ELECTRONIC VOTING TECHNOLOGY Nunamaker [1992] suggests that traditional voting usual- ly happens at the end of a discussion, to close

  12. Idaho Chemical Processing Plant Process Efficiency improvements

    Energy Technology Data Exchange (ETDEWEB)

    Griebenow, B.

    1996-03-01

    In response to decreasing funding levels available to support activities at the Idaho Chemical Processing Plant (ICPP) and a desire to be cost competitive, the Department of Energy Idaho Operations Office (DOE-ID) and Lockheed Idaho Technologies Company have increased their emphasis on cost-saving measures. The ICPP Effectiveness Improvement Initiative involves many activities to improve cost effectiveness and competitiveness. This report documents the methodology and results of one of those cost cutting measures, the Process Efficiency Improvement Activity. The Process Efficiency Improvement Activity performed a systematic review of major work processes at the ICPP to increase productivity and to identify nonvalue-added requirements. A two-phase approach was selected for the activity to allow for near-term implementation of relatively easy process modifications in the first phase while obtaining long-term continuous improvement in the second phase and beyond. Phase I of the initiative included a concentrated review of processes that had a high potential for cost savings with the intent of realizing savings in Fiscal Year 1996 (FY-96.) Phase II consists of implementing long-term strategies too complex for Phase I implementation and evaluation of processes not targeted for Phase I review. The Phase II effort is targeted for realizing cost savings in FY-97 and beyond.

  13. Technology or Process First?

    DEFF Research Database (Denmark)

    Siurdyban, Artur Henryk; Svejvig, Per; Møller, Charles

    between them using strategic alignment, Enterprise Systems and Business Process Management theories. We argue that the insights from these cases can lead to a better alignment between process and technology. Implications for practice include the direction towards a closer integration of process...... and technology factors in organizations. Theoretical implications call for a design-oriented view of technology and process alignment....

  14. Thin film processes

    CERN Document Server

    Vossen, John L

    1978-01-01

    Remarkable advances have been made in recent years in the science and technology of thin film processes for deposition and etching. It is the purpose of this book to bring together tutorial reviews of selected filmdeposition and etching processes from a process viewpoint. Emphasis is placed on the practical use of the processes to provide working guidelines for their implementation, a guide to the literature, and an overview of each process.

  15. Metallurgical process engineering

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Ruiyu [Central Iron and Steel Research Institute (CISRI), Beijing (China)

    2011-07-01

    ''Metallurgical Process Engineering'' discusses large-scale integrated theory on the level of manufacturing production processes, putting forward concepts for exploring non-equilibrium and irreversible complex system. It emphasizes the dynamic and orderly operation of the steel plant manufacturing process, the major elements of which are the flow, process network and program. The book aims at establishing a quasi-continuous and continuous process system for improving several techno-economic indices, minimizing dissipation and enhancing the market competitiveness and sustainability of steel plants. The book is intended for engineers, researchers and managers in the fields of metallurgical engineering, industrial design, and process engineering. (orig.)

  16. Acoustic signal processing toolbox for array processing

    Science.gov (United States)

    Pham, Tien; Whipps, Gene T.

    2003-08-01

    The US Army Research Laboratory (ARL) has developed an acoustic signal processing toolbox (ASPT) for acoustic sensor array processing. The intent of this document is to describe the toolbox and its uses. The ASPT is a GUI-based software that is developed and runs under MATLAB. The current version, ASPT 3.0, requires MATLAB 6.0 and above. ASPT contains a variety of narrowband (NB) and incoherent and coherent wideband (WB) direction-of-arrival (DOA) estimation and beamforming algorithms that have been researched and developed at ARL. Currently, ASPT contains 16 DOA and beamforming algorithms. It contains several different NB and WB versions of the MVDR, MUSIC and ESPRIT algorithms. In addition, there are a variety of pre-processing, simulation and analysis tools available in the toolbox. The user can perform simulation or real data analysis for all algorithms with user-defined signal model parameters and array geometries.

  17. News: Process intensification

    Science.gov (United States)

    Conservation of materials and energy is a major objective to the philosophy of sustainability. Where production processes can be intensified to assist these objectives, significant advances have been developed to assist conservation as well as cost. Process intensification (PI) h...

  18. Group Decision Process Support

    DEFF Research Database (Denmark)

    Gøtze, John; Hijikata, Masao

    1997-01-01

    Introducing the notion of Group Decision Process Support Systems (GDPSS) to traditional decision-support theorists.......Introducing the notion of Group Decision Process Support Systems (GDPSS) to traditional decision-support theorists....

  19. Business Process Inventory

    Data.gov (United States)

    Office of Personnel Management — Inventory of maps and descriptions of the business processes of the U.S. Office of Personnel Management (OPM), with an emphasis on the processes of the Office of the...

  20. Towards better process understanding

    DEFF Research Database (Denmark)

    Matero, Sanni Elina; van der Berg, Franciscus Winfried J; Poutiainen, Sami

    2013-01-01

    The manufacturing of tablets involves many unit operations that possess multivariate and complex characteristics. The interactions between the material characteristics and process related variation are presently not comprehensively analyzed due to univariate detection methods. As a consequence......, current best practice to control a typical process is to not allow process-related factors to vary i.e. lock the production parameters. The problem related to the lack of sufficient process understanding is still there: the variation within process and material properties is an intrinsic feature...... and cannot be compensated for with constant process parameters. Instead, a more comprehensive approach based on the use of multivariate tools for investigating processes should be applied. In the pharmaceutical field these methods are referred to as Process Analytical Technology (PAT) tools that aim...

  1. Secure Processing Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Secure Processing Lab is the center of excellence for new and novel processing techniques for the formation, calibration and analysis of radar. In addition, this...

  2. Radiochemical Processing Laboratory (RPL)

    Data.gov (United States)

    Federal Laboratory Consortium — The Radiochemical Processing Laboratory (RPL)�is a scientific facility funded by DOE to create and implement innovative processes for environmental clean-up and...

  3. Infrared processing of foods

    Science.gov (United States)

    Infrared (IR) processing of foods has been gaining popularity over conventional processing in several unit operations, including drying, peeling, baking, roasting, blanching, pasteurization, sterilization, disinfection, disinfestation, cooking, and popping . It has shown advantages over conventional...

  4. Dairy processing, Improving quality

    NARCIS (Netherlands)

    Smit, G.

    2003-01-01

    This book discusses raw milk composition, production and quality, and reviews developments in processing from hygiene and HACCP systems to automation, high-pressure processing and modified atmosphere packaging.

  5. Stochastic processes - quantum physics

    Energy Technology Data Exchange (ETDEWEB)

    Streit, L. (Bielefeld Univ. (Germany, F.R.))

    1984-01-01

    The author presents an elementary introduction to stochastic processes. He starts from simple quantum mechanics and considers problems in probability, finally presenting quantum dynamics in terms of stochastic processes.

  6. Drug Development Process

    Science.gov (United States)

    ... Device Approvals The Drug Development Process The Drug Development Process Share Tweet Linkedin Pin it More sharing ... Pin it Email Print Step 1 Discovery and Development Discovery and Development Research for a new drug ...

  7. Polyamines in tea processing.

    Science.gov (United States)

    Palavan-Unsal, Narcin; Arisan, Elif Damla; Terzioglu, Salih

    2007-06-01

    The distribution of dietary polyamines, putrescine, spermidine and spermine, was determined during processing of Camellia sinensis. Black tea manufacture is carried by a series of processes on fresh tea leaves involving withering, rolling, fermentation, drying and sieving. The aim of this research was to determine the effect of tea processing on the polyamine content in relation with antioxidant enzymes, superoxide dismutase, lipid peroxidase and glutathione peroxidase. Before processing, the spermine content was much higher than the putrescine and spermidine content in green tea leaves. Spermine was significantly decreased during processing while the putrescine and spermine contents increased during withered and rolling and decreased in the following stages. The superoxide dismutase activity increased at the withering stage and declined during processing. The transcript level of the polyamine biosynthesis-responsible enzyme ornithine decarboxylase was reduced during each processing step. This study reveals the importance of protection of nutritional compounds that are essential for health during the manufacturing process.

  8. Grind hardening process

    CERN Document Server

    Salonitis, Konstantinos

    2015-01-01

    This book presents the grind-hardening process and the main studies published since it was introduced in 1990s.  The modelling of the various aspects of the process, such as the process forces, temperature profile developed, hardness profiles, residual stresses etc. are described in detail. The book is of interest to the research community working with mathematical modeling and optimization of manufacturing processes.

  9. Dosimetry for radiation processing

    DEFF Research Database (Denmark)

    Miller, Arne

    1986-01-01

    During the past few years significant advances have taken place in the different areas of dosimetry for radiation processing, mainly stimulated by the increased interest in radiation for food preservation, plastic processing and sterilization of medical products. Reference services both...... and sterilization dosimetry, optichromic dosimeters in the shape of small tubes for food processing, and ESR spectroscopy of alanine for reference dosimetry. In this paper the special features of radiation processing dosimetry are discussed, several commonly used dosimeters are reviewed, and factors leading...

  10. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  11. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook will pr...

  12. Semisolid Metal Processing Consortium

    Energy Technology Data Exchange (ETDEWEB)

    Apelian,Diran

    2002-01-10

    Mathematical modeling and simulations of semisolid filling processes remains a critical issue in understanding and optimizing the process. Semisolid slurries are non-Newtonian materials that exhibit complex rheological behavior. There the way these slurries flow in cavities is very different from the way liquid in classical casting fills cavities. Actually filling in semisolid processing is often counter intuitive

  13. Clinical Process Intelligence

    DEFF Research Database (Denmark)

    Vilstrup Pedersen, Klaus

    2006-01-01

    .e. local guidelines. From a knowledge management point of view, this externalization of generalized processes, gives the opportunity to learn from, evaluate and optimize the processes. "Clinical Process Intelligence" (CPI), will denote the goal of getting generalized insight into patient centered health...

  14. Biomass process handbook

    Energy Technology Data Exchange (ETDEWEB)

    1983-01-01

    Descriptions are given of 42 processes which use biomass to produce chemical products. Marketing and economic background, process description, flow sheets, costs, major equipment, and availability of technology are given for each of the 42 processes. Some of the chemicals discussed are: ethanol, ethylene, acetaldehyde, butanol, butadiene, acetone, citric acid, gluconates, itaconic acid, lactic acid, xanthan gum, sorbitol, starch polymers, fatty acids, fatty alcohols, glycerol, soap, azelaic acid, perlargonic acid, nylon-11, jojoba oil, furfural, furfural alcohol, tetrahydrofuran, cellulose polymers, products from pulping wastes, and methane. Processes include acid hydrolysis, enzymatic hydrolysis, fermentation, distillation, Purox process, and anaerobic digestion.

  15. Thin film processes II

    CERN Document Server

    Kern, Werner

    1991-01-01

    This sequel to the 1978 classic, Thin Film Processes, gives a clear, practical exposition of important thin film deposition and etching processes that have not yet been adequately reviewed. It discusses selected processes in tutorial overviews with implementation guide lines and an introduction to the literature. Though edited to stand alone, when taken together, Thin Film Processes II and its predecessor present a thorough grounding in modern thin film techniques.Key Features* Provides an all-new sequel to the 1978 classic, Thin Film Processes* Introduces new topics, and sever

  16. Colloid process engineering

    CERN Document Server

    Peukert, Wolfgang; Rehage, Heinz; Schuchmann, Heike

    2015-01-01

    This book deals with colloidal systems in technical processes and the influence of colloidal systems by technical processes. It explores how new measurement capabilities can offer the potential for a dynamic development of scientific and engineering, and examines the origin of colloidal systems and its use for new products. The future challenges to colloidal process engineering are the development of appropriate equipment and processes for the production and obtainment of multi-phase structures and energetic interactions in market-relevant quantities. The book explores the relevant processes and for controlled production and how they can be used across all scales.

  17. Silicon production process evaluations

    Science.gov (United States)

    1982-01-01

    Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.

  18. Data processing made simple

    CERN Document Server

    Wooldridge, Susan

    2013-01-01

    Data Processing: Made Simple, Second Edition presents discussions of a number of trends and developments in the world of commercial data processing. The book covers the rapid growth of micro- and mini-computers for both home and office use; word processing and the 'automated office'; the advent of distributed data processing; and the continued growth of database-oriented systems. The text also discusses modern digital computers; fundamental computer concepts; information and data processing requirements of commercial organizations; and the historical perspective of the computer industry. The

  19. Dynamical laser spike processing

    CERN Document Server

    Shastri, Bhavin J; Tait, Alexander N; Rodriguez, Alejandro W; Wu, Ben; Prucnal, Paul R

    2015-01-01

    Novel materials and devices in photonics have the potential to revolutionize optical information processing, beyond conventional binary-logic approaches. Laser systems offer a rich repertoire of useful dynamical behaviors, including the excitable dynamics also found in the time-resolved "spiking" of neurons. Spiking reconciles the expressiveness and efficiency of analog processing with the robustness and scalability of digital processing. We demonstrate that graphene-coupled laser systems offer a unified low-level spike optical processing paradigm that goes well beyond previously studied laser dynamics. We show that this platform can simultaneously exhibit logic-level restoration, cascadability and input-output isolation---fundamental challenges in optical information processing. We also implement low-level spike-processing tasks that are critical for higher level processing: temporal pattern detection and stable recurrent memory. We study these properties in the context of a fiber laser system, but the addit...

  20. A secondary fuel removal process: plasma processing

    Energy Technology Data Exchange (ETDEWEB)

    Min, J. Y.; Kim, Y. S. [Hanyang Univ., Seoul (Korea, Republic of); Bae, K. K.; Yang, M. S. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-07-01

    Plasma etching process of UO{sub 2} by using fluorine containing gas plasma is studied as a secondary fuel removal process for DUPIC (Direct Use of PWR spent fuel Into Candu) process which is taken into consideration for potential future fuel cycle in Korea. CF{sub 4}/O{sub 2} gas mixture is chosen for reactant gas and the etching rates of UO{sub 2} by the gas plasma are investigated as functions of CF{sub 4}/O{sub 2} ratio, plasma power, substrate temperature, and plasma gas pressure. It is found that the optimum CF{sub 4}/O{sub 2} ratio is around 4:1 at all temperatures up to 400 deg C and the etching rate increases with increasing r.f. power and substrate temperature. Under 150W r.f. power the etching rate reaches 1100 monolayers/min at 400 deg C, which is equivalent to about 0.5mm/min. (author).

  1. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation, by develop......Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...... method of data analysis. Findings - A comprehensive literature review and analysis resulted in a list of business model process configurations systematically organized under five classification groups, namely, revenue model; value proposition; value configuration; target customers, and strategic...

  2. Decomposability for stable processes

    CERN Document Server

    Wang, Yizao; Roy, Parthanil

    2011-01-01

    We characterize all possible independent symmetric $\\alpha$-stable (S$\\alpha$S) components of a non--Gaussian S$\\alpha$S process, $0<\\alpha<2$. In particular, we characterize the independent stationary S$\\alpha$S components of a stationary S$\\alpha$S process. One simple consequence of our characterization is that all stationary components of the S$\\alpha$S moving average processes are trivial. As a main application, we show that the standard Brown--Resnick process has a moving average representation. This complements a result of Kabluchko et al. (2009), who obtained mixed moving average representations for these processes. We also develop a parallel characterization theory for max-stable processes.

  3. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  4. Linearity in Process Languages

    DEFF Research Database (Denmark)

    Nygaard, Mikkel; Winskel, Glynn

    2002-01-01

    The meaning and mathematical consequences of linearity (managing without a presumed ability to copy) are studied for a path-based model of processes which is also a model of affine-linear logic. This connection yields an affine-linear language for processes, automatically respecting open-map bisi......The meaning and mathematical consequences of linearity (managing without a presumed ability to copy) are studied for a path-based model of processes which is also a model of affine-linear logic. This connection yields an affine-linear language for processes, automatically respecting open......-map bisimulation, in which a range of process operations can be expressed. An operational semantics is provided for the tensor fragment of the language. Different ways to make assemblies of processes lead to different choices of exponential, some of which respect bisimulation....

  5. Process Improvement Essentials

    CERN Document Server

    Persse, James R

    2006-01-01

    Process Improvement Essentials combines the foundation needed to understand process improvement theory with the best practices to help individuals implement process improvement initiatives in their organization. The three leading programs: ISO 9001:2000, CMMI, and Six Sigma--amidst the buzz and hype--tend to get lumped together under a common label. This book delivers a combined guide to all three programs, compares their applicability, and then sets the foundation for further exploration.

  6. TEP process flow diagram

    Energy Technology Data Exchange (ETDEWEB)

    Wilms, R Scott [Los Alamos National Laboratory; Carlson, Bryan [Los Alamos National Laboratory; Coons, James [Los Alamos National Laboratory; Kubic, William [Los Alamos National Laboratory

    2008-01-01

    This presentation describes the development of the proposed Process Flow Diagram (PFD) for the Tokamak Exhaust Processing System (TEP) of ITER. A brief review of design efforts leading up to the PFD is followed by a description of the hydrogen-like, air-like, and waterlike processes. Two new design values are described; the mostcommon and most-demanding design values. The proposed PFD is shown to meet specifications under the most-common and mostdemanding design values.

  7. Business process transformation

    CERN Document Server

    Grover, Varun

    2015-01-01

    Featuring contributions from prominent thinkers and researchers, this volume in the ""Advances in Management Information Systems"" series provides a rich set of conceptual, empirical, and introspective studies that epitomize fundamental knowledge in the area of Business Process Transformation. Processes are interpreted broadly to include operational and managerial processes within and between organizations, as well as those involved in knowledge generation. Transformation includes radical and incremental change, its conduct, management, and outcome. The editors and contributing authors pay clo

  8. Jointly Poisson processes

    CERN Document Server

    Johnson, D H

    2009-01-01

    What constitutes jointly Poisson processes remains an unresolved issue. This report reviews the current state of the theory and indicates how the accepted but unproven model equals that resulting from the small time-interval limit of jointly Bernoulli processes. One intriguing consequence of these models is that jointly Poisson processes can only be positively correlated as measured by the correlation coefficient defined by cumulants of the probability generating functional.

  9. Multiphoton processes: conference proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Lambropoulos, P.; Smith, S.J. (eds.)

    1984-01-01

    The chapters of this volume represent the invited papers delivered at the conference. They are arranged according to thermatic proximity beginning with atoms and continuing with molecules and surfaces. Section headings include multiphoton processes in atoms, field fluctuations and collisions in multiphoton process, and multiphoton processes in molecules and surfaces. Abstracts of individual items from the conference were prepared separately for the data base. (GHT)

  10. Polygon mesh processing

    CERN Document Server

    Botsch, Mario; Pauly, Mark; Alliez, Pierre; Levy, Bruno

    2010-01-01

    Geometry processing, or mesh processing, is a fast-growing area of research that uses concepts from applied mathematics, computer science, and engineering to design efficient algorithms for the acquisition, reconstruction, analysis, manipulation, simulation, and transmission of complex 3D models. Applications of geometry processing algorithms already cover a wide range of areas from multimedia, entertainment, and classical computer-aided design, to biomedical computing, reverse engineering, and scientific computing. Over the last several years, triangle meshes have become increasingly popular,

  11. Basic digital signal processing

    CERN Document Server

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  12. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  13. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  14. Multimodal Processes Rescheduling

    DEFF Research Database (Denmark)

    Bocewicz, Grzegorz; Banaszak, Zbigniew A.; Nielsen, Peter

    2013-01-01

    Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe-cuted in the......Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe...

  15. Fuels Processing Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — NETL’s Fuels Processing Laboratory in Morgantown, WV, provides researchers with the equipment they need to thoroughly explore the catalytic issues associated with...

  16. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  17. Living olefin polymerization processes

    Science.gov (United States)

    Schrock, Richard R.; Baumann, Robert

    1999-01-01

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  18. Financial information processing

    Institute of Scientific and Technical Information of China (English)

    Shuo BAI; Shouyang WANG; Lean YU; Aoying ZHOU

    2009-01-01

    @@ The rapid growth in financial data volume has made financial information processing more and more difficult due to the increase in complexity, which has forced businesses and academics alike to turn to sophisticated information processing technologies for better solutions. A typical feature is that high-performance computers and advanced computational techniques play ever-increasingly important roles for business and industries to have competitive advantages. Accordingly, financial information processing has emerged as a new cross-disciplinary field integrating computer science, mathematics, financial economics, intelligent techniques, and computer simulations to make different decisions based on processed financial information.

  19. The Recruitment Process:

    DEFF Research Database (Denmark)

    Holm, Anna

    The aim of this research was to determine whether the introduction of e-recruitment has an impact on the process and underlying tasks, subtasks and activities of recruitment. Three large organizations with well-established e-recruitment practices were included in the study. The three case studies......, which were carried out in Denmark in 2008-2009 using qualitative research methods, revealed changes in the sequence, divisibility and repetitiveness of a number of recruitment tasks and subtasks. The new recruitment process design was identified and presented in the paper. The study concluded...... that the main task of the process shifted from processing applications to communicating with candidates....

  20. Biased predecision processing.

    Science.gov (United States)

    Brownstein, Aaron L

    2003-07-01

    Decision makers conduct biased predecision processing when they restructure their mental representation of the decision environment to favor one alternative before making their choice. The question of whether biased predecision processing occurs has been controversial since L. Festinger (1957) maintained that it does not occur. The author reviews relevant research in sections on theories of cognitive dissonance, decision conflict, choice certainty, action control, action phases, dominance structuring, differentiation and consolidation, constructive processing, motivated reasoning, and groupthink. Some studies did not find evidence of biased predecision processing, but many did. In the Discussion section, the moderators are summarized and used to assess the theories.

  1. Acoustic Signal Processing

    Science.gov (United States)

    Hartmann, William M.; Candy, James V.

    Signal processing refers to the acquisition, storage, display, and generation of signals - also to the extraction of information from signals and the re-encoding of information. As such, signal processing in some form is an essential element in the practice of all aspects of acoustics. Signal processing algorithms enable acousticians to separate signals from noise, to perform automatic speech recognition, or to compress information for more efficient storage or transmission. Signal processing concepts are the building blocks used to construct models of speech and hearing. Now, in the 21st century, all signal processing is effectively digital signal processing. Widespread access to high-speed processing, massive memory, and inexpensive software make signal processing procedures of enormous sophistication and power available to anyone who wants to use them. Because advanced signal processing is now accessible to everybody, there is a need for primers that introduce basic mathematical concepts that underlie the digital algorithms. The present handbook chapter is intended to serve such a purpose.

  2. Cooperative internal conversion process

    CERN Document Server

    Kálmán, Péter

    2015-01-01

    A new phenomenon, called cooperative internal conversion process, in which the coupling of bound-free electron and neutron transitions due to the dipole term of their Coulomb interaction permits cooperation of two nuclei leading to neutron exchange if it is allowed by energy conservation, is discussed theoretically. General expression of the cross section of the process is reported in one particle nuclear and spherical shell models as well in the case of free atoms (e.g. noble gases). A half-life characteristic of the process is also determined. The case of $Ne$ is investigated numerically. The process may have significance in fields of nuclear waste disposal and nuclear energy production.

  3. Kuhlthau's Information Search Process.

    Science.gov (United States)

    Shannon, Donna

    2002-01-01

    Explains Kuhlthau's Information Search Process (ISP) model which is based on a constructivist view of learning and provides a framework for school library media specialists for the design of information services and instruction. Highlights include a shift from library skills to information skills; attitudes; process approach; and an interview with…

  4. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  5. Mineral Processing Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2000-09-01

    This document represents the roadmap for Processing Technology Research in the US Mining Industry. It was developed based on the results of a Processing Technology Roadmap Workshop sponsored by the National Mining Association in conjunction with the US Department of Energy, Office of Energy Efficiency and Renewable Energy, Office of Industrial Technologies. The Workshop was held January 24 - 25, 2000.

  6. Technologies for Optical Processing

    DEFF Research Database (Denmark)

    Stubkjær, Kristian

    2008-01-01

    The article consists of a Powerpoint presentation on technologies for optical processing. The paper concludes that the nonlinear elements based on SOA, fibers and waveguide structures have capabilities of simple processing at data rates of 100-600 Gb/s. Switching powers comparable to electronics...

  7. Cognitive Processes and Achievement.

    Science.gov (United States)

    Hunt, Dennis; Randhawa, Bikkar S.

    For a group of 165 fourth- and fifth-grade students, four achievement test scores were correlated with success on nine tests designed to measure three cognitive functions: sustained attention, successive processing, and simultaneous processing. This experiment was designed in accordance with Luria's model of the three functional units of the…

  8. The process of entrepreneurship:

    DEFF Research Database (Denmark)

    Neergaard, Helle

    2003-01-01

    Growing a technology-based new venture is a complex process because these ventures are embedded in turbulent environments that require fast organisational and managerial transformation. This chapter addresses the evolutionary process of such ventures. It seeks to provide insight into the link...

  9. Process Writing Checklist.

    Science.gov (United States)

    Jenks, Christopher J.

    This checklist is designed to help develop writing strategies for English language learners (ELLs), focusing on a variety of linguistic strategies inherent in the writing process. It provides them with a graphical representation of the cognitive process involved in complex writing, promoting self-assessment strategies and integrating oral…

  10. Food processing and allergenicity.

    Science.gov (United States)

    Verhoeckx, Kitty C M; Vissers, Yvonne M; Baumert, Joseph L; Faludi, Roland; Feys, Marcel; Flanagan, Simon; Herouet-Guicheney, Corinne; Holzhauser, Thomas; Shimojo, Ryo; van der Bolt, Nieke; Wichers, Harry; Kimber, Ian

    2015-06-01

    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat treatment) on the allergenic potential of proteins, and on the antigenic (IgG-binding) and allergenic (IgE-binding) properties of proteins has been considered. A variety of allergenic foods (peanuts, tree nuts, cows' milk, hens' eggs, soy, wheat and mustard) have been reviewed. The overall conclusion drawn is that processing does not completely abolish the allergenic potential of allergens. Currently, only fermentation and hydrolysis may have potential to reduce allergenicity to such an extent that symptoms will not be elicited, while other methods might be promising but need more data. Literature on the effect of processing on allergenic potential and the ability to induce sensitisation is scarce. This is an important issue since processing may impact on the ability of proteins to cause the acquisition of allergic sensitisation, and the subject should be a focus of future research. Also, there remains a need to develop robust and integrated methods for the risk assessment of food allergenicity.

  11. Process innovation laboratory

    DEFF Research Database (Denmark)

    Møller, Charles

    2007-01-01

    Most organizations today are required not only to operate effective business processes but also to allow for changing business conditions at an increasing rate. Today nearly every business relies on their enterprise information systems (EIS) for process integration and future generations of EIS...

  12. Hyperspectral image processing methods

    Science.gov (United States)

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  13. Flow generating processes

    NARCIS (Netherlands)

    Lanen, van H.A.J.; Fendeková, M.; Kupczyk, E.; Kasprzyk, A.; Pokojski, W.

    2004-01-01

    This chapter starts with an overview of how climatic water deficits affect hydrological processes in different type of catchments. It then continues with a more comprehensive description of drought relevant processes. Two catchments in climatologically contrasting regions are used for illustrative p

  14. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  15. Uranium processing and properties

    CERN Document Server

    2013-01-01

    Covers a broad spectrum of topics and applications that deal with uranium processing and the properties of uranium Offers extensive coverage of both new and established practices for dealing with uranium supplies in nuclear engineering Promotes the documentation of the state-of-the-art processing techniques utilized for uranium and other specialty metals

  16. Relational Processing Following Stroke

    Science.gov (United States)

    Andrews, Glenda; Halford, Graeme S.; Shum, David; Maujean, Annick; Chappell, Mark; Birney, Damian

    2013-01-01

    The research examined relational processing following stroke. Stroke patients (14 with frontal, 30 with non-frontal lesions) and 41 matched controls completed four relational processing tasks: sentence comprehension, Latin square matrix completion, modified Dimensional Change Card Sorting, and n-back. Each task included items at two or three…

  17. Monitoring Business Processes

    Science.gov (United States)

    Bellandi, Valerio; Ceravolo, Paolo; Damiani, Ernesto; Frati, Fulvio

    In this chapter, we introduce the TEKNE Metrics Framework that performs services to monitor business processes. This framework was designed to support the prescription and explanation of these processes. TEKNE's most innovative contribution is managing data expressed in declarative form. To face this challenge, the TEKNE project implemented an infrastructure that relies on declarative Semantic Web technologies designed to be used in distributed systems.

  18. Image processing mini manual

    Science.gov (United States)

    Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill

    1992-01-01

    The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.

  19. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  20. Organic food processing

    DEFF Research Database (Denmark)

    Kahl, Johannes; Alborzi, Farnaz; Beck, Alexander

    2014-01-01

    In 2007 EU Regulation (EC) 834/2007 introduced principles and criteria for organic food processing. These regulations have been analysed and discussed in several scientific publications and research project reports. Recently, organic food quality was described by principles, aspects and criteria....... These principles from organic agriculture were verified and adapted for organic food processing. Different levels for evaluation were suggested. In another document, underlying paradigms and consumer perception of organic food were reviewed against functional food, resulting in identifying integral product...... to evaluate processing methods. Therefore the goal of this paper is to describe and discuss the topic of organic food processing to make it operational. A conceptual background for organic food processing is given by verifying the underlying paradigms and principles of organic farming and organic food as well...

  1. Beryllium chemistry and processing

    CERN Document Server

    Walsh, Kenneth A

    2009-01-01

    This book introduces beryllium; its history, its chemical, mechanical, and physical properties including nuclear properties. The 29 chapters include the mineralogy of beryllium and the preferred global sources of ore bodies. The identification and specifics of the industrial metallurgical processes used to form oxide from the ore and then metal from the oxide are thoroughly described. The special features of beryllium chemistry are introduced, including analytical chemical practices. Beryllium compounds of industrial interest are identified and discussed. Alloying, casting, powder processing, forming, metal removal, joining and other manufacturing processes are covered. The effect of composition and process on the mechanical and physical properties of beryllium alloys assists the reader in material selection. The physical metallurgy chapter brings conformity between chemical and physical metallurgical processing of beryllium, metal, alloys, and compounds. The environmental degradation of beryllium and its all...

  2. Posttranslational processing of progastrin

    DEFF Research Database (Denmark)

    Bundgaard, Jens René; Rehfeld, Jens F.

    2010-01-01

    Gastrin and cholecystokinin (CCK) are homologous hormones with important functions in the brain and the gut. Gastrin is the main regulator of gastric acid secretion and gastric mucosal growth, whereas cholecystokinin regulates gall bladder emptying, pancreatic enzyme secretion and besides acts...... processing progastrin is often greatly disturbed in neoplastic cells.The posttranslational phase of the biogenesis of gastrin and the various progastrin products in gastrin gene-expressing tissues is now reviewed here. In addition, the individual contributions of the processing enzymes are discussed......, as are structural features of progastrin that are involved in the precursor activation process. Thus, the review describes how the processing depends on the cell-specific expression of the processing enzymes and kinetics in the secretory pathway....

  3. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  4. Business Process Management

    Science.gov (United States)

    Mendling, Jan

    The recent progress of Business Process Management (BPM) is reflected by the figures of the related industry. Wintergreen Research estimates that the international market for BPM-related software and services accounted for more than USD 1 billion in 2005 with a tendency towards rapid growth in the subsequent couple of years [457]. The relevance of business process modeling to general management initiatives has been previously studied in the 1990s [28]. Today, Gartner finds that organizations that had the best results in implementing business process management spent more than 40 percent of the total project time on discovery and construction of their initial process model [265]. As a consequence, Gartner considers Business Process Modeling to be among the Top 10 Strategic Technologies for 2008.

  5. New Processes for Annulation

    Institute of Scientific and Technical Information of China (English)

    Liu Hsing-Jang

    2004-01-01

    Making use of the high propensity of 2-cyano-2-cycloalkenones to undergo conjugate addition with various carbanions and the high reactivity of the ensuing α -cyano ketone system, a number of new annulation processes have been developed recently in our laboratories. As shown in Eq. 1 (n=1) with a specific example, one such process involves the addition of 3-butenylmagnesium bromide, followed by a palladium (Ⅱ) acetate mediated oxidative cyclization, to facilitate methylenecyclopentane ring formation. This annulation process could be readily extended to effect methylenecyclohexane ring formation (Eq. 1, n=2), using 4-pentenylmagnesinm bromide as the initial reagent, and to install the carbomethoxy-substituted methylenecyclopentane and methylenecyclohexane rings, using the carbanions derived from methyl 4-pentenoate and methyl 5-hexenoate, respectively (Eq. 2). In another annulation process, the addition of the enolate of methyl 5-chloropentanoate is involved initially, and the ring formation is readily effected by an intramolecular alkylation process. A specific example is given in Eq. 3.

  6. Business process support

    Energy Technology Data Exchange (ETDEWEB)

    Carle, Adriana; Fiducia, Daniel [Transportadora de Gas del Sur S.A. (TGS), Buenos Aires (Argentina)

    2005-07-01

    This paper is about the own development of business support software. The developed applications are used to support two business processes: one of them is the process of gas transportation and the other is the natural gas processing. This software has interphases with the ERP SAP, software SCADA and on line gas transportation simulation software. The main functionalities of the applications are: entrance on line real time of clients transport nominations, transport programming, allocation of the clients transport nominations, transport control, measurements, balanced pipeline, allocation of gas volume to the gas processing plants, calculate of product tons processed in each plant and tons of product distributed to clients. All the developed software generates information to the internal staff, regulatory authorities and clients. (author)

  7. Nonhomogeneous fractional Poisson processes

    Energy Technology Data Exchange (ETDEWEB)

    Wang Xiaotian [School of Management, Tianjin University, Tianjin 300072 (China)]. E-mail: swa001@126.com; Zhang Shiying [School of Management, Tianjin University, Tianjin 300072 (China); Fan Shen [Computer and Information School, Zhejiang Wanli University, Ningbo 315100 (China)

    2007-01-15

    In this paper, we propose a class of non-Gaussian stationary increment processes, named nonhomogeneous fractional Poisson processes W{sub H}{sup (j)}(t), which permit the study of the effects of long-range dependance in a large number of fields including quantum physics and finance. The processes W{sub H}{sup (j)}(t) are self-similar in a wide sense, exhibit more fatter tail than Gaussian processes, and converge to the Gaussian processes in distribution in some cases. In addition, we also show that the intensity function {lambda}(t) strongly influences the existence of the highest finite moment of W{sub H}{sup (j)}(t) and the behaviour of the tail probability of W{sub H}{sup (j)}(t)

  8. Conceptualizing operations strategy processes

    DEFF Research Database (Denmark)

    Rytter, Niels Gorm; Boer, Harry; Koch, Christian

    2007-01-01

    Purpose - The purpose of this paper is to present insights into operations strategy (OS) in practice. It outlines a conceptualization and model of OS processes and, based on findings from an in-depth and longitudinal case study, contributes to further development of extant OS models and methods......; taking place in five dimensions of change - technical-rational, cultural, political, project management, and facilitation; and typically unfolding as a sequential and parallel, ordered and disordered, planned and emergent as well as top-down and bottom-up process. The proposed OS conceptualization...... outcomes for an OS process in practice, change agents may need to moderate their outcome ambitions, manage process dimensions and agendas in a situational manner, balance inherent process paradoxes, strive at bridging both language and reality, as well as mobilizing key stakeholders, especially middle...

  9. Branching processes in biology

    CERN Document Server

    Kimmel, Marek

    2015-01-01

    This book provides a theoretical background of branching processes and discusses their biological applications. Branching processes are a well-developed and powerful set of tools in the field of applied probability. The range of applications considered includes molecular biology, cellular biology, human evolution and medicine. The branching processes discussed include Galton-Watson, Markov, Bellman-Harris, Multitype, and General Processes. As an aid to understanding specific examples, two introductory chapters, and two glossaries are included that provide background material in mathematics and in biology. The book will be of interest to scientists who work in quantitative modeling of biological systems, particularly probabilists, mathematical biologists, biostatisticians, cell biologists, molecular biologists, and bioinformaticians. The authors are a mathematician and cell biologist who have collaborated for more than a decade in the field of branching processes in biology for this new edition. This second ex...

  10. Formed HIP Can Processing

    Energy Technology Data Exchange (ETDEWEB)

    Clarke, Kester Diederik [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-07-27

    The intent of this report is to document a procedure used at LANL for HIP bonding aluminum cladding to U-10Mo fuel foils using a formed HIP can for the Domestic Reactor Conversion program in the NNSA Office of Material, Management and Minimization, and provide some details that may not have been published elsewhere. The HIP process is based on the procedures that have been used to develop the formed HIP can process, including the baseline process developed at Idaho National Laboratory (INL). The HIP bonding cladding process development is summarized in the listed references. Further iterations with Babcock & Wilcox (B&W) to refine the process to meet production and facility requirements is expected.

  11. Heavy oils processing materials requirements crude processing

    Energy Technology Data Exchange (ETDEWEB)

    Sloley, Andrew W. [CH2M Hill, Englewood, CO (United States)

    2012-07-01

    Over time, recommended best practices for crude unit materials selection have evolved to accommodate new operating requirements, feed qualities, and product qualities. The shift to heavier oil processing is one of the major changes in crude feed quality occurring over the last 20 years. The three major types of crude unit corrosion include sulfidation attack, naphthenic acid attack, and corrosion resulting from hydrolyzable chlorides. Heavy oils processing makes all three areas worse. Heavy oils have higher sulfur content; higher naphthenic acid content; and are more difficult to desalt, leading to higher chloride corrosion rates. Materials selection involves two major criteria, meeting required safety standards, and optimizing economics of the overall plant. Proper materials selection is only one component of a plant integrity approach. Materials selection cannot eliminate all corrosion. Proper materials selection requires appropriate support from other elements of an integrity protection program. The elements of integrity preservation include: materials selection (type and corrosion allowance); management limits on operating conditions allowed; feed quality control; chemical additives for corrosion reduction; and preventive maintenance and inspection (PMI). The following discussion must be taken in the context of the application of required supporting work in all the other areas. Within that context, specific materials recommendations are made to minimize corrosion due to the most common causes in the crude unit. (author)

  12. The Isasmelt process

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, K.R. (MIM Technology Marketing Ltd., Northfleet (United Kingdom))

    1993-01-01

    The Isasmelt process was developed at Mt Isa Mines Ltd. in Queensland. The process was initially developed for the treatment of lead concentrate. After successful application of the process to lead production a pilot plant was built for the treatment of copper concentrate to produce copper matte. This was successful and as a result Mt Isa Mines decided to build a new copper smelter with a capacity of 180,000 t copper/ a in copper matte. Further commercialisation of the process has resulted in the construction of further plants for lead, nickel and copper production. Mt Isa Mines Ltd. has been associated with CSIRO (Commonwealth Scientific Industrial Research Organisation) in the development of the Isasmelt process since 1977, when a Sirosmelt lance was tested for reducing copper converter slags. Thermodynamic modelling and cruicible scale investigations on a lead smelting process were initiated in 1978. After further work on a 120 kg/h pilot plant the Isasmelt process for lead concentrate smelting was patented jointly by Mt Isa Mines and CSIRO. Since then a 5 t/h demonstration plant was commissioned 1983/85 for smelting and reduction. Finally in 1991 a commercial scale plant with a capacity of 60,000 t/a was commissioned. (orig.).

  13. Revealing the programming process

    DEFF Research Database (Denmark)

    Bennedsen, Jens; Caspersen, Michael Edelgaard

    2005-01-01

    One of the most important goals of an introductory programming course is that the students learn a systematic approach to the development of computer programs. Revealing the programming process is an important part of this; however, textbooks do not address the issue -- probably because the textb......One of the most important goals of an introductory programming course is that the students learn a systematic approach to the development of computer programs. Revealing the programming process is an important part of this; however, textbooks do not address the issue -- probably because...... the textbook medium is static and therefore ill-suited to expose the process of programming. We have found that process recordings in the form of captured narrated programming sessions are a simple, cheap, and efficient way of providing the revelation.We identify seven different elements of the programming...... process for which process recordings are a valuable communication media in order to enhance the learning process. Student feedback indicates both high learning outcome and superior learning potential compared to traditional classroom teaching....

  14. States in Process Calculi

    Directory of Open Access Journals (Sweden)

    Christoph Wagner

    2014-08-01

    Full Text Available Formal reasoning about distributed algorithms (like Consensus typically requires to analyze global states in a traditional state-based style. This is in contrast to the traditional action-based reasoning of process calculi. Nevertheless, we use domain-specific variants of the latter, as they are convenient modeling languages in which the local code of processes can be programmed explicitly, with the local state information usually managed via parameter lists of process constants. However, domain-specific process calculi are often equipped with (unlabeled reduction semantics, building upon a rich and convenient notion of structural congruence. Unfortunately, the price for this convenience is that the analysis is cumbersome: the set of reachable states is modulo structural congruence, and the processes' state information is very hard to identify. We extract from congruence classes of reachable states individual state-informative representatives that we supply with a proper formal semantics. As a result, we can now freely switch between the process calculus terms and their representatives, and we can use the stateful representatives to perform assertional reasoning on process calculus models.

  15. Semi-Markov processes

    CERN Document Server

    Grabski

    2014-01-01

    Semi-Markov Processes: Applications in System Reliability and Maintenance is a modern view of discrete state space and continuous time semi-Markov processes and their applications in reliability and maintenance. The book explains how to construct semi-Markov models and discusses the different reliability parameters and characteristics that can be obtained from those models. The book is a useful resource for mathematicians, engineering practitioners, and PhD and MSc students who want to understand the basic concepts and results of semi-Markov process theory. Clearly defines the properties and

  16. Transnational Learning Processes

    DEFF Research Database (Denmark)

    Nedergaard, Peter

    This paper analyses and compares the transnational learning processes in the employment field in the European Union and among the Nordic countries. Based theoretically on a social constructivist model of learning and methodologically on a questionnaire distributed to the relevant participants......, a number of hypotheses concerning transnational learning processes are tested. The paper closes with a number of suggestions regarding an optimal institutional setting for facilitating transnational learning processes.Key words: Transnational learning, Open Method of Coordination, Learning, Employment......, European Employment Strategy, European Union, Nordic countries....

  17. Plasma processing for VLSI

    CERN Document Server

    Einspruch, Norman G

    1984-01-01

    VLSI Electronics: Microstructure Science, Volume 8: Plasma Processing for VLSI (Very Large Scale Integration) discusses the utilization of plasmas for general semiconductor processing. It also includes expositions on advanced deposition of materials for metallization, lithographic methods that use plasmas as exposure sources and for multiple resist patterning, and device structures made possible by anisotropic etching.This volume is divided into four sections. It begins with the history of plasma processing, a discussion of some of the early developments and trends for VLSI. The second section

  18. Getting Started with Processing

    CERN Document Server

    Reas, Casey

    2010-01-01

    Learn computer programming the easy way with Processing, a simple language that lets you use code to create drawings, animation, and interactive graphics. Programming courses usually start with theory, but this book lets you jump right into creative and fun projects. It's ideal for anyone who wants to learn basic programming, and serves as a simple introduction to graphics for people with some programming skills. Written by the founders of Processing, this book takes you through the learning process one step at a time to help you grasp core programming concepts. You'll learn how to sketch wi

  19. Chemical Processing Manual

    Science.gov (United States)

    Beyerle, F. J.

    1972-01-01

    Chemical processes presented in this document include cleaning, pickling, surface finishes, chemical milling, plating, dry film lubricants, and polishing. All types of chemical processes applicable to aluminum, for example, are to be found in the aluminum alloy section. There is a separate section for each category of metallic alloy plus a section for non-metals, such as plastics. The refractories, super-alloys and titanium, are prime candidates for the space shuttle, therefore, the chemical processes applicable to these alloys are contained in individual sections of this manual.

  20. The image processing handbook

    CERN Document Server

    Russ, John C

    2006-01-01

    Now in its fifth edition, John C. Russ's monumental image processing reference is an even more complete, modern, and hands-on tool than ever before. The Image Processing Handbook, Fifth Edition is fully updated and expanded to reflect the latest developments in the field. Written by an expert with unequalled experience and authority, it offers clear guidance on how to create, select, and use the most appropriate algorithms for a specific application. What's new in the Fifth Edition? ·       A new chapter on the human visual process that explains which visual cues elicit a response from the vie

  1. Study on Glulam Process

    Institute of Scientific and Technical Information of China (English)

    PENG Limin; WANG Haiqing; HE Weili

    2006-01-01

    This paper selected lumbers of Manchurian ash (Fraxinus rnandshurica), Manchurian walnut (Juglans mandshuricd) and Spruce (Picea jezoensis vai.kornamvii) for manufacturing glulam with water-borne polymeric-isocyanate adhesive to determine process variables. The process variables that include specific pressure, pressing time and adhesive application amount influencing the shear strength of the glulam, were investigated through the orthogonal test. The results indicated that optimum process variables for glulam manufacturing were as follows: Specific pressure of 1.5 MPa for Spruce and 2,0 MPa both for Manchurian ash and Manchurian walnut, pressing time of 60 min and adhesive application amount of 250 g/m2.

  2. Digital Differential Geometry Processing

    Institute of Scientific and Technical Information of China (English)

    Xin-Guo Liu; Hu-Jun Bao; Qun-Sheng Peng

    2006-01-01

    The theory and methods of digital geometry processing has been a hot research area in computer graphics, as geometric models serves as the core data for 3D graphics applications. The purpose of this paper is to introduce some recent advances in digital geometry processing, particularly mesh fairing, surface parameterization and mesh editing, that heavily use differential geometry quantities. Some related concepts from differential geometry, such as normal, curvature, gradient,Laplacian and their counterparts on digital geometry are also reviewed for understanding the strength and weakness of various digital geometry processing methods.

  3. Nano integrated circuit process

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Yung Sup

    2004-02-15

    This book contains nine chapters, which are introduction of manufacture of semiconductor chip, oxidation such as Dry-oxidation, wet oxidation, oxidation model and oxide film, diffusion like diffusion process, diffusion equation, diffusion coefficient and diffusion system, ion implantation, including ion distribution, channeling, multiimplantation and masking and its system, sputtering such as CVD and PVD, lithography, wet etch and dry etch, interconnection and flattening like metal-silicon connection, silicide, multiple layer metal process and flattening, an integrated circuit process, including MOSFET and CMOS.

  4. Irreversible processes kinetic theory

    CERN Document Server

    Brush, Stephen G

    2013-01-01

    Kinetic Theory, Volume 2: Irreversible Processes deals with the kinetic theory of gases and the irreversible processes they undergo. It includes the two papers by James Clerk Maxwell and Ludwig Boltzmann in which the basic equations for transport processes in gases are formulated, together with the first derivation of Boltzmann's ""H-theorem"" and a discussion of this theorem, along with the problem of irreversibility.Comprised of 10 chapters, this volume begins with an introduction to the fundamental nature of heat and of gases, along with Boltzmann's work on the kinetic theory of gases and s

  5. Ultrasonic Processing of Materials

    Science.gov (United States)

    Han, Qingyou

    2015-08-01

    Irradiation of high-energy ultrasonic vibration in metals and alloys generates oscillating strain and stress fields in solids, and introduces nonlinear effects such as cavitation, acoustic streaming, and radiation pressure in molten materials. These nonlinear effects can be utilized to assist conventional material processing processes. This article describes recent research at Oak Ridge National Labs and Purdue University on using high-intensity ultrasonic vibrations for degassing molten aluminum, processing particulate-reinforced metal matrix composites, refining metals and alloys during solidification process and welding, and producing bulk nanostructures in solid metals and alloys. Research results suggest that high-intensity ultrasonic vibration is capable of degassing and dispersing small particles in molten alloys, reducing grain size during alloy solidification, and inducing nanostructures in solid metals.

  6. Processer i undervisningen

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe

    Undersøgelsen har fokus på processer i undervisningen – og derigennem på hvordan digitale læremidler kan understøtte eller integreres i typiske processer. Undersøgelsen hviler på deltagende observation på Abildgårdskolen i Odense. Gennem observationerne er der identificeret en række eksempler på ...... udfordringer for at gennemføre de undervisningsmæssige processer og givet bud på digitale læremidler der forventes at kunne understøtte processerne. Undersøgelsen viser samtidig hvordan fokus på processer kan fungere som en metode til brugerdreven innovation....

  7. Processed Products Database System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Collection of annual data on processed seafood products. The Division provides authoritative advice, coordination and guidance on matters related to the collection,...

  8. Reconfigurable network processing platforms

    NARCIS (Netherlands)

    Kachris, C.

    2007-01-01

    This dissertation presents our investigation on how to efficiently exploit reconfigurable hardware to design flexible, high performance, and power efficient network devices capable to adapt to varying processing requirements of network applications and traffic. The proposed reconfigurable network pr

  9. Quantum processes in semiconductors

    CERN Document Server

    Ridley, B K

    2013-01-01

    Aimed at graduate students, this is a guide to quantum processes of importance in the physics and technology of semiconductors. The fifth edition includes new chapters that expand the coverage of semiconductor physics relevant to its accompanying technology.

  10. CAPSULE REPORT: EVAPORATION PROCESS

    Science.gov (United States)

    Evaporation has been an established technology in the metal finishing industry for many years. In this process, wastewaters containing reusable materials, such as copper, nickel, or chromium compounds are heated, producing a water vapor that is continuously removed and condensed....

  11. Phenol removal pretreatment process

    Science.gov (United States)

    Hames, Bonnie R.

    2004-04-13

    A process for removing phenols from an aqueous solution is provided, which comprises the steps of contacting a mixture comprising the solution and a metal oxide, forming a phenol metal oxide complex, and removing the complex from the mixture.

  12. Logistics Innovation Process Revisited

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Su, Shong-Iee Ivan; Yang, Su-Lan

    2011-01-01

    Purpose – The purpose of this paper is to learn more about logistics innovation processes and their implications for the focal organization as well as the supply chain, especially suppliers. Design/methodology/approach – The empirical basis of the study is a longitudinal action research project...... that was triggered by the practical needs of new ways of handling material flows of a hospital. This approach made it possible to revisit theory on logistics innovation process. Findings – Apart from the tangible benefits reported to the case hospital, five findings can be extracted from this study: the logistics...... innovation process model may include not just customers but also suppliers; logistics innovation in buyer-supplier relations may serve as an alternative to outsourcing; logistics innovation processes are dynamic and may improve supplier partnerships; logistics innovations in the supply chain are as dependent...

  13. SIMULATION OF LOGISTICS PROCESSES

    Directory of Open Access Journals (Sweden)

    Yu. Taranenko

    2016-08-01

    Full Text Available The article deals with the theoretical basis of the simulation. The study shows the simulation of logistic processes in industrial countries is an integral part of many economic projects aimed at the creation or improvement of logistics systems. The paper was used model Beer Game for management of logistics processes in the enterprise. The simulation model implements in AnyLogic package. AnyLogic product allows us to consider the logistics processes as an integrated system, which allows reaching better solutions. Logistics process management involves pooling the sales market, production and distribution to ensure the temporal level of customer service at the lowest cost overall. This made it possible to conduct experiments and to determine the optimal size of the warehouse at the lowest cost.

  14. Ultrahigh bandwidth signal processing

    DEFF Research Database (Denmark)

    Oxenløwe, Leif Katsuo

    2016-01-01

    Optical time lenses have proven to be very versatile for advanced optical signal processing. Based on a controlled interplay between dispersion and phase-modulation by e.g. four-wave mixing, the processing is phase-preserving, an hence useful for all types of data signals including coherent multi......-level modulation founats. This has enabled processing of phase-modulated spectrally efficient data signals, such as orthogonal frequency division multiplexed (OFDM) signa In that case, a spectral telescope system was used, using two time lenses with different focal lengths (chirp rates), yielding a spectral...... regeneratio These operations require a broad bandwidth nonlinear platform, and novel photonic integrated nonlinear platform like aluminum gallium arsenide nano-waveguides used for 1.28 Tbaud optical signal processing will be described....

  15. Radiation processing in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Makuuchi, Keizo [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment

    2001-03-01

    Economic scale of radiation application in the field of industry, agriculture and medicine in Japan in 1997 was investigated to compare its economic impacts with that of nuclear energy industry. Total production value of radiation application accounted for 54% of nuclear industry including nuclear energy industry and radiation applications in three fields above. Industrial radiation applications were further divided into five groups, namely nondestructive test, RI instruments, radiation facilities, radiation processing and ion beam processing. More than 70% of the total production value was brought about by ion beam processing for use with IC and semiconductors. Future economic prospect of radiation processing of polymers, for example cross-linking, EB curing, graft polymerization and degradation, is reviewed. Particular attention was paid to radiation vulcanization of natural rubber latex and also to degradation of natural polymers. (S. Ohno)

  16. IT Project Prioritization Process

    DEFF Research Database (Denmark)

    Shollo, Arisa; Constantiou, Ioanna

    2013-01-01

    In most of the large companies IT project prioritization process is designed based on principles of evidencebased management. We investigate a case of IT project prioritization in a financial institution, and in particular, how managers practice evidence-based management during this process. We use...... a rich dataset built from a longitudinal study of the prioritization process for the IT projects. Our findings indicate that managers reach a decision not only by using evidence but from the interplay between the evidence and the judgment devices that managers employ. The interplay between evidence...... and judgment devices is manifested in three ways: supplementing, substituting, and interpreting evidence. We show that while evidence does not fully determine the decision, it plays a central role in discussions, reflections, and negotiations during the IT prioritization process....

  17. Cooperative processing data bases

    Science.gov (United States)

    Hasta, Juzar

    1991-01-01

    Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.

  18. TERMINOLOGY IN PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Igor G. Fedorov

    2013-01-01

    Full Text Available We can be mistaken to formulate basic concepts of process management, and we are at risk to be on the wrong way solving the focused problems – instead of process management we could do automatization, instead of process system we could introduce function-oriented system. Without having a clear idea of the model we have to execute, we can plan this model as an analytical one and do not include all the necessary tools for management on the stage of planning. The article is targeted for the analysts who have skills in analytical modeling of business processes and would like to make a step forward to the implementation of these models. In order to become professionals in this field it is necessary to learn the terminology, first of all. 

  19. Essentials of stochastic processes

    CERN Document Server

    Durrett, Richard

    2016-01-01

    Building upon the previous editions, this textbook is a first course in stochastic processes taken by undergraduate and graduate students (MS and PhD students from math, statistics, economics, computer science, engineering, and finance departments) who have had a course in probability theory. It covers Markov chains in discrete and continuous time, Poisson processes, renewal processes, martingales, and option pricing. One can only learn a subject by seeing it in action, so there are a large number of examples and more than 300 carefully chosen exercises to deepen the reader’s understanding. Drawing from teaching experience and student feedback, there are many new examples and problems with solutions that use TI-83 to eliminate the tedious details of solving linear equations by hand, and the collection of exercises is much improved, with many more biological examples. Originally included in previous editions, material too advanced for this first course in stochastic processes has been eliminated while treatm...

  20. Assessing Process and Product

    DEFF Research Database (Denmark)

    Bennedsen, Jens; Caspersen, Michael E.

    2006-01-01

    The final assessment of a course must reflect its goals, and contents. An important goal of our introductory programming course is that the students learn a systematic approach for the development of computer programs. Having the programming process as learning objective naturally raises the ques......The final assessment of a course must reflect its goals, and contents. An important goal of our introductory programming course is that the students learn a systematic approach for the development of computer programs. Having the programming process as learning objective naturally raises...... the question how to include this in assessments. Traditional assessments (e.g. oral, written, or multiple choice) are unsuitable to test the programming process. We describe and evaluate a practical lab examination that assesses the students' programming process as well as the developed programs...

  1. Markovian risk process

    Institute of Scientific and Technical Information of China (English)

    WANG Han-xing; YAN Yun-zhi; ZHAO Fei; FANG Da-fan

    2007-01-01

    A Markovian risk process is considered in this paper, which is the generalization of the classical risk model. It is proper that a risk process with large claims is modelled as the Markovian risk model. In such a model, the occurrence of claims is described by a point process {N(t)}t≥o with N(t) being the number of jumps during the interval (0, t] for a Markov jump process. The ruin probability Ψ(u) of a company facing such a risk model is mainly studied. An integral equation satisfied by the ruin probability function Ψ(u) is obtained and the bounds for the convergence rate of the ruin probability Ψ(u) are given by using a generalized renewal technique developed in the paper.

  2. Fractional Pure Birth Processes

    CERN Document Server

    Orsingher, Enzo; 10.3150/09-BEJ235

    2010-01-01

    We consider a fractional version of the classical non-linear birth process of which the Yule-Furry model is a particular case. Fractionality is obtained by replacing the first-order time derivative in the difference-differential equations which govern the probability law of the process, with the Dzherbashyan-Caputo fractional derivative. We derive the probability distribution of the number $ \\mathcal{N}_\

  3. Harmonizable Processes: Structure.

    Science.gov (United States)

    1980-11-05

    a related result of Thomas ([39], p. 146). However, the Bourbaki set up of these papers is inconvenient here, and they will be converted to the set ...of processes. 1 2 2i 1. Introduction. Recently there have been significant attempts for extending the well-understood theory of stationary processes...characterizations of the respective classes. This involves a free use of some elementary aspects of vector measure theory ; and it already raises some interesting

  4. Process for compound transformation

    KAUST Repository

    Basset, Jean-Marie

    2016-12-29

    Embodiments of the present disclosure provide for methods of using a catalytic system to chemically transform a compound (e.g., a hydrocarbon). In an embodiment, the method does not employ grafting the catalyst prior to catalysis. In particular, embodiments of the present disclosure provide for a process of hydrocarbon (e.g., C1 to C20 hydrocarbon) metathesis (e.g., alkane, olefin, or alkyne metathesis) transformation, where the process can be conducted without employing grafting prior to catalysis.

  5. Digital signal processing: Handbook

    Science.gov (United States)

    Goldenberg, L. M.; Matiushkin, B. D.; Poliak, M. N.

    The fundamentals of the theory and design of systems and devices for the digital processing of signals are presented. Particular attention is given to algorithmic methods of synthesis and digital processing equipment in communication systems (e.g., selective digital filtering, spectral analysis, and variation of the signal discretization frequency). Programs for the computer-aided analysis of digital filters are described. Computational examples are presented, along with tables of transfer function coefficients for recursive and nonrecursive digital filters.

  6. Diasporic Relationships and Processes

    DEFF Research Database (Denmark)

    Singla, Rashmi

    2010-01-01

    How does moving across the geographical borders affect the relationships of diaspora members both here – in the country of residence and there- in the country of origin? The article delineates some of the processes through gendered experiences of the young adults perceived as active actors based...... an empirical longitudinal study. The results indicate transformations in belongings and longings indicating reinterpretation of the self, others and home in context of exclusion processes at various levels....

  7. Bank Record Processing

    Science.gov (United States)

    1982-01-01

    Barnett Banks of Florida, Inc. operates 150 banking offices in 80 Florida cities. Banking offices have computerized systems for processing deposits or withdrawals in checking/savings accounts, and for handling commercial and installment loan transactions. In developing a network engineering design for the terminals used in record processing, an affiliate, Barnett Computing Company, used COSMIC's STATCOM program. This program provided a reliable network design tool and avoided the cost of developing new software.

  8. Poststroke neuroplasticity processes

    Directory of Open Access Journals (Sweden)

    I. V. Damulin

    2014-01-01

    Full Text Available The paper considers different aspects of neuroplasticity in patients with stroke. It underlines the dynamism of this process and the ambiguity of involvement of the structures of the contralateral cerebral hemisphere in the restorative process. It considers the periods after onset of stroke and the activation of different brain regions (of both the involved and intact hemisphere in the poststroke period. Particular emphasis is placed on the issues of neurorehabilitation in this category of patients. Delay in rehabilitation measures leads to a worse outcome, the patients must be at hospital longer. It is emphasized that the neurorehabilitaton measures should use strategies aimed at improving plasticity processes at the level of synaptic transmission and neuronal communications. At the same time, of great importance are the processes of structural and functional remodeling of neuronal communications with the involvement of surviving neurons that are located in the peri-infarct area and partially damaged during ischemia. To recover stroke-induced lost motor functions, measures are implemented to modulate the ipsilateral motor cortex, contralateral motor cortex, and sensory afferentation. Remodeling processes, one of the manifestations of neuroplasticity, vary with the size and location of an ischemic focus. The specific features of this process with subcortical and cortical foci are considered. It is stressed that there are genetically determined neurotrophic factors that may enhance remodeling processes in the peri-infarct area, as well as factors that inhibit these processes. The sensory system is noted to have a high potential of compensation, which is appreciably associated with the considerable extent of sensory fibers even at the level of the cerebral cortex.

  9. Cognitive Processes in Writing

    Institute of Scientific and Technical Information of China (English)

    李莹

    2009-01-01

    Writing has become one of important topic to discuss in the new age.Its theories could be generally learnt,but its nature needs to handle in specific contents.In another words,every one who can write must generate his/her thinking or cognitive processes.Because writing thinking is to do meaningful activities,how to solove writing problems could be managed through cognitive process.

  10. Apple Image Processing Educator

    Science.gov (United States)

    Gunther, F. J.

    1981-01-01

    A software system design is proposed and demonstrated with pilot-project software. The system permits the Apple II microcomputer to be used for personalized computer-assisted instruction in the digital image processing of LANDSAT images. The programs provide data input, menu selection, graphic and hard-copy displays, and both general and detailed instructions. The pilot-project results are considered to be successful indicators of the capabilities and limits of microcomputers for digital image processing education.

  11. Metoda Analytic Network Process

    OpenAIRE

    2010-01-01

    The thesis is concerned with Multi-Criteria Decision Making, in particular the Analytic Network Process method. The introductory part is dedicated to compile all the theory necessary to understand the method and utilized throughout the paper. The Analytic Hierarchy Process method is described and later generalized in the form of the ANP. Part of the paper is a description of available software products that are able to solve the ANP models. The main focus is on the application of the method, ...

  12. Biomedical signal processing

    CERN Document Server

    Akay, Metin

    1994-01-01

    Sophisticated techniques for signal processing are now available to the biomedical specialist! Written in an easy-to-read, straightforward style, Biomedical Signal Processing presents techniques to eliminate background noise, enhance signal detection, and analyze computer data, making results easy to comprehend and apply. In addition to examining techniques for electrical signal analysis, filtering, and transforms, the author supplies an extensive appendix with several computer programs that demonstrate techniques presented in the text.

  13. Cauchy cluster process

    DEFF Research Database (Denmark)

    Ghorbani, Mohammad

    2013-01-01

    In this paper we introduce an instance of the well-know Neyman–Scott cluster process model with clusters having a long tail behaviour. In our model the offspring points are distributed around the parent points according to a circular Cauchy distribution. Using a modified Cramér-von Misses test st...... statistic and the simulated pointwise envelopes it is shown that this model fits better than the Thomas process to the frequently analyzed long-leaf pine data-set....

  14. Scramjet Combustion Processes

    Science.gov (United States)

    2010-09-01

    plan for these flights is as follows: Scramjet Combustion Processes RTO-EN-AVT-185 11 - 21 HyShot 5 – A Free-Flying Hypersonic Glider HyShot...5 will be a hypersonic glider designed to fly at Mach 8. It will separate from its rocket booster in space and perform controlled manoeuvres as it...RTO-EN-AVT-185 11 - 1 Scramjet Combustion Processes Michael Smart and Ray Stalker Centre for Hypersonics The University of Queensland

  15. Solution Processing - Rodlike Polymers

    Science.gov (United States)

    1979-08-01

    side it necessary and identify by block number) Para-ordered Polymers High Modulus Fibers and Films Polybenzobisoxazoles Polybenzobisthiazoles 20...considerations important in solution processing are considered, with special emphasis on the dry-jet wet spinning process used to form fibers . Pertinent...Company, Summit, N.J. iii TABLE OF CONTENTS 1. INTRODUCTION ................ .......................... .. 1 2. REMARKS ON DRY-JET WET SPUN FIBER

  16. Image Processing Software

    Science.gov (United States)

    1992-01-01

    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  17. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    The Integrated Renovation Process (IRP) is a user customized methodology based on judiciously selected constructivist and interactive multi-criteria decision making methods (Galiotto, Heiselberg, & Knudstrup, 2014 (expected)). When applied for home renovation, the Integrated Renovation Process...... for the quantitative analyses and the generation of the renovation scenarios so they get more time for the cost optimisation and the qualitative analysis of the homeowners’ needs, wishes and behaviours....

  18. Privatisation Process in Kosovo

    Directory of Open Access Journals (Sweden)

    Dr.Sc. Hysni Terziu

    2015-06-01

    Full Text Available This paper aims at analysing activities of the privatisation process in Kosovo, seeing that privatisation is treated as a fundamental factor of overall transformation of the whole society. It may be established that the primary aim of privatisation process is increasing economic efficiency, reflection of the current state and directions of development in general. Privatisation as a process has as primary aim of opening new areas of freedom, economic efficiency and individualism. Key aim of privatisation process in Kosovo must be increase of economic efficiency, preservation of the healthy economic potential created up to date, and ensuring of the long term concept, which enables growth and macroeconomic stability. The policy of privatisation should give a response related to strategic aspects of privatisation of these sectors: of models, procedures, potential investors, technological modernisation and overtaking of social barriers. Process of privatisation and transition which has now covered countries of the Eastern and Central Europe, aims at profound economic and political transformation of these countries. To achieve this, it is necessarily required to have some basic preconditions, which are related to incitement of general efficiency of the enterprises, expansion of the capital market, introduction of competition, development of business culture in private property and freedom of entrepreneurship. Impacts of privatisation in economic development of Kosovo take a considerable place compared to other countries, therefore our aim is that through this paper we analyse factors and methods of implementation in this process.

  19. PALSAR ground data processing

    Science.gov (United States)

    Frick, Heinrich; Palsetia, Marzban; Carande, Richard; Curlander, James C.

    2002-02-01

    The upcoming launches of new satellites like ALOS, Envisat, Radarsat2 and ECHO will pose a significant challenge for many ground stations, namely to integrate new SAR processing software into their existing systems. Vexcel Corporation in Boulder, Colorado, has built a SAR processing system, named APEX -Suite, for spaceborne SAR satellites that can easily be expanded for the next generation of SAR satellites. APEX-Suite includes an auto-satellite-detecting Level 0 Processor that includes bit-error correction, data quality characterization, and as a unique feature, a sophisticated and very accurate Doppler centroid estimator. The Level 1 processing is divided into the strip mode processor FOCUST, based on the well-proven range-Doppler algorithm, and the SWATHT ScanSAR processor that uses the Chirp Z Trans-form algorithm. A high-accuracy ortho-rectification processor produces systematic and precision corrected Level 2 SAR image pro ducts. The PALSAR instrument is an L-band SAR with multiple fine and standard resolution beams in strip mode, and several wide-swath ScanSAR modes. We will address the adaptation process of Vexcel's APEX-Suite processing system for the PALSAR sensor and discuss image quality characteristics based on processed simulated point target phase history data.

  20. Helium process cycle

    Science.gov (United States)

    Ganni, Venkatarao

    2007-10-09

    A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

  1. Novel food processing techniques

    Directory of Open Access Journals (Sweden)

    Vesna Lelas

    2006-12-01

    Full Text Available Recently, a lot of investigations have been focused on development of the novel mild food processing techniques with the aim to obtain the high quality food products. It is presumed also that they could substitute some of the traditional processes in the food industry. The investigations are primarily directed to usage of high hydrostatic pressure, ultrasound, tribomechanical micronization, microwaves, pulsed electrical fields. The results of the scientific researches refer to the fact that application of some of these processes in particular food industry can result in lots of benefits. A significant energy savings, shortening of process duration, mild thermal conditions, food products with better sensory characteristics and with higher nutritional values can be achieved. As some of these techniques act also on the molecular level changing the conformation, structure and electrical potential of organic as well as inorganic materials, the improvement of some functional properties of these components may occur. Common characteristics of all of these techniques are treatment at ambient or insignificant higher temperatures and short time of processing (1 to 10 minutes. High hydrostatic pressure applied to various foodstuffs can destroy some microorganisms, successfully modify molecule conformation and consequently improve functional properties of foods. At the same time it acts positively on the food products intend for freezing. Tribomechanical treatment causes micronization of various solid materials that results in nanoparticles and changes in structure and electrical potential of molecules. Therefore, the significant improvement of some rheological and functional properties of materials occurred. Ultrasound treatment proved to be potentially very successful technique of food processing. It can be used as a pretreatment to drying (decreases drying time and improves functional properties of food, as extraction process of various components

  2. Carbon dioxide reducing processes; Koldioxidreducerande processer

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Fredrik

    1999-12-01

    This thesis discusses different technologies to reduce or eliminate the carbon dioxide emissions, when a fossil fuel is used for energy production. Emission reduction can be accomplished by separating the carbon dioxide for storage or reuse. There are three different ways of doing the separation. The carbon dioxide can be separated before the combustion, the process can be designed so that the carbon dioxide can be separated without any energy consumption and costly systems or the carbon dioxide can be separated from the flue gas stream. Two different concepts of separating the carbon dioxide from a combined cycle are compared, from the performance and the economical point of view, with a standard natural gas fired combined cycle where no attempts are made to reduce the carbon dioxide emissions. One concept is to use absorption technologies to separate the carbon dioxide from the flue gas stream. The other concept is based on a semi-closed gas turbine cycle using carbon dioxide as working fluid and combustion with pure oxygen, generated in an air-separating unit. The calculations show that the efficiency (power) drop is smaller for the first concept than for the second, 8.7 % points compared to 13.7 % points, when power is produced. When both heat and power are produced, the relation concerning the efficiency (power) remains. Regarding the overall efficiency (heat and power) the opposite relation is present. A possible carbon dioxide tax must exceed 0.21 SEK/kg CO{sub 2} for it to be profitable to separate carbon dioxide with any of these technologies.

  3. Customer Innovation Process Leadership

    DEFF Research Database (Denmark)

    Lindgren, Peter; Jørgensen, Jacob Høj; Goduscheit, René Chester

    2007-01-01

    Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been on diffe......Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been...... to handle shorter and faster product life cycles. Continuously changing customer needs are pushing companies' competence of continuous innovation to a maximum - but still this seems not to be enough to stay competitive on the global market and reach the goals of growth. This article therefore suggests...... another outlook to future innovation leadership - Customer Innovation Process Leadership - CIP-leadership. CIP-leadership moves the company's innovation process closer to the customer innovation process and discusses how companies can be involved and innovate in customers' future needs and lead...

  4. VLSI signal processing technology

    CERN Document Server

    Swartzlander, Earl

    1994-01-01

    This book is the first in a set of forthcoming books focussed on state-of-the-art development in the VLSI Signal Processing area. It is a response to the tremendous research activities taking place in that field. These activities have been driven by two factors: the dramatic increase in demand for high speed signal processing, especially in consumer elec­ tronics, and the evolving microelectronic technologies. The available technology has always been one of the main factors in determining al­ gorithms, architectures, and design strategies to be followed. With every new technology, signal processing systems go through many changes in concepts, design methods, and implementation. The goal of this book is to introduce the reader to the main features of VLSI Signal Processing and the ongoing developments in this area. The focus of this book is on: • Current developments in Digital Signal Processing (DSP) pro­ cessors and architectures - several examples and case studies of existing DSP chips are discussed in...

  5. Due process traditionalism.

    Science.gov (United States)

    Sunstein, Cass R

    2008-06-01

    In important cases, the Supreme Court has limited the scope of "substantive due process" by reference to tradition, but it has yet to explain why it has done so. Due process traditionalism might be defended in several distinctive ways. The most ambitious defense draws on a set of ideas associated with Edmund Burke and Friedrich Hayek, who suggested that traditions have special credentials by virtue of their acceptance by many minds. But this defense runs into three problems. Those who have participated in a tradition may not have accepted any relevant proposition; they might suffer from a systematic bias; and they might have joined a cascade. An alternative defense sees due process traditionalism as a second-best substitute for two preferable alternatives: a purely procedural approach to the Due Process Clause, and an approach that gives legislatures the benefit of every reasonable doubt. But it is not clear that in these domains, the first-best approaches are especially attractive; and even if they are, the second-best may be an unacceptably crude substitute. The most plausible defense of due process traditionalism operates on rule-consequentialist grounds, with the suggestion that even if traditions are not great, they are often good, and judges do best if they defer to traditions rather than attempting to specify the content of "liberty" on their own. But the rule-consequentialist defense depends on controversial and probably false assumptions about the likely goodness of traditions and the institutional incapacities of judges.

  6. Beyond the search process

    DEFF Research Database (Denmark)

    Hyldegård, Jette

    2009-01-01

    . It is concluded that the ISP-model does not fully comply with group members' problem solving process and the involved information seeking behavior. Further, complex academic problem solving seems to be even more complex when it is performed in a group based setting. The study contributes with a new conceptual......This paper reports on the findings from a longitudinal case study exploring Kuhlthau's information search process (ISP)-model in a group based academic setting. The research focus is on group members' activities and cognitive and emotional experiences during the task process of writing...... an assignment. It is investigated if group members' information behavior differ from the individual information seeker in the ISP-model and to what extent this behavior is influenced by contextual (work task) and social (group work) factors. Three groups of LIS students were followed during a 14 weeks period...

  7. Laser Processing and Chemistry

    CERN Document Server

    Bäuerle, Dieter

    2011-01-01

    This book gives an overview of the fundamentals and applications of laser-matter interactions, in particular with regard to laser material processing. Special attention is given to laser-induced physical and chemical processes at gas-solid, liquid-solid, and solid-solid interfaces. Starting with the background physics, the book proceeds to examine applications of lasers in “standard” laser machining and laser chemical processing (LCP), including the patterning, coating, and modification of material surfaces. This fourth edition has been enlarged to cover the rapid advances in the understanding of the dynamics of materials under the action of ultrashort laser pulses, and to include a number of new topics, in particular the increasing importance of lasers in various different fields of surface functionalizations and nanotechnology. In two additional chapters, recent developments in biotechnology, medicine, art conservation and restoration are summarized. Graduate students, physicists, chemists, engineers, a...

  8. Processes for xanthomonas biopolymers

    Energy Technology Data Exchange (ETDEWEB)

    Engelskirchen, K.; Stein, W.; Bahn, M.; Schieferstein, L.; Schindler, J.

    1984-03-27

    A process is described for producing xanthan gum in which the use of a stable, water-in-oil emulsion in the fermentation medium markedly lowers the viscosity of the medium, resulting in lower energy requirements for the process, and also resulting in enhanced yields of the biopolymer. In such an emulsion, the aqueous fermentation phase, with its microbial growth and metabolic processes, takes place in a finely dispersed homogeneous oil phase. The viscosity increase in each droplet of the aqueous nutrient solution will not noticeably affect this mixture in the fermenter because the viscosity of the reaction mixture in the fermenter is determined primarily by the viscosity of the oil phase. 45 claims

  9. Identification of wastewater processes

    DEFF Research Database (Denmark)

    Carstensen, Niels Jacob

    -known theory of the processes with the significant effects found in data. These models are called grey box models, and they contain rate expressions for the processes of influent load of nutrients, transport of nutrients between the aeration tanks, hydrolysis and growth of biomass, nitrification...... function. The grey box models are estimated on data sets from the Lundtofte pilot scale plant and the Aalborg West wastewater treatment plant. Estimation of Monod- kinetic expressions is made possible through the application of large data sets. Parameter extimates from the two plants show a reasonable......The introduction of on-line sensors for monitoring of nutrient salts concentrations on wastewater treatment plants with nutrient removal, opens a wide new area of modelling wastewater processes. The subject of this thesis is the formulation of operational dynamic models based on time series...

  10. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  11. Process window metrology

    Science.gov (United States)

    Ausschnitt, Christopher P.; Chu, William; Hadel, Linda M.; Ho, Hok; Talvi, Peter

    2000-06-01

    This paper is the third of a series that defines a new approach to in-line lithography control. The first paper described the use of optically measurable line-shortening targets to enhance signal-to-noise and reduce measurement time. The second described the dual-tone optical critical dimension (OCD) measurement and analysis necessary to distinguish dose and defocus. Here we describe the marriage of dual-tone OCD to SEM-CD metrology that comprises what we call 'process window metrology' (PWM), the means to locate each measured site in dose and focus space relative to the allowed process window. PWM provides in-line process tracking and control essential to the successful implementation of low-k lithography.

  12. Quartz resonator processing system

    Science.gov (United States)

    Peters, Roswell D. M.

    1983-01-01

    Disclosed is a single chamber ultra-high vacuum processing system for the oduction of hermetically sealed quartz resonators wherein electrode metallization and sealing are carried out along with cleaning and bake-out without any air exposure between the processing steps. The system includes a common vacuum chamber in which is located a rotatable wheel-like member which is adapted to move a plurality of individual component sets of a flat pack resonator unit past discretely located processing stations in said chamber whereupon electrode deposition takes place followed by the placement of ceramic covers over a frame containing a resonator element and then to a sealing stage where a pair of hydraulic rams including heating elements effect a metallized bonding of the covers to the frame.

  13. Process of timbral composing

    Science.gov (United States)

    Withrow, Sam

    In this paper, I discuss the techniques and processes of timbral organization I developed while writing my chamber work, Afterimage. I compare my techniques with illustrative examples by other composers to place my work in historical context. I examine three elements of my composition process. The first is the process of indexing and cataloging basic sonic materials. The second consists of the techniques and mechanics of manipulating and assembling these collections into larger scale phrases, textures, and overall form in a musical work. The third element is the more elusive, and often extra-musical, source of inspiration and motivation. The evocative power of tone color is both immediately evident yet difficult to explain. What is timbre? This question cannot be answered solely in scientific terms; subjective factors affect our perception of it.

  14. The Nursing Process

    Directory of Open Access Journals (Sweden)

    M. Hammond

    1978-09-01

    Full Text Available The essence of the nursing process can be summed up in this quotation by Sir Francis Bacon: “Human knowledge and human powers meet in one; for where the cause is not known the effect cannot be produced.” Arriving at a concise, accurate definition of the nursing process was, for me, an impossible task. It is altogether too vast and too personal a topic to contract down into a niftylooking, we-pay-lip-service-to-it cliché. So what I propose to do is to present my understanding of the nursing process throughout this essay, and then to leave the reader with some overall, general impression of what it all entails.

  15. Quantum independent increment processes

    CERN Document Server

    Franz, Uwe

    2005-01-01

    This volume is the first of two volumes containing the revised and completed notes lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald during the period March 9 – 22, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present first volume contains the following lectures: "Lévy Processes in Euclidean Spaces and Groups" by David Applebaum, "Locally Compact Quantum Groups" by Johan Kustermans, "Quantum Stochastic Analysis" by J. Martin Lindsay, and "Dilations, Cocycles and Product Systems" by B.V. Rajarama Bhat.

  16. A support design process

    Energy Technology Data Exchange (ETDEWEB)

    Arthur, J.; Scott, P.B. [Health and Safety Executive (United Kingdom)

    2004-07-01

    A workman suffered a fatal injury due to a fall of ground from the face of a development drivage, which was supported by passive supports supplemented with roof bolts. A working party was set up to review the support process and evaluate how protection of the workmen could be improved whilst setting supports.The working party included representatives from the trade unions, the mines inspectorate and mine operators.Visits were made to several mines and discussions were held with the workmen and management at these mines. The paper describes the results of the visits and how a support design process was evolved. The process will ensure that the support system is designed to reduce the inherent hazards associated with setting supports using either conventional or mixed support systems.

  17. Topological signal processing

    CERN Document Server

    Robinson, Michael

    2014-01-01

    Signal processing is the discipline of extracting information from collections of measurements. To be effective, the measurements must be organized and then filtered, detected, or transformed to expose the desired information.  Distortions caused by uncertainty, noise, and clutter degrade the performance of practical signal processing systems. In aggressively uncertain situations, the full truth about an underlying signal cannot be known.  This book develops the theory and practice of signal processing systems for these situations that extract useful, qualitative information using the mathematics of topology -- the study of spaces under continuous transformations.  Since the collection of continuous transformations is large and varied, tools which are topologically-motivated are automatically insensitive to substantial distortion. The target audience comprises practitioners as well as researchers, but the book may also be beneficial for graduate students.

  18. COTS software selection process.

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

    2006-05-01

    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  19. NTP comparison process

    Science.gov (United States)

    Corban, Robert

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  20. AERONET Version 3 processing

    Science.gov (United States)

    Holben, B. N.; Slutsker, I.; Giles, D. M.; Eck, T. F.; Smirnov, A.; Sinyuk, A.; Schafer, J.; Rodriguez, J.

    2014-12-01

    The Aerosol Robotic Network (AERONET) database has evolved in measurement accuracy, data quality products, availability to the scientific community over the course of 21 years with the support of NASA, PHOTONS and all federated partners. This evolution is periodically manifested as a new data version release by carefully reprocessing the entire database with the most current algorithms that fundamentally change the database and ultimately the data products used by the community. The newest processing, Version 3, will be released in 2015 after the entire database is reprocessed and real-time data processing becomes operational. All V 3 algorithms have been developed, individually vetted and represent four main categories: aerosol optical depth (AOD) processing, inversion processing, database management and new products. The primary trigger for release of V 3 lies with cloud screening of the direct sun observations and computation of AOD that will fundamentally change all data available for analysis and all subsequent retrieval products. This presentation will illustrate the innovative approach used for cloud screening and assesses the elements of V3 AOD relative to the current version. We will also present the advances in the inversion product processing with emphasis on the random and systematic uncertainty estimates. This processing will be applied to the new hybrid measurement scenario intended to provide inversion retrievals for all solar zenith angles. We will introduce automatic quality assurance criteria that will allow near real time quality assured aerosol products necessary for real time satellite and model validation and assimilation. Last we will introduce the new management structure that will improve access to the data database. The current version 2 will be supported for at least two years after the initial release of V3 to maintain continuity for on going investigations.

  1. Orchestrator Telemetry Processing Pipeline

    Science.gov (United States)

    Powell, Mark; Mittman, David; Joswig, Joseph; Crockett, Thomas; Norris, Jeffrey

    2008-01-01

    Orchestrator is a software application infrastructure for telemetry monitoring, logging, processing, and distribution. The architecture has been applied to support operations of a variety of planetary rovers. Built in Java with the Eclipse Rich Client Platform, Orchestrator can run on most commonly used operating systems. The pipeline supports configurable parallel processing that can significantly reduce the time needed to process a large volume of data products. Processors in the pipeline implement a simple Java interface and declare their required input from upstream processors. Orchestrator is programmatically constructed by specifying a list of Java processor classes that are initiated at runtime to form the pipeline. Input dependencies are checked at runtime. Fault tolerance can be configured to attempt continuation of processing in the event of an error or failed input dependency if possible, or to abort further processing when an error is detected. This innovation also provides support for Java Message Service broadcasts of telemetry objects to clients and provides a file system and relational database logging of telemetry. Orchestrator supports remote monitoring and control of the pipeline using browser-based JMX controls and provides several integration paths for pre-compiled legacy data processors. At the time of this reporting, the Orchestrator architecture has been used by four NASA customers to build telemetry pipelines to support field operations. Example applications include high-volume stereo image capture and processing, simultaneous data monitoring and logging from multiple vehicles. Example telemetry processors used in field test operations support include vehicle position, attitude, articulation, GPS location, power, and stereo images.

  2. Biomedical Image Processing

    CERN Document Server

    Deserno, Thomas Martin

    2011-01-01

    In modern medicine, imaging is the most effective tool for diagnostics, treatment planning and therapy. Almost all modalities have went to directly digital acquisition techniques and processing of this image data have become an important option for health care in future. This book is written by a team of internationally recognized experts from all over the world. It provides a brief but complete overview on medical image processing and analysis highlighting recent advances that have been made in academics. Color figures are used extensively to illustrate the methods and help the reader to understand the complex topics.

  3. Exoplanet atmospheres physical processes

    CERN Document Server

    Seager, Sara

    2010-01-01

    Over the past twenty years, astronomers have identified hundreds of extrasolar planets--planets orbiting stars other than the sun. Recent research in this burgeoning field has made it possible to observe and measure the atmospheres of these exoplanets. This is the first textbook to describe the basic physical processes--including radiative transfer, molecular absorption, and chemical processes--common to all planetary atmospheres, as well as the transit, eclipse, and thermal phase variation observations that are unique to exoplanets. In each chapter, Sara Seager offers a conceptual introdu

  4. Semantic and Process Interoperability

    Directory of Open Access Journals (Sweden)

    Félix Oscar Fernández Peña

    2010-05-01

    Full Text Available Knowledge management systems support education at different levels of the education. This is very important for the process in which the higher education of Cuba is involved. Structural transformations of teaching are focused on supporting the foundation of the information society in the country. This paper describes technical aspects of the designing of a model for the integration of multiple knowledgemanagement tools supporting teaching. The proposal is based on the definition of an ontology for the explicit formal description of the semantic of motivations of students and teachers in the learning process. Its target is to facilitate knowledge spreading.

  5. Advanced Polymer Processing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Muenchausen, Ross E. [Los Alamos National Laboratory

    2012-07-25

    Some conclusions of this presentation are: (1) Radiation-assisted nanotechnology applications will continue to grow; (2) The APPF will provide a unique focus for radiolytic processing of nanomaterials in support of DOE-DP, other DOE and advanced manufacturing initiatives; (3) {gamma}, X-ray, e-beam and ion beam processing will increasingly be applied for 'green' manufacturing of nanomaterials and nanocomposites; and (4) Biomedical science and engineering may ultimately be the biggest application area for radiation-assisted nanotechnology development.

  6. Hard exclusive QCD processes

    Energy Technology Data Exchange (ETDEWEB)

    Kugler, W.

    2007-01-15

    Hard exclusive processes in high energy electron proton scattering offer the opportunity to get access to a new generation of parton distributions, the so-called generalized parton distributions (GPDs). This functions provide more detailed informations about the structure of the nucleon than the usual PDFs obtained from DIS. In this work we present a detailed analysis of exclusive processes, especially of hard exclusive meson production. We investigated the influence of exclusive produced mesons on the semi-inclusive production of mesons at fixed target experiments like HERMES. Further we give a detailed analysis of higher order corrections (NLO) for the exclusive production of mesons in a very broad range of kinematics. (orig.)

  7. An Integrated Desgin Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...

  8. Research Planning Process

    Science.gov (United States)

    Lofton, Rodney

    2010-01-01

    This presentation describes the process used to collect, review, integrate, and assess research requirements desired to be a part of research and payload activities conducted on the ISS. The presentation provides a description of: where the requirements originate, to whom they are submitted, how they are integrated into a requirements plan, and how that integrated plan is formulated and approved. It is hoped that from completing the review of this presentation, one will get an understanding of the planning process that formulates payload requirements into an integrated plan used for specifying research activities to take place on the ISS.

  9. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...

  10. Solar industrial process heat

    Energy Technology Data Exchange (ETDEWEB)

    Lumsdaine, E.

    1981-04-01

    The aim of the assessment reported is to candidly examine the contribution that solar industrial process heat (SIPH) is realistically able to make in the near and long-term energy futures of the United States. The performance history of government and privately funded SIPH demonstration programs, 15 of which are briefly summarized, and the present status of SIPH technology are discussed. The technical and performance characteristics of solar industrial process heat plants and equipment are reviewed, as well as evaluating how the operating experience of over a dozen SIPH demonstration projects is influencing institutional acceptance and economoc projections. Implications for domestic energy policy and international implications are briefly discussed. (LEW)

  11. Thermal stir welding process

    Science.gov (United States)

    Ding, R. Jeffrey (Inventor)

    2012-01-01

    A welding method is provided for forming a weld joint between first and second elements of a workpiece. The method includes heating the first and second elements to form an interface of material in a plasticized or melted state interface between the elements. The interface material is then allowed to cool to a plasticized state if previously in a melted state. The interface material, while in the plasticized state, is then mixed, for example, using a grinding/extruding process, to remove any dendritic-type weld microstructures introduced into the interface material during the heating process.

  12. Genomic signal processing

    CERN Document Server

    Shmulevich, Ilya

    2007-01-01

    Genomic signal processing (GSP) can be defined as the analysis, processing, and use of genomic signals to gain biological knowledge, and the translation of that knowledge into systems-based applications that can be used to diagnose and treat genetic diseases. Situated at the crossroads of engineering, biology, mathematics, statistics, and computer science, GSP requires the development of both nonlinear dynamical models that adequately represent genomic regulation, and diagnostic and therapeutic tools based on these models. This book facilitates these developments by providing rigorous mathema

  13. Anaerobic Digestion: Process

    DEFF Research Database (Denmark)

    Angelidaki, Irini; Batstone, Damien J.

    2011-01-01

    with very little dry matter may also be called a digest. The digest should not be termed compost unless it specifically has been composted in an aerated step. This chapter describes the basic processes of anaerobic digestion. Chapter 9.5 describes the anaerobic treatment technologies, and Chapter 9.......6 addresses the mass balances and environmental aspects of anaerobic digestion....

  14. Authenticizing the Research Process

    Directory of Open Access Journals (Sweden)

    Nora Elizondo-Schmelkes, MA, Ph.D. Candidate

    2011-06-01

    Full Text Available This study reflects the main concern of students (national and international who are trying to get a postgraduate degree in a third world (or “in means of development” country. The emergent problem found is that students have to finish their thesis or dissertation but they do not really know how to accomplish this goal. They resolve this problem by authenticizing the process as their own. The theory of authenticizing involves compassing their way to solve the problem of advancing in the research process. Compassing allows the student to authenticize his/her research process, making it a personal and „owned. process. The main categories of compassing are the intellectual, physical and emotional dimension patterns that the student has, learns and follows in order to finish the project and get a degree. Authenticizing implies to author with authenticity their thesis or dissertation. Compassing allows them to do this in their own way, at their own pace or time and with their own internal resources, strengths and weaknesses.

  15. Performance Evaluation Process.

    Science.gov (United States)

    1998

    This document contains four papers from a symposium on the performance evaluation process and human resource development (HRD). "Assessing the Effectiveness of OJT (On the Job Training): A Case Study Approach" (Julie Furst-Bowe, Debra Gates) is a case study of the effectiveness of OJT in one of a high-tech manufacturing company's product…

  16. Advanced Biosignal Processing

    CERN Document Server

    Nait-Ali, Amine

    2009-01-01

    Presents the principle of many advanced biosignal processing techniques. This title introduces the main biosignal properties and the acquisition techniques. It concerns one of the most intensively used biosignals in the clinical routine, namely the Electrocardiogram, the Elektroenzephalogram, the Electromyogram and the Evoked Potential

  17. Matchmaking for business processes

    NARCIS (Netherlands)

    Wombacher, Andreas; Fankhauser, Peter; Mahleko, Bendick; Neuhold, Erich

    2003-01-01

    Web services have a potential to enhance B2B ecommerce over the Internet by allowing companies and organizations to publish their business processes on service directories where potential trading partners can find them. This can give rise to new business paradigms based on ad-hoc trading relations a

  18. Flax shive thermocatalytic processing

    Science.gov (United States)

    Sulman, E. M.; Lugovoy, Yu. V.; Chalov, K. V.; Kosivtsov, Yu. Yu.; Stepacheva, A. A.; Shimanskaya, E. I.

    2016-11-01

    In the paper the thermogravimetric study of biomass waste thermodestruction process is presented. Metal chlorides have the highest influence on the flax shive thermodestruction. The results of kinetic modeling are also shown on the base of thermogravimetric analysis both of the samples of flax shive and flax shive with addition of 10% (wt.) nickel chloride at different heating rate.

  19. Ultrahigh bandwidth signal processing

    Science.gov (United States)

    Oxenløwe, Leif Katsuo

    2016-04-01

    Optical time lenses have proven to be very versatile for advanced optical signal processing. Based on a controlled interplay between dispersion and phase-modulation by e.g. four-wave mixing, the processing is phase-preserving, and hence useful for all types of data signals including coherent multi-level modulation formats. This has enabled processing of phase-modulated spectrally efficient data signals, such as orthogonal frequency division multiplexed (OFDM) signals. In that case, a spectral telescope system was used, using two time lenses with different focal lengths (chirp rates), yielding a spectral magnification of the OFDM signal. Utilising such telescopic arrangements, it has become possible to perform a number of interesting functionalities, which will be described in the presentation. This includes conversion from OFDM to Nyquist WDM, compression of WDM channels to a single Nyquist channel and WDM regeneration. These operations require a broad bandwidth nonlinear platform, and novel photonic integrated nonlinear platforms like aluminum gallium arsenide nano-waveguides used for 1.28 Tbaud optical signal processing will be described.

  20. Obsolescence: the underlying processes

    NARCIS (Netherlands)

    Thomsen, A.F.; Nieboer, N.E.T.; Van der Flier, C.L.

    2015-01-01

    Obsolescence, defined as the process of declining performance of buildings, is a serious threat for the value, the usefulness and the life span of housing properties. Thomsen and van der Flier (2011) developed a model in which obsolescence is categorised on the basis of two distinctions, namely betw

  1. Cascaded Poisson processes

    Science.gov (United States)

    Matsuo, Kuniaki; Saleh, Bahaa E. A.; Teich, Malvin Carl

    1982-12-01

    We investigate the counting statistics for stationary and nonstationary cascaded Poisson processes. A simple equation is obtained for the variance-to-mean ratio in the limit of long counting times. Explicit expressions for the forward-recurrence and inter-event-time probability density functions are also obtained. The results are expected to be of use in a number of areas of physics.

  2. Photonic curvilinear data processing

    Science.gov (United States)

    Browning, Clyde; Quaglio, Thomas; Figueiro, Thiago; Pauliac, Sébastien; Belledent, Jérôme; Fay, Aurélien; Bustos, Jessy; Marusic, Jean-Christophe; Schiavone, Patrick

    2014-10-01

    With more and more photonic data presence in e-beam lithography, the need for efficient and accurate data fracturing is required to meet acceptable manufacturing cycle time. Large photonic based layouts now create high shot count patterns for VSB based tools. Multiple angles, sweeping curves, and non-orthogonal data create a challenge for today's e-beam tools that are more efficient on Manhattan style data. This paper describes techniques developed and used for creating fractured data for VSB based pattern generators. Proximity Effect Correction is also applied during the fracture process, taking into account variable shot sizes to apply for accuracy and design style. Choosing different fracture routines for pattern data on-the-fly allows for fast and efficient processing. Data interpretation is essential for processing curvilinear data as to its size, angle, and complexity. Fracturing complex angled data into "efficient" shot counts is no longer practical as shot creation now requires knowledge of the actual data content as seen in photonic based pattern data. Simulation and physical printing results prove the implementations for accuracy and write times compared to traditional VSB writing strategies on photonic data. Geometry tolerance is used as part of the fracturing algorithm for controlling edge placement accuracy and tuning to different e-beam processing parameters.

  3. The magnetization process: Hysteresis

    Science.gov (United States)

    Balsamel, Richard

    1990-01-01

    The magnetization process, hysteresis (the difference in the path of magnetization for an increasing and decreasing magnetic field), hysteresis loops, and hard magnetic materials are discussed. The fabrication of classroom projects for demonstrating hysteresis and the hysteresis of common magnetic materials is described in detail.

  4. Sustainability of abrasive processes

    DEFF Research Database (Denmark)

    Aurich, J.C.; Linke, B.; Hauschild, Michael Zwicky

    2013-01-01

    This paper presents an overview of research on sustainability of abrasive processes. It incorporates results from a round robin study on ‘‘energy-efficiency of abrasive processes’’ which has been carried out within the scientific technical committee ‘‘abrasive processes’’ (STC G) of CIRP...

  5. BUSINESS PROCESS REENGINEERING

    Directory of Open Access Journals (Sweden)

    Magdalena LUCA (DEDIU

    2014-06-01

    Full Text Available Business process reengineering determines the change of organizational functions from an orientation focused on operations through a multidimensional approach. Former employees who were mere executors are now determined to take their own decisions and as a result the functional departments lose their reason to exist. Managers do not act anymore as supervisors, but mainly as mentors, while the employees focus more attention on customer needs and less than the head’s. Under these conditions, new organizational paradigms are required, the most important being that of learning organizations. In order to implement a reengineering of the economic processes and promoting a new organizational paradigm the information technology plays a decisive role. The article presents some results obtained in a research theme ANSTI funded by contract no. 501/2000. Economic and financial analysis is performed in order to know the current situation to achieve better results in the future. One of its objectives is the production analyzed as a labour process and the interaction elements of this process. The indicators investigated in the analysis of financial and economic activity of production reflect the development directions, the means and resources to accomplish predetermined objectives and express the results and effectiveness of what is expected.

  6. Attentional Processes in Autism.

    Science.gov (United States)

    Goldstein, Gerald; Johnson, Cynthia R.; Minshew, Nancy J.

    2001-01-01

    Attention processes in 103 children and adults with high functioning autism were compared with a matched control group using a battery of attention measures. Differences were found only on tasks which placed demands on cognitive flexibility or psychomotor speed, suggesting that purported attention deficits in autism may actually be primary…

  7. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process...

  8. Communicating Process Achitectures 2005

    NARCIS (Netherlands)

    Broenink, Jan F.; Roebbers, Herman W.; Sunters, Johan P.E.; Welch, Peter H.; Wood, David C.

    2005-01-01

    The awareness of the ideas characterized by Communicating Processes Architecture and their adoption by industry beyond their traditional base in safety-critical systems and security is growing. The complexity of modern computing systems has become so great that no one person – maybe not even a small

  9. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  10. ERGONOMICS AND PROCESS AUTOMATION

    OpenAIRE

    Carrión Muñoz, Rolando; Docente de la FII - UNMSM

    2014-01-01

    The article shows the role that ergonomics in automation of processes, and the importance for Industrial Engineering.  El artículo nos muestra el papel que tiene la ergonomía en la automatización de los procesos, y la importancia para la Ingeniería Industrial.

  11. Students' Differentiated Translation Processes

    Science.gov (United States)

    Bossé, Michael J.; Adu-Gyamfi, Kwaku; Chandler, Kayla

    2014-01-01

    Understanding how students translate between mathematical representations is of both practical and theoretical importance. This study examined students' processes in their generation of symbolic and graphic representations of given polynomial functions. The purpose was to investigate how students perform these translations. The result of the study…

  12. Restricted broadcast process theory

    NARCIS (Netherlands)

    Ghassemi, F.; Fokkink, W.J.; Movaghar, A.; Cerone, A.; Gruner, S.

    2008-01-01

    We present a process algebra for modeling and reasoning about Mobile Ad hoc Networks (MANETs) and their protocols. In our algebra we model the essential modeling concepts of ad hoc networks, i.e. local broadcast, connectivity of nodes and connectivity changes. Connectivity and connectivity changes a

  13. Governing Knowledge Processes

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Husted, Kenneth; Michailova, Snejina;

    2003-01-01

    An under-researched issue in work within the `knowledge movement' is therelation between organizational issues and knowledge processes (i.e., sharingand creating knowledge). We argue that managers can shape formalorganization structure and organization forms and can influence the moreinformal...

  14. Audio Spectral Processing

    Science.gov (United States)

    2010-05-01

    Global Security & Engineering Solutions Division 1300-B Floyd Avenue Rome, NY 13440-4615 8. PERFORMING ORGANIZATION REPORT NUMBER...18 1 1. BACKGROUND This report is being submitted by L-3 Global Security...tasks. Utilized the Avid Xpress video enhancement system to process the Group 2, Phase II competency test A. This was done to attempt to recreate

  15. Qualitative Process Theory.

    Science.gov (United States)

    1984-07-01

    write a heat flow process that violates energy conservation and transfers " caloric fluid" between the source and destination. The assumptions made about...removed in ease of ex1-0ClCits. Seco nd, if’ thle program is drawNing concilsions that rely criticaillyoi atClrsum in, then1 it IIos’t test ss

  16. Food processing in action

    Science.gov (United States)

    Radio frequency (RF) heating is a commonly used food processing technology that has been applied for drying and baking as well as thawing of frozen foods. Its use in pasteurization, as well as for sterilization and disinfection of foods, is more limited. This column will review various RF heating ap...

  17. Udfordringer for transkulturelle processer

    DEFF Research Database (Denmark)

    Petersen, Karen Bjerg

    2013-01-01

    at indskrænke mulighedsrummet for transkulturelle processer og for det at lære fra en terra nullius position. Der er fokus på empiriske undersøgelser af kultursyn i lovgivning om opholdstilladelse fra 2010, lovgivning om statsborgerskab fra 2006 samt kultursyn i den i 2003 indførte obligatoriske...

  18. Sparsity and Information Processing

    OpenAIRE

    Ikeda, Shiro

    2015-01-01

    Recently, many information processing methods utilizing the sparsity of the information source is studied. We have reported some results on this line of research. Here we pick up two results from our own works. One is an image reconstruction method for radio interferometory and the other is a motor command computation method for a two-joint arm.

  19. Image Processing for Teaching.

    Science.gov (United States)

    Greenberg, R.; And Others

    1993-01-01

    The Image Processing for Teaching project provides a powerful medium to excite students about science and mathematics, especially children from minority groups and others whose needs have not been met by traditional teaching. Using professional-quality software on microcomputers, students explore a variety of scientific data sets, including…

  20. Image-Processing Program

    Science.gov (United States)

    Roth, D. J.; Hull, D. R.

    1994-01-01

    IMAGEP manipulates digital image data to effect various processing, analysis, and enhancement functions. It is keyboard-driven program organized into nine subroutines. Within subroutines are sub-subroutines also selected via keyboard. Algorithm has possible scientific, industrial, and biomedical applications in study of flows in materials, analysis of steels and ores, and pathology, respectively.

  1. The Serendipitous Research Process

    Science.gov (United States)

    Nutefall, Jennifer E.; Ryder, Phyllis Mentzell

    2010-01-01

    This article presents the results of an exploratory study asking faculty in the first-year writing program and instruction librarians about their research process focusing on results specifically related to serendipity. Steps to prepare for serendipity are highlighted as well as a model for incorporating serendipity into a first-year writing…

  2. Automated process planning system

    Science.gov (United States)

    Mann, W.

    1978-01-01

    Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.

  3. Quantum image processing?

    Science.gov (United States)

    Mastriani, Mario

    2017-01-01

    This paper presents a number of problems concerning the practical (real) implementation of the techniques known as quantum image processing. The most serious problem is the recovery of the outcomes after the quantum measurement, which will be demonstrated in this work that is equivalent to a noise measurement, and it is not considered in the literature on the subject. It is noteworthy that this is due to several factors: (1) a classical algorithm that uses Dirac's notation and then it is coded in MATLAB does not constitute a quantum algorithm, (2) the literature emphasizes the internal representation of the image but says nothing about the classical-to-quantum and quantum-to-classical interfaces and how these are affected by decoherence, (3) the literature does not mention how to implement in a practical way (at the laboratory) these proposals internal representations, (4) given that quantum image processing works with generic qubits, this requires measurements in all axes of the Bloch sphere, logically, and (5) among others. In return, the technique known as quantum Boolean image processing is mentioned, which works with computational basis states (CBS), exclusively. This methodology allows us to avoid the problem of quantum measurement, which alters the results of the measured except in the case of CBS. Said so far is extended to quantum algorithms outside image processing too.

  4. Electrochemical Discharge Machining Process

    Directory of Open Access Journals (Sweden)

    Anjali V. Kulkarni

    2007-09-01

    Full Text Available Electrochemical discharge machining process is evolving as a promising micromachiningprocess. The experimental investigations in the present work substantiate this trend. In the presentwork, in situ, synchronised, transient temperature and current measurements have been carriedout. The need for the transient measurements arose due to the time-varying nature of the dischargeformation and time varying circuit current. Synchronised and transient measurements revealedthe discrete nature of the process. It also helped in formulating the basic mechanism for thedischarge formation and the material removal in the process. Temperature profile on workpieceand in electrochemical discharge machining cell is experimentally measured using pyrometer,and two varieties of K-type thermocouples. Surface topography of the discharge-affected zoneson the workpiece has been carried out using scanning electron microscope. Measurements andsurface topographical studies reveal the potential use of this process for machining in micronregime. With careful experimental set-up design, suitable supply voltage and its polarity, theprocess can be applied for both micromachining and micro-deposition. It can be extended formachining and or deposition of wide range of materials.

  5. Food Process Engineering

    DEFF Research Database (Denmark)

    Friis, Alan; Jensen, Bo Boye Busk; Risum, Jørgen

    to calculate the requirements of heat processing. Our goal is to put food engineering into a production context. Other courses teach food chemistry, food microbiology and food technology. Topics of great importance and all have to be seen in a broader context of producing good and safe food in a large scale...

  6. Pattern evaporation process

    Directory of Open Access Journals (Sweden)

    Z. Żółkiewicz

    2007-04-01

    Full Text Available The paper discusses the process of thermal evaporation of a foundry pattern. At several research-development centres, studies have been carried out to examine the physico-chemical phenomena that take place in foundry mould filled with polystyrene pattern when it is poured with molten metal. In the technique of evaporative patterns, the process of mould filling with molten metal (the said mould holding inside a polystyrene pattern is interrelated with the process of thermal decomposition of this pattern. The transformation of an evaporative pattern (e.g. made from foamed polystyrene from the solid into liquid and then gaseous state occurs as a result of the thermal effect that the liquid metal exerts onto this pattern. Consequently, at the liquid metal-pattern-mould phase boundary some physico-chemical phenomena take place, which until now have not been fully explained. When the pattern is evaporating, some solid and gaseous products are evolved, e.g. CO, CO2, H2, N2, and hydrocarbons, e.g. styrene, toluene, ethane, methane, benzene [16, 23]. The process of polystyrene pattern evaporation in foundry mould under the effect of molten metal is of a very complex nature and depends on many different factors, still not fully investigated. The kinetics of pattern evaporation is also affected by the technological properties of foundry mould, e.g. permeability, thermophysical properties, parameters of the gating system, temperature of pouring, properties of pattern material, and the size of pattern-liquid metal contact surface.

  7. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  8. PROcess Based Diagnostics PROBE

    Science.gov (United States)

    Clune, T.; Schmidt, G.; Kuo, K.; Bauer, M.; Oloso, H.

    2013-01-01

    Many of the aspects of the climate system that are of the greatest interest (e.g., the sensitivity of the system to external forcings) are emergent properties that arise via the complex interplay between disparate processes. This is also true for climate models most diagnostics are not a function of an isolated portion of source code, but rather are affected by multiple components and procedures. Thus any model-observation mismatch is hard to attribute to any specific piece of code or imperfection in a specific model assumption. An alternative approach is to identify diagnostics that are more closely tied to specific processes -- implying that if a mismatch is found, it should be much easier to identify and address specific algorithmic choices that will improve the simulation. However, this approach requires looking at model output and observational data in a more sophisticated way than the more traditional production of monthly or annual mean quantities. The data must instead be filtered in time and space for examples of the specific process being targeted.We are developing a data analysis environment called PROcess-Based Explorer (PROBE) that seeks to enable efficient and systematic computation of process-based diagnostics on very large sets of data. In this environment, investigators can define arbitrarily complex filters and then seamlessly perform computations in parallel on the filtered output from their model. The same analysis can be performed on additional related data sets (e.g., reanalyses) thereby enabling routine comparisons between model and observational data. PROBE also incorporates workflow technology to automatically update computed diagnostics for subsequent executions of a model. In this presentation, we will discuss the design and current status of PROBE as well as share results from some preliminary use cases.

  9. Retinomorphic image processing.

    Science.gov (United States)

    Ghosh, Kuntal; Bhaumik, Kamales; Sarkar, Sandip

    2008-01-01

    The present work is aimed at understanding and explaining some of the aspects of visual signal processing at the retinal level while exploiting the same towards the development of some simple techniques in the domain of digital image processing. Classical studies on retinal physiology revealed the nature of contrast sensitivity of the receptive field of bipolar or ganglion cells, which lie in the outer and inner plexiform layers of the retina. To explain these observations, a difference of Gaussian (DOG) filter was suggested, which was subsequently modified to a Laplacian of Gaussian (LOG) filter for computational ease in handling two-dimensional retinal inputs. Till date almost all image processing algorithms, used in various branches of science and engineering had followed LOG or one of its variants. Recent observations in retinal physiology however, indicate that the retinal ganglion cells receive input from a larger area than the classical receptive fields. We have proposed an isotropic model for the non-classical receptive field of the retinal ganglion cells, corroborated from these recent observations, by introducing higher order derivatives of Gaussian expressed as linear combination of Gaussians only. In digital image processing, this provides a new mechanism of edge detection on one hand and image half-toning on the other. It has also been found that living systems may sometimes prefer to "perceive" the external scenario by adding noise to the received signals in the pre-processing level for arriving at better information on light and shade in the edge map. The proposed model also provides explanation to many brightness-contrast illusions hitherto unexplained not only by the classical isotropic model but also by some other Gestalt and Constructivist models or by non-isotropic multi-scale models. The proposed model is easy to implement both in the analog and digital domain. A scheme for implementation in the analog domain generates a new silicon retina

  10. Privatization Process in Kosovo

    Directory of Open Access Journals (Sweden)

    Ing. Florin Aliu

    2014-06-01

    Full Text Available Privatization is considered an initial step toward market economy, restructuring financial and economic sector that enables competition in the economy. Privatization is the most painful process in economy where beside legal establishment and political will, it includes also the aspect of fairness and honesty. Analysis of this process is based on the models and comparisons between Kosovo and countries of central and Eastern Europe, in order to give a clearer picture on the overall process of privatization in Kosovo Methodology that is used to analyze this issue is based on empirical results and also qualitative interpretation of the models and also on studying particular asset privatization process. A widely discussed case of privatization in Kosovo is that of Post and Telecom of Kosovo (PTK. Since each company has its own value, I have focused my appraising analysis on the financial statements with a special observation on Cash Flow from Operation, as the most significant indicator on showing how company is using her physical and human recourses to generate money. I have based my research on using methodology of discounted cash flow from operation analysis, even though the company valuation was done using net cash flow from operation analysis. Cash Flow valuation then was discounted by the T-bonds interest rate. This paper tries to bring a conclusion that privatization process in Kosovo have not brought the results excepted, firstly by setting an inappropriate price of assets and lastly by restructuring overall privatization sector and the overall industry. Kosovo, consequently, lost a big opportunity to create a competitive environment of financial industry: starting from the banking industry followed the pension trust which remained at their initial steps of development

  11. Cassini science planning process

    Science.gov (United States)

    Paczkowski, Brian G.; Ray, Trina L.

    2004-01-01

    The mission design for Cassini-Huygens calls for a four-year orbital survey of the Saturnian system and the descent into the Titan atmosphere and eventual soft-landing of the Huygens probe. The Cassini orbiter tour consists of 76 orbits around Saturn with 44 close Titan flybys and 8 targeted icy satellite flybys. The Cassini orbiter spacecraft carries twelve scientific instruments that will perform a wide range of observations on a multitude of designated targets. The science opportunities, frequency of encounters, the length of the Tour, and the use of distributed operations pose significant challenges for developing the science plan for the orbiter mission. The Cassini Science Planning Process is the process used to develop and integrate the science and engineering plan that incorporates an acceptable level of science required to meet the primary mission objectives far the orbiter. The bulk of the integrated science and engineering plan will be developed prior to Saturn Orbit Insertion (Sol). The Science Planning Process consists of three elements: 1) the creation of the Tour Atlas, which identifies the science opportunities in the tour, 2) the development of the Science Operations Plan (SOP), which is the conflict-free timeline of all science observations and engineering activities, a constraint-checked spacecraft pointing profile, and data volume allocations to the science instruments, and 3) an Aftermarket and SOP Update process, which is used to update the SOP while in tour with the latest information on spacecraft performance, science opportunities, and ephemerides. This paper will discuss the various elements of the Science Planning Process used on the Cassini Mission to integrate, implement, and adapt the science and engineering activity plans for Tour.

  12. Vaccine process technology.

    Science.gov (United States)

    Josefsberg, Jessica O; Buckland, Barry

    2012-06-01

    The evolution of vaccines (e.g., live attenuated, recombinant) and vaccine production methods (e.g., in ovo, cell culture) are intimately tied to each other. As vaccine technology has advanced, the methods to produce the vaccine have advanced and new vaccine opportunities have been created. These technologies will continue to evolve as we strive for safer and more immunogenic vaccines and as our understanding of biology improves. The evolution of vaccine process technology has occurred in parallel to the remarkable growth in the development of therapeutic proteins as products; therefore, recent vaccine innovations can leverage the progress made in the broader biotechnology industry. Numerous important legacy vaccines are still in use today despite their traditional manufacturing processes, with further development focusing on improving stability (e.g., novel excipients) and updating formulation (e.g., combination vaccines) and delivery methods (e.g., skin patches). Modern vaccine development is currently exploiting a wide array of novel technologies to create safer and more efficacious vaccines including: viral vectors produced in animal cells, virus-like particles produced in yeast or insect cells, polysaccharide conjugation to carrier proteins, DNA plasmids produced in E. coli, and therapeutic cancer vaccines created by in vitro activation of patient leukocytes. Purification advances (e.g., membrane adsorption, precipitation) are increasing efficiency, while innovative analytical methods (e.g., microsphere-based multiplex assays, RNA microarrays) are improving process understanding. Novel adjuvants such as monophosphoryl lipid A, which acts on antigen presenting cell toll-like receptors, are expanding the previously conservative list of widely accepted vaccine adjuvants. As in other areas of biotechnology, process characterization by sophisticated analysis is critical not only to improve yields, but also to determine the final product quality. From a regulatory

  13. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip;

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  14. Process and Post-Process: A Discursive History.

    Science.gov (United States)

    Matsuda, Paul Kei

    2003-01-01

    Examines the history of process and post-process in composition studies, focusing on ways in which terms, such as "current-traditional rhetoric,""process," and "post-process" have contributed to the discursive construction of reality. Argues that use of the term post-process in the context of second language writing needs to be guided by a…

  15. Managing Process Variants in the Process Life Cycle

    NARCIS (Netherlands)

    Hallerbach, A.; Bauer, Th.; Reichert, M.U.

    2007-01-01

    When designing process-aware information systems, often variants of the same process have to be specified. Each variant then constitutes an adjustment of a particular process to specific requirements building the process context. Current Business Process Management (BPM) tools do not adequately supp

  16. 5 CFR 1653.13 - Processing legal processes.

    Science.gov (United States)

    2010-01-01

    ... TSP is notified in writing that the legal process has been appealed, and that the effect of the filing... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Processing legal processes. 1653.13... PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's...

  17. An Integrated Design Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... that there is a difference in the experiences of the different offices. Architects taking an active part in the development of projects and tools in general have a better understanding of how to approach this. It is of course not surprising, because of a focused strategy towards this. However the most important thing...

  18. Instabilities in sensory processes

    Science.gov (United States)

    Balakrishnan, J.

    2014-07-01

    In any organism there are different kinds of sensory receptors for detecting the various, distinct stimuli through which its external environment may impinge upon it. These receptors convey these stimuli in different ways to an organism's information processing region enabling it to distinctly perceive the varied sensations and to respond to them. The behavior of cells and their response to stimuli may be captured through simple mathematical models employing regulatory feedback mechanisms. We argue that the sensory processes such as olfaction function optimally by operating in the close proximity of dynamical instabilities. In the case of coupled neurons, we point out that random disturbances and fluctuations can move their operating point close to certain dynamical instabilities triggering synchronous activity.

  19. The Player Engagement Process

    DEFF Research Database (Denmark)

    Schoenau-Fog, Henrik

    2011-01-01

    Engagement is an essential element of the player experience, and the concept is described in various ways in the literature. To gain a more detailed comprehension of this multifaceted concept, and in order to better understand what aspects can be used to evaluate engaging game play and to design...... engaging user experiences, this study investigates one dimension of player engagement by empirically identifying the components associated with the desire to continue playing. Based on a description of the characteristics of player engagement, a series of surveys were developed to discover the components......, categories and triggers involved in this process. By applying grounded theory to the analysis of the responses, a process-oriented player engagement framework was developed and four main components consisting of objectives, activities, accomplishments and affects as well as the corresponding categories...

  20. Plant hydrocarbon recovery process

    Energy Technology Data Exchange (ETDEWEB)

    Dzadzic, P.M.; Price, M.C.; Shih, C.J.; Weil, T.A.

    1982-01-26

    A process for production and recovery of hydrocarbons from hydrocarbon-containing whole plants in a form suitable for use as chemical feedstocks or as hydrocarbon energy sources which process comprises: (A) pulverizing by grinding or chopping hydrocarbon-containing whole plants selected from the group consisting of euphorbiaceae, apocynaceae, asclepiadaceae, compositae, cactaceae and pinaceae families to a suitable particle size, (B) drying and preheating said particles in a reducing atmosphere under positive pressure (C) passing said particles through a thermal conversion zone containing a reducing atmosphere and with a residence time of 1 second to about 30 minutes at a temperature within the range of from about 200* C. To about 1000* C., (D) separately recovering the condensable vapors as liquids and the noncondensable gases in a condition suitable for use as chemical feedstocks or as hydrocarbon fuels.

  1. A Logical Process Calculus

    Science.gov (United States)

    Cleaveland, Rance; Luettgen, Gerald; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents the Logical Process Calculus (LPC), a formalism that supports heterogeneous system specifications containing both operational and declarative subspecifications. Syntactically, LPC extends Milner's Calculus of Communicating Systems with operators from the alternation-free linear-time mu-calculus (LT(mu)). Semantically, LPC is equipped with a behavioral preorder that generalizes Hennessy's and DeNicola's must-testing preorder as well as LT(mu's) satisfaction relation, while being compositional for all LPC operators. From a technical point of view, the new calculus is distinguished by the inclusion of: (1) both minimal and maximal fixed-point operators and (2) an unimple-mentability predicate on process terms, which tags inconsistent specifications. The utility of LPC is demonstrated by means of an example highlighting the benefits of heterogeneous system specification.

  2. Integral Politics as Process

    Directory of Open Access Journals (Sweden)

    Tom Atlee

    2010-03-01

    Full Text Available Using the definition proposed here, integral politics can be a process of integrating diverse perspectives into wholesome guidance for a community or society. Characteristics that follow from this definition have ramifications for understanding what such political processes involve. Politics becomes integral as it transcends partisan battle and nurtures generative conversation toward the common good. Problems, conflicts and crises become opportunities for new (or renewed social coherence. Conversational methodologies abound that can help citizen awareness temporarily expand during policy-making, thus helping raise society’s manifested developmental stage. Convening archetypal stakeholders or randomly selected citizens in conversations designed to engage the broader public enhances democratic legitimacy. With minimal issue- and candidate-advocacy, integral political leaders would develop society’s capacity to use integral conversational tools to improve its health, resilience, and collective intelligence. This both furthers and manifests evolution becoming conscious of itself.

  3. Yeast nuclear RNA processing

    Institute of Scientific and Technical Information of China (English)

    Jade; Bernstein; Eric; A; Toth

    2012-01-01

    Nuclear RNA processing requires dynamic and intricately regulated machinery composed of multiple enzymes and their cofactors.In this review,we summarize recent experiments using Saccharomyces cerevisiae as a model system that have yielded important insights regarding the conversion of pre-RNAs to functional RNAs,and the elimination of aberrant RNAs and unneeded intermediates from the nuclear RNA pool.Much progress has been made recently in describing the 3D structure of many elements of the nuclear degradation machinery and its cofactors.Similarly,the regulatory mechanisms that govern RNA processing are gradually coming into focus.Such advances invariably generate many new questions,which we highlight in this review.

  4. Posttranslational processing of progastrin

    DEFF Research Database (Denmark)

    Bundgaard, Jens René; Rehfeld, Jens F.

    2010-01-01

    Gastrin and cholecystokinin (CCK) are homologous hormones with important functions in the brain and the gut. Gastrin is the main regulator of gastric acid secretion and gastric mucosal growth, whereas cholecystokinin regulates gall bladder emptying, pancreatic enzyme secretion and besides acts...... as a major neurotransmitter in the central and peripheral nervous systems. The tissue-specific expression of the hormones is regulated at the transcriptional level, but the posttranslational phase is also decisive and is highly complex in order to ensure accurate maturation of the prohormones in a cell...... processing progastrin is often greatly disturbed in neoplastic cells.The posttranslational phase of the biogenesis of gastrin and the various progastrin products in gastrin gene-expressing tissues is now reviewed here. In addition, the individual contributions of the processing enzymes are discussed...

  5. Process Improvement: Customer Service.

    Science.gov (United States)

    Cull, Donald

    2015-01-01

    Utilizing the comment section of patient satisfaction surveys, Clark Memorial Hospital in Jeffersonville, IN went through a thoughtful process to arrive at an experience that patients said they wanted. Two Lean Six Sigma tools were used--the Voice of the Customer (VoC) and the Affinity Diagram. Even when using these tools, a facility will not be able to accomplish everything the patient may want. Guidelines were set and rules were established for the Process Improvement Team in order to lessen frustration, increase focus, and ultimately be successful. The project's success is driven by the team members carrying its message back to their areas. It's about ensuring that everyone is striving to improve the patients' experience by listening to what they say is being done right and what they say can be done better. And then acting on it.

  6. Thin film interconnect processes

    Science.gov (United States)

    Malik, Farid

    Interconnects and associated photolithography and etching processes play a dominant role in the feature shrinkage of electronic devices. Most interconnects are fabricated by use of thin film processing techniques. Planarization of dielectrics and novel metal deposition methods are the focus of current investigations. Spin-on glass, polyimides, etch-back, bias-sputtered quartz, and plasma-enhanced conformal films are being used to obtain planarized dielectrics over which metal films can be reliably deposited. Recent trends have been towards chemical vapor depositions of metals and refractory metal silicides. Interconnects of the future will be used in conjunction with planarized dielectric layers. Reliability of devices will depend to a large extent on the quality of the interconnects.

  7. The aluminum smelting process.

    Science.gov (United States)

    Kvande, Halvor

    2014-05-01

    This introduction to the industrial primary aluminum production process presents a short description of the electrolytic reduction technology, the history of aluminum, and the importance of this metal and its production process to modern society. Aluminum's special qualities have enabled advances in technologies coupled with energy and cost savings. Aircraft capabilities have been greatly enhanced, and increases in size and capacity are made possible by advances in aluminum technology. The metal's flexibility for shaping and extruding has led to architectural advances in energy-saving building construction. The high strength-to-weight ratio has meant a substantial reduction in energy consumption for trucks and other vehicles. The aluminum industry is therefore a pivotal one for ecological sustainability and strategic for technological development.

  8. [In Process Citation].

    Science.gov (United States)

    Yildirim, Ayhan; Metzler, Philipp; Lanzer, Martin; Lübbers, Heinz-Theo; Yildirim, Vedat

    2015-01-01

    Solcoseryl® is a protein-free haemodialysate, containing a broad spectrum of low molecular components of cellular mass and blood serum obtained from veal calves. Solcoseryl® improves the transport of oxygen and glucose to cells that are under hypoxic conditions. It increases the synthesis of intracellular ATP and contributes to an increase in the level of aerobic glycolysis and oxidative phosphorylation. It activates the reparative and regenerative processes in tissues by stimulating fibroblast proliferation and repair of the collagen vascular wall. The formulations of Solcoseryl® are infusion, injection, gel and ointment, and it is also available as a dental paste for inflammatory processes of the mouth cavity, gums and lips.

  9. Plutonium dissolution process

    Science.gov (United States)

    Vest, Michael A.; Fink, Samuel D.; Karraker, David G.; Moore, Edwin N.; Holcomb, H. Perry

    1996-01-01

    A two-step process for dissolving plutonium metal, which two steps can be carried out sequentially or simultaneously. Plutonium metal is exposed to a first mixture containing approximately 1.0M-1.67M sulfamic acid and 0.0025M-0.1M fluoride, the mixture having been heated to a temperature between 45.degree. C. and 70.degree. C. The mixture will dissolve a first portion of the plutonium metal but leave a portion of the plutonium in an oxide residue. Then, a mineral acid and additional fluoride are added to dissolve the residue. Alteratively, nitric acid in a concentration between approximately 0.05M and 0.067M is added to the first mixture to dissolve the residue as it is produced. Hydrogen released during the dissolution process is diluted with nitrogen.

  10. Youpi: YOUr processing PIpeline

    Science.gov (United States)

    Monnerville, Mathias; Sémah, Gregory

    2012-03-01

    Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.

  11. Image processing occupancy sensor

    Science.gov (United States)

    Brackney, Larry J.

    2016-09-27

    A system and method of detecting occupants in a building automation system environment using image based occupancy detection and position determinations. In one example, the system includes an image processing occupancy sensor that detects the number and position of occupants within a space that has controllable building elements such as lighting and ventilation diffusers. Based on the position and location of the occupants, the system can finely control the elements to optimize conditions for the occupants, optimize energy usage, among other advantages.

  12. Fastdata processing with Spark

    CERN Document Server

    Karau, Holden

    2013-01-01

    This book will be a basic, step-by-step tutorial, which will help readers take advantage of all that Spark has to offer.Fastdata Processing with Spark is for software developers who want to learn how to write distributed programs with Spark. It will help developers who have had problems that were too much to be dealt with on a single computer. No previous experience with distributed programming is necessary. This book assumes knowledge of either Java, Scala, or Python.

  13. Sample Data Processing.

    Science.gov (United States)

    1982-08-01

    the relative practicality of compensating the channel with an approach of predistorting the masking sequence, by processing in a filter that...replicates the channel response, with a conventional approach of equal- izing the channel with an inverse filter. The predistortion method demonstrated a...compensate for the channel distortion is to predistort the encryption stream in the receiver by means of a fil- ter which replicates the impulse response of

  14. Near Shore Wave Processes

    Science.gov (United States)

    2016-06-07

    given the offshore wave conditions. OBJECTIVES We hypothesize that the wave-induced kinematic, sediment and morphologic processes are nonlinearly... morphology , which acts as hydraulic roughness for the mean flows and perturbs the velocity-sediment fields, is measured as a function of time and over...REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT

  15. Aluminum powder metallurgy processing

    Energy Technology Data Exchange (ETDEWEB)

    Flumerfelt, J.F.

    1999-02-12

    The objective of this dissertation is to explore the hypothesis that there is a strong linkage between gas atomization processing conditions, as-atomized aluminum powder characteristics, and the consolidation methodology required to make components from aluminum powder. The hypothesis was tested with pure aluminum powders produced by commercial air atomization, commercial inert gas atomization, and gas atomization reaction synthesis (GARS). A comparison of the GARS aluminum powders with the commercial aluminum powders showed the former to exhibit superior powder characteristics. The powders were compared in terms of size and shape, bulk chemistry, surface oxide chemistry and structure, and oxide film thickness. Minimum explosive concentration measurements assessed the dependence of explosibility hazard on surface area, oxide film thickness, and gas atomization processing conditions. The GARS aluminum powders were exposed to different relative humidity levels, demonstrating the effect of atmospheric conditions on post-atomization processing conditions. The GARS aluminum powders were exposed to different relative humidity levels, demonstrating the effect of atmospheric conditions on post-atomization oxidation of aluminum powder. An Al-Ti-Y GARS alloy exposed in ambient air at different temperatures revealed the effect of reactive alloy elements on post-atomization powder oxidation. The pure aluminum powders were consolidated by two different routes, a conventional consolidation process for fabricating aerospace components with aluminum powder and a proposed alternative. The consolidation procedures were compared by evaluating the consolidated microstructures and the corresponding mechanical properties. A low temperature solid state sintering experiment demonstrated that tap densified GARS aluminum powders can form sintering necks between contacting powder particles, unlike the total resistance to sintering of commercial air atomization aluminum powder.

  16. Topology and mental processes.

    Science.gov (United States)

    McLeay, H

    2000-08-01

    The study reported here considers the effect of rotation on the decision time taken to compare nonrigid objects, presented as like and unlike pairs of knots and unknots. The results for 48 subjects, 21 to 45 years old, support the notion that images which have a characteristic 'foundation part' are more easily stored and accessed in the brain. Also, there is evidence that the comparison of deformable objects is processed by mental strategies other than self-evident mental rotation.

  17. Processing Nanostructured Structural Ceramics

    Science.gov (United States)

    2006-08-01

    aspects of the processing of nanostructured ceramics, viz. • • • The production of a flowable and compactable dry nanopowder suitable for use in... composition due to the different synthesis routes used. Therefore, ‘industry-standard’ dispersants can cause flocculation rather than dispersion...stabilised zirconia (3-YSZ) were no higher than for conventional, micron-sized material of the same composition . However, detailed crystallographic

  18. Inelastic Light Scattering Processes

    Science.gov (United States)

    Fouche, Daniel G.; Chang, Richard K.

    1973-01-01

    Five different inelastic light scattering processes will be denoted by, ordinary Raman scattering (ORS), resonance Raman scattering (RRS), off-resonance fluorescence (ORF), resonance fluorescence (RF), and broad fluorescence (BF). A distinction between fluorescence (including ORF and RF) and Raman scattering (including ORS and RRS) will be made in terms of the number of intermediate molecular states which contribute significantly to the scattered amplitude, and not in terms of excited state lifetimes or virtual versus real processes. The theory of these processes will be reviewed, including the effects of pressure, laser wavelength, and laser spectral distribution on the scattered intensity. The application of these processes to the remote sensing of atmospheric pollutants will be discussed briefly. It will be pointed out that the poor sensitivity of the ORS technique cannot be increased by going toward resonance without also compromising the advantages it has over the RF technique. Experimental results on inelastic light scattering from I(sub 2) vapor will be presented. As a single longitudinal mode 5145 A argon-ion laser line was tuned away from an I(sub 2) absorption line, the scattering was observed to change from RF to ORF. The basis, of the distinction is the different pressure dependence of the scattered intensity. Nearly three orders of magnitude enhancement of the scattered intensity was measured in going from ORF to RF. Forty-seven overtones were observed and their relative intensities measured. The ORF cross section of I(sub 2) compared to the ORS cross section of N2 was found to be 3 x 10(exp 6), with I(sub 2) at its room temperature vapor pressure.

  19. Pyrolysis process and apparatus

    Science.gov (United States)

    Lee, Chang-Kuei

    1983-01-01

    This invention discloses a process and apparatus for pyrolyzing particulate coal by heating with a particulate solid heating media in a transport reactor. The invention tends to dampen fluctuations in the flow of heating media upstream of the pyrolysis zone, and by so doing forms a substantially continuous and substantially uniform annular column of heating media flowing downwardly along the inside diameter of the reactor. The invention is particularly useful for bituminous or agglomerative type coals.

  20. The Caroline interrogatory process

    Energy Technology Data Exchange (ETDEWEB)

    Degagne, D. [Alberta Energy and Utilities Board, Calgary, AB (Canada); Gibson, T. [Gecko Management, Calgary, AB (Canada)

    1999-11-01

    Using the specific case study of the Caroline interrogatory process, an example is given of how an effective communications and public involvement program can re-establish trust and credibility levels within an community after an incident. The public is nervous about sour gas, especially about blowouts of gas from a pipeline. The post-approval period was marked by high expectations and a community consultation program which included a community advisory board, an emergency planning committee, socio-economic factors, and environmental monitoring and studies. Information and education involves newspaper articles, newsletters, tours, public consultation meetings, and weekly e-mail. Mercury was detected as a potential hazard at the site, and company actions are illustrated. Overall lessons learned included: starting early paid off, face to face resident contacts were the most effective, the willingness to make changes was the key to success, the community helped, knowing all the answers is not essential, and there is a need for empathy. The interrogatory process includes a hybrid technique that is comprised of four stages: 1) process review and public input, 2) identification and clarification of issues, 3) responses by industry and government, and 4) a public forum and follow-up action.

  1. The Caroline interrogatory process

    Energy Technology Data Exchange (ETDEWEB)

    Degagne, D. (Alberta Energy and Utilities Board, Calgary, AB (Canada)); Gibson, T. (Gecko Management, Calgary, AB (Canada))

    1999-01-01

    Using the specific case study of the Caroline interrogatory process, an example is given of how an effective communications and public involvement program can re-establish trust and credibility levels within an community after an incident. The public is nervous about sour gas, especially about blowouts of gas from a pipeline. The post-approval period was marked by high expectations and a community consultation program which included a community advisory board, an emergency planning committee, socio-economic factors, and environmental monitoring and studies. Information and education involves newspaper articles, newsletters, tours, public consultation meetings, and weekly e-mail. Mercury was detected as a potential hazard at the site, and company actions are illustrated. Overall lessons learned included: starting early paid off, face to face resident contacts were the most effective, the willingness to make changes was the key to success, the community helped, knowing all the answers is not essential, and there is a need for empathy. The interrogatory process includes a hybrid technique that is comprised of four stages: 1) process review and public input, 2) identification and clarification of issues, 3) responses by industry and government, and 4) a public forum and follow-up action.

  2. Processing of lateritic ores

    Energy Technology Data Exchange (ETDEWEB)

    Collier, D.E.; Ring, R.J. [Environment Division, Australian Nuclear Science and Technology Organisation, Menai, New South Wales (Australia); McGill, J.; Russell, H. [Energy Resources of Australia Ltd., Ranger Mine, Jabiru, Northern Territory (Australia)

    2000-07-01

    Highly weathered or lateritic ores that contain high proportions of fine clay minerals present specific problems when they are processed to extract uranium. Of perhaps the greatest significance is the potential of the fine minerals to adsorb dissolved uranium (preg-robbing) from leach liquors produced by processing laterites or blends of laterite and primary ores. These losses can amount to 25% of the readily soluble uranium. The clay components can also restrict practical slurry densities to relatively low values in order to avoid rheology problems in pumping and agitation. The fine fractions also contribute to relatively poor solid-liquid separation characteristics in settling and/or filtration. Studies at ANSTO have characterised the minerals believed to be responsible for these problems and quantified the effects of the fines in these types of ores. Processing strategies were also examined, including roasting, resin-in-leach and separate leaching of the laterite fines to overcome potential problems. The incorporation of the preferred treatment option into an existing mill circuit is discussed. (author)

  3. Advanced microwave processing concepts

    Energy Technology Data Exchange (ETDEWEB)

    Lauf, R.J.; McMillan, A.D.; Paulauskas, F.L. [Oak Ridge National Laboratory, TN (United States)

    1995-05-01

    The purpose of this work is to explore the feasibility of several advanced microwave processing concepts to develop new energy-efficient materials and processes. The project includes two tasks: (1) commercialization of the variable-frequency microwave furnace; and (2) microwave curing of polymer composites. The variable frequency microwave furnace, whose initial conception and design was funded by the AIC Materials Program, will allow us, for the first time, to conduct microwave processing studies over a wide frequency range. This novel design uses a high-power traveling wave tube (TWT) originally developed for electronic warfare. By using this microwave source, one can not only select individual microwave frequencies for particular experiments, but also achieve uniform power densities over a large area by the superposition of many different frequencies. Microwave curing of thermoset resins will be studied because it hold the potential of in-situ curing of continuous-fiber composites for strong, lightweight components. Microwave heating can shorten curing times, provided issues of scaleup, uniformity, and thermal management can be adequately addressed.

  4. Advanced microwave processing concepts

    Energy Technology Data Exchange (ETDEWEB)

    Lauf, R.J.; McMillan, A.D.; Paulauskas, F.L. [Oak Ridge National Lab., TN (United States)

    1997-04-01

    The purpose of this work is to explore the feasibility of several advanced microwave processing concepts to develop new energy-efficient materials and processes. The project includes two tasks: (1) commercialization of the variable-frequency microwave furnace; and (2) microwave curing of polymeric materials. The variable frequency microwave furnace, whose initial conception and design was funded by the AIM Materials Program, allows the authors, for the first time, to conduct microwave processing studies over a wide frequency range. This novel design uses a high-power traveling wave tube (TWT) originally developed for electronic warfare. By using this microwave source, one can not only select individual microwave frequencies for particular experiments, but also achieve uniform power densities over a large area by the superposition of many different frequencies. Microwave curing of various thermoset resins will be studied because it holds the potential of in-situ curing of continuous-fiber composites for strong, lightweight components or in-situ curing of adhesives, including metal-to-metal. Microwave heating can shorten curing times, provided issues of scaleup, uniformity, and thermal management can be adequately addressed.

  5. Laser processing of materials

    Indian Academy of Sciences (India)

    J Dutta Majumdar; I Manna

    2003-06-01

    Light amplification by stimulated emission of radiation (laser) is a coherent and monochromatic beam of electromagnetic radiation that can propagate in a straight line with negligible divergence and occur in a wide range of wavelength, energy/power and beam-modes/configurations. As a result, lasers find wide applications in the mundane to the most sophisticated devices, in commercial to purely scientific purposes, and in life-saving as well as life-threatening causes. In the present contribution, we provide an overview of the application of lasers for material processing. The processes covered are broadly divided into four major categories; namely, laser-assisted forming, joining, machining and surface engineering. Apart from briefly introducing the fundamentals of these operations, we present an updated review of the relevant literature to highlight the recent advances and open questions. We begin our discussion with the general applications of lasers, fundamentals of laser-matter interaction and classification of laser material processing. A major part of the discussion focuses on laser surface engineering that has attracted a good deal of attention from the scientific community for its technological significance and scientific challenges. In this regard, a special mention is made about laser surface vitrification or amorphization that remains a very attractive but unaccomplished proposition.

  6. Basic Social Processes

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon. PhD

    2005-06-01

    Full Text Available The goal of grounded theory is to generate a theory that accounts for a pattern of behavior that is relevant and problematic for those involved. The goal is not voluminous description, nor clever verification. As with all grounded theory, the generation of a basic social process (BSP theory occurs around a core category. While a core category is always present in a grounded research study, a BSP may not be.BSPs are ideally suited to generation by grounded theory from qualitative research because qualitative research can pick up process through fieldwork that continues over a period of time. BSPs are a delight to discover and formulate since they give so much movement and scope to the analyst’s perception of the data. BSPs such as cultivating, defaulting, centering, highlighting or becoming, give the feeling of process, change and movement over time. They also have clear, amazing general implications; so much so, that it is hard to contain them within the confines of a single substantive study. The tendency is to refer to them as a formal theory without the necessary comparative development of formal theory. They are labeled by a “gerund”(“ing” which both stimulates their generation and the tendency to over-generalize them.

  7. Adaptive Signal Processing Testbed

    Science.gov (United States)

    Parliament, Hugh A.

    1991-09-01

    The design and implementation of a system for the acquisition, processing, and analysis of signal data is described. The initial application for the system is the development and analysis of algorithms for excision of interfering tones from direct sequence spread spectrum communication systems. The system is called the Adaptive Signal Processing Testbed (ASPT) and is an integrated hardware and software system built around the TMS320C30 chip. The hardware consists of a radio frequency data source, digital receiver, and an adaptive signal processor implemented on a Sun workstation. The software components of the ASPT consists of a number of packages including the Sun driver package; UNIX programs that support software development on the TMS320C30 boards; UNIX programs that provide the control, user interaction, and display capabilities for the data acquisition, processing, and analysis components of the ASPT; and programs that perform the ASPT functions including data acquisition, despreading, and adaptive filtering. The performance of the ASPT system is evaluated by comparing actual data rates against their desired values. A number of system limitations are identified and recommendations are made for improvements.

  8. Spacelab Ground Processing

    Science.gov (United States)

    Scully, Edward J.; Gaskins, Roger B.

    1982-02-01

    Spacelab (SL) ground processing is active at the Kennedy Space Center (KSC). The palletized payload for the second Shuttle launch is staged and integrated with interface verification active. The SL Engineering Model is being assembled for subsequent test and checkout activities. After delivery of SL flight elements from Europe, prelaunch operations for the first SL flight start with receipt of the flight experiment packages and staging of the SL hardware. Experiment operations consist of integrating the various experiment elements into the SL racks, floors and pallets. Rack and floor assemblies with the experiments installed, are integrated into the flight module. Aft end-cone installation, pallet connections, and SL subsystems interface verifications are accomplished, and SL-Orbiter interfaces verified. The Spacelab cargo is then transferred to the Orbiter Processing Facility (OPF) in a controlled environment using a canister/transporter. After the SL is installed into the Orbiter payload bay, physical and functional integrity of all payload-to-Orbiter interfaces are verified and final close-out operations conducted. Spacelab payload activities at the launch pad are minimal with the payload bay doors remaining closed. Limited access is available to the module through the Spacelab Transfer Tunnel. After mission completion, the SL is removed from the Orbiter in the OPF and returned to the SL processing facility for experiment equipment removal and reconfiguration for the subsequent mission.

  9. Process for protein PEGylation.

    Science.gov (United States)

    Pfister, David; Morbidelli, Massimo

    2014-04-28

    PEGylation is a versatile drug delivery technique that presents a particularly wide range of conjugation chemistry and polymer structure. The conjugated protein can be tuned to specifically meet the needs of the desired application. In the area of drug delivery this typically means to increase the persistency in the human body without affecting the activity profile of the original protein. On the other hand, because of the high costs associated with the production of therapeutic proteins, subsequent operations imposed by PEGylation must be optimized to minimize the costs inherent to the additional steps. The closest attention has to be given to the PEGylation reaction engineering and to the subsequent purification processes. This review article focuses on these two aspects and critically reviews the current state of the art with a clear focus on the development of industrial scale processes which can meet the market requirements in terms of quality and costs. The possibility of using continuous processes, with integration between the reaction and the separation steps is also illustrated.

  10. Process measuring techniques; Prozessmesstechnik

    Energy Technology Data Exchange (ETDEWEB)

    Freudenberger, A.

    2000-07-01

    This introduction into measurement techniques for chemical and process-technical plant in science and industry describes in detail the methods used to measure basic quantities. Most prominent are modern measuring techniques by means of ultrasound, microwaves and the Coriolis effect. Alongside physical and measuring technique fundamentals, the practical applications of measuring devices are described. Calculation examples are given to illustrate the subject matter. The book addresses students of physical engineering, process engineering and environmental engineering at technical schools as well as engineers of other disciplines wishing to familiarize themselves with the subject of process measurement techniques. (orig.) [German] Diese Einfuehrung in die Messtechnik fuer chemische und verfahrens-technische Forschungs- und Produktionsanlagen beschreibt ausfuehrlich die Methoden zur Messung der Basisgroessen. Moderne Messverfahren mit Ultraschall, Mikrowellen und Coriolis-Effekt stehen dabei im Vordergrund. Beruecksichtigung finden sowohl die physikalischen und messtechnischen Grundlagen als auch die praktischen Anwendungen der Geraete. Berechnungsbeispiele dienen der Erlaeuterung und Vertiefung des Stoffes. Angesprochen sind Studenten der Ingenieurstufengaenge Physikalische Technik und Verfahrens- und Umwelttechnik an Fachhochschulen als auch Ingenieure anderer Fachrichtungen, die sich in das Gebiet der Prozessmesstechnik einarbeiten wollen. (orig.)

  11. RACORO aerosol data processing

    Energy Technology Data Exchange (ETDEWEB)

    Elisabeth Andrews

    2011-10-31

    The RACORO aerosol data (cloud condensation nuclei (CCN), condensation nuclei (CN) and aerosol size distributions) need further processing to be useful for model evaluation (e.g., GCM droplet nucleation parameterizations) and other investigations. These tasks include: (1) Identification and flagging of 'splash' contaminated Twin Otter aerosol data. (2) Calculation of actual supersaturation (SS) values in the two CCN columns flown on the Twin Otter. (3) Interpolation of CCN spectra from SGP and Twin Otter to 0.2% SS. (4) Process data for spatial variability studies. (5) Provide calculated light scattering from measured aerosol size distributions. Below we first briefly describe the measurements and then describe the results of several data processing tasks that which have been completed, paving the way for the scientific analyses for which the campaign was designed. The end result of this research will be several aerosol data sets which can be used to achieve some of the goals of the RACORO mission including the enhanced understanding of cloud-aerosol interactions and improved cloud simulations in climate models.

  12. Approximate simulation of Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    2006-01-01

    Hawkes processes are important in point process theory and its applications, and simulation of such processes are often needed for various statistical purposes. This article concerns a simulation algorithm for unmarked and marked Hawkes processes, exploiting that the process can be constructed...

  13. CONVERGENCE TO PROCESS ORGANIZATION BY MODEL OF PROCESS MATURITY

    Directory of Open Access Journals (Sweden)

    Blaženka Piuković Babičković

    2015-06-01

    Full Text Available With modern business process orientation binds primarily, process of thinking and process organizational structure. Although the business processes are increasingly a matter of writing and speaking, it is a major problem among the business world, especially in countries in transition, where it has been found that there is a lack of understanding of the concept of business process management. The aim of this paper is to give a specific contribution to overcoming the identified problem, by pointing out the significance of the concept of business process management, as well as the representation of the model for review of process maturity and tools that are recommended for use in process management.

  14. Discovery as a process

    Energy Technology Data Exchange (ETDEWEB)

    Loehle, C.

    1994-05-01

    The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if something is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.

  15. Solar Flares: Magnetohydrodynamic Processes

    Directory of Open Access Journals (Sweden)

    Kazunari Shibata

    2011-12-01

    Full Text Available This paper outlines the current understanding of solar flares, mainly focused on magnetohydrodynamic (MHD processes responsible for producing a flare. Observations show that flares are one of the most explosive phenomena in the atmosphere of the Sun, releasing a huge amount of energy up to about 10^32 erg on the timescale of hours. Flares involve the heating of plasma, mass ejection, and particle acceleration that generates high-energy particles. The key physical processes for producing a flare are: the emergence of magnetic field from the solar interior to the solar atmosphere (flux emergence, local enhancement of electric current in the corona (formation of a current sheet, and rapid dissipation of electric current (magnetic reconnection that causes shock heating, mass ejection, and particle acceleration. The evolution toward the onset of a flare is rather quasi-static when free energy is accumulated in the form of coronal electric current (field-aligned current, more precisely, while the dissipation of coronal current proceeds rapidly, producing various dynamic events that affect lower atmospheres such as the chromosphere and photosphere. Flares manifest such rapid dissipation of coronal current, and their theoretical modeling has been developed in accordance with observations, in which numerical simulations proved to be a strong tool reproducing the time-dependent, nonlinear evolution of a flare. We review the models proposed to explain the physical mechanism of flares, giving an comprehensive explanation of the key processes mentioned above. We start with basic properties of flares, then go into the details of energy build-up, release and transport in flares where magnetic reconnection works as the central engine to produce a flare.

  16. Time processing in dyscalculia

    Directory of Open Access Journals (Sweden)

    marinella eCappelletti

    2011-12-01

    Full Text Available To test whether atypical number development may affect other types of quantity processing, we investigated temporal discrimination in adults with developmental dyscalculia (DD. This also allowed us to test whether (1 number and time may be sub-served by a common quantity system or decision mechanisms –in which case they may both be impaired, or (2 whether number and time are distinct –and therefore they may dissociate. Participants judged which of two successively presented horizontal lines was longer in duration, the first line being preceded by either a small or a large number prime (‘1’ or ‘9’ or by a neutral symbol (‘#’, or in third task decide which of two Arabic numbers (either ‘1’, ‘5’, ’9’ lasted longer. Results showed that (i DD’s temporal discriminability was normal as long as numbers were not part of the experimental design even as task-irrelevant stimuli; however (ii task-irrelevant numbers dramatically disrupted DD’s temporal discriminability, the more their salience increased, though the actual magnitude of the numbers had no effect; and in contrast (iii controls’ time perception was robust to the presence of numbers but modulated by numerical quantity such that small number primes or numerical stimuli made durations appear shorter than veridical and the opposite for larger numerical prime or numerical stimuli. This study is the first to investigate continuous quantity as time in a population with a congenital number impairment and to show that atypical development of numerical competence leaves continuous quantity processing spared. Our data support the idea of a partially shared quantity system across numerical and temporal dimensions, which allows dissociations and interactions among dimensions; furthermore, they suggest that impaired number in DD is unlikely to originate from systems initially dedicated to continuous quantity processing like time.

  17. Stone dusting process advance

    Energy Technology Data Exchange (ETDEWEB)

    Matt Ryan; David Humphreys [Mining Attachments (Qld.) Pty Ltd. (Australia)

    2009-01-15

    The coal mining industry has, for many years, used dry stone dust or calcium carbonate (CaCO{sub 3}) in the prevention of the propagation of coal dust explosions throughout their underground mines in Australia. In the last decade wet stone dusting has been introduced. This is where stone dust and water are mixed together to form a paste like slurry. This mixture is pumped and sprayed on to the underground roadway surfaces. This method solved the contamination of the intake airways but brought with it a new problem known as 'caking'. Caking is the hardened layer that is formed as the stone dust slurry dries. It was proven that this hardened layer compromises the dispersal characteristics of the stone dust and therefore its ability to suppress a coal dust explosion. This project set out to prove a specially formulated, non toxic slurry additive and process that could overcome the caking effect. The slurry additive process combines dry stone dust with water to form a slurry. The slurry is then treated with the additive and compressed air to create a highly vesicular foam like stone dusted surface. The initial testing on a range of additives and the effectiveness in minimising the caking effect of wet dusting were performed at Applied Chemical's research laboratory in Melbourne, Victoria and independently tested at the SGS laboratory in Paget, Queensland. The results from these tests provided the platform to conduct full scale spraying trials at the Queensland Mines Rescue Station and Caledon Coal's Cook Colliery, Blackwater. The project moved into the final stage of completion with the collection of data. The intent was to compare the slurry additive process to dry stone dusting in full-scale methane explosions at the CSIR Kloppersbos explosion facility in Kloppersbos, South Africa.

  18. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    CERN Document Server

    Ge, Zhiqiang

    2013-01-01

      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  19. Process for soil consolidation

    Energy Technology Data Exchange (ETDEWEB)

    Herrick, F.W.; Brandstrom, R.I.

    1967-01-09

    In this process for the formation of a consolidated aggregate, a mass of solid particles is combined with an aqueous alkaline consolidating compound which forms a gel. This gel consists principally of a mixture of the following: a vegetable polyphenolic material; one of the group of catechins; condensed tannins and extract of the bark of coniferous trees; with 1-10% by weight of formaldehyde; and a catalyst of the group of water-soluble salts of chromium, iron, and aluminum. This catalyst serves to catalyze the reaction of formation of the gel.

  20. Phonocardiography Signal Processing

    CERN Document Server

    Abbas, Abbas K

    2009-01-01

    The auscultation method is an important diagnostic indicator for hemodynamic anomalies. Heart sound classification and analysis play an important role in the auscultative diagnosis. The term phonocardiography refers to the tracing technique of heart sounds and the recording of cardiac acoustics vibration by means of a microphone-transducer. Therefore, understanding the nature and source of this signal is important to give us a tendency for developing a competent tool for further analysis and processing, in order to enhance and optimize cardiac clinical diagnostic approach. This book gives the

  1. Bismuth vanadate process

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, R.M.

    1990-06-26

    This patent describes the process for the preparation of bismuth vanadate and bismuth vanadate-containing compounds wherein the precursor materials are calcined in the solid state at temperatures sufficient to react the precursor materials to prepare the vanadate compounds. It comprises: wet grinding the calcined product, contacting the calcined product with sufficient alkaline material to provide a pH level of 7.0-13.0 and recovering the treated product, the wet grinding of the calcined product being conducted either in the presence of the alkaline material or prior to the contacting with the alkaline material.

  2. Medical image processing

    CERN Document Server

    Dougherty, Geoff

    2011-01-01

    This book is designed for end users in the field of digital imaging, who wish to update their skills and understanding with the latest techniques in image analysis. This book emphasizes the conceptual framework of image analysis and the effective use of image processing tools. It uses applications in a variety of fields to demonstrate and consolidate both specific and general concepts, and to build intuition, insight and understanding. Although the chapters are essentially self-contained they reference other chapters to form an integrated whole. Each chapter employs a pedagogical approach to e

  3. FHR Process Instruments

    Energy Technology Data Exchange (ETDEWEB)

    Holcomb, David Eugene [ORNL

    2015-01-01

    Fluoride salt-cooled High temperature Reactors (FHRs) are entering into early phase engineering development. Initial candidate technologies have been identified to measure all of the required process variables. The purpose of this paper is to describe the proposed measurement techniques in sufficient detail to enable assessment of the proposed instrumentation suite and to support development of the component technologies. This paper builds upon the instrumentation chapter of the recently published FHR technology development roadmap. Locating instruments outside of the intense core radiation and high-temperature fluoride salt environment significantly decreases their environmental tolerance requirements. Under operating conditions, FHR primary coolant salt is a transparent, low-vapor-pressure liquid. Consequently, FHRs can employ standoff optical measurements from above the salt pool to assess in-vessel conditions. For example, the core outlet temperature can be measured by observing the fuel s blackbody emission. Similarly, the intensity of the core s Cerenkov glow indicates the fission power level. Short-lived activation of the primary coolant provides another means for standoff measurements of process variables. The primary coolant flow and neutron flux can be measured using gamma spectroscopy along the primary coolant piping. FHR operation entails a number of process measurements. Reactor thermal power and core reactivity are the most significant variables for process control. Thermal power can be determined by measuring the primary coolant mass flow rate and temperature rise across the core. The leading candidate technologies for primary coolant temperature measurement are Au-Pt thermocouples and Johnson noise thermometry. Clamp-on ultrasonic flow measurement, that includes high-temperature tolerant standoffs, is a potential coolant flow measurement technique. Also, the salt redox condition will be monitored as an indicator of its corrosiveness. Both

  4. Coupled Diffusion Processes

    Institute of Scientific and Technical Information of China (English)

    章复熹

    2004-01-01

    @@ Coupled diffusion processes (or CDP for short) model the systems of molecular motors,which attract much interest from physicists and biologists in recent years[1,2,9,14,4,7,21]. The protein moves along a filament called the track, and it is crucial that there are several inner states of the protein and the underlying chemical reaction causes transitions among different inner states,while chemical energy can be converted to mechanical energy by rachet effects[5,3,2,14,12].

  5. Image Processing Research

    Science.gov (United States)

    1975-09-30

    Technical Journal, Vol. 36, pp. 653-709, May 1957. -50- 4. Image Restoration anJ Enhdikcement Projects Imaje restoration ani image enhancement are...n (9K =--i_ (9) -sn =0- 2. where o is the noise energy ani I is an identity matrix. n Color Imaje Scanner Calibration: A common problem in the...line of the imaje , and >at. The statistics cf the process N(k) can now be given in terms of the statistics of m , 8 2 , and the sequence W= (cLe (5

  6. Process for treating biomass

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Timothy J; Teymouri, Farzaneh

    2015-11-04

    This invention is directed to a process for treating biomass. The biomass is treated with a biomass swelling agent within the vessel to swell or rupture at least a portion of the biomass. A portion of the swelling agent is removed from a first end of the vessel following the treatment. Then steam is introduced into a second end of the vessel different from the first end to further remove swelling agent from the vessel in such a manner that the swelling agent exits the vessel at a relatively low water content.

  7. Industrial Information Processing

    DEFF Research Database (Denmark)

    Svensson, Carsten

    2002-01-01

    This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build......-to-order” challenge. An outcome of this capability is that the potential market for customized products will expand, resulting in a reduction in administrative and manufacturing costs. This potential for cost reduction, simultaneous with market expansion, is a source of competitive advantage; hence manufacturers have...

  8. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability...

  9. Process Principle of Information

    Institute of Scientific and Technical Information of China (English)

    张高锋; 任君

    2006-01-01

    Ⅰ.IntroductionInformation structure is the organization modelof given and New information in the course ofinformation transmission.A discourse contains avariety of information and not all the informationlisted in the discourse is necessary and useful to us.When we decode a discourse,usually,we do not needto read every word in the discourse or text but skimor scan the discourse or text to search what we thinkis important or useful to us in the discourse as quicklyas possible.Ⅱ.Process Principles of Informati...

  10. Process Analytical Chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Veltkamp, David J.(VISITORS); Doherty, Steve D.(BCO); Anderson, B B.(VISITORS); Koch, Mel (University of Washington); Bond, Leonard J.(BATTELLE (PACIFIC NW LAB)); Burgess, Lloyd W.(VISITORS); Ullman, Alan H.(UNKNOWN); Bamberger, Judith A.(BATTELLE (PACIFIC NW LAB)); Greenwood, Margaret S.(BATTELLE (PACIFIC NW LAB))

    1999-06-15

    This review of process analytical chemistry is an update to the previous review on this subject published in 1995(A2). The time period covered for this review includes publications written or published from late 1994 until early 1999, with the addition of a few classic references pointing to background information critical to an understanding of a specific topic area. These older references have been critically included as established fundamental works. New topics covered in this review not previously treated as separate subjects in past reviews include sampling systems, imaging (via optical spectroscopy), and ultrasonic analysis.

  11. Sea Ice Processes

    Science.gov (United States)

    1988-01-01

    aq pnoiqs suol)0!pOid AixoolQA 00! 191100 (1I ’uoTow poAlosqo aql jo lqlgti 04) ol a~xe juqp suotioaJip 4)!A% parto s~t S stqi pule ’spoods 001 a)tUJT...to provide information as ating characteristics of PIPS. These factors in- to processes and their scales (as ascertained by elude the vertical grid...warranted horizontal compression being compensated by at this time. Further investigation is needed. vertical motion. In the case of ice, upward The space

  12. Introduction to information processing

    CERN Document Server

    Dietel, Harvey M

    2014-01-01

    An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

  13. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  14. Stochastic conditional intensity processes

    DEFF Research Database (Denmark)

    Bauwens, Luc; Hautsch, Nikolaus

    2006-01-01

    In this article, we introduce the so-called stochastic conditional intensity (SCI) model by extending Russell’s (1999) autoregressive conditional intensity (ACI) model by a latent common dynamic factor that jointly drives the individual intensity components. We show by simulations that the proposed...... model allows for a wide range of (cross-)autocorrelation structures in multivariate point processes. The model is estimated by simulated maximum likelihood (SML) using the efficient importance sampling (EIS) technique. By modeling price intensities based on NYSE trading, we provide significant evidence...

  15. Hyperspectral image processing

    CERN Document Server

    Wang, Liguo

    2016-01-01

    Based on the authors’ research, this book introduces the main processing techniques in hyperspectral imaging. In this context, SVM-based classification, distance comparison-based endmember extraction, SVM-based spectral unmixing, spatial attraction model-based sub-pixel mapping, and MAP/POCS-based super-resolution reconstruction are discussed in depth. Readers will gain a comprehensive understanding of these cutting-edge hyperspectral imaging techniques. Researchers and graduate students in fields such as remote sensing, surveying and mapping, geosciences and information systems will benefit from this valuable resource.

  16. Digital signal processing

    CERN Document Server

    O'Shea, Peter; Hussain, Zahir M

    2011-01-01

    In three parts, this book contributes to the advancement of engineering education and that serves as a general reference on digital signal processing. Part I presents the basics of analog and digital signals and systems in the time and frequency domain. It covers the core topics: convolution, transforms, filters, and random signal analysis. It also treats important applications including signal detection in noise, radar range estimation for airborne targets, binary communication systems, channel estimation, banking and financial applications, and audio effects production. Part II considers sel

  17. Digital signal processing laboratory

    CERN Document Server

    Kumar, B Preetham

    2011-01-01

    INTRODUCTION TO DIGITAL SIGNAL PROCESSING Brief Theory of DSP ConceptsProblem SolvingComputer Laboratory: Introduction to MATLAB®/SIMULINK®Hardware Laboratory: Working with Oscilloscopes, Spectrum Analyzers, Signal SourcesDigital Signal Processors (DSPs)ReferencesDISCRETE-TIME LTI SIGNALS AND SYSTEMS Brief Theory of Discrete-Time Signals and SystemsProblem SolvingComputer Laboratory: Simulation of Continuous Time and Discrete-Time Signals and Systems ReferencesTIME AND FREQUENCY ANALYSIS OF COMMUNICATION SIGNALS Brief Theory of Discrete-Time Fourier Transform (DTFT), Discrete Fourier Transform

  18. Koenigs function and branching processes

    CERN Document Server

    Chikilev, O G

    2001-01-01

    An explicit solution of time-homogeneous pure birth branching processes is described. It gives alternative extensions for the negative binomial distribution (branching processes with immigration) and for the Furry-Yule distribution (branching processes without immigration).

  19. Perfect simulation of Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    This article concerns a perfect simulation algorithm for unmarked and marked Hawkes processes. The usual stratihtforward simulation algorithm suffers from edge effects, whereas our perfect simulation algorithm does not. By viewing Hawkes processes as Poisson cluster processes and using...

  20. NANOSCALE PROCESS ENGINEERING

    Institute of Scientific and Technical Information of China (English)

    Qixiang Wang; Fei Wei

    2003-01-01

    The research of nanoscale process engineering (NPE) is based on the interdisciplinary nature of nanoscale science and technology. It mainly deals with transformation of materials and energy into nanostructured materials and nanodevices, and synergizes the multidisciplinary convergence between materials science and technology, biotechnology, and information technology. The core technologies of NPE concern all aspects of nanodevice construction and operation, such as manufacture of nanomaterials "by design", concepts and design of nanoarchitectures, and manufacture and control of customizable nanodevices. Two main targets of NPE at present are focused on nanoscale manufacture and concept design of nanodevices. The research progress of nanoscale manufacturing processes focused on creating nanostructures and assembling them into nanosystems and larger scale architectures has built the interdiscipline of NPE. The concepts and design of smart, multi-functional, environmentally compatible and customizable nanodevice prototypes built from the nanostructured systems of nanocrystalline, nanoporous and microemulsion systems are most challenging tasks of NPE. The development of NPE may also impel us to consider the curriculum and educational reform of chemical engineering in universities.

  1. Oxytocin and emotion processing.

    Science.gov (United States)

    Di Simplicio, Martina; Harmer, Catherine J

    2016-11-01

    Since the observation that oxytocin has key effects on social decision making, research on this exciting neuropeptide has doubled in volume: hundreds of studies have pursued the promise of a specific oxytocin action on high-level cognition and social function with wide potential translational implications (from autism to social anxiety to dementia). Here we review the evidence on whether the complex behavioural effects observed in humans after exogenous oxytocin administration build on changes in basic emotional information processing, in particular emotional facial expressions recognition, and attention and memory for emotionally-valenced stimuli.We observe that recent studies confirm a facilitatory effect of oxytocin to more accurate emotion processing, irrespective of emotion type. However, it remains unclear whether this action precedes, is independent of or even secondary to the neuropeptide promoting a greater salience of social stimuli. Overall, this growing research area has shown that oxytocin produces behavioural and neurofunctional outcomes that are highly dependent on the experimental context and on individual differences (gender, personality, life experiences). This poses an exciting challenge for future experimental medicine designs to address and unpack complex interactions between individual and context characteristic, which is needed for the development of more precise clinical applications.

  2. Mastering the diesel process

    Energy Technology Data Exchange (ETDEWEB)

    Antila, E.; Kaario, O.; Lahtinen, T. (and others)

    2004-07-01

    This is the final report of the research project 'Mastering the Diesel Process'. The project has been a joint research effort of the Helsinki University of Technology, the Tampere University of Technology, the Technical Research Centre of Finland, and the Aabo Akademi University. Moreover, the contribution of the Michigan Technological University has been important. The project 'Mastering the Diesel Process' has been a computational research project on the physical phenomena of diesel combustion. The theoretical basis of the project lies on computational fluid dynamics. Various submodels for computational fluid dynamics have been developed or tested within engine simulation. Various model combinations in three diesel engines of different sizes have been studied. The most important submodels comprise fuel spray drop breakup, fuel evaporation, gas-fuel interaction in the spray, mixing model of combustion, heat transfer, emission mechanisms. The boundary conditions and flow field modelling have been studied, as well. The main simulation tool have been Star-CD. KIVA code have been used in the model development, as well. By the help of simulation, we are able to investigate the effect of various design parameters or operational parameters on diesel combustion and emission formation. (orig.)

  3. ARM Mentor Selection Process

    Energy Technology Data Exchange (ETDEWEB)

    Sisterson, D. L. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Program was created in 1989 with funding from the U.S. Department of Energy (DOE) to develop several highly instrumented ground stations to study cloud formation processes and their influence on radiative transfer. In 2003, the ARM Program became a national scientific user facility, known as the ARM Climate Research Facility. This scientific infrastructure provides for fixed sites, mobile facilities, an aerial facility, and a data archive available for use by scientists worldwide through the ARM Climate Research Facility—a scientific user facility. The ARM Climate Research Facility currently operates more than 300 instrument systems that provide ground-based observations of the atmospheric column. To keep ARM at the forefront of climate observations, the ARM infrastructure depends heavily on instrument scientists and engineers, also known as lead mentors. Lead mentors must have an excellent understanding of in situ and remote-sensing instrumentation theory and operation and have comprehensive knowledge of critical scale-dependent atmospheric processes. They must also possess the technical and analytical skills to develop new data retrievals that provide innovative approaches for creating research-quality data sets. The ARM Climate Research Facility is seeking the best overall qualified candidate who can fulfill lead mentor requirements in a timely manner.

  4. The anaerobic digestion process

    Energy Technology Data Exchange (ETDEWEB)

    Rivard, C.J. [National Renewable Energy Lab., Golden, CO (United States); Boone, D.R. [Oregon Graduate Inst., Portland, OR (United States)

    1996-01-01

    The microbial process of converting organic matter into methane and carbon dioxide is so complex that anaerobic digesters have long been treated as {open_quotes}black boxes.{close_quotes} Research into this process during the past few decades has gradually unraveled this complexity, but many questions remain. The major biochemical reactions for forming methane by methanogens are largely understood, and evolutionary studies indicate that these microbes are as different from bacteria as they are from plants and animals. In anaerobic digesters, methanogens are at the terminus of a metabolic web, in which the reactions of myriads of other microbes produce a very limited range of compounds - mainly acetate, hydrogen, and formate - on which the methanogens grow and from which they form methane. {open_quotes}Interspecies hydrogen-transfer{close_quotes} and {open_quotes}interspecies formate-transfer{close_quotes} are major mechanisms by which methanogens obtain their substrates and by which volatile fatty acids are degraded. Present understanding of these reactions and other complex interactions among the bacteria involved in anaerobic digestion is only now to the point where anaerobic digesters need no longer be treated as black boxes.

  5. Ambiguity in sentence processing.

    Science.gov (United States)

    Altmann, G T

    1998-04-01

    As listeners and readers, we rarely notice the ambiguities that pervade our everyday language. When we hear the proverb `Time flies like an arrow' we might ponder its meaning, but not the fact that there are almost 100 grammatically permissible interpretations of this short sentence. On occasion, however, we do notice sentential ambiguity: headlines, such as `Two Sisters Reunited After 18 Years in Checkout Counter', are amusing because they so consistently lead to the unintended interpretation (presumably, the sisters did not spend 18 years at the checkout). It is this consistent preference for one interpretation-and one grammatical structure-rather than another that has fuelled research into sentence processing for more than 20 years. Until relatively recently, the dominant belief had been that these preferences arise from general principles that underlie our use of grammar, with certain grammatical constructions being preferred over others. There has now accrued, however, a considerable body of evidence demonstrating that these preferences are not absolute, but can change in particular circumstances. With this evidence have come new theories of sentence processing, some of which, at first glance, radically question the standard notions of linguistic representation, grammar and understanding.

  6. Multidimensional diffusion processes

    CERN Document Server

    Stroock, Daniel W

    1997-01-01

    From the reviews: "… Both the Markov-process approach and the Itô approach … have been immensely successful in diffusion theory. The Stroock-Varadhan book, developed from the historic 1969 papers by its authors, presents the martingale-problem approach as a more powerful - and, in certain regards, more intrinsic-means of studying the foundations of the subject. […] … the authors make the uncompromising decision not "to proselytise by intimidating the reader with myriad examples demonstrating the full scope of the techniques", but rather to persuade the reader "with a careful treatment of just one problem to which they apply". […] Most of the main tools of stochastic-processes theory are used, ..but it is the formidable combination of probability theory with analysis … which is the core of the work. […] I have emphasized the great importance of the Stroock-Varadhan book. It contains a lot more than I have indicated; in particular, its many exercises conain much interesting material. For immediat...

  7. Mindfulness and psychological process.

    Science.gov (United States)

    Williams, J Mark G

    2010-02-01

    The author reviews the articles in the Special Section on Mindfulness, starting from the assumption that emotions evolved as signaling systems that need to be sensitive to environmental contingencies. Failure to switch off emotion is due to the activation of mental representations of present, past, and future that are created independently of external contingencies. Mindfulness training can be seen as one way to teach people to discriminate such "simulations" from objects and contingencies as they actually are. The articles in this Special Section show how even brief laboratory training can have effects on processing affective stimuli; that long-term meditation practitioners show distinct reactions to pain; that longer meditation training is associated with differences in brain structure; that 8 weeks' mindfulness practice brings about changes in the way emotion is processed showing that participants can learn to uncouple the sensory, directly experienced self from the "narrative" self; that mindfulness training can affect working memory capacity, and enhance the ability of participants to talk about past crises in a way that enables them to remain specific and yet not be overwhelmed. The implications of these findings for understanding emotion and for further research is discussed.

  8. Turbulence and Stochastic Processes

    Science.gov (United States)

    Celani, Antonio; Mazzino, Andrea; Pumir, Alain

    sec:08-1In 1931 the monograph Analytical Methods in Probability Theory appeared, in which A.N. Kolmogorov laid the foundations for the modern theory of Markov processes [1]. According to Gnedenko: "In the history of probability theory it is difficult to find other works that changed the established points of view and basic trends in research work in such a decisive way". Ten years later, his article on fully developed turbulence provided the framework within which most, if not all, of the subsequent theoretical investigations have been conducted [2] (see e.g. the review by Biferale et al. in this volume [3]. Remarkably, the greatest advances made in the last few years towards a thorough understanding of turbulence developed from the successful marriage between the theory of stochastic processes and the phenomenology of turbulent transport of scalar fields. In this article we will summarize these recent developments which expose the direct link between the intermittency of transported fields and the statistical properties of particle trajectories advected by the turbulent flow (see also [4], and, for a more thorough review, [5]. We also discuss the perspectives of the Lagrangian approach beyond passive scalars, especially for the modeling of hydrodynamic turbulence.

  9. Natural gas conversion process

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The experimental apparatus was dismantled and transferred to a laboratory space provided by Lawrence Berkeley Laboratory (LBL) which is already equipped with a high-ventilation fume hood. This will enable us to make tests at higher gas flow rates in a safe environment. Three papers presented at the ACS meeting in San Francisco (Symposium on Natural Gas Upgrading II) April 5--10, 1992 show that the goal of direct catalytic conversion of Methane into heavier Hydrocarbons in a reducing atmosphere is actively pursued in three other different laboratories. There are similarities in their general concept with our own approach, but the temperature range of the experiments reported in these recent papers is much lower and this leads to uneconomic conversion rates. This illustrates the advantages of Methane activation by a Hydrogen plasma to reach commercial conversion rates. A preliminary process flow diagram was established for the Integrated Process, which was outlined in the previous Quarterly Report. The flow diagram also includes all the required auxiliary facilities for product separation and recycle of the unconverted feed as well as for the preparation and compression of the Syngas by-product.

  10. Grants Process Overview

    Science.gov (United States)

    This infographic shows the steps in the National Institutes of Health and National Cancer Institute Grants Process. The graphic shows which steps are done by the Principle Investigator, Grantee Institution, and by NIH. The process is represented by a circular flow of steps. Starting from the top and reading clockwise: The Principle Investigator “Initiates Research Idea and Prepares Application” The Grantee Institution “Submits Application” NIH “NIH Center For Scientific Review, Assigns To NCI And To Study Section” NIH “Scientific Review Group (NCI OR CSR) Evaluates for Scientific Merit” NIH “National Cancer Advisory Board Recommends Action” NIH “NCI Evaluates Program Relevance And Need” NIH “NCI Makes Funding Selections And Issues Grant Awards” (NIH) NIH “NCI Monitors Programmatic and Business Management Performance of the Grant” The Grantee Institution “Manages Funds” The Principle Investigator “Conducts Research” Source: www.cancer.gov Icons made by Freepik from http://www.flaticon.com is licensed by CC BY3.0”

  11. Poultry Slaughtering and Processing Facilities

    Data.gov (United States)

    Department of Homeland Security — Agriculture Production Poultry Slaughtering and Processing in the United States This dataset consists of facilities which engage in slaughtering, processing, and/or...

  12. Beryllium Manufacturing Processes

    Energy Technology Data Exchange (ETDEWEB)

    Goldberg, A

    2006-06-30

    This report is one of a number of reports that will be combined into a handbook on beryllium. Each report covers a specific topic. To-date, the following reports have been published: (1) Consolidation and Grades of Beryllium; (2) Mechanical Properties of Beryllium and the Factors Affecting these Properties; (3) Corrosion and Corrosion Protection of Beryllium; (4) Joining of Beryllium; (5) Atomic, Crystal, Elastic, Thermal, Nuclear, and other Properties of Beryllium; and (6) Beryllium Coating (Deposition) Processes and the Influence of Processing Parameters on Properties and Microstructure. The conventional method of using ingot-cast material is unsuitable for manufacturing a beryllium product. Beryllium is a highly reactive metal with a high melting point, making it susceptible to react with mold-wall materials forming beryllium compounds (BeO, etc.) that become entrapped in the solidified metal. In addition, the grain size is excessively large, being 50 to 100 {micro}m in diameter, while grain sizes of 15 {micro}m or less are required to meet acceptable strength and ductility requirements. Attempts at refining the as-cast-grain size have been unsuccessful. Because of the large grain size and limited slip systems, the casting will invariably crack during a hot-working step, which is an important step in the microstructural-refining process. The high reactivity of beryllium together with its high viscosity (even with substantial superheat) also makes it an unsuitable candidate for precision casting. In order to overcome these problems, alternative methods have been developed for the manufacturing of beryllium. The vast majority of these methods involve the use of beryllium powders. The powders are consolidated under pressure in vacuum at an elevated temperature to produce vacuum hot-pressed (VHP) blocks and vacuum hot-isostatic-pressed (HIP) forms and billets. The blocks (typically cylindrical), which are produced over a wide range of sizes (up to 183 cm dia. by 61

  13. Fundamentals of process intensification: A process systems engineering view

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Sales Cruz, Alfonso Mauricio; Gani, Rafiqul

    2016-01-01

    at different scales of size, that is, the unit operation scale, the task scale, and the phenomena scale. The roles of process intensification with respect to process improvements and the generation of more sustainable process designs are discussed and questions related to when to apply process intensification...

  14. Brownian semi-stationary processes, turbulence and smooth processes

    DEFF Research Database (Denmark)

    Urbina, José Ulises Márquez

    process a bounded variation process with differentiable paths. It is natural to inquire if it is possible to obtain an asymptotic theory for this class of BSS processes. This problem is investigated and some partial results are presented. The asymptotic theory for BSS processes naturally leads...

  15. Particle processing technology

    Science.gov (United States)

    Yoshio, Sakka

    2014-02-01

    In recent years, there has been strong demand for the development of novel devices and equipment that support advanced industries including IT/semiconductors, the environment, energy and aerospace along with the achievement of higher efficiency and reduced environmental impact. Many studies have been conducted on the fabrication of innovative inorganic materials with novel individual properties and/or multifunctional properties including electrical, dielectric, thermal, optical, chemical and mechanical properties through the development of particle processing. The fundamental technologies that are key to realizing such materials are (i) the synthesis of nanoparticles with uniform composition and controlled crystallite size, (ii) the arrangement/assembly and controlled dispersion of nanoparticles with controlled particle size, (iii) the precise structural control at all levels from micrometer to nanometer order and (iv) the nanostructural design based on theoretical/experimental studies of the correlation between the local structure and the functions of interest. In particular, it is now understood that the application of an external stimulus, such as magnetic energy, electrical energy and/or stress, to a reaction field is effective in realizing advanced particle processing [1-3]. This special issue comprises 12 papers including three review papers. Among them, seven papers are concerned with phosphor particles, such as silicon, metals, Si3N4-related nitrides, rare-earth oxides, garnet oxides, rare-earth sulfur oxides and rare-earth hydroxides. In these papers, the effects of particle size, morphology, dispersion, surface states, dopant concentration and other factors on the optical properties of phosphor particles and their applications are discussed. These nanoparticles are classified as zero-dimensional materials. Carbon nanotubes (CNT) and graphene are well-known one-dimensional (1D) and two-dimensional (2D) materials, respectively. This special issue also

  16. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  17. Process dissociation, single-process theories, and recognition memory.

    Science.gov (United States)

    Ratcliff, R; Van Zandt, T; McKoon, G

    1995-12-01

    According to the assumptions of L. L. Jacoby's (1991) process dissociation method, performance in recognition memory is determined by the combination of an unconscious familiarity process and a conscious intentional recollection process. The process dissociation method is used to produce estimates of the contributions of the 2 components to recognition performance. This article investigates whether the method provides the correct estimates of components if performance actually depends on only a single process or on 2 processes different from those assumed by the method. The SAM model (G. Gillund & R. M. Shiffrin, 1984) was used to produce simulated data based on a single process. Variants of SAM with 2 processes and R. C. Atkinson and J. F. Juola's (1973) 2-process model were used to produce data based on 2 processes.

  18. Titan's global geologic processes

    Science.gov (United States)

    Malaska, Michael; Lopes, Rosaly M. C.; Schoenfeld, Ashley; Birch, Samuel; Hayes, Alexander; Williams, David A.; Solomonidou, Anezina; Janssen, Michael A.; Le Gall, Alice; Soderblom, Jason M.; Neish, Catherine; Turtle, Elizabeth P.; Cassini RADAR Team

    2016-10-01

    We have mapped the Cassini SAR imaged areas of Saturn's moon Titan in order to determine the geological properties that modify the surface [1]. We used the SAR dataset for mapping, but incorporated data from radiometry, VIMS, ISS, and SARTopo for terrain unit determination. This work extends our analyses of the mid-latitude/equatorial Afekan Crater region [2] and in the southern and northern polar regions [3]. We placed Titan terrains into six broad terrain classes: craters, mountain/hummocky, labyrinth, plains, dunes, and lakes. We also extended the fluvial mapping done by Burr et al. [4], and defined areas as potential cryovolcanic features [5]. We found that hummocky/mountainous and labyrinth areas are the oldest units on Titan, and that lakes and dunes are among the youngest. Plains units are the largest unit in terms of surface area, followed by the dunes unit. Radiometry data suggest that most of Titan's surface is covered in high-emissivity materials, consistent with organic materials, with only minor exposures of low-emissivity materials that are consistent with water ice, primarily in the mountain and hummocky areas and crater rims and ejecta [6, 7]. From examination of terrain orientation, we find that landscape evolution in the mid-latitude and equatorial regions is driven by aeolian processes, while polar landscapes are shaped by fluvial, lacrustine, and possibly dissolution or volatilization processes involving cycling organic materials [3, 8]. Although important in deciphering Titan's terrain evolution, impact processes play a very minor role in the modification of Titan's landscape [9]. We find no evidence for large-scale aqueous cryovolcanic deposits.References: [1] Lopes, R.M.C. et al. (2010) Icarus, 205, 540–558. [2] Malaska, M.J. et al. (2016) Icarus, 270, 130–161. [3] Birch et al., in revision. [4] Burr et al. (2013) GSA Bulletin 125, 299–321. [5] Lopes et al. JGR: Planets, 118, 1–20. [6] Janssen et al., (2009) Icarus, 200, 222–239. [7

  19. Sensors in Spray Processes

    Science.gov (United States)

    Fauchais, P.; Vardelle, M.

    2010-06-01

    This paper presents what is our actual knowledge about sensors, used in the harsh environment of spray booths, to improve the reproducibility and reliability of coatings sprayed with hot or cold gases. First are described, with their limitations and precisions, the different sensors following the in-flight hot particle parameters (trajectories, temperatures, velocities, sizes, and shapes). A few comments are also made about techniques, still under developments in laboratories, to improve our understanding of coating formation such as plasma jet temperature measurements in non-symmetrical conditions, hot gases heat flux, particles flattening and splats formation, particles evaporation. Then are described the illumination techniques by laser flash of either cold particles (those injected in hot gases, or in cold spray gun) or liquid injected into hot gases (suspensions or solutions). The possibilities they open to determine the flux and velocities of cold particles or visualize liquid penetration in the core of hot gases are discussed. Afterwards are presented sensors to follow, when spraying hot particles, substrate and coating temperature evolution, and the stress development within coatings during the spray process as well as the coating thickness. The different uses of these sensors are then described with successively: (i) Measurements limited to particle trajectories, velocities, temperatures, and sizes in different spray conditions: plasma (including transient conditions due to arc root fluctuations in d.c. plasma jets), HVOF, wire arc, cold spray. Afterwards are discussed how such sensor data can be used to achieve a better understanding of the different spray processes, compare experiments to calculations and improve the reproducibility and reliability of the spray conditions. (ii) Coatings monitoring through in-flight measurements coupled with those devoted to coatings formation. This is achieved by either maintaining at their set point both in-flight and

  20. Process technology implications of procurement process: some initial observations

    OpenAIRE

    Ellmer, E.; Emmerich, W.; Finkelstein, A

    1998-01-01

    We report on a study of procurement processes in a large organization. The purpose of the study was to identify problems in the organization’s procurement processesand to suggestimprovement actions.Procurement processesdetermine the characteristics of software processes. Procurement processes are themselves complex and amenable to process technology. Cost and scheduling benefits can be realised if procurement and contracting organizations integrate their respective processes...

  1. Learning Determinantal Point Processes

    CERN Document Server

    Kulesza, Alex

    2012-01-01

    Determinantal point processes (DPPs), which arise in random matrix theory and quantum physics, are natural models for subset selection problems where diversity is preferred. Among many remarkable properties, DPPs offer tractable algorithms for exact inference, including computing marginal probabilities and sampling; however, an important open question has been how to learn a DPP from labeled training data. In this paper we propose a natural feature-based parameterization of conditional DPPs, and show how it leads to a convex and efficient learning formulation. We analyze the relationship between our model and binary Markov random fields with repulsive potentials, which are qualitatively similar but computationally intractable. Finally, we apply our approach to the task of extractive summarization, where the goal is to choose a small subset of sentences conveying the most important information from a set of documents. In this task there is a fundamental tradeoff between sentences that are highly relevant to th...

  2. Signal processing unit

    Energy Technology Data Exchange (ETDEWEB)

    Boswell, J.

    1983-01-01

    The architecture of the signal processing unit (SPU) comprises an ROM connected to a program bus, and an input-output bus connected to a data bus and register through a pipeline multiplier accumulator (pmac) and a pipeline arithmetic logic unit (palu), each associated with a random access memory (ram1,2). The system pulse frequency is from 20 mhz. The pmac is further detailed, and has a capability of 20 mega operations per second. There is also a block diagram for the palu, showing interconnections between the register block (rbl), separator for bus (bs), register (reg), shifter (sh) and combination unit. The first and second rams have formats 64*16 and 32*32 bits, respectively. Further data are a 5-v power supply and 2.5 micron n-channel silicon gate mos technology with about 50000 transistors.

  3. Forward Osmosis Process

    KAUST Repository

    Duan, Jintang

    2013-12-05

    A process that can alleviate the internal concentration polarization and can enhance membrane performance of a forward osmosis system includes the steps of passing a fluid in a forward osmosis system from a feed solution with a first osmotic pressure, through a membrane into a draw solution comprising a draw solute with a second osmotic pressure, where the first osmotic pressure is lower than the second osmotic pressure, the membrane includes an active layer and a support layer, and the membrane is oriented such that the active layer of the membrane faces a draw side, and the support layer faces a feed side; and applying an external force to the fluid on the feed side of the membrane.

  4. Catalyzing alignment processes

    DEFF Research Database (Denmark)

    Lauridsen, Erik Hagelskjær; Jørgensen, Ulrik

    2004-01-01

    This paper describes how environmental management systems (EMS) spur the circulation of processes that support the constitution of environmental issues as specific environ¬mental objects and objectives. EMS catalyzes alignmentprocesses that produce coherence among the different elements involved......, the networks of environmental professionals that work in the environmental organisation, in consulting and regulatory enforcement, and dominating business cultures. These have previously been identified in the literature as individually significant in relation to the evolving environmental agendas....... They are here used to describe the context in which environmental management is implemented. Based on findings from contributions to a research program studying the implementation and impact of EMS in different settings, we highlight the diverse roles that these systems play in the Thai context. EMS may over...

  5. Plutonium dissolution process

    Science.gov (United States)

    Vest, M.A.; Fink, S.D.; Karraker, D.G.; Moore, E.N.; Holcomb, H.P.

    1994-01-01

    A two-step process for dissolving Pu metal is disclosed in which two steps can be carried out sequentially or simultaneously. Pu metal is exposed to a first mixture of 1.0-1.67 M sulfamic acid and 0.0025-0.1 M fluoride, the mixture having been heated to 45-70 C. The mixture will dissolve a first portion of the Pu metal but leave a portion of the Pu in an oxide residue. Then, a mineral acid and additional fluoride are added to dissolve the residue. Alternatively, nitric acid between 0.05 and 0.067 M is added to the first mixture to dissolve the residue as it is produced. Hydrogen released during the dissolution is diluted with nitrogen.

  6. Additive Gaussian Processes

    CERN Document Server

    Duvenaud, David; Rasmussen, Carl Edward

    2011-01-01

    We introduce a Gaussian process model of functions which are additive. An additive function is one which decomposes into a sum of low-dimensional functions, each depending on only a subset of the input variables. Additive GPs generalize both Generalized Additive Models, and the standard GP models which use squared-exponential kernels. Hyperparameter learning in this model can be seen as Bayesian Hierarchical Kernel Learning (HKL). We introduce an expressive but tractable parameterization of the kernel function, which allows efficient evaluation of all input interaction terms, whose number is exponential in the input dimension. The additional structure discoverable by this model results in increased interpretability, as well as state-of-the-art predictive power in regression tasks.

  7. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    The Integrated Renovation Process (IRP) is a user customized methodology based on judiciously selected constructivist and interactive multi-criteria decision making methods (Galiotto, Heiselberg, & Knudstrup, 2014 (expected)). The IRP supports, informs and reassures building owners to decide...... they get more time for the cost optimization and the qualitative analysis of the users’ needs and behaviours. In order to reach a fossil free energy building stock within an acceptable time frame, it is primordial that researchers, politicians and the building industry work hand in hand. Indeed, in order...... to overcome the financial barriers to energy renovation and bring a new type of building experts in the building renovation sector, cost optimization tools for building renovation have been and can be developed but new legislation and politico-economic supports are still much needed. We present in this report...

  8. PROCESSING OF MONAZITE SAND

    Science.gov (United States)

    Calkins, G.D.; Bohlmann, E.G.

    1957-12-01

    A process for the recovery of thorium, uranium, and rare earths from monazite sands is presented. The sands are first digested and dissolved in concentrated NaOH, and the solution is then diluted causing precipitation of uranium, thorium and rare earth hydroxides. The precipitate is collected and dissolved in HCl, and the pH of this solution is adjusted to about 6, precipitating the hydroxides of thorium and uranium but leaving the rare earths in solution. The rare earths are then separated from the solution by precipitation at a still higher pH. The thorium and uranium containing precipitate is redissolved in HNO/sub 3/ and the two elements are separated by extraction into tributyl phosphate and back extraction with a weakly acidic solution to remove the thorium.

  9. Fractal Poisson processes

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-09-01

    The Central Limit Theorem (CLT) and Extreme Value Theory (EVT) study, respectively, the stochastic limit-laws of sums and maxima of sequences of independent and identically distributed (i.i.d.) random variables via an affine scaling scheme. In this research we study the stochastic limit-laws of populations of i.i.d. random variables via nonlinear scaling schemes. The stochastic population-limits obtained are fractal Poisson processes which are statistically self-similar with respect to the scaling scheme applied, and which are characterized by two elemental structures: (i) a universal power-law structure common to all limits, and independent of the scaling scheme applied; (ii) a specific structure contingent on the scaling scheme applied. The sum-projection and the maximum-projection of the population-limits obtained are generalizations of the classic CLT and EVT results - extending them from affine to general nonlinear scaling schemes.

  10. Paretian Poisson Processes

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-05-01

    Many random populations can be modeled as a countable set of points scattered randomly on the positive half-line. The points may represent magnitudes of earthquakes and tornados, masses of stars, market values of public companies, etc. In this article we explore a specific class of random such populations we coin ` Paretian Poisson processes'. This class is elemental in statistical physics—connecting together, in a deep and fundamental way, diverse issues including: the Poisson distribution of the Law of Small Numbers; Paretian tail statistics; the Fréchet distribution of Extreme Value Theory; the one-sided Lévy distribution of the Central Limit Theorem; scale-invariance, renormalization and fractality; resilience to random perturbations.

  11. Supplier Evaluation Processes

    DEFF Research Database (Denmark)

    Hald, Kim Sundtoft; Ellegaard, Chris

    2011-01-01

    Purpose – The purpose of this paper is to illuminate how supplier evaluation practices are linked to supplier performance improvements. Specifically, the paper investigates how performance information travelling between the evaluating buyer and the evaluated suppliers is shaped and reshaped...... in the evaluation process. Design/methodology/approach – The paper relies on a multiple, longitudinal case research methodology. The two cases show two companies' efforts in designing, implementing, and using supplier evaluation in order to improve supplier performance. Findings – The findings show how the dynamics...... of representing, reducing, amplifying, dampening, and directing shape and reshape supplier evaluation information. In both companies, evaluation practices were defined, redefined, and re-directed by the involved actors' perception and decision making, as well as organisational structures, IT systems...

  12. Process and plant safety

    CERN Document Server

    Hauptmanns, Ulrich

    2015-01-01

    Accidents in technical installations are random events. Hence they cannot be totally avoided. Only the probability of their occurrence may be reduced and their consequences be mitigated. The book proceeds from hazards caused by materials and process conditions to indicating technical and organizational measures for achieving the objectives of reduction and mitigation. Qualitative methods for identifying weaknesses of design and increasing safety as well as models for assessing accident consequences are presented. The quantitative assessment of the effectiveness of safety measures is explained. The treatment of uncertainties plays a role there. They stem from the random character of the accident and from lacks of knowledge on some of the phenomena to be addressed. The reader is acquainted with the simulation of accidents, safety and risk analyses and learns how to judge the potential and limitations of mathematical modelling. Risk analysis is applied amongst others to “functional safety” and the determinat...

  13. Advanced powder processing

    Energy Technology Data Exchange (ETDEWEB)

    Janney, M.A. [Oak Ridge National Lab., TN (United States)

    1997-04-01

    Gelcasting is an advanced powder forming process. It is most commonly used to form ceramic or metal powders into complex, near-net shapes. Turbine rotors, gears, nozzles, and crucibles have been successfully gelcast in silicon nitride, alumina, nickel-based superalloy, and several steels. Gelcasting can also be used to make blanks that can be green machined to near-net shape and then high fired. Green machining has been successfully applied to both ceramic and metal gelcast blanks. Recently, the authors have used gelcasting to make tooling for metal casting applications. Most of the work has centered on H13 tool steel. They have demonstrated an ability to gelcast and sinter H13 to near net shape for metal casting tooling. Also, blanks of H13 have been cast, green machined into complex shape, and fired. Issues associated with forming, binder burnout, and sintering are addressed.

  14. Welding processes handbook

    CERN Document Server

    Weman, Klas

    2011-01-01

    Offers an introduction to the range of available welding technologies. This title includes chapters on individual techniques that cover principles, equipment, consumables and key quality issues. It includes material on such topics as the basics of electricity in welding, arc physics, and distortion, and the weldability of particular metals.$bThe first edition of Welding processes handbook established itself as a standard introduction and guide to the main welding technologies and their applications. This new edition has been substantially revised and extended to reflect the latest developments. After an initial introduction, the book first reviews gas welding before discussing the fundamentals of arc welding, including arc physics and power sources. It then discusses the range of arc welding techniques including TIG, plasma, MIG/MAG, MMA and submerged arc welding. Further chapters cover a range of other important welding technologies such as resistance and laser welding, as well as the use of welding techniqu...

  15. Electro Processing Research

    Science.gov (United States)

    1982-01-01

    Electroprocessing which is concerned with fluid dynamics of the electroreduction process to determine how it may be modified to improve the quality of the deposit was studied. Experimental techniques are used in this research. These techniques include laser Schlieren photography, laser Doppler velocimetry, and frequency spectrum analysis. Projects involve fluid flow studies of zinc plating in aqueous and molten salt electrolytes, study of cell design for magnesium chlorides electrolysis, digital signal analysis of manganese electrodeposition in molten chlorides, and electroplating of molybdenum from low melting salts. It is anticipated that the use of refractory metals as constructed materials in engineering will increase. Their electrodeposition from molten salt electrolytes is important in the extraction metallurgy of refractory metals.

  16. Experiencing Historical Processes

    DEFF Research Database (Denmark)

    Marchetti, Emanuela

    2016-01-01

    ” are involved in guided tours: the visitors (in this case primary school children), the guides, and museum practitioners responsible for planning exhibitions. Current studies tend to focus on one user group; this means that the proposed solutions do not take into account the needs of the other groups. Instead......, which is discussed in the second paper, based on the framework of apprenticeship in thinking (Rogoff 1990) and 4 play as a resource for conceptual thinking (Vygotsky 1978). Play is also seen as a state of mind (Apter 2007; Sutton-Smith 1997) allowing children to reconfigure the hierarchical relationship...... emerging with the guides. Moreover, as discussed in the third paper presented in this thesis, the design process takes into account children’s individual needs, regarding play and museum experience. Final evaluations with MicroCulture (fourth paper) show that digital technologies allow for compelling...

  17. Continuous coal processing method

    Science.gov (United States)

    Ryason, P. R.

    1980-06-01

    A coal pump is provided in which solid coal is heated in the barrel of an extruder under pressure to a temperature at which the coal assumes plastic properties. The coal is continuously extruded, without static zones, using, for example, screw extrusion preferably without venting through a reduced diameter die to form a dispersed spray. As a result, the dispersed coal may be continuously injected into vessels or combustors at any pressure up to the maximum pressure developed in the extrusion device. The coal may be premixed with other materials such as desulfurization aids or reducible metal ores so that reactions occur, during or after conversion to its plastic state. Alternatively, the coal may be processed and caused to react after extrusion, through the die, with, for example, liquid oxidizers, whereby a coal reactor is provided.

  18. Fluorination process using catalysts

    Science.gov (United States)

    Hochel, R.C.; Saturday, K.A.

    1983-08-25

    A process is given for converting an actinide compound selected from the group consisting of uranium oxides, plutonium oxides, uranium tetrafluorides, plutonium tetrafluorides and mixtures of said oxides and tetrafluorides, to the corresponding volatile actinide hexafluoride by fluorination with a stoichiometric excess of fluorine gas. The improvement involves conducting the fluorination of the plutonium compounds in the presence of a fluoride catalyst selected from the group consisting of CoF/sub 3/, AgF/sub 2/ and NiF/sub 2/, whereby the fluorination is significantly enhanced. The improvement also involves conducting the fluorination of one of the uranium compounds in the presence of a fluoride catalyst selected from the group consisting of CoF/sub 3/ and AgF/sub 2/, whereby the fluorination is significantly enhanced.

  19. Fluorination process using catalyst

    Science.gov (United States)

    Hochel, Robert C.; Saturday, Kathy A.

    1985-01-01

    A process for converting an actinide compound selected from the group consisting of uranium oxides, plutonium oxides, uranium tetrafluorides, plutonium tetrafluorides and mixtures of said oxides and tetrafluorides, to the corresponding volatile actinide hexafluoride by fluorination with a stoichiometric excess of fluorine gas. The improvement involves conducting the fluorination of the plutonium compounds in the presence of a fluoride catalyst selected from the group consisting of CoF.sub.3, AgF.sub.2 and NiF.sub.2, whereby the fluorination is significantly enhanced. The improvement also involves conducting the fluorination of one of the uranium compounds in the presence of a fluoride catalyst selected from the group consisting of CoF.sub.3 and AgF.sub.2, whereby the fluorination is significantly enhanced.

  20. Catalyst Alloys Processing

    Science.gov (United States)

    Tan, Xincai

    2014-10-01

    Catalysts are one of the key materials used for diamond formation at high pressures. Several such catalyst products have been developed and applied in China and around the world. The catalyst alloy most widely used in China is Ni70Mn25Co5 developed at Changsha Research Institute of Mining and Metallurgy. In this article, detailed techniques for manufacturing such a typical catalyst alloy will be reviewed. The characteristics of the alloy will be described. Detailed processing of the alloy will be presented, including remelting and casting, hot rolling, annealing, surface treatment, cold rolling, blanking, finishing, packaging, and waste treatment. An example use of the catalyst alloy will also be given. Industrial experience shows that for the catalyst alloy products, a vacuum induction remelt furnace can be used for remelting, a metal mold can be used for casting, hot and cold rolling can be used for forming, and acid pickling can be used for metal surface cleaning.

  1. Evaluating Discourse Processing Algorithms

    CERN Document Server

    Walker, M A

    1994-01-01

    In order to take steps towards establishing a methodology for evaluating Natural Language systems, we conducted a case study. We attempt to evaluate two different approaches to anaphoric processing in discourse by comparing the accuracy and coverage of two published algorithms for finding the co-specifiers of pronouns in naturally occurring texts and dialogues. We present the quantitative results of hand-simulating these algorithms, but this analysis naturally gives rise to both a qualitative evaluation and recommendations for performing such evaluations in general. We illustrate the general difficulties encountered with quantitative evaluation. These are problems with: (a) allowing for underlying assumptions, (b) determining how to handle underspecifications, and (c) evaluating the contribution of false positives and error chaining.

  2. Entrepreneurship and Process Studies

    DEFF Research Database (Denmark)

    Hjorth, Daniel; Holt, Robin; Steyaert, Chris

    2015-01-01

    Process studies put movement, change and flow first; to study processually is to consider the world as restless, something underway, becoming and perishing, without end. To understand firms processually is to accept but also – and this is harder perhaps – to absorb this fluidity, to treat...... a variable as just that, a variable. The resonance with entrepreneurship studies is obvious. If any field is alive to, and fully resonant with, a processual understanding of, for example, the creation of firms, it is entrepreneurship studies. This special issue is an attempt to consider the promise...... and potential of processual approaches to studying, researching and practising entrepreneurship. The articles in the issue attest to an increasing sensitivity to processual thinking. We argue that appreciating entrepreneurial phenomena processually opens up the field to an understanding of entrepreneurship...

  3. Perception and information processing

    DEFF Research Database (Denmark)

    Scholderer, Joachim

    2010-01-01

    : as consumers, we can only respond to a stimulus if our senses are actually stimulated by it. Psychologically speaking, a stimulus only exists for us once we have formed an internal representation of it. The objective of this chapter is to introduce the systems that are involved in this processing of perceptual...... information and to characterise the operations they perform. To avoid confusion, it should be stressed that the term "perception" is often used in a colloquial sense in consumer research. In concepts like perceived quality, perceived value, or perceived risk, the modifier "perceived" simply highlights...... ("psychophysics") can be considered the birth of experimental psychology. Today, most perception research is carried out in the interdisciplinary field of cognitive neuroscience. Only selected issues have made their way into consumer research. After a short general introduction, we will therefore focus...

  4. COMMUNICATION PROCESSES AND STRATEGIES

    Institute of Scientific and Technical Information of China (English)

    1996-01-01

    IntroductionThe main goal of studying a foreign language is to be able to communicate.The essence ofcommunication is sending and receiving messages and negotiating meaning.During the communicationprocess,learners may meet problems which hinder their understanding.In order to overcome theselimitations,it is very.important to know and use certain strategies involved in the communicationprocesses.There are three basic activities in the communication process-expressing intensions,interpretation andnegotiation.Expressing intentions is giving information.During communication,every speaker has tofirst send his or her messages and the listener must decode what he or she has heard.This activity may becalled interpretation.During conversation,both listener and speaker must do some negotiation in orderto make sure that they understand each other.Negotiation could be called communication exchange.

  5. Business Process Outsourcing

    Directory of Open Access Journals (Sweden)

    Doina FOTACHE

    2006-01-01

    Full Text Available Business Process Outsourcing (BPO is gaining widespread acceptance throughout the US, Europe, South America and Asia Pacific as the top executives of leading multinationals turn to outsourcing as a strategic management tool for improving corporate performance, profitability and shareholder value. BPO started to emerge a few years ago as follow-on to IT outsourcing. The concept is not new; BPO is the contracting of a specific business task. Outsourcing focuses on adding value typically to non-core and non-complex activities by buying in best practices and economies of scale. Because reduce costs, focus on core strategic activities and improve customer service, an increasing number of organizations in both the public and the private sector are looking toward BPO as a solution to their needs.

  6. Experimental adaptive process tomography

    Science.gov (United States)

    Pogorelov, I. A.; Struchalin, G. I.; Straupe, S. S.; Radchenko, I. V.; Kravtsov, K. S.; Kulik, S. P.

    2017-01-01

    Adaptive measurements were recently shown to significantly improve the performance of quantum state tomography. Utilizing information about the system for the online choice of optimal measurements allows one to reach the ultimate bounds of precision for state reconstruction. In this article we generalize an adaptive Bayesian approach to the case of process tomography and experimentally show its superiority in the task of learning unknown quantum operations. Our experiments with photonic polarization qubits cover all types of single-qubit channels. We also discuss instrumental errors and the criteria for evaluation of the ultimate achievable precision in an experiment. It turns out that adaptive tomography provides a lower noise floor in the presence of strong technical noise.

  7. Epoxidation catalyst and process

    Science.gov (United States)

    Linic, Suljo; Christopher, Phillip

    2010-10-26

    Disclosed herein is a catalytic method of converting alkenes to epoxides. This method generally includes reacting alkenes with oxygen in the presence of a specific silver catalyst under conditions suitable to produce a yield of the epoxides. The specific silver catalyst is a silver nanocrystal having a plurality of surface planes, a substantial portion of which is defined by Miller indices of (100). The reaction is performed by charging a suitable reactor with this silver catalyst and then feeding the reactants to the reactor under conditions to carry out the reaction. The reaction may be performed in batch, or as a continuous process that employs a recycle of any unreacted alkenes. The specific silver catalyst has unexpectedly high selectivity for epoxide products. Consequently, this general method (and its various embodiments) will result in extraordinarily high epoxide yields heretofore unattainable.

  8. Innovation Processes and Closure

    DEFF Research Database (Denmark)

    Darsø, Lotte; Austin, Robert

    2009-01-01

    Artiklen beskriver, diskuterer og illustrerer en lang række forskellige innovationsprocesmodeller med fokus på hvordan (og hvornår) innovative teams håndterer at stoppe processen. Sommetider er der tale om forceret lukning fx pga deadlines eller andet, andre gange sker der en tydelig krystalliser......Artiklen beskriver, diskuterer og illustrerer en lang række forskellige innovationsprocesmodeller med fokus på hvordan (og hvornår) innovative teams håndterer at stoppe processen. Sommetider er der tale om forceret lukning fx pga deadlines eller andet, andre gange sker der en tydelig...... krystallisering af et koncept. Der peges også på nye typer af modeller, hvor udviklingsprocessen holdes åben, fordi produktet aldrig bliver "færdigt". Endelig sammenholdes innovationsprocessen med kunstneriske og kreative processer....

  9. Uncloaking the Scientific Process

    Science.gov (United States)

    Leitzell, K.; Meier, W.

    2009-12-01

    Since April 2008, NSIDC has offered daily updates of sea ice data on our Arctic Sea Ice News & Analysis Web page (http://nsidc.org/arcticseaicenews). The images provide near-real-time data to the general public and policy makers, accompanied by monthly or more frequent analysis updates. In February 2009, a crucial channel of the Special Sensor Microwave/Imager (SSM/I) sensor on the Defense Meteorological Satellite Program (DMSP) F15 satellite, from which NSIDC was obtaining near-real-time Arctic sea ice data, suddenly failed. The daily image, which is automatically updated, showed a sudden drop in ice extent of over 50,000 square kilometers. Even after taking the images down, skeptical blogs jumped on the event, posting headlines such as “Errors in publicly presented data - Worth blogging about?” and “NSIDC pulls the plug on sea ice data.” In fact, NSIDC data managers and scientists were well aware that the F15 satellite sensor would eventually fail. NSIDC switched to a previously used back-up sensor, F13, and work to transition to a newer sensor on the F17 satellite had been underway for several weeks. While the deluge of questions from readers and bloggers were frustrating to NSIDC communications staff and scientists, they also presented a chance to give readers a window into the scientific process, and specifically into the collection of satellite data. We decided to publish a clear account of the process used to transition between sensors, as well as a basic explanation of the satellites used to measure sea ice data. While most scientists are familiar with the limitations of near-real-time data, the concept is unfamiliar to many in the general public. The Web page includes links to information on near-real-time data, including notes that images sometimes contain missing or erroneous data, and that delays can occur. However, to a skeptical person, the words that scientists use to describe the processing of final data, including “adjustment,”

  10. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  11. Mulighedsbetingelser for transkulturelle processer

    DEFF Research Database (Denmark)

    Petersen, Karen Bjerg

    2012-01-01

    En tiltagende neoliberal diskurs ikke kun i international uddannelsespolitik i de seneste årtier, men også i skandinaviske lande som fx Danmark i 2000-tallet indikerer betydningsfulde ændringer i forståelsen af formålet med kulturelle processer som læring og undervisning. Eksemplificeret gennem...... læringsprocesser, men tillige synes at have ændret også konkrete kulturelle læringsprocesser i uhensigtsmæssig retning. Da en neoliberal uddannelsespolitik er relativt ny i Danmark er formålet med artiklen 1. at diskutere udviklingen i den konkrete case og sammenholde dette med internationale tendenser i fx...

  12. Weather Information Processing

    Science.gov (United States)

    1991-01-01

    Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.

  13. Irradiation and food processing.

    Science.gov (United States)

    Sigurbjörnsson, B; Loaharanu, P

    1989-01-01

    After more than four decades of research and development, food irradiation has been demonstrated to be safe, effective and versatile as a process of food preservation, decontamination or disinfection. Its various applications cover: inhibition of sprouting of root crops; insect disinfestation of stored products, fresh and dried food; shelf-life extension of fresh fruits, vegetables, meat and fish; destruction of parasites and pathogenic micro-organisms in food of animal origin; decontamination of spices and food ingredients, etc. Such applications provide consumers with the increase in variety, volume and value of food. Although regulations on food irradiation in different countries are largely unharmonized, national authorities have shown increasing recognition and acceptance of this technology based on the Codex Standard for Irradiated Foods and its associated Code of Practice. Harmonization of national legislations represents an important prerequisite to international trade in irradiated food. Consumers at large are still not aware of the safety and benefits that food irradiation has to offer. Thus, national and international organizations, food industry, trade associations and consumer unions have important roles to play in introducing this technology based on its scientific values. Public acceptance of food irradiation may be slow at the beginning, but should increase at a faster rate in the foreseeable future when consumers are well informed of the safety and benefits of this technology in comparison with existing ones. Commercial applications of food irradiation has already started in 18 countries at present. The volume of food or ingredients treated on a commercial scale varies from country to country ranging from several tons of spices to hundreds of thousands of tons of grains per annum. With the increasing interest of national authorities and the food industry in applying the process, it is anticipated that some 25 countries will use some 55 commercial

  14. Time processing in dyscalculia.

    Science.gov (United States)

    Cappelletti, Marinella; Freeman, Elliot D; Butterworth, Brian L

    2011-01-01

    To test whether atypical number development may affect other types of quantity processing, we investigated temporal discrimination in adults with developmental dyscalculia (DD). This also allowed us to test whether number and time may be sub-served by a common quantity system or decision mechanisms: if they do, both should be impaired in dyscalculia, but if number and time are distinct they should dissociate. Participants judged which of two successively presented horizontal lines was longer in duration, the first line being preceded by either a small or a large number prime ("1" or "9") or by a neutral symbol ("#"), or in a third task participants decided which of two Arabic numbers (either "1," "5," "9") lasted longer. Results showed that (i) DD's temporal discriminability was normal as long as numbers were not part of the experimental design, even as task-irrelevant stimuli; however (ii) task-irrelevant numbers dramatically disrupted DD's temporal discriminability the more their salience increased, though the actual magnitude of the numbers had no effect; in contrast (iii) controls' time perception was robust to the presence of numbers but modulated by numerical quantity: therefore small number primes or numerical stimuli seemed to make durations appear shorter than veridical, but longer for larger numerical prime or numerical stimuli. This study is the first to show spared temporal discrimination - a dimension of continuous quantity - in a population with a congenital number impairment. Our data reinforce the idea of a partially shared quantity system across numerical and temporal dimensions, which supports both dissociations and interactions among dimensions; however, they suggest that impaired number in DD is unlikely to originate from systems initially dedicated to continuous quantity processing like time.

  15. Laundry process intensification by ultrasound

    NARCIS (Netherlands)

    Warmoeskerken, M.M.C.G.; Vlist, van der P.; Moholkar, V.S.; Nierstrasz, V.A.

    2002-01-01

    In domestic textile laundering processes, mass transfer and mass transport are often rate limiting. Therefore, these processes require a long processing time, large amounts of water and chemicals, and they are energy consuming. In most of these processes, diffusion and convection in the inter-yarn a

  16. Process algebra for Hybrid systems

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2008-01-01

    We propose a process algebra obtained by extending a combination of the process algebra with continuous relative timing from Baeten and Middelburg [Process Algebra with Timing, Springer, Chap. 4, 2002] and the process algebra with propositional signals from Baeten and Bergstra [Theoretical Computer

  17. Process algebra for hybrid systems

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2005-01-01

    We propose a process algebra obtained by extending a combination of the process algebra with continuous relative timing from Baeten and Middelburg (Process Algebra with Timing, Springer,Berlin, 2002, Chapter 4), and the process algebra with propositional signals from Baeten and Bergstra(Theoret. Com

  18. Multivariate supOU processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Stelzer, Robert

    2011-01-01

    Univariate superpositions of Ornstein–Uhlenbeck-type processes (OU), called supOU processes, provide a class of continuous time processes capable of exhibiting long memory behavior. This paper introduces multivariate supOU processes and gives conditions for their existence and finiteness of momen...

  19. Perspectives on Multienzyme Process Technology

    DEFF Research Database (Denmark)

    Santacoloma, Paloma A.; Woodley, John M.

    2014-01-01

    There is little doubt that chemical processing of the future will involve an increasing number of biocatalytic processes using more than one enzyme. There are good reasons for developing such innovative biocatalytic processes and interesting new biocatalyst and process options will be introduced....

  20. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2013-09-01

    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  1. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  2. Process of Petri Nets Extension

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    To describe the dynamic semantics for the network computing, the concept on process is presented based on the semantic model with variable, resource and relation. Accordingly, the formal definition of process and the mapping rules from the specification of Petri nets extension to process are discussed in detail respectively. Based on the collective concepts of process, the specification of dynamic semantics also is constructed as a net system. Finally, to illustrate process intuitively, an example is specified completely.

  3. Process-oriented evaluation of agile business processes

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Agile enterprises are built based on agile business processes.Simultaneously,agile enterprises must be able to utilize agile business processes to rapidly respond market opportunities and maintain enterprises'competitions.But how to evaluate and choose agile business processes is a key problem about building agile enterprises.The paper proposes a goal driven method and an evaluation architecture for business processes' agility.Furthermore,a four-layer configuring model for agile business processes is developed based on the evaluation architecture and it can evaluate and configure agile business processes among alternatives.

  4. Event-driven process execution model for process virtual machine

    Institute of Scientific and Technical Information of China (English)

    WU Dong-yao; WEI Jun; GAO Chu-shu; DOU Wen-shen

    2012-01-01

    Current orchestration and choreography process engines only serve with dedicate process languages. To solve these problems, an Even~driven Process Execution Model (EPEM) was developed. Formalization and map- ping principles of the model were presented to guarantee the correctness and efficiency for process transformation. As a case study, the EPEM descriptions of Web Services Business Process Execution Language (WS~BPEL) were represented and a Process Virtual Machine (PVM)-OncePVM was implemented in compliance with the EPEM.

  5. Data Processing for Scientists.

    Science.gov (United States)

    Heumann, K F

    1956-10-26

    This brief survey of integrated and electronic data processing has touched on such matters as the origin of the concepts, their use in business, machines that are available, indexing problems, and, finally, some scientific uses that surely foreshadow further development. The purpose of this has been to present for the consideration of scientists a point of view and some techniques which have had a phenomenal growth in the business world and to suggest that these are worth consideration in scientific data-handling problems (30). To close, let me quote from William Bamert on the experience of the C. and O. Railroad once more (8, p. 121): "Frankly, we have been asked whether we weren't planning for Utopia-the implication being that everyone except starry-eyed visionaries knows that Utopia is unattainable. Our answer is that of course we are! Has anyone yet discovered a better way to begin program planning of this nature? Our feeling is that compromise comes early enough in the normal order of things."

  6. Controlled processing during sequencing

    Directory of Open Access Journals (Sweden)

    Malathi eThothathiri

    2015-10-01

    Full Text Available Longstanding evidence has identified a role for the frontal cortex in sequencing within both linguistic and non-linguistic domains. More recently, neuropsychological studies have suggested a specific role for the left premotor-prefrontal junction (BA 44/6 in selection between competing alternatives during sequencing. In this study, we used neuroimaging with healthy adults to confirm and extend knowledge about the neural correlates of sequencing. Participants reproduced visually presented sequences of syllables and words using manual button presses. Items in the sequence were presented either consecutively or concurrently. Concurrent presentation is known to trigger the planning of multiple responses, which might compete with one another. Therefore, we hypothesized that regions involved in controlled processing would show greater recruitment during the concurrent than the consecutive condition. Whole-brain analysis showed concurrent > consecutive activation in sensory, motor and somatosensory cortices and notably also in rostral-dorsal anterior cingulate cortex (ACC. Region of interest analyses showed increased activation within left BA 44/6 and correlation between this region’s activation and behavioral response times. Functional connectivity analysis revealed increased connectivity between left BA 44/6 and the posterior lobe of the cerebellum during the concurrent than the consecutive condition. These results corroborate recent evidence and demonstrate the involvement of BA 44/6 and other control regions when ordering co-activated representations.

  7. Coal liquefaction processes

    Energy Technology Data Exchange (ETDEWEB)

    Baker, N.R.; Blazek, C.F.; Tison, R.R.

    1979-07-01

    Coal liquefaction is an emerging technology receiving great attention as a possible liquid fuel source. Currently, four general methods of converting coal to liquid fuel are under active development: direct hydrogenation; pyrolysis/hydrocarbonization; solvent extraction; and indirect liquefaction. This work is being conducted at the pilot plant stage, usually with a coal feed rate of several tons per day. Several conceptual design studies have been published recently for large (measured in tens of thousands of tons per day coal feed rate) commercial liquefaction plants, and these reports form the data base for this evaluation. Products from a liquefaction facility depend on the particular method and plant design selected, and these products range from synthetic crude oils up through the lighter hydrocarbon gases, and, in some cases, electricity. Various processes are evaluated with respect to product compositions, thermal efficiency, environmental effects, operating and maintenance requirements, and cost. Because of the large plant capacities of current conceptual designs, it is not clear as to how, and on what scale, coal liquefaction may be considered appropriate as an energy source for Integrated Community Energy Systems (CES). Development work, both currently under way and planned for the future, should help to clarify and quantify the question of applicability.

  8. Controlled processing during sequencing.

    Science.gov (United States)

    Thothathiri, Malathi; Rattinger, Michelle

    2015-01-01

    Longstanding evidence has identified a role for the frontal cortex in sequencing within both linguistic and non-linguistic domains. More recently, neuropsychological studies have suggested a specific role for the left premotor-prefrontal junction (BA 44/6) in selection between competing alternatives during sequencing. In this study, we used neuroimaging with healthy adults to confirm and extend knowledge about the neural correlates of sequencing. Participants reproduced visually presented sequences of syllables and words using manual button presses. Items in the sequence were presented either consecutively or concurrently. Concurrent presentation is known to trigger the planning of multiple responses, which might compete with one another. Therefore, we hypothesized that regions involved in controlled processing would show greater recruitment during the concurrent than the consecutive condition. Whole-brain analysis showed concurrent > consecutive activation in sensory, motor and somatosensory cortices and notably also in rostral-dorsal anterior cingulate cortex. Region of interest analyses showed increased activation within left BA 44/6 and correlation between this region's activation and behavioral response times. Functional connectivity analysis revealed increased connectivity between left BA 44/6 and the posterior lobe of the cerebellum during the concurrent than the consecutive condition. These results corroborate recent evidence and demonstrate the involvement of BA 44/6 and other control regions when ordering co-activated representations.

  9. Radiation processing of polyethylene

    Science.gov (United States)

    Barlow, A.; Biggs, J. W.; Meeks, L. A.

    This paper covers two areas (a) the use of high energy radiation for the synthesis and improvement of polymer properties and (b) the formulation of radiation curable compounds for automotive/appliance wire applications and high voltage insulation. The first part discusses the use of gamma radiation for the bulk polymerization of ethylene and the properties of the polymer produced. The use of low dose radiation to increase polymer molecular weight and modify polydispersity is also described together with its projected operational cost. An update is provided of the cost savings that can be realized when using radiation crosslinked heavy duty film, which expands its applications, compared with noncrosslinked materials. The second section of the paper considers the advantages and disadvantages of radiation vs. peroxide curing of wire and cable compounds. The formulation of a radiation curable, automotive/appliance wire compound is discussed together with the interactions between the various ingredients; i.e., base resin, antioxidants, flame retardant filler, coupling agents, processing aids and radiation to achieve the desired product. In addition, the general property requirements of a radiation curable polyethylene for high voltage insulation are discussed; these include crosslinking efficiency, thermal stability, wet tree resistance and satisfactory dielectric properties. Preliminary data generated in the development of a 230KV radiation crosslinked polyethylene insulation are included.

  10. Processing of food wastes.

    Science.gov (United States)

    Kosseva, Maria R

    2009-01-01

    Every year almost 45 billion kg of fresh vegetables, fruits, milk, and grain products is lost to waste in the United States. According to the EPA, the disposal of this costs approximately $1 billion. In the United Kingdom, 20 million ton of food waste is produced annually. Every tonne of food waste means 4.5 ton of CO(2) emissions. The food wastes are generated largely by the fruit-and-vegetable/olive oil, fermentation, dairy, meat, and seafood industries. The aim of this chapter is to emphasize existing trends in the food waste processing technologies during the last 15 years. The chapter consists of three major parts, which distinguish recovery of added-value products (the upgrading concept), the food waste treatment technologies as well as the food chain management for sustainable food system development. The aim of the final part is to summarize recent research on user-oriented innovation in the food sector, emphasizing on circular structure of a sustainable economy.

  11. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  12. Food Processing Control

    Science.gov (United States)

    1997-01-01

    When NASA started plarning for manned space travel in 1959, the myriad challenges of sustaining life in space included a seemingly mundane but vitally important problem: How and what do you feed an astronaut? There were two main concerns: preventing food crumbs from contaminating the spacecraft's atmosphere or floating into sensitive instruments, and ensuring complete freedom from potentially catastrophic disease-producing bacteria, viruses, and toxins. To solve these concerns, NASA enlisted the help of the Pillsbury Company. Pillsbury quickly solved the first problem by coating bite-size foods to prevent crumbling. They developed the hazard analysis and critical control point (HACCP) concept to ensure against bacterial contamination. Hazard analysis is a systematic study of product, its ingredients, processing conditions, handling, storage, packing, distribution, and directions for consumer use to identify sensitive areas that might prove hazardous. Hazard analysis provides a basis for blueprinting the Critical Control Points (CCPs) to be monitored. CCPs are points in the chain from raw materials to the finished product where loss of control could result in unacceptable food safety risks. In early 1970, Pillsbury plants were following HACCP in production of food for Earthbound consumers. Pillsbury's subsequent training courses for Food and Drug Administration (FDA) personnel led to the incorporation of HACCP in the FDA's Low Acid Canned Foods Regulations, set down in the mid-1970s to ensure the safety of all canned food products in the U.S.

  13. Sustainable Process Synthesis-Intensification

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi

    materials (feedstock) and the use of sustainable technologies or processes which directly impacts and improves sustainability/LCA factors. Process intensification is a concept by which processes, whether conceptual or existing, can be designed or redesigned to achieve more efficient and sustainable designs....... Therefore sustainable process design can be achieved by performing process syn-thesis and process intensification together. The main contribution of this work is the development of a systematic computer-aided multi-scale, multi-level framework for performing process synthesis-intensification that aims...... to make a process more sustainable than a base case design, which represents either a new or existing process. The framework consists of eight steps (step 1 to step 8) that operates at the unit operation scale and task scale, and four integrated task-phenomena-based steps (IT-PBS.1 to IT-PBS.4...

  14. Locally Stationary Processes - A Review

    CERN Document Server

    Dahlhaus, Rainer

    2011-01-01

    The article contains an overview over locally stationary processes. At the beginning time varying autoregressive processes are discussed in detail - both as as a deep example and an important class of locally stationary processes. In the next section a general framework for time series with time varying finite dimensional parameters is discussed with special emphasis on nonlinear locally stationary processes. Then the paper focusses on linear processes where a more general theory is possible. First a general definition for linear processes is given and time varying spectral densities are discussed in detail. Then the Gaussian likelihood theory is presented for locally stationary processes. In the next section the relevance of empirical spectral processes for locally stationary time series is discussed. Empirical spectral processes play a major role in proving theoretical results and provide a deeper understanding of many techniques. The article concludes with an overview of other results for locally stationar...

  15. Transforming spatial point processes into Poisson processes using random superposition

    DEFF Research Database (Denmark)

    Møller, Jesper; Berthelsen, Kasper Klitgaaard

    with a complementary spatial point process Y  to obtain a Poisson process X∪Y  with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt,Yt) which converges towards the distribution of (X,Y). We study the joint distribution of X and Y, and their marginal and conditional distributions...... process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking....... In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson...

  16. Design Process Optimization Based on Design Process Gene Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Bo; TONG Shu-rong

    2011-01-01

    The idea of genetic engineering is introduced into the area of product design to improve the design efficiency. A method towards design process optimization based on the design process gene is proposed through analyzing the correlation between the design process gene and characteristics of the design process. The concept of the design process gene is analyzed and categorized into five categories that are the task specification gene, the concept design gene, the overall design gene, the detailed design gene and the processing design gene in the light of five design phases. The elements and their interactions involved in each kind of design process gene signprocess gene mapping is drawn with its structure disclosed based on its function that process gene.

  17. Integrated Process Design, Control and Analysis of Intensified Chemical Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil

    chemical processes; for example, intensified processes such as reactive distillation. Most importantly, it identifies and eliminates potentially promising design alternatives that may have controllability problems later. To date, a number of methodologies have been proposed and applied on various problems......Process design and process control have been considered as independent problems for many years. In this context, a sequential approach is used where the process is designed first, followed by the control design. However, this sequential approach has its limitations related to dynamic constraint...... violations, for example, infeasible operating points, process overdesign or under-performance. Therefore, by using this approach, a robust performance is not always guaranteed. Furthermore, process design decisions can influence process control and operation. To overcome these limitations, an alternative...

  18. Business Process Innovation using the Process Innovation Laboratory

    DEFF Research Database (Denmark)

    Møller, Charles

    process innovation (BPI) in future organizations. There is a significant body of knowledge on various aspect of process innovation, e.g. on conceptual modeling, business processes, supply chains and enterprise systems. Still an overall comprehensive and consistent theoretical framework with guidelines...... for practical applications has not been identified. The aim of this paper is to establish a conceptual framework for business process innovation in the supply chain based on advanced enterprise systems. The main approach to business process innovation in this context is to create a new methodology for exploring...... process models and patterns of applications. The paper thus presents a new concept for business process innovation called the process innovation laboratory a.k.a. the ?-Lab. The ?-Lab is a comprehensive framework for BPI using advanced enterprise systems. The ?-Lab is a collaborative workspace...

  19. Markov reward processes

    Science.gov (United States)

    Smith, R. M.

    1991-01-01

    Numerous applications in the area of computer system analysis can be effectively studied with Markov reward models. These models describe the behavior of the system with a continuous-time Markov chain, where a reward rate is associated with each state. In a reliability/availability model, upstates may have reward rate 1 and down states may have reward rate zero associated with them. In a queueing model, the number of jobs of certain type in a given state may be the reward rate attached to that state. In a combined model of performance and reliability, the reward rate of a state may be the computational capacity, or a related performance measure. Expected steady-state reward rate and expected instantaneous reward rate are clearly useful measures of the Markov reward model. More generally, the distribution of accumulated reward or time-averaged reward over a finite time interval may be determined from the solution of the Markov reward model. This information is of great practical significance in situations where the workload can be well characterized (deterministically, or by continuous functions e.g., distributions). The design process in the development of a computer system is an expensive and long term endeavor. For aerospace applications the reliability of the computer system is essential, as is the ability to complete critical workloads in a well defined real time interval. Consequently, effective modeling of such systems must take into account both performance and reliability. This fact motivates our use of Markov reward models to aid in the development and evaluation of fault tolerant computer systems.

  20. Flash Lidar Data Processing

    Science.gov (United States)

    Bergkoetter, M. D.; Ruppert, L.; Weimer, C. S.; Ramond, T.; Lefsky, M. A.; Burke, I. C.; Hu, Y.

    2009-12-01

    Late last year, a prototype Flash LIDAR instrument flew on a series of airborne tests to demonstrate its potential for improved vegetation measurements. The prototype is a precursor to the Electronically Steerable Flash LIDAR (ESFL) currently under development at Ball Aerospace and Technology Corp. with funding from the NASA Earth Science Technology Office. ESFL may soon significantly expand our ability to measure vegetation and forests and better understand the extent of their role in global climate change and the carbon cycle - all critical science questions relating to the upcoming NASA DESDynI and ESA BIOMASS missions. In order to more efficiently exploit data returned from the experimental Flash Lidar system and plan for data exploitation from future flights, Ball funded a graduate student project (through the Ball Summer Intern Program, summer 2009) to develop and implement algorithms for post-processing of the 3-Dimensional Flash Lidar data. This effort included developing autonomous algorithms to resample the data to a uniform rectangular grid, geolocation of the data, and visual display of large swaths of data. The resampling, geolocation, surface hit detection, and aggregation of frame data are implemented with new MATLAB code, and the efficient visual display is achieved with free commercial viewing software. These efforts directly support additional tests flights planned as early as October 2009, including possible flights over Niwot Ridge, CO, for which there is ICESat data, and a sea-level coastal area in California to test the effect of higher altitude (above ground level) on the divergence of the beams and the beam spot sizes.

  1. Representative process sampling - in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen

    2007-01-01

    Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and....../or periodicity; different nugget effects and process variations ranging from less than one lag to full variogram lag). Variogram data analysis leads to a fundamental decomposition into 0-D sampling vs. 1-D process variances, based on the three principal variogram parameters: range, sill and nugget effect...

  2. An improved approach for process monitoring in laser material processing

    Science.gov (United States)

    König, Hans-Georg; Pütsch, Oliver; Stollenwerk, Jochen; Loosen, Peter

    2016-04-01

    Process monitoring is used in many different laser material processes due to the demand for reliable and stable processes. Among different methods, on-axis process monitoring offers multiple advantages. To observe a laser material process it is unavoidable to choose a wavelength for observation that is different to the one used for material processing, otherwise the light of the processing laser would outshine the picture of the process. By choosing a different wavelength, lateral chromatic aberration occurs in not chromatically corrected optical systems with optical scanning units and f-Theta lenses. These aberrations lead to a truncated image of the process on the camera or the pyrometer, respectively. This is the reason for adulterated measurements and non-satisfying images of the process. A new approach for solving the problem of field dependent lateral chromatic aberration in process monitoring is presented. Therefore, the scanner-based optical system is reproduced in a simulation environment, to predict the occurring lateral chromatic aberrations. In addition, a second deflecting system is integrated into the system. By using simulation, a predictive control is designed that uses the additional deflecting system to introduce reverse lateral deviations in order to compensate the lateral effect of chromatic aberration. This paper illustrates the concept and the implementation of the predictive control, which is used to eliminate lateral chromatic aberrations in process monitoring, the simulation on which the system is based the optical system as well as the control concept.

  3. Process mining discovery, conformance and enhancement of business processes

    CERN Document Server

    van der Aalst, Wil M P

    2011-01-01

    The first to cover this missing link between data mining and process modeling, this book provides real-world techniques for monitoring and analyzing processes in real time. It is a powerful new tool destined to play a key role in business process management.

  4. Composable Data Processing in Environmental Science - A Process View

    NARCIS (Netherlands)

    Wombacher, A.

    2008-01-01

    Data processing in environmental science is essential for doing science. The heterogeneity of data sources, data processing operations and infrastructures results in a lot of manual data and process integration work done by each scientist individually. This is very inefficient and time consuming. Th

  5. Process intensification: a balance between product and process innovation

    NARCIS (Netherlands)

    Graaff, M.P. de; Swinkels, P.

    2013-01-01

    Martijn de Graaff of TNO and Pieter Swinkels of TU Delft discuss the challenges of implementing process intensification in new product and process innovation. The Delft Product & Process Design Institute at Delft University of Technology in the Netherlands (TU Delft) has seen over 100 case studies o

  6. Arctic Summer Ice Processes

    Science.gov (United States)

    Holt, Benjamin

    1999-01-01

    The primary objective of this study is to estimate the flux of heat and freshwater resulting from sea ice melt in the polar seas. The approach taken is to examine the decay of sea ice in the summer months primarily through the use of spaceborne Synthetic Aperture Radar (SAR) imagery. The improved understanding of the dynamics of the melt process can be usefully combined with ice thermodynamic and upper ocean models to form more complete models of ice melt. Models indicate that more heat is absorbed in the upper ocean when the ice cover is composed of smaller rather than larger floes and when there is more open water. Over the course of the summer, floes disintegrate by physical forcing and heating, melting into smaller and smaller sizes. By measuring the change in distribution of floes together with open water over a summer period, we can make estimates of the amount of heating by region and time. In a climatic sense, these studies are intended to improve the understanding of the Arctic heat budget which can then be eventually incorporated into improved global climate models. This work has two focus areas. The first is examining the detailed effect of storms on floe size and open water. A strong Arctic low pressure storm has been shown to loosen up the pack ice, increase the open water concentration well into the pack ice, and change the distribution of floes toward fewer and smaller floes. This suggests episodic melting and the increased importance of horizontal (lateral) melt during storms. The second focus area is related to an extensive ship-based experiment that recently took place in the Arctic called Surface Heat Budget of the Arctic (SHEBA). An icebreaker was placed purposely into the older pack ice north of Alaska in September 1997. The ship served as the base for experimenters who deployed extensive instrumentation to measure the atmosphere, ocean, and ice during a one-year period. My experiment will be to derive similar measurements (floe size, open

  7. Human Assisted Assembly Processes

    Energy Technology Data Exchange (ETDEWEB)

    CALTON,TERRI L.; PETERS,RALPH R.

    2000-01-01

    Automatic assembly sequencing and visualization tools are valuable in determining the best assembly sequences, but without Human Factors and Figure Models (HFFMs) it is difficult to evaluate or visualize human interaction. In industry, accelerating technological advances and shorter market windows have forced companies to turn to an agile manufacturing paradigm. This trend has promoted computerized automation of product design and manufacturing processes, such as automated assembly planning. However, all automated assembly planning software tools assume that the individual components fly into their assembled configuration and generate what appear to be a perfectly valid operations, but in reality the operations cannot physically be carried out by a human. Similarly, human figure modeling algorithms may indicate that assembly operations are not feasible and consequently force design modifications; however, if they had the capability to quickly generate alternative assembly sequences, they might have identified a feasible solution. To solve this problem HFFMs must be integrated with automated assembly planning to allow engineers to verify that assembly operations are possible and to see ways to make the designs even better. Factories will very likely put humans and robots together in cooperative environments to meet the demands for customized products, for purposes including robotic and automated assembly. For robots to work harmoniously within an integrated environment with humans the robots must have cooperative operational skills. For example, in a human only environment, humans may tolerate collisions with one another if they did not cause much pain. This level of tolerance may or may not apply to robot-human environments. Humans expect that robots will be able to operate and navigate in their environments without collisions or interference. The ability to accomplish this is linked to the sensing capabilities available. Current work in the field of cooperative

  8. [In Process Citation].

    Science.gov (United States)

    Stahnisch, Frank W

    2014-01-01

    Since the middle of the Nineteenth Century, neurophysiological researchers such as Theodor Fechner (1801-1887), Wilhelm Wundt (1832-1920), or Maximilian Ruppert Franz von Frey (1852-1932) started to analyze the causes, propagation, and perception of "pain" in the nervous system through the systematic use of experimental laboratory investigations. Particularly, Theodor Fechner's groundbreaking works made the contemporary neurophysiologists aware of the potential inclusion of psychological and subjective perceptions as a respectable object for the experimental study in mid-nineteenth century laboratories and clinical wards. Wilhelm Wundt frequently crossed the intersections between animal and human subject research and opened up many theoretical discussions, which also incorporated pluridisciplinary perspectives. On the research side, Wundt worked with many experimental physiological methods, developed theoretical psychophysiological considerations, and provided a detailed philosophical analysis of the new experimental findings and the subjective accounts of pain perceptions in his test persons--among many other experimental and investigative approaches. While each one of these neurophysiologists' research programs have been extensively studied in their own right, their mutual contributions to modern pain research and impact on this emerging interdisciplinary field of biomedical, psychophysiological and philosophical studies have so far not sufficiently been analyzed from a historiographical perspective. This even regards their highly sophisticated instruments and apparatuses that they applied to the study of pain, which Maximilian von Frey used further in the medical wards at the Fin de Siècle. These instruments became applied to many patients with acute or chronic pain disorders. In a way, the substantial time lag between early laboratory research and the application of these findings in the medical clinics of the time could also be explained as a process of newly

  9. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  10. Process waste assessment: Color print processing (RA-4)

    Energy Technology Data Exchange (ETDEWEB)

    Catlett, P.

    1994-05-01

    The Kodak RA-4 process is used to develop prints and overhead transparencies from photographic negatives. The assessment was based on usage, effluent discharge, and final disposition of waste generated by the process. Two options explored were bleach-fix regeneration and the conversion to a digital image processing system. The RA-4 process is process is environmentally sound and generates a relatively small amount of waste. The bleach-fix option would provide only a small effluent reduction. The digital imaging conversion option, if fully implemented, could greatly reduce waste generated in the photo lab.

  11. Nontraditional machining processes research advances

    CERN Document Server

    2013-01-01

    Nontraditional machining employs processes that remove material by various methods involving thermal, electrical, chemical and mechanical energy or even combinations of these. Nontraditional Machining Processes covers recent research and development in techniques and processes which focus on achieving high accuracies and good surface finishes, parts machined without burrs or residual stresses especially with materials that cannot be machined by conventional methods. With applications to the automotive, aircraft and mould and die industries, Nontraditional Machining Processes explores different aspects and processes through dedicated chapters. The seven chapters explore recent research into a range of topics including laser assisted manufacturing, abrasive water jet milling and hybrid processes. Students and researchers will find the practical examples and new processes useful for both reference and for developing further processes. Industry professionals and materials engineers will also find Nontraditional M...

  12. National Automated Conformity Inspection Process -

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  13. Sustainable development through process innovation

    NARCIS (Netherlands)

    Sleeman, E.; Oonk, J.; Krist-Spit, mw. C.E.

    1998-01-01

    Innovation of processes is one of the corner stones of sustainable development. Innovation may be characterized by the degree of intervention in the existing basematerials-processes-products chains. Four types of innovation are distinguished [1]:

  14. Chemometrics applications in biotech processes: assessing process comparability.

    Science.gov (United States)

    Bhushan, Nitish; Hadpe, Sandip; Rathore, Anurag S

    2012-01-01

    A typical biotech process starts with the vial of the cell bank, ends with the final product and has anywhere from 15 to 30 unit operations in series. The total number of process variables (input and output parameters) and other variables (raw materials) can add up to several hundred variables. As the manufacturing process is widely accepted to have significant impact on the quality of the product, the regulatory agencies require an assessment of process comparability across different phases of manufacturing (Phase I vs. Phase II vs. Phase III vs. Commercial) as well as other key activities during product commercialization (process scale-up, technology transfer, and process improvement). However, assessing comparability for a process with such a large number of variables is nontrivial and often companies resort to qualitative comparisons. In this article, we present a quantitative approach for assessing process comparability via use of chemometrics. To our knowledge this is the first time that such an approach has been published for biotech processing. The approach has been applied to an industrial case study involving evaluation of two processes that are being used for commercial manufacturing of a major biosimilar product. It has been demonstrated that the proposed approach is able to successfully identify the unit operations in the two processes that are operating differently. We expect this approach, which can also be applied toward assessing product comparability, to be of great use to both the regulators and the industry which otherwise struggle to assess comparability.

  15. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    B.P. Mahesh

    2010-09-01

    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  16. Analysis of Hospital Processes with Process Mining Techniques.

    Science.gov (United States)

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  17. Electrochemical process of titanium extraction

    Institute of Scientific and Technical Information of China (English)

    CH. RVS. NAGESH; C. S. RAMACHANDRAN

    2007-01-01

    A wide variety of processes are being pursued by researchers for cost effective extraction of titanium metal. Electrochemical processes are promising due to simplicity and being less capital intensive. Some of the promising electrochemical processes of titanium extraction were reviewed and the results of laboratory scale experiments on electrochemical reduction of TiO2 granules were brought out. Some of the kinetic parameters of the reduction process were discussed while presenting the quality improvements achieved in the experimentation.

  18. GPU applications for data processing

    Energy Technology Data Exchange (ETDEWEB)

    Vladymyrov, Mykhailo, E-mail: mykhailo.vladymyrov@cern.ch [LPI - Lebedev Physical Institute of the Russian Academy of Sciences, RUS-119991 Moscow (Russian Federation); Aleksandrov, Andrey [LPI - Lebedev Physical Institute of the Russian Academy of Sciences, RUS-119991 Moscow (Russian Federation); INFN sezione di Napoli, I-80125 Napoli (Italy); Tioukov, Valeri [INFN sezione di Napoli, I-80125 Napoli (Italy)

    2015-12-31

    Modern experiments that use nuclear photoemulsion imply fast and efficient data acquisition from the emulsion can be performed. The new approaches in developing scanning systems require real-time processing of large amount of data. Methods that use Graphical Processing Unit (GPU) computing power for emulsion data processing are presented here. It is shown how the GPU-accelerated emulsion processing helped us to rise the scanning speed by factor of nine.

  19. Advanced methods for processing ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Carter, W.B. [Georgia Institute of Technology, Atlanta, GA (United States)

    1995-05-01

    Combustion chemical vapor deposition (CCVD) is a flame assisted, open air chemical vapor deposition (CVD) process. The process is capable of producing textured, epitaxial coatings on single crystal substrates using low cost reagents. Combustion chemical vapor deposition is a relatively inexpensive, alternative thin film deposition process with potential to replace conventional coating technologies for certain applications. The goals of this project are to develop the CCVD process to the point that potential industrial applications can be identified and reliably assessed.

  20. Project management of business process

    OpenAIRE

    Jovanović, Slaviša

    2011-01-01

    This thesis presents one of the hottest areas in today's software industry, business process management (. Business Process Management has become an important tool for companies to hone their capabilities and business processes to adjust rapid and sudden changes in the marketplace and achieve the set of plans. Thesis is divided into two parts. The first section provides an overview of the scope of business process management through the project approach, historically defined its...

  1. PSE in Pharmaceutical Process Development

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2011-01-01

    The pharmaceutical industry is under growing pressure to increase efficiency, both in production and in process development. This paper will discuss the use of Process Systems Engineering (PSE) methods in pharmaceutical process development, and searches for answers to questions such as: Which PSE...

  2. Rapid thermal processing of semiconductors

    CERN Document Server

    Borisenko, Victor E

    1997-01-01

    Rapid thermal processing has contributed to the development of single wafer cluster processing tools and other innovations in integrated circuit manufacturing environments Borisenko and Hesketh review theoretical and experimental progress in the field, discussing a wide range of materials, processes, and conditions They thoroughly cover the work of international investigators in the field

  3. Dynamic similarity in erosional processes

    Science.gov (United States)

    Scheidegger, A.E.

    1963-01-01

    A study is made of the dynamic similarity conditions obtaining in a variety of erosional processes. The pertinent equations for each type of process are written in dimensionless form; the similarity conditions can then easily be deduced. The processes treated are: raindrop action, slope evolution and river erosion. ?? 1963 Istituto Geofisico Italiano.

  4. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  5. Present status of processing method

    Energy Technology Data Exchange (ETDEWEB)

    Kosako, Kazuaki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)

    1998-11-01

    Present status of processing method for a high-energy nuclear data file was examined. The NJOY94 code is the only one available to the processing. In Japan, present processing used NJOY94 is orienting toward the production of traditional cross section library, because a high-energy transport code using a high-energy cross section library is indistinct. (author)

  6. IT Support for Healthcare Processes

    NARCIS (Netherlands)

    Lenz, R.; Reichert, M.U.

    2005-01-01

    Patient treatment processes require the cooperation of different organizational units and medical disciplines. In such an environment an optimal process support becomes crucial. Though healthcare processes frequently change, and therefore the separation of the flow logic from the application code se

  7. Process control in biogas plants

    DEFF Research Database (Denmark)

    Holm-Nielsen, Jens Bo; Oleskowicz-Popiel, Piotr

    2013-01-01

    Efficient monitoring and control of anaerobic digestion (AD) processes are necessary in order to enhance biogas plant performance. The aim of monitoring and controlling the biological processes is to stabilise and optimise the production of biogas. The principles of process analytical technology...

  8. Counting Processes in Simple Addition.

    Science.gov (United States)

    Svenson, Ola; Hedenborg, Maj-Lene

    1980-01-01

    The cognitive processes of seven children solving arithmetic problems were accurately classified as reconstructive or reproductive according to the child's verbal report of his thought processes. Classifications of thought processes by means of verbal reports can also be used to improve the analysis of latencies. (SB)

  9. Learning processes across knowledge domains

    DEFF Research Database (Denmark)

    Hall-Andersen, Lene Bjerg; Broberg, Ole

    2014-01-01

    informed by selected perspectives on learning processes and boundary processes was applied on three illustrative vignettes to illuminate learning potentials and shortcomings in boundary processes. Findings - In the engineering consultancy, it was found that while learning did occur in the consultancy...

  10. Perfect simulation of Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    2005-01-01

    Our objective is to construct a perfect simulation algorithm for unmarked and marked Hawkes processes. The usual straightforward simulation algorithm suffers from edge effects, whereas our perfect simulation algorithm does not. By viewing Hawkes processes as Poisson cluster processes and using...

  11. Semantic Processing of Mathematical Gestures

    Science.gov (United States)

    Lim, Vanessa K.; Wilson, Anna J.; Hamm, Jeff P.; Phillips, Nicola; Iwabuchi, Sarina J.; Corballis, Michael C.; Arzarello, Ferdinando; Thomas, Michael O. J.

    2009-01-01

    Objective: To examine whether or not university mathematics students semantically process gestures depicting mathematical functions (mathematical gestures) similarly to the way they process action gestures and sentences. Semantic processing was indexed by the N400 effect. Results: The N400 effect elicited by words primed with mathematical gestures…

  12. Process algebra for synchronous communication

    NARCIS (Netherlands)

    Bergstra, J.A.; Klop, J.W.

    1984-01-01

    Within the context of an algebraic theory of processes, an equational specification of process cooperation is provided. Four cases are considered: free merge or interleaving, merging with communication, merging with mutual exclusion of tight regions, and synchronous process cooperation. The rewrite

  13. Methods in Astronomical Image Processing

    Science.gov (United States)

    Jörsäter, S.

    A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

  14. Continuous-state branching processes

    CERN Document Server

    Li, Zenghu

    2012-01-01

    These notes were used in a short graduate course on branching processes the author gave in Beijing Normal University. The following main topics are covered: scaling limits of Galton--Watson processes, continuous-state branching processes, extinction probabilities, conditional limit theorems, decompositions of sample paths, martingale problems, stochastic equations, Lamperti's transformations, independent and dependent immigration processes. Some of the results are simplified versions of those in the author's book "Measure-valued branching Markov processes" (Springer, 2011). We hope these simplified results will set out the main ideas in an easy way and lead the reader to a quick access of the subject.

  15. Optical and digital image processing

    CERN Document Server

    Cristobal, Gabriel; Thienpont, Hugo

    2011-01-01

    In recent years, Moore's law has fostered the steady growth of the field of digital image processing, though the computational complexity remains a problem for most of the digital image processing applications. In parallel, the research domain of optical image processing has matured, potentially bypassing the problems digital approaches were suffering and bringing new applications. The advancement of technology calls for applications and knowledge at the intersection of both areas but there is a clear knowledge gap between the digital signal processing and the optical processing communities. T

  16. Unifying the Software Process Spectrum

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Software Process Workshop (SPW 2005) was held in Beijing on May 25-27, 2005. This paper introduces the motivation of organizing such a workshop, as well as its theme and paper gathering and review; and summarizes the main content and insights of 11 keynote speeches, 30 regular papers in five sessions of "Process Content", "Process Tools and Metrics", "Process Management", "Process Representation and Analysis", and "Experience Reports", 8 software development support tools demonstration, and the ending panel "Where Are We Now? Where Should We Go Next?".

  17. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  18. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  19. Wet flue gas desulfurization processes

    Directory of Open Access Journals (Sweden)

    Hayrunnisa Çavuşoğlu

    2013-04-01

    Full Text Available The wet flue gas desulfurization process is widely used for the treatment of exhaust gases in power stations. Due to its high level of effectiveness over the already available processes, it has also been the mostly preferred method by industry. Its high SO2 removal efficiency, wide applicability of absorption chemicals and the ease of the chemical process handling which does not require comprehensive konowledge are among the main advantages of this process. In this article, various wet flue gas desulfurization processes such as lime/limestone have beendiscussed.

  20. Guideline Implementation: Processing Flexible Endoscopes.

    Science.gov (United States)

    Bashaw, Marie A

    2016-09-01

    The updated AORN "Guideline for processing flexible endoscopes" provides guidance to perioperative, endoscopy, and sterile processing personnel for processing all types of reusable flexible endoscopes and accessories in all procedural settings. This article focuses on key points of the guideline to help perioperative personnel safely and effectively process flexible endoscopes to prevent infection transmission. The key points address verification of manual cleaning, mechanical cleaning and processing, storage in a drying cabinet, determination of maximum storage time before reprocessing is needed, and considerations for implementing a microbiologic surveillance program. Perioperative RNs should review the complete guideline for additional information and for guidance when writing and updating policies and procedures.

  1. IWTU Process Sample Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  2. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  3. Sustainable Process Synthesis-Intensification

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Holtbruegge, Johannes; Lutze, Philip

    2014-01-01

    Sustainable process design can be achieved by performing process synthesis and process intensification together. This approach first defines a design target through a sustainability analysis and then finds design alternatives that match the target through process intensification. A systematic......, multi-stage framework for process synthesis- intensification that identifies more sustainable process designs has been developed. At stages 1-2, the working scale is at the level of unit operations, where a base case design is identified and analyzed with respect to sustainability metrics. At stages 3...... concepts and the framework are presented together with the results from a case study highlighting the application of the framework to the sustainable design of a production process for dimethyl carbonate....

  4. Machine intelligence and signal processing

    CERN Document Server

    Vatsa, Mayank; Majumdar, Angshul; Kumar, Ajay

    2016-01-01

    This book comprises chapters on key problems in machine learning and signal processing arenas. The contents of the book are a result of a 2014 Workshop on Machine Intelligence and Signal Processing held at the Indraprastha Institute of Information Technology. Traditionally, signal processing and machine learning were considered to be separate areas of research. However in recent times the two communities are getting closer. In a very abstract fashion, signal processing is the study of operator design. The contributions of signal processing had been to device operators for restoration, compression, etc. Applied Mathematicians were more interested in operator analysis. Nowadays signal processing research is gravitating towards operator learning – instead of designing operators based on heuristics (for example wavelets), the trend is to learn these operators (for example dictionary learning). And thus, the gap between signal processing and machine learning is fast converging. The 2014 Workshop on Machine Intel...

  5. Multivariate supOU processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Stelzer, Robert

    Univariate superpositions of Ornstein-Uhlenbeck (OU) type processes, called supOU processes, provide a class of continuous time processes capable of exhibiting long memory behaviour. This paper introduces multivariate supOU processes and gives conditions for their existence and finiteness...... of moments. Moreover, the second order moment structure is explicitly calculated, and examples exhibit the possibility of long range dependence. Our supOU processes are defined via homogeneous and factorisable Lévy bases. We show that the behaviour of supOU processes is particularly nice when the mean...... reversion parameter is restricted to normal matrices and especially to strictly negative definite ones.For finite variation Lévy bases we are able to give conditions for supOU processes to have locally bounded càdlàg paths of finite variation and to show an analogue of the stochastic differential equation...

  6. Taking the Copenhagen Process apart

    DEFF Research Database (Denmark)

    Cort, Pia

    The aim of this thesis is to analyse the EU vocational education and training policy process (The Copenhagen Process) from a critical perspective based on the policy analysis methodology, “What’s the Problem Represented to Be?” (WPR) developed by Professor Carol Bacchi. The main research question...... “How can the European vocational education and training policy process - the Copenhagen Process - be understood from a WPR perspective? “ is addressed in six articles which take apart the Copenhagen Process and deal with specific WPR questions and specific aspects of the Copenhagen Process......: the construction of vocational education and training; changes in governmentality; the genealogy of EC vocational education and training policy; the technologies of Europeanization; and finally the discursive and institutional effects of the policy process in the Danish context. The thesis argues...

  7. Contact Process with Exogenous Infection and the Scaled SIS Process

    CERN Document Server

    Zhang, June

    2015-01-01

    Propagation of contagion in networks depends on the graph topology. This paper is concerned with studying the time-asymptotic behavior of the extended contact processes on static, undirected, finite-size networks. This is a contact process with nonzero exogenous infection rate (also known as the {\\epsilon}-SIS, {\\epsilon} susceptible-infected-susceptible, model [1]). The only known analytical characterization of the equilibrium distribution of this process is for complete networks. For large networks with arbitrary topology, it is infeasible to numerically solve for the equilibrium distribution since it requires solving the eigenvalue-eigenvector problem of a matrix that is exponential in N , the size of the network. We show that, for a certain range of the network process parameters, the equilibrium distribution of the extended contact process on arbitrary, finite-size networks is well approximated by the equilibrium distribution of the scaled SIS process, which we derived in closed-form in prior work. We co...

  8. PROPOSAL OF SPATIAL OPTIMIZATION OF PRODUCTION PROCESS IN PROCESS DESIGNER

    Directory of Open Access Journals (Sweden)

    Peter Malega

    2015-03-01

    Full Text Available This contribution is focused on optimizing the use of space in the production process using software Process Designer. The aim of this contribution is to suggest possible improvements to the existing layout of the selected production process. Production process was analysed in terms of inputs, outputs and course of actions. Nowadays there are many software solutions aimed at optimizing the use of space. One of these software products is the Process Designer, which belongs to the product line Tecnomatix. This software is primarily aimed at production planning. With Process Designer is possible to design the layout of production and subsequently to analyse the production or to change according to the current needs of the company.

  9. On some applications of diffusion processes for image processing

    Energy Technology Data Exchange (ETDEWEB)

    Morfu, S., E-mail: smorfu@u-bourgogne.f [Laboratoire d' Electronique, Informatique et Image (LE2i), UMR Cnrs 5158, Aile des Sciences de l' Ingenieur, BP 47870, 21078 Dijon Cedex (France)

    2009-06-29

    We propose a new algorithm inspired by the properties of diffusion processes for image filtering. We show that purely nonlinear diffusion processes ruled by Fisher equation allows contrast enhancement and noise filtering, but involves a blurry image. By contrast, anisotropic diffusion, described by Perona and Malik algorithm, allows noise filtering and preserves the edges. We show that combining the properties of anisotropic diffusion with those of nonlinear diffusion provides a better processing tool which enables noise filtering, contrast enhancement and edge preserving.

  10. Process intensification technologies for biodiesel production reactive separation processes

    CERN Document Server

    Kiss, A A

    2014-01-01

    This book is among the first to address the novel process intensification technologies for biodiesel production, in particular the integrated reactive separations. It provides a comprehensive overview illustrated with many industrially relevant examples of novel reactive separation processes used in the production of biodiesel (e.g. fatty acid alkyl esters): reactive distillation, reactive absorption, reactive extraction, membrane reactors, and centrifugal contact separators. Readers will also learn about the working principles, design and control of integrated processes, while also getting a

  11. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration......This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...

  12. Rapid thermal processing and beyond applications in semiconductor processing

    CERN Document Server

    Lerch, W

    2008-01-01

    Heat-treatment and thermal annealing are very common processing steps which have been employed during semiconductor manufacturing right from the beginning of integrated circuit technology. In order to minimize undesired diffusion, and other thermal budget-dependent effects, the trend has been to reduce the annealing time sharply by switching from standard furnace batch-processing (involving several hours or even days), to rapid thermal processing involving soaking times of just a few seconds. This transition from thermal equilibrium, to highly non-equilibrium, processing was very challenging a

  13. Process-in-Network: A Comprehensive Network Processing Approach

    Directory of Open Access Journals (Sweden)

    Juan Carlos Lopez

    2012-06-01

    Full Text Available A solid and versatile communications platform is very important in modern Ambient Intelligence (AmI applications, which usually require the transmission of large amounts of multimedia information over a highly heterogeneous network. This article focuses on the concept of Process-in-Network (PIN, which is defined as the possibility that the network processes information as it is being transmitted, and introduces a more comprehensive approach than current network processing technologies. PIN can take advantage of waiting times in queues of routers, idle processing capacity in intermediate nodes, and the information that passes through the network.

  14. Process-in-Network: a comprehensive network processing approach.

    Science.gov (United States)

    Urzaiz, Gabriel; Villa, David; Villanueva, Felix; Lopez, Juan Carlos

    2012-01-01

    A solid and versatile communications platform is very important in modern Ambient Intelligence (AmI) applications, which usually require the transmission of large amounts of multimedia information over a highly heterogeneous network. This article focuses on the concept of Process-in-Network (PIN), which is defined as the possibility that the network processes information as it is being transmitted, and introduces a more comprehensive approach than current network processing technologies. PIN can take advantage of waiting times in queues of routers, idle processing capacity in intermediate nodes, and the information that passes through the network.

  15. Digital signal processing techniques and applications in radar image processing

    CERN Document Server

    Wang, Bu-Chin

    2008-01-01

    A self-contained approach to DSP techniques and applications in radar imagingThe processing of radar images, in general, consists of three major fields: Digital Signal Processing (DSP); antenna and radar operation; and algorithms used to process the radar images. This book brings together material from these different areas to allow readers to gain a thorough understanding of how radar images are processed.The book is divided into three main parts and covers:* DSP principles and signal characteristics in both analog and digital domains, advanced signal sampling, and

  16. Integrated Process Design and Control of Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted;

    2015-01-01

    In this work, integrated process design and control of reactive distillation processes is presented. Simple graphical design methods that are similar in concept to non-reactive distillation processes are used, such as reactive McCabe-Thiele method and driving force approach. The methods are based...... of this approach, it is shown that designing the reactive distillation process at the maximum driving force results in an optimal design in terms of controllability and operability. It is verified that the reactive distillation design option is less sensitive to the disturbances in the feed at the highest driving...

  17. Secure Inter-Process Communication

    Directory of Open Access Journals (Sweden)

    Valentin Radulescu

    2013-12-01

    Full Text Available This article reveals the necessity in modern distributed systems for authentication of a process running in a distributed system and to provide a secure channel for inter-process communication in which both the client authenticates to the server and the server authenticates to the client. The distributed system is a client-server system based on ENEA LINX inter-process communication framework. Enea LINX is a Linux open source project which allows processes to exchange information between several media channels: shared memory (local process communication, Ethernet (local network inter process communication, TCP/IP (inter process communication through Internet in which nodes are communicating regardless of the underlying media. Because ENEA LINX offers no security mechanism it appears the need for securing the communicating over LINX protocol. Process authentication disables the need for personal authentication of the user and also prevents an attacker from starting a process which will harm the entire system. Besides authentication, using public key combined with symmetric key technologies the secure inter-process communication system must provide integrity and confidentiality.

  18. Stochastic processes in cell biology

    CERN Document Server

    Bressloff, Paul C

    2014-01-01

    This book develops the theory of continuous and discrete stochastic processes within the context of cell biology.  A wide range of biological topics are covered including normal and anomalous diffusion in complex cellular environments, stochastic ion channels and excitable systems, stochastic calcium signaling, molecular motors, intracellular transport, signal transduction, bacterial chemotaxis, robustness in gene networks, genetic switches and oscillators, cell polarization, polymerization, cellular length control, and branching processes. The book also provides a pedagogical introduction to the theory of stochastic process – Fokker Planck equations, stochastic differential equations, master equations and jump Markov processes, diffusion approximations and the system size expansion, first passage time problems, stochastic hybrid systems, reaction-diffusion equations, exclusion processes, WKB methods, martingales and branching processes, stochastic calculus, and numerical methods.   This text is primarily...

  19. Oxidase-based biocatalytic processes

    DEFF Research Database (Denmark)

    Ramesh, Hemalata; Woodley, John; Krühne, Ulrich

    solvent-resistant oxygen sensors as supporting technology for oxidase-basedreactions using a glucose oxidase reaction system as an example.iiImplementation of biocatalytic oxidation at scale still requires process knowledge which includes thelimitations of the system and the knowledge about the potential......Biocatalytic processes are gaining significant focus in frontiers where they offer unique advantages(selectivity and mild operating conditions) over chemical catalysts. It is therefore not surprising that therehave been many industrial biocatalytic processes implemented.Despite past successes......, the implementation of a new biocatalytic process still presents some challenges (demands placed on the biocatalyst) in terms of the requirements to make a viable industrial process. Inorder for a biocatalytic process to be economically successful, it is necessary that certain a set of targetmetrics (product titre...

  20. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  1. Fundamentals of semiconductor processing technology

    CERN Document Server

    El-Kareh, Badih

    1995-01-01

    The drive toward new semiconductor technologies is intricately related to market demands for cheaper, smaller, faster, and more reliable circuits with lower power consumption. The development of new processing tools and technologies is aimed at optimizing one or more of these requirements. This goal can, however, only be achieved by a concerted effort between scientists, engineers, technicians, and operators in research, development, and manufac­ turing. It is therefore important that experts in specific disciplines, such as device and circuit design, understand the principle, capabil­ ities, and limitations of tools and processing technologies. It is also important that those working on specific unit processes, such as lithography or hot processes, be familiar with other unit processes used to manufacture the product. Several excellent books have been published on the subject of process technologies. These texts, however, cover subjects in too much detail, or do not cover topics important to modem tech­ n...

  2. Time-Changed Poisson Processes

    CERN Document Server

    Kumar, A; Vellaisamy, P

    2011-01-01

    We consider time-changed Poisson processes, and derive the governing difference-differential equations (DDE) these processes. In particular, we consider the time-changed Poisson processes where the the time-change is inverse Gaussian, or its hitting time process, and discuss the governing DDE's. The stable subordinator, inverse stable subordinator and their iterated versions are also considered as time-changes. DDE's corresponding to probability mass functions of these time-changed processes are obtained. Finally, we obtain a new governing partial differential equation for the tempered stable subordinator of index $0<\\beta<1,$ when $\\beta $ is a rational number. We then use this result to obtain the governing DDE for the mass function of Poisson process time-changed by tempered stable subordinator. Our results extend and complement the results in Baeumer et al. \\cite{B-M-N} and Beghin et al. \\cite{BO-1} in several directions.

  3. Telerobotic electronic materials processing experiment

    Science.gov (United States)

    Ollendorf, Stanford

    1991-01-01

    The Office of Commercial Programs (OCP), working in conjunction with NASA engineers at the Goddard Space Flight Center, is supporting research efforts in robot technology and microelectronics materials processing that will provide many spinoffs for science and industry. The Telerobotic Materials Processing Experiment (TRMPX) is a Shuttle-launched materials processing test payload using a Get Away Special can. The objectives of the project are to define, develop, and demonstrate an automated materials processing capability under realistic flight conditions. TRMPX will provide the capability to test the production processes that are dependent on microgravity. The processes proposed for testing include the annealing of amorphous silicon to increase grain size for more efficient solar cells, thin film deposition to demonstrate the potential of fabricating solar cells in orbit, and the annealing of radiation damaged solar cells.

  4. Dynamic analysis of process reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process models are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.

  5. Guide to Computational Geometry Processing

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Gravesen, Jens; Anton, François;

    be processed before it is useful. This Guide to Computational Geometry Processing reviews the algorithms for processing geometric data, with a practical focus on important techniques not covered by traditional courses on computer vision and computer graphics. This is balanced with an introduction......, metric space, affine spaces, differential geometry, and finite difference methods for derivatives and differential equations Reviews geometry representations, including polygonal meshes, splines, and subdivision surfaces Examines techniques for computing curvature from polygonal meshes Describes...

  6. Markov processes, semigroups and generators

    CERN Document Server

    Kolokoltsov, Vassili N

    2011-01-01

    This work offers a highly useful, well developed reference on Markov processes, the universal model for random processes and evolutions. The wide range of applications, in exact sciences as well as in other areas like social studies, require a volume that offers a refresher on fundamentals before conveying the Markov processes and examples for applications. This work does just that, and with the necessary mathematical rigor.

  7. Verification of Stochastic Process Calculi

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya

    Stochastic process calculi represent widely accepted formalisms within Computer Science for modelling nondeterministic stochastic systems in a compositional way. Similar to process calculi in general, they are suited for modelling systems in a hierarchical manner, by explicitly specifying...... subsystems as well as their interdependences and communication channels. Stochastic process calculi incorporate both the quantified uncertainty on probabilities or durations of events and nondeterministic choices between several possible continuations of the system behaviour. Modelling of a system is often...

  8. BPMN Impact on Process Modeling

    OpenAIRE

    Polak, Przemyslaw

    2013-01-01

    Recent years have seen huge rise in popularity of BPMN in the area of business process modeling, especially among business analysts. This notation has characteristics that distinguish it significantly from the previously popular process modeling notations, such as EPC. The article contains the analysis of some important characteristics of BPMN and provides author’s conclusions on the impact that the popularity and specificity of BPMN can have on the practice of process modeling. Author's obse...

  9. Introduction to digital signal processing

    CERN Document Server

    Kuc, Roman

    2008-01-01

    This book approaches digital Signal Processing and filter design in a Novel way, by presenting the relevant theory and then having the Student apply it by implementing signal processing routines on a computer. This mixture of theory and application has worked successfully. With this approach, the students receive a deeper and intuitive understanding of the theory, its applications and its limitations. This text also includes projects that require students to write Computer programs to accomplish signal processing projects.

  10. Process Analysis Via Accuracy Control

    Science.gov (United States)

    1982-02-01

    0 1 4 3 NDARDS THE NATIONAL February 1982 Process Analysis Via Accuracy Control RESEARCH PROG RAM U.S. DEPARTMENT OF TRANSPORTATION Maritime...SUBTITLE Process Analysis Via Accuracy Control 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...examples are contained in Appendix C. Included, are examples of how “A/C” process - analysis leads to design improvement and how a change in sequence can

  11. Thermodynamics of irreversible physicochemical processes

    Science.gov (United States)

    Bulatov, N. K.; Lundin, A. B.

    The main principles of the phenomenological thermodynamics of irreversible processes are expounded in close relation to concepts of classical phenomenological thermodynamics, and the most important thermodynamic equations of state are presented. These principles are then used in describing various physicochemical processes, including chemical transformations, structural relaxation, heat conduction, electrical conductivity, diffusion, and sedimentation in homogeneous, continuous, and discontinuous systems. Other processes discussed include filtration, electrical osmosis, heat transfer, and the mechanocaloric effect.

  12. Terrestrial photovoltaic cell process testing

    Science.gov (United States)

    Burger, D. R.

    1985-01-01

    The paper examines critical test parameters, criteria for selecting appropriate tests, and the use of statistical controls and test patterns to enhance PV-cell process test results. The coverage of critical test parameters is evaluated by examining available test methods and then screening these methods by considering the ability to measure those critical parameters which are most affected by the generic process, the cost of the test equipment and test performance, and the feasibility for process testing.

  13. Design of Industrial Quenching Processes

    Institute of Scientific and Technical Information of China (English)

    Nikolai. I. KOBASKO; George .E. TOTTEN

    2004-01-01

    The method of designing industrial processes of quench cooling, in particular, the speed of the conveyor movement with regard to shape and sizes of parts to be quenched, thermal and physical properties of material and cooling capacity of quenchants has been developed. The suggested designing method and databases are the basis for the complete automation of industrial processes of quench cooling, especially for continuous conveyor lines, with the purpose of making high-strength materials. The process is controlled by infrared technique.

  14. DIGITAL IMAGES PROCESSING IN RADIOGRAPHY

    OpenAIRE

    Pilař, Martin

    2010-01-01

    This thesis is focused primarily on digital image processing and modern imaging modalities algorithms. An algorithm means a method for solving a problem or an instruction. In image processing an algorithm presents the process from data acquisition to the resulting image displayed on the monitor. Therefore, in the first part of the thesis a brief overview of principles of imaging modalities used in radiodiagnostics is given. Collected data have to be analyzed and modelled in a certain way. The...

  15. Process mining using convex polytopes

    OpenAIRE

    Alemany Puig, Lluís

    2017-01-01

    Process Mining is a relatively young field of study that highlights the difficulty to infer models of processes from which to extract enough information to make predictions about its behaviour, find bottlenecks and causality relationships so as to be able to answer as many questions as one can make about them. In this context, a process may be understood as any activity performed by humans or computers or the result between the interaction of the two. Research on this topic has...

  16. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  17. GREAT Process Modeller user manual

    OpenAIRE

    Rueda, Urko; España, Sergio; Ruiz, Marcela

    2015-01-01

    This report contains instructions to install, uninstall and use GREAT Process Modeller, a tool that supports Communication Analysis, a communication-oriented business process modelling method. GREAT allows creating communicative event diagrams (i.e. business process models), specifying message structures (which describe the messages associated to each communicative event), and automatically generating a class diagram (representing the data model of an information system that would support suc...

  18. Representation of cointegrated autoregressive processes with application to fractional processes

    DEFF Research Database (Denmark)

    Johansen, Søren

    2009-01-01

    We analyse vector autoregressive processes using the matrix valued characteristic polynomial. The purpose of this  paper is to give a survey of the mathematical results on inversion of a matrix polynomial in case there are unstable roots, to study integrated and cointegrated processes. The new...

  19. Translating Message Sequence Charts to other Process Languages using Process

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van Dongen, Boudewijn F.

    2008-01-01

    stakeholders. Often such discussions lead to more complete behavioral models described by e.g. Event-driven Process Chains (EPCs), UML activity diagrams, BPMN models, Petri nets, etc. Process mining on the other hand, deals with the problem of constructing complete behavioral models by analyzing event logs...

  20. Phenomena based Methodology for Process Synthesis incorporating Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip; Babi, Deenesh Kavi; Woodley, John

    2013-01-01

    at processes at the lowest level of aggregation which is the phenomena level. In this paper, a phenomena based synthesis/design methodology incorporating process intensification is presented. Using this methodology, a systematic identification of necessary and desirable (integrated) phenomena as well...

  1. Optimality of Poisson Processes Intensity Learning with Gaussian Processes

    NARCIS (Netherlands)

    A. Kirichenko; H. van Zanten

    2015-01-01

    In this paper we provide theoretical support for the so-called "Sigmoidal Gaussian Cox Process" approach to learning the intensity of an inhomogeneous Poisson process on a d-dimensional domain. This method was proposed by Adams, Murray and MacKay (ICML, 2009), who developed a tractable computational

  2. From pulsed power to processing: Plasma initiated chemical process intensification

    NARCIS (Netherlands)

    Heesch, E.J.M. van; Yan, K.; Pemen, A.J.M.; Winands, G.J.J.; Beckers, F.J.C.M.; Hoeben, W.F.L.M.

    2012-01-01

    Smart electric power for process intensification is a challenging research field that integrates power engineering, chemistry and green technology. Pulsed power technology is offering elegant solutions. This work focuses on backgrounds of matching the power source to the process. Important items are

  3. Extremes of independent stochastic processes: a point process approach

    CERN Document Server

    Dombry, Clément

    2011-01-01

    For each $n\\geq 1$, let $ {X_{in}, \\quad i \\geq 1} $ be independent copies of a nonnegative continuous stochastic process $X_{n}=(X_n(t))_{t\\in T}$ indexed by a compact metric space $T$. We are interested in the process of partial maxima [\\tilde M_n(u,t) =\\max {X_{in}(t), 1 \\leq i\\leq [nu]},\\quad u\\geq 0,\\ t\\in T.] where the brackets $[\\,\\cdot\\,]$ denote the integer part. Under a regular variation condition on the sequence of processes $X_n$, we prove that the partial maxima process $\\tilde M_n$ weakly converges to a superextremal process $\\tilde M$ as $n\\to\\infty$. We use a point process approach based on the convergence of empirical measures. Properties of the limit process are investigated: we characterize its finite-dimensional distributions, prove that it satisfies an homogeneous Markov property, and show in some cases that it is max-stable and self-similar. Convergence of further order statistics is also considered. We illustrate our results on the class of log-normal processes in connection with some r...

  4. DECAB: process development of a phase change absorption process

    NARCIS (Netherlands)

    Sanchez Fernandez, E.; Goetheer, E.L.V.

    2011-01-01

    This work describes the conceptual design of a novel separation process for CO2 removal from flue gas based on precipitating solvents. The process here described (DECAB) is an enhanced CO2 absorption based on the Le Chatelier's principle, which states that reaction equilibrium can be shifted by remo

  5. Analyzing Discourse Processing Using a Simple Natural Language Processing Tool

    Science.gov (United States)

    Crossley, Scott A.; Allen, Laura K.; Kyle, Kristopher; McNamara, Danielle S.

    2014-01-01

    Natural language processing (NLP) provides a powerful approach for discourse processing researchers. However, there remains a notable degree of hesitation by some researchers to consider using NLP, at least on their own. The purpose of this article is to introduce and make available a "simple" NLP (SiNLP) tool. The overarching goal of…

  6. Industrial processing versus home processing of tomato sauce

    NARCIS (Netherlands)

    Tomas, Merve; Beekwilder, Jules; Hall, Robert D.; Sagdic, Osman; Boyacioglu, Dilek; Capanoglu, Esra

    2017-01-01

    The effect of industrial and home processing, in vitro gastrointestinal digestion, individual phenolic content, and antioxidant capacity of tomato into tomato sauce were investigated. Industrial processing of tomato fruit into sauce had an overall positive effect on the total antioxidant capacity

  7. Fundamental Laser Welding Process Investigations

    DEFF Research Database (Denmark)

    Bagger, Claus; Olsen, Flemming Ove

    1998-01-01

    In a number of systematic laboratory investigations the fundamental behavior of the laser welding process was analyzed by the use of normal video (30 Hz), high speed video (100 and 400 Hz) and photo diodes. Sensors were positioned to monitor the welding process from both the top side and the rear...... side of the specimen.Special attention has been given to the dynamic nature of the laser welding process, especially during unstable welding conditions. In one series of experiments, the stability of the process has been varied by changing the gap distance in lap welding. In another series...

  8. Modelling of CWS combustion process

    Science.gov (United States)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  9. Integer-valued trawl processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole E.; Lunde, Asger; Shephard, Neil;

    2014-01-01

    This paper introduces a new continuous-time framework for modelling serially correlated count and integer-valued data. The key component in our new model is the class of integer-valued trawl processes, which are serially correlated, stationary, infinitely divisible processes. We analyse the proba......This paper introduces a new continuous-time framework for modelling serially correlated count and integer-valued data. The key component in our new model is the class of integer-valued trawl processes, which are serially correlated, stationary, infinitely divisible processes. We analyse...

  10. Design of environmentally benign processes

    DEFF Research Database (Denmark)

    Hostrup, Martin; Harper, Peter Mathias; Gani, Rafiqul

    1999-01-01

    This paper presents a hybrid method for design of environmentally benign processes. The hybrid method integrates mathematical modelling with heuristic approaches to solving the optimisation problems related to separation process synthesis and solvent design and selection. A structured method...... of solution, which employs thermodynamic insights to reduce the complexity and size of the mathematical problem by eliminating redundant alternatives, has been developed for the hybrid method. Separation process synthesis and design problems related to the removal of a chemical species from process streams...... mixture and the second example involves the determination of environmentally benign substitute solvents for removal of a chemical species from wastewater. (C) 1999 Elsevier Science Ltd. All rights reserved....

  11. Necessary Processing of Personal Data

    DEFF Research Database (Denmark)

    Tranberg, Charlotte Bagger

    2006-01-01

    The Data Protection Directive prohibits processing of sensitive data (racial or ethnic origin, political, religious or philosophical convictions, trade union membership and information on health and sex life). All other personal data may be processed, provided processing is deemed necessary...... Handelsgesellschaft. The aim of this article is to clarify the necessity requirement of the Data Protection Directive in terms of the general principle of proportionality. The usefulness of the principle of proportionality as the standard by which processing of personal data may be weighed is illustrated by the Peck...

  12. Graphical Language for Data Processing

    Science.gov (United States)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  13. ITSM process assessment supporting ITIL

    CERN Document Server

    Barafort, Béatrix; Cortina, Stéphane

    2009-01-01

    The key to any successful IT Service Management solution are strong, clear processes that are fit for purpose. The continual cycle of service improvements must therefore look at the existing processes and assess how effective they are within changing business requirements.This innovative title not only looks at this fundamental process assessment, it does it using the key ISO/IEC standard in this area. In brief, this title explains the meeting between two standards:ITIL: the de facto standard in IT Service Management.ISO/IEC 15504 Information technology - Process assessmentReaders can therefor

  14. Containerless processing of amorphous ceramics

    Science.gov (United States)

    Weber, J. K. Richard; Krishnan, Shankar; Schiffman, Robert A.; Nordine, Paul C.

    1990-01-01

    The absence of gravity allows containerless processing of materials which could not otherwise be processed. High melting point, hard materials such as borides, nitrides, and refractory metals are usually brittle in their crystalline form. The absence of dislocations in amorphous materials frequently endows them with flexibility and toughness. Systematic studies of the properties of many amorphous materials have not been carried out. The requirements for their production is that they can be processed in a controlled way without container interaction. Containerless processing in microgravity could permit the control necessary to produce amorphous forms of hard materials.

  15. Consolidated Copayment Processing Center (CCPC)

    Data.gov (United States)

    Department of Veterans Affairs — The Consolidated Copayment Processing Center (CCPC) database contains Veteran patient contact and billing information in order to support the printing and mailing of...

  16. Austempered ductile iron process development

    Science.gov (United States)

    Gupta, C. D.; Keough, J. R.; Pramstaller, D. M.

    1986-11-01

    Pressure from imports and material substitution has severly affected demand for domestic iron industry products. It is estimated that the potential market for Austempered Ductile Iron (ADI) is as large as the market for carburized and/or through hardened forgings. The primary interest in ADI is generated by the economics of process. Improved machinability and reduced processing costs as well as interesting physical properties has created an enormous interest in all metalworking industries towards ADI. The development of gas-fired austempering processes and resoluton of technical and economic uncertainities concerning the process will help improve the outlook for iron founderies.

  17. PAT tools for fermentation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist; Bolic, Andrijana; Svanholm, Bent

    2012-01-01

    The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...... knowledge is central in PAT projects. This manuscript therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: on-line sensors, mechanistic models and small-scale equipment for high-throughput experimentation. The manuscript ends with a short...

  18. PAT tools for fermentation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist

    The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...... knowledge is central in PAT projects. This presentation therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: - On-line sensors, where for example spectroscopic measurements are increasingly applied - Mechanistic models, which can be used...

  19. Evaluating Knowledge of Business Processes

    OpenAIRE

    Andra TURDASAN; Razvan PETRUSEL

    2016-01-01

    Any organization relies on processes/procedures in order to organize the operations. Those processes can be explicit (e.g. textual descriptions of workflow steps or graphical descriptions) or implicit (e.g. employees have learned by experience the steps needed to ‘get things done’). A widely acknowledged fact is that processes change due to internal and/or external factors. How can managers make sure the employees know the last version of the process? The current practice is to test employees...

  20. Polymer Processing and Characterization Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The purpose is to process and evaluate polymers for use in nonlinear optical, conductive and structural Air Force applications. Primary capabilities are extrusion of...