WorldWideScience

Sample records for data-flow processing

  1. Synthesis of a parallel data stream processor from data flow process networks

    NARCIS (Netherlands)

    Zissulescu-Ianculescu, Claudiu

    2008-01-01

    In this talk, we address the problem of synthesizing Process Network specifications to FPGA execution platforms. The process networks we consider are special cases of Kahn Process Networks. We call them COMPAAN Data Flow Process Networks (CDFPN) because they are provided by a translator called the

  2. Adapting high-level language programs for parallel processing using data flow

    Science.gov (United States)

    Standley, Hilda M.

    1988-01-01

    EASY-FLOW, a very high-level data flow language, is introduced for the purpose of adapting programs written in a conventional high-level language to a parallel environment. The level of parallelism provided is of the large-grained variety in which parallel activities take place between subprograms or processes. A program written in EASY-FLOW is a set of subprogram calls as units, structured by iteration, branching, and distribution constructs. A data flow graph may be deduced from an EASY-FLOW program.

  3. - GEONET - A Realization of an Automated Data Flow for Data Collecting, Processing, Storing, and Retrieving

    International Nuclear Information System (INIS)

    Friedsam, Horst; Pushor, Robert; Ruland, Robert; SLAC

    2005-01-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's

  4. A realization of an automated data flow for data collecting, processing, storing and retrieving

    International Nuclear Information System (INIS)

    Friedsam, H.; Pushor, R.; Ruland, R.

    1986-11-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's. 14 refs., 4 figs

  5. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  6. Multiverse data-flow control.

    Science.gov (United States)

    Schindler, Benjamin; Waser, Jürgen; Ribičić, Hrvoje; Fuchs, Raphael; Peikert, Ronald

    2013-06-01

    In this paper, we present a data-flow system which supports comparative analysis of time-dependent data and interactive simulation steering. The system creates data on-the-fly to allow for the exploration of different parameters and the investigation of multiple scenarios. Existing data-flow architectures provide no generic approach to handle modules that perform complex temporal processing such as particle tracing or statistical analysis over time. Moreover, there is no solution to create and manage module data, which is associated with alternative scenarios. Our solution is based on generic data-flow algorithms to automate this process, enabling elaborate data-flow procedures, such as simulation, temporal integration or data aggregation over many time steps in many worlds. To hide the complexity from the user, we extend the World Lines interaction techniques to control the novel data-flow architecture. The concept of multiple, special-purpose cursors is introduced to let users intuitively navigate through time and alternative scenarios. Users specify only what they want to see, the decision which data are required is handled automatically. The concepts are explained by taking the example of the simulation and analysis of material transport in levee-breach scenarios. To strengthen the general applicability, we demonstrate the investigation of vortices in an offline-simulated dam-break data set.

  7. Storing Data Flow Monitoring in Hadoop

    CERN Document Server

    Georgiou, Anastasia

    2013-01-01

    The on-line data flow monitoring for the CMS data acquisition system produces a large amount of data. Only 5% of data is stored permanently in a relational database due to performance issues and the cost for using dedicated infrastructure (e.g. Oracle systems). In a commercial environment, companies and organizations need to find new innovative approaches to process such big volumes of data, known as “big data”. The Big Data approach is trying to address the problem of a large and complex collection of data sets that become difficult to handle using traditional data processing applications. Using these new technologies, it should be possible to store all the monitoring information for a time window of months or a year. This report contains an initial evaluation of Hadoop for storage of data flow monitoring and subsequent data mining.

  8. Data flow manager for DART

    International Nuclear Information System (INIS)

    Berg, D.; Black, D.; Slimmer, D.; Engelfried, J.; O'Dell, V.

    1994-04-01

    The DART Data Flow Manager (dfm) integrates a buffer manager with a requester/provider model for scheduling work on buffers. Buffer lists, representing built events or other data, are queued by service requesters to service providers. Buffers may be either internal (reside on the local node), or external (located elsewhere, e.g., dual ported memory). Internal buffers are managed locally. Wherever possible, dfm moves only addresses of buffers rather than buffers themselves

  9. VLT Data Flow System Begins Operation

    Science.gov (United States)

    1999-06-01

    conceived as a complex digital facility to explore the Universe. In order for astronomers to be able to use this marvellous research tool in the most efficient manner possible, the VLT computer software and hardware systems must guarantee a smooth flow of scientific information through the entire system. This process starts when the astronomers submit well-considered proposals for observing time and it ends with large volumes of valuable astronomical data being distributed to the international astronomical community. For this, ESO has produced an integrated collection of software and hardware, known as the VLT Data Flow System (DFS) , that manages and facilitates the flow of scientific information within the VLT Observatory. Early information about this new concept was published as ESO Press Release 12/96 and extensive tests were first carried out at ESOs 3.5-m New Technology Telescope (NTT) at La Silla, cf. ESO Press Release 03/97 [1]. The VLT DFS is a complete (end-to-end) system that guarantees the highest data quality by optimization of the observing process and repeated checks that identify and eliminate any problems. It also introduces automatic calibration of the data, i.e. the removal of external effects introduced by the atmospheric conditions at the time of the observations, as well as the momentary state of the telescope and the instruments. From Proposals to Observations In order to obtain observing time with ESO telescopes, also with the VLT, astronomers must submit a detailed observing proposal to the ESO Observing Programmes Committee (OPC) . It meets twice a year and ranks the proposals according to scientific merit. More than 1000 proposals are submitted each year, mostly by astronomers from the ESO members states and Chile; the competition is fierce and only a fraction of the total demand for observing time can be fulfilled. During the submission of observing proposals, DFS software tools available over the World Wide Web enable the astronomers to simulate

  10. Functional language and data flow architectures

    Science.gov (United States)

    Ercegovac, M. D.; Patel, D. R.; Lang, T.

    1983-01-01

    This is a tutorial article about language and architecture approaches for highly concurrent computer systems based on the functional style of programming. The discussion concentrates on the basic aspects of functional languages, and sequencing models such as data-flow, demand-driven and reduction which are essential at the machine organization level. Several examples of highly concurrent machines are described.

  11. Data flow in LCG Data Challenge 3

    CERN Multimedia

    2005-01-01

    This map shows the real data transfer from CERN to selected nodes during the Large Hadron Collider Computer Grid (LCG) Data Challenge 3. The goal of this activity was to achieve an average data flow out of CERN of 400 Mbytes/sec, equivalent to 100 million words every second, for one week. At this rate, the complete works of Shakespeare could be sent every second.

  12. The nuclear safeguards data flow for the item facilities

    International Nuclear Information System (INIS)

    Wang Hongjun; Chen Desheng

    1994-04-01

    The constitution of nuclear safeguards data flow for the item facilities is introduced and the main contents are the data flow of nuclear safeguards. If the data flow moves positively, i.e. from source data →supporting documents→accounting records→accounting reports, the systems of records and reports will be constituted. If the data flow moves negatively, the way to trace inspection of nuclear material accounting quality will be constituted

  13. File-based data flow in the CMS Filter Farm

    Science.gov (United States)

    Andre, J.-M.; Andronidis, A.; Bawej, T.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.

  14. File-Based Data Flow in the CMS Filter Farm

    Energy Technology Data Exchange (ETDEWEB)

    Andre, J.M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.

  15. The ATLAS Data Flow System for LHC Run II

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00305920; The ATLAS collaboration

    2016-01-01

    After its first shutdown, the LHC will provide pp collisions with increased luminosity and energy. In the ATLAS experiment, the Trigger and Data Acquisition (TDAQ) system has been upgraded to deal with the increased event rates. The Data Flow (DF) element of the TDAQ is a distributed hardware and software system responsible for buffering and transporting event data from the readout system to the High Level Trigger (HLT) and to the event storage. The DF has been reshaped in order to profit from the technological progress and to maximize the flexibility and efficiency of the data selection process. The updated DF is radically different from the previous implementation both in terms of architecture and expected performance. The pre-existing two level software filtering, known as L2 and the Event Filter, and the Event Building are now merged into a single process, performing incremental data collection and analysis. This design has many advantages, among which are: the radical simplification of the architecture, ...

  16. The ATLAS Data Flow System for Run 2

    CERN Document Server

    Kazarov, Andrei; The ATLAS collaboration

    2015-01-01

    After its first shutdown, the LHC will provide pp collisions with increased luminosity and energy. In the ATLAS experiment, the Trigger and Data Acquisition (TDAQ) system has been upgraded to deal with the increased event rates. The Data Flow (DF) element of the TDAQ is a distributed hardware and software system responsible for buffering and transporting event data from the readout system to the High Level Trigger (HLT) and to the event storage. The DF has been reshaped in order to profit from the technological progress and to maximize the flexibility and efficiency of the data selection process. The updated DF is radically different from the previous implementation both in terms of architecture and expected performance. The pre-existing two level software filtering, known as L2 and the Event Filter, and the Event Building are now merged into a single process, performing incremental data collection and analysis. This design has many advantages, among which are: the radical simplification of the architecture, ...

  17. Distributed Wireless Data Acquisition System with Synchronized Data Flow

    CERN Document Server

    Astakhova, N V; Dikoussar, N D; Eremin, G I; Gerasimov, A V; Ivanov, A I; Kryukov, Yu S; Mazny, N G; Ryabchun, O V; Salamatin, I M

    2006-01-01

    New methods to provide succession of computer codes under changes of the class of problems and to integrate the drivers of special-purpose devices into application are devised. The worked out scheme and methods for constructing automation systems are used to elaborate a distributed wireless system intended for registration of the characteristics of pulse processes with synchronized data flow, transmitted over a radio channel. The equipment with a sampling frequency of 20 kHz allowed us to achieve a synchronization accuracy of up to $\\pm $ 50 $\\mu$s. Modification of part of the equipment (sampling frequency) permits one to improve the accuracy up to 0.1 $\\mu$s. The obtained results can be applied to develop systems for monitoring various objects, as well as automation systems for experiments and automated process control systems.

  18. A formal definition of data flow graph models

    Science.gov (United States)

    Kavi, Krishna M.; Buckles, Bill P.; Bhat, U. Narayan

    1986-01-01

    In this paper, a new model for parallel computations and parallel computer systems that is based on data flow principles is presented. Uninterpreted data flow graphs can be used to model computer systems including data driven and parallel processors. A data flow graph is defined to be a bipartite graph with actors and links as the two vertex classes. Actors can be considered similar to transitions in Petri nets, and links similar to places. The nondeterministic nature of uninterpreted data flow graphs necessitates the derivation of liveness conditions.

  19. Electronic device, system on chip and method for monitoring a data flow

    NARCIS (Netherlands)

    2012-01-01

    An electronic device is provided which comprises a plurality of processing units (IP1-IP6), a network-based inter-connect (N) coupled to the processing units (IP1-IP6) and at least one monitoring unit (P1, P2) for monitoring a data flow of at least one first communication path between the processing

  20. The management and realizing of image data flow in PACS

    International Nuclear Information System (INIS)

    Tao Yonghao; Miao Jingtao

    2002-01-01

    Objective: To explore the management model and realizing of PACS image data-flow. Methods: Based on the implementing environment and management model of PACS image data-flow after full digital reengineering for radiology department in Shanghai First Hospital was completed, analysis on image data flow types, procedure, and achieving pattern were conducted. Results: Two kinds of image data-flow management were set up for the PACS of Shanghai First Hospital, which included image archiving procedure and image forward procedure. The former was implemented with central management model while the latter was achieved with a program that functionally acted as workflow management running on the central server. Conclusion: The image data-flow management pattern, as a key factor for PACS, has to be designed and implemented functionally and effectively depending on the performance environment, the tasks and requirements specified to particular user

  1. The ATLAS Data Flow system for the Second LHC Run

    CERN Document Server

    Hauser, Reiner; The ATLAS collaboration

    2015-01-01

    After its first shutdown, LHC will provide pp collisions with increased luminosity and energy. In the ATLAS experiment the Trigger and Data Acquisition (TDAQ) system has been upgraded to deal with the increased event rates. The Data Flow (DF) element of the TDAQ is a distributed hardware and software system responsible for buffering and transporting event data from the Readout system to the High Level Trigger (HLT) and to the event storage. The DF has been reshaped in order to profit from the technological progress and to maximize the flexibility and efficiency of the data selection process. The updated DF is radically different from the previous implementation both in terms of architecture and expected performance. The pre-existing two level software filtering, known as L2 and the Event Filter, and the Event Building are now merged into a single process, performing incremental data collection and analysis. This design has many advantages, among which are: the radical simplification of the architecture, the f...

  2. Collecting and Storing Data Flow Monitoring in Elasticsearch

    CERN Document Server

    Hashim, Fatin Hazwani

    2014-01-01

    A very large amount of data is produced from the online data flow monitoring for the CMS data acquisition system. However, there are only a small portion of data is stored permanently in the relational database. This is because of the high cost needed while relying on the dedicated infrastructure as well as the issues in its performance itself. A new approach needs to be found in order to confront such a big volume of data known as “Big Data”. The Big Data [1] is the term given to the very large and complex data sets that cannot be handled by the traditional data processing application [2] in terms of capturing, storing, managing, and analyzing. The sheer size of the data [3] in CMS data acquisition system is one of the major challenges, and is the one of the most easily recognized. New technology need to be used as the alternative of the traditional databases initial evaluation to handle this problem as more data need to be stored permanently and can be easily retrieved. This report consists of the intro...

  3. Making Data Flow Diagrams Accessible for Visually Impaired Students Using Excel Tables

    Science.gov (United States)

    Sauter, Vicki L.

    2015-01-01

    This paper addresses the use of Excel tables to convey information to blind students that would otherwise be presented using graphical tools, such as Data Flow Diagrams. These tables can supplement diagrams in the classroom when introducing their use to understand the scope of a system and its main sub-processes, on exams when answering questions…

  4. Data-flow Analysis of Programs with Associative Arrays

    Directory of Open Access Journals (Sweden)

    David Hauzar

    2014-05-01

    Full Text Available Dynamic programming languages, such as PHP, JavaScript, and Python, provide built-in data structures including associative arrays and objects with similar semantics—object properties can be created at run-time and accessed via arbitrary expressions. While a high level of security and safety of applications written in these languages can be of a particular importance (consider a web application storing sensitive data and providing its functionality worldwide, dynamic data structures pose significant challenges for data-flow analysis making traditional static verification methods both unsound and imprecise. In this paper, we propose a sound and precise approach for value and points-to analysis of programs with associative arrays-like data structures, upon which data-flow analyses can be built. We implemented our approach in a web-application domain—in an analyzer of PHP code.

  5. Foundations of Total Functional Data-Flow Programming

    Directory of Open Access Journals (Sweden)

    Baltasar Trancón y Widemann

    2014-06-01

    Full Text Available The field of declarative stream programming (discrete time, clocked synchronous, modular, data-centric is divided between the data-flow graph paradigm favored by domain experts, and the functional reactive paradigm favored by academics. In this paper, we describe the foundations of a framework for unifying functional and data-flow styles that differs from FRP proper in significant ways: It is based on set theory to match the expectations of domain experts, and the two paradigms are reduced symmetrically to a low-level middle ground, with strongly compositional semantics. The design of the framework is derived from mathematical first principles, in particular coalgebraic coinduction and a standard relational model of stateful computation. The abstract syntax and semantics introduced here constitute the full core of a novel stream programming language.

  6. CyNC - a method for Real Time Analysis of Systems with Cyclic Data Flows

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens F. Dalsgaard; Larsen, Kim Guldstrand

    2005-01-01

    The paper addresses a novel method for realtime analysis of systems with cyclic data flows. The presented method is based on Network Calculus principles, where upper and lower flow and service constraint are used to bound data flows and processing resources. In acyclic systems flow constraints ma...... in a prototype tool also denoted CyNC providing a graphical user interface for model specification based on the MATLAB/SimuLink framework....... in a space of constraint functions. In this paper a method denoted CyNC for obtaining a well defined solution to that problem is presented along with a theoretical justification of the method as well as comparative results for CyNC and alternative methods on a relevant example. The method is implemented...

  7. Fastr: a workflow engine for advanced data flows in medical image analysis

    Directory of Open Access Journals (Sweden)

    Hakim Christiaan Achterberg

    2016-08-01

    Full Text Available With the increasing number of datasets encountered in imaging studies, the increasingcomplexity of processing workflows, and a growing awareness for data stewardship, thereis a need for managed, automated workflows. In this paper we introduce Fastr, an automatedworkflow engine with support for advanced data flows. Fastr has built-in data provenance forrecording processing trails and ensuring reproducible results. The extensible plugin-based designallows the system to interface with virtually any image archive and processing infrastructure. Thisworkflow engine is designed to consolidate quantitative imaging biomarker pipelines in order toenable easy application to new data.

  8. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    Science.gov (United States)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  9. Scenarios for control and data flows in multiprotocol over ATM

    Science.gov (United States)

    Kujoory, Ali

    1997-10-01

    The multiprotocol over ATM (MPOA), specified by the ATM Forum, provides an architecture for transfer of Internetwork layer packets (Layer 3 datagram such as IP, IPX) over ATM subnets or across the emulated LANs. MPOA provides shortcuts that bypass routers to avoid router bottlenecks. It is a grand union of some of the existing standards such as LANE by the ATM Forum, NHRP by the IETF, and the Q.2931 by ITU. The intent of this paper is to clarify the data flows between pairs of source and destination hosts in an MPOA system. It includes scenarios for both the intra- and inter-subnet flows between different pairs of MPOA end-systems. The intrasubnet flows simply use LANE for address resolution or data transfer. The inter-subnet flows may use a default path for short-lived flows or a shortcut for long-lived flows. The default path uses the LANE and router capabilities. The shortcut path uses LANE plus NHRP for ATM address resoluton. An ATM virtual circuit is established before the data transfer. This allows efficient transfer of internetwork layer packets over ATM for real-time applications.

  10. Analyzing data flows of WLCG jobs at batch job level

    Science.gov (United States)

    Kuehn, Eileen; Fischer, Max; Giffels, Manuel; Jung, Christopher; Petzold, Andreas

    2015-05-01

    With the introduction of federated data access to the workflows of WLCG, it is becoming increasingly important for data centers to understand specific data flows regarding storage element accesses, firewall configurations, as well as the scheduling of batch jobs themselves. As existing batch system monitoring and related system monitoring tools do not support measurements at batch job level, a new tool has been developed and put into operation at the GridKa Tier 1 center for monitoring continuous data streams and characteristics of WLCG jobs and pilots. Long term measurements and data collection are in progress. These measurements already have been proven to be useful analyzing misbehaviors and various issues. Therefore we aim for an automated, realtime approach for anomaly detection. As a requirement, prototypes for standard workflows have to be examined. Based on measurements of several months, different features of HEP jobs are evaluated regarding their effectiveness for data mining approaches to identify these common workflows. The paper will introduce the actual measurement approach and statistics as well as the general concept and first results classifying different HEP job workflows derived from the measurements at GridKa.

  11. Network Transfer of Control Data: An Application of the NIST SMART DATA FLOW

    Directory of Open Access Journals (Sweden)

    Vincent Stanford

    2004-12-01

    Full Text Available Pervasive Computing environments range from basic mobile point of sale terminal systems, to rich Smart Spaces with many devices and sensors such as lapel microphones, audio and video sensor arrays and multiple interactive PDA acting as electronic brief cases, providing authentication, and user preference data to the environment. These systems present new challenges in distributed human-computer interfaces such as how to best use sensor streams, distribute interfaces across multiple devices, and dynamic network management as users come an go, and as devices are added or fail. The NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY SMART DATA FLOW system is a low overhead, high bandwidth transport mechanism for standardized multi-modal data streams. It is designed to allow integration of multiple sensors with distributed processing needed for the sense-recognize-respond cycle of multi modal user interfaces. Its core is a server/client architecture, allowing clients to produce or subscribe to data flows, and supporting steps toward scalable processing, distributing the computing requirements among many network connected computers and pervasive devices. This article introduces the communication broker and provides an example of an effective real time sensor fusion to track a speaker with a video camera using data captured from multi-channel microphone array.

  12. Monitoring the data flow of LHCb’s data acquisition system

    CERN Document Server

    Svantesson, David; Rainer, S

    2010-01-01

    The data acquisition system of the Large Hadron Collider beauty (LHCb) experiment need to read out huge amount of data. Monitoring is done for each subsystem but there exist no system to monitor the overall data flow. The aim of this work has been to design a system where the data rates can be vied continuously and making it possible to do an exact consistency check after the run to ensure no data are lost. This involves collecting and processing all necessary data from each subsystem and integrate it into the experiment control system for displaying it to the operators. The challenges are to communicate and collect data from all stages of the data acquisitions system which uses different techniques and data formats. The size of the system also makes it a challenge to gather all statistics in real time. The system must also be able to support partitioning. The result was to build a data flow monitoring system, that acquire statistics from all stages of the data acquisition, process it and display it in the ex...

  13. Harnessing Data Flow and Modelling Potentials for Sustainable Development

    Directory of Open Access Journals (Sweden)

    Kassim S Mwitondi

    2012-12-01

    Full Text Available Tackling the global challenges relating to health, poverty, business, and the environment is heavily dependent on the flow and utilisation of data. However, while enhancements in data generation, storage, modelling, dissemination, and the related integration of global economies and societies are fast transforming the way we live and interact, the resulting dynamic, globalised, information society remains digitally divided. On the African continent in particular, this division has resulted in a gap between the knowledge generation and its transformation into tangible products and services. This paper proposes some fundamental approaches for a sustainable transformation of data into knowledge for the purpose of improving the people's quality of life. Its main strategy is based on a generic data sharing model providing access to data utilising and generating entities in a multi-disciplinary environment. It highlights the great potentials in using unsupervised and supervised modelling in tackling the typically predictive-in-nature challenges we face. Using both simulated and real data, the paper demonstrates how some of the key parameters may be generated and embedded in models to enhance their predictive power and reliability. The paper's conclusions include a proposed implementation framework setting the scene for the creation of decision support systems capable of addressing the key issues in society. It is expected that a sustainable data flow will forge synergies among the private sector, academic, and research institutions within and among countries. It is also expected that the paper's findings will help in the design and development of knowledge extraction from data in the wake of cloud computing and, hence, contribute towards the improvement in the people's overall quality of life. To avoid running high implementation costs, selected open source tools are recommended for developing and sustaining the system.

  14. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    Science.gov (United States)

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  15. Reconstructing Data Flow Diagrams from Structure Charts Based on the Input and Output Relationship

    OpenAIRE

    YAMAMOTO, Shuichiro

    1995-01-01

    The traceability of data flow diagrams against structure charts is very important for large software development. Specifying if there is a relationship between a data flow diagram and a structure chart is a time consuming task. Existing CASE tools provide a way to maintain traceability. If we can extract the input-output relationship of a system from a structure chart, the corresponding data flow diagram can be automatically generated from the relationship. For example, Benedusi et al. propos...

  16. Run-Time HW/SW Scheduling of Data Flow Applications on Reconfigurable Architectures

    Directory of Open Access Journals (Sweden)

    Ghaffari Fakhreddine

    2009-01-01

    Full Text Available This paper presents an efficient dynamic and run-time Hardware/Software scheduling approach. This scheduling heuristic consists in mapping online the different tasks of a highly dynamic application in such a way that the total execution time is minimized. We consider soft real-time data flow graph oriented applications for which the execution time is function of the input data nature. The target architecture is composed of two processors connected to a dynamically reconfigurable hardware accelerator. Our approach takes advantage of the reconfiguration property of the considered architecture to adapt the treatment to the system dynamics. We compare our heuristic with another similar approach. We present the results of our scheduling method on several image processing applications. Our experiments include simulation and synthesis results on a Virtex V-based platform. These results show a better performance against existing methods.

  17. A Fault Detection Mechanism in a Data-flow Scheduled Multithreaded Processor

    NARCIS (Netherlands)

    Fu, J.; Yang, Q.; Poss, R.; Jesshope, C.R.; Zhang, C.

    2014-01-01

    This paper designs and implements the Redundant Multi-Threading (RMT) in a Data-flow scheduled MultiThreaded (DMT) multicore processor, called Data-flow scheduled Redundant Multi-Threading (DRMT). Meanwhile, It presents Asynchronous Output Comparison (AOC) for RMT techniques to avoid fault detection

  18. CyNC: A method for real time analysis of systems with cyclic data flows

    DEFF Research Database (Denmark)

    Jessen, Jan Jacob; Schiøler, Henrik; Nielsen, Jens Frederik Dalsgaard

    2006-01-01

    The paper addresses a novel method for performance analysis of distributed realtime systems with complex, and especially cyclic data flow graphs. The presented method is based on Network Calculus principles, where flow and service constraint functions are used to bound data flows and processing r...... on a relevant example. The method is implemented in a prototype tool also denoted CyNC providing a graphical user interface for model specification based on the MATLAB/SimuLink framework. Udgivelsesdato: DECEMBER...... constraints implicitely given by a fix point equation in a space of constraint functions. In this paper a method denoted CyNC for obtaining a well defined solution to that problem is presented along with a theoretical justification of the method as well as comparative results for CyNC and alternative methods...

  19. Secure Data Flow in a Calculus for Context Awareness

    DEFF Research Database (Denmark)

    Bucur, Doina; Nielsen, Mogens

    2008-01-01

    We present a Mobile-Ambients-based process calculus to describe context-aware computing in an infrastructure-based Ubiquitous Computing setting. In our calculus, computing agents can provide and discover contextual information and are owners of security policies. Simple access control to contextual...

  20. The web as platform: Data flows in social media

    NARCIS (Netherlands)

    Helmond, A.

    2015-01-01

    This dissertation looks into the history of Web 2.0 as "the web as platform" (O’Reilly 2004) and traces the transition of social network sites into social media platforms to examine how social media has transformed the web. In order to understand this process from an infrastructural perspective, I

  1. Data Flow for the TERRA-REF project

    Science.gov (United States)

    Kooper, R.; Burnette, M.; Maloney, J.; LeBauer, D.

    2017-12-01

    The Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform (TERRA-REF) program aims to identify crop traits that are best suited to producing high-energy sustainable biofuels and match those plant characteristics to their genes to speed the plant breeding process. One tool used to achieve this goal is a high-throughput phenotyping robot outfitted with sensors and cameras to monitor the growth of 1.25 acres of sorghum. Data types range from hyperspectral imaging to 3D reconstructions and thermal profiles, all at 1mm resolution. This system produces thousands of daily measurements with high spatiotemporal resolution. The team at NCSA processes, annotates, organizes and stores the massive amounts of data produced by this system - up to 5 TB per day. Data from the sensors is streamed to a local gantry-cache server. The standardized sensor raw data stream is automatically and securely delivered to NCSA using Globus Connect service. Once files have been successfully received by the Globus endpoint, the files are removed from the gantry-cache server. As each dataset arrives or is created the Clowder system automatically triggers different software tools to analyze each file, extract information, and convert files to a common format. Other tools can be triggered to run after all required data is uploaded. For example, a stitched image of the entire field is created after all images of the field become available. Some of these tools were developed by external collaborators based on predictive models and algorithms, others were developed as part of other projects and could be leveraged by the TERRA project. Data will be stored for the lifetime of the project and is estimated to reach 10 PB over 3 years. The Clowder system, BETY and other systems will allow users to easily find data by browsing or searching the extracted information.

  2. ATLAS DataFlow Infrastructure: Recent results from ATLAS cosmic and first-beam data-taking

    Energy Technology Data Exchange (ETDEWEB)

    Vandelli, Wainer, E-mail: wainer.vandelli@cern.c

    2010-04-01

    The ATLAS DataFlow infrastructure is responsible for the collection and conveyance of event data from the detector front-end electronics to the mass storage. Several optimized and multi-threaded applications fulfill this purpose operating over a multi-stage Gigabit Ethernet network which is the backbone of the ATLAS Trigger and Data Acquisition System. The system must be able to efficiently transport event-data with high reliability, while providing aggregated bandwidths larger than 5 GByte/s and coping with many thousands network connections. Nevertheless, routing and streaming capabilities and monitoring and data accounting functionalities are also fundamental requirements. During 2008, a few months of ATLAS cosmic data-taking and the first experience with the LHC beams provided an unprecedented test-bed for the evaluation of the performance of the ATLAS DataFlow, in terms of functionality, robustness and stability. Besides, operating the system far from its design specifications helped in exercising its flexibility and contributed in understanding its limitations. Moreover, the integration with the detector and the interfacing with the off-line data processing and management have been able to take advantage of this extended data taking-period as well. In this paper we report on the usage of the DataFlow infrastructure during the ATLAS data-taking. These results, backed-up by complementary performance tests, validate the architecture of the ATLAS DataFlow and prove that the system is robust, flexible and scalable enough to cope with the final requirements of the ATLAS experiment.

  3. ATLAS DataFlow Infrastructure: Recent results from ATLAS cosmic and first-beam data-taking

    International Nuclear Information System (INIS)

    Vandelli, Wainer

    2010-01-01

    The ATLAS DataFlow infrastructure is responsible for the collection and conveyance of event data from the detector front-end electronics to the mass storage. Several optimized and multi-threaded applications fulfill this purpose operating over a multi-stage Gigabit Ethernet network which is the backbone of the ATLAS Trigger and Data Acquisition System. The system must be able to efficiently transport event-data with high reliability, while providing aggregated bandwidths larger than 5 GByte/s and coping with many thousands network connections. Nevertheless, routing and streaming capabilities and monitoring and data accounting functionalities are also fundamental requirements. During 2008, a few months of ATLAS cosmic data-taking and the first experience with the LHC beams provided an unprecedented test-bed for the evaluation of the performance of the ATLAS DataFlow, in terms of functionality, robustness and stability. Besides, operating the system far from its design specifications helped in exercising its flexibility and contributed in understanding its limitations. Moreover, the integration with the detector and the interfacing with the off-line data processing and management have been able to take advantage of this extended data taking-period as well. In this paper we report on the usage of the DataFlow infrastructure during the ATLAS data-taking. These results, backed-up by complementary performance tests, validate the architecture of the ATLAS DataFlow and prove that the system is robust, flexible and scalable enough to cope with the final requirements of the ATLAS experiment.

  4. The ATLAS Eventlndex: data flow and inclusion of other metadata

    Science.gov (United States)

    Barberis, D.; Cárdenas Zárate, S. E.; Favareto, A.; Fernandez Casani, A.; Gallas, E. J.; Garcia Montoro, C.; Gonzalez de la Hoz, S.; Hrivnac, J.; Malon, D.; Prokoshin, F.; Salt, J.; Sanchez, J.; Toebbicke, R.; Yuan, R.; ATLAS Collaboration

    2016-10-01

    The ATLAS EventIndex is the catalogue of the event-related metadata for the information collected from the ATLAS detector. The basic unit of this information is the event record, containing the event identification parameters, pointers to the files containing this event as well as trigger decision information. The main use case for the EventIndex is event picking, as well as data consistency checks for large production campaigns. The EventIndex employs the Hadoop platform for data storage and handling, as well as a messaging system for the collection of information. The information for the EventIndex is collected both at Tier-0, when the data are first produced, and from the Grid, when various types of derived data are produced. The EventIndex uses various types of auxiliary information from other ATLAS sources for data collection and processing: trigger tables from the condition metadata database (COMA), dataset information from the data catalogue AMI and the Rucio data management system and information on production jobs from the ATLAS production system. The ATLAS production system is also used for the collection of event information from the Grid jobs. EventIndex developments started in 2012 and in the middle of 2015 the system was commissioned and started collecting event metadata, as a part of ATLAS Distributed Computing operations.

  5. Data flows and water woes: The Utah Data Center

    Directory of Open Access Journals (Sweden)

    Mél Hogan

    2015-07-01

    Full Text Available Using a new materialist line of questioning that looks at the agential potentialities of water and its entanglements with Big Data and surveillance, this article explores how the recent Snowden revelations about the National Security Agency (NSA have reignited media scholars to engage with the infrastructures that enable intercepting, hosting, and processing immeasurable amounts of data. Focusing on the expansive architecture, location, and resource dependence of the NSA’s Utah Data Center, I demonstrate how surveillance and privacy can never be disconnected from the material infrastructures that allow and render natural the epistemological state of mass surveillance. Specifically, I explore the NSA’s infrastructure and the million of gallons of water it requires daily to cool its servers, while located in one of the driest states in the US. Complicating surveillance beyond the NSA, as also already imbricated with various social media companies, this article questions the emplacement and impact of corporate data centers more generally, and the changes they are causing to the landscape and local economies. I look at how water is an intriguing and politically relevant part of the surveillance infrastructure and how it has been constructed as the main tool for activism in this case, and how it may eventually help transform the public’s conceptualization of Big Data, as deeply material.

  6. The use of Ethernet in the DataFlow of the ATLAS Trigger & DAQ

    CERN Document Server

    Stancu, Stefan; Dobinson, Bob; Korcyl, Krzysztof; Knezo, Emil; CHEP 2003 Computing in High Energy Physics

    2003-01-01

    The article analyzes a proposed network topology for the ATLAS DAQ DataFlow, and identifies the Ethernet features required for a proper operation of the network: MAC address table size, switch performance in terms of throughput and latency, the use of Flow Control, Virtual LANs and Quality of Service. We investigate these features on some Ethernet switches, and conclude on their usefulness for the ATLAS DataFlow network

  7. Improved Low Power FPGA Binding of Datapaths from Data Flow Graphs with NSGA II -based Schedule Selection

    Directory of Open Access Journals (Sweden)

    BHUVANESWARI, M. C.

    2013-11-01

    Full Text Available FPGAs are increasingly being used to implement data path intensive algorithms for signal processing and image processing applications. In High Level Synthesis of Data Flow Graphs targeted at FPGAs, the effect of interconnect resources such as multiplexers must be considered since they contribute significantly to the area and switching power. We propose a binding framework for behavioral synthesis of Data Flow Graphs (DFGs onto FPGA targets with power reduction as the main criterion. The technique uses a multi-objective GA, NSGA II for design space exploration to identify schedules that have the potential to yield low-power bindings from a population of non-dominated solutions. A greedy constructive binding technique reported in the literature is adapted for interconnect minimization. The binding is further subjected to a perturbation process by altering the register and multiplexer assignments. Results obtained on standard DFG benchmarks indicate that our technique yields better power aware bindings than the constructive binding approach with little or no area overhead.

  8. Jackson System Development, Entity-relationship Analysis and Data Flow Models: a comparative study

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    1994-01-01

    This report compares JSD with ER modeling and data flow modeling. It is shown that JSD can be combined with ER modeling and that the result is a richer method than either of the two. The resulting method can serve as a basis for a pratical object-oriented modeling method and has some resemblance to

  9. A new approach in development of data flow control and investigation system for computer networks

    International Nuclear Information System (INIS)

    Frolov, I.; Vaguine, A.; Silin, A.

    1992-01-01

    This paper describes a new approach in development of data flow control and investigation system for computer networks. This approach was developed and applied in the Moscow Radiotechnical Institute for control and investigations of Institute computer network. It allowed us to solve our network current problems successfully. Description of our approach is represented below along with the most interesting results of our work. (author)

  10. T.I.M.S: TaqMan Information Management System, tools to organize data flow in a genotyping laboratory

    Directory of Open Access Journals (Sweden)

    Albion Tim

    2005-10-01

    Full Text Available Abstract Background Single Nucleotide Polymorphism (SNP genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS.

  11. T.I.M.S: TaqMan Information Management System, tools to organize data flow in a genotyping laboratory

    Science.gov (United States)

    Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico

    2005-01-01

    Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298

  12. Perancangan Model Data Flow Diagram Untuk Mengukur Kualitas Website Menggunakan Webqual 4.0

    Directory of Open Access Journals (Sweden)

    Karina Hapsari

    2017-05-01

    Full Text Available The more competition the e-commerce company and the development of technology company in Indonesia, website of Zalora Indonesia traffic rank declining. Measuring the quality of website by using WebQual 4.0 will help manage the web to be able to adjust the quality of the web with user perception.  The research aims to make design of Data Flow Diagram model to measure website quality using WebQual 4.0 based on user satisfaction variable. A case study was conducted on the Zalora Indonesia website. Data Flow model is used to make design of system model recommendation, while WebQual 4.0 method is used to measure website quality to user satisfaction. The research data using primary data in the form of questionnaires involving 384 respondents in the city of Bandung who had transacted on the website Zalora Indonesia. Data analysis technique applies descriptive analysis. Based on the research result on the quality of the website Zalora Indonesia, simultaneous positive and significant impact on user satisfaction Zalora Indonesia website. t test result showed that three variables partially have a posotive impact on user satisfaction Zalora Indonesia website is usability quality, information quality and service interaction quality, with Information quality variable has largest impact. Therefore, the modeling system using the Context Diagram-Data Flow Diagram focused on information quality variable.

  13. Understanding the ‘Intensive’ in ‘Data Intensive Research’: Data Flows in Next Generation Sequencing and Environmental Networked Sensors

    Directory of Open Access Journals (Sweden)

    Ruth McNally

    2012-03-01

    Full Text Available Genomic and environmental sciences represent two poles of scientific data. In the first, highly parallel sequencing facilities generate large quantities of sequence data. In the latter, loosely networked remote and field sensors produce intermittent streams of different data types. Yet both genomic and environmental sciences are said to be moving to data intensive research. This paper explores and contrasts data flow in these two domains in order to better understand how data intensive research is being done. Our case studies are next generation sequencing for genomics and environmental networked sensors.Our objective was to enrich understanding of the ‘intensive’ processes and properties of data intensive research through a ‘sociology’ of data using methods that capture the relational properties of data flows. Our key methodological innovation was the staging of events for practitioners with different kinds of expertise in data intensive research to participate in the collective annotation of visual forms. Through such events we built a substantial digital data archive of our own that we then analysed in terms of three traits of data flow: durability, replicability and metrology.Our findings are that analysing data flow with respect to these three traits provides better insight into how doing data intensive research involves people, infrastructures, practices, things, knowledge and institutions. Collectively, these elements shape the topography of data and condition how it flows. We argue that although much attention is given to phenomena such as the scale, volume and speed of data in data intensive research, these are measures of what we call ‘extensive’ properties rather than intensive ones. Our thesis is that extensive changes, that is to say those that result in non-linear changes in metrics, can be seen to result from intensive changes that bring multiple, disparate flows into confluence.If extensive shifts in the modalities of

  14. Data flow methods for dynamic system simulation - A CSSL-IV microcomputer network interface

    Science.gov (United States)

    Makoui, A.; Karplus, W. J.

    1983-01-01

    A major problem in employing networks of microcomputers for the real-time simulation of complex systems is to allocate computational tasks to the various microcomputers in such a way that idle time and time lost in interprocess communication is minimized. The research reported in this paper is directed to the development of a software interface between a higher-level simulation language and a network of microcomputers. A CSSL-IV source program is translated to a data flow graph. This graph is then analyzed automatically so as to allocate computing tasks to the various processors.

  15. Perancangan Model Data Flow Diagram Untuk Mengukur Kualitas Website Menggunakan Webqual 4.0

    OpenAIRE

    Karina Hapsari

    2017-01-01

    The more competition the e-commerce company and the development of technology company in Indonesia, website of Zalora Indonesia traffic rank declining. Measuring the quality of website by using WebQual 4.0 will help manage the web to be able to adjust the quality of the web with user perception.  The research aims to make design of Data Flow Diagram model to measure website quality using WebQual 4.0 based on user satisfaction variable. A case study was conducted on the Zalora Indonesia websit...

  16. Data flow between RFID devices in a modern restricted access administrative office

    Directory of Open Access Journals (Sweden)

    Robert Waszkowski

    2016-01-01

    Full Text Available The paper presents models of data flow between RFID devices in a modern restricted access administrative office. The presented diagrams are the result of the analytical work performed by the multidisciplinary team of experts. The team was composed of IT specialist, security systems specialists and employees of the secret office. The presented models include the fact that the facilities in the secret office (cabinet, sluice, photocopier, desks are equipped with the RFID reader, which allows to immediately read the documents that are within their reach.

  17. Data Flow in Relation to Life-Cycle Costing of Construction Projects in the Czech Republic

    Science.gov (United States)

    Biolek, Vojtěch; Hanák, Tomáš; Marović, Ivan

    2017-10-01

    Life-cycle costing is an important part of every construction project, as it makes it possible to take into consideration future costs relating to the operation and demolition phase of a built structure. In this way, investors can optimize the project design to minimize the total project costs. Even though there have already been some attempts to implement BIM software in the Czech Republic, the current state of affairs does not support automated data flow between the bill of costs and applications that support building facility management. The main aim of this study is to critically evaluate the current situation and outline a future framework that should allow for the use of the data contained in the bill of costs to manage building operating costs.

  18. Developing a Translator from C Programs to Data Flow Graphs Using RAISE

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    1996-01-01

    Describes how a translator from a subset of C to data flow graphs has been formally developed using the RAISE (Rigorous Approach to Industrial Software Engineering) method and tools. In contrast to many development examples described in the literature, this development is not a case study......, but a real one, and it covers all development phases, including the code-generation phase. The translator is now one of the components of the LYCOS (LYngby CO-Synthesis) system, which is a software/hardware co-synthesis system under development at the Technical University of Denmark. The translator, together...... with the other components of LYCOS, provides a means for moving parts of C programs to dedicated hardware, thereby obtaining better performance. The translator was refined in steps, starting with an abstract specification and ending with a concrete specification from which C++ code was then automatically...

  19. A Data Flow Model to Solve the Data Distribution Changing Problem in Machine Learning

    Directory of Open Access Journals (Sweden)

    Shang Bo-Wen

    2016-01-01

    Full Text Available Continuous prediction is widely used in broad communities spreading from social to business and the machine learning method is an important method in this problem.When we use the machine learning method to predict a problem. We use the data in the training set to fit the model and estimate the distribution of data in the test set.But when we use machine learning to do the continuous prediction we get new data as time goes by and use the data to predict the future data, there may be a problem. As the size of the data set increasing over time, the distribution changes and there will be many garbage data in the training set.We should remove the garbage data as it reduces the accuracy of the prediction. The main contribution of this article is using the new data to detect the timeliness of historical data and remove the garbage data.We build a data flow model to describe how the data flow among the test set, training set, validation set and the garbage set and improve the accuracy of prediction. As the change of the data set, the best machine learning model will change.We design a hybrid voting algorithm to fit the data set better that uses seven machine learning models predicting the same problem and uses the validation set putting different weights on the learning models to give better model more weights. Experimental results show that, when the distribution of the data set changes over time, our time flow model can remove most of the garbage data and get a better result than the traditional method that adds all the data to the data set; our hybrid voting algorithm has a better prediction result than the average accuracy of other predict models

  20. Accessible Modelling of Complexity in Health (AMoCH and associated data flows: asthma as an exemplar

    Directory of Open Access Journals (Sweden)

    Harshana Liyanage

    2016-04-01

    Full Text Available Background Modelling is an important part of information science. Models are abstractions of reality. We use models in the following contexts: (1 to describe the data and information flows in clinical practice to information scientists, (2 to compare health systems and care pathways, (3 to understand how clinical cases are recorded in record systems and (4 to model health care business models. Asthma is an important condition associated with a substantial mortality and morbidity. However, there are difficulties in determining who has the condition, making both its incidence and prevalence uncertain. Objective To demonstrate an approach for modelling complexity in health using asthma prevalence and incidence as an exemplar. Method The four steps in our process are: 1. Drawing a rich picture, following Checkland’s soft systems methodology; 2. Constructing data flow diagrams (DFDs; 3. Creating Unified Modelling Language (UML use case diagrams to describe the interaction of the key actors with the system; 4. Activity diagrams, either UML activity diagram or business process modelling notation diagram. Results Our rich picture flagged the complexity of factors that might impact on asthma diagnosis. There was consensus that the principle issue was that there were undiagnosed and misdiagnosed cases as well as correctly diagnosed. Genetic predisposition to atopy; exposure to environmental triggers; impact of respiratory health on earnings or ability to attend education or participate in sport, charities, pressure groups and the pharmaceutical industry all increased the likelihood of a diagnosis of asthma. Stigma and some factors within the health system diminished the likelihood of a diagnosis. The DFDs and other elements focused on better case finding. Conclusions This approach flagged the factors that might impact on the reported prevalence or incidence of asthma. The models suggested that applying selection criteria may improve the specificity of

  1. Accessible Modelling of Complexity in Health (AMoCH) and associated data flows: asthma as an exemplar.

    Science.gov (United States)

    Liyanage, Harshana; Luzi, Daniela; De Lusignan, Simon; Pecoraro, Fabrizio; McNulty, Richard; Tamburis, Oscar; Krause, Paul; Rigby, Michael; Blair, Mitch

    2016-04-18

    Background Modelling is an important part of information science. Models are abstractions of reality. We use models in the following contexts: (1) to describe the data and information flows in clinical practice to information scientists, (2) to compare health systems and care pathways, (3) to understand how clinical cases are recorded in record systems and (4) to model health care business models.Asthma is an important condition associated with a substantial mortality and morbidity. However, there are difficulties in determining who has the condition, making both its incidence and prevalence uncertain.Objective To demonstrate an approach for modelling complexity in health using asthma prevalence and incidence as an exemplar.Method The four steps in our process are:1. Drawing a rich picture, following Checkland's soft systems methodology;2. Constructing data flow diagrams (DFDs);3. Creating Unified Modelling Language (UML) use case diagrams to describe the interaction of the key actors with the system;4. Activity diagrams, either UML activity diagram or business process modelling notation diagram.Results Our rich picture flagged the complexity of factors that might impact on asthma diagnosis. There was consensus that the principle issue was that there were undiagnosed and misdiagnosed cases as well as correctly diagnosed. Genetic predisposition to atopy; exposure to environmental triggers; impact of respiratory health on earnings or ability to attend education or participate in sport, charities, pressure groups and the pharmaceutical industry all increased the likelihood of a diagnosis of asthma. Stigma and some factors within the health system diminished the likelihood of a diagnosis. The DFDs and other elements focused on better case finding.Conclusions This approach flagged the factors that might impact on the reported prevalence or incidence of asthma. The models suggested that applying selection criteria may improve the specificity of new or confirmed diagnosis.

  2. ATLAS DataFlow Infrastructure recent results from ATLAS cosmic and first-beam data-taking

    CERN Document Server

    Vandelli, W

    2010-01-01

    The ATLAS DataFlow infrastructure is responsible for the collection and conveyance of event data from the detector front-end electronics to the mass storage. Several optimized and multi-threaded applications fulfill this purpose operating over a multi-stage Gigabit Ethernet network which is the backbone of the ATLAS Trigger and Data Acquisition System. The system must be able to efficiently transport event-data with high reliability, while providing aggregated bandwidths larger than 5 GByte/s and coping with many thousands network connections. Nevertheless, routing and streaming capabilities and monitoring and data accounting functionalities are also fundamental requirements. During 2008, a few months of ATLAS cosmic data-taking and the first experience with the LHC beams provided an unprecedented testbed for the evaluation of the performance of the ATLAS DataFlow, in terms of functionality, robustness and stability. Besides, operating the system far from its design specifications helped in exercising its fle...

  3. Management of complex data flows in the ASDEX Upgrade plasma control system

    Energy Technology Data Exchange (ETDEWEB)

    Treutterer, Wolfgang, E-mail: Wolfgang.Treutterer@ipp.mpg.de [Max-Planck Institut fuer Plasmaphysik, EURATOM Association, Garching (Germany); Neu, Gregor; Raupp, Gerhard; Zasche, Dieter; Zehetbauer, Thomas [Max-Planck Institut fuer Plasmaphysik, EURATOM Association, Garching (Germany); Cole, Richard; Lueddecke, Klaus [Unlimited Computer Systems, Iffeldorf (Germany)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Control system architectures with data-driven workflows are efficient, flexible and maintainable. Black-Right-Pointing-Pointer Signal groups provide coherence of interrelated signals and increase the efficiency of process synchronisation. Black-Right-Pointing-Pointer Sample tags indicating sample quality form the fundament of a local event handling strategy. Black-Right-Pointing-Pointer A self-organising workflow benefits from sample tags consisting of time stamp and stream activity. - Abstract: Establishing adequate technical and physical boundary conditions for a sustained nuclear fusion reaction is a challenging task. Phased feedback control and monitoring for heating, fuelling and magnetic shaping is mandatory, especially for fusion devices aiming at high performance plasmas. Technical and physical interrelations require close collaboration of many components in sequential as well as in parallel processing flows. Moreover, handling of asynchronous, off-normal events has become a key element of modern plasma performance optimisation and machine protection recipes. The manifoldness of plasma states and events, the variety of plant system operation states and the diversity in diagnostic data sampling rates can hardly be mastered with a rigid control scheme. Rather, an adaptive system topology in combination with sophisticated synchronisation and process scheduling mechanisms is suited for such an environment. Moreover, the system is subject to real-time control constraints: response times must be deterministic and adequately short. Therefore, the experimental tokamak device ASDEX Upgrade employs a discharge control system DCS, whose core has been designed to meet these requirements. In the paper we will compare the scheduling schemes for the parallelised realisation of a control workflow and show the advantage of a data-driven workflow over a managed workflow. The data-driven workflow as used in DCS is based on signals

  4. Management of complex data flows in the ASDEX Upgrade plasma control system

    International Nuclear Information System (INIS)

    Treutterer, Wolfgang; Neu, Gregor; Raupp, Gerhard; Zasche, Dieter; Zehetbauer, Thomas; Cole, Richard; Lüddecke, Klaus

    2012-01-01

    Highlights: ► Control system architectures with data-driven workflows are efficient, flexible and maintainable. ► Signal groups provide coherence of interrelated signals and increase the efficiency of process synchronisation. ► Sample tags indicating sample quality form the fundament of a local event handling strategy. ► A self-organising workflow benefits from sample tags consisting of time stamp and stream activity. - Abstract: Establishing adequate technical and physical boundary conditions for a sustained nuclear fusion reaction is a challenging task. Phased feedback control and monitoring for heating, fuelling and magnetic shaping is mandatory, especially for fusion devices aiming at high performance plasmas. Technical and physical interrelations require close collaboration of many components in sequential as well as in parallel processing flows. Moreover, handling of asynchronous, off-normal events has become a key element of modern plasma performance optimisation and machine protection recipes. The manifoldness of plasma states and events, the variety of plant system operation states and the diversity in diagnostic data sampling rates can hardly be mastered with a rigid control scheme. Rather, an adaptive system topology in combination with sophisticated synchronisation and process scheduling mechanisms is suited for such an environment. Moreover, the system is subject to real-time control constraints: response times must be deterministic and adequately short. Therefore, the experimental tokamak device ASDEX Upgrade employs a discharge control system DCS, whose core has been designed to meet these requirements. In the paper we will compare the scheduling schemes for the parallelised realisation of a control workflow and show the advantage of a data-driven workflow over a managed workflow. The data-driven workflow as used in DCS is based on signals connecting process outputs and inputs. These are implemented as real-time streams of data samples

  5. Data-flow performance optimisation of the ATLAS data acquisition system

    CERN Document Server

    Colombo, Tommaso; Vandelli, Wainer

    Colliding particles at higher and higher energies has proven to be a fruitful avenue to expand our knowledge of nature. Results from high-energy physics experiments have led to the formulation of the Standard Model, which has been strikingly successful in describing the currently known fundamental particles and the interactions between them. Nevertheless, the Standard Model is necessarily an incomplete theory as it does neither account for gravity, nor provide an explanation to cosmological problems like the apparent existence of dark matter and the observed matter-antimatter asymmetry. New phenomena, not contained in the Standard Model, could be discovered by pushing the energy boundary further. Current high-energy physics experiments aim to observe these new phenomena and explore the electroweak symmetry breaking mechanism predicted by the Standard Model. These will necessarily be concealed within a huge background of already well known processes. Therefore, not only the energy, but also the collision rate ...

  6. The ATLAS Data Flow system in Run2: Design and Performance

    CERN Document Server

    Rifki, Othmane; The ATLAS collaboration

    2016-01-01

    The ATLAS detector uses a real time selective triggering system to reduce the high interaction rate from 40 MHz to its data storage capacity of 1 kHz. A hardware first level trigger limits the rate to 100 kHz and a software high level trigger selects events for offline analysis. By building on the experience gained during the successful first run of the LHC, the ATLAS Trigger and Data Acquisition system has been simplified and upgraded to take advantage of state of the art technologies. The Dataflow element of the system is composed of distributed hardware and software responsible for buffering and transporting event data from the Readout system to the High Level Trigger and to the event storage. This system has been reshaped in order to maximize the flexibility and efficiency of the data selection process. The updated dataflow is different from the previous implementation both in terms of architecture and performance. The biggest difference is within the high level trigger, where the merger of region-of-inte...

  7. The ATLAS EventIndex: data flow and inclusion of other metadata

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00064378; Cardenas Zarate, Simon Ernesto; Favareto, Andrea; Fernandez Casani, Alvaro; Gallas, Elizabeth; Garcia Montoro, Carlos; Gonzalez de la Hoz, Santiago; Hrivnac, Julius; Malon, David; Prokoshin, Fedor; Salt, Jose; Sanchez, Javier; Toebbicke, Rainer; Yuan, Ruijun

    2016-01-01

    The ATLAS EventIndex is the catalogue of the event-related metadata for the information collected from the ATLAS detector. The basic unit of this information is the event record, containing the event identification parameters, pointers to the files containing this event as well as trigger decision information. The main use case for the EventIndex is event picking, as well as data consistency checks for large production campaigns. The EventIndex employs the Hadoop platform for data storage and handling, as well as a messaging system for the collection of information. The information for the EventIndex is collected both at Tier-0, when the data are first produced, and from the Grid, when various types of derived data are produced. The EventIndex uses various types of auxiliary information from other ATLAS sources for data collection and processing: trigger tables from the condition metadata database (COMA), dataset information from the data catalogue AMI and the Rucio data management system and information on p...

  8. The ATLAS EventIndex: data flow and inclusion of other metadata

    CERN Document Server

    Prokoshin, Fedor; The ATLAS collaboration; Cardenas Zarate, Simon Ernesto; Favareto, Andrea; Fernandez Casani, Alvaro; Gallas, Elizabeth; Garcia Montoro, Carlos; Gonzalez de la Hoz, Santiago; Hrivnac, Julius; Malon, David; Salt, Jose; Sanchez, Javier; Toebbicke, Rainer; Yuan, Ruijun

    2016-01-01

    The ATLAS EventIndex is the catalogue of the event-related metadata for the information obtained from the ATLAS detector. The basic unit of this information is event record, containing the event identification parameters, pointers to the files containing this event as well as trigger decision information. The main use case for the EventIndex are the event picking, providing information for the Event Service and data consistency checks for large production campaigns. The EventIndex employs the Hadoop platform for data storage and handling, as well as a messaging system for the collection of information. The information for the EventIndex is collected both at Tier-0, when the data are first produced, and from the GRID, when various types of derived data are produced. The EventIndex uses various types of auxiliary information from other ATLAS sources for data collection and processing: trigger tables from the condition metadata database (COMA), dataset information from the data catalog AMI and the Rucio data man...

  9. Analysis of data flow and activities at radiology reporting stations for design and evaluation of digital work stations

    International Nuclear Information System (INIS)

    Mun, S.K.; Benson, H.; Welsh, C.; Elliott, L.P.; Zeman, R.

    1987-01-01

    Definition of necessary and desirable functional capabilities of PACS work stations is critical in the design of digital systems for the successful clinical acceptance of digital imaging networks. The authors conducted a detailed time motion study of data flow pattern, diagnostic decision making, and reporting activities at current film alternators for neuroradiology, body CT, and pulmonary service. The measured parameters include data volume, data presentation speed, frequency of use of previous studies, efforts needed to retrieve previous studies, time required for diagnosis, frequency and duration of consultation with referring physicians, frequency of interruptions, and dictation time an defficiency. The result of this study provides critical information in designing digital work stations for various services

  10. Data-flow coupling and data-acquisition triggers for the PreSPEC-AGATA campaign at GSI

    Energy Technology Data Exchange (ETDEWEB)

    Ralet, D., E-mail: D.Ralet@gsi.de [Institut für Kernphysik, Technische Universität Darmstadt, Darmstadt (Germany); GSI, Helmholtzzentrum für Schwerionenforschung GmbH, Darmstadt (Germany); Pietri, S. [GSI, Helmholtzzentrum für Schwerionenforschung GmbH, Darmstadt (Germany); Aubert, Y. [Institut de Physique Nucléaire, Orsay (France); Bellato, M.; Bortolato, D. [Istituto Nazionale di Fisica Nucleare sezione di Padova, Padova (Italy); Brambilla, S.; Camera, F. [Istituto Nazionale di Fisica Nucleare sezione di Milano, Milano (Italy); Dosme, N. [CSNSM, Université Paris-Sud, Orsay (France); Gadea, A. [Instituto di Fisica Corpuscular, Valencia (Spain); Gerl, J. [GSI, Helmholtzzentrum für Schwerionenforschung GmbH, Darmstadt (Germany); Golubev, P. [Department of Physics, Lund University, Lund (Sweden); Grave, X. [CSNSM, Université Paris-Sud, Orsay (France); Johansson, H.T. [Chalmers University of Technology, Göteborg (Sweden); Karkour, N.; Korichi, A. [CSNSM, Université Paris-Sud, Orsay (France); Kurz, N. [GSI, Helmholtzzentrum für Schwerionenforschung GmbH, Darmstadt (Germany); Lafay, X.; Legay, E.; Linget, D. [CSNSM, Université Paris-Sud, Orsay (France); Pietralla, N. [Institut für Kernphysik, Technische Universität Darmstadt, Darmstadt (Germany); GSI, Helmholtzzentrum für Schwerionenforschung GmbH, Darmstadt (Germany); and others

    2015-06-21

    The PreSPEC setup for high-resolution γ-ray spectroscopy using radioactive ion beams was employed for experimental campaigns in 2012 and 2014. The setup consisted of the state of the art Advanced GAmma Tracking Array (AGATA) and the High Energy γ deteCTOR (HECTOR+) positioned around a secondary target at the final focal plane of the GSI FRagment Separator (FRS) to perform in-beam γ-ray spectroscopy of exotic nuclei. The Lund York Cologne CAlorimeter (LYCCA) was used to identify the reaction products. In this paper we report on the trigger scheme used during the campaigns. The data-flow coupling between the Multi-Branch System (MBS) based Data AcQuisition (DAQ) used for FRS-LYCCA and the “Nouvelle Acquisition temps Réel Version 1.2 Avec Linux” (NARVAL) based acquisition system used for AGATA are also described.

  11. Personal Data Flows

    DEFF Research Database (Denmark)

    Albrechtslund, Anders; Damkjær, Maja Sonne; Bøge, Ask Risom

    are used and how they potentially change the relation between parents and children. Both parents and children use their digital devices, particularly smartphones, as cameras to document their lives and to share photos with others. The interviews show that parents do not generally plan to store or organize...

  12. Dynamics of data flows on the low-activated vanadium alloy for thermonuclear power engineering (analysis of four international data bases)

    International Nuclear Information System (INIS)

    Shepelev, A.G.; Kurilo, Yu.P.; Krivchenko, O.V.

    2015-01-01

    The paper presents the results of scientometric analysis of data flows in the International Data Bases SCOPUS, INSPEC, INIS, MSCI over a period since 1971 to 2014 on low-activated vanadium alloys suitable for operation as structural materials under extremely hard conditions in the future fusion reactors. The data on the dynamics of publications and contributions in them from the scientists of different countries have been obtained. The types and languages of publications have been identified. The analysis shows that investigations on the low-activated vanadium alloys are of current importance

  13. When Are Mobile Phones Useful for Water Quality Data Collection? An Analysis of Data Flows and ICT Applications among Regulated Monitoring Institutions in Sub-Saharan Africa.

    Science.gov (United States)

    Kumpel, Emily; Peletz, Rachel; Bonham, Mateyo; Fay, Annette; Cock-Esteb, Alicea; Khush, Ranjiv

    2015-09-02

    Water quality monitoring is important for identifying public health risks and ensuring water safety. However, even when water sources are tested, many institutions struggle to access data for immediate action or long-term decision-making. We analyzed water testing structures among 26 regulated water suppliers and public health surveillance agencies across six African countries and identified four water quality data management typologies. Within each typology, we then analyzed the potential for information and communication technology (ICT) tools to facilitate water quality information flows. A consistent feature of all four typologies was that testing activities occurred in laboratories or offices, not at water sources; therefore, mobile phone-based data management may be most beneficial for institutions that collect data from multiple remote laboratories. We implemented a mobile phone application to facilitate water quality data collection within the national public health agency in Senegal, Service National de l'Hygiène. Our results indicate that using the phones to transmit more than just water quality data will likely improve the effectiveness and sustainability of this type of intervention. We conclude that an assessment of program structure, particularly its data flows, provides a sound starting point for understanding the extent to which ICTs might strengthen water quality monitoring efforts.

  14. When Are Mobile Phones Useful for Water Quality Data Collection? An Analysis of Data Flows and ICT Applications among Regulated Monitoring Institutions in Sub-Saharan Africa

    Directory of Open Access Journals (Sweden)

    Emily Kumpel

    2015-09-01

    Full Text Available Water quality monitoring is important for identifying public health risks and ensuring water safety. However, even when water sources are tested, many institutions struggle to access data for immediate action or long-term decision-making. We analyzed water testing structures among 26 regulated water suppliers and public health surveillance agencies across six African countries and identified four water quality data management typologies. Within each typology, we then analyzed the potential for information and communication technology (ICT tools to facilitate water quality information flows. A consistent feature of all four typologies was that testing activities occurred in laboratories or offices, not at water sources; therefore, mobile phone-based data management may be most beneficial for institutions that collect data from multiple remote laboratories. We implemented a mobile phone application to facilitate water quality data collection within the national public health agency in Senegal, Service National de l’Hygiène. Our results indicate that using the phones to transmit more than just water quality data will likely improve the effectiveness and sustainability of this type of intervention. We conclude that an assessment of program structure, particularly its data flows, provides a sound starting point for understanding the extent to which ICTs might strengthen water quality monitoring efforts.

  15. Design, functioning and possible applications of process computers

    International Nuclear Information System (INIS)

    Kussl, V.

    1975-01-01

    Process computers are useful as automation instruments a) when large numbers of data are processed in analog or digital form, b) for low data flow (data rate), and c) when data must be stored over short or long periods of time. (orig./AK) [de

  16. Reachability for Finite-State Process Algebras Using Static Analysis

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming

    2011-01-01

    of the Data Flow Analysis are used in order to “cut off” some of the branches in the reachability analysis that are not important for determining, whether or not a state is reachable. In this way, it is possible for our reachability algorithm to avoid building large parts of the system altogether and still......In this work we present an algorithm for solving the reachability problem in finite systems that are modelled with process algebras. Our method uses Static Analysis, in particular, Data Flow Analysis, of the syntax of a process algebraic system with multi-way synchronisation. The results...... solve the reachability problem in a precise way....

  17. Standard services for the capture, processing, and distribution of packetized telemetry data

    Science.gov (United States)

    Stallings, William H.

    1989-01-01

    Standard functional services for the capture, processing, and distribution of packetized data are discussed with particular reference to the future implementation of packet processing systems, such as those for the Space Station Freedom. The major functions are listed under the following major categories: input processing, packet processing, and output processing. A functional block diagram of a packet data processing facility is presented, showing the distribution of the various processing functions as well as the primary data flow through the facility.

  18. Timing analysis of synchronous data flow graphs

    NARCIS (Netherlands)

    Ghamarian, A.H.

    2008-01-01

    Consumer electronic systems are getting more and more complex. Consequently, their design is getting more complicated. Typical systems built today are made of different subsystems that work in parallel in order to meet the functional re- quirements of the demanded applications. The types of

  19. STREAM PROCESSING ALGORITHMS FOR DYNAMIC 3D SCENE ANALYSIS

    Science.gov (United States)

    2018-02-15

    PROCESSING ALGORITHMS FOR DYNAMIC 3D SCENE ANALYSIS 5a. CONTRACT NUMBER FA8750-14-2-0072 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62788F 6...of Figures 1 The 3D processing pipeline flowchart showing key modules. . . . . . . . . . . . . . . . . 12 2 Overall view (data flow) of the proposed...pipeline flowchart showing key modules. from motion and bundle adjustment algorithm. By fusion of depth masks of the scene obtained from 3D

  20. The ATLAS Event Service: A New Approach to Event Processing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00070566; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; van Gemmeren, Peter; Wenaus, Torre

    2015-01-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre­staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabi...

  1. Flow Logic for Process Calculi

    DEFF Research Database (Denmark)

    Nielson, Hanne Riis; Nielson, Flemming; Pilegaard, Henrik

    2012-01-01

    Flow Logic is an approach to statically determining the behavior of programs and processes. It borrows methods and techniques from Abstract Interpretation, Data Flow Analysis and Constraint Based Analysis while presenting the analysis in a style more reminiscent of Type Systems. Traditionally...... developed for programming languages, this article provides a tutorial development of the approach of Flow Logic for process calculi based on a decade of research. We first develop a simple analysis for the π-calculus; this consists of the specification, semantic soundness (in the form of subject reduction......, and finally, we extend it to a relational analysis. A Flow Logic is a program logic---in the same sense that a Hoare’s logic is. We conclude with an executive summary presenting the highlights of the approach from this perspective including a discussion of theoretical properties as well as implementation...

  2. Declaratively programmable ultra-low latency audio effects processing on FPGA

    NARCIS (Netherlands)

    Verstraelen, Martinus Johannes Wilhelmina; Kuper, Jan; Smit, Gerardus Johannes Maria

    2014-01-01

    WaveCore is a coarse-grained reconfigurable processor architecture, based on data-flow principles. The processor architecture consists of a scalable and interconnected cluster of Processing Units (PU), where each PU embodies a small floating-point RISC processor. The processor has been designed in

  3. Graphical Language for Data Processing

    Science.gov (United States)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  4. Verification of Stochastic Process Calculi

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya

    algorithms for constructing bisimulation relations, computing (overapproximations of) sets of reachable states and computing the expected time reachability, the last for a linear fragment of IMC. In all the cases we have the complexities of algorithms which are low polynomial in the size of the syntactic....... In support of this claim we have developed analysis methods that belong to a particular type of Static Analysis { Data Flow / Pathway Analysis. These methods have previously been applied to a number of non-stochastic process calculi. In this thesis we are lifting them to the stochastic calculus...... of Interactive Markov Chains (IMC). We have devised the Pathway Analysis of IMC that is not only correct in the sense of overapproximating all possible behaviour scenarios, as is usual for Static Analysis methods, but is also precise. This gives us the possibility to explicitly decide on the trade-o between...

  5. Data triggered data processing at MFTF-B

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1985-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory we schedule jobs to process experimental data to be collected during a five minute shot cycle. Our data-driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on our networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. We report here on details of diagnostic data processing and our experiences

  6. Data triggered data processing at the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1986-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory the authors schedule jobs to process experimental data to be collected during a five minute shot cycle. The data driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on the networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. The authors report here on details of diagnostic data processing and their experiences

  7. NPTool: Towards Scalability and Reliability of Business Process Management

    Science.gov (United States)

    Braghetto, Kelly Rosa; Ferreira, João Eduardo; Pu, Calton

    Currently one important challenge in business process management is provide at the same time scalability and reliability of business process executions. This difficulty becomes more accentuated when the execution control assumes complex countless business processes. This work presents NavigationPlanTool (NPTool), a tool to control the execution of business processes. NPTool is supported by Navigation Plan Definition Language (NPDL), a language for business processes specification that uses process algebra as formal foundation. NPTool implements the NPDL language as a SQL extension. The main contribution of this paper is a description of the NPTool showing how the process algebra features combined with a relational database model can be used to provide a scalable and reliable control in the execution of business processes. The next steps of NPTool include reuse of control-flow patterns and support to data flow management.

  8. Fuel processing. Wastes processing

    International Nuclear Information System (INIS)

    Bourgeois, M.

    2000-01-01

    The gaseous, liquid and solid radioactive effluents generated by the fuel reprocessing, can't be release in the environment. They have to be treated in order to respect the limits of the pollution regulations. These processing are detailed and discussed in this technical paper. A second part is devoted to the SPIN research program relative to the separation of the long life radionuclides in order to reduce the radioactive wastes storage volume. (A.L.B.)

  9. Systematic framework for carbon dioxide capture and utilization processes to reduce the global carbon dioxide emissions

    DEFF Research Database (Denmark)

    Frauzem, Rebecca; Plaza, Cristina Calvera; Gani, Rafiqul

    information-data on various carbon dioxide emission sources and available capture-utilization technologies; the model and solution libraries [2]; and the generic 3-stage approach for determining more sustainable solutions [3] through superstructure (processing networks) based optimization – adopted for global...... need to provide, amongst other options: useful data from in-house databases on carbon dioxide emission sources; mathematical models from a library of process-property models; numerical solvers from library of implemented solvers; and, work-flows and data-flows for different benefit scenarios...... to be investigated. It is useful to start by developing a prototype framework and then augmenting its application range by increasing the contents of its databases, libraries and work-flows and data-flows. The objective is to present such a prototype framework with its implemented database containing collected...

  10. Process Accounting

    OpenAIRE

    Gilbertson, Keith

    2002-01-01

    Standard utilities can help you collect and interpret your Linux system's process accounting data. Describes the uses of process accounting, standard process accounting commands, and example code that makes use of process accounting utilities.

  11. Modular trigger processing The GCT muon and quiet bit system

    CERN Document Server

    Stettler, Matthew; Hansen, Magnus; Iles, Gregory; Jones, John; PH-EP

    2007-01-01

    The CMS Global Calorimeter Trigger system's HCAL Muon and Quiet bit reformatting function is being implemented with a novel processing architecture. This architecture utilizes micro TCA, a modern modular communications standard based on high speed serial links, to implement a processing matrix. This matrix is configurable in both logical functionality and data flow, allowing far greater flexibility than current trigger processing systems. In addition, the modular nature of this architecture allows flexibility in scale unmatched by traditional approaches. The Muon and Quiet bit system consists of two major components, a custom micro TCA backplane and processing module. These components are based on Xilinx Virtex5 and Mindspeed crosspoint switch devices, bringing together state of the art FPGA based processing and Telcom switching technologies.

  12. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  13. Meat Processing.

    Science.gov (United States)

    Legacy, Jim; And Others

    This publication provides an introduction to meat processing for adult students in vocational and technical education programs. Organized in four chapters, the booklet provides a brief overview of the meat processing industry and the techniques of meat processing and butchering. The first chapter introduces the meat processing industry and…

  14. Vulnerability detection using data-flow graphs and SMT solvers

    Science.gov (United States)

    2016-10-31

    SECURITY CLASSIFICATION OF: 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES 12. DISTRIBUTION AVAILIBILITY STATEMENT 6. AUTHORS...correctly. Developers rarely develop with consideration for eliminating vulnerabilities in source code. Source code is not always available for...as deep neural networks. We also plan to define heuristics on what type of learners and features to use with identifying different vulnerabilities

  15. Data-flow oriented visual programming libraries for scientific computing

    NARCIS (Netherlands)

    Maubach, J.M.L.; Drenth, W.D.; Sloot, P.M.A.

    2002-01-01

    The growing release of scientific computational software does not seem to aid the implementation of complex numerical algorithms. Released libraries lack a common standard interface with regard to for instance finite element, difference or volume discretizations. And, libraries written in standard

  16. Computer network prepared to handle massive data flow

    CERN Multimedia

    2006-01-01

    "Massive quantities of data will soon begin flowing from the largest scientific instrument ever built into an internationl network of computer centers, including one operated jointly by the University of Chicago and Indiana University." (2 pages)

  17. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  18. Process development

    Energy Technology Data Exchange (ETDEWEB)

    Schuegerl, K

    1984-01-01

    The item 'process development' comprises the production of acetonic/butonal with C. acetobylicum and the yeasting of potato waste. The target is to increase productivity by taking the following measures - optimation of media, on-line process analysis, analysis of reaction, mathematic modelling and identification of parameters, process simulation, development of a state estimator with the help of the on-line process analysis and the model, optimization and adaptive control.

  19. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  20. Poisson processes

    NARCIS (Netherlands)

    Boxma, O.J.; Yechiali, U.; Ruggeri, F.; Kenett, R.S.; Faltin, F.W.

    2007-01-01

    The Poisson process is a stochastic counting process that arises naturally in a large variety of daily life situations. We present a few definitions of the Poisson process and discuss several properties as well as relations to some well-known probability distributions. We further briefly discuss the

  1. Data processing

    CERN Document Server

    Fry, T F

    2013-01-01

    Data Processing discusses the principles, practices, and associated tools in data processing. The book is comprised of 17 chapters that are organized into three parts. The first part covers the characteristics, systems, and methods of data processing. Part 2 deals with the data processing practice; this part discusses the data input, output, and storage. The last part discusses topics related to systems and software in data processing, which include checks and controls, computer language and programs, and program elements and structures. The text will be useful to practitioners of computer-rel

  2. Stochastic processes

    CERN Document Server

    Parzen, Emanuel

    1962-01-01

    Well-written and accessible, this classic introduction to stochastic processes and related mathematics is appropriate for advanced undergraduate students of mathematics with a knowledge of calculus and continuous probability theory. The treatment offers examples of the wide variety of empirical phenomena for which stochastic processes provide mathematical models, and it develops the methods of probability model-building.Chapter 1 presents precise definitions of the notions of a random variable and a stochastic process and introduces the Wiener and Poisson processes. Subsequent chapters examine

  3. Magnetics Processing

    Data.gov (United States)

    Federal Laboratory Consortium — The Magnetics Processing Lab equipped to perform testing of magnetometers, integrate them into aircraft systems, and perform data analysis, including noise reduction...

  4. Data processing

    International Nuclear Information System (INIS)

    Cousot, P.

    1988-01-01

    The 1988 progress report of the Data Processing laboratory (Polytechnic School, France), is presented. The laboratory research fields are: the semantics, the tests and the semantic analysis of the codes, the formal calculus, the software applications, the algorithms, the neuron networks and VLSI (Very Large Scale Integration). The investigations concerning the polynomial rings are performed by means of the standard basis approach. Among the research topics, the Pascal codes, the parallel processing, the combinatorial, statistical and asymptotic properties of the fundamental data processing tools, the signal processing and the pattern recognition. The published papers, the congress communications and the thesis are also included [fr

  5. Suppurative processes

    International Nuclear Information System (INIS)

    Vinner, M.G.

    1983-01-01

    Suppurative process in the case of bronchiectatic disease, abscess and gang rene of lungs, has been described. Characteristic signs of roentgenologic pictu re of the above-mentioned diseases are considered. It is shown,that in most cas es roentgenologic studies give a possibility to make a high-quality diagnosis of suppurative processes

  6. Design Processes

    DEFF Research Database (Denmark)

    Ovesen, Nis

    2009-01-01

    Inspiration for most research and optimisations on design processes still seem to focus within the narrow field of the traditional design practise. The focus in this study turns to associated businesses of the design professions in order to learn from their development processes. Through interviews...... and emerging production methods....

  7. Process development

    International Nuclear Information System (INIS)

    Zapata G, G.

    1989-01-01

    Process development: The paper describes the organization and laboratory facilities of the group working on radioactive ore processing studies. Contains a review of the carried research and the plans for the next future. A list of the published reports is also presented

  8. Sustainable processing

    DEFF Research Database (Denmark)

    Kristensen, Niels Heine

    2004-01-01

    Kristensen_NH and_Beck A: Sustainable processing. In Otto Schmid, Alexander Beck and Ursula Kretzschmar (Editors) (2004): Underlying Principles in Organic and "Low-Input Food" Processing - Literature Survey. Research Institute of Organic Agriculture FiBL, CH-5070 Frick, Switzerland. ISBN 3-906081-58-3...

  9. Food processing

    NARCIS (Netherlands)

    Teodorowicz, Malgorzata; Neerven, Van Joost; Savelkoul, Huub

    2017-01-01

    The majority of foods that are consumed in our developed society have been processed. Processing promotes a non-enzymatic reaction between proteins and sugars, the Maillard reaction (MR). Maillard reaction products (MRPs) contribute to the taste, smell and color of many food products, and thus

  10. Membrane processes

    Science.gov (United States)

    Staszak, Katarzyna

    2017-11-01

    The membrane processes have played important role in the industrial separation process. These technologies can be found in all industrial areas such as food, beverages, metallurgy, pulp and paper, textile, pharmaceutical, automotive, biotechnology and chemical industry, as well as in water treatment for domestic and industrial application. Although these processes are known since twentieth century, there are still many studies that focus on the testing of new membranes' materials and determining of conditions for optimal selectivity, i. e. the optimum transmembrane pressure (TMP) or permeate flux to minimize fouling. Moreover the researchers proposed some calculation methods to predict the membrane processes properties. In this article, the laboratory scale experiments of membrane separation techniques, as well their validation by calculation methods are presented. Because membrane is the "heart" of the process, experimental and computational methods for its characterization are also described.

  11. Process mining

    DEFF Research Database (Denmark)

    van der Aalst, W.M.P.; Rubin, V.; Verbeek, H.M.W.

    2010-01-01

    Process mining includes the automated discovery of processes from event logs. Based on observed events (e.g., activities being executed or messages being exchanged) a process model is constructed. One of the essential problems in process mining is that one cannot assume to have seen all possible...... behavior. At best, one has seen a representative subset. Therefore, classical synthesis techniques are not suitable as they aim at finding a model that is able to exactly reproduce the log. Existing process mining techniques try to avoid such “overfitting” by generalizing the model to allow for more...... support for it). None of the existing techniques enables the user to control the balance between “overfitting” and “underfitting”. To address this, we propose a two-step approach. First, using a configurable approach, a transition system is constructed. Then, using the “theory of regions”, the model...

  12. Partial processing

    International Nuclear Information System (INIS)

    1978-11-01

    This discussion paper considers the possibility of applying to the recycle of plutonium in thermal reactors a particular method of partial processing based on the PUREX process but named CIVEX to emphasise the differences. The CIVEX process is based primarily on the retention of short-lived fission products. The paper suggests: (1) the recycle of fission products with uranium and plutonium in thermal reactor fuel would be technically feasible; (2) it would, however, take ten years or more to develop the CIVEX process to the point where it could be launched on a commercial scale; (3) since the majority of spent fuel to be reprocessed this century will have been in storage for ten years or more, the recycling of short-lived fission products with the U-Pu would not provide an effective means of making refabrication fuel ''inaccessible'' because the radioactivity associated with the fission products would have decayed. There would therefore be no advantage in partial processing

  13. Process monitoring

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    Many of the measurements and observations made in a nuclear processing facility to monitor processes and product quality can also be used to monitor the location and movements of nuclear materials. In this session information is presented on how to use process monitoring data to enhance nuclear material control and accounting (MC and A). It will be seen that SNM losses can generally be detected with greater sensitivity and timeliness and point of loss localized more closely than by conventional MC and A systems if process monitoring data are applied. The purpose of this session is to enable the participants to: (1) identify process unit operations that could improve control units for monitoring SNM losses; (2) choose key measurement points and formulate a loss indicator for each control unit; and (3) describe how the sensitivities and timeliness of loss detection could be determined for each loss indicator

  14. Message processing in the human brain. III

    Energy Technology Data Exchange (ETDEWEB)

    Gerke, P

    1983-10-07

    For pt.II see ibid., no.19, p.95-100 (1983). The general problem of the possibly achievable super brain is discussed, and subtle differences between various linkages leading to selective processes, creativity decision making and speculative assessments are pointed out and translated into possible approaches to the making of machine intelligence. Generally, associative sequences for processing of large data flows cannot be attempted without the provision of generally valid linkage rules. Such coordination steps are considered first, the brain-machine simulation being built-up vertically on 6 levels and horizontally as recognition stages in an event. These six levels are: repertoire (i.e. vocabulary); definition; scene; happenings; spatial linkages; temporal linkages. Event simulation proceeds from the descriptive to the cognitive situation. Speculative discussions continue with the gradual introduction of computer hardware and software concepts to be adapted for intelligence simulation; thus, the simplest associative process could start with an adder network and proceed to a virtual expert system, which would include teaching by example, autonomous control, non-procedural language, all these governed by schedules.

  15. Design and simulation for real-time distributed processing systems

    International Nuclear Information System (INIS)

    Legrand, I.C.; Gellrich, A.; Gensah, U.; Leich, H.; Wegner, P.

    1996-01-01

    The aim of this work is to provide a proper framework for the simulation and the optimization of the event building, the on-line third level trigger, and complete event reconstruction processor farm for the future HERA-B experiment. A discrete event, process oriented, simulation developed in concurrent μC++ is used for modelling the farm nodes running with multi-tasking constraints and different types of switching elements and digital signal processors interconnected for distributing the data through the system. An adequate graphic interface to the simulation part which allows to monitor features on-line and to analyze trace files, provides a powerful development tool for evaluating and designing parallel processing architectures. Control software and data flow protocols for event building and dynamic processor allocation are presented for two architectural models. (author)

  16. Biased Competition in Visual Processing Hierarchies: A Learning Approach Using Multiple Cues.

    Science.gov (United States)

    Gepperth, Alexander R T; Rebhan, Sven; Hasler, Stephan; Fritsch, Jannik

    2011-03-01

    In this contribution, we present a large-scale hierarchical system for object detection fusing bottom-up (signal-driven) processing results with top-down (model or task-driven) attentional modulation. Specifically, we focus on the question of how the autonomous learning of invariant models can be embedded into a performing system and how such models can be used to define object-specific attentional modulation signals. Our system implements bi-directional data flow in a processing hierarchy. The bottom-up data flow proceeds from a preprocessing level to the hypothesis level where object hypotheses created by exhaustive object detection algorithms are represented in a roughly retinotopic way. A competitive selection mechanism is used to determine the most confident hypotheses, which are used on the system level to train multimodal models that link object identity to invariant hypothesis properties. The top-down data flow originates at the system level, where the trained multimodal models are used to obtain space- and feature-based attentional modulation signals, providing biases for the competitive selection process at the hypothesis level. This results in object-specific hypothesis facilitation/suppression in certain image regions which we show to be applicable to different object detection mechanisms. In order to demonstrate the benefits of this approach, we apply the system to the detection of cars in a variety of challenging traffic videos. Evaluating our approach on a publicly available dataset containing approximately 3,500 annotated video images from more than 1 h of driving, we can show strong increases in performance and generalization when compared to object detection in isolation. Furthermore, we compare our results to a late hypothesis rejection approach, showing that early coupling of top-down and bottom-up information is a favorable approach especially when processing resources are constrained.

  17. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  18. Perceptual Processing Affects Conceptual Processing

    Science.gov (United States)

    van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.

    2008-01-01

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…

  19. The ATLAS Event Service: A new approach to event processing

    Science.gov (United States)

    Calafiura, P.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.

    2015-12-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre-staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabilities, its architecture and the highly scalable tools and technologies employed in its implementation, and its applications in ATLAS processing on HPCs, commercial cloud resources, volunteer computing, and grid resources. Notice: This manuscript has been authored by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  20. Sewer Processes

    DEFF Research Database (Denmark)

    Hvitved-Jacobsen, Thorkild; Vollertsen, Jes; Nielsen, Asbjørn Haaning

    Since the first edition was published over a decade ago, advancements have been made in the design, operation, and maintenance of sewer systems, and new problems have emerged. For example, sewer processes are now integrated in computer models, and simultaneously, odor and corrosion problems caused...... by hydrogen sulfide and other volatile organic compounds, as well as other potential health issues, have caused environmental concerns to rise. Reflecting the most current developments, Sewer Processes: Microbial and Chemical Process Engineering of Sewer Networks, Second Edition, offers the reader updated...... and valuable information on the sewer as a chemical and biological reactor. It focuses on how to predict critical impacts and control adverse effects. It also provides an integrated description of sewer processes in modeling terms. This second edition is full of illustrative examples and figures, includes...

  1. Electrochemical Processes

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers

    1997-01-01

    The notes describe in detail primary and secondary galvanic cells, fuel cells, electrochemical synthesis and electroplating processes, corrosion: measurments, inhibitors, cathodic and anodic protection, details of metal dissolution reactions, Pourbaix diagrams and purification of waste water from...

  2. Dissolution processes

    International Nuclear Information System (INIS)

    Silver, G.L.

    1976-01-01

    This review contains more than 100 observations and 224 references on the dissolution phenomenon. The dissolution processes are grouped into three categories: methods of aqueous attack, fusion methods, and miscellaneous observations on phenomena related to dissolution problems

  3. Renewal processes

    CERN Document Server

    Mitov, Kosto V

    2014-01-01

    This monograph serves as an introductory text to classical renewal theory and some of its applications for graduate students and researchers in mathematics and probability theory. Renewal processes play an important part in modeling many phenomena in insurance, finance, queuing systems, inventory control and other areas. In this book, an overview of univariate renewal theory is given and renewal processes in the non-lattice and lattice case are discussed. A pre-requisite is a basic knowledge of probability theory.

  4. Fuel processing

    International Nuclear Information System (INIS)

    Allardice, R.H.

    1990-01-01

    The technical and economic viability of the fast breeder reactor as an electricity generating system depends not only upon the reactor performance but also on a capability to recycle plutonium efficiently, reliably and economically through the reactor and fuel cycle facilities. Thus the fuel cycle is an integral and essential part of the system. Fuel cycle research and development has focused on demonstrating that the challenging technical requirements of processing plutonium fuel could be met and that the sometimes conflicting requirements of the fuel developer, fuel fabricator and fuel reprocessor could be reconciled. Pilot plant operation and development and design studies have established both the technical and economic feasibility of the fuel cycle but scope for further improvement exists through process intensification and flowsheet optimization. These objectives and the increasing processing demands made by the continuing improvement to fuel design and irradiation performance provide an incentive for continuing fuel cycle development work. (author)

  5. Organizing Process

    DEFF Research Database (Denmark)

    Hull Kristensen, Peer; Bojesen, Anders

    This paper invites to discuss the processes of individualization and organizing being carried out under what we might see as an emerging regime of change. The underlying argumentation is that in certain processes of change, competence becomes questionable at all times. The hazy characteristics...... of this regime of change are pursued through a discussion of competencies as opposed to qualifications illustrated by distinct cases from the Danish public sector in the search for repetitive mechanisms. The cases are put into a general perspective by drawing upon experiences from similar change processes...... in MNCs. The paper concludes by asking whether we can escape from a regime of competence in a world defined by a rhetoric of change and create a more promising world in which doubt and search serve as a strategy for gaining knowledge and professionalism that improve on our capability for mutualism....

  6. Welding process

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    For the final chapter of this book, there is basic introduction on welding process. The good radiography must know somehow on welding process so that they can know what kind of welding that must rejected or not. All of the exposure technique that mention in earlier chapter almost applicable in this field because welding process is critical problem if there is no inspection will be done. So, for this chapter, all the discontinuity that usually appeared will be discussed and there is another discontinuity maybe not to important and do not give big impact if found it, do not described here. On top of that, the decision to accept or reject based on code, standard and specification that agreed by both to make sure that decision that agreed is corrected and more meaningful.

  7. Markov Processes in Image Processing

    Science.gov (United States)

    Petrov, E. P.; Kharina, N. L.

    2018-05-01

    Digital images are used as an information carrier in different sciences and technologies. The aspiration to increase the number of bits in the image pixels for the purpose of obtaining more information is observed. In the paper, some methods of compression and contour detection on the basis of two-dimensional Markov chain are offered. Increasing the number of bits on the image pixels will allow one to allocate fine object details more precisely, but it significantly complicates image processing. The methods of image processing do not concede by the efficiency to well-known analogues, but surpass them in processing speed. An image is separated into binary images, and processing is carried out in parallel with each without an increase in speed, when increasing the number of bits on the image pixels. One more advantage of methods is the low consumption of energy resources. Only logical procedures are used and there are no computing operations. The methods can be useful in processing images of any class and assignment in processing systems with a limited time and energy resources.

  8. LHCb Online event processing and filtering

    Science.gov (United States)

    Alessio, F.; Barandela, C.; Brarda, L.; Frank, M.; Franek, B.; Galli, D.; Gaspar, C.; Herwijnen, E. v.; Jacobsson, R.; Jost, B.; Köstner, S.; Moine, G.; Neufeld, N.; Somogyi, P.; Stoica, R.; Suman, S.

    2008-07-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. The entire data-flow is controlled and configured by means of a SCADA system and several databases. After an overview of the LHCb data acquisition and its design principles this paper will emphasize the LHCb event filter system, which is now implemented using the final hardware and will be ready for data-taking for the LHC startup. Control, configuration and security aspects will also be discussed.

  9. LHCb Online event processing and filtering

    International Nuclear Information System (INIS)

    Alessio, F; Barandela, C; Brarda, L; Frank, M; Gaspar, C; Herwijnen, E v; Jacobsson, R; Jost, B; Koestner, S; Moine, G; Neufeld, N; Somogyi, P; Stoica, R; Suman, S; Franek, B; Galli, D

    2008-01-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. The entire data-flow is controlled and configured by means of a SCADA system and several databases. After an overview of the LHCb data acquisition and its design principles this paper will emphasize the LHCb event filter system, which is now implemented using the final hardware and will be ready for data-taking for the LHC startup. Control, configuration and security aspects will also be discussed

  10. Film processing

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    The processing was made not only to show what are in the film but also to produce radiograph with high quality where the information gathered really presented level of the quality of the object inspected. Besides that, good procedure will make the film with good quality can keep the film in long time for reference. Here, more detailed on how the dark room functioned and its design. So, the good procedure while processed the film will be discussed detailed in this chapter from entering the dark room to exit from there.

  11. Extraction process

    International Nuclear Information System (INIS)

    Rendall, J.S.; Cahalan, M.J.

    1979-01-01

    A process is described for extracting at least two desired constituents from a mineral, using a liquid reagent which produces the constituents, or compounds thereof, in separable form and independently extracting those constituents, or compounds. The process is especially valuable for the extraction of phosphoric acid and metal values from acidulated phosphate rock, the slurry being contacted with selective extractants for phosphoric acid and metal (e.g. uranium) values. In an example, uranium values are oxidized to uranyl form and extracted using an ion exchange resin. (U.K.)

  12. Process simulation

    International Nuclear Information System (INIS)

    Cao, E.G.; Suarez, P.S.; Pantaleon, J.C.

    1984-01-01

    The search for an optimal design of a heavy water plant is done by means of a simulation model for the mass and enthalpy balances of the SH 2 -H 2 O exchange process. A symplified model for the simulation diagram where the entire plant is represented by a sole tray tower with recicles, and heat and mass feeds/extractions was used. The tower is simulated by the method developed by Tomich with the convergence part given by the algorithm of Broyden. The concluding part of the work is centered in setting the design parameters (flowrates, heat exchange rates, number of plates) wich give the desired process operating conditions. (author) [es

  13. Processing Proteases

    DEFF Research Database (Denmark)

    Ødum, Anders Sebastian Rosenkrans

    -terminal of the scissile bond, leaving C-terminal fusions to have non-native C-termini after processing. A solution yielding native C-termini would allow novel expression and purification systems for therapeutic proteins and peptides.The peptidyl-Lys metallopeptidase (LysN) of the fungus Armillaria mellea (Am) is one...... of few known proteases to have substrate specificity for the C-terminal side of the scissile bond. LysN exhibits specificity for lysine, and has primarily been used to complement trypsin in to proteomic studies. A working hypothesis during this study was the potential of LysN as a processing protease...

  14. Processing Branches

    DEFF Research Database (Denmark)

    Schindler, Christoph; Tamke, Martin; Tabatabai, Ali

    2014-01-01

    Angled and forked wood – a desired material until 19th century, was swept away by industrialization and its standardization of processes and materials. Contemporary information technology has the potential for the capturing and recognition of individual geometries through laser scanning...

  15. BENTONITE PROCESSING

    Directory of Open Access Journals (Sweden)

    Anamarija Kutlić

    2012-07-01

    Full Text Available Bentonite has vide variety of uses. Special use of bentonite, where its absorbing properties are employed to provide water-tight sealing is for an underground repository in granites In this paper, bentonite processing and beneficiation are described.

  16. Purex process

    International Nuclear Information System (INIS)

    Starks, J.B.

    1977-01-01

    The following aspects of the Purex Process are discussed: head end dissolution, first solvent extraction cycle, second plutonium solvent extraction cycle, second uranium solvent extraction cycle, solvent recovery systems, primary recovery column for high activity waste, low activity waste, laboratory waste evaporation, vessel vent system, airflow and filtration, acid recovery unit, fume recovery, and discharges to seepage basin

  17. Innovation process

    DEFF Research Database (Denmark)

    Kolodovski, A.

    2006-01-01

    Purpose of this report: This report was prepared for RISO team involved in design of the innovation system Report provides innovation methodology to establish common understanding of the process concepts and related terminology The report does not includeRISO- or Denmark-specific cultural, econom...

  18. Processing Determinism

    Science.gov (United States)

    O'Grady, William

    2015-01-01

    I propose that the course of development in first and second language acquisition is shaped by two types of processing pressures--internal efficiency-related factors relevant to easing the burden on working memory and external input-related factors such as frequency of occurrence. In an attempt to document the role of internal factors, I consider…

  19. Shale processing

    Energy Technology Data Exchange (ETDEWEB)

    Hampton, W H

    1928-05-29

    The process of treating bituminiferous solid materials such as shale or the like to obtain valuable products therefrom, which comprises digesting a mixture of such material in comminuted condition with a suitable digestion liquid, such as an oil, recovering products vaporized in the digestion, and separating residual solid matter from the digestion liquid by centrifuging.

  20. Radiation processing

    International Nuclear Information System (INIS)

    Noriah Mod Ali

    2005-01-01

    This chapter covers the basic principle and application of radiation technology. The topic titled specific application discussed briefly the following subtopics: 1) Polymer modification - crosslinking, polymerisation, degradation, grafting; 2) Medical sterilisation; 3) Food irradiation; 4) Environmental protection - waste processing, pollutants treatment

  1. Leaching process

    International Nuclear Information System (INIS)

    Heinen, H.J.; McClelland, G.E.; Lindstrom, R.E.

    1982-01-01

    A gold and uranium ore is heap leached in accordance with the process comprising initial agglomeration of fines in the feed by means of a binding agent and cyanide solution. The lixiviant comprises a compatible mixture of sodium cyanide and sodium bicarbonate

  2. Leaching process

    Energy Technology Data Exchange (ETDEWEB)

    Heinen, H J; McClelland, G E; Lindstrom, R E

    1982-10-18

    A gold and uranium ore is heap leached in accordance with the process comprising initial agglomeration of fines in the feed by means of a binding agent and cyanide solution. The lixiviant comprises a compatible mixture of sodium cyanide and sodium bicarbonate.

  3. Signal Processing

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Signal processing techniques, extensively used nowadays to maximize the performance of audio and video equipment, have been a key part in the design of hardware and software for high energy physics detectors since pioneering applications in the UA1 experiment at CERN in 1979

  4. Algorithm-structured computer arrays and networks architectures and processes for images, percepts, models, information

    CERN Document Server

    Uhr, Leonard

    1984-01-01

    Computer Science and Applied Mathematics: Algorithm-Structured Computer Arrays and Networks: Architectures and Processes for Images, Percepts, Models, Information examines the parallel-array, pipeline, and other network multi-computers.This book describes and explores arrays and networks, those built, being designed, or proposed. The problems of developing higher-level languages for systems and designing algorithm, program, data flow, and computer structure are also discussed. This text likewise describes several sequences of successively more general attempts to combine the power of arrays wi

  5. Process validation for radiation processing

    International Nuclear Information System (INIS)

    Miller, A.

    1999-01-01

    Process validation concerns the establishment of the irradiation conditions that will lead to the desired changes of the irradiated product. Process validation therefore establishes the link between absorbed dose and the characteristics of the product, such as degree of crosslinking in a polyethylene tube, prolongation of shelf life of a food product, or degree of sterility of the medical device. Detailed international standards are written for the documentation of radiation sterilization, such as EN 552 and ISO 11137, and the steps of process validation that are described in these standards are discussed in this paper. They include material testing for the documentation of the correct functioning of the product, microbiological testing for selection of the minimum required dose and dose mapping for documentation of attainment of the required dose in all parts of the product. The process validation must be maintained by reviews and repeated measurements as necessary. This paper presents recommendations and guidance for the execution of these components of process validation. (author)

  6. Stochastic processes

    CERN Document Server

    Borodin, Andrei N

    2017-01-01

    This book provides a rigorous yet accessible introduction to the theory of stochastic processes. A significant part of the book is devoted to the classic theory of stochastic processes. In turn, it also presents proofs of well-known results, sometimes together with new approaches. Moreover, the book explores topics not previously covered elsewhere, such as distributions of functionals of diffusions stopped at different random times, the Brownian local time, diffusions with jumps, and an invariance principle for random walks and local times. Supported by carefully selected material, the book showcases a wealth of examples that demonstrate how to solve concrete problems by applying theoretical results. It addresses a broad range of applications, focusing on concrete computational techniques rather than on abstract theory. The content presented here is largely self-contained, making it suitable for researchers and graduate students alike.

  7. Offshoring Process

    DEFF Research Database (Denmark)

    Slepniov, Dmitrij; Sørensen, Brian Vejrum; Katayama, Hiroshi

    2011-01-01

    The purpose of this chapter is to contribute to the knowledge on how production offshoring and international operations management vary across cultural contexts. The chapter attempts to shed light on how companies approach the process of offshoring in different cultural contexts. In order...... of globalisation. Yet there are clear differences in how offshoring is conducted in Denmark and Japan. The main differences are outlined in a framework and explained employing cultural variables. The findings lead to a number of propositions suggesting that the process of offshoring is not simply a uniform...... technical-rational calculation of the most efficient organisation of activities across national borders, but it is rather specific to the parent companies’ national contexts....

  8. Photobiomodulation Process

    Directory of Open Access Journals (Sweden)

    Yang-Yi Xu

    2012-01-01

    Full Text Available Photobiomodulation (PBM is a modulation of laser irradiation or monochromatic light (LI on biosystems. There is little research on PBM dynamics although its phenomena and mechanism have been widely studied. The PBM was discussed from dynamic viewpoint in this paper. It was found that the primary process of cellular PBM might be the key process of cellular PBM so that the transition rate of cellular molecules can be extended to discuss the dose relationship of PBM. There may be a dose zone in which low intensity LI (LIL at different doses has biological effects similar to each other, so that biological information model of PBM might hold. LIL may self-adaptively modulate a chronic stress until it becomes successful.

  9. Multiphoton processes

    International Nuclear Information System (INIS)

    Manus, C.; Mainfray, G.

    1980-01-01

    The main features of multiphoton processes are described on a somewhat elementary basis. The emphasis is put on multiphoton ionization of atoms where the influence of resonance effects is given through typical examples. The important role played by the coherence of light is shown to produce a very dramatic influence on multiphoton absorption. Different observations concerning molecules, electrons, as well as solid surfaces illustrate the generality of these very non linear interaction between light and matter

  10. Process heat. Triggering the processes

    Energy Technology Data Exchange (ETDEWEB)

    Augsten, Eva

    2012-07-01

    If solar process heat is to find a market, then the decision makers in industrial companies need to be aware that it actually exists. This was one of the main goals of the So-Pro project, which officially drew to a close in April 2012. (orig.)

  11. Speech Processing.

    Science.gov (United States)

    1983-05-01

    The VDE system developed had the capability of recognizing up to 248 separate words in syntactic structures. 4 The two systems described are isolated...AND SPEAKER RECOGNITION by M.J.Hunt 5 ASSESSMENT OF SPEECH SYSTEMS ’ ..- * . by R.K.Moore 6 A SURVEY OF CURRENT EQUIPMENT AND RESEARCH’ by J.S.Bridle...TECHNOLOGY IN NAVY TRAINING SYSTEMS by R.Breaux, M.Blind and R.Lynchard 10 9 I-I GENERAL REVIEW OF MILITARY APPLICATIONS OF VOICE PROCESSING DR. BRUNO

  12. Markov processes

    CERN Document Server

    Kirkwood, James R

    2015-01-01

    Review of ProbabilityShort HistoryReview of Basic Probability DefinitionsSome Common Probability DistributionsProperties of a Probability DistributionProperties of the Expected ValueExpected Value of a Random Variable with Common DistributionsGenerating FunctionsMoment Generating FunctionsExercisesDiscrete-Time, Finite-State Markov ChainsIntroductionNotationTransition MatricesDirected Graphs: Examples of Markov ChainsRandom Walk with Reflecting BoundariesGambler’s RuinEhrenfest ModelCentral Problem of Markov ChainsCondition to Ensure a Unique Equilibrium StateFinding the Equilibrium StateTransient and Recurrent StatesIndicator FunctionsPerron-Frobenius TheoremAbsorbing Markov ChainsMean First Passage TimeMean Recurrence Time and the Equilibrium StateFundamental Matrix for Regular Markov ChainsDividing a Markov Chain into Equivalence ClassesPeriodic Markov ChainsReducible Markov ChainsSummaryExercisesDiscrete-Time, Infinite-State Markov ChainsRenewal ProcessesDelayed Renewal ProcessesEquilibrium State f...

  13. Coking processes

    Energy Technology Data Exchange (ETDEWEB)

    Hiller, H K

    1917-11-20

    A gas suitable for use in containers or motor-vehicles, etc., and consisting mainly of methane, is obtained by distilling at a temperature not exceeding 500/sup 0/C bastard cannel coal, lignite, wood, peat, shale, etc., in an horizontal or vertical retort, through which the material is continuously fed in a thin layer or column by means of a screw conveyor or the like. Cracking or dissociation of the gaseous products is prevented by introducing into the retort part of the gas which is the result of the process and which is compressed to a pressure of at least 15 atmospheres and allowed to expand into the retort to cool and carry away the gaseous products produced. These are then passed through condensers for extracting liquid hydrocarbons, and other hydrocarbons are extracted by passage through washing-oils. The gas is then compressed by a water-cooled pump to a pressure of 15 atmospheres, whereby a spirit similar to petrol is formed, and a stable gas left which is mainly methane, part of the gas being used to carry out the process described above.

  14. Etherification process

    Science.gov (United States)

    Smith, L.A. Jr.; Hearn, D.; Jones, E.M. Jr.

    1990-08-21

    A liquid phase process is described for oligomerization of C[sub 4] and C[sub 5] isoolefins or the etherification thereof with C[sub 1] to C[sub 6] alcohols wherein the reactants are contacted in a reactor with a fixed bed acid cation exchange resin catalyst at an LHSV of 5 to 20, pressure of 0 to 400 psig and temperature of 120 to 300 F wherein the improvement is the operation of the reactor at a pressure to maintain the reaction mixture at its boiling point whereby at least a portion but less than all of the reaction mixture is vaporized. By operating at the boiling point and allowing a portion of the reaction mixture to vaporize, the exothermic heat of reaction is dissipated by the formation of more boil up and the temperature in the reactor is controlled. 2 figs.

  15. Oligomerization process

    Science.gov (United States)

    Smith, L.A. Jr.; Hearn, D.; Jones, E.M. Jr.

    1991-03-26

    A liquid phase process is described for oligomerization of C[sub 4] and C[sub 5] isoolefins or the etherification thereof with C[sub 1] to C[sub 6] alcohols wherein the reactants are contacted in a reactor with a fixed bed acid cation exchange resin catalyst at an LHSV of 5 to 20, pressure of 0 to 400 psig and temperature of 120 to 300 F wherein the improvement is the operation of the reactor at a pressure to maintain the reaction mixture at its boiling point whereby at least a portion but less than all of the reaction mixture is vaporized. By operating at the boiling point and allowing a portion of the reaction mixture to vaporize, the exothermic heat of reaction is dissipated by the formation of more boil up and the temperature in the reactor is controlled. 2 figures.

  16. Lithospheric processes

    Energy Technology Data Exchange (ETDEWEB)

    Baldridge, W. [and others

    2000-12-01

    The authors used geophysical, geochemical, and numerical modeling to study selected problems related to Earth's lithosphere. We interpreted seismic waves to better characterize the thickness and properties of the crust and lithosphere. In the southwestern US and Tien Shari, crust of high elevation is dynamically supported above buoyant mantle. In California, mineral fabric in the mantle correlate with regional strain history. Although plumes of buoyant mantle may explain surface deformation and magmatism, our geochemical work does not support this mechanism for Iberia. Generation and ascent of magmas remains puzzling. Our work in Hawaii constrains the residence of magma beneath Hualalai to be a few hundred to about 1000 years. In the crust, heat drives fluid and mass transport. Numerical modeling yielded robust and accurate predictions of these processes. This work is important fundamental science, and applies to mitigation of volcanic and earthquake hazards, Test Ban Treaties, nuclear waste storage, environmental remediation, and hydrothermal energy.

  17. Lithospheric processes

    International Nuclear Information System (INIS)

    Baldridge, W.S.

    2000-01-01

    The authors used geophysical, geochemical, and numerical modeling to study selected problems related to Earth's lithosphere. We interpreted seismic waves to better characterize the thickness and properties of the crust and lithosphere. In the southwestern US and Tien Shari, crust of high elevation is dynamically supported above buoyant mantle. In California, mineral fabric in the mantle correlate with regional strain history. Although plumes of buoyant mantle may explain surface deformation and magmatism, our geochemical work does not support this mechanism for Iberia. Generation and ascent of magmas remains puzzling. Our work in Hawaii constrains the residence of magma beneath Hualalai to be a few hundred to about 1000 years. In the crust, heat drives fluid and mass transport. Numerical modeling yielded robust and accurate predictions of these processes. This work is important fundamental science, and applies to mitigation of volcanic and earthquake hazards, Test Ban Treaties, nuclear waste storage, environmental remediation, and hydrothermal energy

  18. Carbonizing process

    Energy Technology Data Exchange (ETDEWEB)

    1923-11-22

    In the downward distillation of coal, shale, lignite, or the like, the heat is generated by the combustion of liquid or gaseous fuel above the charge the zone of carbonization thus initiated travelling downwards through the charge. The combustible gases employed are preferably those resulting from the process but gases such as natural gas may be employed. The charge is in a moistened and pervious state the lower parts being maintained at a temperature not above 212/sup 0/F until influenced by contact with the carbonization zone and steam may be admitted to increase the yield of ammonia. The combustible gases may be supplied with insufficient air so as to impart to them a reducing effect.

  19. WELDING PROCESS

    Science.gov (United States)

    Zambrow, J.; Hausner, H.

    1957-09-24

    A method of joining metal parts for the preparation of relatively long, thin fuel element cores of uranium or alloys thereof for nuclear reactors is described. The process includes the steps of cleaning the surfaces to be jointed, placing the sunfaces together, and providing between and in contact with them, a layer of a compound in finely divided form that is decomposable to metal by heat. The fuel element members are then heated at the contact zone and maintained under pressure during the heating to decompose the compound to metal and sinter the members and reduced metal together producing a weld. The preferred class of decomposable compounds are the metal hydrides such as uranium hydride, which release hydrogen thus providing a reducing atmosphere in the vicinity of the welding operation.

  20. Processing Disability.

    Science.gov (United States)

    Harris, Jasmine

    2015-01-01

    This Article argues that the practice of holding so many adjudicative proceedings related to disability in private settings (e.g., guardianship, special education due process, civil commitment, and social security) relative to our strong normative presumption of public access to adjudication may cultivate and perpetuate stigma in contravention of the goals of inclusion and enhanced agency set forth in antidiscrimination laws. Descriptively, the law has a complicated history with disability--initially rendering disability invisible; later, underwriting particular narratives of disability synonymous with incapacity; and, in recent history, promoting the full socio-economic visibility of people with disabilities. The Americans with Disabilities Act (ADA), the marquee civil rights legislation for people with disabilities (about to enter its twenty-fifth year), expresses a national approach to disability that recognizes the role of society in its construction, maintenance, and potential remedy. However, the ADA’s mission is incomplete. It has not generated the types of interactions between people with disabilities and nondisabled people empirically shown to deconstruct deeply entrenched social stigma. Prescriptively, procedural design can act as an "ntistigma agent"to resist and mitigate disability stigma. This Article focuses on one element of institutional design--public access to adjudication--as a potential tool to construct and disseminate counter-narratives of disability. The unique substantive focus in disability adjudication on questions of agency provides a potential public space for the negotiation of nuanced definitions of disability and capacity more reflective of the human condition.

  1. [In process.

    Science.gov (United States)

    Kaasch, Michael; Kaasch, Joachim

    -increasing competitiveness which came to a head as an embroiled dispute resulting from differences in scientific and scientific policy views. In the process a battle was fought over research resources so that, what was at first an apparently personal quarrel, affected the course of research promotion at an institutional level in the area of life sciences in the GDR. Despite several attempts at mediation, old age finally forced the adversaries to put aside their differences.

  2. Data Processing

    Science.gov (United States)

    Grangeat, P.

    A new area of biology has been opened up by nanoscale exploration of the living world. This has been made possible by technological progress, which has provided the tools needed to make devices that can measure things on such length and time scales. In a sense, this is a new window upon the living world, so rich and so diverse. Many of the investigative methods described in this book seek to obtain complementary physical, chemical, and biological data to understand the way it works and the way it is organised. At these length and time scales, only dedicated instrumentation could apprehend the relevant phenomena. There is no way for our senses to observe these things directly. One important field of application is molecular medicine, which aims to explain the mechanisms of life and disease by the presence and quantification of specific molecular entities. This involves combining information about genes, proteins, cells, and organs. This in turn requires the association of instruments for molecular diagnosis, either in vitro, e.g., the microarray or the lab-on-a-chip, or in vivo, e.g., probes for molecular biopsy, and tools for molecular imaging, used to localise molecular information in living organisms in a non-invasive way. These considerations concern both preclinical research for drug design and human medical applications. With the development of DNA and RNA chips [1], genomics has revolutionised investigative methods for cells and cell processes [2,3]. By sequencing the human genome, new ways have been found for understanding the fundamental mechanisms of life [4]. A revolution is currently under way with the analysis of the proteome [5-8], i.e., the complete set of proteins that can be found in some given biological medium, such as the blood plasma. The goal is to characterise certain diseases by recognisable signatures in the proteomic profile, as determined from a blood sample or a biopsy, for example [9-13]. What is at stake is the early detection of

  3. Hydrothermal Processes

    Science.gov (United States)

    German, C. R.; von Damm, K. L.

    2003-12-01

    What is Hydrothermal Circulation?Hydrothermal circulation occurs when seawater percolates downward through fractured ocean crust along the volcanic mid-ocean ridge (MOR) system. The seawater is first heated and then undergoes chemical modification through reaction with the host rock as it continues downward, reaching maximum temperatures that can exceed 400 °C. At these temperatures the fluids become extremely buoyant and rise rapidly back to the seafloor where they are expelled into the overlying water column. Seafloor hydrothermal circulation plays a significant role in the cycling of energy and mass between the solid earth and the oceans; the first identification of submarine hydrothermal venting and their accompanying chemosynthetically based communities in the late 1970s remains one of the most exciting discoveries in modern science. The existence of some form of hydrothermal circulation had been predicted almost as soon as the significance of ridges themselves was first recognized, with the emergence of plate tectonic theory. Magma wells up from the Earth's interior along "spreading centers" or "MORs" to produce fresh ocean crust at a rate of ˜20 km3 yr-1, forming new seafloor at a rate of ˜3.3 km2 yr-1 (Parsons, 1981; White et al., 1992). The young oceanic lithosphere formed in this way cools as it moves away from the ridge crest. Although much of this cooling occurs by upward conduction of heat through the lithosphere, early heat-flow studies quickly established that a significant proportion of the total heat flux must also occur via some additional convective process (Figure 1), i.e., through circulation of cold seawater within the upper ocean crust (Anderson and Silbeck, 1981). (2K)Figure 1. Oceanic heat flow versus age of ocean crust. Data from the Pacific, Atlantic, and Indian oceans, averaged over 2 Ma intervals (circles) depart from the theoretical cooling curve (solid line) indicating convective cooling of young ocean crust by circulating seawater

  4. A System Structure for a VHTR-SI Process Dynamic Simulation Code

    International Nuclear Information System (INIS)

    Chang, Jiwoon; Shin, Youngjoon; Kim, Jihwan; Lee, Kiyoung; Lee, Wonjae; Chang, Jonghwa; Youn, Cheung

    2008-01-01

    The VHTR-SI process dynamic simulation code embedded in a mathematical solution engine is an application software system that simulates the dynamic behavior of the VHTR-SI process. Also, the software system supports a user friendly graphical user interface (GUI) for user input/out. Structured analysis techniques were developed in the late 1970s by Yourdon, DeMarco, Gane and Sarson for applying a systematic approach to a systems analysis. It included the use of data flow diagrams and data modeling and fostered the use of an implementation-independent graphical notation for a documentation. In this paper, we present a system structure for a VHRT-SI process dynamic simulation code by using the methodologies of structured analysis

  5. A Framework for Coordination Process into Construction Projects

    Directory of Open Access Journals (Sweden)

    Alaloul Wesam S.

    2016-01-01

    Full Text Available Construction industry is recognized as high fragmentation, low efficiency, cost and time overruns in contrast with other industries. These peculiarities are the main roots of poor performance facing by the industry. Effective coordination is vital in construction projects success and mitigate the fragmentation dilemma, however it is often difficult to achieve and need iterative process. Coordination is core issue to improve performance in construction project. Relevant studies have addressed the coordination process importance and implementation, but not in a framework. This paper propose a framework for coordination process in construction projects, as well as its relationship with performance. The objective of the framework is to provide a roadmap for the construction parties to realize operational excellence so that collectively stakeholders can recognize the effect of coordination process application on the project performance. The data were obtained from literature review and structured interviews with five experts. The analysis produced the framework of coordination based on the extensively used procedures for information and data flow between stakeholders.

  6. The Specific Features of design and process engineering in branch of industrial enterprise

    Science.gov (United States)

    Sosedko, V. V.; Yanishevskaya, A. G.

    2017-06-01

    Production output of industrial enterprise is organized in debugged working mechanisms at each stage of product’s life cycle from initial design documentation to product and finishing it with utilization. The topic of article is mathematical model of the system design and process engineering in branch of the industrial enterprise, statistical processing of estimated implementation results of developed mathematical model in branch, and demonstration of advantages at application at this enterprise. During the creation of model a data flow about driving of information, orders, details and modules in branch of enterprise groups of divisions were classified. Proceeding from the analysis of divisions activity, a data flow, details and documents the state graph of design and process engineering was constructed, transitions were described and coefficients are appropriated. To each condition of system of the constructed state graph the corresponding limiting state probabilities were defined, and also Kolmogorov’s equations are worked out. When integration of sets of equations of Kolmogorov the state probability of system activity the specified divisions and production as function of time in each instant is defined. On the basis of developed mathematical model of uniform system of designing and process engineering and manufacture, and a state graph by authors statistical processing the application of mathematical model results was carried out, and also advantage at application at this enterprise is shown. Researches on studying of loading services probability of branch and third-party contractors (the orders received from branch within a month) were conducted. The developed mathematical model of system design and process engineering and manufacture can be applied to definition of activity state probability of divisions and manufacture as function of time in each instant that will allow to keep account of loading of performance of work in branches of the enterprise.

  7. Multidimensional process discovery

    NARCIS (Netherlands)

    Ribeiro, J.T.S.

    2013-01-01

    Typically represented in event logs, business process data describe the execution of process events over time. Business process intelligence (BPI) techniques such as process mining can be applied to get strategic insight into business processes. Process discovery, conformance checking and

  8. PC image processing

    International Nuclear Information System (INIS)

    Hwa, Mok Jin Il; Am, Ha Jeng Ung

    1995-04-01

    This book starts summary of digital image processing and personal computer, and classification of personal computer image processing system, digital image processing, development of personal computer and image processing, image processing system, basic method of image processing such as color image processing and video processing, software and interface, computer graphics, video image and video processing application cases on image processing like satellite image processing, color transformation of image processing in high speed and portrait work system.

  9. A tool to increase information-processing capacity for consumer water meter data

    Directory of Open Access Journals (Sweden)

    Heinz E. Jacobs

    2012-06-01

    Objective: The objective of this research article was to describe the development of Swift, a locally developed software tool for analysing water meter data from an information management perspective, which engineers in the water field generally use, and to assess critically the influence of Swift on published research and industry. This article focuses on water usage and the challenge of data interchange and extraction as issues that various industries face. Method: This article presents the first detailed report on Swift. It uses a detailed knowledge review and presents and summarises the findings chronologically. Results: The water meter data flow path used to be quite simple. The risk of breaches in confidentiality was limited. Technological advances over the years have led to additional knowledge coming from the same water meter readings with subsequent research outputs. However, there are also complicated data flow paths and increased risks. Users have used Swift to analyse more than two million consumers’ water meter readings to date. Studies have culminated in 10 peer-reviewed journal articles using the data. Seven of them were in the last five years. Conclusion: Swift-based data was the basis of various research studies in the past decade. Practical guidelines in the civil engineering fraternity for estimating water use in South Africa have incorporated knowledge from these studies. Developments after 1995 have increased the information processing capacity for water meter data.

  10. AN ADVANCED OXIDATION PROCESS : FENTON PROCESS

    Directory of Open Access Journals (Sweden)

    Engin GÜRTEKİN

    2008-03-01

    Full Text Available Biological wastewater treatment is not effective treatment method if raw wastewater contains toxic and refractory organics. Advanced oxidation processes are applied before or after biological treatment for the detoxification and reclamation of this kind of wastewaters. The advanced oxidation processes are based on the formation of powerful hydroxyl radicals. Among advanced oxidation processes Fenton process is one of the most promising methods. Because application of Fenton process is simple and cost effective and also reaction occurs in a short time period. Fenton process is applied for many different proposes. In this study, Fenton process was evaluated as an advanced oxidation process in wastewater treatment.

  11. ASKGene, a system for automate DNA processing - DOI: 10.3395/reciis.v1i2.Sup.100en

    Directory of Open Access Journals (Sweden)

    Eden Cardim

    2007-12-01

    Full Text Available Computational resources have become essential for genome project development. Distributed systems managing complex structures integrating graphical user interfaces, expensive data processing, data mining and large databases, have been proposed. Most consolidated sequencing laboratories have developed their own bioinformatics solutions. However, a portable and scalable system integrating all these aspects is not yet available to the scientific community. In this report, we present the prototype of such a system in open development at http://sourceforge.net/projects/askgene. It allows for the (i accessibility of data and processes all along the data flow, (ii data representation and ontology, (iii workflow tuning, (iv system architecture and documentation, (v corporate development, (vi manual annotation, (vii bogus data processing, (viii process parallelization and distribution, (ix portability and scalability.

  12. TRAQ I, a CAMAC system for multichannel data acquisition, storage and processing

    International Nuclear Information System (INIS)

    Broad, A.S.; Jordan, C.L.; Kojola, P.H.; Miller, M.

    1983-01-01

    Multichannel, high speed, signal sources generate large amounts of data which cannot be handled real time on the camac dataway. TRAQ I is a modular CAMAC system designed to buffer and process data of this type. The system can acquire data from up to 256 sources (ADCs etc.) and store in local memory (4 Mbytes). Many different signal sources can be controlled, working in either a histogramming or sequential mode. The system's data transfer bus is designed to accommodate other modules which can pre- or postprocess the data. Pre-processors can either intercept the data flow to memory for data compaction or passively monitor, looking for signal excursions, etc. Post-processors access memory to process and rewrite the data or transmit to other devices

  13. Process-driven information management system at a biotech company: concept and implementation.

    Science.gov (United States)

    Gobbi, Alberto; Funeriu, Sandra; Ioannou, John; Wang, Jinyi; Lee, Man-Ling; Palmer, Chris; Bamford, Bob; Hewitt, Robin

    2004-01-01

    While established pharmaceutical companies have chemical information systems in place to manage their compounds and the associated data, new startup companies need to implement these systems from scratch. Decisions made early in the design phase usually have long lasting effects on the expandability, maintenance effort, and costs associated with the information management system. Careful analysis of work and data flows, both inter- and intradepartmental, and identification of existing dependencies between activities are important. This knowledge is required to implement an information management system, which enables the research community to work efficiently by avoiding redundant registration and processing of data and by timely provision of the data whenever needed. This paper first presents the workflows existing at Anadys, then ARISE, the research information management system developed in-house at Anadys. ARISE was designed to support the preclinical drug discovery process and covers compound registration, analytical quality control, inventory management, high-throughput screening, lower throughput screening, and data reporting.

  14. Management of processes of electrochemical dimensional processing

    Science.gov (United States)

    Akhmetov, I. D.; Zakirova, A. R.; Sadykov, Z. B.

    2017-09-01

    In different industries a lot high-precision parts are produced from hard-processed scarce materials. Forming such details can only be acting during non-contact processing, or a minimum of effort, and doable by the use, for example, of electro-chemical processing. At the present stage of development of metal working processes are important management issues electrochemical machining and its automation. This article provides some indicators and factors of electrochemical machining process.

  15. Extensible packet processing architecture

    Science.gov (United States)

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  16. Experiment 2-B data processing system

    International Nuclear Information System (INIS)

    Price, H.; Svrcek, F.

    1976-08-01

    A new set of programs has been written for the analysis of data from the Fermilab 30-inch bubble chamber--wide gap spark chamber hybrid system. This report describes those programs, provides operating instructions, and indicates how they fit into the overall data flow

  17. Image processing system for the measurement of timber truck loads

    Science.gov (United States)

    Carvalho, Fernando D.; Correia, Bento A. B.; Davies, Roger; Rodrigues, Fernando C.; Freitas, Jose C. A.

    1993-01-01

    The paper industry uses wood as its raw material. To know the quantity of wood in the pile of sawn tree trunks, every truck load entering the plant is measured to determine its volume. The objective of this procedure is to know the solid volume of wood stocked in the plant. Weighing the tree trunks has its own problems, due to their high capacity for absorbing water. Image processing techniques were used to evaluate the volume of a truck load of logs of wood. The system is based on a PC equipped with an image processing board using data flow processors. Three cameras allow image acquisition of the sides and rear of the truck. The lateral images contain information about the sectional area of the logs, and the rear image contains information about the length of the logs. The machine vision system and the implemented algorithms are described. The results being obtained with the industrial prototype that is now installed in a paper mill are also presented.

  18. BPMN process views construction

    NARCIS (Netherlands)

    Yongchareon, S.; Liu, Chengfei; Zhao, X.; Kowalkiewicz, M.; Kitagawa, H.; Ishikawa, Y.

    2010-01-01

    Process view technology is catching more attentions in modern business process management, as it enables the customisation of business process representation. This capability helps improve the privacy protection, authority control, flexible display, etc., in business process modelling. One of

  19. Silicon integrated circuit process

    International Nuclear Information System (INIS)

    Lee, Jong Duck

    1985-12-01

    This book introduces the process of silicon integrated circuit. It is composed of seven parts, which are oxidation process, diffusion process, ion implantation process such as ion implantation equipment, damage, annealing and influence on manufacture of integrated circuit and device, chemical vapor deposition process like silicon Epitaxy LPCVD and PECVD, photolithography process, including a sensitizer, spin, harden bake, reflection of light and problems related process, infrared light bake, wet-etch, dry etch, special etch and problems of etching, metal process like metal process like metal-silicon connection, aluminum process, credibility of aluminum and test process.

  20. Silicon integrated circuit process

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Duck

    1985-12-15

    This book introduces the process of silicon integrated circuit. It is composed of seven parts, which are oxidation process, diffusion process, ion implantation process such as ion implantation equipment, damage, annealing and influence on manufacture of integrated circuit and device, chemical vapor deposition process like silicon Epitaxy LPCVD and PECVD, photolithography process, including a sensitizer, spin, harden bake, reflection of light and problems related process, infrared light bake, wet-etch, dry etch, special etch and problems of etching, metal process like metal process like metal-silicon connection, aluminum process, credibility of aluminum and test process.

  1. The Newest Laser Processing

    International Nuclear Information System (INIS)

    Lee, Baek Yeon

    2007-01-01

    This book mentions laser processing with laser principle, laser history, laser beam property, laser kinds, foundation of laser processing such as laser oscillation, characteristic of laser processing, laser for processing and its characteristic, processing of laser hole including conception of processing of laser hole and each material, and hole processing of metal material, cut of laser, reality of cut, laser welding, laser surface hardening, application case of special processing and safety measurement of laser.

  2. Minimal and careful processing

    OpenAIRE

    Nielsen, Thorkild

    2004-01-01

    In several standards, guidelines and publications, organic food processing is strongly associated with "minimal processing" and "careful processing". The term "minimal processing" is nowadays often used in the general food processing industry and described in literature. The term "careful processing" is used more specifically within organic food processing but is not yet clearly defined. The concept of carefulness seems to fit very well with the processing of organic foods, especially if it i...

  3. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    2010-01-01

    are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and......In this paper we describe methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points......, thus, can be used as a graphical exploratory tool for inspecting the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  4. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    , and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can......This paper describes methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points are identified...... be used as a diagnostic for assessing the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  5. A process insight repository supporting process optimization

    OpenAIRE

    Vetlugin, Andrey

    2012-01-01

    Existing solutions for analysis and optimization of manufacturing processes, such as online analysis processing or statistical calculations, have shortcomings that limit continuous process improvements. In particular, they lack means of storing and integrating the results of analysis. This makes the valuable information that can be used for process optimizations used only once and then disposed. The goal of the Advanced Manufacturing Analytics (AdMA) research project is to design an integrate...

  6. Process mining: making knowledge discovery process centric

    NARCIS (Netherlands)

    Aalst, van der W.M.P.

    2011-01-01

    Recently, the Task Force on Process Mining released the Process Mining Manifesto. The manifesto is supported by 53 organizations and 77 process mining experts contributed to it. The active contributions from end-users, tool vendors, consultants, analysts, and researchers illustrate the growing

  7. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  8. Uranium enrichment. Enrichment processes

    International Nuclear Information System (INIS)

    Alexandre, M.; Quaegebeur, J.P.

    2009-01-01

    Despite the remarkable progresses made in the diversity and the efficiency of the different uranium enrichment processes, only two industrial processes remain today which satisfy all of enriched uranium needs: the gaseous diffusion and the centrifugation. This article describes both processes and some others still at the demonstration or at the laboratory stage of development: 1 - general considerations; 2 - gaseous diffusion: physical principles, implementation, utilisation in the world; 3 - centrifugation: principles, elementary separation factor, flows inside a centrifuge, modeling of separation efficiencies, mechanical design, types of industrial centrifuges, realisation of cascades, main characteristics of the centrifugation process; 4 - aerodynamic processes: vortex process, nozzle process; 5 - chemical exchange separation processes: Japanese ASAHI process, French CHEMEX process; 6 - laser-based processes: SILVA process, SILMO process; 7 - electromagnetic and ionic processes: mass spectrometer and calutron, ion cyclotron resonance, rotating plasmas; 8 - thermal diffusion; 9 - conclusion. (J.S.)

  9. The permanental process

    DEFF Research Database (Denmark)

    McCullagh, Peter; Møller, Jesper

    2006-01-01

    We extend the boson process first to a large class of Cox processes and second to an even larger class of infinitely divisible point processes. Density and moment results are studied in detail. These results are obtained in closed form as weighted permanents, so the extension i called a permanental...... process. Temporal extensions and a particularly tractable case of the permanental process are also studied. Extensions of the fermion process along similar lines, leading to so-called determinantal processes, are discussed....

  10. The new inter process communication middle-ware for the ATLAS Trigger and Data Acquisition system

    CERN Document Server

    Kolos, Serguei; The ATLAS collaboration

    2016-01-01

    The ATLAS Trigger & Data Acquisition (TDAQ) project was started almost twenty years ago with the aim of providing scalable distributed data collection system for the experiment. While the software dealing with physics data flow was implemented by directly using the low-level communication protocols, like TCP and UDP, the control and monitoring infrastructure services for the TDAQ system were implemented on top of the CORBA communication middle-ware. CORBA provides a high-level object oriented abstraction for the inter process communication, hiding communication complexity from the developers. This approach speeds up and simplifies development of communication services but incurs some extra cost in terms of performance and resources overhead. Our experience of using CORBA for control and monitoring data exchange in the distributed TDAQ system was very successful, mostly due to the outstanding quality of the CORBA brokers, which have been used in the project: omniORB for C++ and JacORB for Java. However, du...

  11. The permanent process

    DEFF Research Database (Denmark)

    Møller, Jesper; McCullagh, Peter

    We extend the boson process first to a large class of Cox processes and second an even larger class of infinitely divisible point processes. Density and moment results are studied in detail. These results are obtained in closed form as weighted permanents, so the extension is called a permanent...... process. Temporal extensions and a particularly tractable case of the permanent process are also studied. Extensions of the ferminon process along similar lines, leading to so-called determinant processes, are discussed at the end. While the permanent process is attractive, the determinant process...

  12. Explosive processes in nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, R.N.

    2002-01-01

    There are many explosive processes in nucleosynthesis: big bang nucleosynthesis, the rp-process, the γ-process, the ν-process, and the r-process. However, I will discuss just the rp-process and the r-process in detail, primarily because both seem to have been very active research areas of late, and because they have great potential for studies with radioactive nuclear beams. I will also discuss briefly the γ-process because of its inevitability in conjunction with the rp-process. (orig.)

  13. Process Intensification: A Perspective on Process Synthesis

    DEFF Research Database (Denmark)

    Lutze, Philip; Gani, Rafiqul; Woodley, John

    2010-01-01

    In recent years, process intensification (PI) has attracted considerable academic interest as a potential means for process improvement, to meet the increasing demands for sustainable production. A variety of intensified operations developed in academia and industry creates a large number...... of options to potentially improve the process but to identify the set of feasible solutions for PI in which the optimal can be found takes considerable resources. Hence, a process synthesis tool to achieve PI would potentially assist in the generation and evaluation of PI options. Currently, several process...... design tools with a clear focus on specific PI tasks exist. Therefore, in this paper, the concept of a general systematic framework for synthesis and design of PI options in hierarchical steps through analyzing an existing process, generating PI options in a superstructure and evaluating intensified...

  14. Mining processes in dentistry

    NARCIS (Netherlands)

    Mans, R.S.; Reijers, H.A.; van Genuchten, M.; Wismeijer, D.

    2012-01-01

    Business processes in dentistry are quickly evolving towards "digital dentistry". This means that many steps in the dental process will increasingly deal with computerized information or computerized half products. A complicating factor in the improvement of process performance in dentistry,

  15. Realtime Color Stereovision Processing

    National Research Council Canada - National Science Library

    Formwalt, Bryon

    2000-01-01

    .... This research takes a step forward in real time machine vision processing. It investigates techniques for implementing a real time stereovision processing system using two miniature color cameras...

  16. Business Process Customization using Process Merging Techniques

    NARCIS (Netherlands)

    Bulanov, Pavel; Lazovik, Alexander; Aiello, Marco

    2012-01-01

    One of the important application of service composition techniques lies in the field of business process management. Essentially a business process can be considered as a composition of services, which is usually prepared by domain experts, and many tasks still have to be performed manually. These

  17. From Process Understanding to Process Control

    NARCIS (Netherlands)

    Streefland, M.

    2010-01-01

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged. Recent changes in the

  18. Idaho Chemical Processing Plant Process Efficiency improvements

    International Nuclear Information System (INIS)

    Griebenow, B.

    1996-03-01

    In response to decreasing funding levels available to support activities at the Idaho Chemical Processing Plant (ICPP) and a desire to be cost competitive, the Department of Energy Idaho Operations Office (DOE-ID) and Lockheed Idaho Technologies Company have increased their emphasis on cost-saving measures. The ICPP Effectiveness Improvement Initiative involves many activities to improve cost effectiveness and competitiveness. This report documents the methodology and results of one of those cost cutting measures, the Process Efficiency Improvement Activity. The Process Efficiency Improvement Activity performed a systematic review of major work processes at the ICPP to increase productivity and to identify nonvalue-added requirements. A two-phase approach was selected for the activity to allow for near-term implementation of relatively easy process modifications in the first phase while obtaining long-term continuous improvement in the second phase and beyond. Phase I of the initiative included a concentrated review of processes that had a high potential for cost savings with the intent of realizing savings in Fiscal Year 1996 (FY-96.) Phase II consists of implementing long-term strategies too complex for Phase I implementation and evaluation of processes not targeted for Phase I review. The Phase II effort is targeted for realizing cost savings in FY-97 and beyond

  19. PGAS in-memory data processing for the Processing Unit of the Upgraded Electronics of the Tile Calorimeter of the ATLAS Detector

    International Nuclear Information System (INIS)

    Ohene-Kwofie, Daniel; Otoo, Ekow

    2015-01-01

    The ATLAS detector, operated at the Large Hadron Collider (LHC) records proton-proton collisions at CERN every 50ns resulting in a sustained data flow up to PB/s. The upgraded Tile Calorimeter of the ATLAS experiment will sustain about 5PB/s of digital throughput. These massive data rates require extremely fast data capture and processing. Although there has been a steady increase in the processing speed of CPU/GPGPU assembled for high performance computing, the rate of data input and output, even under parallel I/O, has not kept up with the general increase in computing speeds. The problem then is whether one can implement an I/O subsystem infrastructure capable of meeting the computational speeds of the advanced computing systems at the petascale and exascale level.We propose a system architecture that leverages the Partitioned Global Address Space (PGAS) model of computing to maintain an in-memory data-store for the Processing Unit (PU) of the upgraded electronics of the Tile Calorimeter which is proposed to be used as a high throughput general purpose co-processor to the sROD of the upgraded Tile Calorimeter. The physical memory of the PUs are aggregated into a large global logical address space using RDMA- capable interconnects such as PCI- Express to enhance data processing throughput. (paper)

  20. Making process improvement 'stick'.

    Science.gov (United States)

    Studer, Quint

    2014-06-01

    To sustain gains from a process improvement initiative, healthcare organizations should: Explain to staff why a process improvement initiative is needed. Encourage leaders within the organization to champion the process improvement, and tie their evaluations to its outcomes. Ensure that both leaders and employees have the skills to help sustain the sought-after process improvements.

  1. Fractional Poisson process (II)

    International Nuclear Information System (INIS)

    Wang Xiaotian; Wen Zhixiong; Zhang Shiying

    2006-01-01

    In this paper, we propose a stochastic process W H (t)(H-bar (12,1)) which we call fractional Poisson process. The process W H (t) is self-similar in wide sense, displays long range dependence, and has more fatter tail than Gaussian process. In addition, it converges to fractional Brownian motion in distribution

  2. Genetic process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Alves De Medeiros, A.K.; Weijters, A.J.M.M.; Ciardo, G.; Darondeau, P.

    2005-01-01

    The topic of process mining has attracted the attention of both researchers and tool vendors in the Business Process Management (BPM) space. The goal of process mining is to discover process models from event logs, i.e., events logged by some information system are used to extract information about

  3. Business process quality management

    NARCIS (Netherlands)

    Reijers, H.A.; Mendling, J.; Recker, J.; Brocke, vom J.; Rosemann, M.

    2010-01-01

    Abstract Process modeling is a central element in any approach to Business Process Management (BPM). However, what hinders both practitioners and aca demics is the lack of support for assessing the quality of process models — let alone realizing high quality process models. Existing frameworks are

  4. Shell coal gasification process

    Energy Technology Data Exchange (ETDEWEB)

    Hennekes, B. [Shell Global Solutions (US) Inc. (United States). Technology Marketing

    2002-07-01

    The presentation, on which 17 slides/overheads are included in the papers, explained the principles of the Shell coal gasification process and the methods incorporated for control of sulfur dioxide, nitrogen oxides, particulates and mercury. The economics of the process were discussed. The differences between gasification and burning, and the differences between the Shell process and other processes were discussed.

  5. Exploring processes and deviations

    NARCIS (Netherlands)

    Leemans, S.J.J.; Fahland, D.; Aalst, van der W.M.P.; Fournier, F.; Mendling, J.

    2015-01-01

    In process mining, one of the main challenges is to discover a process model, while balancing several quality criteria. This often requires repeatedly setting parameters, discovering a map and evaluating it, which we refer to as process exploration. Commercial process mining tools like Disco,

  6. Distributed genetic process mining

    NARCIS (Netherlands)

    Bratosin, C.C.; Sidorova, N.; Aalst, van der W.M.P.

    2010-01-01

    Process mining aims at discovering process models from data logs in order to offer insight into the real use of information systems. Most of the existing process mining algorithms fail to discover complex constructs or have problems dealing with noise and infrequent behavior. The genetic process

  7. Dosimetry and process control for radiation processing

    International Nuclear Information System (INIS)

    Mod Ali, N.

    2002-01-01

    Complete text of publication follows. Accurate radiation dosimetry can provide quality assurance in radiation processing. Considerable relevant experiences in dosimetry by the SSDL-MINT has necessitate the development of methods making measurement at gamma plant traceable to the national standard. It involves the establishment of proper calibration procedure and selection of appropriate transfer system/technique to assure adequate traceability to a primary radiation standard. The effort forms the basis for irradiation process control, the legal approval of the process by the public health authorities (medical product sterilization and food preservation) and the safety and acceptance of the product

  8. Thin film processes

    CERN Document Server

    Vossen, John L

    1978-01-01

    Remarkable advances have been made in recent years in the science and technology of thin film processes for deposition and etching. It is the purpose of this book to bring together tutorial reviews of selected filmdeposition and etching processes from a process viewpoint. Emphasis is placed on the practical use of the processes to provide working guidelines for their implementation, a guide to the literature, and an overview of each process.

  9. A Campbell random process

    International Nuclear Information System (INIS)

    Reuss, J.D.; Misguich, J.H.

    1993-02-01

    The Campbell process is a stationary random process which can have various correlation functions, according to the choice of an elementary response function. The statistical properties of this process are presented. A numerical algorithm and a subroutine for generating such a process is built up and tested, for the physically interesting case of a Campbell process with Gaussian correlations. The (non-Gaussian) probability distribution appears to be similar to the Gamma distribution

  10. Business process transformation the process tangram framework

    CERN Document Server

    Sharma, Chitra

    2015-01-01

    This book presents a framework through transformation and explains  how business goals can be translated into realistic plans that are tangible and yield real results in terms of the top line and the bottom line. Process Transformation is like a tangram puzzle, which has multiple solutions yet is essentially composed of seven 'tans' that hold it together. Based on practical experience and intensive research into existing material, 'Process Tangram' is a simple yet powerful framework that proposes Process Transformation as a program. The seven 'tans' are: the transformation program itself, triggers, goals, tools and techniques, culture, communication and success factors. With its segregation into tans and division into core elements, this framework makes it possible to use 'pick and choose' to quickly and easily map an organization's specific requirements. Change management and process modeling are covered in detail. In addition, the book approaches managed services as a model of service delivery, which it ex...

  11. Dry process potentials

    International Nuclear Information System (INIS)

    Faugeras, P.

    1997-01-01

    Various dry processes have been studied and more or less developed in order particularly to reduce the waste quantities but none of them had replaced the PUREX process, for reasons departing to policy errors, un-appropriate demonstration examples or too late development, although realistic and efficient dry processes such as a fluoride selective volatility based processes have been demonstrated in France (CLOVIS, ATILA) and would be ten times cheaper than the PUREX process. Dry processes could regain interest in case of a nuclear revival (following global warming fears) or thermal wastes over-production. In the near future, dry processes could be introduced in complement to the PUREX process, especially at the end of the process cycle, for a more efficient recycling and safer storage (inactivation)

  12. Honing process optimization algorithms

    Science.gov (United States)

    Kadyrov, Ramil R.; Charikov, Pavel N.; Pryanichnikova, Valeria V.

    2018-03-01

    This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

  13. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  14. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  15. A tool to increase information-processing capacity for consumer water meter data

    Directory of Open Access Journals (Sweden)

    Heinz E. Jacobs

    2012-02-01

    Full Text Available Background: Water service providers invoice most South African urban consumers for the water they use every month. A secure treasury system generates water invoices at municipalities’ financial departments. Information about the water usage of customers initially comes from reading the water meters, usually located in gardens near the front boundaries of properties. Until as recently as 1990, the main purpose of the water meter readings was to generate invoices for water usage. There are various treasury systems for this purpose.Objective: The objective of this research article was to describe the development of Swift, a locally developed software tool for analysing water meter data from an information management perspective, which engineers in the water field generally use, and to assess critically the influence of Swift on published research and industry. This article focuses on water usage and the challenge of data interchange and extraction as issues that various industries face.Method: This article presents the first detailed report on Swift. It uses a detailed knowledge review and presents and summarises the findings chronologically.Results: The water meter data flow path used to be quite simple. The risk of breaches in confidentiality was limited. Technological advances over the years have led to additional knowledge coming from the same water meter readings with subsequent research outputs. However, there are also complicated data flow paths and increased risks. Users have used Swift to analyse more than two million consumers’ water meter readings to date. Studies have culminated in 10 peer-reviewed journal articles using the data. Seven of them were in the last five years.Conclusion: Swift-based data was the basis of various research studies in the past decade. Practical guidelines in the civil engineering fraternity for estimating water use in South Africa have incorporated knowledge from these studies. Developments after 1995 have

  16. Defense waste processing facility precipitate hydrolysis process

    International Nuclear Information System (INIS)

    Doherty, J.P.; Eibling, R.E.; Marek, J.C.

    1986-03-01

    Sodium tetraphenylborate and sodium titanate are used to assist in the concentration of soluble radionuclide in the Savannah River Plant's high-level waste. In the Defense Waste Processing Facility, concentrated tetraphenylborate/sodium titanate slurry containing cesium-137, strontium-90 and traces of plutonium from the waste tank farm is hydrolyzed in the Salt Processing Cell forming organic and aqueous phases. The two phases are then separated and the organic phase is decontaminated for incineration outside the DWPF building. The aqueous phase, containing the radionuclides and less than 10% of the original organic, is blended with the insoluble radionuclides in the high-level waste sludge and is fed to the glass melter for vitrification into borosilicate glass. During the Savannah River Laboratory's development of this process, copper (II) was found to act as a catalyst during the hydrolysis reactions, which improved the organic removal and simplified the design of the reactor

  17. The Diazo Copying Process.

    Science.gov (United States)

    Osterby, Bruce

    1989-01-01

    Described is an activity which demonstrates an organic-based reprographic method that is used extensively for the duplication of microfilm and engineering drawings. Discussed are the chemistry of the process and how to demonstrate the process for students. (CW)

  18. Waste processing air cleaning

    International Nuclear Information System (INIS)

    Kriskovich, J.R.

    1998-01-01

    Waste processing and preparing waste to support waste processing relies heavily on ventilation. Ventilation is used at the Hanford Site on the waste storage tanks to provide confinement, cooling, and removal of flammable gases

  19. Radiochemical Processing Laboratory (RPL)

    Data.gov (United States)

    Federal Laboratory Consortium — The Radiochemical Processing Laboratory (RPL)�is a scientific facility funded by DOE to create and implement innovative processes for environmental clean-up and...

  20. Laser material processing

    CERN Document Server

    Steen, William

    2010-01-01

    This text moves from the basics of laser physics to detailed treatments of all major materials processing techniques for which lasers are now essential. New chapters cover laser physics, drilling, micro- and nanomanufacturing and biomedical laser processing.

  1. Business Process Inventory

    Data.gov (United States)

    Office of Personnel Management — Inventory of maps and descriptions of the business processes of the U.S. Office of Personnel Management (OPM), with an emphasis on the processes of the Office of the...

  2. Process evaluation distributed system

    Science.gov (United States)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  3. Group Decision Process Support

    DEFF Research Database (Denmark)

    Gøtze, John; Hijikata, Masao

    1997-01-01

    Introducing the notion of Group Decision Process Support Systems (GDPSS) to traditional decision-support theorists.......Introducing the notion of Group Decision Process Support Systems (GDPSS) to traditional decision-support theorists....

  4. Cognitive processes in CBT

    NARCIS (Netherlands)

    Becker, E.S.; Vrijsen, J.N.; Hofmann, S.G.; Asmundson, G.J.G.

    2017-01-01

    Automatic cognitive processing helps us navigate the world. However, if the emotional and cognitive interplay becomes skewed, those cognitive processes can become maladaptive and result in psychopathology. Although biases are present in most mental disorders, different disorders are characterized by

  5. Improving Healthcare Logistics Processes

    DEFF Research Database (Denmark)

    Feibert, Diana Cordes

    logistics processes in hospitals and aims to provide theoretically and empirically based evidence for improving these processes to both expand the knowledge base of healthcare logistics and provide a decision tool for hospital logistics managers to improve their processes. Case studies were conducted...... processes. Furthermore, a method for benchmarking healthcare logistics processes was developed. Finally, a theoretically and empirically founded framework was developed to support managers in making an informed decision on how to improve healthcare logistics processes. This study contributes to the limited...... literature concerned with the improvement of logistics processes in hospitals. Furthermore, the developed framework provides guidance for logistics managers in hospitals on how to improve their processes given the circumstances in which they operate....

  6. Flavor changing lepton processes

    International Nuclear Information System (INIS)

    Kuno, Yoshitaka

    2002-01-01

    The flavor changing lepton processes, or in another words the lepton flavor changing processes, are described with emphasis on the updated theoretical motivations and the on-going experimental progress on a new high-intense muon source. (author)

  7. Process innovation laboratory

    DEFF Research Database (Denmark)

    Møller, Charles

    2007-01-01

    to create a new methodology for developing and exploring process models and applications. The paper outlines the process innovation laboratory as a new approach to BPI. The process innovation laboratory is a comprehensive framework and a collaborative workspace for experimenting with process models....... The process innovation laboratory facilitates innovation by using an integrated action learning approach to process modelling in a controlled environment. The study is based on design science and the paper also discusses the implications to EIS research and practice......Most organizations today are required not only to operate effective business processes but also to allow for changing business conditions at an increasing rate. Today nearly every business relies on their enterprise information systems (EIS) for process integration and future generations of EIS...

  8. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    The Integrated Renovation Process (IRP) is a user customized methodology based on judiciously selected constructivist and interactive multi-criteria decision making methods (Galiotto, Heiselberg, & Knudstrup, 2014 (expected)). When applied for home renovation, the Integrated Renovation Process...

  9. Wetland Groundwater Processes

    National Research Council Canada - National Science Library

    Williams, Greg

    1993-01-01

    This technical note summarizes hydrologic and hydraulic (H AND H) processes and the related terminology that will likely be encountered during an evaluation of the effect of ground-water processes on wetland function...

  10. Flue Gas Desulphurization Processes

    International Nuclear Information System (INIS)

    Aly, A.I.M.; Halhouli, K.A.; Abu-Ashur, B.M.

    1999-01-01

    Flue gas desulphurization process are discussed. These processes can be grouped into non-regenerable systems and regenerable systems. The non-regenerable systems produce a product which is either disposed of as waste or sold as a by-product e.g. lime/limestone process. While in the regenerable systems, e.g. Wellman-Lord process, the SO 2 is regenerated from the sorbent(sodium sulphite), which is returned to absorb more SO 2 . Also a newer technology for flue gas desulphurization is discussed. The Ispra process uses bromine as oxidant, producing HBr, from which bromine is regenerated by electrolysis. The only by-products of this process are sulphuric acid and hydrogen, which are both valuable products, and no waste products are produced. Suggested modifications on the process are made based on experimental investigations to improve the efficiency of the process and to reduce its costs

  11. Chemical radwaste solidification processes

    International Nuclear Information System (INIS)

    Malloy, C.W.

    1979-01-01

    Some of these processes and their problems are briefly reviewed: early cement systems; urea-formaldehyde; Dow solidification process; low-viscosity chemical agents (POLYPAC); and water-extensible polyester. 9 refs

  12. The plasma hearth process: Process residuals characterization

    International Nuclear Information System (INIS)

    Leatherman, G.L.; Geimer, R.; Batdorf, J.; Hassel, G.; Wolfe, P.; Carney, K.P.

    1994-01-01

    The Plasma Hearth Process (PHP) is a high-temperature waste treatment process being developed by Science Applications International Corporation (SAIC) for the Department of Energy (DOE) that destroys hazardous organics while stabilizing radionuclides and hazardous metals in a vitreous slag waste form. The PHP has potential application for the treatment of a wide range of mixed waste types in both the low-level and transuranic (TRU) mixed waste categories. DOE, through the Office of Technology Development's Mixed Waste Integrated Program (MWIP) is conducting a three phase development project to ready the PHP for implementation in the DOE complex

  13. Process and device for processing radioactive wastes

    International Nuclear Information System (INIS)

    1974-01-01

    A method is described for processing liquid radioactive wastes. It includes the heating of the liquid wastes so that the contained liquids are evaporated and a practically anhydrous mass of solid particles inferior in volume to that of the wastes introduced is formed, then the transformation of the solid particles into a monolithic structure. This transformation includes the compressing of the particles and sintering or fusion. The solidifying agent is a mixture of polyethylene and paraffin wax or a styrene copolymer and a polyester resin. The device used for processing the radioactive liquid wastes is also described [fr

  14. Grind hardening process

    CERN Document Server

    Salonitis, Konstantinos

    2015-01-01

    This book presents the grind-hardening process and the main studies published since it was introduced in 1990s.  The modelling of the various aspects of the process, such as the process forces, temperature profile developed, hardness profiles, residual stresses etc. are described in detail. The book is of interest to the research community working with mathematical modeling and optimization of manufacturing processes.

  15. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  16. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  17. Dosimetry for radiation processing

    DEFF Research Database (Denmark)

    Miller, Arne

    1986-01-01

    During the past few years significant advances have taken place in the different areas of dosimetry for radiation processing, mainly stimulated by the increased interest in radiation for food preservation, plastic processing and sterilization of medical products. Reference services both...... and sterilization dosimetry, optichromic dosimeters in the shape of small tubes for food processing, and ESR spectroscopy of alanine for reference dosimetry. In this paper the special features of radiation processing dosimetry are discussed, several commonly used dosimeters are reviewed, and factors leading...

  18. Generating process model collections

    NARCIS (Netherlands)

    Yan, Z.; Dijkman, R.M.; Grefen, P.W.P.J.

    2017-01-01

    Business process management plays an important role in the management of organizations. More and more organizations describe their operations as business processes. It is common for organizations to have collections of thousands of business processes, but for reasons of confidentiality these

  19. Multiphoton processes: conference proceedings

    International Nuclear Information System (INIS)

    Lambropoulos, P.; Smith, S.J.

    1984-01-01

    The chapters of this volume represent the invited papers delivered at the conference. They are arranged according to thermatic proximity beginning with atoms and continuing with molecules and surfaces. Section headings include multiphoton processes in atoms, field fluctuations and collisions in multiphoton process, and multiphoton processes in molecules and surfaces. Abstracts of individual items from the conference were prepared separately for the data base

  20. Semisolid Metal Processing Consortium

    Energy Technology Data Exchange (ETDEWEB)

    Apelian,Diran

    2002-01-10

    Mathematical modeling and simulations of semisolid filling processes remains a critical issue in understanding and optimizing the process. Semisolid slurries are non-Newtonian materials that exhibit complex rheological behavior. There the way these slurries flow in cavities is very different from the way liquid in classical casting fills cavities. Actually filling in semisolid processing is often counter intuitive

  1. How yogurt is processed

    Science.gov (United States)

    This month’s Processing column on the theme of “How Is It Processed?” focuses on yogurt. Yogurt is known for its health-promoting properties. This column will provide a brief overview of the history of yogurt and the current market. It will also unveil both traditional and modern yogurt processing t...

  2. Clinical Process Intelligence

    DEFF Research Database (Denmark)

    Vilstrup Pedersen, Klaus

    2006-01-01

    .e. local guidelines. From a knowledge management point of view, this externalization of generalized processes, gives the opportunity to learn from, evaluate and optimize the processes. "Clinical Process Intelligence" (CPI), will denote the goal of getting generalized insight into patient centered health...

  3. Process research and development

    Science.gov (United States)

    Bickler, D. B.

    1986-01-01

    The following major processes involved in the production of crystalline-silicon solar cells were discussed: surface preparation, junction formation, metallization, and assembly. The status of each of these processes, and the sequence in which these processes are applied, were described as they were in 1975, as they were in 1985, and what they might be in the future.

  4. The Constitutional Amendment Process

    Science.gov (United States)

    Chism, Kahlil

    2005-01-01

    This article discusses the constitutional amendment process. Although the process is not described in great detail, Article V of the United States Constitution allows for and provides instruction on amending the Constitution. While the amendment process currently consists of six steps, the Constitution is nevertheless quite difficult to change.…

  5. Teaching Process Design through Integrated Process Synthesis

    Science.gov (United States)

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  6. Statistical Process Control for KSC Processing

    Science.gov (United States)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  7. Process-based costing.

    Science.gov (United States)

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  8. Colloid process engineering

    CERN Document Server

    Peukert, Wolfgang; Rehage, Heinz; Schuchmann, Heike

    2015-01-01

    This book deals with colloidal systems in technical processes and the influence of colloidal systems by technical processes. It explores how new measurement capabilities can offer the potential for a dynamic development of scientific and engineering, and examines the origin of colloidal systems and its use for new products. The future challenges to colloidal process engineering are the development of appropriate equipment and processes for the production and obtainment of multi-phase structures and energetic interactions in market-relevant quantities. The book explores the relevant processes and for controlled production and how they can be used across all scales.

  9. Nonaqueous processing methods

    International Nuclear Information System (INIS)

    Coops, M.S.; Bowersox, D.F.

    1984-09-01

    A high-temperature process utilizing molten salt extraction from molten metal alloys has been developed for purification of spent power reactor fuels. Experiments with laboratory-scale processing operations show that purification and throughput parameters comparable to the Barnwell Purex process can be achieved by pyrochemical processing in equipment one-tenth the size, with all wastes being discharged as stable metal alloys at greatly reduced volume and disposal cost. This basic technology can be developed for large-scale processing of spent reactor fuels. 13 references, 4 figures

  10. Data processing made simple

    CERN Document Server

    Wooldridge, Susan

    2013-01-01

    Data Processing: Made Simple, Second Edition presents discussions of a number of trends and developments in the world of commercial data processing. The book covers the rapid growth of micro- and mini-computers for both home and office use; word processing and the 'automated office'; the advent of distributed data processing; and the continued growth of database-oriented systems. The text also discusses modern digital computers; fundamental computer concepts; information and data processing requirements of commercial organizations; and the historical perspective of the computer industry. The

  11. Thin film processes II

    CERN Document Server

    Kern, Werner

    1991-01-01

    This sequel to the 1978 classic, Thin Film Processes, gives a clear, practical exposition of important thin film deposition and etching processes that have not yet been adequately reviewed. It discusses selected processes in tutorial overviews with implementation guide lines and an introduction to the literature. Though edited to stand alone, when taken together, Thin Film Processes II and its predecessor present a thorough grounding in modern thin film techniques.Key Features* Provides an all-new sequel to the 1978 classic, Thin Film Processes* Introduces new topics, and sever

  12. FFTF gas processing systems

    International Nuclear Information System (INIS)

    Halverson, T.G.

    1977-01-01

    The design and operation of the two radioactive gas processing systems at the Fast Flux Test Facility (FFTF) exemplifies the concept that will be used in the first generation of Liquid Metal Fast Breeder Reactors (LMFBR's). The two systems, the Radioactive Argon Processing System (RAPS) and the Cell Atmosphere Processing System (CAPS), process the argon and nitrogen used in the FFTF for cover gas on liquid metal systems and as inert atmospheres in steel lined cells housing sodium equipment. The RAPS specifically processes the argon cover gas from the reactor coolant system, providing for decontamination and eventual reuse. The CAPS processes radioactive gasses from inerted cells and other liquid metal cover gas systems, providing for decontamination and ultimate discharge to the atmosphere. The cryogenic processing of waste gas by both systems is described

  13. Poisson branching point processes

    International Nuclear Information System (INIS)

    Matsuo, K.; Teich, M.C.; Saleh, B.E.A.

    1984-01-01

    We investigate the statistical properties of a special branching point process. The initial process is assumed to be a homogeneous Poisson point process (HPP). The initiating events at each branching stage are carried forward to the following stage. In addition, each initiating event independently contributes a nonstationary Poisson point process (whose rate is a specified function) located at that point. The additional contributions from all points of a given stage constitute a doubly stochastic Poisson point process (DSPP) whose rate is a filtered version of the initiating point process at that stage. The process studied is a generalization of a Poisson branching process in which random time delays are permitted in the generation of events. Particular attention is given to the limit in which the number of branching stages is infinite while the average number of added events per event of the previous stage is infinitesimal. In the special case when the branching is instantaneous this limit of continuous branching corresponds to the well-known Yule--Furry process with an initial Poisson population. The Poisson branching point process provides a useful description for many problems in various scientific disciplines, such as the behavior of electron multipliers, neutron chain reactions, and cosmic ray showers

  14. A secondary fuel removal process: plasma processing

    Energy Technology Data Exchange (ETDEWEB)

    Min, J Y; Kim, Y S [Hanyang Univ., Seoul (Korea, Republic of); Bae, K K; Yang, M S [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-07-01

    Plasma etching process of UO{sub 2} by using fluorine containing gas plasma is studied as a secondary fuel removal process for DUPIC (Direct Use of PWR spent fuel Into Candu) process which is taken into consideration for potential future fuel cycle in Korea. CF{sub 4}/O{sub 2} gas mixture is chosen for reactant gas and the etching rates of UO{sub 2} by the gas plasma are investigated as functions of CF{sub 4}/O{sub 2} ratio, plasma power, substrate temperature, and plasma gas pressure. It is found that the optimum CF{sub 4}/O{sub 2} ratio is around 4:1 at all temperatures up to 400 deg C and the etching rate increases with increasing r.f. power and substrate temperature. Under 150W r.f. power the etching rate reaches 1100 monolayers/min at 400 deg C, which is equivalent to about 0.5mm/min. (author).

  15. The SILVA atomic process

    International Nuclear Information System (INIS)

    Cazalet, J.

    1997-01-01

    The SILVA laser isotope separation process is based on the laser selective photo-ionization of uranium atomic vapour; the process is presently under development by CEA and COGEMA in France, with the aim to reduce by a factor three the cost of uranium enrichment. The two main components of a SILVA process plant are the lasers (copper vapour lasers and dye lasers) and the separator for the vaporization (with a high energy electron beam), ionization and separation operations. Researches on the SILVA process started in 1985 and the technical and economical feasibility is to be demonstrated in 1997. The progresses of similar rival processes and other processes are discussed and the remaining research stages and themes of the SILVA program are presented

  16. Technology or Process First?

    DEFF Research Database (Denmark)

    Siurdyban, Artur Henryk; Svejvig, Per; Møller, Charles

    Enterprise Systems Management (ESM) and Business Pro- cess Management (BPM), although highly correlated, have evolved as alternative and mutually exclusive approaches to corporate infrastruc- ture. As a result, companies struggle to nd the right balance between technology and process factors...... in infrastructure implementation projects. The purpose of this paper is articulate a need and a direction to medi- ate between the process-driven and the technology-driven approaches. Using a cross-case analysis, we gain insight into two examples of sys- tems and process implementation. We highlight the dierences...... between them using strategic alignment, Enterprise Systems and Business Process Management theories. We argue that the insights from these cases can lead to a better alignment between process and technology. Implications for practice include the direction towards a closer integration of process...

  17. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  18. Empowering genomic medicine by establishing critical sequencing result data flows: the eMERGE example.

    Science.gov (United States)

    Aronson, Samuel; Babb, Lawrence; Ames, Darren; Gibbs, Richard A; Venner, Eric; Connelly, John J; Marsolo, Keith; Weng, Chunhua; Williams, Marc S; Hartzler, Andrea L; Liang, Wayne H; Ralston, James D; Devine, Emily Beth; Murphy, Shawn; Chute, Christopher G; Caraballo, Pedro J; Kullo, Iftikhar J; Freimuth, Robert R; Rasmussen, Luke V; Wehbe, Firas H; Peterson, Josh F; Robinson, Jamie R; Wiley, Ken; Overby Taylor, Casey

    2018-05-31

    The eMERGE Network is establishing methods for electronic transmittal of patient genetic test results from laboratories to healthcare providers across organizational boundaries. We surveyed the capabilities and needs of different network participants, established a common transfer format, and implemented transfer mechanisms based on this format. The interfaces we created are examples of the connectivity that must be instantiated before electronic genetic and genomic clinical decision support can be effectively built at the point of care. This work serves as a case example for both standards bodies and other organizations working to build the infrastructure required to provide better electronic clinical decision support for clinicians.

  19. AUTOMATION OF OPERATIONAL CONTROL OF DATA FLOWS OF THE METALLURGICAL ENTERPRISE ORGANIZATIONAL STRUCTURE

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2006-01-01

    Full Text Available New method for creation of models of operative control of enterprise is offered. The computer variant of the organizational structure, based on analysis of the charging dynamics of control units, is offered and illustrated at the example of one of organizational structures of Belorussian metallurgical works.

  20. Discovering and understanding android sensor usage behaviors with data flow analysis

    KAUST Repository

    Liu, Xing

    2017-03-20

    Today’s Android-powered smartphones have various embedded sensors that measure the acceleration, orientation, light and other environmental conditions. Many functions in the third-party applications (apps) need to use these sensors. However, embedded sensors may lead to security issues, as the third-party apps can read data from these sensors without claiming any permissions. It has been proven that embedded sensors can be exploited by well designed malicious apps, resulting in leaking users’ privacy. In this work, we are motivated to provide an overview of sensor usage patterns in current apps by investigating what, why and how embedded sensors are used in the apps collected from both a Chinese app. market called “AppChina” and the official market called “Google Play”. To fulfill this goal, We develop a tool called “SDFDroid” to identify the used sensors’ types and to generate the sensor data propagation graphs in each app. We then cluster the apps to find out their sensor usage patterns based on their sensor data propagation graphs. We apply our method on 22,010 apps collected from AppChina and 7,601 apps from Google Play. Extensive experiments are conducted and the experimental results show that most apps implement their sensor related functions by using the third-party libraries. We further study the sensor usage behaviors in the third-party libraries. Our results show that the accelerometer is the most frequently used sensor. Though many third-party libraries use no more than four types of sensors, there are still some third-party libraries registering all the types of sensors recklessly. These results call for more attentions on better regulating the sensor usage in Android apps.

  1. Timing distribution and Data Flow for the ATLAS Tile Calorimeter Phase II Upgrade

    CERN Document Server

    AUTHOR|(SzGeCERN)713745; The ATLAS collaboration

    2016-01-01

    The Hadronic Tile Calorimeter (TileCal) detector is one of the several subsystems composing the ATLAS experiment at the Large Hadron Collider (LHC). The LHC upgrade program plans an increase of order five times the LHC nominal instantaneous luminosity culminating in the High Luminosity LHC (HL-LHC). In order to accommodate the detector to the new HL-LHC parameters, the TileCal read out electronics is being redesigned introducing a new read out strategy with a full-digital trigger system. In the new read out architecture, the front-end electronics allocates the MainBoards and the DaughterBoards. The MainBoard digitizes the analog signals coming from the PhotoMultiplier Tubes (PMTs), provides integrated data for minimum bias monitoring and includes electronics for PMT calibration. The DaughterBoard receives and distributes Detector Control System (DCS) commands, clock and timing commands to the rest of the elements of the front-end electronics, as well as, collects and transmits the digitized data to the back-e...

  2. Discovering and understanding android sensor usage behaviors with data flow analysis

    KAUST Repository

    Liu, Xing; Liu, Jiqiang; Wang, Wei; He, Yongzhong; Zhang, Xiangliang

    2017-01-01

    Today’s Android-powered smartphones have various embedded sensors that measure the acceleration, orientation, light and other environmental conditions. Many functions in the third-party applications (apps) need to use these sensors. However, embedded sensors may lead to security issues, as the third-party apps can read data from these sensors without claiming any permissions. It has been proven that embedded sensors can be exploited by well designed malicious apps, resulting in leaking users’ privacy. In this work, we are motivated to provide an overview of sensor usage patterns in current apps by investigating what, why and how embedded sensors are used in the apps collected from both a Chinese app. market called “AppChina” and the official market called “Google Play”. To fulfill this goal, We develop a tool called “SDFDroid” to identify the used sensors’ types and to generate the sensor data propagation graphs in each app. We then cluster the apps to find out their sensor usage patterns based on their sensor data propagation graphs. We apply our method on 22,010 apps collected from AppChina and 7,601 apps from Google Play. Extensive experiments are conducted and the experimental results show that most apps implement their sensor related functions by using the third-party libraries. We further study the sensor usage behaviors in the third-party libraries. Our results show that the accelerometer is the most frequently used sensor. Though many third-party libraries use no more than four types of sensors, there are still some third-party libraries registering all the types of sensors recklessly. These results call for more attentions on better regulating the sensor usage in Android apps.

  3. Scalable and Accurate SMT-Based Model Checking of Data Flow Systems

    Science.gov (United States)

    2013-10-31

    of variable x is always less than that of variable y) can be represented in this theory. • A theory of inductive datatypes . Modeling software... datatypes can be done directly in this theory. • A theory of arrays. Software that uses arrays can be modeled with constraints in this theory, as can...Arithmetic (and specialized fragments) Arrays Inductive datatypes Bit-vectors Uninterpreted functions SMT Engine Input interfaces FEATURES Support for

  4. Controlling and Monitoring the Data Flow of the LHCb Read-out and DAQ Network

    CERN Multimedia

    Schwemmer, R; Neufeld, N; Svantesson, D

    2011-01-01

    The LHCb readout uses a set of 320 FPGA based boards as interface between the on-detector hardware and the GBE DAQ network. The boards are the logical Level 1 (L1) read-out electronics and aggregate the experiment's raw data into event fragments that are sent to the DAQ network. To control the many parameters of the read-out boards, an embedded PC is included on each board, connecting to the boards ICs and FPGAs. The data from the L1 boards is sent through an aggregation network into the High Level Trigger farm. The farm comprises approximately 1500 PCs which at first assemble the fragments from the L1 boards and then do a partial reconstruction and selection of the events. In total there are approximately 3500 network connections. Data is pushed through the network and there is no mechanism for resending packets. Loss of data on a small scale is acceptable but care has to be taken to avoid data loss if possible. To monitor and debug losses, different probes are inserted throughout the entire read-out chain t...

  5. Controlling and monitoring the data flow of the LHCb read-out and DAQ network

    International Nuclear Information System (INIS)

    Schwemmer, R.; Gaspar, C.; Neufeld, N.; Svantesson, D.

    2012-01-01

    The LHCb read-out uses a set of 320 FPGA based boards as interface between the on-detector hardware and the GBE DAQ network. The boards are the logical Level 1 (L1) read-out electronics and aggregate the experiment's raw data into event fragments that are sent to the DAQ network. To control the many parameters of the read-out boards, an embedded PC is included on each board, connecting to the boards ICs and FPGAs. The data from the L1 boards is sent through an aggregation network into the High Level Trigger farm. The farm comprises approximately 1500 PCs which at first assemble the fragments from the L1 boards and then do a partial reconstruction and selection of the events. In total there are approximately 3500 network connections. Data is pushed through the network and there is no mechanism for resending packets. Loss of data on a small scale is acceptable but care has to be taken to avoid data loss if possible. To monitor and debug losses, different probes are inserted throughout the entire read-out chain to count fragments, packets and their rates at different positions. To keep uniformity throughout the experiment, all control software was developed using the common SCADA software, PVSS, with the JCOP framework as base. The presentation will focus on the low level controls interface developed for the L1 boards and the networking probes, as well as the integration of the high level user interfaces into PVSS. (authors)

  6. New Chicago-Indiana computer network prepared to handle massive data flow

    CERN Multimedia

    2006-01-01

    "The Chicago-Indiana system is ont of five Tier-2 (regional) centers in the United States that will receive data from one of four massive detectors at the Large Hadron Collider at CERN, the European particle physics laboratory in Geneva. When the new instrument begins operating late next year, beams of protons will collide 40 million times a second. When each of those proton beams reaches full intensity, each collision will produce approximately 23 interactions between protons that will create various types of subatomic particles." (1,5 page)

  7. O2A - Data Flow Framework from Sensor Observations to Archives

    OpenAIRE

    Gerchow, Peter; Koppe, Roland; Macario, Ana; Haas, Antonie; Schäfer-Neth, Christian; Pfeiffenberger, Hans; Schäfer, Angela

    2017-01-01

    The Alfred Wegener Institute coordinates German polar research and is one of the most productive polar research institutions worldwide with scientists working in both Polar Regions – a task that can only be successful with the help of excellent infrastructure and logistics. Conducting research in the Arctic and Antarctic requires research stations staffed throughout the year as the basis for expeditions and data collection. It needs research vessels, aircrafts and long-term observ...

  8. Real-time meteorological data flow in support of TVA's radiological emergency plan

    International Nuclear Information System (INIS)

    Hunter, C.H.; Pittman, D.E.; Malo, J.E.

    1985-01-01

    The Tennessee Valley Authority (TVA) presently operates two nuclear power plants - Browns Ferry (3 units) and Sequoyah (2 units). Two additional plants are under construction. These are Watts Bar scheduled for commercial operation later this year, and Bellefonte (2 units), scheduled for operation near the end of the decade. Under regulations promulgated under 10 CFR Part 50, TVA has developed a Radiological Emergency Plan (REP) to facilitate assessment of the effects of a radiological accident at any of the operational plants. As part of the REP, TVA has developed a system for collecting, displaying, and reviewing, and disseminating real-time meteorological information collected at the nuclear plant sites. The flow of this information must be reliable and continuous so that prompt, informed decisions are possible. This system has been designed using guidance provided in applicable Nuclear Regulatory Commission (NRC) documents, most notably Supplement 1 to NUREG-0737 and Regularoty Guide (R.G.) 1.23. This paper presents a brief description of the REP meteorological support. Meteorological support for nuclear plant emergency preparedness at TVA nuclear plants has been provided for several years. The system has undergone numerous changes during this time, reflecting changes in regulatory guidance and experience gained in implementing the system through numerous drills and exercises. A brief discussion of some of this experience is also presented

  9. On the Efficient Exploitation of Speculation under Data flow Paradigms of Control

    Science.gov (United States)

    1989-05-19

    use. Une maison est une machine 6 habiter. - LE CORBUSIER , Vers une architecture 110 Chapter 5 Experiments and Results W e will journey in this chapter...occupation. le replied: "a speculator." - A. MICHAEL LIPPER, Back to the Future: The Case for Speculation, Baruch-Style 25 Chapter 2 Language and System

  10. Controlling and Monitoring the Data Flow of the LHCb Read-out and DAQ Network

    CERN Document Server

    Schwemmer, Rainer; Neufeld, N; Svantesson, D

    2011-01-01

    The LHCb read-out uses a set of 320 FPGA based boards as interface between the on-detector hardware and the GBE DAQ network. The boards are the logical Level 1 (L1) read-out electronics and aggregate the experiment’s raw data into event fragments that are sent to the DAQ network. To control the many parameters of the read-out boards, an embedded PC is included on each board, connecting to the boards ICs and FPGAs. The data from the L1 boards is sent through an aggregation network into the High Level Trigger farm. The farm comprises approximately 1500 PCs which at first assemble the fragments from the L1 boards and then do a partial reconstruction and selection of the events. In total there are approximately 3500 network connections. Data is pushed through the network and there is no mechanism for resending packets. Loss of data on a small scale is acceptable but care has to be taken to avoid data loss if possible. To monitor and debug losses, different probes are inserted throughout the entire read-out cha...

  11. An event-oriented database for continuous data flows in the TJ-II environment

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, E. [Asociacion Euratom/CIEMAT para Fusion Madrid, 28040 Madrid (Spain)], E-mail: edi.sanchez@ciemat.es; Pena, A. de la; Portas, A.; Pereira, A.; Vega, J. [Asociacion Euratom/CIEMAT para Fusion Madrid, 28040 Madrid (Spain); Neto, A.; Fernandes, H. [Associacao Euratom/IST, Centro de Fusao Nuclear, Avenue Rovisco Pais P-1049-001 Lisboa (Portugal)

    2008-04-15

    A new database for storing data related to the TJ-II experiment has been designed and implemented. It allows the storage of raw data not acquired during plasma shots, i.e. data collected continuously or between plasma discharges while testing subsystems (e.g. during neutral beam test pulses). This new database complements already existing ones by permitting the storage of raw data that are not classified by shot number. Rather these data are indexed according to a more general entity entitled event. An event is defined as any occurrence relevant to the TJ-II environment. Such occurrences are registered thus allowing relationships to be established between data acquisition, TJ-II control-system and diagnostic control-system actions. In the new database, raw data are stored in files on the TJ-II UNIX central server disks while meta-data are stored in Oracle tables thereby permitting fast data searches according to different criteria. In addition, libraries for registering data/events in the database from different subsystems within the laboratory local area network have been developed. Finally, a Shared Data Access System has been implemented for external access to data. It permits both new event-indexed as well as old data (indexed by shot number) to be read from a common event perspective.

  12. Air Education and Training Command Cost and Capacity System: Implications for Organizational and Data Flow Changes

    National Research Council Canada - National Science Library

    Manacapilli, Thomas

    2004-01-01

    .... It briefly reviews training management systems and associated organizational arrangements in the other services and the private sector to draw insights for a model management system for the Air Force...

  13. Data-flow Performance Optimisation on Unreliable Networks: the ATLAS Data-Acquisition Case

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2015-01-01

    Abstract The ATLAS detector at CERN records proton-proton collisions delivered by the Large Hadron Collider (LHC). The ATLAS Trigger and Data-Acquisition (TDAQ) system identifies, selects, and stores interesting collision data. These are received from the detector readout electronics at an average rate of 100 kHz. The typical event data size is 1 to 2 MB. Overall, the ATLAS TDAQ can be seen as a distributed software system executed on a farm of roughly 2000 commodity PCs. The worker nodes are interconnected by an Ethernet network that at the restart of the LHC in 2015 is expected to experience a sustained throughput of several 10 GB/s. Abstract A particular type of challenge posed by this system, and by DAQ systems in general, is the inherently bursty nature of the data traffic from the readout buffers to the worker nodes. This can cause instantaneous network congestion and therefore performance degradation. The effect is particularly pronounced for unreliable network interconnections, such as Ethernet. Abstr...

  14. Data-flow performance optimization on unreliable networks: the ATLAS data-acquisition case

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2014-01-01

    The ATLAS detector at CERN records proton-proton collisions delivered by the Large Hadron Collider (LHC). The ATLAS Trigger and Data-Acquisition (TDAQ) system identifies, selects, and stores interesting collision data. These are received from the detector readout electronics at an average rate of 100 kHz. The typical event data size is 1 to 2 MB. Overall, the ATLAS TDAQ can be seen as a distributed software system executed on a farm of roughly 2000 commodity PCs. The worker nodes are interconnected by an Ethernet network that at the restart of the LHC in 2015 is expected to experience a sustained throughput of several 10 GB/s. A particular type of challenge posed by this system, and by DAQ systems in general, is the inherently bursty nature of the data traffic from the readout buffers to the worker nodes. This can cause instantaneous network congestion and therefore performance degradation. The effect is particularly pronounced for unreliable network interconnections, such as Ethernet. In this presentation we...

  15. Business process transformation

    CERN Document Server

    Grover, Varun

    2015-01-01

    Featuring contributions from prominent thinkers and researchers, this volume in the ""Advances in Management Information Systems"" series provides a rich set of conceptual, empirical, and introspective studies that epitomize fundamental knowledge in the area of Business Process Transformation. Processes are interpreted broadly to include operational and managerial processes within and between organizations, as well as those involved in knowledge generation. Transformation includes radical and incremental change, its conduct, management, and outcome. The editors and contributing authors pay clo

  16. Rational Unified Process

    OpenAIRE

    Kopal, Nils

    2016-01-01

    In this German seminar paper, which was written in the year 2011 at the University of Duisburg for a Bachelor Colloquium in Applied computer science, we show a brief overview of the Rational Unified Process (RUP). Thus, interested students or generally interested people in software development gain a first impression of RUP. The paper includes a survey and overview of the underlying process structure, the phases of the process, its workflows, and describes the always by the RUP developers pos...

  17. Deference and Due Process

    OpenAIRE

    Vermeule, Cornelius Adrian

    2015-01-01

    In the textbooks, procedural due process is a strictly judicial enterprise; although substantive entitlements are created by legislative and executive action, it is for courts to decide independently what process the Constitution requires. The notion that procedural due process might be committed primarily to the discretion of the agencies themselves is almost entirely absent from the academic literature. The facts on the ground are very different. Thanks to converging strands of caselaw ...

  18. Process Improvement Essentials

    CERN Document Server

    Persse, James R

    2006-01-01

    Process Improvement Essentials combines the foundation needed to understand process improvement theory with the best practices to help individuals implement process improvement initiatives in their organization. The three leading programs: ISO 9001:2000, CMMI, and Six Sigma--amidst the buzz and hype--tend to get lumped together under a common label. This book delivers a combined guide to all three programs, compares their applicability, and then sets the foundation for further exploration.

  19. Basic digital signal processing

    CERN Document Server

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  20. Multiphoton processes: conference proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Lambropoulos, P.; Smith, S.J. (eds.)

    1984-01-01

    The chapters of this volume represent the invited papers delivered at the conference. They are arranged according to thermatic proximity beginning with atoms and continuing with molecules and surfaces. Section headings include multiphoton processes in atoms, field fluctuations and collisions in multiphoton process, and multiphoton processes in molecules and surfaces. Abstracts of individual items from the conference were prepared separately for the data base. (GHT)

  1. TEP process flow diagram

    Energy Technology Data Exchange (ETDEWEB)

    Wilms, R Scott [Los Alamos National Laboratory; Carlson, Bryan [Los Alamos National Laboratory; Coons, James [Los Alamos National Laboratory; Kubic, William [Los Alamos National Laboratory

    2008-01-01

    This presentation describes the development of the proposed Process Flow Diagram (PFD) for the Tokamak Exhaust Processing System (TEP) of ITER. A brief review of design efforts leading up to the PFD is followed by a description of the hydrogen-like, air-like, and waterlike processes. Two new design values are described; the mostcommon and most-demanding design values. The proposed PFD is shown to meet specifications under the most-common and mostdemanding design values.

  2. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  3. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  4. Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    2016-01-01

    renovation to be overcome. The homeowners were better integrated and their preferences and immaterial values were better taken into account. To keep the decision-making process economically viable and timely, the process as known today still needs to be improved, and new tools need to be developed....... This paper presents a new scheme: the integrated renovation process. One successful case study is introduced, and recommendations for future developments needed in the field are provided....

  5. Transuranium processing plant

    International Nuclear Information System (INIS)

    King, L.J.

    1983-01-01

    The Transuranium Processing Plant (TRU) is a remotely operated, hot-cell, chemical processing facility of advanced design. The heart of TRU is a battery of nine heavily shielded process cells housed in a two-story building. Each cell, with its 54-inch-thick walls of a special high-density concrete, has enough shielding to stop the neutrons and gamma radiation from 1 gram of 252/sub Cf/ and associated fission products. Four cells contain chemical processing equipment, three contain equipment for the preparation and inspection of HFIR targets, and two cells are used for analytical chemistry operations. In addition, there are eight laboratories used for process development, for part of the process-control analyses, and for product finishing operations. Although the Transuranium Processing Plant was built for the purpose of recovering transuranium elements from targets irradiated in the High Flux Isotope Reactor (HFIR), it is also a highly versatile facility which has extensive provisions for changing and modifying equipment. Thus, it was a relatively simple matter to install a Solvent Extraction Test Facility (SETF) in one of the TRU chemical processing cells for use in the evaluation and demonstration of solvent extraction flowsheets for the recovery of fissile and fertile materials from irradiated reactor fuels. The equipment in the SETF has been designed for process development and demonstrations and the particular type of mixer-settler contactors was chosen because it is easy to observe and sample

  6. Linearity in Process Languages

    DEFF Research Database (Denmark)

    Nygaard, Mikkel; Winskel, Glynn

    2002-01-01

    The meaning and mathematical consequences of linearity (managing without a presumed ability to copy) are studied for a path-based model of processes which is also a model of affine-linear logic. This connection yields an affine-linear language for processes, automatically respecting open......-map bisimulation, in which a range of process operations can be expressed. An operational semantics is provided for the tensor fragment of the language. Different ways to make assemblies of processes lead to different choices of exponential, some of which respect bisimulation....

  7. Refractometry in process engineering

    Energy Technology Data Exchange (ETDEWEB)

    Roepscher, H

    1980-02-01

    Following a brief historical introduction into general refractometry, the limiting angle refractometer is dealt with in the first section and the differential refractometer in the second section, as well as process engineering information on this measuring method being given. An attempt is made with an extensive close-to-practice description to introduce the planner and technician to this physical measuring method in process engineering in order that they be able to use it themselves if necessary. When properly applied, it can be a valuable help to process control in compliance with process automization.

  8. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  9. The Critical Design Process

    DEFF Research Database (Denmark)

    Brunsgaard, Camilla; Knudstrup, Mary-Ann; Heiselberg, Per

    2014-01-01

    within Danish tradition of architecture and construction. The objective of the research presented in this paper, is to compare the different design processes behind the making of passive houses in a Danish context. We evaluated the process with regard to the integrated and traditional design process....... Data analysis showed that the majority of the consortiums worked in an integrated manner; though there was room for improvment. Additionally, the paper discusses the challanges of implementing the integrated design process in practice and suggests ways of overcomming some of the barriers . In doing so...

  10. Multimodal Processes Rescheduling

    DEFF Research Database (Denmark)

    Bocewicz, Grzegorz; Banaszak, Zbigniew A.; Nielsen, Peter

    2013-01-01

    Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe-cuted in the......Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe...

  11. Fuels Processing Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — NETL’s Fuels Processing Laboratory in Morgantown, WV, provides researchers with the equipment they need to thoroughly explore the catalytic issues associated with...

  12. Quantum information processing

    National Research Council Canada - National Science Library

    Leuchs, Gerd; Beth, Thomas

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5 SimulationofHamiltonians... References... 1 1 1 3 5 8 10 2 Quantum Information Processing and Error Correction with Jump Codes (G. Alber, M. Mussinger...

  13. Exhaust gas processing facility

    International Nuclear Information System (INIS)

    Terada, Shin-ichi.

    1995-01-01

    The facility of the present invention comprises a radioactive liquid storage vessel, an exhaust gas dehumidifying device for dehumidifying gases exhausted from the vessel and an exhaust gas processing device for reducing radioactive materials in the exhaust gases. A purified gas line is disposed to the radioactive liquid storage vessel for purging exhaust gases generated from the radioactive liquid, then dehumidified and condensed liquid is recovered, and exhaust gases are discharged through an exhaust gas pipe disposed downstream of the exhaust gas processing device. With such procedures, the scale of the exhaust gas processing facility can be reduced and exhaust gases can be processed efficiently. (T.M.)

  14. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  15. Processer og procesledelse

    DEFF Research Database (Denmark)

    Madsen, Benedicte

    Bogen udfolder, nuancerer og konkretiserer procesledelse ift. mentale, relationelle og organisatoriske processer. Eksempler på kapitel-overskrifter: Procesbegrebet, Rammesætning, Kontraktredskabet, Mødeledelse, Samtaler, Reflekterende positioner og processer, Konflikthåndtering og teamudvikling......, Invitation til det dialogiske rum - om at arbejde over og under stregen, og Mentale omstruktureringer....

  16. The Player Engagement Process

    DEFF Research Database (Denmark)

    Schoenau-Fog, Henrik

    2011-01-01

    , categories and triggers involved in this process. By applying grounded theory to the analysis of the responses, a process-oriented player engagement framework was developed and four main components consisting of objectives, activities, accomplishments and affects as well as the corresponding categories...

  17. Mineral Processing Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2000-09-01

    This document represents the roadmap for Processing Technology Research in the US Mining Industry. It was developed based on the results of a Processing Technology Roadmap Workshop sponsored by the National Mining Association in conjunction with the US Department of Energy, Office of Energy Efficiency and Renewable Energy, Office of Industrial Technologies. The Workshop was held January 24 - 25, 2000.

  18. Process control program development

    International Nuclear Information System (INIS)

    Dameron, H.J.

    1985-01-01

    This paper details the development and implementation of a ''Process Control Program'' at Duke Power's three nuclear stations - Oconee, McGuire, and Catawba. Each station is required by Technical Specification to have a ''Process Control Program'' (PCP) to control all dewatering and/or solidification activities for radioactive wastes

  19. Valuing the Accreditation Process

    Science.gov (United States)

    Bahr, Maria

    2018-01-01

    The value of the National Association for Developmental Education (NADE) accreditation process is far-reaching. Not only do students and programs benefit from the process, but also the entire institution. Through data collection of student performance, analysis, and resulting action plans, faculty and administrators can work cohesively towards…

  20. CMOS/SOS processing

    Science.gov (United States)

    Ramondetta, P.

    1980-01-01

    Report describes processes used in making complementary - metal - oxide - semiconductor/silicon-on-sapphire (CMOS/SOS) integrated circuits. Report lists processing steps ranging from initial preparation of sapphire wafers to final mapping of "good" and "bad" circuits on a wafer.

  1. Management oriented process

    International Nuclear Information System (INIS)

    2004-01-01

    ANAV decided to implement process-oriented management by adopting the U. S. NEI (Nuclear Electric Industry) model. The article describes the initial phases of the project, its current status and future prospects. The project has been considered as an improvement in the areas of organization and human factors. Recently, IAEA standard drafts are including processes as an accepted management model. (Author)

  2. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  3. Industrial separation processes : fundamentals

    NARCIS (Netherlands)

    Haan, de A.B.; Bosch, Hans

    2013-01-01

    Separation processes on an industrial scale comprise well over half of the capital and operating costs. They are basic knowledge in every chemical engineering and process engineering study. This book provides comprehensive and fundamental knowledge of university teaching in this discipline,

  4. METAL PLATING PROCESS

    Science.gov (United States)

    Walker, D.E.; Noland, R.A.

    1958-08-12

    A process ts described for obtaining a closely bonded coating of steel or iron on uranium. The process consists of providing, between the steel and uramium. a layer of silver. amd then pressure rolling tbe assembly at about 600 deg C until a reduction of from l0 to 50% has been obtained.

  5. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  6. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  7. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  8. Continuous affine processes

    DEFF Research Database (Denmark)

    Buchardt, Kristian

    2016-01-01

    Affine processes possess the property that expectations of exponential affine transformations are given by a set of Riccati differential equations, which is the main feature of this popular class of processes. In this paper we generalise these results for expectations of more general transformati...

  9. Food-Processing Wastes.

    Science.gov (United States)

    Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z

    2017-10-01

    Literature published in 2016 and early 2017 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.

  10. Food processing and allergenicity.

    Science.gov (United States)

    Verhoeckx, Kitty C M; Vissers, Yvonne M; Baumert, Joseph L; Faludi, Roland; Feys, Marcel; Flanagan, Simon; Herouet-Guicheney, Corinne; Holzhauser, Thomas; Shimojo, Ryo; van der Bolt, Nieke; Wichers, Harry; Kimber, Ian

    2015-06-01

    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat treatment) on the allergenic potential of proteins, and on the antigenic (IgG-binding) and allergenic (IgE-binding) properties of proteins has been considered. A variety of allergenic foods (peanuts, tree nuts, cows' milk, hens' eggs, soy, wheat and mustard) have been reviewed. The overall conclusion drawn is that processing does not completely abolish the allergenic potential of allergens. Currently, only fermentation and hydrolysis may have potential to reduce allergenicity to such an extent that symptoms will not be elicited, while other methods might be promising but need more data. Literature on the effect of processing on allergenic potential and the ability to induce sensitisation is scarce. This is an important issue since processing may impact on the ability of proteins to cause the acquisition of allergic sensitisation, and the subject should be a focus of future research. Also, there remains a need to develop robust and integrated methods for the risk assessment of food allergenicity. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Business process intelligence

    NARCIS (Netherlands)

    Castellanos, M.; Alves De Medeiros, A.K.; Mendling, J.; Weber, B.; Weijters, A.J.M.M.; Cardoso, J.; Aalst, van der W.M.P.

    2009-01-01

    Business Process Intelligence (BPI,) is an emerging area that is getting increasingly popularfor enterprises. The need to improve business process efficiency, to react quickly to changes and to meet regulatory compliance is among the main drivers for BPI. BPI refers to the application of Business

  12. Correctness of concurrent processes

    NARCIS (Netherlands)

    E.R. Olderog (Ernst-Rüdiger)

    1989-01-01

    textabstractA new notion of correctness for concurrent processes is introduced and investigated. It is a relationship P sat S between process terms P built up from operators of CCS [Mi 80], CSP [Ho 85] and COSY [LTS 79] and logical formulas S specifying sets of finite communication sequences as in

  13. Uranium processing and properties

    CERN Document Server

    2013-01-01

    Covers a broad spectrum of topics and applications that deal with uranium processing and the properties of uranium Offers extensive coverage of both new and established practices for dealing with uranium supplies in nuclear engineering Promotes the documentation of the state-of-the-art processing techniques utilized for uranium and other specialty metals

  14. Relational Processing Following Stroke

    Science.gov (United States)

    Andrews, Glenda; Halford, Graeme S.; Shum, David; Maujean, Annick; Chappell, Mark; Birney, Damian

    2013-01-01

    The research examined relational processing following stroke. Stroke patients (14 with frontal, 30 with non-frontal lesions) and 41 matched controls completed four relational processing tasks: sentence comprehension, Latin square matrix completion, modified Dimensional Change Card Sorting, and n-back. Each task included items at two or three…

  15. Contaminated nickel scrap processing

    International Nuclear Information System (INIS)

    Compere, A.L.; Griffith, W.L.; Hayden, H.W.; Johnson, J.S. Jr.; Wilson, D.F.

    1994-12-01

    The DOE will soon choose between treating contaminated nickel scrap as a legacy waste and developing high-volume nickel decontamination processes. In addition to reducing the volume of legacy wastes, a decontamination process could make 200,000 tons of this strategic metal available for domestic use. Contaminants in DOE nickel scrap include 234 Th, 234 Pa, 137 Cs, 239 Pu (trace), 60 Co, U, 99 Tc, and 237 Np (trace). This report reviews several industrial-scale processes -- electrorefining, electrowinning, vapormetallurgy, and leaching -- used for the purification of nickel. Conventional nickel electrolysis processes are particularly attractive because they use side-stream purification of process solutions to improve the purity of nickel metal. Additionally, nickel purification by electrolysis is effective in a variety of electrolyte systems, including sulfate, chloride, and nitrate. Conventional electrorefining processes typically use a mixed electrolyte which includes sulfate, chloride, and borate. The use of an electrorefining or electrowinning system for scrap nickel recovery could be combined effectively with a variety of processes, including cementation, solvent extraction, ion exchange, complex-formation, and surface sorption, developed for uranium and transuranic purification. Selected processes were reviewed and evaluated for use in nickel side-stream purification. 80 refs

  16. Production process of VE

    International Nuclear Information System (INIS)

    1987-07-01

    This book tells of synopsis of production process of VE(value engineering), object selection method and establishment of target, collection of object information, design of function, write improvement suggestion, evaluation of improvement suggestion, all sorts of worksheets of production process of VE, explanation of IE, explanation of PERT.

  17. Perspectives on Innovation Processes

    NARCIS (Netherlands)

    Garud, R.; Tuertscher, P.R.; Van de Ven, A.H.

    2013-01-01

    Innovation is often thought of as an outcome. In this chapter, we review the literatures on innovation processes pertaining to the invention, development, and implementation of ideas. In particular, we explore how these processes unfold within firms, across multi-party networks, and within

  18. Hyperspectral image processing methods

    Science.gov (United States)

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  19. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  20. Process for preparing radiopharmaceuticals

    International Nuclear Information System (INIS)

    Barak, M.; Winchell, H.S.

    1977-01-01

    A process for the preparation of technetium-99m labeled pharmaceuticals is disclosed. The process comprises initially isolating technetium-99m pertechnetate by adsorption upon an adsorbent packing in a chromatographic column. The technetium-99m is then eluted from the packing with a biological compound to form a radiopharmaceutical

  1. Contaminated nickel scrap processing

    Energy Technology Data Exchange (ETDEWEB)

    Compere, A.L.; Griffith, W.L.; Hayden, H.W.; Johnson, J.S. Jr.; Wilson, D.F.

    1994-12-01

    The DOE will soon choose between treating contaminated nickel scrap as a legacy waste and developing high-volume nickel decontamination processes. In addition to reducing the volume of legacy wastes, a decontamination process could make 200,000 tons of this strategic metal available for domestic use. Contaminants in DOE nickel scrap include {sup 234}Th, {sup 234}Pa, {sup 137}Cs, {sup 239}Pu (trace), {sup 60}Co, U, {sup 99}Tc, and {sup 237}Np (trace). This report reviews several industrial-scale processes -- electrorefining, electrowinning, vapormetallurgy, and leaching -- used for the purification of nickel. Conventional nickel electrolysis processes are particularly attractive because they use side-stream purification of process solutions to improve the purity of nickel metal. Additionally, nickel purification by electrolysis is effective in a variety of electrolyte systems, including sulfate, chloride, and nitrate. Conventional electrorefining processes typically use a mixed electrolyte which includes sulfate, chloride, and borate. The use of an electrorefining or electrowinning system for scrap nickel recovery could be combined effectively with a variety of processes, including cementation, solvent extraction, ion exchange, complex-formation, and surface sorption, developed for uranium and transuranic purification. Selected processes were reviewed and evaluated for use in nickel side-stream purification. 80 refs.

  2. Microcontroller base process emulator

    OpenAIRE

    Jovrea Titus Claudiu

    2009-01-01

    This paper describes the design of a microcontroller base emulator for a conventional industrial process. The emulator is made with microcontroller and is used for testing and evaluating the performances of the industrial regulators. The parameters of the emulated process are fully customizable online and downloadable thru a serial communication from a personal computer.

  3. Cementation process study

    International Nuclear Information System (INIS)

    Park, H.H.; Han, K.W.; Ahn, S.J.; Choi, K.S.; Lee, M.W.; Ryu, Y.K.

    1985-01-01

    In the cementation process study, in 1984, design of the waste treatment simulator was finished for the first step. We can experience not only the operation of solidification system but the design and construction of comming large scale plant through the design of cementation process. (Author)

  4. Monitoring of operating processes

    International Nuclear Information System (INIS)

    Barry, R.F.

    1981-01-01

    Apparatus is described for monitoring the processes of a nuclear reactor to detect off-normal operation of any process and for testing the monitoring apparatus. The processes are evaluated by response to their paramters, such as temperature, pressure, etc. The apparatus includes a pair of monitoring paths or signal processing units. Each unit includes facilities for receiving on a time-sharing basis, a status binary word made up of digits each indicating the status of a process, whether normal or off-normal, and test-signal binary words simulating the status binary words. The status words and test words are processed in succession during successive cycles. During each cycle, the two units receive the same status word and the same test word. The test words simulate the status words both when they indicate normal operation and when they indicate off-normal operation. Each signal-processing unit includes a pair of memories. Each memory receives a status word or a test word, as the case may be, and converts the received word into a converted status word or a converted test word. The memories of each monitoring unit operate into a non-coincidence which signals non-coincidence of the converted word out of one memory of a signal-processing unit not identical to the converted word of the other memory of the same unit

  5. Teaching the Writing Process

    Science.gov (United States)

    Keen, John

    2017-01-01

    This article outlines some cognitive process models of writing composition. Possible reasons why students' writing capabilities do not match their abilities in some other school subjects are explored. Research findings on the efficacy of process approaches to teaching writing are presented and potential shortcomings are discussed. Product-based…

  6. Heavy oils processing materials requirements crude processing

    Energy Technology Data Exchange (ETDEWEB)

    Sloley, Andrew W. [CH2M Hill, Englewood, CO (United States)

    2012-07-01

    Over time, recommended best practices for crude unit materials selection have evolved to accommodate new operating requirements, feed qualities, and product qualities. The shift to heavier oil processing is one of the major changes in crude feed quality occurring over the last 20 years. The three major types of crude unit corrosion include sulfidation attack, naphthenic acid attack, and corrosion resulting from hydrolyzable chlorides. Heavy oils processing makes all three areas worse. Heavy oils have higher sulfur content; higher naphthenic acid content; and are more difficult to desalt, leading to higher chloride corrosion rates. Materials selection involves two major criteria, meeting required safety standards, and optimizing economics of the overall plant. Proper materials selection is only one component of a plant integrity approach. Materials selection cannot eliminate all corrosion. Proper materials selection requires appropriate support from other elements of an integrity protection program. The elements of integrity preservation include: materials selection (type and corrosion allowance); management limits on operating conditions allowed; feed quality control; chemical additives for corrosion reduction; and preventive maintenance and inspection (PMI). The following discussion must be taken in the context of the application of required supporting work in all the other areas. Within that context, specific materials recommendations are made to minimize corrosion due to the most common causes in the crude unit. (author)

  7. Performance Measurement of Location Enabled e-Government Processes: A Use Case on Traffic Safety Monitoring

    Science.gov (United States)

    Vandenbroucke, D.; Vancauwenberghe, G.

    2016-12-01

    The European Union Location Framework (EULF), as part of the Interoperable Solutions for European Public Administrations (ISA) Programme of the EU (EC DG DIGIT), aims to enhance the interactions between governments, businesses and citizens by embedding location information into e-Government processes. The challenge remains to find scientific sound and at the same time practicable approaches to estimate or measure the impact of location enablement of e-Government processes on the performance of the processes. A method has been defined to estimate process performance in terms of variables describing the efficiency, effectiveness, as well as the quality of the output of the work processes. A series of use cases have been identified, corresponding to existing e-Government work processes in which location information could bring added value. In a first step, the processes are described by means of BPMN (Business Process Model and Notation) to better understand the process steps, the actors involved, the spatial data flows, as well as the required input and the generated output. In a second step the processes are assessed in terms of the (sub-optimal) use of location information and the potential enhancement of the process by better integrating location information and services. The process performance is measured ex ante (before using location enabled e-Government services) and ex-post (after the integration of such services) in order to estimate and measure the impact of location information. The paper describes the method for performance measurement and highlights how the method is applied to one use case, i.e. the process of traffic safety monitoring. The use case is analysed and assessed in terms of location enablement and its potential impact on process performance. The results of applying the methodology on the use case revealed that performance is highly impacted by factors such as the way location information is collected, managed and shared throughout the

  8. Nuclear process heat

    International Nuclear Information System (INIS)

    Barnert, H.; Hohn, H.; Schad, M.; Schwarz, D.; Singh, J.

    1993-01-01

    In a system for the application of high temperature heat from the HTR one must distinguish between the current generation and the use of process heat. In this respect it is important that the current can be generated by dual purpose power plants. The process heat is used as sensible heat, vaporisation heat and as chemical energy at the chemical conversion for the conversion of raw materials, the refinement of fossil primary energy carriers and finally circuit processes for the fission of water. These processes supply the market for heat, fuels, motor fuels and basic materials. Fifteen examples of HTR heat processes from various projects and programmes are presented in form of energy balances, however in a rather short way. (orig./DG) [de

  9. Nonhomogeneous fractional Poisson processes

    Energy Technology Data Exchange (ETDEWEB)

    Wang Xiaotian [School of Management, Tianjin University, Tianjin 300072 (China)]. E-mail: swa001@126.com; Zhang Shiying [School of Management, Tianjin University, Tianjin 300072 (China); Fan Shen [Computer and Information School, Zhejiang Wanli University, Ningbo 315100 (China)

    2007-01-15

    In this paper, we propose a class of non-Gaussian stationary increment processes, named nonhomogeneous fractional Poisson processes W{sub H}{sup (j)}(t), which permit the study of the effects of long-range dependance in a large number of fields including quantum physics and finance. The processes W{sub H}{sup (j)}(t) are self-similar in a wide sense, exhibit more fatter tail than Gaussian processes, and converge to the Gaussian processes in distribution in some cases. In addition, we also show that the intensity function {lambda}(t) strongly influences the existence of the highest finite moment of W{sub H}{sup (j)}(t) and the behaviour of the tail probability of W{sub H}{sup (j)}(t)

  10. Nonhomogeneous fractional Poisson processes

    International Nuclear Information System (INIS)

    Wang Xiaotian; Zhang Shiying; Fan Shen

    2007-01-01

    In this paper, we propose a class of non-Gaussian stationary increment processes, named nonhomogeneous fractional Poisson processes W H (j) (t), which permit the study of the effects of long-range dependance in a large number of fields including quantum physics and finance. The processes W H (j) (t) are self-similar in a wide sense, exhibit more fatter tail than Gaussian processes, and converge to the Gaussian processes in distribution in some cases. In addition, we also show that the intensity function λ(t) strongly influences the existence of the highest finite moment of W H (j) (t) and the behaviour of the tail probability of W H (j) (t)

  11. Identification of wastewater processes

    DEFF Research Database (Denmark)

    Carstensen, Niels Jacob

    The introduction of on-line sensors for monitoring of nutrient salts concentrations on wastewater treatment plants with nutrient removal, opens a wide new area of modelling wastewater processes. The subject of this thesis is the formulation of operational dynamic models based on time series...... of ammonia, nitrate, and phosphate concentrations, which are measured in the aeration tanks of the biological nutrient removal system. The alternatign operation modes of the BIO-DENITRO and BIO-DENIPHO processes are of particular interest. Time series models of the hydraulic and biological processes are very......-known theory of the processes with the significant effects found in data. These models are called grey box models, and they contain rate expressions for the processes of influent load of nutrients, transport of nutrients between the aeration tanks, hydrolysis and growth of biomass, nitrification...

  12. Quantum independent increment processes

    CERN Document Server

    Franz, Uwe

    2006-01-01

    This is the second of two volumes containing the revised and completed notes of lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald in March, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present second volume contains the following lectures: "Random Walks on Finite Quantum Groups" by Uwe Franz and Rolf Gohm, "Quantum Markov Processes and Applications in Physics" by Burkhard Kümmerer, Classical and Free Infinite Divisibility and Lévy Processes" by Ole E. Barndorff-Nielsen, Steen Thorbjornsen, and "Lévy Processes on Quantum Groups and Dual Groups" by Uwe Franz.

  13. Integrated biofuels process synthesis

    DEFF Research Database (Denmark)

    Torres-Ortega, Carlo Edgar; Rong, Ben-Guang

    2017-01-01

    Second and third generation bioethanol and biodiesel are more environmentally friendly fuels than gasoline and petrodiesel, andmore sustainable than first generation biofuels. However, their production processes are more complex and more expensive. In this chapter, we describe a two-stage synthesis......% used for bioethanol process), and steam and electricity from combustion (54%used as electricity) in the bioethanol and biodiesel processes. In the second stage, we saved about 5% in equipment costs and 12% in utility costs for bioethanol separation. This dual synthesis methodology, consisting of a top......-level screening task followed by a down-level intensification task, proved to be an efficient methodology for integrated biofuel process synthesis. The case study illustrates and provides important insights into the optimal synthesis and intensification of biofuel production processes with the proposed synthesis...

  14. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  15. Beryllium chemistry and processing

    CERN Document Server

    Walsh, Kenneth A

    2009-01-01

    This book introduces beryllium; its history, its chemical, mechanical, and physical properties including nuclear properties. The 29 chapters include the mineralogy of beryllium and the preferred global sources of ore bodies. The identification and specifics of the industrial metallurgical processes used to form oxide from the ore and then metal from the oxide are thoroughly described. The special features of beryllium chemistry are introduced, including analytical chemical practices. Beryllium compounds of industrial interest are identified and discussed. Alloying, casting, powder processing, forming, metal removal, joining and other manufacturing processes are covered. The effect of composition and process on the mechanical and physical properties of beryllium alloys assists the reader in material selection. The physical metallurgy chapter brings conformity between chemical and physical metallurgical processing of beryllium, metal, alloys, and compounds. The environmental degradation of beryllium and its all...

  16. Business process support

    Energy Technology Data Exchange (ETDEWEB)

    Carle, Adriana; Fiducia, Daniel [Transportadora de Gas del Sur S.A. (TGS), Buenos Aires (Argentina)

    2005-07-01

    This paper is about the own development of business support software. The developed applications are used to support two business processes: one of them is the process of gas transportation and the other is the natural gas processing. This software has interphases with the ERP SAP, software SCADA and on line gas transportation simulation software. The main functionalities of the applications are: entrance on line real time of clients transport nominations, transport programming, allocation of the clients transport nominations, transport control, measurements, balanced pipeline, allocation of gas volume to the gas processing plants, calculate of product tons processed in each plant and tons of product distributed to clients. All the developed software generates information to the internal staff, regulatory authorities and clients. (author)

  17. Branching processes in biology

    CERN Document Server

    Kimmel, Marek

    2015-01-01

    This book provides a theoretical background of branching processes and discusses their biological applications. Branching processes are a well-developed and powerful set of tools in the field of applied probability. The range of applications considered includes molecular biology, cellular biology, human evolution and medicine. The branching processes discussed include Galton-Watson, Markov, Bellman-Harris, Multitype, and General Processes. As an aid to understanding specific examples, two introductory chapters, and two glossaries are included that provide background material in mathematics and in biology. The book will be of interest to scientists who work in quantitative modeling of biological systems, particularly probabilists, mathematical biologists, biostatisticians, cell biologists, molecular biologists, and bioinformaticians. The authors are a mathematician and cell biologist who have collaborated for more than a decade in the field of branching processes in biology for this new edition. This second ex...

  18. Scalable Stream Processing with Quality of Service for Smart City Crowdsensing Applications

    Directory of Open Access Journals (Sweden)

    Paolo Bellavista

    2013-12-01

    Full Text Available Crowdsensing is emerging as a powerful paradigm capable of leveraging the collective, though imprecise, monitoring capabilities of common people carrying smartphones or other personal devices, which can effectively become real-time mobile sensors, collecting information about the physical places they live in. This unprecedented amount of information, considered collectively, offers new valuable opportunities to understand more thoroughly the environment in which we live and, more importantly, gives the chance to use this deeper knowledge to act and improve, in a virtuous loop, the environment itself. However, managing this process is a hard technical challenge, spanning several socio-technical issues: here, we focus on the related quality, reliability, and scalability trade-offs by proposing an architecture for crowdsensing platforms that dynamically self-configure and self-adapt depending on application-specific quality requirements. In the context of this general architecture, the paper will specifically focus on the Quasit distributed stream processing middleware, and show how Quasit can be used to process and analyze crowdsensing-generated data flows with differentiated quality requirements in a highly scalable and reliable way.

  19. EDITORIAL: Industrial Process Tomography

    Science.gov (United States)

    Anton Johansen, Geir; Wang, Mi

    2008-09-01

    There has been tremendous development within measurement science and technology over the past couple of decades. New sensor technologies and compact versatile signal recovery electronics are continuously expanding the limits of what can be measured and the accuracy with which this can be done. Miniaturization of sensors and the use of nanotechnology push these limits further. Also, thanks to powerful and cost-effective computer systems, sophisticated measurement and reconstruction algorithms previously only accessible in advanced laboratories are now available for in situ online measurement systems. The process industries increasingly require more process-related information, motivated by key issues such as improved process control, process utilization and process yields, ultimately driven by cost-effectiveness, quality assurance, environmental and safety demands. Industrial process tomography methods have taken advantage of the general progress in measurement science, and aim at providing more information, both quantitatively and qualitatively, on multiphase systems and their dynamics. The typical approach for such systems has been to carry out one local or bulk measurement and assume that this is representative of the whole system. In some cases, this is sufficient. However, there are many complex systems where the component distribution varies continuously and often unpredictably in space and time. The foundation of industrial tomography is to conduct several measurements around the periphery of a multiphase process, and use these measurements to unravel the cross-sectional distribution of the process components in time and space. This information is used in the design and optimization of industrial processes and process equipment, and also to improve the accuracy of multiphase system measurements in general. In this issue we are proud to present a selection of the 145 papers presented at the 5th World Congress on Industrial Process Tomography in Bergen

  20. States in Process Calculi

    Directory of Open Access Journals (Sweden)

    Christoph Wagner

    2014-08-01

    Full Text Available Formal reasoning about distributed algorithms (like Consensus typically requires to analyze global states in a traditional state-based style. This is in contrast to the traditional action-based reasoning of process calculi. Nevertheless, we use domain-specific variants of the latter, as they are convenient modeling languages in which the local code of processes can be programmed explicitly, with the local state information usually managed via parameter lists of process constants. However, domain-specific process calculi are often equipped with (unlabeled reduction semantics, building upon a rich and convenient notion of structural congruence. Unfortunately, the price for this convenience is that the analysis is cumbersome: the set of reachable states is modulo structural congruence, and the processes' state information is very hard to identify. We extract from congruence classes of reachable states individual state-informative representatives that we supply with a proper formal semantics. As a result, we can now freely switch between the process calculus terms and their representatives, and we can use the stateful representatives to perform assertional reasoning on process calculus models.

  1. Welding processes handbook

    CERN Document Server

    Weman, Klas

    2003-01-01

    Deals with the main commercially significant and commonly used welding processes. This title takes the student or novice welder through the individual steps involved in each process in an easily understood way. It covers many of the requirements referred to in European Standards including EN719, EN 729, EN 729 and EN 287.$bWelding processes handbook is a concise, explanatory guide to the main commercially significant and commonly-used welding processes. It takes the novice welder or student through the individual steps involved in each process in a clear and easily understood way. It is intended to provide an up-to-date reference to the major applications of welding as they are used in industry. The contents have been arranged so that it can be used as a textbook for European welding courses in accordance with guidelines from the European Welding Federation. Welding processes and equipment necessary for each process are described so that they can be applied to all instruction levels required by the EWF and th...

  2. Organic waste incineration processes

    Energy Technology Data Exchange (ETDEWEB)

    Lemort, F.; Charvillat, J.P.; Nabot, J.P. [CEA Valrho, Bagnols sur Ceze Cedex (France); Chateauvieux, H.; Thiebaut, C. [CEA Valduc, 21 - Is-sur-Tille (France)

    2001-07-01

    Nuclear activities produce organic waste compatible with thermal processes designed to obtain a significant weight and volume reduction as well as to stabilize the inorganic residue in a form suitable for various interim storage or disposal routes. Several processes may be implemented (e.g. excess air, plasma, fluidized bed or rotating furnace) depending on the nature of the waste and the desired objectives. The authors focus on the IRIS rotating-kiln process, which was used for the first time with radioactive materials during the first half of 1999. IRIS is capable of processing highly chlorinated and {alpha}-contaminated waste at a rate of several kilograms per hour, while limiting corrosion due to chlorine as well as mechanical entrainment of radioactive particles in the off-gas stream. Although operated industrially, the process is under continual development to improve its performance and adapt it to a wider range of industrial applications. The main focus of attention today is on adapting the pyrolytic processes to waste with highly variable compositions and to enhance the efficiency of the off-gas purification systems. These subjects are of considerable interest for a large number of heat treatment processes (including all off-gas treatment systems) for which extremely durable, high-performance and low-flow electrostatic precipitators are now being developed. (author)

  3. Organic waste incineration processes

    International Nuclear Information System (INIS)

    Lemort, F.; Charvillat, J.P.; Nabot, J.P.; Chateauvieux, H.; Thiebaut, C.

    2001-01-01

    Nuclear activities produce organic waste compatible with thermal processes designed to obtain a significant weight and volume reduction as well as to stabilize the inorganic residue in a form suitable for various interim storage or disposal routes. Several processes may be implemented (e.g. excess air, plasma, fluidized bed or rotating furnace) depending on the nature of the waste and the desired objectives. The authors focus on the IRIS rotating-kiln process, which was used for the first time with radioactive materials during the first half of 1999. IRIS is capable of processing highly chlorinated and α-contaminated waste at a rate of several kilograms per hour, while limiting corrosion due to chlorine as well as mechanical entrainment of radioactive particles in the off-gas stream. Although operated industrially, the process is under continual development to improve its performance and adapt it to a wider range of industrial applications. The main focus of attention today is on adapting the pyrolytic processes to waste with highly variable compositions and to enhance the efficiency of the off-gas purification systems. These subjects are of considerable interest for a large number of heat treatment processes (including all off-gas treatment systems) for which extremely durable, high-performance and low-flow electrostatic precipitators are now being developed. (author)

  4. IT Project Prioritization Process

    DEFF Research Database (Denmark)

    Shollo, Arisa; Constantiou, Ioanna

    2013-01-01

    In most of the large companies IT project prioritization process is designed based on principles of evidencebased management. We investigate a case of IT project prioritization in a financial institution, and in particular, how managers practice evidence-based management during this process. We use...... a rich dataset built from a longitudinal study of the prioritization process for the IT projects. Our findings indicate that managers reach a decision not only by using evidence but from the interplay between the evidence and the judgment devices that managers employ. The interplay between evidence...

  5. Revealing the programming process

    DEFF Research Database (Denmark)

    Bennedsen, Jens; Caspersen, Michael Edelgaard

    2005-01-01

    One of the most important goals of an introductory programming course is that the students learn a systematic approach to the development of computer programs. Revealing the programming process is an important part of this; however, textbooks do not address the issue -- probably because...... the textbook medium is static and therefore ill-suited to expose the process of programming. We have found that process recordings in the form of captured narrated programming sessions are a simple, cheap, and efficient way of providing the revelation.We identify seven different elements of the programming...

  6. Process of performance assessment

    International Nuclear Information System (INIS)

    King, C.M.; Halford, D.K.

    1987-01-01

    Performance assessment is the process used to evaluate the environmental consequences of disposal of radioactive waste in the biosphere. An introductory review of the subject is presented. Emphasis is placed on the process of performance assessment from the standpoint of defining the process. Performance assessment, from evolving experience at DOE sites, has short-term and long-term subprograms, the components of which are discussed. The role of mathematical modeling in performance assessment is addressed including the pros and cons of current approaches. Finally, the system/site/technology issues as the focal point of this symposium are reviewed

  7. Semi-Markov processes

    CERN Document Server

    Grabski

    2014-01-01

    Semi-Markov Processes: Applications in System Reliability and Maintenance is a modern view of discrete state space and continuous time semi-Markov processes and their applications in reliability and maintenance. The book explains how to construct semi-Markov models and discusses the different reliability parameters and characteristics that can be obtained from those models. The book is a useful resource for mathematicians, engineering practitioners, and PhD and MSc students who want to understand the basic concepts and results of semi-Markov process theory. Clearly defines the properties and

  8. Irreversible processes kinetic theory

    CERN Document Server

    Brush, Stephen G

    2013-01-01

    Kinetic Theory, Volume 2: Irreversible Processes deals with the kinetic theory of gases and the irreversible processes they undergo. It includes the two papers by James Clerk Maxwell and Ludwig Boltzmann in which the basic equations for transport processes in gases are formulated, together with the first derivation of Boltzmann's ""H-theorem"" and a discussion of this theorem, along with the problem of irreversibility.Comprised of 10 chapters, this volume begins with an introduction to the fundamental nature of heat and of gases, along with Boltzmann's work on the kinetic theory of gases and s

  9. Catalytic biomass pyrolysis process

    Science.gov (United States)

    Dayton, David C.; Gupta, Raghubir P.; Turk, Brian S.; Kataria, Atish; Shen, Jian-Ping

    2018-04-17

    Described herein are processes for converting a biomass starting material (such as lignocellulosic materials) into a low oxygen containing, stable liquid intermediate that can be refined to make liquid hydrocarbon fuels. More specifically, the process can be a catalytic biomass pyrolysis process wherein an oxygen removing catalyst is employed in the reactor while the biomass is subjected to pyrolysis conditions. The stream exiting the pyrolysis reactor comprises bio-oil having a low oxygen content, and such stream may be subjected to further steps, such as separation and/or condensation to isolate the bio-oil.

  10. DWPF process control

    International Nuclear Information System (INIS)

    Heckendoin, F.M. II

    1983-01-01

    The Defense Waste Processing Facility (DWPF) for waste vitrification at the Savannah River Plant (SRP) is in the final design stage. Instrumentation to provide the parameter sensing required to assure the quality of the two-foot-diameter, ten-foot-high waste canister is in the final stage of development. All step of the process and instrumentation are now operating as nearly full-scale prototypes at SRP. Quality will be maintained by assuring that only the intended material enters the canisters, and by sensing the resultant condition of the filled canisters. Primary emphasis will be on instrumentation of the process

  11. Advanced uranium enrichment processes

    International Nuclear Information System (INIS)

    Clerc, M.; Plurien, P.

    1986-01-01

    Three advanced Uranium enrichment processes are dealt with in the report: AVLIS (Atomic Vapour LASER Isotope Separation), MLIS (Molecular LASER Isotope Separation) and PSP (Plasma Separation Process). The description of the physical and technical features of the processes constitutes a major part of the report. If further presents comparisons with existing industrially used enrichment technologies, gives information on actual development programmes and budgets and ends with a chapter on perspectives and conclusions. An extensive bibliography of the relevant open literature is added to the different subjects discussed. The report was drawn up by the nuclear research Centre (CEA) Saclay on behalf of the Commission of the European Communities

  12. Plasma processing for VLSI

    CERN Document Server

    Einspruch, Norman G

    1984-01-01

    VLSI Electronics: Microstructure Science, Volume 8: Plasma Processing for VLSI (Very Large Scale Integration) discusses the utilization of plasmas for general semiconductor processing. It also includes expositions on advanced deposition of materials for metallization, lithographic methods that use plasmas as exposure sources and for multiple resist patterning, and device structures made possible by anisotropic etching.This volume is divided into four sections. It begins with the history of plasma processing, a discussion of some of the early developments and trends for VLSI. The second section

  13. Radioactive waste processing method

    International Nuclear Information System (INIS)

    Sakuramoto, Naohiko.

    1992-01-01

    When granular materials comprising radioactive wastes containing phosphorus are processed at first in a fluidized bed type furnace, if the granular materials are phosphorus-containing activated carbon, granular materials comprising alkali compound such as calcium hydroxide and barium hydroxide are used as fluidizing media. Even granular materials of slow burning speed can be burnt stably in a fluidizing state by high temperature heat of the fluidizing media, thereby enabling to take a long burning processing time. Accordingly, radioactive activated carbon wastes can be processed by burning treatment. (T.M.)

  14. Nano integrated circuit process

    International Nuclear Information System (INIS)

    Yoon, Yung Sup

    2004-02-01

    This book contains nine chapters, which are introduction of manufacture of semiconductor chip, oxidation such as Dry-oxidation, wet oxidation, oxidation model and oxide film, diffusion like diffusion process, diffusion equation, diffusion coefficient and diffusion system, ion implantation, including ion distribution, channeling, multiimplantation and masking and its system, sputtering such as CVD and PVD, lithography, wet etch and dry etch, interconnection and flattening like metal-silicon connection, silicide, multiple layer metal process and flattening, an integrated circuit process, including MOSFET and CMOS.

  15. Transnational Learning Processes

    DEFF Research Database (Denmark)

    Nedergaard, Peter

    This paper analyses and compares the transnational learning processes in the employment field in the European Union and among the Nordic countries. Based theoretically on a social constructivist model of learning and methodologically on a questionnaire distributed to the relevant participants......, a number of hypotheses concerning transnational learning processes are tested. The paper closes with a number of suggestions regarding an optimal institutional setting for facilitating transnational learning processes.Key words: Transnational learning, Open Method of Coordination, Learning, Employment......, European Employment Strategy, European Union, Nordic countries....

  16. Nano integrated circuit process

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Yung Sup

    2004-02-15

    This book contains nine chapters, which are introduction of manufacture of semiconductor chip, oxidation such as Dry-oxidation, wet oxidation, oxidation model and oxide film, diffusion like diffusion process, diffusion equation, diffusion coefficient and diffusion system, ion implantation, including ion distribution, channeling, multiimplantation and masking and its system, sputtering such as CVD and PVD, lithography, wet etch and dry etch, interconnection and flattening like metal-silicon connection, silicide, multiple layer metal process and flattening, an integrated circuit process, including MOSFET and CMOS.

  17. Getting Started with Processing

    CERN Document Server

    Reas, Casey

    2010-01-01

    Learn computer programming the easy way with Processing, a simple language that lets you use code to create drawings, animation, and interactive graphics. Programming courses usually start with theory, but this book lets you jump right into creative and fun projects. It's ideal for anyone who wants to learn basic programming, and serves as a simple introduction to graphics for people with some programming skills. Written by the founders of Processing, this book takes you through the learning process one step at a time to help you grasp core programming concepts. You'll learn how to sketch wi

  18. Quantum Information Processing

    CERN Document Server

    Leuchs, Gerd

    2005-01-01

    Quantum processing and communication is emerging as a challenging technique at the beginning of the new millennium. This is an up-to-date insight into the current research of quantum superposition, entanglement, and the quantum measurement process - the key ingredients of quantum information processing. The authors further address quantum protocols and algorithms. Complementary to similar programmes in other countries and at the European level, the German Research Foundation (DFG) started a focused research program on quantum information in 1999. The contributions - written by leading experts - bring together the latest results in quantum information as well as addressing all the relevant questions

  19. Robot welding process control

    Science.gov (United States)

    Romine, Peter L.

    1991-01-01

    This final report documents the development and installation of software and hardware for Robotic Welding Process Control. Primary emphasis is on serial communications between the CYRO 750 robotic welder, Heurikon minicomputer running Hunter & Ready VRTX, and an IBM PC/AT, for offline programming and control and closed-loop welding control. The requirements for completion of the implementation of the Rocketdyne weld tracking control are discussed. The procedure for downloading programs from the Intergraph, over the network, is discussed. Conclusions are made on the results of this task, and recommendations are made for efficient implementation of communications, weld process control development, and advanced process control procedures using the Heurikon.

  20. Lasers in chemical processing

    International Nuclear Information System (INIS)

    Davis, J.I.

    1982-01-01

    The high cost of laser energy is the crucial issue in any potential laser-processing application. It is expensive relative to other forms of energy and to most bulk chemicals. We show those factors that have previously frustrated attempts to find commercially viable laser-induced processes for the production of materials. Having identified the general criteria to be satisfied by an economically successful laser process and shown how these imply the laser-system requirements, we present a status report on the uranium laser isotope separation (LIS) program at the Lawrence Livermore National Laboratory

  1. Dosimetry for radiation processing

    International Nuclear Information System (INIS)

    Miller, Arne

    1986-01-01

    During the past few years significant advances have taken place in the different areas of dosimetry for radiation processing, mainly stimulated by the increased interest in radiation for food preservation, plastic processing and sterilization of medical products. Reference services both by international organizations (IAEA) and national laboratories have helped to improve the reliability of dose measurements. In this paper the special features of radiation processing dosimetry are discussed, several commonly used dosimeters are reviewed, and factors leading to traceable and reliable dosimetry are discussed. (author)

  2. The SILVA atomic process

    International Nuclear Information System (INIS)

    Cazalet, J.

    1996-01-01

    The SILVA isotopic laser separation process of atomic uranium vapor requires the use of specific high power visible light laser devices and systems for uranium evaporation and management (separation modules). The CEA, in collaboration with industrialists, has developed these components and built some demonstration plants. The scientific and technological challenges raised by this process are now surmounted. The principle of the SILVA process is the selective photo-ionization of uranium isotopes using laser photon beams tuned to the exact excitation frequency of the isotope electron layers. This paper describes the principle of the SILVA process (lasers and separator), the technical feasibility and actual progress of the program and its future steps, its economical stakes, and the results obtained so far. (J.S.). 2 figs., 2 photos

  3. Cognitive Processing Hardware Elements

    National Research Council Canada - National Science Library

    Widrow, Bernard; Eliashberg, Victor; Kamenetsky, Max

    2005-01-01

    The purpose of this research is to identify and develop cognitive information processing systems and algorithms that can be implemented with novel architectures and devices with the goal of achieving...

  4. Ferrous Metal Processing Plants

    Data.gov (United States)

    Department of Homeland Security — This map layer includes ferrous metal processing plants in the United States. The data represent commodities covered by the Minerals Information Team (MIT) of the...

  5. Nonferrous Metal Processing Plants

    Data.gov (United States)

    Department of Homeland Security — This map layer includes nonferrous metal processing plants in the United States. The data represent commodities covered by the Minerals Information Team (MIT) of the...

  6. Canada's hydrocarbon processing evolution

    International Nuclear Information System (INIS)

    Wise, T.H.; Horton, R.

    2000-01-01

    The development of petroleum refining, petrochemicals and natural gas industries in Canada are discussed together with future issues and prospects. Figures give data on (a) refined products trade 1998; (b) refining capacity; (c) product demand 1980-1999; (d) refinery crude runs and capacity; (e) refining and marketing, historical returns 1993-1999; (f) processing power index for Canada and USA; (g) ethylene capacity; (eye) Montreal petrochemical capacities; (j) Sarnia petrochemical capacities in 2000; (k) Alberta petrochemicals capacities 2001; (l) ethylene net equivalent trade; (m) ethylene costs 1999 for W. Canada and other countries. It was concluded that the hydrocarbon processing business continues to expand in Canada and natural gas processing is likely to increase. Petrochemicals may expand in W. Canada, possibly using feed stock from the Far North. Offshore developments may stimulate new processing on the E. Coast

  7. Essentials of stochastic processes

    CERN Document Server

    Durrett, Richard

    2016-01-01

    Building upon the previous editions, this textbook is a first course in stochastic processes taken by undergraduate and graduate students (MS and PhD students from math, statistics, economics, computer science, engineering, and finance departments) who have had a course in probability theory. It covers Markov chains in discrete and continuous time, Poisson processes, renewal processes, martingales, and option pricing. One can only learn a subject by seeing it in action, so there are a large number of examples and more than 300 carefully chosen exercises to deepen the reader’s understanding. Drawing from teaching experience and student feedback, there are many new examples and problems with solutions that use TI-83 to eliminate the tedious details of solving linear equations by hand, and the collection of exercises is much improved, with many more biological examples. Originally included in previous editions, material too advanced for this first course in stochastic processes has been eliminated while treatm...

  8. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  9. HYDICE postflight data processing

    Science.gov (United States)

    Aldrich, William S.; Kappus, Mary E.; Resmini, Ronald G.; Mitchell, Peter A.

    1996-06-01

    The hyperspectral digital imagery collection experiment (HYDICE) sensor records instrument counts for scene data, in-flight spectral and radiometric calibration sequences, and dark current levels onto an AMPEX DCRsi data tape. Following flight, the HYDICE ground data processing subsystem (GDPS) transforms selected scene data from digital numbers (DN) to calibrated radiance levels at the sensor aperture. This processing includes: dark current correction, spectral and radiometric calibration, conversion to radiance, and replacement of bad detector elements. A description of the algorithms for post-flight data processing is presented. A brief analysis of the original radiometric calibration procedure is given, along with a description of the development of the modified procedure currently used. Example data collected during the 1995 flight season, but uncorrected and processed, are shown to demonstrate the removal of apparent sensor artifacts (e.g., non-uniformities in detector response over the array) as a result of this transformation.

  10. Reconfigurable network processing platforms

    NARCIS (Netherlands)

    Kachris, C.

    2007-01-01

    This dissertation presents our investigation on how to efficiently exploit reconfigurable hardware to design flexible, high performance, and power efficient network devices capable to adapt to varying processing requirements of network applications and traffic. The proposed reconfigurable network

  11. Towards processable Afrikaans

    CSIR Research Space (South Africa)

    Pretorius, L

    2009-06-01

    Full Text Available In this paper researchers discuss a number of structural problems that are faced with when designing a machine-oriented controlled natural language for Afrikaans taking the underlying principles of Attempto Controlled English (ACE) and Processable...

  12. Gas processing device

    International Nuclear Information System (INIS)

    Kobayashi, Yoshihiro; Seki, Eiji.

    1991-01-01

    State of electric discharge is detected based on a gas pressure in a sealed container and a discharging current flowing between both of electrodes. When electric arc discharges occur, introduction of gases to be processed is stopped and a voltage applied to both of the electrodes is interrupted. Then, when the gas pressure in the sealed container is lowered to a predetermined value, a power source voltage is applied again to both of the electrodes to recover glow discharges, and the introduction of the gas to be processed is started. With such steps, even if electric arc discharges occur, they are eliminated automatically and, accordingly, normal glow discharges can be recovered, to prevent failures of the device due to electric arc discharges. The glow discharges are recovered automatically without stopping the operation of the gas processing device, and gas injection and solidification processing can be conducted continuously and stably. (T.M.)

  13. Cooperative processing data bases

    Science.gov (United States)

    Hasta, Juzar

    1991-01-01

    Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.

  14. Radiation processing in Japan

    International Nuclear Information System (INIS)

    Makuuchi, Keizo

    2001-01-01

    Economic scale of radiation application in the field of industry, agriculture and medicine in Japan in 1997 was investigated to compare its economic impacts with that of nuclear energy industry. Total production value of radiation application accounted for 54% of nuclear industry including nuclear energy industry and radiation applications in three fields above. Industrial radiation applications were further divided into five groups, namely nondestructive test, RI instruments, radiation facilities, radiation processing and ion beam processing. More than 70% of the total production value was brought about by ion beam processing for use with IC and semiconductors. Future economic prospect of radiation processing of polymers, for example cross-linking, EB curing, graft polymerization and degradation, is reviewed. Particular attention was paid to radiation vulcanization of natural rubber latex and also to degradation of natural polymers. (S. Ohno)

  15. Logistics Innovation Process Revisited

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Su, Shong-Iee Ivan; Yang, Su-Lan

    2011-01-01

    Purpose – The purpose of this paper is to learn more about logistics innovation processes and their implications for the focal organization as well as the supply chain, especially suppliers. Design/methodology/approach – The empirical basis of the study is a longitudinal action research project...... that was triggered by the practical needs of new ways of handling material flows of a hospital. This approach made it possible to revisit theory on logistics innovation process. Findings – Apart from the tangible benefits reported to the case hospital, five findings can be extracted from this study: the logistics...... innovation process model may include not just customers but also suppliers; logistics innovation in buyer-supplier relations may serve as an alternative to outsourcing; logistics innovation processes are dynamic and may improve supplier partnerships; logistics innovations in the supply chain are as dependent...

  16. Negative ion detachment processes

    International Nuclear Information System (INIS)

    Champion, R.L.; Doverspike, L.D.

    1990-10-01

    This paper discusses the following topics: H - and D - collisions with atomic hydrogen; collisional decomposition of SF 6 - ; two-electron loss processes in negative ion collisions; associative electron detachment; and negative ion desorption from surfaces

  17. Acoustic MIMO signal processing

    CERN Document Server

    Huang, Yiteng; Chen, Jingdong

    2006-01-01

    A timely and important book addressing a variety of acoustic signal processing problems under multiple-input multiple-output (MIMO) scenarios. It uniquely investigates these problems within a unified framework offering a novel and penetrating analysis.

  18. Introduction to stochastic processes

    CERN Document Server

    Cinlar, Erhan

    2013-01-01

    Clear presentation employs methods that recognize computer-related aspects of theory. Topics include expectations and independence, Bernoulli processes and sums of independent random variables, Markov chains, renewal theory, more. 1975 edition.

  19. Dosimetry for radiation processing

    International Nuclear Information System (INIS)

    McLaughlin, W.L.; Boyd, A.W.; Chadwick, K.H.; McDonald, J.C.; Miller, A.

    1989-01-01

    Radiation processing is a relatively young industry with broad applications and considerable commercial success. Dosimetry provides an independent and effective way of developing and controlling many industrial processes. In the sterilization of medical devices and in food irradiation, where the radiation treatment impacts directly on public health, the measurements of dose provide the official means of regulating and approving its use. In this respect, dosimetry provides the operator with a means of characterizing the facility, of proving that products are treated within acceptable dose limits and of controlling the routine operation. This book presents an up-to-date review of the theory, data and measurement techniques for radiation processing dosimetry in a practical and useful way. It is hoped that this book will lead to improved measurement procedures, more accurate and precise dosimetry and a greater appreciation of the necessity of dosimetry for radiation processing. (author)

  20. SIMULATION OF LOGISTICS PROCESSES

    Directory of Open Access Journals (Sweden)

    Yu. Taranenko

    2016-08-01

    Full Text Available The article deals with the theoretical basis of the simulation. The study shows the simulation of logistic processes in industrial countries is an integral part of many economic projects aimed at the creation or improvement of logistics systems. The paper was used model Beer Game for management of logistics processes in the enterprise. The simulation model implements in AnyLogic package. AnyLogic product allows us to consider the logistics processes as an integrated system, which allows reaching better solutions. Logistics process management involves pooling the sales market, production and distribution to ensure the temporal level of customer service at the lowest cost overall. This made it possible to conduct experiments and to determine the optimal size of the warehouse at the lowest cost.

  1. Digital image processing

    National Research Council Canada - National Science Library

    Gonzalez, Rafael C; Woods, Richard E

    2008-01-01

    Completely self-contained-and heavily illustrated-this introduction to basic concepts and methodologies for digital image processing is written at a level that truly is suitable for seniors and first...

  2. Ultrahigh bandwidth signal processing

    DEFF Research Database (Denmark)

    Oxenløwe, Leif Katsuo

    2016-01-01

    Optical time lenses have proven to be very versatile for advanced optical signal processing. Based on a controlled interplay between dispersion and phase-modulation by e.g. four-wave mixing, the processing is phase-preserving, an hence useful for all types of data signals including coherent multi......-level modulation founats. This has enabled processing of phase-modulated spectrally efficient data signals, such as orthogonal frequency division multiplexed (OFDM) signa In that case, a spectral telescope system was used, using two time lenses with different focal lengths (chirp rates), yielding a spectral...... regeneratio These operations require a broad bandwidth nonlinear platform, and novel photonic integrated nonlinear platform like aluminum gallium arsenide nano-waveguides used for 1.28 Tbaud optical signal processing will be described....

  3. Rubber compounding and processing

    CSIR Research Space (South Africa)

    John, MJ

    2014-06-01

    Full Text Available This chapter presents an overview on the compounding and processing techniques of natural rubber compounds. The introductory portion deals with different types of rubbers and principles of rubber compounding. The primary and secondary fillers used...

  4. Radiation processing in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Makuuchi, Keizo [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment

    2001-03-01

    Economic scale of radiation application in the field of industry, agriculture and medicine in Japan in 1997 was investigated to compare its economic impacts with that of nuclear energy industry. Total production value of radiation application accounted for 54% of nuclear industry including nuclear energy industry and radiation applications in three fields above. Industrial radiation applications were further divided into five groups, namely nondestructive test, RI instruments, radiation facilities, radiation processing and ion beam processing. More than 70% of the total production value was brought about by ion beam processing for use with IC and semiconductors. Future economic prospect of radiation processing of polymers, for example cross-linking, EB curing, graft polymerization and degradation, is reviewed. Particular attention was paid to radiation vulcanization of natural rubber latex and also to degradation of natural polymers. (S. Ohno)

  5. Desalination processes and technologies

    International Nuclear Information System (INIS)

    Furukawa, D.H.

    1996-01-01

    Reasons of the development of desalination processes, the modern desalination technologies, such as multi-stage flash evaporation, multi-effect distillation, reverse osmosis, and the prospects of using nuclear power for desalination purposes are discussed. 9 refs

  6. Enrichment: centrifuge process

    International Nuclear Information System (INIS)

    Soubbaramayer.

    1989-01-01

    This short course is divided into three sections devoted respectively to the physics of the process, some practical problems raised by the design of a centrifuge and the present situation of centrifugation in the World. 31 figs., 18 refs

  7. Constitutional reform as process

    OpenAIRE

    Schultze, Rainer-Olaf (Prof.)

    2000-01-01

    Constitutional reform as process. - In: The politics of constitutional reform in North America / Rainer-Olaf Schultze ... (eds.). - Opladen : Leske + Budrich, 2000. - S. 11-31. - (Politikwissenschaftliche paperbacks ; 30)

  8. Paper 2: Process characterisation

    African Journals Online (AJOL)

    drinie

    A variety of names represent the prefermentation process, due to emphasis placed on ... monitoring and control instrumentation employed successfully at existing side-stream ..... MÜNCH E (1998) DSP-Prefermenter Technology Book. Science ...

  9. Project management process.

    Science.gov (United States)

    2007-03-01

    This course provides INDOT staff with foundational knowledge and skills in project management principles and methodologies. INDOTs project management processes provide the tools for interdisciplinary teams to efficiently and effectively deliver pr...

  10. Processed Products Database System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Collection of annual data on processed seafood products. The Division provides authoritative advice, coordination and guidance on matters related to the collection,...

  11. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  12. Digital image processing

    National Research Council Canada - National Science Library

    Gonzalez, Rafael C; Woods, Richard E

    2008-01-01

    ...-year graduate students in almost any technical discipline. The leading textbook in its field for more than twenty years, it continues its cutting-edge focus on contemporary developments in all mainstream areas of image processing-e.g...

  13. CAPSULE REPORT: EVAPORATION PROCESS

    Science.gov (United States)

    Evaporation has been an established technology in the metal finishing industry for many years. In this process, wastewaters containing reusable materials, such as copper, nickel, or chromium compounds are heated, producing a water vapor that is continuously removed and condensed....

  14. A novel digital pulse processing architecture for nuclear instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Moline, Yoann; Thevenin, Mathieu; Corre, Gwenole [CEA, LIST - Laboratoire Capteurs et Architectures electroniques, F-91191 Gif-sur-Yvette, (France); Paindavoine, Michel [CNRS, Universite de Bourgogne - Laboratoire d' Etude de l' Apprentissage et du Developpement, 21000 DIJON, (France)

    2015-07-01

    The field of nuclear instrumentation covers a wide range of applications, including counting, spectrometry, pulse shape discrimination and multi-channel coincidence. These applications are the topic of many researches, new algorithms and implementations are constantly proposed thanks to advances in digital signal processing. However, these improvements are not yet implemented in instrumentation devices. This is especially true for neutron-gamma discrimination applications which traditionally use charge comparison method while literature proposes other algorithms based on frequency domain or wavelet theory which show better performances. Another example is pileups which are generally rejected while pileup correction algorithms also exist. These processes are traditionally performed offline due to two issues. The first is the Poissonian characteristic of the signal, composed of random arrival pulses which requires to current architectures to work in data flow. The second is the real-time requirement, which implies losing pulses when the pulse rate is too high. Despite the possibility of treating the pulses independently from each other, current architectures paralyze the acquisition of the signal during the processing of a pulse. This loss is called dead-time. These two issues have led current architectures to use dedicated solutions based on re-configurable components like Field Programmable Gate Arrays (FPGAs) to overcome the need of performance necessary to deal with dead-time. However, dedicated hardware algorithm implementations on re-configurable technologies are complex and time-consuming. For all these reasons, a programmable Digital pulse Processing (DPP) architecture in a high level language such as Cor C++ which can reduce dead-time would be worthwhile for nuclear instrumentation. This would reduce prototyping and test duration by reducing the level of hardware expertise to implement new algorithms. However, today's programmable solutions do not meet

  15. A novel digital pulse processing architecture for nuclear instrumentation

    International Nuclear Information System (INIS)

    Moline, Yoann; Thevenin, Mathieu; Corre, Gwenole; Paindavoine, Michel

    2015-01-01

    The field of nuclear instrumentation covers a wide range of applications, including counting, spectrometry, pulse shape discrimination and multi-channel coincidence. These applications are the topic of many researches, new algorithms and implementations are constantly proposed thanks to advances in digital signal processing. However, these improvements are not yet implemented in instrumentation devices. This is especially true for neutron-gamma discrimination applications which traditionally use charge comparison method while literature proposes other algorithms based on frequency domain or wavelet theory which show better performances. Another example is pileups which are generally rejected while pileup correction algorithms also exist. These processes are traditionally performed offline due to two issues. The first is the Poissonian characteristic of the signal, composed of random arrival pulses which requires to current architectures to work in data flow. The second is the real-time requirement, which implies losing pulses when the pulse rate is too high. Despite the possibility of treating the pulses independently from each other, current architectures paralyze the acquisition of the signal during the processing of a pulse. This loss is called dead-time. These two issues have led current architectures to use dedicated solutions based on re-configurable components like Field Programmable Gate Arrays (FPGAs) to overcome the need of performance necessary to deal with dead-time. However, dedicated hardware algorithm implementations on re-configurable technologies are complex and time-consuming. For all these reasons, a programmable Digital pulse Processing (DPP) architecture in a high level language such as Cor C++ which can reduce dead-time would be worthwhile for nuclear instrumentation. This would reduce prototyping and test duration by reducing the level of hardware expertise to implement new algorithms. However, today's programmable solutions do not meet

  16. Aerospace Materials Process Modelling

    Science.gov (United States)

    1988-08-01

    Cooling Transformation diagram ( CCT diagram ) When a IT diagram is used in the heat process modelling, we suppose that a sudden cooling (instantaneous...processes. CE, chooses instead to study thermo-mechanical properties referring to a CCT diagram . This is thinked to be more reliable to give a true...k , mm-_____sml l ml A I 1 III 12.4 This determination is however based on the following approximations: i) A CCT diagram is valid only for the

  17. Radiopharmaceutical drug review process

    International Nuclear Information System (INIS)

    Frankel, R.

    1985-01-01

    To ensure proper radioactive drug use (such as quality, diagnostic improvement, and minimal radioactive exposure), the Food and Drug Administration evaluates new drugs with respect to safety, effectiveness, and accuracy and adequacy of the labeling. The IND or NDA process is used for this purpose. A brief description of the process, including the Chemical Classification System and the therapeutic potential classification, is presented as it applies to radiopharmaceuticals. Also, the status of the IND or NDA review of radiopharmaceuticals is given

  18. Qualification of conditioning process

    International Nuclear Information System (INIS)

    Wolf, J.

    1989-01-01

    A conditioning process is qualified by the PTB if the execution of pre-treatment and conditioning occurs so that a safe and orderly final storage of the products and waste containers produced can be assumed. All the relevant operating conditions for the plant are laid down by the producer/conditioner of the waste in a handbook. The elements of product inspection by process qualification are shown in tabular form. (DG) [de

  19. Hydrogen recovery process

    Science.gov (United States)

    Baker, Richard W.; Lokhandwala, Kaaeid A.; He, Zhenjie; Pinnau, Ingo

    2000-01-01

    A treatment process for a hydrogen-containing off-gas stream from a refinery, petrochemical plant or the like. The process includes three separation steps: condensation, membrane separation and hydrocarbon fraction separation. The membrane separation step is characterized in that it is carried out under conditions at which the membrane exhibits a selectivity in favor of methane over hydrogen of at least about 2.5.

  20. Process through practice

    DEFF Research Database (Denmark)

    Burry, Jane; Burry, Mark; Tamke, Martin

    2012-01-01

    This paper describes the development of a design and prototype production system for novel structural use of networked small components of wood deploying elastic and plastic bending. The design process engaged with a significant number of different overlapping and interrelated design criteria...... simulation and feedback. The outcome was amplified through carrying out the research over a series of workshops with distinct foci and participation. Two full scale demonstrators have so far been constructed and exhibited as outputs of the process....

  1. Process for compound transformation

    KAUST Repository

    Basset, Jean-Marie

    2016-12-29

    Embodiments of the present disclosure provide for methods of using a catalytic system to chemically transform a compound (e.g., a hydrocarbon). In an embodiment, the method does not employ grafting the catalyst prior to catalysis. In particular, embodiments of the present disclosure provide for a process of hydrocarbon (e.g., C1 to C20 hydrocarbon) metathesis (e.g., alkane, olefin, or alkyne metathesis) transformation, where the process can be conducted without employing grafting prior to catalysis.

  2. Processes of Strategic Renewal,

    OpenAIRE

    Harald Aadne, John; Mahnke, Volker

    2010-01-01

    We discuss strategic renewal from a competence perspective. We argue that the management of speed and timing in this process is viewed distinctively when perceived through a cognitive lens. Managers need more firmly grounded process-understanding. The key idea of this paper is to dynamically conceptualize key activities of strategic renewal, and possible sources of break-down as they relate to the managment of speed and timing. Based on a case from the media industry, we identi...

  3. Hydrogen production processes

    International Nuclear Information System (INIS)

    2003-01-01

    The goals of this first Gedepeon workshop on hydrogen production processes are: to stimulate the information exchange about research programs and research advances in the domain of hydrogen production processes, to indicate the domains of interest of these processes and the potentialities linked with the coupling of a nuclear reactor, to establish the actions of common interest for the CEA, the CNRS, and eventually EDF, that can be funded in the framework of the Gedepeon research group. This document gathers the slides of the 17 presentations given at this workshop and dealing with: the H 2 question and the international research programs (Lucchese P.); the CEA's research program (Lucchese P., Anzieu P.); processes based on the iodine/sulfur cycle: efficiency of a facility - flow-sheets, efficiencies, hard points (Borgard J.M.), R and D about the I/S cycle: Bunsen reaction (Colette S.), R and D about the I/S cycle: the HI/I 2 /H 2 O system (Doizi D.), demonstration loop/chemical engineering (Duhamet J.), materials and corrosion (Terlain A.); other processes under study: the Westinghouse cycle (Eysseric C.), other processes under study at the CEA (UT3, plasma,...) (Lemort F.), database about thermochemical cycles (Abanades S.), Zn/ZnO cycle (Broust F.), H 2 production by cracking, high temperature reforming with carbon trapping (Flamant G.), membrane technology (De Lamare J.); high-temperature electrolysis: SOFC used as electrolyzers (Grastien R.); generic aspects linked with hydrogen production: technical-economical evaluation of processes (Werkoff F.), thermodynamic tools (Neveu P.), the reactor-process coupling (Aujollet P.). (J.S.)

  4. Qualitative Process Theory.

    Science.gov (United States)

    1984-07-01

    solving common sense reasoning mathematical reasoning naive physics aritificial intelligence * 20. ABSTRACT (Continue o,, reverse side Ift necessary and...AD-A148 987 QUALITATIVE PROCESS THEORY(U) MASSACHUSETTS INST OF 1/2 TEEH CAMBRIDGE ARTIFICIAL INTELLIGENCE LAB K D FORBUS JUL 84 RI-TR-789 N88814-80...NATIONAL BUREAU Of STAN ARDS IJ% A 4 I .7 Technical Report 789 Q[-----Qualitative• Process M° Theory . Kenneth Dale Forbus MIT Artificial Intelligence

  5. Electron-attachment processes

    International Nuclear Information System (INIS)

    Christophorou, L.G.; McCorkle, D.L.; Christodoulides, A.A.

    1982-01-01

    Topics covered include: (1) modes of production of negative ions, (2) techniques for the study of electron attachment processes, (3) dissociative electron attachment to ground-state molecules, (4) dissociative electron attachment to hot molecules (effects of temperature on dissociative electron attachment), (5) molecular parent negative ions, and (6) negative ions formed by ion-pair processes and by collisions of molecules with ground state and Rydberg atoms

  6. Process management practice

    International Nuclear Information System (INIS)

    Pyeon, In Beom

    1983-04-01

    This book gives descriptions of qualifying subject and test scope like production plan and control, economic feasibility, process management, quality management and operations research, industrial economics like materials and marketing management, production management such as meaning and goals of process management and production plan and control, basic economic concept, official interest and equivalence, and depreciation, and OR concept such as network analysis and PERT CPM and stimulation.

  7. Harmful Waste Process

    International Nuclear Information System (INIS)

    Ki, Mun Bong; Lee, Shi Jin; Park, Jun Seok; Yoon, Seok Pyo; Lee, Jae Hyo; Jo, Byeong Ryeol

    2008-08-01

    This book gives descriptions of processing harmful waste, including concerned law and definition of harmful waste, current conditions and generation of harmful waste in Korea, international condition of harmful waste, minimizing of generation of harmful waste, treatment and storage. It also tells of basic science for harmful waste disposal with physics, chemistry, combustion engineering, microbiology and technique of disposal such as physical, chemical, biological process, stabilizing and solidification, incineration and waste in landfill.

  8. Natural Information Processing Systems

    OpenAIRE

    John Sweller; Susan Sweller

    2006-01-01

    Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...

  9. Processing Of Binary Images

    Science.gov (United States)

    Hou, H. S.

    1985-07-01

    An overview of the recent progress in the area of digital processing of binary images in the context of document processing is presented here. The topics covered include input scan, adaptive thresholding, halftoning, scaling and resolution conversion, data compression, character recognition, electronic mail, digital typography, and output scan. Emphasis has been placed on illustrating the basic principles rather than descriptions of a particular system. Recent technology advances and research in this field are also mentioned.

  10. Personal sale process

    Directory of Open Access Journals (Sweden)

    Gašović Milan

    2002-01-01

    Full Text Available Experience from prior successful sale of many companies from different business activities, tells us that it is necessary to create approach system, flexible to different buyers and environment. The base of this system is a belief that salesmen can stimulate big buyers to make buying decisions, if the selling process is done well. Emphasis is made on practical selling techniques which are used in the whole selling process.

  11. Synroc processing options

    International Nuclear Information System (INIS)

    Rozsa, R.B.; Hoenig, C.L.

    1981-01-01

    Synroc is a titanate-based ceramic material currently being developed for immobilizing high-level nuclear reactor wastes in solid form. Synroc D is a unique variation of Synroc. It can contain the high-level defense wastes, particularly those in storage at the Savannah River Plant. In this report, we review the early development of the initial Synroc process, discuss modification and other options that simplify it overall, and recommend the future direction of research and development in the processing area. A reference Synroc process is described briefly and contrasted with the Savannah River Laboratory glass-based reference case. Preliminary engineering layouts show Synroc to be a more complex processing operation and, thus, more expensive than the glass-based process. However, we believe that simplifications, which will significantly reduce the cost difference, are possible. Further research and development will continue in the areas of slurry processing, fluidized bed calcination, and mineralization. This last will use sintering, hot uniaxial pressing, or hot isostatic pressing

  12. Ultrasound in chemical processes

    International Nuclear Information System (INIS)

    Baig, S.; Farooq, R.; Malik, A.H.

    2009-01-01

    The use of ultrasound to promote chemical reactions or sono chemistry is a field of chemistry which involves the process of acoustic cavitations i.e. the collapse of microscopic bubbles in liquid. There are two essential components for the application of sono chemistry, a liquid medium and a source of high-energy vibrations. The liquid medium is necessary because sono chemistry is driven by acoustic cavitations that can only occur in liquids. The source of the vibrational energy is the transducer. The chemical effects of ultrasound include the enhancement of reaction rates at ambient temperatures and striking advancements in stoichiometric and catalytic reactions In some cases, ultrasonic irradiation can increase reactivities by nearly million fold. The ultrasound has large number of applications not only in emending old chemical processes but also in developing new synthetic strategies. Ultrasound enhances all chemical and physical processes e.g., crystallization, vitamin synthesis, preparation of catalysts, dissolution of chemicals, organometallic reactions, electrochemical processes, etc. High-power ultrasonics is a new powerful technology that is not only safe and environmentally friendly in its application but is also efficient and economical. It can be applied to existing processes to eliminate the need for chemicals and/or heat application in a variety of industrial processes. (author)

  13. Customer Innovation Process Leadership

    DEFF Research Database (Denmark)

    Lindgren, Peter; Jørgensen, Jacob Høj; Goduscheit, René Chester

    2007-01-01

    Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been on diffe......Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been...... on different types of organisational setup to the product development model and process. The globalization and enhanced competitive markets are however changing the innovation game and the challenge to innovation leadership Excellent product development innovation and leadership seems not any longer to enough...... another outlook to future innovation leadership - Customer Innovation Process Leadership - CIP-leadership. CIP-leadership moves the company's innovation process closer to the customer innovation process and discusses how companies can be involved and innovate in customers' future needs and lead...

  14. Process Management Plans

    Directory of Open Access Journals (Sweden)

    Tomasz Miksa

    2014-07-01

    Full Text Available In the era of research infrastructures and big data, sophisticated data management practices are becoming essential building blocks of successful science. Most practices follow a data-centric approach, which does not take into account the processes that created, analysed and presented the data. This fact limits the possibilities for reliable verification of results. Furthermore, it does not guarantee the reuse of research, which is one of the key aspects of credible data-driven science. For that reason, we propose the introduction of the new concept of Process Management Plans, which focus on the identification, description, sharing and preservation of the entire scientific processes. They enable verification and later reuse of result data and processes of scientific experiments. In this paper we describe the structure and explain the novelty of Process Management Plans by showing in what way they complement existing Data Management Plans. We also highlight key differences, major advantages, as well as references to tools and solutions that can facilitate the introduction of Process Management Plans.

  15. Pennsylvania's partnering process

    International Nuclear Information System (INIS)

    Latham, J.W.

    1996-01-01

    Pennsylvania is committed to finding a site for a low-level radioactive waste (LLRW) disposal facility through an innovative voluntary process. The Pennsylvania Department of Environmental Protection (DEP) and Chem-Nuclear Systems, Inc. (CNSI) developed the Community Partnering Plan with extensive public participation. The Community Partnering Plan outlines a voluntary process that empowers municipalities to evaluate the advantages and disadvantages of hosting the facility. DEP and CNSI began developing the Community Partnering Plan in July 1995. Before then, CNSI was using a screening process prescribed by state law and regulations to find a location for the facility. So far, approximately 78 percent of the Commonwealth has been identified as disqualified as a site for the LLRW disposal facility. The siting effort will now focus on identifying volunteer host municipalities in the remaining 22 percent of the state. This combination of technical screening and voluntary consideration makes Pennsylvania's process unique. A volunteered site will have to meet the same tough requirements for protecting people and the environment as a site chosen through the screening process. Protection of public health and safety continues to be the foundation of the state's siting efforts. The Community Partnering Plan offers a window of opportunity. If Pennsylvania does not find volunteer municipalities with suitable sites by the end of 1997, it probably will return to a technical screening process

  16. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    Science.gov (United States)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  17. Continuous Correctness of Business Processes Against Process Interference

    NARCIS (Netherlands)

    van Beest, Nick; Bucur, Doina

    2013-01-01

    In distributed business process support environments, process interference from multiple stakeholders may cause erroneous process outcomes. Existing solutions to detect and correct interference at runtime employ formal verification and the automatic generation of intervention processes at runtime.

  18. Advanced materials processing

    International Nuclear Information System (INIS)

    Giamei, A.F.

    1993-01-01

    Advanced materials will require improved processing methods due to high melting points, low toughness or ductility values, high reactivity with air or ceramics and typically complex crystal structures with significant anisotropy in flow and/or fracture stress. Materials for structural applications at elevated temperature in critical systems will require processing with a high degree of control. This requires an improved understanding of the relationship between process variables and microstructure to enable control systems to achieve consistently high quality. One avenue to the required level of understanding is computer simulation. Past attempts to do process modeling have been hampered by incomplete data regarding thermophysical or mechanical material behavior. Some of the required data can be calculated. Due to the advances in software and hardware, accuracy and costs are in the realm of acquiring experimental data. Such calculations can, for example, be done at an atomic level to compute lattice energy, fault energies, density of states and charge densities. These can lead to fundamental information about the competition between slip and fracture, anisotropy of bond strength (and therefore cleavage strength), cohesive strength, adhesive strength, elastic modulus, thermal expansion and possibly other quantities which are difficult (and therefore expensive to measure). Some of these quantities can be fed into a process model. It is probable that temperature dependencies can be derived numerically as well. Examples are given of the beginnings of such an approach for Ni 3 Al and MoSi 2 . Solidification problems are examples of the state-of-the-art process modeling and adequately demonstrate the need for extensive input data. Such processes can be monitored in terms of interfacial position vs. time, cooling rate and thermal gradient

  19. Novel food processing techniques

    Directory of Open Access Journals (Sweden)

    Vesna Lelas

    2006-12-01

    Full Text Available Recently, a lot of investigations have been focused on development of the novel mild food processing techniques with the aim to obtain the high quality food products. It is presumed also that they could substitute some of the traditional processes in the food industry. The investigations are primarily directed to usage of high hydrostatic pressure, ultrasound, tribomechanical micronization, microwaves, pulsed electrical fields. The results of the scientific researches refer to the fact that application of some of these processes in particular food industry can result in lots of benefits. A significant energy savings, shortening of process duration, mild thermal conditions, food products with better sensory characteristics and with higher nutritional values can be achieved. As some of these techniques act also on the molecular level changing the conformation, structure and electrical potential of organic as well as inorganic materials, the improvement of some functional properties of these components may occur. Common characteristics of all of these techniques are treatment at ambient or insignificant higher temperatures and short time of processing (1 to 10 minutes. High hydrostatic pressure applied to various foodstuffs can destroy some microorganisms, successfully modify molecule conformation and consequently improve functional properties of foods. At the same time it acts positively on the food products intend for freezing. Tribomechanical treatment causes micronization of various solid materials that results in nanoparticles and changes in structure and electrical potential of molecules. Therefore, the significant improvement of some rheological and functional properties of materials occurred. Ultrasound treatment proved to be potentially very successful technique of food processing. It can be used as a pretreatment to drying (decreases drying time and improves functional properties of food, as extraction process of various components

  20. Radiation processing of polysaccharides

    International Nuclear Information System (INIS)

    2004-11-01

    Radiation processing is a very convenient tool for imparting desirable effects in polymeric materials and it has been an area of enormous interest in the last few decades. The success of radiation technology for processing of synthetic polymers can be attributed to two reasons namely, their ease of processing in various shapes and sizes, and secondly, most of these polymers undergo crosslinking reaction upon exposure to radiation. In recent years, natural polymers are being looked at with renewed interest because of their unique characteristics, such as inherent biocompatibility, biodegradability and easy availability. Traditionally, the commercial exploitation of natural polymers like carrageenans, alginates or starch etc. has been based, to a large extent, on empirical knowledge. But now, the applications of natural polymers are being sought in knowledge - demanding areas such as pharmacy and biotechnology, which is acting as a locomotive for further scientific research in their structure-function relationship. Selected success stories concerning radiation processed natural polymers and application of their derivatives in the health care products industries and agriculture are reported. This publication will be of interest to individuals at nuclear institutions worldwide that have programmes of R and D and applications in radiation processing technologies. New developments in radiation processing of polymers and other natural raw materials give insight into converting them into useful products for every day life, human health and environmental remediation. The book will also be of interest to other field specialists, readers including managers and decision makers in industry (health care, food and agriculture) helping them to understand the important role of radiation processing technology in polysaccharides

  1. Nonconscious processes and health.

    Science.gov (United States)

    Sheeran, Paschal; Gollwitzer, Peter M; Bargh, John A

    2013-05-01

    Health behavior theories focus on the role of conscious, reflective factors (e.g., behavioral intentions, risk perceptions) in predicting and changing behavior. Dual-process models, on the other hand, propose that health actions are guided not only by a conscious, reflective, rule-based system but also by a nonconscious, impulsive, associative system. This article argues that research on health decisions, actions, and outcomes will be enriched by greater consideration of nonconscious processes. A narrative review is presented that delineates research on implicit cognition, implicit affect, and implicit motivation. In each case, we describe the key ideas, how they have been taken up in health psychology, and the possibilities for behavior change interventions, before outlining directions that might profitably be taken in future research. Correlational research on implicit cognitive and affective processes (attentional bias and implicit attitudes) has recently been supplemented by intervention studies using implementation intentions and practice-based training that show promising effects. Studies of implicit motivation (health goal priming) have also observed encouraging findings. There is considerable scope for further investigations of implicit affect control, unconscious thought, and the automatization of striving for health goals. Research on nonconscious processes holds significant potential that can and should be developed by health psychologists. Consideration of impulsive as well as reflective processes will engender new targets for intervention and should ultimately enhance the effectiveness of behavior change efforts. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  2. Staging Collaborative Innovation Processes

    DEFF Research Database (Denmark)

    Pedersen, Signe; Clausen, Christian

    Organisations are currently challenged by demands for increased collaborative innovation internally as well as with external and new entities - e.g. across the value chain. The authors seek to develop new approaches to managing collaborative innovative processes in the context of open innovation ...... the diverse matters of concern into a coherent product or service concept, and 2) in the same process move these diverse holders of the matters of concern into a translated actor network which carry or support the concept.......Organisations are currently challenged by demands for increased collaborative innovation internally as well as with external and new entities - e.g. across the value chain. The authors seek to develop new approaches to managing collaborative innovative processes in the context of open innovation...... and public private innovation partnerships. Based on a case study of a collaborative design process in a large electronics company the paper points to the key importance of staging and navigation of collaborative innovation process. Staging and navigation is presented as a combined activity: 1) to translate...

  3. Range Process Simulation Tool

    Science.gov (United States)

    Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga

    2005-01-01

    Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

  4. VLSI signal processing technology

    CERN Document Server

    Swartzlander, Earl

    1994-01-01

    This book is the first in a set of forthcoming books focussed on state-of-the-art development in the VLSI Signal Processing area. It is a response to the tremendous research activities taking place in that field. These activities have been driven by two factors: the dramatic increase in demand for high speed signal processing, especially in consumer elec­ tronics, and the evolving microelectronic technologies. The available technology has always been one of the main factors in determining al­ gorithms, architectures, and design strategies to be followed. With every new technology, signal processing systems go through many changes in concepts, design methods, and implementation. The goal of this book is to introduce the reader to the main features of VLSI Signal Processing and the ongoing developments in this area. The focus of this book is on: • Current developments in Digital Signal Processing (DSP) pro­ cessors and architectures - several examples and case studies of existing DSP chips are discussed in...

  5. Chemical process engineering in the transuranium processing plant

    International Nuclear Information System (INIS)

    Collins, E.D.; Bigelow, J.E.

    1976-01-01

    Since operation of the Transuranium Processing Plant began, process changes have been made to counteract problems caused by equipment corrosion, to satisfy new processing requirements, and to utilize improved processes. The new processes, equipment, and techniques have been incorporated into a sequence of steps which satisfies all required processing functions

  6. A simulation of the SDC on-line processing farm

    International Nuclear Information System (INIS)

    Wang, C.; Chen, Y.; Dorenbosch, J.; Lee, J.; Sayle, R.

    1993-10-01

    In the Solenoidal Detector Collaboration (SDC) data acquisition system (DAQ), an enormous amount of data flows into a processor farm for extraction of interesting physics events. To design an efficient on-line filter, the operations in the farm must be carefully modeled. The authors present a simulation model developed at the Superconducting Super Collider Laboratory which efficiently allocates physics events to the farm

  7. Acoustic Levitation Containerless Processing

    Science.gov (United States)

    Whymark, R. R.; Rey, C. A.

    1985-01-01

    This research program consists of the development of acoustic containerless processing systems with applications in the areas of research in material sciences, as well as the production of new materials, solid forms with novel and unusual microstructures, fusion target spheres, and improved optical fibers. Efforts have been focused on the containerless processing at high temperatures for producing new kinds of glasses. Also, some development has occurred in the areas of containerlessly supporting liquids at room temperature, with applications in studies of fluid dynamics, potential undercooling of liquids, etc. The high temperature area holds the greatest promise for producing new kinds of glasses and ceramics, new alloys, and possibly unusual structural shapes, such as very uniform hollow glass shells for fusion target applications. High temperature acoustic levitation required for containerless processing has been demonstrated in low-g environments as well as in ground-based experiments. Future activities include continued development of the signals axis acoustic levitator.

  8. A complementary MOS process

    International Nuclear Information System (INIS)

    Jhabvala, M.D.

    1977-03-01

    The complete sequence used to manufacture complementary metal oxide semiconductor (CMOS) integrated circuits is described. The fixed-gate array concept is presented as a means of obtaining CMOS integrated circuits in a fast and reliable fashion. Examples of CMOS circuits fabricated by both the conventional method and the fixed-gate array method are included. The electrical parameter specifications and characteristics are given along with typical values used to produce CMOS circuits. Temperature-bias stressing data illustrating the thermal stability of devices manufactured by this process are presented. Results of a preliminary study on the radiation sensitivity of circuits manufactured by this process are discussed. Some process modifications are given which have improved the radiation hardness of our CMOS devices. A formula description of the chemicals and gases along with the gas flow rates is also included

  9. Radiation processes in astrophysics

    CERN Document Server

    Tucker, Wallace H

    1975-01-01

    The purpose of this book is twofold: to provide a brief, simple introduction to the theory of radiation and its application in astrophysics and to serve as a reference manual for researchers. The first part of the book consists of a discussion of the basic formulas and concepts that underlie the classical and quantum descriptions of radiation processes. The rest of the book is concerned with applications. The spirit of the discussion is to present simple derivations that will provide some insight into the basic physics involved and then to state the exact results in a form useful for applications. The reader is referred to the original literature and to reviews for rigorous derivations.The wide range of topics covered is illustrated by the following table of contents: Basic Formulas for Classical Radiation Processes; Basic Formulas for Quantum Radiation Processes; Cyclotron and Synchrotron Radiation; Electron Scattering; Bremsstrahlung and Collision Losses; Radiative Recombination; The Photoelectric Effect; a...

  10. The Nursing Process

    Directory of Open Access Journals (Sweden)

    M. Hammond

    1978-09-01

    Full Text Available The essence of the nursing process can be summed up in this quotation by Sir Francis Bacon: “Human knowledge and human powers meet in one; for where the cause is not known the effect cannot be produced.” Arriving at a concise, accurate definition of the nursing process was, for me, an impossible task. It is altogether too vast and too personal a topic to contract down into a niftylooking, we-pay-lip-service-to-it cliché. So what I propose to do is to present my understanding of the nursing process throughout this essay, and then to leave the reader with some overall, general impression of what it all entails.

  11. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  12. Oxygen Dependent Biocatalytic Processes

    DEFF Research Database (Denmark)

    Pedersen, Asbjørn Toftgaard

    Enzyme catalysts have the potential to improve both the process economics and the environ-mental profile of many oxidation reactions especially in the fine- and specialty-chemical industry, due to their exquisite ability to perform stereo-, regio- and chemo-selective oxida-tions at ambient...... to aldehydes and ketones, oxyfunctionalization of C-H bonds, and epoxidation of C-C double bonds. Although oxygen dependent biocatalysis offers many possibilities, there are numerous chal-lenges to be overcome before an enzyme can be implemented in an industrial process. These challenges requires the combined...... far below their potential maximum catalytic rate at industrially relevant oxygen concentrations. Detailed knowledge of the en-zyme kinetics are therefore required in order to determine the best operating conditions and design oxygen supply to minimize processing costs. This is enabled...

  13. Cryogenic process simulation

    International Nuclear Information System (INIS)

    Panek, J.; Johnson, S.

    1994-01-01

    Combining accurate fluid property databases with a commercial equation-solving software package running on a desktop computer allows simulation of cryogenic processes without extensive computer programming. Computer simulation can be a powerful tool for process development or optimization. Most engineering simulations to date have required extensive programming skills in languages such as Fortran, Pascal, etc. Authors of simulation code have also usually been responsible for choosing and writing the particular solution algorithm. This paper describes a method of simulating cryogenic processes with a commercial software package on a desktop personal computer that does not require these traditional programming tasks. Applications include modeling of cryogenic refrigerators, heat exchangers, vapor-cooled power leads, vapor pressure thermometers, and various other engineering problems

  14. Total process surveillance: (TOPS)

    International Nuclear Information System (INIS)

    Millar, J.H.P.

    1992-01-01

    A Total Process Surveillance system is under development which can provide, in real-time, additional process information from a limited number of raw measurement signals. This is achieved by using a robust model based observer to generate estimates of the process' internal states. The observer utilises the analytical reduncancy among a diverse range of transducers and can thus accommodate off-normal conditions which lead to transducer loss or damage. The modular hierarchical structure of the system enables the maximum amount of information to be assimilated from the available instrument signals no matter how diverse. This structure also constitutes a data reduction path thus reducing operator cognitive overload from a large number of varying, and possibly contradictory, raw plant signals. (orig.)

  15. Near field transport processes

    International Nuclear Information System (INIS)

    Neretnieks, I.

    1991-01-01

    In repositories for nuclear waste there are many processes which will be instrumental in corroding the canisters and releasing the nuclides. Based on experiences from studies on the performance of repositories and on an actual design the major mechanisms influencing the integrity and performance of a repository are described and discussed. The paper addresses only conditions in crystalline rock repositories. The low water flow rate in fractures and channels plays a dominant role in limiting the interaction between water and waste. Molecular diffusion in the backfill and rock matrix as well as in the mobile water is an important transport process but actually limits the exchange rate because diffusive transport is slow. Solubility limits of both waste matrix and of individual nuclides are also important. Complicating processes include gas generation by iron corrosion and alpha-radiolysis. (au) (19 refs., 2 figs.)

  16. Topological signal processing

    CERN Document Server

    Robinson, Michael

    2014-01-01

    Signal processing is the discipline of extracting information from collections of measurements. To be effective, the measurements must be organized and then filtered, detected, or transformed to expose the desired information.  Distortions caused by uncertainty, noise, and clutter degrade the performance of practical signal processing systems. In aggressively uncertain situations, the full truth about an underlying signal cannot be known.  This book develops the theory and practice of signal processing systems for these situations that extract useful, qualitative information using the mathematics of topology -- the study of spaces under continuous transformations.  Since the collection of continuous transformations is large and varied, tools which are topologically-motivated are automatically insensitive to substantial distortion. The target audience comprises practitioners as well as researchers, but the book may also be beneficial for graduate students.

  17. Posttranslational processing of progastrin

    DEFF Research Database (Denmark)

    Bundgaard, Jens René; Rehfeld, Jens F.

    2010-01-01

    as a major neurotransmitter in the central and peripheral nervous systems. The tissue-specific expression of the hormones is regulated at the transcriptional level, but the posttranslational phase is also decisive and is highly complex in order to ensure accurate maturation of the prohormones in a cell...... picomolar concentrations, whereas the cellular expression of gastrin is expressed at higher levels, and accordingly gastrin circulates in 10-20-fold higher concentrations. Both common cancers and the less frequent neuroendocrine tumors express the gastrin gene and prohormone. But the posttranslational......, as are structural features of progastrin that are involved in the precursor activation process. Thus, the review describes how the processing depends on the cell-specific expression of the processing enzymes and kinetics in the secretory pathway....

  18. Quantum independent increment processes

    CERN Document Server

    Franz, Uwe

    2005-01-01

    This volume is the first of two volumes containing the revised and completed notes lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald during the period March 9 – 22, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present first volume contains the following lectures: "Lévy Processes in Euclidean Spaces and Groups" by David Applebaum, "Locally Compact Quantum Groups" by Johan Kustermans, "Quantum Stochastic Analysis" by J. Martin Lindsay, and "Dilations, Cocycles and Product Systems" by B.V. Rajarama Bhat.

  19. Nuclear process heat

    Energy Technology Data Exchange (ETDEWEB)

    Schulten, R [Kernforschungsanlage Juelich G.m.b.H. (F.R. Germany). Inst. fuer Reaktorentwicklung

    1976-05-01

    It is anticipated that the coupled utilization of coal and nuclear energy will achieve great importance in the future, the coal serving mainly as raw material and nuclear energy more as primary energy. Prerequisite for this development is the availability of high-temperature reactors, the state of development of which is described here. Raw materials for coupled use with nuclear process heat are petroleum, natural gas, coal, lignite, and water. Steam reformers heated by nuclear process heat, which are suitable for numerous processes, are expected to find wide application. The article describes several individual methods, all based on the transport of gas in pipelines, which could be utilized for the long distance transport of 'nuclear energy'.

  20. Laser Processing and Chemistry

    CERN Document Server

    Bäuerle, Dieter

    2011-01-01

    This book gives an overview of the fundamentals and applications of laser-matter interactions, in particular with regard to laser material processing. Special attention is given to laser-induced physical and chemical processes at gas-solid, liquid-solid, and solid-solid interfaces. Starting with the background physics, the book proceeds to examine applications of lasers in “standard” laser machining and laser chemical processing (LCP), including the patterning, coating, and modification of material surfaces. This fourth edition has been enlarged to cover the rapid advances in the understanding of the dynamics of materials under the action of ultrashort laser pulses, and to include a number of new topics, in particular the increasing importance of lasers in various different fields of surface functionalizations and nanotechnology. In two additional chapters, recent developments in biotechnology, medicine, art conservation and restoration are summarized. Graduate students, physicists, chemists, engineers, a...

  1. Radiation processing and sterilization

    International Nuclear Information System (INIS)

    Takehisa, M.; Machi, S.

    1987-01-01

    This growth of commercial radiation processing has been largely dependent on the achievement in production of reliable and less expensive radiation facilities as well as the research and development effort for new applications. Although world statistics of the growth are not available, Figure 20-1 shows steady growth in the number of EBAs installed in Japan for various purposes. Growth rate of Co-60 sources supplied by AECL (Atomic Energy of Canada Limited), which supplied approximately 80% of the world market, approximately 10% per year, including future growth estimates. Potential applications of radiation processing under development are in environmental conservation (e.g., treatment of sewage sludge, waste water, and exhaust gases) and bioengineering (e.g., immobilization of bioactive materials). The authors plan to introduce her the characteristics of radiation processing, examples of its industrial applications, the status of its research and development activities, and an economic analysis

  2. Plasma processing of nanomaterials

    CERN Document Server

    Sankaran, R Mohan

    2014-01-01

    CRC Press author R. Mohan Sankaran is the winner of the 2011 Peter Mark Memorial Award "… for the development of a tandem plasma synthesis method to grow carbon nanotubes with unprecedented control over the nanotube properties and chirality." -2011 AVS Awards Committee"Readers who want to learn about how nanomaterials are processed, using the most recent methods, will benefit greatly from this book. It contains very recent technical details on plasma processing and synthesis methods used by current researchers developing new nano-based materials, with all the major plasma-based processing techniques used today being thoroughly discussed."-John J. Shea, IEEE Electrical Insulation Magazine, May/June 2013, Vol. 29, No. 3.

  3. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  4. s-process chronometers

    International Nuclear Information System (INIS)

    Beer, H.

    1983-01-01

    The radionuclei 40 K, 81 Kr, 87 Rb, 93 Zr, 107 Pd, 147 Sm, 176 Lu and 205 Pb are built up totally or partially by the s-process. Due to their long half life they are potential chronometers for the age and the development of the s-process. The usefulness of the various nuclei is discussed. For the determination of the mean age of the s-process synthesis and with it the age of the galaxy, 176 Lu is best suited. It is demonstrated that this age can be calculated solely from measured cross section and abundance ratios. Various effects which can limit the usefulness of 176 Lu as a clock are discussed. (orig.) [de

  5. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas

    and constructivist multiple criteria decision-making analysis method is selected for developing the work further. The method is introduced and applied to the renovation of a multi-residential historic building. Furthermore, a new scheme, the Integrated Renovation Process, is presented. Finally, the methodology...... is applied to two single-family homes. In practice, such a scheme allowed most informational barriers to sustainable home renovation to be overcome. The homeowners were better integrated and their preferences and immaterial values were better taken into account. They assimilated the multiple benefits...... to keep the decision making process economically viable and timely, the process still needs to be improved and new tools need to be developed....

  6. Foundations of signal processing

    CERN Document Server

    Vetterli, Martin; Goyal, Vivek K

    2014-01-01

    This comprehensive and engaging textbook introduces the basic principles and techniques of signal processing, from the fundamental ideas of signals and systems theory to real-world applications. Students are introduced to the powerful foundations of modern signal processing, including the basic geometry of Hilbert space, the mathematics of Fourier transforms, and essentials of sampling, interpolation, approximation and compression. The authors discuss real-world issues and hurdles to using these tools, and ways of adapting them to overcome problems of finiteness and localisation, the limitations of uncertainty and computational costs. Standard engineering notation is used throughout, making mathematical examples easy for students to follow, understand and apply. It includes over 150 homework problems and over 180 worked examples, specifically designed to test and expand students' understanding of the fundamentals of signal processing, and is accompanied by extensive online materials designed to aid learning, ...

  7. MATHEMATICAL SIMULATION AND AUTOMATION OF PROCESS ENGINEERING FOR WELDED STRUCTURE PRODUCTION

    Directory of Open Access Journals (Sweden)

    P. V. Zankovets

    2017-01-01

    Full Text Available Models and methods for presentation of database and knowledge base have been developed on the basis of composition and structure of data flow in technological process of welding. The information in data and knowledge base is presented in the form of multilevel hierarchical structure and it is organized according to its functionality in the form of separate files. Each file contains a great number of tables. While using mathematical simulation and information technologies an expert system has been developed with the purpose to take decisions in designing and process engineering for production of welded ructures. The system makes it possible to carry out technically substantiated selection of welded and welding materials, sttypes of welded connections, welding methods, parameters and modes of welding. The developed system allows to improve quality of the accepted design decisions due to reduction of manual labour costs for work with normative-reference documentation, analysis and evaluation of dozens of possible alternatives. The system also permits to reduce labour inputs for testing structures on technological effectiveness, to ensure reduction of materials consumption for welded structures, to guarantee faultless formation of welded connections at this stage.

  8. Process Damping Parameters

    International Nuclear Information System (INIS)

    Turner, Sam

    2011-01-01

    The phenomenon of process damping as a stabilising effect in milling has been encountered by machinists since milling and turning began. It is of great importance when milling aerospace alloys where maximum surface speed is limited by excessive tool wear and high speed stability lobes cannot be attained. Much of the established research into regenerative chatter and chatter avoidance has focussed on stability lobe theory with different analytical and time domain models developed to expand on the theory first developed by Trusty and Tobias. Process damping is a stabilising effect that occurs when the surface speed is low relative to the dominant natural frequency of the system and has been less successfully modelled and understood. Process damping is believed to be influenced by the interference of the relief face of the cutting tool with the waveform traced on the cut surface, with material properties and the relief geometry of the tool believed to be key factors governing performance. This study combines experimental trials with Finite Element (FE) simulation in an attempt to identify and understand the key factors influencing process damping performance in titanium milling. Rake angle, relief angle and chip thickness are the variables considered experimentally with the FE study looking at average radial and tangential forces and surface compressive stress. For the experimental study a technique is developed to identify the critical process damping wavelength as a means of measuring process damping performance. For the range of parameters studied, chip thickness is found to be the dominant factor with maximum stable parameters increased by a factor of 17 in the best case. Within the range studied, relief angle was found to have a lesser effect than expected whilst rake angle had an influence.

  9. NTP comparison process

    Science.gov (United States)

    Corban, Robert

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  10. Fundamentals of business process management

    NARCIS (Netherlands)

    Dumas, Marlon; La Rosa, Marcello; Mendling, Jan; Reijers, Hajo A.

    2018-01-01

    This textbook covers the entire Business Process Management (BPM) lifecycle, from process identification to process monitoring, covering along the way process modelling, analysis, redesign and automation. Concepts, methods and tools from business management, computer science and industrial

  11. Thermal stir welding process

    Science.gov (United States)

    Ding, R. Jeffrey (Inventor)

    2012-01-01

    A welding method is provided for forming a weld joint between first and second elements of a workpiece. The method includes heating the first and second elements to form an interface of material in a plasticized or melted state interface between the elements. The interface material is then allowed to cool to a plasticized state if previously in a melted state. The interface material, while in the plasticized state, is then mixed, for example, using a grinding/extruding process, to remove any dendritic-type weld microstructures introduced into the interface material during the heating process.

  12. A grieving process illustrated?

    Science.gov (United States)

    Hutchinson, Rory

    2018-03-01

    The sudden death of Pablo Picasso's closest friend Carlos Casagemas in 1901 came as a great shock to the young Picasso. From a young age, Picasso had ruminated on life and death; however, this was his first experience of bereavement. Following the death of Casagemas, Picasso's paintings can be seen as a diary of his grieving process and clearly illustrate the five stages of the grieving process as outlined by Kubler-Ross in ' On Death and Dying ' (1969). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Actinide metal processing

    International Nuclear Information System (INIS)

    Sauer, N.N.; Watkin, J.G.

    1992-01-01

    A process for converting an actinide metal such as thorium, uranium, or plutonium to an actinide oxide material by admixing the actinide metal in an aqueous medium with a hypochlorite as an oxidizing agent for sufficient time to form the actinide oxide material and recovering the actinide oxide material is described together with a low temperature process for preparing an actinide oxide nitrate such as uranyl nitrate. Additionally, a composition of matter comprising the reaction product of uranium metal and sodium hypochlorite is provided, the reaction product being an essentially insoluble uranium oxide material suitable for disposal or long term storage

  14. Biomedical Image Processing

    CERN Document Server

    Deserno, Thomas Martin

    2011-01-01

    In modern medicine, imaging is the most effective tool for diagnostics, treatment planning and therapy. Almost all modalities have went to directly digital acquisition techniques and processing of this image data have become an important option for health care in future. This book is written by a team of internationally recognized experts from all over the world. It provides a brief but complete overview on medical image processing and analysis highlighting recent advances that have been made in academics. Color figures are used extensively to illustrate the methods and help the reader to understand the complex topics.

  15. Radiation processed polysaccharide products

    International Nuclear Information System (INIS)

    Nguyen, Quoc Hien

    2007-01-01

    Radiation crosslinking, degradation and grafting techniques for modification of polymeric materials including natural polysaccharides have been providing many unique products. In this communication, typical products from radiation processed polysaccharides particularly plant growth promoter from alginate, plant protector and elicitor from chitosan, super water absorbent containing starch, hydrogel sheet containing carrageenan/CM-chitosan as burn wound dressing, metal ion adsorbent from partially deacetylated chitin were described. The procedures for producing those above products were also outlined. Future development works on radiation processing of polysaccharides were briefly presented. (author)

  16. Symbolic signal processing

    International Nuclear Information System (INIS)

    Rechester, A.B.; White, R.B.

    1993-01-01

    Complex dynamic processes exhibit many complicated patterns of evolution. How can all these patterns be recognized using only output (observational, experimental) data without prior knowledge of the equations of motion? The powerful method for doing this is based on symbolic dynamics: (1) Present output data in symbolic form (trial language). (2) Topological and metric entropies are constructed. (3) Develop algorithms for computer optimization of entropies. (4) By maximizing entropies, find the most appropriate symbolic language for the purpose of pattern recognition. (5) Test this method using a variety of dynamical models from nonlinear science. The authors are in the process of applying this method for analysis of MHD fluctuations in tokamaks

  17. Reversible brazing process

    Science.gov (United States)

    Pierce, Jim D.; Stephens, John J.; Walker, Charles A.

    1999-01-01

    A method of reversibly brazing surfaces together. An interface is affixed to each surface. The interfaces can be affixed by processes such as mechanical joining, welding, or brazing. The two interfaces are then brazed together using a brazing process that does not defeat the surface to interface joint. Interfaces of materials such as Ni-200 can be affixed to metallic surfaces by welding or by brazing with a first braze alloy. The Ni-200 interfaces can then be brazed together using a second braze alloy. The second braze alloy can be chosen so that it minimally alters the properties of the interfaces to allow multiple braze, heat and disassemble, rebraze cycles.

  18. Power plant process computer

    International Nuclear Information System (INIS)

    Koch, R.

    1982-01-01

    The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)

  19. Lasers in materials processing

    International Nuclear Information System (INIS)

    Davis, J.I.; Rockower, E.B.

    1981-01-01

    A status report on the uranium Laser Isotope Separation (LIS) Program at the Lawrence Livermore National Laboratory is presented. Prior to this status report, process economic analysis is presented so as to understand how the unique properties of laser photons can be best utilized in the production of materials and components despite the high cost of laser energy. The characteristics of potential applications that are necessary for success are identified, and those factors that have up to now frustrated attempts to find commercially viable laser induced chemical and physical process for the production of new or existing materials are pointed out

  20. Signal processing in microdosimetry

    International Nuclear Information System (INIS)

    Arbel, A.

    1984-01-01

    Signals occurring in microdosimetric measurements cover a dynamic range of 100 dB at a counting rate which normally stays below 10 4 but could increase significantly in case of an accident. The need for high resolution at low energies, non-linear signal processing to accommodate the specified dynamic range, easy calibration and thermal stability are conflicting requirements which pose formidable design problems. These problems are reviewed, and a practical approach to their solution is given employing a single processing channel. (author)

  1. An Integrated Desgin Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... the environmental measures cannot be discarded due to extra costs....

  2. An Integrated Design Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... the environmental measures cannot be discarded due to extra costs....

  3. PROCESS OF RECOVERING URANIUM

    Science.gov (United States)

    Carter, J.M.; Larson, C.E.

    1958-10-01

    A process is presented for recovering uranium values from calutron deposits. The process consists in treating such deposits to produce an oxidlzed acidic solution containing uranium together with the following imparities: Cu, Fe, Cr, Ni, Mn, Zn. The uranium is recovered from such an impurity-bearing solution by adjusting the pH of the solution to the range 1.5 to 3.0 and then treating the solution with hydrogen peroxide. This results in the precipitation of uranium peroxide which is substantially free of the metal impurities in the solution. The peroxide precipitate is then separated from the solution, washed, and calcined to produce uranium trioxide.

  4. Hard exclusive QCD processes

    Energy Technology Data Exchange (ETDEWEB)

    Kugler, W.

    2007-01-15

    Hard exclusive processes in high energy electron proton scattering offer the opportunity to get access to a new generation of parton distributions, the so-called generalized parton distributions (GPDs). This functions provide more detailed informations about the structure of the nucleon than the usual PDFs obtained from DIS. In this work we present a detailed analysis of exclusive processes, especially of hard exclusive meson production. We investigated the influence of exclusive produced mesons on the semi-inclusive production of mesons at fixed target experiments like HERMES. Further we give a detailed analysis of higher order corrections (NLO) for the exclusive production of mesons in a very broad range of kinematics. (orig.)

  5. The process of entrepreneurship:

    DEFF Research Database (Denmark)

    Neergaard, Helle

    2003-01-01

    between organisational growth and managerial role transformation in technology-based new ventures. The chapter begins by reviewing existing literature on organisational growth patterns and establishing a link to managerial roles in order to elucidate the basic premises of the study. The chapter...... for understanding the link between organisational growth and managerial role transformation.......Growing a technology-based new venture is a complex process because these ventures are embedded in turbulent environments that require fast organisational and managerial transformation. This chapter addresses the evolutionary process of such ventures. It seeks to provide insight into the link...

  6. Process for purifying graphite

    International Nuclear Information System (INIS)

    Clausius, R.A.

    1985-01-01

    A process for purifying graphite comprising: comminuting graphite containing mineral matter to liberate at least a portion of the graphite particles from the mineral matter; mixing the comminuted graphite particles containing mineral matter with water and hydrocarbon oil to form a fluid slurry; separating a water phase containing mineral matter and a hydrocarbon oil phase containing grahite particles; and separating the graphite particles from the hydrocarbon oil to obtain graphite particles reduced in mineral matter. Depending upon the purity of the graphite desired, steps of the process can be repeated one or more times to provide a progressively purer graphite

  7. Stochastic conditional intensity processes

    DEFF Research Database (Denmark)

    Bauwens, Luc; Hautsch, Nikolaus

    2006-01-01

    model allows for a wide range of (cross-)autocorrelation structures in multivariate point processes. The model is estimated by simulated maximum likelihood (SML) using the efficient importance sampling (EIS) technique. By modeling price intensities based on NYSE trading, we provide significant evidence......In this article, we introduce the so-called stochastic conditional intensity (SCI) model by extending Russell’s (1999) autoregressive conditional intensity (ACI) model by a latent common dynamic factor that jointly drives the individual intensity components. We show by simulations that the proposed...... for a joint latent factor and show that its inclusion allows for an improved and more parsimonious specification of the multivariate intensity process...

  8. Process energy reduction

    International Nuclear Information System (INIS)

    Lowthian, W.E.

    1993-01-01

    Process Energy Reduction (PER) is a demand-side energy reduction approach which complements and often supplants other traditional energy reduction methods such as conservation and heat recovery. Because the application of PER is less obvious than the traditional methods, it takes some time to learn the steps as well as practice to become proficient in its use. However, the benefit is significant, often far outweighing the traditional energy reduction approaches. Furthermore, the method usually results in a better process having less waste and pollution along with improved yields, increased capacity, and lower operating costs

  9. Conceptualizing operations strategy processes

    DEFF Research Database (Denmark)

    Rytter, Niels Gorm; Boer, Harry; Koch, Christian

    2007-01-01

    Purpose - The purpose of this paper is to present insights into operations strategy (OS) in practice. It outlines a conceptualization and model of OS processes and, based on findings from an in-depth and longitudinal case study, contributes to further development of extant OS models and methods......; taking place in five dimensions of change - technical-rational, cultural, political, project management, and facilitation; and typically unfolding as a sequential and parallel, ordered and disordered, planned and emergent as well as top-down and bottom-up process. The proposed OS conceptualization...

  10. Advanced Polymer Processing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Muenchausen, Ross E. [Los Alamos National Laboratory

    2012-07-25

    Some conclusions of this presentation are: (1) Radiation-assisted nanotechnology applications will continue to grow; (2) The APPF will provide a unique focus for radiolytic processing of nanomaterials in support of DOE-DP, other DOE and advanced manufacturing initiatives; (3) {gamma}, X-ray, e-beam and ion beam processing will increasingly be applied for 'green' manufacturing of nanomaterials and nanocomposites; and (4) Biomedical science and engineering may ultimately be the biggest application area for radiation-assisted nanotechnology development.

  11. Process of pyrogenation, etc

    Energy Technology Data Exchange (ETDEWEB)

    Pascal, P V.H.

    1949-03-31

    The invention has for its object improvements relating to processes of pyrogenation of carbonaceous materials in the presence of hydrocarbon wetting agents, notably to those processes for solid fossil combustibles, which improvements consist, chiefly, in adding to the wetting agent and/or carbonaceous materials to be heated, a catalyst constituted by at least one mineral material derived from a polyvalent element such as vanadium, molybdenum, iron, manganese, nickel, cobalt, tin etc, or by a derivative of such an element, the catalyst being added in the proportions of some ten thousandths to some hundredths by weight, for example up to five hundredths.

  12. Research Planning Process

    Science.gov (United States)

    Lofton, Rodney

    2010-01-01

    This presentation describes the process used to collect, review, integrate, and assess research requirements desired to be a part of research and payload activities conducted on the ISS. The presentation provides a description of: where the requirements originate, to whom they are submitted, how they are integrated into a requirements plan, and how that integrated plan is formulated and approved. It is hoped that from completing the review of this presentation, one will get an understanding of the planning process that formulates payload requirements into an integrated plan used for specifying research activities to take place on the ISS.

  13. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  14. Data processing on FPGAs

    CERN Document Server

    Teubner, Jens

    2013-01-01

    Roughly a decade ago, power consumption and heat dissipation concerns forced the semiconductor industry to radically change its course, shifting from sequential to parallel computing. Unfortunately, improving performance of applications has now become much more difficult than in the good old days of frequency scaling. This is also affecting databases and data processing applications in general, and has led to the popularity of so-called data appliances-specialized data processing engines, where software and hardware are sold together in a closed box. Field-programmable gate arrays (FPGAs) incr

  15. Radioactive waste processing container

    International Nuclear Information System (INIS)

    Ishizaki, Kanjiro; Koyanagi, Naoaki; Sakamoto, Hiroyuki; Uchida, Ikuo.

    1992-01-01

    A radioactive waste processing container used for processing radioactive wastes into solidification products suitable to disposal such as underground burying or ocean discarding is constituted by using cements. As the cements, calcium sulfoaluminate clinker mainly comprising calcium sulfoaluminate compound; 3CaO 3Al 2 O 3 CaSO 4 , Portland cement and aqueous blast furnace slug is used for instance. Calciumhydroxide formed from the Portland cement is consumed for hydration of the calcium sulfoaluminate clinker. According, calcium hydroxide is substantially eliminated in the cement constituent layer of the container. With such a constitution, damages such as crackings and peelings are less caused, to improve durability and safety. (I.N.)

  16. Silicon web process development

    Science.gov (United States)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1981-01-01

    The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

  17. The Recruitment Process:

    DEFF Research Database (Denmark)

    Holm, Anna

    , which were carried out in Denmark in 2008-2009 using qualitative research methods, revealed changes in the sequence, divisibility and repetitiveness of a number of recruitment tasks and subtasks. The new recruitment process design was identified and presented in the paper. The study concluded......The aim of this research was to determine whether the introduction of e-recruitment has an impact on the process and underlying tasks, subtasks and activities of recruitment. Three large organizations with well-established e-recruitment practices were included in the study. The three case studies...

  18. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  19. Genomic signal processing

    CERN Document Server

    Shmulevich, Ilya

    2007-01-01

    Genomic signal processing (GSP) can be defined as the analysis, processing, and use of genomic signals to gain biological knowledge, and the translation of that knowledge into systems-based applications that can be used to diagnose and treat genetic diseases. Situated at the crossroads of engineering, biology, mathematics, statistics, and computer science, GSP requires the development of both nonlinear dynamical models that adequately represent genomic regulation, and diagnostic and therapeutic tools based on these models. This book facilitates these developments by providing rigorous mathema

  20. Water And Waste Water Processing

    International Nuclear Information System (INIS)

    Yang, Byeong Ju

    1988-04-01

    This book shows US the distribution diagram of water and waste water processing with device of water processing, and device of waste water processing, property of water quality like measurement of pollution of waste water, theoretical Oxygen demand, and chemical Oxygen demand, processing speed like zero-order reactions and enzyme reactions, physical processing of water and waste water, chemical processing of water and waste water like neutralization and buffering effect, biological processing of waste water, ammonia removal, and sludges processing.

  1. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  2. Automated interactive sales processes

    NARCIS (Netherlands)

    T.B. Klos (Tomas); D.J.A. Somefun (Koye); J.A. La Poutré (Han)

    2010-01-01

    htmlabstractWhen we look at successful sales processes occurring in practice, we find they combine two techniques which have been studied separately in the literature. Recommender systems are used to suggest additional products or accessories to include in the bundle under consideration, and

  3. Process Experimental Pilot Plant

    International Nuclear Information System (INIS)

    Henze, H.

    1986-01-01

    The Process Experimental Pilot Plant (PREPP) at the Idaho National Engineering Laboratory (INEL) was built to convert transuranic contaminated solid waste into a form acceptable for disposal at the Waste Isolation Pilot Plant (WIPP), located near Carlsbad, New Mexico. There are about 2.0 million cubic ft of transuranic waste stored at the Transuranic Storage Area of the INEL's Radioactive Waste Management Complex (RWMC). The Stored Waste Examination Pilot Plant (SWEPP) located at the RWMC will examine this stored transuranic waste to determine if the waste is acceptable for direct shipment to and storage at WIPP, or if it requires shipment to PREPP for processing before shipment to WIPP. The PREPP process shreds the waste, incinerates the shredded waste, and cements (grouts) the shredded incinerated waste in new 55-gal drums. Unshreddable items are repackaged and returned to SWEPP. The process off-gas is cleaned prior to its discharge to the atmosphere, and complies with the effluent standards of the State of Idaho, EPA, and DOE. Waste liquid generated is used in the grouting operation

  4. Pervaporation process and assembly

    Science.gov (United States)

    Wynn, Nicholas P.; Huang, Yu; Aldajani, Tiem; Fulton, Donald A.

    2010-07-20

    The invention is a pervaporation process and pervaporation equipment, using a series of membrane modules, and including inter-module reheating of the feed solution under treatment. The inter-module heating is achieved within the tube or vessel in which the modules are housed, thereby avoiding the need to repeatedly extract the feed solution from the membrane module train.

  5. Behavioural Hybrid Process Calculus

    NARCIS (Netherlands)

    Brinksma, Hendrik; Krilavicius, T.

    2005-01-01

    Process algebra is a theoretical framework for the modelling and analysis of the behaviour of concurrent discrete event systems that has been developed within computer science in past quarter century. It has generated a deeper nderstanding of the nature of concepts such as observable behaviour in

  6. Students' Differentiated Translation Processes

    Science.gov (United States)

    Bossé, Michael J.; Adu-Gyamfi, Kwaku; Chandler, Kayla

    2014-01-01

    Understanding how students translate between mathematical representations is of both practical and theoretical importance. This study examined students' processes in their generation of symbolic and graphic representations of given polynomial functions. The purpose was to investigate how students perform these translations. The result of the study…

  7. Cauchy cluster process

    DEFF Research Database (Denmark)

    Ghorbani, Mohammad

    2013-01-01

    In this paper we introduce an instance of the well-know Neyman–Scott cluster process model with clusters having a long tail behaviour. In our model the offspring points are distributed around the parent points according to a circular Cauchy distribution. Using a modified Cramér-von Misses test...

  8. Business process simulation revisited

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Barjis, J.

    2010-01-01

    Computer simulation attempts to "mimic" real-life or hypothetical behavior on a computer to see how processes or systems can be improved and to predict their performance under different circumstances. Simulation has been successfully applied in many disciplines and is considered to be a relevant and

  9. Processes of Similarity Judgment

    Science.gov (United States)

    Larkey, Levi B.; Markman, Arthur B.

    2005-01-01

    Similarity underlies fundamental cognitive capabilities such as memory, categorization, decision making, problem solving, and reasoning. Although recent approaches to similarity appreciate the structure of mental representations, they differ in the processes posited to operate over these representations. We present an experiment that…

  10. Reforming the Interagency Process

    National Research Council Canada - National Science Library

    Uchida, Ted T

    2005-01-01

    .... The result is a process that continues lacking the ability to clarify objectives, chains of command, and policy implementation plans Insights from organizational behavior theory reveal that some of the IAP's sub-optimal performance and irrational behavior are rooted in bureaucratic bargaining and decisions.

  11. Matchmaking for business processes

    NARCIS (Netherlands)

    Wombacher, Andreas; Fankhauser, Peter; Mahleko, Bendick; Neuhold, Erich

    2003-01-01

    Web services have a potential to enhance B2B ecommerce over the Internet by allowing companies and organizations to publish their business processes on service directories where potential trading partners can find them. This can give rise to new business paradigms based on ad-hoc trading relations

  12. Fusion reactor fuel processing

    International Nuclear Information System (INIS)

    Johnson, E.F.

    1972-06-01

    For thermonuclear power reactors based on the continuous fusion of deuterium and tritium the principal fuel processing problems occur in maintaining desired compositions in the primary fuel cycled through the reactor, in the recovery of tritium bred in the blanket surrounding the reactor, and in the prevention of tritium loss to the environment. Since all fuel recycled through the reactor must be cooled to cryogenic conditions for reinjection into the reactor, cryogenic fractional distillation is a likely process for controlling the primary fuel stream composition. Another practical possibility is the permeation of the hydrogen isotopes through thin metal membranes. The removal of tritium from the ash discharged from the power system would be accomplished by chemical procedures to assure physiologically safe concentration levels. The recovery process for tritium from the breeder blanket depends on the nature of the blanket fluids. For molten lithium the only practicable possibility appears to be permeation from the liquid phase. For molten salts the process would involve stripping with inert gas followed by chemical recovery. In either case extremely low concentrations of tritium in the melts would be desirable to maintain low tritium inventories, and to minimize escape of tritium through unwanted permeation, and to avoid embrittlement of metal walls. 21 refs

  13. Process design and redesign

    NARCIS (Netherlands)

    Reijers, H.A.; Dumas, M.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.

    2005-01-01

    This chapter aims to provide concrete guidance in redesigning business processes. Two alternative methods are described, both of them suitable to boost business performance. The first one is based on a collection of best practices, as applied in various redesign projects. These best practices all

  14. Microbiological metal extraction processes

    International Nuclear Information System (INIS)

    Torma, A.E.

    1991-01-01

    Application of biotechnological principles in the mineral processing, especially in hydrometallurgy, has created new opportunities and challenges for these industries. During the 1950's and 60's, the mining wastes and unused complex mineral resources have been successfully treated in bacterial assisted heap and dump leaching processes for copper and uranium. The interest in bio-leaching processes is the consequence of economic advantages associated with these techniques. For example, copper can be produced from mining wastes for about 1/3 to 1/2 of the costs of copper production by the conventional smelting process from high-grade sulfide concentrates. The economic viability of bio leaching technology lead to its world wide acceptance by the extractive industries. During 1970's this technology grew into a more structured discipline called 'bio hydrometallurgy'. Currently, bio leaching techniques are ready to be used, in addition to copper and uranium, for the extraction of cobalt, nickel, zinc, precious metals and for the desulfurization of high-sulfur content pyritic coals. As a developing technology, the microbiological leaching of the less common and rare metals has yet to reach commercial maturity. However, the research in this area is very active. In addition, in a foreseeable future the biotechnological methods may be applied also for the treatment of high-grade ores and mineral concentrates using adapted native and/or genetically engineered microorganisms. (author)

  15. Food processing in action

    Science.gov (United States)

    Radio frequency (RF) heating is a commonly used food processing technology that has been applied for drying and baking as well as thawing of frozen foods. Its use in pasteurization, as well as for sterilization and disinfection of foods, is more limited. This column will review various RF heating ap...

  16. Video processing project

    CSIR Research Space (South Africa)

    Globisch, R

    2009-03-01

    Full Text Available Video processing source code for algorithms and tools used in software media pipelines (e.g. image scalers, colour converters, etc.) The currently available source code is written in C++ with their associated libraries and DirectShow- Filters....

  17. Laser processing of materials

    Indian Academy of Sciences (India)

    M. Senthilkumar (Newgen Imaging) 1461 1996 Oct 15 13:05:22

    The initial foundation of laser theory was laid by Einstein [11]. ..... general definition and scope of the processes as understood in conventional practice, but is ..... [54]. Laser welding of Ti-alloys. Welding. 2001 TiNi shape memory alloys. CW–CO2. Study corrosion, mechanical and shape memory properties of weldments.

  18. Restricted broadcast process theory

    NARCIS (Netherlands)

    F. Ghassemi; W.J. Fokkink (Wan); A. Movaghar; A. Cerone; S. Gruner

    2008-01-01

    htmlabstractWe present a process algebra for modeling and reasoning about Mobile Ad hoc Networks (MANETs) and their protocols. In our algebra we model the essential modeling concepts of ad hoc networks, i.e. local broadcast, connectivity of nodes and connectivity changes. Connectivity and

  19. A Process for Planning.

    Science.gov (United States)

    Gurowitz, William D.; And Others

    1988-01-01

    Describes how Division of Campus Life at Cornell University conducted long-range planning and the results of its 2-year effort. Explains 2 (strategic and organizational) by 3 (diagnosis, formulation, and execution) matrix providing systems view from describing and evaluating long-range planning. Presents 10-step process implemented at Cornell. (NB)

  20. Television picture signal processing

    NARCIS (Netherlands)

    1998-01-01

    Field or frame memories are often used in television receivers for video signal processing functions, such as noise reduction and/or flicker reduction. Television receivers also have graphic features such as teletext, menu-driven control systems, multilingual subtitling, an electronic TV-Guide, etc.