WorldWideScience

Sample records for back-end processing system

  1. CO2-laser micromachining and back-end processing for rapid production of PMMA-based microfluidic systems

    DEFF Research Database (Denmark)

    Klank, Henning; Kutter, Jörg Peter; Geschke, Oliver

    2002-01-01

    In this article, we focus on the enormous potential of a CO2-laser system for rapidly producing polymer microfluidic structures. The dependence was assessed of the depth and width of laser-cut channels on the laser beam power and on the number of passes of the beam along the same channel......, a three-layer polymer microstructure with included optical fibers was fabricated within two days. The use of CO2-laser systems to produce microfluidic systems has not been published before. These systems provide a cost effective alternative to UV-laser systems and they are especially useful....... In the experiments the laser beam power was varied between 0 and 40 W and the passes were varied in the range of 1 to 7 times. Typical channel depths were between 100 and 300 m m, while the channels were typically 250 m m wide. The narrowest produced channel was 85 m m wide. Several bonding methods...

  2. System Level Power Optimization of Digital Audio Back End for Hearing Aids

    DEFF Research Database (Denmark)

    Pracny, Peter; Jørgensen, Ivan Harald Holger; Bruun, Erik

    2016-01-01

    the interpolation filter and the SD modulator on the system level so that the switching frequency of the Class D PA - the main power consumer in the back end - is minimized. A figure-of-merit (FOM) which allows judging the power consumption of the digital part of the back end early in the design process is used...... to track the hardware and power demands as the tradeoffs of the system level parameters are investigated. The result is the digital part of the back end optimized with respect to power which provides audio performance comparable to state-of-theart. A combination of system level parameters leading...... to the lowest switching frequency of the Class D power amplifier reported in literature for the SD modulatorbased back end is derived using this approach....

  3. Performance and scalability of the back-end sub-system in the ATLAS DAQ/EF prototype

    CERN Document Server

    Alexandrov, I N; Badescu, E; Burckhart, Doris; Caprini, M; Cohen, L; Duval, P Y; Hart, R; Jones, R; Kazarov, A; Kolos, S; Kotov, V; Laugier, D; Mapelli, Livio P; Moneta, L; Qian, Z; Radu, A A; Ribeiro, C A; Roumiantsev, V; Ryabov, Yu; Schweiger, D; Soloviev, I V

    2000-01-01

    The DAQ group of the future ATLAS experiment has developed a prototype system based on the trigger/DAQ architecture described in the ATLAS Technical Proposal to support studies of the full system functionality, architecture as well as available hardware and software technologies. One sub-system of this prototype is the back- end which encompasses the software needed to configure, control and monitor the DAQ, but excludes the processing and transportation of physics data. The back-end consists of a number of components including run control, configuration databases and message reporting system. The software has been developed using standard, external software technologies such as OO databases and CORBA. It has been ported to several C++ compilers and operating systems including Solaris, Linux, WNT and LynxOS. This paper gives an overview of the back-end software, its performance, scalability and current status. (17 refs).

  4. Front- and back-end process characterization by SIMS to achieve electrically matched devices

    Science.gov (United States)

    Budri, Thanas; Kouzminov, Dimitry

    2004-06-01

    Application of SIMS metrology in high volume wafer manufacturing allows comparison of important physical characteristics of devices and can address changes in the process during early stages of process flow, thus improving production yields and cycles. In the current paper, we investigate the correlation between wafer-level SIMS characterization and electrical characteristics of devices in a wide spectrum of front- and back-end applications: high precision SIMS analysis for implanter recipe development and monitoring is a technique that has provided major contributions to achieve electrically matched devices. SIMS analysis is also used widely on gate material selection and characterization. As SiGe/SiGeC is taking precedence over III-V materials for rf applications due to processing simplicity, SIMS analytical technique provides major metrology support on process targeting and development. The SIMS analytical technique has earned its reputation and is wide used as metrology solution on front-end semiconductor processing. Fluorine SIMS analysis investigation in TiN, W and its relation with increased via resistance and voids on the nucleation is an example of SIMS analysis application for back-end process support.

  5. A hearing aid on-chip system based on accuracy optimized front- and back-end blocks

    Science.gov (United States)

    Fanyang, Li; Hao, Jiang

    2014-03-01

    A hearing aid on-chip system based on accuracy optimized front- and back-end blocks is presented for enhancing the signal processing accuracy of the hearing aid. Compared with the conventional system, the accuracy optimized system is characterized by the dual feedback network and the gain compensation technique used in the front- and back-end blocks, respectively, so as to alleviate the nonlinearity distortion caused by the output swing. By using the technique, the accuracy of the whole hearing aid system can be significantly improved. The prototype chip has been designed with a 0.13 μm standard CMOS process and tested with 1 V supply voltage. The measurement results show that, for driving a 16 Ω loudspeaker with a normalized output level of 300 mVp-p, the total harmonic distortion reached about -60 dB, achieving at least three times reduction compared to the previously reported works. In addition, the typical input referred noise is only about 5 μVrms.

  6. System aspects on safeguards for the back-end of the Swedish nuclear fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    Fritzell, Anni (Dept. of Physics and Astronomy, Uppsala Univ., Uppsala (Sweden))

    2008-03-15

    This thesis has investigated system aspects of safeguarding the back-end of the Swedish nuclear fuel cycle. These aspects include the important notion of continuity of knowledge, the philosophy of verifying measurements and the need to consider the safeguards system as a whole when expanding it to include the encapsulation facility and the geological repository. The research has been analytical in method both in the identification of concrete challenges for the safeguards community in Paper 1, and in the diversion path analysis performed in Paper 2. This method of work is beneficial for example when abstract notions are treated. However, as a suggestion for further work along these lines, a formal systems analysis would be advantageous, and may even reveal properties of the safeguards system that the human mind so far has been to narrow to consider. A systems analysis could be used to model a proposed safeguards approach with the purpose of finding vulnerabilities in its detection probabilities. From the results, capabilities needed to overcome these vulnerabilities could be deduced, thereby formulating formal boundary conditions. These could include: The necessary partial defect level for the NDA measurement; The level of redundancy required in the C/S system to minimize the risk of inconclusive results due to equipment failure; and, Requirements on the capabilities of seismic methods, etc. The field of vulnerability assessment as a tool for systems analysis should be of interest for the safeguards community, as a formal approach could give a new dimension to the credibility of safeguards systems

  7. Solar dynamics imaging system a back-end instrument for the proposed NLST

    Science.gov (United States)

    Ramesh, K. B.; Vasantharaju, N.; Hemanth, P.; Reardon, K.

    2016-12-01

    The Solar Dynamics Imaging System (SDIS) will be one of the focal plane instruments operated at the National Large Solar Telescope (NLST). The prime objective of the instrument is to obtain high spatial and temporal resolution images of the region of interest on the Sun in the wavelength range from 390 nm to 900 nm. The SDIS provides filtergrams using broad-band filters while preserving the Strehl ratio provided by the telescope. Furthermore, the SDIS is expected to provide observations that allow image reconstruction to extract wave front information and achieve a homogenous image quality over the entire FOV.

  8. 40 CFR 63.500 - Back-end process provisions-carbon disulfide limitations for styrene butadiene rubber by emulsion...

    Science.gov (United States)

    2010-07-01

    ... disulfide limitations for styrene butadiene rubber by emulsion processes. 63.500 Section 63.500 Protection... limitations for styrene butadiene rubber by emulsion processes. (a) Owners or operators of sources subject to this subpart producing styrene butadiene rubber using an emulsion process shall operate the...

  9. 40 CFR 63.497 - Back-end process provisions-monitoring provisions for control and recovery devices.

    Science.gov (United States)

    2010-07-01

    ... flame is required. (3) Where a boiler or process heater of less than 44 megawatts design heat input... required. Any boiler or process heater in which all vent streams are introduced with primary fuel or are... steam flow, nitrogen flow, or pressure monitoring device having an accuracy of at least ±10 percent...

  10. The Back-end of User Centred Innovation

    DEFF Research Database (Denmark)

    Lassen, Astrid Heidemann

    2015-01-01

    User Centred Innovation (UCI) has during the past decade developed into a widely acknowledged approach to innovation. Yet, in spite of plethora of methods and tools for conducting UCI, companies continue to struggle in relation to creating the desired effect UCI. In this paper, it is proposed...... that a lack of focus on the back-end of UCI is what is hampering the effect of the approach. Through a single case study it is demonstrated how opposing logics, different foci, and dissimilar working processes creates serious challenges in relation to implementation of UCI. The clarification...

  11. Great Swamp Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Great Swamp Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  12. Island Bay Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Island Bay Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  13. Back-end and interface implementation of the STS-XYTER2 prototype ASIC for the CBM experiment

    Science.gov (United States)

    Kasinski, K.; Szczygiel, R.; Zabolotny, W.

    2016-11-01

    Each front-end readout ASIC for the High-Energy Physics experiments requires robust and effective hit data streaming and control mechanism. A new STS-XYTER2 full-size prototype chip for the Silicon Tracking System and Muon Chamber detectors in the Compressed Baryonic Matter experiment at Facility for Antiproton and Ion Research (FAIR, Germany) is a 128-channel time and amplitude measuring solution for silicon microstrip and gas detectors. It operates at 250 kHit/s/channel hit rate, each hit producing 27 bits of information (5-bit amplitude, 14-bit timestamp, position and diagnostics data). The chip back-end implements fast front-end channel read-out, timestamp-wise hit sorting, and data streaming via a scalable interface implementing the dedicated protocol (STS-HCTSP) for chip control and hit transfer with data bandwidth from 9.7 MHit/s up to 47 MHit/s. It also includes multiple options for link diagnostics, failure detection, and throttling features. The back-end is designed to operate with the data acquisition architecture based on the CERN GBTx transceivers. This paper presents the details of the back-end and interface design and its implementation in the UMC 180 nm CMOS process.

  14. Back-end interconnection. A generic concept for high volume manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Bosman, J.; Budel, T.; De Kok, C.J.G.M.

    2013-10-15

    The general method to realize series connection in thin film PV modules is monolithical interconnection through a sequence of laser scribes (P1, P2 and P3) and layer depositions. This method however implies that the deposition processes are interrupted several times, an undesirable situation in high volume processing. In order to eliminate this drawback we focus our developments on the so called 'back-end interconnection concept' in which series interconnection takes place AFTER the deposition of the functional layers of the thin film PV device. The process of making a back-end interconnection combines laser scribing, curing, sintering and inkjet processes. These different processes interacts with each other and are investigated in order to create processing strategies that are robust to ensure high volume production. The generic approach created a technology base that can be applied to any thin film PV technology.

  15. New Persistent Back-End for the ATLAS Online Information Service

    CERN Document Server

    Soloviev, I; The ATLAS collaboration

    2014-01-01

    The Trigger and Data Acquisition (TDAQ) and detector systems of the ATLAS experiment deploy more than 3000 computers, running more than 15000 concurrent processes, to perform the selection, recording and monitoring of the proton collisions data in ATLAS. Most of these processes produce and share operational monitoring data used for inter-process communication and analysis of the systems. Few of these data are archived by dedicated applications into conditions and histogram databases. The rest of the data remained transient and lost at the end of a data taking session. To save these data for later, offline analysis of the quality of data taking and to help investigating the behavior of the system by experts, the first prototype of a new Persistent Back-End for the Atlas Information System of TDAQ (P-BEAST) was developed and deployed in the second half of 2012. The modern, distributed, and Java-based Cassandra database has been used as the storage technology and the CERN EOS for long-term storage. This paper pr...

  16. LFI 30 and 44 GHz receivers Back-End Modules

    Energy Technology Data Exchange (ETDEWEB)

    Artal, E; Aja, B; Fuente, M L de la; Pascual, J P; Mediavilla, A [Dpt. Ingenieria de Comunicaciones, Universidad de Cantabria, Plaza de la Ciencia, 39005 Santander (Spain); Martinez-Gonzalez, E [Instituto de Fisica de Cantabria, CSIC-Universidad de Cantabria, Avda. de los Castros s/n, 39005 Santander (Spain); Pradell, L; Paco, P de [Departament de Teoria del Senyal i Comunicacions, Universitat Politecnica de Catalunya, Barcelona (Spain); Bara, M; Blanco, E; GarcIa, E [Mier Comunicaciones S.A. La Garriga, Barcelona (Spain); Davis, R; Kettle, D; Roddis, N; Wilkinson, A [Jodrell Bank Centre for Astrophysics, University of Manchester, Manchester (United Kingdom); Bersanelli, M; Mennella, A; Tomasi, M [Universita degli studi di Milano, Department of Physics Via Celoria 16, Milano (Italy); Butler, R C; Cuttaia, F, E-mail: artale@unican.e [INAF/IASF - Bologna Via P. Gobetti 101, 40129 Bologna (Italy)

    2009-12-15

    The 30 and 44 GHz Back End Modules (BEM) for the Planck Low Frequency Instrument are broadband receivers (20% relative bandwidth) working at room temperature. The signals coming from the Front End Module are amplified, band pass filtered and finally converted to DC by a detector diode. Each receiver has two identical branches following the differential scheme of the Planck radiometers. The BEM design is based on MMIC Low Noise Amplifiers using GaAs P-HEMT devices, microstrip filters and Schottky diode detectors. Their manufacturing development has included elegant breadboard prototypes and finally qualification and flight model units. Electrical, mechanical and environmental tests were carried out for the characterization and verification of the manufactured BEMs. A description of the 30 and 44 GHz Back End Modules of Planck-LFI radiometers is given, with details of the tests done to determine their electrical and environmental performances. The electrical performances of the 30 and 44 GHz Back End Modules: frequency response, effective bandwidth, equivalent noise temperature, 1/f noise and linearity are presented.

  17. Web News Publishing System Back-end Database Design and Implementation%Web新闻发布系统后台数据库的设计与实现

    Institute of Scientific and Technical Information of China (English)

    隆重

    2011-01-01

    本文主要是研究基于B/S的新闻共享系统的服务器端的数据库设计与实现。该新闻发布系统服务器端负责新闻的存储、整合、流程控制、发布等。%This paper is to study the B/S to share the news server system,database design and implementation.This press release is responsible for the news server system,storage,integration,process control,release,etc.

  18. Improvement in Product Development: Use of back-end data to support upstream efforts of Robust Design Methodology

    Directory of Open Access Journals (Sweden)

    Vanajah Siva

    2012-12-01

    Full Text Available In the area of Robust Design Methodology (RDM less is done on how to use and work with data from the back-end of the product development process to support upstream improvement. The purpose of this paper is to suggest RDM practices for the use of customer claims data in early design phases as a basis for improvements. The back-end data, when systematically analyzed and fed back into the product development process, aids in closing the product development loop from claims to improvement in the design phase. This is proposed through a flow of claims data analysis tied to an existing tool, namely Failure Mode and Effects Analysis (FMEA. The systematic and integrated analysis of back-end data is suggested as an upstream effort of RDM to increase understanding of noise factors during product usage based on the feedback of claims data to FMEA and to address continuous improvement in product development.

  19. Visualization for Hyper-Heuristics: Back-End Processing

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Luke [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-01

    Modern society is faced with increasingly complex problems, many of which can be formulated as generate-and-test optimization problems. Yet, general-purpose optimization algorithms may sometimes require too much computational time. In these instances, hyperheuristics may be used. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario, finding the solution significantly faster than its predecessor. However, it may be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address these issues by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics and an easy-to-understand scientific visualization for the produced solutions. To support the development of this GUI, my portion of the research involved developing algorithms that would allow for parsing of the data produced by the hyper-heuristics. This data would then be sent to the front-end, where it would be displayed to the end user.

  20. Scientific research on the back-end of the fuel cycle for the 21. century; Les recherches scientifiques sur l'aval du cycle pour le 21. siecle

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    The aim of the Atalante-2000 conference is to present the major research axis concerning the nuclear fuel cycle back-end. The different topics are: - Present options concerning fuel cycle back-end; - Reprocessing of spent fuel; - Advanced separation for transmutation; - Processing and packaging of radioactive wastes; - Design and fabrication of targets for transmutation; and - Conversion of military plutonium into MOX fuels.

  1. Back End of Line Nanorelays for Ultra-low Power Monolithic Integrated NEMS-CMOS Circuits

    KAUST Repository

    Lechuga Aranda, Jesus Javier

    2016-05-01

    Since the introduction of Complementary-Metal-Oxide-Semiconductor (CMOS) technology, the chip industry has enjoyed many benefits of transistor feature size scaling, including higher speed and device density and improved energy efficiency. However, in the recent years, the IC designers have encountered a few roadblocks, namely reaching the physical limits of scaling and also increased device leakage which has resulted in a slow-down of supply voltage and power density scaling. Therefore, there has been an extensive hunt for alternative circuit architectures and switching devices that can alleviate or eliminate the current crisis in the semiconductor industry. The Nano-Electro-Mechanical (NEM) relay is a promising alternative switch that offers zero leakage and abrupt turn-on behaviour. Even though these devices are intrinsically slower than CMOS transistors, new circuit design techniques tailored for the electromechanical properties of such devices can be leveraged to design medium performance, ultra-low power integrated circuits. In this thesis, we deal with a new generation of such devices that is built in the back end of line (BEOL) CMOS process and is an ideal option for full integration with current CMOS transistor technology. Simulation and verification at the circuit and system level is a critical step in the design flow of microelectronic circuits, and this is especially important for new technologies that lack the standard design infrastructure and well-known verification platforms. Although most of the physical and electrical properties of NEM structures can be simulated using standard electronic automation software, there is no report of a reliable behavioural model for NEMS switches that enable large circuit simulations. In this work, we present an optimised model of a BEOL nano relay that encompasses all the electromechanical characteristics of the device and is robust and lightweight enough for VLSI applications that require simulation of thousands of

  2. Design concepts and advanced telerobotics development for facilities in the back end of the nuclear fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, M.J.

    1987-01-01

    In the Fuel Recycle Division at the Oak Ridge National Laboratory, a comprehensive remote systems development program has existed for the past seven years. The new remote technology under development is expected to significantly improve remote operations by extending the range of tasks accomplished by remote means and increasing the efficiency of remote work undertaken. Five areas of the development effort are primary contributors to the goal of higher operating efficiency for major facilities for the back end of the nuclear fuel cycle. These areas are the single-cell concept, the low-flow ventilation concept, television viewing, equipment-mounting racks, and force-reflecting manipulation. These somewhat innovative directions are products of a design process where the technical scenario to be accomplished, the remote equipment to accomplish the scenario, and the facility design to house the equipment, are considered in an iterative design process to optimize performance, maximize long-term costs effectiveness, and minimize initial capital outlay. 14 refs., 3 figs.

  3. The CMS fast beams condition monitor back-end electronics based on MicroTCA technology: status and development

    Science.gov (United States)

    Zagozdzinska, Agnieszka A.; Dabrowski, Anne E.; Pozniak, Krzysztof T.

    2015-09-01

    The Fast Beams Condition Monitor (BCM1F), upgraded for LHC Run II, is used to measure the online luminosity and machine induced background for the CMS experiment. The detector consists of 24 single-crystal CVD diamond sensors that are read out with a custom fast front-end chip fabricated in 130 nm CMOS technology. Since the signals from the sensors are used for real time monitoring of the LHC conditions they are processed by dedicated back-end electronics to measure separately rates corresponding to LHC collision products, machine induced background and residual activation exploiting different arrival times. The system is built in MicroTCA technology and uses high speed analog-to-digital converters. In operational modes of high rates, consecutive events, spaced in time by less than 12.5 ns, may cause partially overlapping events. Hence, novel signal processing techniques are deployed to resolve overlapping peaks. The high accuracy qualification of the signals is crucial to determine the luminosity and the machine induced background rates for the CMS experiment and the LHC.

  4. Kenai National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Kenai Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see reference...

  5. Okefenokee National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Okefenokee Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  6. Red Rock Lakes National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Wilderness Name Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  7. Becharof National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Becharof Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see reference...

  8. St Marks National Wildlife Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for theSt. Marks Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see reference...

  9. West Sister Island National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the West Sister Island Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  10. Charles M. Russell National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Charles M. Russell Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  11. Mingo National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Mingo Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see reference...

  12. Seney National Wildlife Refuge, Huron Islands Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Huron Islands Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  13. Alaska Maritime National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Alaska Maritime Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  14. Seney National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Wilderness Name Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  15. Chincoteague National Wildlife Refuge, Assateague Island Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the proposed Assateague Island Wilderness Character Monitoring Application. User interface and lookup databases are required for...

  16. Edwin B. Forsythe National Wildlife Refuge, Brigantine Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Edwin B. Forsythe National Wildlife Refuge, Brigantine Wilderness Character Monitoring Application. User interface and lookup...

  17. Izembek National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Izembek Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see reference...

  18. Crab Orchard National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Crab Orchard Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  19. Alaska Maritime National Wildlife Refuge, Unimak Island Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Unimak Island Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  20. Cedar Keys National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Ceder Keys Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  1. Chassahowitzka National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Chassahowitzka Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  2. Pelican Island National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Pelican Island Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  3. Michigan Islands National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Michigan Islands Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  4. J.N. "Ding" Darling National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the J.N. "Ding" Darling Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  5. Moosehorn National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Moosehorn Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see reference...

  6. Monomoy National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Wilderness Name Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  7. Agassiz National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Agassiz Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see reference...

  8. Cape Romain National Wildlife Refuge Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Cape Romain Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  9. Readout Control Specifications for the Front-End and Back-End of the LHCb Upgrade

    CERN Document Server

    Alessio, F

    2014-01-01

    The LHCb experiment has proposed an upgrade towards a full 40 MHz readout system [1] in order to run at between five and ten times the initial design luminosity. The various sub-systems in the readout architecture will need to be upgraded in order to cope with higher sub-detector occupancies, higher rate and higher network load. The development of a new readout control system for the upgraded LHCb readout system was investigated already in 2008 [2]. This work has evolved into a detailed system-level specification of the entire timing and readout control system [3]. In this paper, we specify in detail the functionalities that must be supported by the Front-End and the Back-End electronics to comply with the timing requirements and the readout scheme, and the necessary control and monitoring capabilities in order to validate, commission and operate the upgraded experiment efficiently and with sufficient flexibility. The document focuses entirely on the readout control aspects of the FE and BE, and the ECS inter...

  10. An open-source data storage and visualization back end for experimental data.

    Science.gov (United States)

    Nielsen, Kenneth; Andersen, Thomas; Jensen, Robert; Nielsen, Jane H; Chorkendorff, Ib

    2014-04-01

    In this article, a flexible free and open-source software system for data logging and presentation will be described. The system is highly modular and adaptable and can be used in any laboratory in which continuous and/or ad hoc measurements require centralized storage. A presentation component for the data back end has furthermore been written that enables live visualization of data on any device capable of displaying Web pages. The system consists of three parts: data-logging clients, a data server, and a data presentation Web site. The logging of data from independent clients leads to high resilience to equipment failure, whereas the central storage of data dramatically eases backup and data exchange. The visualization front end allows direct monitoring of acquired data to see live progress of long-duration experiments. This enables the user to alter experimental conditions based on these data and to interfere with the experiment if needed. The data stored consist both of specific measurements and of continuously logged system parameters. The latter is crucial to a variety of automation and surveillance features, and three cases of such features are described: monitoring system health, getting status of long-duration experiments, and implementation of instant alarms in the event of failure.

  11. Analysis of factors affecting the implementation of back-end nuclear fuel cycle policy in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yung Myung; Yang, Maeng Ho; Kim, Hyun Joon; Chung, Hwan Sam; Oh, Keun Bae; Lee, Byung OoK; Ko, Han Suk; Song, Ki Dong; Lee, Man Ki; Moon, Ki Hwan; Lee, Han Myung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-01-01

    In this study, the back-end nuclear fuel cycle acceptability is surveyed and analyzed in the following three aspects. To begin with, the future political situation and energy-environmental issues are analyzed as part of the socio-economic aspect. Secondly, the domestic situation of nuclear industries and the fuel cycle policy of foreign countries are surveyed as the technical aspect. Finally, NPT, IAEA safeguards and nuclear export control regimes are analyzed as the institutional aspect. The unification period of South and North Korea also will greatly affect the implementation of back-end fuel cycle policy, and public attitudes will affect the acquisition of site, construction, and operation of nuclear facilities. An effort to release international restrictions on the back-end fuel cycle is also required to accelerate the implementation of the policy. In this regard, the back-end fuel cycle policy should be clear-cut to avoid misunderstanding with respect to nuclear proliferation. Importantly, agreements with foreign countries should be amended at a mutual equivalent level. (Author) 30 refs., 5 figs., 25 tabs.

  12. The Areva Group back-end division - challenges and prospects; Le pole aval dans le groupe Areva - enjeux et perspectives

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-06-01

    This document presents the Areva Group back-end division challenges and prospects. Areva, a world nuclear industry leader, analyzes in this document, the high-profile mix of complementary activities of the nuclear energy industry, concerning the back-end division the full range of services for the end of the fuel cycle, the fuel cycle back-end markets, the economic and financial associated considerations. (A.L.B.)

  13. Legal, institutional, and political issues in transportation of nuclear materials at the back end of the LWR nuclear fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    Lippek, H.E.; Schuller, C.R.

    1979-03-01

    A study was conducted to identify major legal and institutional problems and issues in the transportation of spent fuel and associated processing wastes at the back end of the LWR nuclear fuel cycle. (Most of the discussion centers on the transportation of spent fuel, since this activity will involve virtually all of the legal and institutional problems likely to be encountered in moving waste materials, as well.) Actions or approaches that might be pursued to resolve the problems identified in the analysis are suggested. Two scenarios for the industrial-scale transportation of spent fuel and radioactive wastes, taken together, high-light most of the major problems and issues of a legal and institutional nature that are likely to arise: (1) utilizing the Allied General Nuclear Services (AGNS) facility at Barnwell, SC, as a temporary storage facility for spent fuel; and (2) utilizing AGNS for full-scale commercial reprocessing of spent LWR fuel.

  14. Green Bay and Gravel Island National Wildlife Refuge, Wisconsin Islands Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is the back-end data file for the Wisconsin Islands Wilderness Character Monitoring Application. User interface and lookup databases are required for use (see...

  15. Back end of the nuclear fuel cycle; Aval du cycle du combustible nucleaire

    Energy Technology Data Exchange (ETDEWEB)

    Dognon, J.P.; Rabbe, C.; Beudaert, Ph.; Lamare, V.; Wipff, G.; Moisy, Ph.; Charrin, N.; Blanc, P.; Den Auwer, Ch.; Revel, R.; Charbonnel, M.C.; Presson, M.T.; Cau Dit Coumes, C.; Chopin-Dumas, J.; Devisme, F.; Rat, B.; Hill, C.; Guillaneux, D.; Madic, C.; Carrera, A.; Dozol, J.F.; Rouquette, H.; Allain, F.; Virelizier, H.; Moulin, Ch.; Lemort, F.; Orlhac, X.; Fillet, C.; Carpena, J.; Advocat, T.; Leturcq, G.; Lacombe, J.; Bonnetier, A.; Ribet, I.; Poitou, S.; Richaud, D.; Fiquet, O.; Gramondi, P.; Massit, H.; Meyer, D.; Conocar, O.; Pettier, J.L.; Raphael, T.; Bouniol, P.; Sercombe, J.; Badouix, P.; Adenot, F.; Le Bescop, P.; Mazoin, C.; Motellier, S.; Charles, Y.; Richet, C.; Ayache, R.; Pitsch, H.; Ly, J.; Beaucaire, C.; Devol-Brown, I.; Libert, M.F.; Besnainou, B

    1999-07-01

    In this chapter of the DCC 1999 scientific report, the following theoretical studies are detailed: electronic structure of lanthanides or actinides complexes, forecasting of the stoichiometry of europium nitrate complexes, actinides aqueous solutions analytical and thermodynamical chemistry, actinides complexes structural determination. It also provides experimental studies: actinides and lanthanides separation, radioactive wastes processing and conditioning, plasma torch vitrification process, simulation of the wastes packages characterization, wastes storage with concrete behaviour and biodegradation. (A.L.B.)

  16. An Open-Source Data Storage and Visualization Back End for Experimental Data

    DEFF Research Database (Denmark)

    Nielsen, Kenneth; Andersen, Thomas; Jensen, Robert

    2014-01-01

    In this article, a flexible free and open-source software system for data logging and presentation will be described. The system is highly modular and adaptable and can be used in any laboratory in which continuous and/or ad hoc measurements require centralized storage. A presentation component...

  17. Back-end of the fuel cycle and non-proliferation strategies

    Energy Technology Data Exchange (ETDEWEB)

    Chebeskov, A.N.; Oussanov, V.I.; Iougai, S.V.; Pshakin, G.M. [Institute of Physics and Power Engineering, State Scientific Center of Russian Federation, Obninsk (Russian Federation)

    2001-07-01

    The paper focuses on the problem of fissile materials proliferation risk estimation. Some methodological approaches to the solution of this task and results of their application for comparison of different nuclear fuel cycle strategies are discussed. The results of comparative assessment of non-proliferation aspects of plutonium utilization alternatives in Russia using system analysis approach are presented. (author)

  18. A Persistent Back-End for the ATLAS Online Information Service (P-BEAST)

    CERN Document Server

    SICOE, A D; The ATLAS collaboration; KOLOS, S; MAGNONI, L; SOLOVIEV, I

    2012-01-01

    This poster describes P-BEAST, a highly scalable, highly available and durable system for archiving monitoring information of the trigger and data acquisition (TDAQ) system of the ATLAS experiment. Currently this consists of 20,000 applications running on 2,400 interconnected computers but it is foreseen to grow further in the near future. P-BEAST stores considerable amounts of monitoring information which would otherwise be lost. Making this data accessible, facilitates long term analysis and faster debugging. The novelty of this research consists of using a modern key-value storage technology (Cassandra) to satisfy the large time series data rates, flexibility and scalability requirements entailed by the project. The loose schema allows the stored data to evolve seamlessly with the information flowing within the Information Service. An architectural overview of P-BEAST is presented together with a discussion on the arguments which ultimately lead to choosing Cassandra as the storage technology. Measurements...

  19. A Real-Time, GPU-Based, Non-Imaging Back-End for Radio Telescopes

    CERN Document Server

    Magro, Alessio

    2014-01-01

    Since the discovery of RRATs, interest in single pulse radio searches has increased dramatically. Due to the large data volumes generated by these searches, especially in planned surveys for future radio telescopes, such searches have to be conducted in real-time. This has led to the development of a multitude of search techniques and real-time pipeline prototypes. In this work we investigated the applicability of GPUs. We have designed and implemented a scalable, flexibile, GPU-based, transient search pipeline composed of several processing stages, including RFI mitigation, dedispersion, event detection and classification, as well as data quantisation and persistence. These stages are encapsulated as a standalone framework. The optimised GPU implementation of direct dedispersion achieves a speedup of more than an order of magnitude when compared to an optimised CPU implementation. We use a density-based clustering algorithm, coupled with a candidate selection mechanism to group detections caused by the same ...

  20. Persistent Back End for the ATLAS Information Service of Trigger and Data Acquisition

    CERN Document Server

    Sicoe, Alexandru

    ATLAS is the largest of several experiments built along the Large Hadron Collider at CERN, Geneva. Its aim is to measure particle production when protons collide at a very high center of mass energy, thus reproducing the behavior of matter a few instants after the big bang. The detecting techniques used for this purpose are very sophisticated and the amount of digitized data created by the sensing elements requires a very large data acquisition system, based on thousands of interconnected computers. The experiment is successfully taking data since the end of 2008 and the trigger and data acquisition are now in a production stage.The main development eorts are guided towards adding easy to use and intuitive tools to aid experts monitor dierent components or subsystems. P BEAST is an example of such a tool. It facilitates the storage of vast amounts of operational information which is otherwise lost. With this data at hand, long term analysis can be made and issues discovered. The project has reached deployment...

  1. 40 CFR 63.496 - Back-end process provisions-procedures to determine compliance using control or recovery devices.

    Science.gov (United States)

    2010-07-01

    .... ER05SE96.026 where: Cij, Coj=Concentration of sample component j of the gas stream at the inlet and outlet... (kg/hr). Mij, Moj=Molecular weight of sample component j of the gas stream at the inlet and outlet of... × 10−6 (ppmv)−1 (gm-mole/scm) (kg/gm) (min/hr), where standard temperature is 20 °C. (v) Inlet...

  2. Investigation of economics of back-end nuclear fuel cycle options in the Republic of Korea based on Once-through

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Seok-Ki; Yim, Man-Sung [Korea Advance Institute of Science and Technology, Daejeon (Korea, Republic of)

    2015-05-15

    The purpose of this study is to examine these questions and perform economic evaluations of various cases of Once-through back-end fuel cycles in the ROK. Therefore, the study is to support decision making in terms of how the long term spent nuclear fuel (SNF) management strategy should be developed. A spreadsheet model was developed to plan reactor construction, the interim storage and the HLW repository construction within engineered constraints, based on the estimation of the spent fuel flow and the energy supply of the nuclear power program. The model computes the back-end levelized costs for various fuel cycle choices. The scenarios assumed in the model include (1) 0 year/10year/20year of licensed operation period extension; (2) the phase-out of NPP program and the continuous use including the reunification of Korean peninsula; (3) reactor decommissioning and construction lead times - 10 years and 5 years respectively in this study; (4) geological constraints of siting for a new reactor - 38 for without the reunification and 70 for with the reunification; (5) the first initiation of reactor decommissioning and operation of HLW repository - assumed to be 2020 and 2050; and (6) capacity factor of reactor operation and the on-site wet storage pool capacity - 0.85 and 0.498 MTHM per MWe which is equivalent with APR1400. The capacity factor for PHWR reactors was assumed at 0.85 and the plan for PHWR was fixed as phase-out. The spreadsheet model conducts computation for annual expenditures of the back-end fuel cycle and calculates the levelized costs. Licensed operation period extension enhances not only economic efficiency, stable energy supply, but also reduces burden of siting for a new reactor and waste disposal. And regardless the reunification, continuous use of nuclear energy lowers the back-end fuel cycle cost. With projection that a large portion of social cost is included in the current back-end fuel cycle cost, nuclear energy likely has more competency in

  3. A methodology for assessing the environmental and health impact of options for the back-end of the nuclear fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    Ouzounian, G.H. [Agence Nationale pour la Gestion des Dechets Radioactifs ANDRA, 92 - Chatenay Malabry (France); Devezeaux de Lavergne, J.G.; Devin, P. [Cogema, 78 - Saint Quentin en Yvelines (France); Lioure, A. [CEA Valrho, 30 - Marcoule (France); Mouney, H. [Electricite de France (EDF), 93 - Saint Denis (France); Le Boulch, D. [Electricite de France, DRD, 77 - Moret sur Loing (France)

    2001-07-01

    Research programs conducted in France in the framework of the 1991 act offer various options for management of the back- end of the fuel cycle. Proposals to be debated in 2006 will rely not only on broad scientific and technical knowledge, but also on the compilation and integration of results, with syntheses and analyses intended to highlight the advantages and the limitations of each of the waste management paths. This presentation introduces a methodology derived from the life cycle analysis as well as some preliminary results. (author)

  4. Desarrollo de una aplicación web, con Front-end y Back-end para compraventa de segunda mano

    OpenAIRE

    NIETO RODRIGO, JORGE

    2016-01-01

    Desde el CDM (Centro De Movilidad) de la multinacional ‘Babel, sistemas de información’ se motiva al autor a diseñar y crear una página web de compraventa de productos de segunda mano la cual sirva como proyecto de fin de carrera. La empresa tiene como objetivo para este proyecto:  Implementar una web, tanto el front-end como su back-end.  Implementar medidas de seguridad y su corroboración.  Trabajar con metodologías agiles de programación como Scrum.  Trabajar con G...

  5. The Evolution of Wafer Bonding Moving from the back-end further to the front-end

    Institute of Scientific and Technical Information of China (English)

    Thomas Glinsner; Peter Hangweier

    2009-01-01

    @@ 1 Introduction As the nanoscale era progresses, innovative new materials and processes continue to be developed and implemented as a means of keeping the industry on the path of Moore's Law. Wafer bonding - literally, the temporary or permanent joining of two wafers or substrates using a suitable combination of process technologies, chemicals and adhesives - is one such innovation.

  6. Demonstrator System for the Phase-I Upgrade of the Trigger Readout Electronics of the ATLAS Liquid Argon Calorimeters

    CERN Document Server

    FRAGNAUD, J; The ATLAS collaboration

    2014-01-01

    The trigger readout electronics of the ATLAS LAr Calorimeters will be improved for the Phase-I luminosity upgrade of the LHC to enhance the trigger feature extraction. Signals with higher spatial granularity will be digitized and processed by newly developed front-end and back-end components. In order to evaluate technical and performance aspects, a demonstrator system is being set up which is planned to be installed on the ATLAS detector during the upcoming LHC run. Results from system tests of the analog signal treatment, the trigger digitizer, the optical signal transmission and the FPGA-based back-end are reported.

  7. An analysis of the back end of the nuclear fuel cycle with emphasis on high-level waste management, volume 1

    Science.gov (United States)

    1977-01-01

    The programs and plans of the U.S. government for the "back end of the nuclear fuel cycle" were examined to determine if there were any significant technological or regulatory gaps and inconsistencies. Particular emphasis was placed on analysis of high-level nuclear waste management plans, since the permanent disposal of radioactive waste has emerged as a major factor in the public acceptance of nuclear power. The implications of various light water reactor fuel cycle options were examined including throwaway, stowaway, uranium recycle, and plutonium plus uranium recycle. The results of this study indicate that the U.S. program for high-level waste management has significant gaps and inconsistencies. Areas of greatest concern include: the adequacy of the scientific data base for geological disposal; programs for the the disposal of spent fuel rods; interagency coordination; and uncertainties in NRC regulatory requirements for disposal of both commercial and military high-level waste.

  8. A Persistent Back-End for the ATLAS TDAQ On-line Information Service (P-BEAST)

    CERN Document Server

    Sicoe, A; The ATLAS collaboration; Kolos, S; Magnoni, L; Soloviev, I

    2011-01-01

    This paper describes P-BEAST, a highly scalable, highly available and durable system for archiving monitoring information of the trigger and data acquisition (TDAQ) system of the ATLAS experiment at CERN. Currently this consists of 20,000 applications running on 2,400 interconnected computers but it is foreseen to grow further in the near future. P-BEAST stores considerable amounts of monitoring information which would otherwise be lost. Making this data accessible, facilitates long term analysis and faster debugging. The novelty of this research consists of using a modern key-value storage technology (Cassandra) to satisfy the massive time series data rates, flexibility and scalability requirements entailed by the project. The loose schema allows the stored data to evolve seamlessly with the information flowing within the Information Service. An architectural overview of P-BEAST is presented alongside a discussion about the technologies considered as candidates for storing the data. The arguments which ultim...

  9. Concerns when designing a safeguards approach for the back-end of the Swedish nuclear fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    Fritzell, Anni (Uppsala Univ., Uppsala (Sweden))

    2008-03-15

    In Sweden, the construction of an encapsulation plant and a geological repository for the final disposal of spent nuclear fuel is planned to start within the next ten years. Due to Sweden's international agreements on non-proliferation, the Swedish safeguards regime must be extended to include these facilities. The geological repository has some unique features, which present the safeguards system with unprecedented challenges. These features include, inter alia, the long period of time that the facility will contain nuclear material and that the disposed nuclear material will be very difficult to access, implying that physical verification of its presence in the repository is not foreseen. This work presents the available techniques for creating a safeguards system for the backend of the Swedish nuclear fuel cycle. Important issues to consider in the planning and implementation of the safeguards system have been investigated, which in some cases has led to an identification of areas needing further research. The results include three proposed options for a safeguards approach, which have been evaluated on the basis of the safeguards authorities' requirements. Also, the evolution and present situation of the work carried out in connection to safeguards for geological repositories has been compiled

  10. Performance and description of the upgraded readout with the new back-end electronics for the ATLAS Pixel detector

    CERN Document Server

    Yajima, Kazuki; The ATLAS collaboration

    2017-01-01

    LHC increased drastically its performance during the RUN2 data taking, starting from a peak instantaneous luminosity of up to $5\\times10^{33} \\mathrm{cm}^{-2} \\mathrm{s}^{-1}$ in 2015 to conclude with the record value of $1.4\\times10^{34} \\mathrm{cm}^{-2} \\mathrm{s}^{-1}$ in November 2016. The concurrent increase of the trigger rate and event size forced the ATLAS experiment to exploit its sub-detectors to the maximum, approaching and possibly overcoming the design parameters. The ATLAS Pixel data acquisition system was upgraded to avoid possible bandwidth limitations. Two upgrades of the read-out electronics have been done. The first one during 2015/16 YETS, when the outermost pixel layer (Layer-2) was upgraded and its bandwidth was doubled. This upgrade partly contributed to maintain the data taking efficiency of the Pixel detector at a relatively high level ($\\sim$99%) during the 2016 run. A similar upgrade of the read-out system for the middle layer (Layer-1) is ongoing during 2016/17 EYETS. The details o...

  11. Automated process planning system

    Science.gov (United States)

    Mann, W.

    1978-01-01

    Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.

  12. Quartz resonator processing system

    Science.gov (United States)

    Peters, Roswell D. M.

    1983-01-01

    Disclosed is a single chamber ultra-high vacuum processing system for the oduction of hermetically sealed quartz resonators wherein electrode metallization and sealing are carried out along with cleaning and bake-out without any air exposure between the processing steps. The system includes a common vacuum chamber in which is located a rotatable wheel-like member which is adapted to move a plurality of individual component sets of a flat pack resonator unit past discretely located processing stations in said chamber whereupon electrode deposition takes place followed by the placement of ceramic covers over a frame containing a resonator element and then to a sealing stage where a pair of hydraulic rams including heating elements effect a metallized bonding of the covers to the frame.

  13. SAR processing using SHARC signal processing systems

    Science.gov (United States)

    Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.

    1998-09-01

    Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.

  14. Processed Products Database System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Collection of annual data on processed seafood products. The Division provides authoritative advice, coordination and guidance on matters related to the collection,...

  15. Personal Investigations Processing System

    Data.gov (United States)

    US Agency for International Development — PIPS is a system that maintains the Security/Suitability Investigations Index (SII) for OPM. It contains over 11 million background investigation records of Federal...

  16. A brief introduction of back-end dedusting system in Beijing Yanshan cement plant%我厂窑尾除尘系统改造简介

    Institute of Scientific and Technical Information of China (English)

    钱晓露

    2002-01-01

    @@ 始建于1958年的北京市燕山水泥厂位于北京市西五环路外,为北京市离市区最近的一个水泥厂.为配合北京申奥成功后的城市建设,该厂对700t/d熟料生产线窑尾除尘系统进行了更新改造.

  17. Further development and data basis for safety and accident analyses of nuclear front end and back end facilities and actualization and revision of calculation methods for nuclear safety analyses. Final report; Weiterentwicklung von Methoden und Datengrundlagen zu Sicherheits- und Stoerfallanalysen fuer Anlagen der nuklearen Ver- und Entsorgung sowie Aktualisierung und Ueberpruefung von Rechenmethoden zu nuklearen Sicherheitsanalysen. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Kilger, Robert; Peters, Elisabeth; Sommer, Fabian; Moser, Eberhard-Franz; Kessen, Sven; Stuke, Maik

    2016-07-15

    This report briefly describes the activities carried out under the project 3613R03350 on the GRS ''Handbook on Accident Analysis for Nuclear Front and Back End Facilities'', and in detail the continuing work on the revision and updating of the GRS ''Handbook on Criticality'', which here focused on fissile systems with plutonium and {sup 233}U. The in previous projects started and ongoing literature study on innovative fuel concepts is continued. Also described are the review and qualification of computational methods by research and active benchmark participation, and the results of tracking the state of science and technology in the field of computational methods for criticality safety analysis. Special in-depth analyzes of selected criticality-relevant occurrences in the past are also documented.

  18. Design and implementation of device for effective utilization of residual heat from the back-end of organic heat carrier boiler%有机热载体锅炉尾部高效余热利用装置的设计与应用

    Institute of Scientific and Technical Information of China (English)

    郭建平; 李文炜; 楼云定

    2009-01-01

    Organic heat carrier boilers were widely used in textile, printing and dyeing industries. Due to the special processes of high temperature stereotype and dyeing & finishing, as well as the structure and operating principle of organic heat carrier boilers,the temperature of exhaust gas from the back-end of the organic heat carrier boilers reaches up to 300℃, leading to enormous waste of heat energy. In order to reduce the loss of heat energy, by using heat pipe technology, this paper designed and developed a reasonable-structured, effective and easy-to-operate device for recycling of the high-temperature flue gas from the back-end of organic heat carrier boilers ,through recovery and utilization of the residual heat in the high-temperature flue gas, improving the comprehensive thermal efficiency of organic heat carrier boiler and maximizing the effectiveness of energy conservation and emission reduction.%有机热栽体锅炉广泛应用于纺织、印染行业,由于高温定型和染整的特殊工艺以及有机热载体锅炉的构造和运行原理造成其尾部排烟温度在300℃左右,产生巨大的热能浪费.为减少热能损失,通过高温烟气的余热回收利用,提高有机热载体锅炉的综合热效率,可实现节能降耗和减排.本课题运用热管技术设计研制了结构合理、操作方便、高效的有机热载体锅炉尾部高温烟气的回收再利用装置.

  19. AREVA Technical Days (ATD) session 2: operations of the back-end of the nuclear fuel cycle; AREVA Technical Days (ATD) session 2: les activites du pole Aval du cycle du combustible nucleaire

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    These technical days organized by the Areva Group aims to explain the group activities in a technological and economic point of view, to provide an outlook of worldwide energy trends and challenges and to present each of their businesses in a synthetic manner. This second session deals with the reprocessing business, back-end financing mechanisms, technology transfer, environmental management, risk management programs, research and development contribution to waste volume reductions, issues and outlook of nuclear wastes, comparison of the open and closed cycles. (A.L.B.)

  20. UNICS - An Unified Instrument Control System for Small/Medium Sized Astronomical Observatories

    CERN Document Server

    Srivastava, Mudit K; Burse, Mahesh P; Chordia, Pravin A; Chillal, Kalpesh S; Mestry, Vilas B; Das, Hillol K; Kohok, Abhay A

    2009-01-01

    Although the astronomy community is witnessing an era of large telescopes, smaller and medium sized telescopes still maintain their utility being larger in numbers. In order to obtain better scientific outputs it is necessary to incorporate modern and advanced technologies to the back-end instruments and to their interfaces with the telescopes through various control processes. However often tight financial constraints on the smaller and medium size observatories limit the scope and utility of these systems. Most of the time for every new development on the telescope the back-end control systems are required to be built from scratch leading to high costs and efforts. Therefore a simple, low cost control system for small and medium size observatory needs to be developed to minimize the cost and efforts while going for the expansion of the observatory. Here we report on the development of a modern, multipurpose instrument control system UNICS (Unified Instrument Control System) to integrate the controls of vari...

  1. Evaluation of alignment marks using ASML ATHENA alignment system in 90nm BEOL process

    CERN Document Server

    Tan Chin Boon; Koh Hui Peng; Koo Chee, Kiong; Siew Yong Kong; Yeo Swee Hock

    2003-01-01

    As the critical dimension (CD) in integrated circuit (IC) device reduces, the total overlay budget needs to be more stringent. Typically, the allowable overlay error is 1/3 of the CD in the IC device. In this case, robustness of alignment mark is critical, as accurate signal is required by the scanner's alignment system to precisely align a layer of pattern to the previous layer. Alignment issue is more severe in back-end process partly due to the influenced of Chemical Mechanical Polishing (CMP), which contribute to the asymmetric or total destruction of the alignment marks. Alignment marks on the wafer can be placed along the scribe-line of the IC pattern. ASML scanner allows such type of wafer alignment using phase grating mark, known as Scribe-line Primary Mark (SPM) which can be fit into a standard 80um scribe-line. In this paper, we have studied the feasibility of introducing Narrow SPM (NSPM) to enable a smaller scribe-line. The width of NSPM has been shrunk down to 70% of the SPM and the length remain...

  2. Mars Aqueous Processing System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Mars Aqueous Processing System (MAPS) is an innovative method to produce useful building materials from Martian regolith. Acids and bases produced from the regolith...

  3. Mars Aqueous Processing System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Mars Aqueous Processing System (MAPS) is a novel technology for recovering oxygen, iron, and other constituents from lunar and Mars soils. The closed-loop...

  4. Handbook of signal processing systems

    CERN Document Server

    Deprettere, Ed; Leupers, Rainer; Takala, Jarmo

    2013-01-01

    Handbook of Signal Processing Systems is organized in three parts. The first part motivates representative applications that drive and apply state-of-the art methods for design and implementation of signal processing systems; the second part discusses architectures for implementing these applications; the third part focuses on compilers and simulation tools, describes models of computation and their associated design tools and methodologies. This handbook is an essential tool for professionals in many fields and researchers of all levels.

  5. Handbook of signal processing systems

    CERN Document Server

    Bhattacharyya, Shuvra S; Leupers, Rainer; Takala, Jarmo

    2010-01-01

    The Handbook is organized in four parts. The first part motivates representative applications that drive and apply state-of-the art methods for design and implementation of signal processing systems; the second part discusses architectures for implementing these applications; the third part focuses on compilers and simulation tools; and the fourth part describes models of computation and their associated design tools and methodologies.

  6. VLSI mixed signal processing system

    Science.gov (United States)

    Alvarez, A.; Premkumar, A. B.

    1993-01-01

    An economical and efficient VLSI implementation of a mixed signal processing system (MSP) is presented in this paper. The MSP concept is investigated and the functional blocks of the proposed MSP are described. The requirements of each of the blocks are discussed in detail. A sample application using active acoustic cancellation technique is described to demonstrate the power of the MSP approach.

  7. PENGEMBANGAN CONTENT MANAGEMENT SYSTEM PADA ADMISI ONLINE BINUS UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Karto Iskandar

    2012-11-01

    Full Text Available Technically registration form for new Binus institution must be regenerated and not dynamic. Therefore, this research is objected to simplify the creation of registration process for new Binus institution and provide solutions to the problem when Binus establish other new institutions. Methodology used is analysis and design of the database (database oriented. Analysis is done by asking the problems of existing systems in the IT Directorate, whereas the design uses UML diagram notation 2.0. The results obtained is front end and back end applications for the Content Management System of the registration form. The results of this design can be used to simplify the new student registration or in the many Binus institutions by grouping similar fields. With some changes in the front end and back end applications for content Management System, the addition of new online admission application form can be managed faster, where the creation of admission registration form is managed in the back end application. As suggestions for future development, online admission registration can be run in mobile version.

  8. Evaluation of fuel fabrication and the back end of the fuel cycle for light-water- and heavy-water-cooled nuclear power reactors

    Energy Technology Data Exchange (ETDEWEB)

    Carter, W.L.; Olsen, A.R.

    1979-06-01

    The classification of water-cooled nuclear reactors offers a number of fuel cycles that present inherently low risk of weapons proliferation while making power available to the international community. Eight fuel cycles in light water reactor (LWR), heavy water reactor (HWR), and the spectral shift controlled reactor (SSCR) systems have been proposed to promote these objectives in the International Fuel Cycle Evaluation (INFCE) program. Each was examined in an effort to provide technical and economic data to INFCE on fuel fabrication, refabrication, and reprocessing for an initial comparison of alternate cycles. The fuel cycles include three once-through cycles that require only fresh fuel fabrication, shipping, and spent fuel storage; four cycles that utilize denatured uranium--thorium and require all recycle operations; and one cycle that considers the LWR--HWR tandem operation requiring refabrication but no reprocessing.

  9. Forward Osmosis System And Process

    KAUST Repository

    Duan, Jintang

    2013-08-22

    A forward osmosis fluid purification system includes a cross-flow membrane module with a membrane, a channel on each side of the membrane which allows a feed solution and a draw solution to flow through separately, a feed side, a draw side including a draw solute, where the draw solute includes an aryl sulfonate salt. The system can be used in a process to extract water from impure water, such as wastewater or seawater. The purified water can be applied to arid land.

  10. Condensation Processes in Geothermal Systems

    Science.gov (United States)

    Norman, D. I.; Moore, J. N.

    2005-12-01

    We model condensation processes in geothermal systems to understand how this process changes fluid chemistry. We assume two processes operate in geothermal systems: 1) condensation of a vapor phase derived by boiling an aqueous geothermal fluid into a cool near surface water and 2) condensation of a magmatic vapor by a deep circulating meteoric thermal fluid. It is assumed that the condensation process has two stages. Initially the condensing fluid is under saturated in gaseous species. Condensation of the vapor phase continues until the pressure on the fluid equals the sum of the partial pressures of water and the dissolved gaseous species. At that time bubbles flux through the condensing fluid. In time the fluid and fluxing gas phase come to equilibrium. Calculation shows that during the second stage of the condensation process the liquid phase becomes enriched in more soluble gaseous species like CO2 and H2S, and depleted in less soluble species like CH4 and N2. Stage 2 condensation processes can therefore be monitored by ratios of more and less condensable species like CO2/N2. Condensation of vapor released by boiling geothermal fluids results in liquids with high concentrations of H2S and CO2 like is seen in geothermal system steam-heated waters. Condensation of a magmatic vapor into circulating meteoric water has been proposed, but not well demonstrated. We compare to our models the Cerro Prieto, Mexico gas analysis data set collected over twelve years time by USGS personnel. It was assumed for modeling that the Cerro Prieto geothermal fluids are circulating meteoritic fluids with N2/Ar ratios about 40 to which is added a magmatic vapor with N2/Ar ratio = 400. The Cerro Prieto analyses show a strong correlation between N2/Ar and CO2/N2 as predicted by calculation. Two dimensional image plots of well N2/Ar + CO2/N2 show a bull's-eye pattern on the geothermal field. Image plots of analyses collected over a year or less time show N2/Ar and CO2/N2 hot spots

  11. Demonstrator System for the Phase-I Upgrade of the Trigger Readout Electronics of the ATLAS Liquid-Argon Calorimeters

    CERN Document Server

    Chen, Kai; The ATLAS collaboration

    2014-01-01

    The trigger readout electronics of the ATLAS Liquid Argon Calorimeters are foreseen to be improved for the Phase-I luminosity upgrade of the LHC, in 2019, in order to enhance the trigger feature extraction. Signals with higher spatial granularity will be digitized and processed by newly developed front-end and back-end components. In order to evaluate technical and performance aspects, a demonstrator system is being developed, with the intention of installing it on the ATLAS detector for operation during the data-taking period beginning in 2015. Results from system tests of the analog signal treatment, the trigger digitizer, the optical signal transmission and the FPGA-based back-end modules will be reported.

  12. XCPU2 process management system

    Energy Technology Data Exchange (ETDEWEB)

    Ionkov, Latchesar [Los Alamos National Laboratory; Van Hensbergen, Eric [IBM AUSTIN RESEARCH LAB

    2009-01-01

    Xcpu2 is a new process management system that allows the users to specify custom file system for a running job. Most cluster management systems enforce single software distribution running on all nodes. Xcpu2 allows programs running on the cluster to work in environment identical to the user's desktop, using the same versions of the libraries and tools the user installed locally, and accessing the configuration file in the same places they are located on the desktop. Xcpu2 builds on our earlier work with the Xcpu system. Like Xcpu, Xcpu2's process management interface is represented as a set of files exported by a 9P file server. It supports heterogeneous clusters and multiple head nodes. Unlike Xcpu, it uses pull instead of push model. In this paper we describe the Xcpu2 clustering model, its operation and how the per-job filesystem configuration can be used to solve some of the common problems when running a cluster.

  13. NDMAS System and Process Description

    Energy Technology Data Exchange (ETDEWEB)

    Larry Hull

    2012-10-01

    Experimental data generated by the Very High Temperature Reactor Program need to be more available to users in the form of data tables on Web pages that can be downloaded to Excel or in delimited text formats that can be used directly for input to analysis and simulation codes, statistical packages, and graphics software. One solution that can provide current and future researchers with direct access to the data they need, while complying with records management requirements, is the Nuclear Data Management and Analysis System (NDMAS). This report describes the NDMAS system and its components, defines roles and responsibilities, describes the functions the system performs, describes the internal processes the NDMAS team uses to carry out the mission, and describes the hardware and software used to meet Very High Temperature Reactor Program needs.

  14. Dynamic security assessment processing system

    Science.gov (United States)

    Tang, Lei

    The architecture of dynamic security assessment processing system (DSAPS) is proposed to address online dynamic security assessment (DSA) with focus of the dissertation on low-probability, high-consequence events. DSAPS upgrades current online DSA functions and adds new functions to fit into the modern power grid. Trajectory sensitivity analysis is introduced and its applications in power system are reviewed. An index is presented to assess transient voltage dips quantitatively using trajectory sensitivities. Then the framework of anticipatory computing system (ACS) for cascading defense is presented as an important function of DSAPS. ACS addresses various security problems and the uncertainties in cascading outages. Corrective control design is automated to mitigate the system stress in cascading progressions. The corrective controls introduced in the dissertation include corrective security constrained optimal power flow, a two-stage load control for severe under-frequency conditions, and transient stability constrained optimal power flow for cascading outages. With state-of-the-art computing facilities to perform high-speed extended-term time-domain simulation and optimization for large-scale systems, DSAPS/ACS efficiently addresses online DSA for low-probability, high-consequence events, which are not addressed by today's industrial practice. Human interference is reduced in the computationally burdensome analysis.

  15. A systems process of reinforcement.

    Science.gov (United States)

    Sudakov, K V

    1997-01-01

    Functional systems theory was used to consider the process of reinforcement of the actions on the body of reinforcing factors, i.e., the results of behavior satisfying the body's original needs. The systems process of reinforcement includes reverse afferentation entering the CNS from receptors acted upon by various parameters of the desired results, and mechanisms for comparing reverse afferentation with the apparatus which accepts the results of the action and the corresponding emotional component. A tight interaction between reinforcement and the dominant motivation is generated on the basis of the hologram principle. Reinforcement forms an apparatus for predicting a desired result, i.e. a result-of-action acceptor. Reinforcement procedures significant changes in the activities of individual neurons in the various brain structures involved in dominant motivation, transforming their spike activity for a burst pattern to regular discharges; there are also molecular changes in neuron properties. After preliminary reinforcement, the corresponding motivation induces the ribosomal system of neurons to start synthesizing special effector molecules, which organize molecular engrams of the acceptor of the action's result. Sensory mechanisms of reinforcement are considered, with particular reference to the information role of emotions.

  16. [The systems process of reinforcement].

    Science.gov (United States)

    Sudakov, K V

    1996-01-01

    The process of reinforcement is considered in the context of the general theory of functional systems as an important part of behavioural act organization closely interacting with the dominant motivation. It is shown that reinforcement substantially changes the activities of separate neurons in different brain structures involved in dominant motivation. After a preliminary reinforcement under the influence of corresponding motivation the ribosomal apparatus of neurons begins to synthesize special molecular engrams of the action acceptor. The sensory mechanisms of reinforcement and, especially, the role of emotions are considered in details in the paper.

  17. Process algebra for Hybrid systems

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2008-01-01

    We propose a process algebra obtained by extending a combination of the process algebra with continuous relative timing from Baeten and Middelburg [Process Algebra with Timing, Springer, Chap. 4, 2002] and the process algebra with propositional signals from Baeten and Bergstra [Theoretical Computer

  18. Process algebra for hybrid systems

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2005-01-01

    We propose a process algebra obtained by extending a combination of the process algebra with continuous relative timing from Baeten and Middelburg (Process Algebra with Timing, Springer,Berlin, 2002, Chapter 4), and the process algebra with propositional signals from Baeten and Bergstra(Theoret. Com

  19. Pulsar Coherent De-dispersion System on the Urumqi Observatory

    Science.gov (United States)

    Liu, Li-Yong; Ali, Esamdin; Zhang, Jin

    2007-03-01

    Pulsar coherent de-dispersion experiment was carried out by using the 25m Nanshan radio telescope in the Urumqi Observatory. It uses a dual polarization receiver operating at 18cm and a VLBI back-end, Mark5A. The data processing system is based on a C program on the Linux and a 4-node Beowulf cluster. A high quality data acquisition system and a cluster with more processors are needed to build an online pulsar coherent de-dispersion system in the future.

  20. Pulsar coherent de-dispersion system of Urumqi Observatory

    Science.gov (United States)

    Liyong, Liu; Esamdin, Ali; Jin, Zhang

    Pulsar coherent de-dispersion experiment has been carried by using the 25-m Nanshan radio telescope of Urumqi Observatory It uses a dual polarization receiver operating at 18cm and a VLBI back-end Mark5A The data processing system is based on a C program on Linux and a 4-node Beowulf cluster A high quality data acquisition system and a cluster with more processors are needed to build an on-line pulsar coherent de-dispersion system in future Key words Astronomical instrument Pulsar Coherent de-dispersion Parallel computing Cluster Mark5A

  1. Applications of text processing using natural processing system in Printer

    Science.gov (United States)

    Saito, Tadashi

    DAI NIPPON PRINTING CO., Ltd. developed a natural language processing system for the automatic indexing and assorting readable kana characters to kanji characters, which is called ruby. This system based on the automatic indexing system called INDEXER produced by NTT Communications and Information Processing Laboratories and NTT Data Communications Co., Ltd. This paper describes some applications using the system. This system creates kana characters for kanji characters which is useful for address books, name lists and books. Further we apply this system for an automatic indexing on CD-ROM.

  2. NASA System Engineering Design Process

    Science.gov (United States)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  3. Process automatization in system administration

    OpenAIRE

    Petauer, Janja

    2013-01-01

    The aim of the thesis is to present automatization of user management in company Studio Moderna. The company has grown exponentially in recent years, that is why we needed to find faster, easier and cheaper way of man- aging user accounts. We automatized processes of creating, changing and removing user accounts within Active Directory. We prepared user interface inside of existing application, used Java Script for drop down menus, wrote script in scripting programming langu...

  4. Vision Systems Illuminate Industrial Processes

    Science.gov (United States)

    2013-01-01

    When NASA designs a spacecraft to undertake a new mission, innovation does not stop after the design phase. In many cases, these spacecraft are firsts of their kind, requiring not only remarkable imagination and expertise in their conception but new technologies and methods for their manufacture. In the realm of manufacturing, NASA has from necessity worked on the cutting-edge, seeking new techniques and materials for creating unprecedented structures, as well as capabilities for reducing the cost and increasing the efficiency of existing manufacturing technologies. From friction stir welding enhancements (Spinoff 2009) to thermoset composites (Spinoff 2011), NASA s innovations in manufacturing have often transferred to the public in ways that enable the expansion of the Nation s industrial productivity. NASA has long pursued ways of improving upon and ensuring quality results from manufacturing processes ranging from arc welding to thermal coating applications. But many of these processes generate blinding light (hence the need for special eyewear during welding) that obscures the process while it is happening, making it difficult to monitor and evaluate. In the 1980s, NASA partnered with a company to develop technology to address this issue. Today, that collaboration has spawned multiple commercial products that not only support effective manufacturing for private industry but also may support NASA in the use of an exciting, rapidly growing field of manufacturing ideal for long-duration space missions.

  5. Applied signal processing concepts, circuits, and systems

    CERN Document Server

    Hamdy, Nadder

    2008-01-01

    Introduction What are Signals? Signal parameters Why Signal processing? Analog vs. Digital Signal processing Practical Signal processing Systems Analog Signal Processing Amplitude Shaping Frequency Spectrum Shaping Phase Errors Correction Waveform Generation Analog Filter Design Describing Equations Design Procedures Filter Specifications Approximations to the Ideal Response Realization Practical RC-Filters Design Switched Capacitor Filter Realization Design examples Data Converters Introduction A typical DSP System Specifications of Data Converters Sampling Samp

  6. Software Defined Common Processing System (SDCPS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Coherent Logix, Incorporated proposes the Software Defined Common Processing System (SDCPS) program to facilitate the development of a Software Defined Radio...

  7. Software Defined Common Processing System (SDCPS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Coherent Logix, Incorporated (CLX) proposes the development of a Software Defined Common Processing System (SDCPS) that leverages the inherent advantages of an...

  8. Integrated Monitoring System of Production Processes

    Directory of Open Access Journals (Sweden)

    Oborski Przemysław

    2016-12-01

    Full Text Available Integrated monitoring system for discrete manufacturing processes is presented in the paper. The multilayer hardware and software reference model was developed. Original research are an answer for industry needs of the integration of information flow in production process. Reference model corresponds with proposed data model based on multilayer data tree allowing to describe orders, products, processes and save monitoring data. Elaborated models were implemented in the integrated monitoring system demonstrator developed in the project. It was built on the base of multiagent technology to assure high flexibility and openness on applying intelligent algorithms for data processing. Currently on the base of achieved experience an application integrated monitoring system for real production system is developed. In the article the main problems of monitoring integration are presented, including specificity of discrete production, data processing and future application of Cyber-Physical-Systems. Development of manufacturing systems is based more and more on taking an advantage of applying intelligent solutions into machine and production process control and monitoring. Connection of technical systems, machine tools and manufacturing processes monitoring with advanced information processing seems to be one of the most important areas of near future development. It will play important role in efficient operation and competitiveness of the whole production system. It is also important area of applying in the future Cyber-Physical-Systems that can radically improve functionally of monitoring systems and reduce the cost of its implementation.

  9. Weak Markov Processes as Linear Systems

    CERN Document Server

    Gohm, Rolf

    2012-01-01

    A noncommutative Fornasini-Marchesini system (a multi-variable version of a linear system) can be realized within a weak Markov process (a model for quantum evolution). For a discrete time parameter this is worked out systematically as a theory of representations of structure maps of a system by a weak process. We introduce subprocesses and quotient processes which can be described naturally by a suitable category of weak processes. A corresponding notion of cascade for processes induces a represented cascade of systems. We study the control theoretic notion of observability which turns out to be particularly interesting in connection with a cascade structure. As an application we gain new insights into stationary Markov chains where observability for the system is closely related to asymptotic completeness in the scattering theory of the chain. This motivates a general definition of asymptotic completeness in the category of weak processes.

  10. Improving Process Heating System Performance v3

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-04-11

    Improving Process Heating System Performance: A Sourcebook for Industry is a development of the U.S. Department of Energy (DOE) Advanced Manufacturing Office (AMO) and the Industrial Heating Equipment Association (IHEA). The AMO and IHEA undertook this project as part of an series of sourcebook publications developed by AMO on energy-consuming industrial systems, and opportunities to improve performance. Other topics in this series include compressed air systems, pumping systems, fan systems, steam systems, and motors and drives

  11. Sticky continuous processes have consistent price systems

    DEFF Research Database (Denmark)

    Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan

    Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under arb...

  12. Gaussian process based recursive system identification

    Science.gov (United States)

    Prüher, Jakub; Šimandl, Miroslav

    2014-12-01

    This paper is concerned with the problem of recursive system identification using nonparametric Gaussian process model. Non-linear stochastic system in consideration is affine in control and given in the input-output form. The use of recursive Gaussian process algorithm for non-linear system identification is proposed to alleviate the computational burden of full Gaussian process. The problem of an online hyper-parameter estimation is handled using proposed ad-hoc procedure. The approach to system identification using recursive Gaussian process is compared with full Gaussian process in terms of model error and uncertainty as well as computational demands. Using Monte Carlo simulations it is shown, that the use of recursive Gaussian process with an ad-hoc learning procedure offers converging estimates of hyper-parameters and constant computational demands.

  13. Multidisciplinary systems engineering architecting the design process

    CERN Document Server

    Crowder, James A; Demijohn, Russell

    2016-01-01

    This book presents Systems Engineering from a modern, multidisciplinary engineering approach, providing the understanding that all aspects of systems design, systems, software, test, security, maintenance and the full life-cycle must be factored in to any large-scale system design; up front, not factored in later. It lays out a step-by-step approach to systems-of-systems architectural design, describing in detail the documentation flow throughout the systems engineering design process. It provides a straightforward look and the entire systems engineering process, providing realistic case studies, examples, and design problems that will enable students to gain a firm grasp on the fundamentals of modern systems engineering.  Included is a comprehensive design problem that weaves throughout the entire text book, concluding with a complete top-level systems architecture for a real-world design problem.

  14. The Farm Processing System at CDF

    Institute of Scientific and Technical Information of China (English)

    JaroslayAntos; MarianBabik; 等

    2001-01-01

    At Fermilab's CDF farm a modular and highly scalable software and control system for processing,reprocessing,Monte Carlo generation and many other tasks has been created.The system is called FPS(Farm Processing System).This system consists of independent software components and allows modifications to suit other types of processing as well.FPS is accompanied with fully featured monitoring and control interfaces,including web statistics displays and a multiplatform Java control interface that allow easy management and control.The system also features automatic error recovery procedures with early warnings that allow smooth running.A general overview of the software desing along with a description of the features and limitations of the system and its components will be presented.Run 2 experience with the system will be giver as well.

  15. Image Processing in Intelligent Medical Robotic Systems

    Directory of Open Access Journals (Sweden)

    Shashev Dmitriy

    2016-01-01

    Full Text Available The paper deals with the use of high-performance computing systems with the parallel-operation architecture in intelligent medical systems, such as medical robotic systems, based on a computer vision system, is an automatic control system with the strict requirements, such as high reliability, accuracy and speed of performance. It shows the basic block-diagram of an automatic control system based on a computer vision system. The author considers the possibility of using a reconfigurable computing environment in such systems. The design principles of the reconfigurable computing environment allows to improve a reliability, accuracy and performance of whole system many times. The article contains the brief overview and the theory of the research, demonstrates the use of reconfigurable computing environments for the image preprocessing, namely morphological image processing operations. Present results of the successful simulation of the reconfigurable computing environment and implementation of the morphological image processing operations on the test image in the MATLAB Simulink.

  16. The message processing and distribution system development

    Science.gov (United States)

    Whitten, K. L.

    1981-06-01

    A historical approach is used in presenting the life cycle development of the Navy's message processing and distribution system beginning with the planning phase and ending with the integrated logistic support phase. Several maintenance problems which occurred after the system was accepted for fleet use were examined to determine if they resulted from errors in the acquisition process. The critical decision points of the acquisition process are examined and constructive recommendations are made for avoiding the problems which hindered the successful development of this system.

  17. Design approaches for the X band LLRF system

    Energy Technology Data Exchange (ETDEWEB)

    Mavric, Uros, E-mail: uros@i-tech.si [Instrumentation Technologies, Velika pot 22, Solkan 5250 (Slovenia); Jug, Gasper [Instrumentation Technologies, Velika pot 22, Solkan 5250 (Slovenia)

    2011-11-21

    The low-level RF (LLRF) system regulates disturbances over a limited bandwidth in accordance with its capabilities and the RF loop parameters. The disturbances usually originate in the RF system or can be coupled to the RF system from the environment. In this paper a general overview of the possible design approaches for a digital LLRF system operating in X band is presented. Firstly, the possible design approaches of the RF front/back ends are presented and reviewed. We also define the main design parameters for the RF front/back ends. Parameters like isolation between channels, noise, gain, linearity and number of IF stages are put into the perspective of machines using RF components in the X band. An important part of the LLRF system is the local RF timing generation and distribution, which is also treated in the paper. In the second part of the paper the main design approaches in the digital signal processing part of the LLRF system are presented. The emphasis is on the algorithms that are machine specific. Some standard processing algorithms like adaptive feed-forward and arbitrary shaping of feed-forward pulses are presented. Finally, a suggestion for the X band LLRF design is given.

  18. A dynamically reconfigurable data stream processing system

    Energy Technology Data Exchange (ETDEWEB)

    Nogiec, J.M.; Trombly-Freytag, K.; /Fermilab

    2004-11-01

    This paper describes a component-based framework for data stream processing that allows for configuration, tailoring, and runtime system reconfiguration. The system's architecture is based on a pipes and filters pattern, where data is passed through routes between components. A network of pipes and filters can be dynamically reconfigured in response to a preplanned sequence of processing steps, operator intervention, or a change in one or more data streams. This framework provides several mechanisms supporting dynamic reconfiguration and can be used to build static data stream processing applications such as monitoring or data acquisition systems, as well as self-adjusting systems that can adapt their processing algorithm, presentation layer, or data persistency layer in response to changes in input data streams.

  19. Human-Systems Integration Processes (HSIP) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In FY12, this project removed the commercial-specific content from the Commercial Human-Systems Integration Design Processes (CHSIP), identified gaps in the...

  20. A plasma process monitor/control system

    Energy Technology Data Exchange (ETDEWEB)

    Stevenson, J.O.; Ward, P.P.; Smith, M.L. [Sandia National Labs., Albuquerque, NM (United States); Markle, R.J. [Advanced Micro Devices, Inc., Austin, TX (United States)

    1997-08-01

    Sandia National Laboratories has developed a system to monitor plasma processes for control of industrial applications. The system is designed to act as a fully automated, sand-alone process monitor during printed wiring board and semiconductor production runs. The monitor routinely performs data collection, analysis, process identification, and error detection/correction without the need for human intervention. The monitor can also be used in research mode to allow process engineers to gather additional information about plasma processes. The plasma monitor can perform real-time control of support systems known to influence plasma behavior. The monitor can also signal personnel to modify plasma parameters when the system is operating outside of desired specifications and requires human assistance. A notification protocol can be selected for conditions detected in the plasma process. The Plasma Process Monitor/Control System consists of a computer running software developed by Sandia National Laboratories, a commercially available spectrophotometer equipped with a charge-coupled device camera, an input/output device, and a fiber optic cable.

  1. Teaching Information Systems Development via Process Variants

    Science.gov (United States)

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  2. Interactive data-processing system for metallurgy

    Science.gov (United States)

    Rathz, T. J.

    1978-01-01

    Equipment indicates that system can rapidly and accurately process metallurgical and materials-processing data for wide range of applications. Advantages include increase in contract between areas on image, ability to analyze images via operator-written programs, and space available for storing images.

  3. Agents-based distributed processes control systems

    Directory of Open Access Journals (Sweden)

    Adrian Gligor

    2011-12-01

    Full Text Available Large industrial distributed systems have revealed a remarkable development in recent years. We may note an increase of their structural and functional complexity, at the same time with those on requirements side. These are some reasons why there are involvednumerous researches, energy and resources to solve problems related to these types of systems. The paper addresses the issue of industrial distributed systems with special attention being given to the distributed industrial processes control systems. A solution for a distributed process control system based on mobile intelligent agents is presented.The main objective of the proposed system is to provide an optimal solution in terms of costs, maintenance, reliability and flexibility. The paper focuses on requirements, architecture, functionality and advantages brought by the proposed solution.

  4. Process monitoring using ultrasonic sensor systems.

    Science.gov (United States)

    Henning, Bernd; Rautenberg, Jens

    2006-12-22

    Continuous in-line measurement of substance concentration in liquid mixtures is valuable in improving industrial processes in terms of material properties, energy efficiency and process safety. Ultrasonic sensor systems meet the practical requirements of a chemical sensor quite well. Currently ultrasonic sensor systems are widely used as acoustic chemical sensors to measure concentration of selected substances or to monitor the course of polymerisation, crystallisation or fermentation processes. Useable acoustic properties for the characterisation of liquid mixtures are sound velocity, sound absorption and acoustic impedance. This contribution will give a short overview of the state of the art and several trends for the use of ultrasonic sensor systems in process applications. Novel investigations show the very promising possibility to analyse liquid multi-phase mixtures like suspensions, emulsions and dispersions.

  5. Power systems signal processing for smart grids

    CERN Document Server

    Ribeiro, Paulo Fernando; Ribeiro, Paulo Márcio; Cerqueira, Augusto Santiago

    2013-01-01

    With special relation to smart grids, this book provides clear and comprehensive explanation of how Digital Signal Processing (DSP) and Computational Intelligence (CI) techniques can be applied to solve problems in the power system. Its unique coverage bridges the gap between DSP, electrical power and energy engineering systems, showing many different techniques applied to typical and expected system conditions with practical power system examples. Surveying all recent advances on DSP for power systems, this book enables engineers and researchers to understand the current state of the art a

  6. Air conditioning for data processing system areas

    Directory of Open Access Journals (Sweden)

    Hernando Camacho García

    2011-06-01

    Full Text Available The appropiate selection of air conditioners for data processing system areas requires the knowledge of the environmental desing conditions, the air conditioning systems succssfully used computer and the cooling loads to handle. This work contains information about a wide variety of systems designed for computer room applications. a complete example of calculation to determine the amount of heat to be removed for satisfactory operation, is also included.

  7. Waste receiving and processing plant control system; system design description

    Energy Technology Data Exchange (ETDEWEB)

    LANE, M.P.

    1999-02-24

    The Plant Control System (PCS) is a heterogeneous computer system composed of numerous sub-systems. The PCS represents every major computer system that is used to support operation of the Waste Receiving and Processing (WRAP) facility. This document, the System Design Description (PCS SDD), includes several chapters and appendices. Each chapter is devoted to a separate PCS sub-system. Typically, each chapter includes an overview description of the system, a list of associated documents related to operation of that system, and a detailed description of relevant system features. Each appendice provides configuration information for selected PCS sub-systems. The appendices are designed as separate sections to assist in maintaining this document due to frequent changes in system configurations. This document is intended to serve as the primary reference for configuration of PCS computer systems. The use of this document is further described in the WRAP System Configuration Management Plan, WMH-350, Section 4.1.

  8. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  9. Information processing systems, reasoning modules, and reasoning system design methods

    Science.gov (United States)

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  10. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  11. Architecture for Survivable System Processing (ASSP)

    Science.gov (United States)

    Wood, Richard J.

    1991-11-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  12. Improving industrial process control systems security

    CERN Document Server

    Epting, U; CERN. Geneva. TS Department

    2004-01-01

    System providers are today creating process control systems based on remote connectivity using internet technology, effectively exposing these systems to the same threats as corporate computers. It is becoming increasingly difficult and costly to patch/maintain the technical infrastructure monitoring and control systems to remove these vulnerabilities. A strategy including risk assessment, security policy issues, service level agreements between the IT department and the controls engineering groups must be defined. In addition an increased awareness of IT security in the controls system engineering domain is needed. As consequence of these new factors the control system architectures have to take into account security requirements, that often have an impact on both operational aspects as well as on the project and maintenance cost. Manufacturers of industrial control system equipment do however also propose progressively security related solutions that can be used for our active projects. The paper discusses ...

  13. Data processing system of GA and PPPL

    Energy Technology Data Exchange (ETDEWEB)

    Oshima, Takayuki [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2001-11-01

    Results of research in 1997 to General Atomics (GA) and Princeton Plasma Physics Laboratory (PPPL) are reported. The author visited the computer system of fusion group in GA. He joined the tokamak experiment in DIII-D, especially on the demonstration of the remote experiment inside U.S., and investigated the data processing system of DIII-D and the computer network, etc. After the visit to GA, He visited PPPL and exchanged the information about the equipment of remote experiment between JAERI and PPPL based on the US-Japan fusion energy research cooperation. He also investigated the data processing system of TFTR tokamak, the computer network and so on. Results of research of the second visit to GA in 2000 are also reported, which describes a rapid progress of each data processing equipment by the advance on the computer technology in just three years. (author)

  14. Integrated Virtual Assembly Process Planning System

    Institute of Scientific and Technical Information of China (English)

    LIU Jianhua; HOU Weiwei; HOU Weiwei; SHANG Wei; SHANG Wei; NING Ruxin; NING Ruxin

    2009-01-01

    Assembly process planning(APP) for complicated products is a time-consuming and difficult work with conventional method. Virtual assembly process planning(VAPP) provides engineers a new and efficiency way. Previous studies in VAPP are almost isolated and dispersive, and have not established a whole understanding and discussed key realization techniques of VAPP from a systemic and integrated view. The integrated virtual assembly process planning(IVAPP) system is a new virtual reality based engineering application, which offers engineers an efficient, intuitive, immersive and integrated method for assembly process planning in a virtual environment. Based on analysis the information integration requirement of VAPP, the architecture of IVAPP is proposed. Through the integrated structure, IVAPP system can realize information integration and workflow controlling. In order to model the assembly process in IVAPP, a hierarchical assembly task list(HATL) is presented, in which different assembly tasks for assembling different components are organized into a hierarchical list. A process-oriented automatic geometrical constraint recognition algorithm(AGCR) is proposed, so that geometrical constraints between components can be automatically recognized during the process of interactive assembling. At the same time, a progressive hierarchical reasoning(PHR) model is discussed. AGCR and PHR will greatly reduce the interactive workload. A discrete control node model(DCNM) for cable harness assembly planning in IVAPP is detailed. DCNM converts a cable harness into continuous flexed line segments connected by a series of section center points, and designs can realize cable harness planning through controlling those control nodes. Mechanical assemblies (such as transmission case and engine of automobile) are used to illustrate the feasibility of the proposed method and algorithms. The application of IVAPP system reveals advantages over the traditional assembly process planning method

  15. Expert systems in the process industries

    Science.gov (United States)

    Stanley, G. M.

    1992-01-01

    This paper gives an overview of industrial applications of real-time knowledge based expert systems (KBES's) in the process industries. After a brief overview of the features of a KBES useful in process applications, the general roles of KBES's are covered. A particular focus is diagnostic applications, one of the major applications areas. Many applications are seen as an expansion of supervisory control. The lessons learned from numerous online applications are summarized.

  16. Arabic Natural Language Processing System Code Library

    Science.gov (United States)

    2014-06-01

    Adelphi, MD 20783-1197 This technical note provides a brief description of a Java library for Arabic natural language processing ( NLP ) containing code...for training and applying the Arabic NLP system described in the paper "A Cross-Task Flexible Transition Model for Arabic Tokenization, Affix...processing, NLP , Java, code 14 Stephen C. Tratz (301) 394-2305Unclassified Unclassified Unclassified UU ii Contents 1. Introduction 1 2. File Overview 1 3

  17. Modification of kiln back end's dirt collection system of 700t/d rotary kiln%700t/d回转窑窑尾收尘系统的改造

    Institute of Scientific and Technical Information of China (English)

    刘忠东; 钱晓露

    2003-01-01

    @@ 燕山水泥厂 700t/d熟料干法生产线,窑尾为Φ 6m× 20m增湿塔和 WY-70三电场电除尘器.自 1987年投产至改造前,除尘器不能确保达到排放标准.为迎接 2008年奥运会 ,达到新的环保要求,原窑尾除尘器急需改造.

  18. The development of technical database of advanced spent fuel management process

    Energy Technology Data Exchange (ETDEWEB)

    Ro, Seung Gy; Byeon, Kee Hoh; Song, Dae Yong; Park, Seong Won; Shin, Young Jun

    1999-03-01

    The purpose of this study is to develop the technical database system to provide useful information to researchers who study on the back end nuclear fuel cycle. Technical database of advanced spent fuel management process was developed for a prototype system in 1997. In 1998, this database system is improved into multi-user systems and appended special database which is composed of thermochemical formation data and reaction data. In this report, the detailed specification of our system design is described and the operating methods are illustrated as a user's manual. Also, expanding current system, or interfacing between this system and other system, this report is very useful as a reference. (Author). 10 refs., 18 tabs., 46 fig.

  19. System Runs Analysis with Process Mining

    Directory of Open Access Journals (Sweden)

    S. A. Shershakov

    2015-01-01

    Full Text Available Information systems (IS produce numerous traces and logs at runtime. In the context of SOA-based (service-oriented architecture IS, these logs contain details about sequences of process and service calls. Modern application monitoring and error tracking tools provide only rather straightforward log search and filtering functionality. However, “clever” analysis of the logs is highly useful, since it can provide valuable insights into the system architecture, interaction of business domains and services. Here we took runs event logs (trace data of a big booking system and discovered architectural guidelines violations and common anti-patterns. We applied mature process mining techniques for discovery and analysis of these logs. The aims of process mining are to discover, analyze, and improve processes on the basis of IS behavior recorded as event logs. In several specific examples, we show successful applications of process mining to system runtime analysis and motivate further research in this area.The article is published in the authors’ wording.

  20. Telemedicine optoelectronic biomedical data processing system

    Science.gov (United States)

    Prosolovska, Vita V.

    2010-08-01

    The telemedicine optoelectronic biomedical data processing system is created to share medical information for the control of health rights and timely and rapid response to crisis. The system includes the main blocks: bioprocessor, analog-digital converter biomedical images, optoelectronic module for image processing, optoelectronic module for parallel recording and storage of biomedical imaging and matrix screen display of biomedical images. Rated temporal characteristics of the blocks defined by a particular triggering optoelectronic couple in analog-digital converters and time imaging for matrix screen. The element base for hardware implementation of the developed matrix screen is integrated optoelectronic couples produced by selective epitaxy.

  1. Process Identification through Test on Cryogenic System

    CERN Document Server

    Pezzetti, M; Chadli, M; Coppier, H

    2008-01-01

    UNICOS (UNified Industrial Control System) is the CERN object-based control standard for the cryogenics of the LHC and its experiments. It includes a variety of embedded functions, dedicated to the specific cryogenic processes. To enlarge the capabilities of the standard it is proposed to integrate the parametrical identification step in the control system of large scale cryogenic plants. Different methods of parametrical identification have been tested and the results were combined to obtain a better model. The main objective of the work is to find a compromise between an easy-to-use solution and a good level of process identification model. The study focuses on identification protocol for large delayed system, the measurement consistency and correlation between different inputs and outputs. Furthermore the paper describes in details, the results and the tests carried out on parametrical identification investigations with large scale systems.

  2. Haptic teleoperation systems signal processing perspective

    CERN Document Server

    Lee, Jae-young

    2015-01-01

    This book examines the signal processing perspective in haptic teleoperation systems. This text covers the topics of prediction, estimation, architecture, data compression, and error correction that can be applied to haptic teleoperation systems. The authors begin with an overview of haptic teleoperation systems, then look at a Bayesian approach to haptic teleoperation systems. They move onto a discussion of haptic data compression, haptic data digitization and forward error correction.   ·         Presents haptic data prediction/estimation methods that compensate for unreliable networks   ·         Discusses haptic data compression that reduces haptic data size over limited network bandwidth and haptic data error correction that compensate for packet loss problem   ·         Provides signal processing techniques used with existing control architectures.

  3. The IUE Final Archive Processing System

    Science.gov (United States)

    Imhoff, C. L.; Dunn, N.; Fireman, G. F.; Levay, K. L.; Meylan, T.; Nichols, J.; Michalitsianos, A.

    1993-12-01

    The IUE Project has begun the task of reprocessing all IUE data using significantly enhanced reduction algorithms and calibrations. In order to perform this task in a timely, reliable manner, we have developed the IUE Final Archive Processing System. The system runs on a DECstation 5000, using Fortran software embedded in portable MIDAS. The processing queue is driven by a commercial relational database. The database interface allows the system to access the enhanced IUE database, which is resident on a second DECstation 5000 (see poster by Levay et al.). The system runs automatically, with little operator intervention. Built-in quality assurance software detects virtually all input or processing problems. In addition, a fraction of the images, including all those with quality assurance warnings, are screened by the staff. The screening system, known as the Post-Production Verification (PPV) system, uses a widget-based graphics user interface written in IDL. It allows one to display and inspect the MIDAS and FITS files, review the FITS headers and other text files, and record the results in the IUE database. Images which have passed quality assurance are then delivered to NASA's National Space Science Data Center, which makes the data available to the astronomical community. This work has been supported under NASA contract NAS5-31230 to Computer Sciences Corp.

  4. Features, Events, and Processes: system Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  5. Currency Recognition System Using Image Processing

    Directory of Open Access Journals (Sweden)

    S. M. Saifullah

    2015-11-01

    Full Text Available In the last few years a great technological advances in color printing, duplicating and scanning, counterfeiting problems have become more serious. In past only authorized printing house has the ability to make currency paper, but now a days it is possible for anyone to print fake bank note with the help of modern technology such as computer, laser printer. Fake notes are burning questions in almost every country. Like others country Bangladesh has also hit really heard and has become a very acute problem. Therefore there is a need to design a currency recognition system that can easily make a difference between real and fake banknote and the process will time consuming. Our system describes an approach for verification of Bangladeshi currency banknotes. The currency will be verified by using image processing techniques. The approach consists of a number of components including image processing, image segmentation, feature extraction, comparing images. The system is designed by MATLAB. Image processing involves changing the nature of an image in order to improve its pictorial information for human interpretation. The image processing software is a collection of functions that extends the capability of the MATLAB numeric computing environment. The result will be whether currency is real or fake.

  6. System and process for biomass treatment

    Science.gov (United States)

    Dunson, Jr., James B; Tucker, III, Melvin P; Elander, Richard T; Lyons, Robert C

    2013-08-20

    A system including an apparatus is presented for treatment of biomass that allows successful biomass treatment at a high solids dry weight of biomass in the biomass mixture. The design of the system provides extensive distribution of a reactant by spreading the reactant over the biomass as the reactant is introduced through an injection lance, while the biomass is rotated using baffles. The apparatus system to provide extensive assimilation of the reactant into biomass using baffles to lift and drop the biomass, as well as attrition media which fall onto the biomass, to enhance the treatment process.

  7. Emulsification with microstructured systems : process principles

    NARCIS (Netherlands)

    Zwan, van der E.A.

    2008-01-01

    The aim of this thesis is to elucidate the underlying processes and mechanisms that determine the droplet size of emulsions produced with microstructured systems, such as premix microstructure homogenization and microchannel emulsification. The ultimate goal is to describe these methods based on det

  8. Microprocessor systems for industrial process control

    Science.gov (United States)

    Lesh, F. H.

    1980-01-01

    Six computers operate synchronously and are interconnected by three independent data buses. Processors control one subsystem. Some can control buses to transfer data at 1 megabit per second. Every 2.5 msec each processor examines list of things to do during next interval. This spacecraft control system could be adapted for controlling complex industrial processes.

  9. Color Image Processing and Object Tracking System

    Science.gov (United States)

    Klimek, Robert B.; Wright, Ted W.; Sielken, Robert S.

    1996-01-01

    This report describes a personal computer based system for automatic and semiautomatic tracking of objects on film or video tape, developed to meet the needs of the Microgravity Combustion and Fluids Science Research Programs at the NASA Lewis Research Center. The system consists of individual hardware components working under computer control to achieve a high degree of automation. The most important hardware components include 16-mm and 35-mm film transports, a high resolution digital camera mounted on a x-y-z micro-positioning stage, an S-VHS tapedeck, an Hi8 tapedeck, video laserdisk, and a framegrabber. All of the image input devices are remotely controlled by a computer. Software was developed to integrate the overall operation of the system including device frame incrementation, grabbing of image frames, image processing of the object's neighborhood, locating the position of the object being tracked, and storing the coordinates in a file. This process is performed repeatedly until the last frame is reached. Several different tracking methods are supported. To illustrate the process, two representative applications of the system are described. These applications represent typical uses of the system and include tracking the propagation of a flame front and tracking the movement of a liquid-gas interface with extremely poor visibility.

  10. Small Interactive Image Processing System (SMIPS) system description

    Science.gov (United States)

    Moik, J. G.

    1973-01-01

    The Small Interactive Image Processing System (SMIPS) operates under control of the IBM-OS/MVT operating system and uses an IBM-2250 model 1 display unit as interactive graphic device. The input language in the form of character strings or attentions from keys and light pen is interpreted and causes processing of built-in image processing functions as well as execution of a variable number of application programs kept on a private disk file. A description of design considerations is given and characteristics, structure and logic flow of SMIPS are summarized. Data management and graphic programming techniques used for the interactive manipulation and display of digital pictures are also discussed.

  11. A process planning system for cold extrusion

    Directory of Open Access Journals (Sweden)

    SANTOSH KUMAR,

    2010-12-01

    Full Text Available A Process Planning system ProEx-Cold is developed for extrusion shapes to eliminate the tedious and expensive procedure of trial and correction of a proper die and the process. The system has three modules as: feature recognition, upper bound analysis and 3D graphics generation & display using OpenGL application engine. The input parameters to the proposed CAPP system includes: die type, billet TYPE & material, geometrical details of the product, ram speed,reduction, friction condition and billet condition etc. to influence parameters like production rate, extrusion ram pressure etc. C-programming, OpenGL graphics and Visual C++ editor has been used to implement ProEx-Cold.

  12. Model systems for life processes on Mars

    Science.gov (United States)

    Mitz, M. A.

    1974-01-01

    In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

  13. Process simulation in digital camera system

    Science.gov (United States)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  14. Sedimentation process and design of settling systems

    CERN Document Server

    De, Alak

    2017-01-01

    This book is designed to serve as a comprehensive source of information of sedimentation processes and design of settling systems, especially as applied to design of such systems in civil and environmental engineering. The book begins with an introduction to sedimentation as a whole and goes on to cover the development and details of various settling theories. The book traces the chronological developments of the comprehensive knowledge of settling studies and design of settling systems from 1889.A new concept of 'Velocity Profile Theorem', tool for settling problem analysis, has been employed to the analysis of the phenomenon of short circuiting. Complete theory of tube settling has been developed and its application to the computation of residual solids from the assorted solids through the same has been demonstrated. Experimental verification of the tube settling theory has also been presented. Field-oriented compatible design and operation methodology of settling system has been developed from the detailed...

  15. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  16. Smart Parking System with Image Processing Facility

    Directory of Open Access Journals (Sweden)

    M.O. Reza

    2012-04-01

    Full Text Available Smart Parking Systems obtain information about available parking spaces, process it and then place the car at a certain position. A prototype of the parking assistance system based on the proposed architecture was constructed here. The adopted hardware, software, and implementation solutions in this prototype construction are described in this paper. The effective circular design is introduced here having rack-pinion special mechanism which is used to lift and place the car in the certain position. The design of rack pinion mechanism is also simulated using AUTODESK INVENTOR and COMSOL software.

  17. Security of legacy process control systems : Moving towards secure process control systems

    NARCIS (Netherlands)

    Oosterink, M.

    2012-01-01

    This white paper describes solutions which organisations may use to improve the security of their legacy process control systems. When we refer to a legacy system, we generally refer to old methodologies, technologies, computer systems or applications which are still in use, despite the fact that ne

  18. Modeling delayed processes in biological systems

    Science.gov (United States)

    Feng, Jingchen; Sevier, Stuart A.; Huang, Bin; Jia, Dongya; Levine, Herbert

    2016-09-01

    Delayed processes are ubiquitous in biological systems and are often characterized by delay differential equations (DDEs) and their extension to include stochastic effects. DDEs do not explicitly incorporate intermediate states associated with a delayed process but instead use an estimated average delay time. In an effort to examine the validity of this approach, we study systems with significant delays by explicitly incorporating intermediate steps. We show that such explicit models often yield significantly different equilibrium distributions and transition times as compared to DDEs with deterministic delay values. Additionally, different explicit models with qualitatively different dynamics can give rise to the same DDEs revealing important ambiguities. We also show that DDE-based predictions of oscillatory behavior may fail for the corresponding explicit model.

  19. Systems perspectives on mRNA processing

    Institute of Scientific and Technical Information of China (English)

    Adrienne E McKee; Pamela A Silver

    2007-01-01

    The application of genomic technologies to the study of mRNA processing is increasingly conducted in metazoan organisms in order to understand the complex events that occur during and after transcription. Large-scale systems analyses of mRNA-protein interactions and mRNA dynamics have revealed specificity in mRNA transcription, splicing, transport, translation, and turnover, and have begun to make connections between the different layers of mRNA processing. Here, we review global studies of post-transcriptional processes and discuss the challenges facing our understanding of mRNA regulation in metazoan organisms. In parallel, we examine genome-scale investigations that have expanded our knowledge of RNA-binding proteins and the networks of mRNAs that they regulate.

  20. Fundamentals of process intensification: A process systems engineering view

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Sales Cruz, Alfonso Mauricio; Gani, Rafiqul

    2016-01-01

    at different scales of size, that is, the unit operation scale, the task scale, and the phenomena scale. The roles of process intensification with respect to process improvements and the generation of more sustainable process designs are discussed and questions related to when to apply process intensification...

  1. Graphics Processing Units for HEP trigger systems

    Science.gov (United States)

    Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.; Neri, I.; Paolucci, P. S.; Piandani, R.; Pontisso, L.; Rescigno, M.; Simula, F.; Sozzi, M.; Vicini, P.

    2016-07-01

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  2. Integrated system for automated financial document processing

    Science.gov (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  3. Graphics Processing Units for HEP trigger systems

    Energy Technology Data Exchange (ETDEWEB)

    Ammendola, R. [INFN Sezione di Roma “Tor Vergata”, Via della Ricerca Scientifica 1, 00133 Roma (Italy); Bauce, M. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); Biagioni, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); Chiozzi, S.; Cotta Ramusino, A. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Fantechi, R. [INFN Sezione di Pisa, Largo B. Pontecorvo 3, 56127 Pisa (Italy); CERN, Geneve (Switzerland); Fiorini, M. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Giagu, S. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); Gianoli, A. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Lamanna, G., E-mail: gianluca.lamanna@cern.ch [INFN Sezione di Pisa, Largo B. Pontecorvo 3, 56127 Pisa (Italy); INFN Laboratori Nazionali di Frascati, Via Enrico Fermi 40, 00044 Frascati (Roma) (Italy); Lonardo, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); Messina, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); and others

    2016-07-11

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  4. Disease processes as hybrid dynamical systems

    Directory of Open Access Journals (Sweden)

    Pietro Liò

    2012-08-01

    Full Text Available We investigate the use of hybrid techniques in complex processes of infectious diseases. Since predictive disease models in biomedicine require a multiscale approach for understanding the molecule-cell-tissue-organ-body interactions, heterogeneous methodologies are often employed for describing the different biological scales. Hybrid models provide effective means for complex disease modelling where the action and dosage of a drug or a therapy could be meaningfully investigated: the infection dynamics can be classically described in a continuous fashion, while the scheduling of multiple treatment discretely. We define an algebraic language for specifying general disease processes and multiple treatments, from which a semantics in terms of hybrid dynamical system can be derived. Then, the application of control-theoretic tools is proposed in order to compute the optimal scheduling of multiple therapies. The potentialities of our approach are shown in the case study of the SIR epidemic model and we discuss its applicability on osteomyelitis, a bacterial infection affecting the bone remodelling system in a specific and multiscale manner. We report that formal languages are helpful in giving a general homogeneous formulation for the different scales involved in a multiscale disease process; and that the combination of hybrid modelling and control theory provides solid grounds for computational medicine.

  5. Advances in Packaging Methods, Processes and Systems

    Directory of Open Access Journals (Sweden)

    Nitaigour Premchand Mahalik

    2014-10-01

    Full Text Available The food processing and packaging industry is becoming a multi-trillion dollar global business. The reason is that the recent increase in incomes in traditionally less economically developed countries has led to a rise in standards of living that includes a significantly higher consumption of packaged foods. As a result, food safety guidelines have been more stringent than ever. At the same time, the number of research and educational institutions—that is, the number of potential researchers and stakeholders—has increased in the recent past. This paper reviews recent developments in food processing and packaging (FPP, keeping in view the aforementioned advancements and bearing in mind that FPP is an interdisciplinary area in that materials, safety, systems, regulation, and supply chains play vital roles. In particular, the review covers processing and packaging principles, standards, interfaces, techniques, methods, and state-of-the-art technologies that are currently in use or in development. Recent advances such as smart packaging, non-destructive inspection methods, printing techniques, application of robotics and machineries, automation architecture, software systems and interfaces are reviewed.

  6. Stochastic transport processes in discrete biological systems

    CERN Document Server

    Frehland, Eckart

    1982-01-01

    These notes are in part based on a course for advanced students in the applications of stochastic processes held in 1978 at the University of Konstanz. These notes contain the results of re­ cent studies on the stochastic description of ion transport through biological membranes. In particular, they serve as an introduction to an unified theory of fluctuations in complex biological transport systems. We emphasize that the subject of this volume is not to introduce the mathematics of stochastic processes but to present a field of theoretical biophysics in which stochastic methods are important. In the last years the study of membrane noise has become an important method in biophysics. Valuable information on the ion transport mechanisms in membranes can be obtained from noise analysis. A number of different processes such as the opening and closing of ion channels have been shown to be sources of the measured current or voltage fluctuations. Bio­ logical 'transport systems can be complex. For example, the tr...

  7. Electrochemical decontamination system for actinide processing gloveboxes

    Energy Technology Data Exchange (ETDEWEB)

    Wedman, D.E.; Lugo, J.L.; Ford, D.K.; Nelson, T.O.; Trujillo, V.L.; Martinez, H.E.

    1998-03-01

    An electrolytic decontamination technology has been developed and successfully demonstrated at Los Alamos National Laboratory (LANL) for the decontamination of actinide processing gloveboxes. The technique decontaminates the interior surfaces of stainless steel gloveboxes utilizing a process similar to electropolishing. The decontamination device is compact and transportable allowing it to be placed entirely within the glovebox line. In this way, decontamination does not require the operator to wear any additional personal protective equipment and there is no need for additional air handling or containment systems. Decontamination prior to glovebox decommissioning reduces the potential for worker exposure and environmental releases during the decommissioning, transport, and size reduction procedures which follow. The goal of this effort is to reduce contamination levels of alpha emitting nuclides for a resultant reduction in waste level category from High Level Transuranic (TRU) to low Specific Activity (LSA, less than or equal 100 nCi/g). This reduction in category results in a 95% reduction in disposal and disposition costs for the decontaminated gloveboxes. The resulting contamination levels following decontamination by this method are generally five orders of magnitude below the LSA specification. Additionally, the sodium sulfate based electrolyte utilized in the process is fully recyclable which results in the minimum of secondary waste. The process bas been implemented on seven gloveboxes within LANL`s Plutonium Facility at Technical Area 55. Of these gloveboxes, two have been discarded as low level waste items and the remaining five have been reused.

  8. NASA Human System Risk Assessment Process

    Science.gov (United States)

    Francisco, D.; Romero, E.

    2016-01-01

    NASA utilizes an evidence based system to perform risk assessments for the human system for spaceflight missions. The center of this process is the multi-disciplinary Human System Risk Board (HSRB). The HSRB is chartered from the Chief Health and Medical Officer (OCHMO) at NASA Headquarters. The HSRB reviews all human system risks via an established comprehensive risk and configuration management plan based on a project management approach. The HSRB facilitates the integration of human research (terrestrial and spaceflight), medical operations, occupational surveillance, systems engineering and many other disciplines in a comprehensive review of human system risks. The HSRB considers all factors that influence human risk. These factors include pre-mission considerations such as screening criteria, training, age, sex, and physiological condition. In mission factors such as available countermeasures, mission duration and location and post mission factors such as time to return to baseline (reconditioning), post mission health screening, and available treatments. All of the factors influence the total risk assessment for each human risk. The HSRB performed a comprehensive review of all potential inflight medical conditions and events and over the course of several reviews consolidated the number of human system risks to 30, where the greatest emphasis is placed for investing program dollars for risk mitigation. The HSRB considers all available evidence from human research and, medical operations and occupational surveillance in assessing the risks for appropriate mitigation and future work. All applicable DRMs (low earth orbit for 6 and 12 months, deep space for 30 days and 1 year, a lunar mission for 1 year, and a planetary mission for 3 years) are considered as human system risks are modified by the hazards associated with space flight such as microgravity, exposure to radiation, distance from the earth, isolation and a closed environment. Each risk has a summary

  9. System and process for upgrading hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Bingham, Dennis N.; Klingler, Kerry M.; Smith, Joseph D.; Turner, Terry D.; Wilding, Bruce M.

    2015-08-25

    In one embodiment, a system for upgrading a hydrocarbon material may include a black wax upgrade subsystem and a molten salt gasification (MSG) subsystem. The black wax upgrade subsystem and the MSG subsystem may be located within a common pressure boundary, such as within a pressure vessel. Gaseous materials produced by the MSG subsystem may be used in the process carried out within the black wax upgrade subsystem. For example, hydrogen may pass through a gaseous transfer interface to interact with black wax feed material to hydrogenate such material during a cracking process. In one embodiment, the gaseous transfer interface may include one or more openings in a tube or conduit which is carrying the black wax material. A pressure differential may control the flow of hydrogen within the tube or conduit. Related methods are also disclosed.

  10. Processing and Linguistics Properties of Adaptable Systems

    Directory of Open Access Journals (Sweden)

    Dumitru TODOROI

    2006-01-01

    Full Text Available Continuation and development of the research in Adaptable Programming Initialization [Tod-05.1,2,3] is presented. As continuation of [Tod-05.2,3] in this paper metalinguistic tools used in the process of introduction of new constructions (data, operations, instructions and controls are developed. The generalization schemes of evaluation of adaptable languages and systems are discussed. These results analogically with [Tod-05.2,3] are obtained by the team, composed from the researchers D. Todoroi [Tod-05.4], Z. Todoroi [ZTod-05], and D. Micusa [Mic-03]. Presented results will be included in the book [Tod-06].

  11. The snow system: A decentralized medical data processing system.

    Science.gov (United States)

    Bellika, Johan Gustav; Henriksen, Torje Starbo; Yigzaw, Kassaye Yitbarek

    2015-01-01

    Systems for large-scale reuse of electronic health record data is claimed to have the potential to transform the current health care delivery system. In principle three alternative solutions for reuse exist: centralized, data warehouse, and decentralized solutions. This chapter focuses on the decentralized system alternative. Decentralized systems may be categorized into approaches that move data to enable computations or move computations to the where data is located to enable computations. We describe a system that moves computations to where the data is located. Only this kind of decentralized solution has the capabilities to become ideal systems for reuse as the decentralized alternative enables computation and reuse of electronic health record data without moving or exposing the information to outsiders. This chapter describes the Snow system, which is a decentralized medical data processing system, its components and how it has been used. It also describes the requirements this kind of systems need to support to become sustainable and successful in recruiting voluntary participation from health institutions.

  12. SOFC system with integrated catalytic fuel processing

    Science.gov (United States)

    Finnerty, Caine; Tompsett, Geoff. A.; Kendall, Kevin; Ormerod, R. Mark

    In recent years, there has been much interest in the development of solid oxide fuel cell technology operating directly on hydrocarbon fuels. The development of a catalytic fuel processing system, which is integrated with the solid oxide fuel cell (SOFC) power source is outlined here. The catalytic device utilises a novel three-way catalytic system consisting of an in situ pre-reformer catalyst, the fuel cell anode catalyst and a platinum-based combustion catalyst. The three individual catalytic stages have been tested in a model catalytic microreactor. Both temperature-programmed and isothermal reaction techniques have been applied. Results from these experiments were used to design the demonstration SOFC unit. The apparatus used for catalytic characterisation can also perform in situ electrochemical measurements as described in previous papers [C.M. Finnerty, R.H. Cunningham, K. Kendall, R.M. Ormerod, Chem. Commun. (1998) 915-916; C.M. Finnerty, N.J. Coe, R.H. Cunningham, R.M. Ormerod, Catal. Today 46 (1998) 137-145]. This enabled the performance of the SOFC to be determined at a range of temperatures and reaction conditions, with current output of 290 mA cm -2 at 0.5 V, being recorded. Methane and butane have been evaluated as fuels. Thus, optimisation of the in situ partial oxidation pre-reforming catalyst was essential, with catalysts producing high H 2/CO ratios at reaction temperatures between 873 K and 1173 K being chosen. These included Ru and Ni/Mo-based catalysts. Hydrocarbon fuels were directly injected into the catalytic SOFC system. Microreactor measurements revealed the reaction mechanisms as the fuel was transported through the three-catalyst device. The demonstration system showed that the fuel processing could be successfully integrated with the SOFC stack.

  13. SOFC system with integrated catalytic fuel processing

    Energy Technology Data Exchange (ETDEWEB)

    Finnerty, C.; Tompsett, G.A.; Kendall, K.; Ormerod, R.M. [Birchall Centre for Inorganic Chemistry and Materials Science, Keele Univ. (United Kingdom)

    2000-03-01

    In recent years, there has been much interest in the development of solid oxide fuel cell technology operating directly on hydrocarbon fuels. The development of a catalytic fuel processing system, which is integrated with the solid oxide fuel cell (SOFC) power source is outlined here. The catalytic device utilises a novel three-way catalytic system consisting of an in situ pre-reformer catalyst, the fuel cell anode catalyst and a platinum-based combustion catalyst. The three individual catalytic stages have been tested in a model catalytic microreactor. Both temperature-programmed and isothermal reaction techniques have been applied. Results from these experiments were used to design the demonstration SOFC unit. The apparatus used for catalytic characterisation can also perform in situ electrochemical measurements as described in previous papers [C.M. Finnerty, R.H. Cunningham, K. Kendall, R.M. Ormerod, Chem. Commun. (1998) 915-916; C.M. Finnerty, N.J. Coe, R.H. Cunningham, R.M. Ormerod, Catal. Today 46 (1998) 137-145]. This enabled the performance of the SOFC to be determined at a range of temperatures and reaction conditions, with current output of 290 mA cm{sup -2} at 0.5 V, being recorded. Methane and butane have been evaluated as fuels. Thus, optimisation of the in situ partial oxidation pre-reforming catalyst was essential, with catalysts producing high H{sub 2}/CO ratios at reaction temperatures between 873 K and 1173 K being chosen. These included Ru and Ni/Mo-based catalysts. Hydrocarbon fuels were directly injected into the catalytic SOFC system. Microreactor measurements revealed the reaction mechanisms as the fuel was transported through the three-catalyst device. The demonstration system showed that the fuel processing could be successfully integrated with the SOFC stack. (orig.)

  14. Onboard Image Processing System for Hyperspectral Sensor

    Directory of Open Access Journals (Sweden)

    Hiroki Hihara

    2015-09-01

    Full Text Available Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS, which is a hierarchical predictive coding method with resolution scaling. To improve FELICS’s performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost.

  15. Onboard Image Processing System for Hyperspectral Sensor.

    Science.gov (United States)

    Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun

    2015-09-25

    Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS's performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost.

  16. Scaled CMOS Reliability and Considerations for Spacecraft Systems: Bottom-Up and Top-Down Perspective

    Science.gov (United States)

    White, Mark

    2012-01-01

    New space missions will increasingly rely on more advanced technologies because of system requirements for higher performance, particularly in instruments and high-speed processing. Component-level reliability challenges with scaled CMOS in spacecraft systems from a bottom-up perspective have been presented. Fundamental Front-end and Back-end processing reliability issues with more aggressively scaled parts have been discussed. Effective thermal management from system-level to the componentlevel (top-down) is a key element in overall design of reliable systems. Thermal management in space systems must consider a wide range of issues, including thermal loading of many different components, and frequent temperature cycling of some systems. Both perspectives (top-down and bottom-up) play a large role in robust, reliable spacecraft system design.

  17. Information Processing in Auto-regulated Systems

    Directory of Open Access Journals (Sweden)

    Karl Javorszky

    2003-06-01

    Full Text Available Abstract: We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields.

  18. Integration mockup and process material management system

    Science.gov (United States)

    Verble, Adas James, Jr.

    1992-01-01

    Work to define and develop a full scale Space Station Freedom (SSF) mockup with the flexibility to evolve into future designs, to validate techniques for maintenance and logistics and verify human task allocations and support trade studies is described. This work began in early 1985 and ended in August, 1991. The mockups are presently being used at MSFC in Building 4755 as a technology and design testbed, as well as for public display. Micro Craft also began work on the Process Material Management System (PMMS) under this contract. The PMMS simulator was a sealed enclosure for testing to identify liquids, gaseous, particulate samples, and specimen including, urine, waste water, condensate, hazardous gases, surrogate gasses, liquids, and solids. The SSF would require many trade studies to validate techniques for maintenance and logistics and verify system task allocations; it was necessary to develop a full scale mockup which would be representative of current SSF design with the ease of changing those designs as the SSF design evolved and changed. The tasks defined for Micro Craft were to provide the personnel, services, tools, and materials for the SSF mockup which would consist of four modules, nodes, interior components, and part task mockups of MSFC responsible engineering systems. This included the Engineering Control and Life Support Systems (ECLSS) testbed. For the initial study, the mockups were low fidelity, soft mockups of graphics art bottle, and other low cost materials, which evolved into higher fidelity mockups as the R&D design evolved, by modifying or rebuilding, an important cost saving factor in the design process. We designed, fabricated, and maintained the full size mockup shells and support stands. The shells consisted of cylinders, end cones, rings, longerons, docking ports, crew airlocks, and windows. The ECLSS required a heavier cylinder to support the ECLSS systems test program. Details of this activity will be covered. Support stands were

  19. Thermionic power system power processing and control

    Science.gov (United States)

    Metcalf, Kenneth J.

    1992-01-01

    Thermionic power systems are being considered for space-based miltary applications because of their survivability and high efficiency. Under the direction of the Air Force, conceptual designs were generated for two thermionic power systems to determine preliminary system performance data and direct future component development. This paper discusses the power processing and control (PP&C) subsystem that conditions the thermionic converter power and controls the operation of the reactor and thermionic converter subsystems. The baseline PP&C design and design options are discussed, mass and performance data are provided, and technology needs are identified. The impact on PP&C subsystem mass and efficiency of alternate power levels and boom lengths is also presented. The baseline PP&C subsystem is lightweight and reliable, and it uses proven design concepts to minimize development and testing time. However, the radiation dosages specified in the program research and development announcement (PRDA) are 10 to 100 times the capabilities of present semiconductor devices. While these levels are aggressive, they are considered to be achievable by 1995 if the Air Force and other government agencies continue to actively develop radiation resistant electronics devices for military applications.

  20. Control-based Scheduling in a Distributed Stream Processing System

    OpenAIRE

    2006-01-01

    Stream processing systems receive continuous streams of messages with raw information and produce streams of messages with processed information. The utility of a stream-processing system depends, in part, on the accuracy and timeliness of the output. Streams in complex event processing systems are processed on distributed systems; several steps are taken on different processors to process each incoming message, and messages may be enqueued between steps. This paper de...

  1. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    Jaros, W.

    2005-08-30

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project

  2. Efficient audio signal processing for embedded systems

    Science.gov (United States)

    Chiu, Leung Kin

    As mobile platforms continue to pack on more computational power, electronics manufacturers start to differentiate their products by enhancing the audio features. However, consumers also demand smaller devices that could operate for longer time, hence imposing design constraints. In this research, we investigate two design strategies that would allow us to efficiently process audio signals on embedded systems such as mobile phones and portable electronics. In the first strategy, we exploit properties of the human auditory system to process audio signals. We designed a sound enhancement algorithm to make piezoelectric loudspeakers sound ”richer" and "fuller." Piezoelectric speakers have a small form factor but exhibit poor response in the low-frequency region. In the algorithm, we combine psychoacoustic bass extension and dynamic range compression to improve the perceived bass coming out from the tiny speakers. We also developed an audio energy reduction algorithm for loudspeaker power management. The perceptually transparent algorithm extends the battery life of mobile devices and prevents thermal damage in speakers. This method is similar to audio compression algorithms, which encode audio signals in such a ways that the compression artifacts are not easily perceivable. Instead of reducing the storage space, however, we suppress the audio contents that are below the hearing threshold, therefore reducing the signal energy. In the second strategy, we use low-power analog circuits to process the signal before digitizing it. We designed an analog front-end for sound detection and implemented it on a field programmable analog array (FPAA). The system is an example of an analog-to-information converter. The sound classifier front-end can be used in a wide range of applications because programmable floating-gate transistors are employed to store classifier weights. Moreover, we incorporated a feature selection algorithm to simplify the analog front-end. A machine

  3. Assessing biosphere feedbacks on Earth System Processes

    Science.gov (United States)

    McElwain, Jennifer

    2016-04-01

    The evolution and ecology of plant life has been shaped by the direct and indirect influence of plate tectonics. Climatic change and environmental upheaval associated with the emplacement of large igneous provinces have triggered biosphere level ecological change, physiological modification and pulses of both extinction and origination. This talk will investigate the influence of large scale changes in atmospheric composition on plant ecophysiology at key intervals of the Phanerozoic. Furthermore, I will assess the extent to which plant ecophysiological response can in turn feedback on earth system processes such as the global hydrological cycle and biogeochemical cycling of nitrogen and carbon. Palaeo-atmosphere simulation experiments, palaeobotanical data and recent historical (last 50 years) data-model comparison will be used to address the extent to which plant physiological responses to atmospheric CO2 can modulate global climate change via biosphere level feedback.

  4. Dynamics of ranking processes in complex systems.

    Science.gov (United States)

    Blumm, Nicholas; Ghoshal, Gourab; Forró, Zalán; Schich, Maximilian; Bianconi, Ginestra; Bouchaud, Jean-Philippe; Barabási, Albert-László

    2012-09-21

    The world is addicted to ranking: everything, from the reputation of scientists, journals, and universities to purchasing decisions is driven by measured or perceived differences between them. Here, we analyze empirical data capturing real time ranking in a number of systems, helping to identify the universal characteristics of ranking dynamics. We develop a continuum theory that not only predicts the stability of the ranking process, but shows that a noise-induced phase transition is at the heart of the observed differences in ranking regimes. The key parameters of the continuum theory can be explicitly measured from data, allowing us to predict and experimentally document the existence of three phases that govern ranking stability.

  5. Signal processing by the endosomal system.

    Science.gov (United States)

    Villaseñor, Roberto; Kalaidzidis, Yannis; Zerial, Marino

    2016-04-01

    Cells need to decode chemical or physical signals from their environment in order to make decisions on their fate. In the case of signalling receptors, ligand binding triggers a cascade of chemical reactions but also the internalization of the activated receptors in the endocytic pathway. Here, we highlight recent studies revealing a new role of the endosomal network in signal processing. The diversity of entry pathways and endosomal compartments is exploited to regulate the kinetics of receptor trafficking, and interactions with specific signalling adaptors and effectors. By governing the spatio-temporal distribution of signalling molecules, the endosomal system functions analogously to a digital-analogue computer that regulates the specificity and robustness of the signalling response.

  6. Range Query Processing in Multidisk Systems

    Institute of Scientific and Technical Information of China (English)

    李建中

    1992-01-01

    In order to reduce the disk access time,a database can be stored on several simultaneously accessible disks.In this paper,we are concerned with the dynamic d-attribute database allocation problem for range queries,An allocation method,called coordinate moule allocation method,is proposed to allocate data in a d-attribute database among disks so that the maximum disk accessing concurrency can be achieved for range queries.Our analysis and experiments show that the method achieves the optimum or near-optimum parallelism for range queries.The paper offers the conditions under which the method is optimal .The worst case bounds of the performance of the method are also given.In addition,the parallel algorithm of processing range queries in described at the end of the paper.The method has been used in the statistic and scientific database management system whic is being designed by us.

  7. Integrating RFID technique to design mobile handheld inventory management system

    Science.gov (United States)

    Huang, Yo-Ping; Yen, Wei; Chen, Shih-Chung

    2008-04-01

    An RFID-based mobile handheld inventory management system is proposed in this paper. Differing from the manual inventory management method, the proposed system works on the personal digital assistant (PDA) with an RFID reader. The system identifies electronic tags on the properties and checks the property information in the back-end database server through a ubiquitous wireless network. The system also provides a set of functions to manage the back-end inventory database and assigns different levels of access privilege according to various user categories. In the back-end database server, to prevent improper or illegal accesses, the server not only stores the inventory database and user privilege information, but also keeps track of the user activities in the server including the login and logout time and location, the records of database accessing, and every modification of the tables. Some experimental results are presented to verify the applicability of the integrated RFID-based mobile handheld inventory management system.

  8. AHTR Refueling Systems and Process Description

    Energy Technology Data Exchange (ETDEWEB)

    Varma, Venugopal Koikal [ORNL; Holcomb, David Eugene [ORNL; Bradley, Eric Craig [ORNL; Zaharia, Nathaniel M [ORNL; Cooper, Eliott J [ORNL

    2012-07-01

    The Advanced High-Temperature Reactor (AHTR) is a design concept for a central station-type [1500 MW(e)] Fluoride salt-cooled High-temperature Reactor (FHR) that is currently undergoing development by Oak Ridge National Laboratory for the US. Department of Energy, Office of Nuclear Energy's Advanced Reactor Concepts program. FHRs, by definition, feature low-pressure liquid fluoride salt cooling, coated-particle fuel, a high-temperature power cycle, and fully passive decay heat rejection. The overall goal of the AHTR development program is to demonstrate the technical feasibility of FHRs as low-cost, large-size power producers while maintaining full passive safety. The AHTR is approaching a preconceptual level of maturity. An initial integrated layout of its major systems, structures, and components (SSCs), and an initial, high-level sequence of operations necessary for constructing and operating the plant is nearing completion. An overview of the current status of the AHTR concept has been recently published and a report providing a more detailed overview of the AHTR structures and mechanical systems is currently in preparation. This report documents the refueling components and processes envisioned at this early development phase. The report is limited to the refueling aspects of the AHTR and does not include overall reactor or power plant design information. The report, however, does include a description of the materials envisioned for the various components and the instrumentation necessary to control the refueling process. The report begins with an overview of the refueling strategy. Next a mechanical description of the AHTR fuel assemblies and core is provided. The reactor vessel upper assemblies are then described. Following this the refueling path structures and the refueling mechanisms and components are described. The sequence of operations necessary to fuel and defuel the reactor is then discussed. The report concludes with a discussion of the

  9. AHTR Refueling Systems and Process Description

    Energy Technology Data Exchange (ETDEWEB)

    Varma, V.K.; Holcomb, D.E.; Bradley, E.C.; Zaharia, N.M.; Cooper, E.J.

    2012-07-15

    The Advanced High-Temperature Reactor (AHTR) is a design concept for a central station-type [1500 MW(e)] Fluoride salt–cooled High-temperature Reactor (FHR) that is currently undergoing development by Oak Ridge National Laboratory for the US. Department of Energy, Office of Nuclear Energy’s Advanced Reactor Concepts program. FHRs, by definition, feature low-pressure liquid fluoride salt cooling, coated-particle fuel, a high-temperature power cycle, and fully passive decay heat rejection. The overall goal of the AHTR development program is to demonstrate the technical feasibility of FHRs as low-cost, large-size power producers while maintaining full passive safety. The AHTR is approaching a preconceptual level of maturity. An initial integrated layout of its major systems, structures, and components (SSCs), and an initial, high-level sequence of operations necessary for constructing and operating the plant is nearing completion. An overview of the current status of the AHTR concept has been recently published [1], and a report providing a more detailed overview of the AHTR structures and mechanical systems is currently in preparation. This report documents the refueling components and processes envisioned at this early development phase. The report is limited to the refueling aspects of the AHTR and does not include overall reactor or power plant design information. The report, however, does include a description of the materials envisioned for the various components and the instrumentation necessary to control the refueling process. The report begins with an overview of the refueling strategy. Next a mechanical description of the AHTR fuel assemblies and core is provided. The reactor vessel upper assemblies are then described. Following this the refueling path structures and the refueling mechanisms and components are described. The sequence of operations necessary to fuel and defuel the reactor is then discussed. The report concludes with a discussion of the

  10. NASA's Earth Science Data Systems Standards Process

    Science.gov (United States)

    Ullman, R.; Enloe, Y.

    2006-12-01

    Starting in January 2004, NASA instituted a set of internal working groups to develop ongoing recommendations for the continuing broad evolution of Earth Science Data Systems development and management within NASA. One of these Data Systems Working Groups is called the Standards Process Group (SPG). This group's goal is to facilitate broader use of standards that have proven implementation and operational benefit to NASA Earth science by facilitating the approval of proposed standards and directing the evolution of standards. We have found that the candidate standards that self defined communities are proposing for approval to the SPG are one of 3 types: (1) A NASA community developed standard used within at least one self defined community where the proposed standard has not been approved or adopted by an external standards organization and where new implementations are expected to be developed from scratch, using the proposed standard as the implementation specification; (2) A NASA community developed standard used within at least one self defined community where the proposed standard has not been approved or adopted by an external standards organization and where new implementations are not expected to be developed from scratch but use existing software libraries or code;. (3) A standard already approved by an external standards organization but is being proposed for use for the NASA Earth science community. There are 3 types of reviews potentially needed to evaluate a proposed standard: (1) A detailed technical review to determine the quality, accuracy, and clarity of the proposed specification and where a detailed technical review ensures that implementers can use the proposed standard as an implementation specification for any future implementations with confidence; (2) A "usefulness" user review that determines if the proposed standard is useful or helpful or necessary to the user to carry out his work; (3) An operational review that evaluates if the

  11. Theory of Neural Information Processing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Galla, Tobias [Abdus Salam International Centre for Theoretical Physics and INFM/CNR SISSA-Unit, Strada Costiera 11, I-34014 Trieste (Italy)

    2006-04-07

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10{sup 11} neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard

  12. Annuitants Added to the Annuity Roll Processing System (ARPS)

    Data.gov (United States)

    Office of Personnel Management — Small table showing the total Civil Service Retirement System (CSRS) and Federal Employees Retirement System (FERS) Annuitants added to the Annuity Roll Processing...

  13. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    na

    2005-05-30

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The

  14. Digital processing system for developing countries

    Science.gov (United States)

    Nanayakkara, C.; Wagner, H.

    1977-01-01

    An effort was undertaken to perform simple digital processing tasks using pre-existing general purpose digital computers. An experimental software package, LIGMALS, was obtained and modified for this purpose. The resulting software permits basic processing tasks to be performed including level slicing, gray mapping and ratio processing. The experience gained in this project indicates a possible direction which may be used by other developing countries to obtain digital processing capabilities.

  15. Standardization of Information Systems Development Processes and Banking Industry Adaptations

    OpenAIRE

    Tuna Ozcer; Zuhal Tanrikulu

    2011-01-01

    This paper examines the current system development processes of three major Turkish banks in terms of compliance to internationally accepted system development and software engineering standards to determine the common process problems of banks. After an in-depth investigation into system development and software engineering standards, related process-based standards were selected. Questions were then prepared covering the whole system development process by applying the classical Waterfall l...

  16. Second International Conference on Communications, Signal Processing, and Systems

    CERN Document Server

    Mu, Jiasong; Wang, Wei; Liang, Qilian; Pi, Yiming

    2014-01-01

    The Proceedings of The Second International Conference on Communications, Signal Processing, and Systems provides the state-of-art developments of Communications, Signal Processing, and Systems. The conference covered such topics as wireless communications, networks, systems, signal processing for communications. This book is a collection of contributions coming out of The Second International Conference on Communications, Signal Processing, and Systems (CSPS) held September 2013 in Tianjin, China.

  17. Modeling and Advanced Control for Sustainable Process Systems

    Science.gov (United States)

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-insp...

  18. 21 CFR 864.9145 - Processing system for frozen blood.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Processing system for frozen blood. 864.9145... Blood and Blood Products § 864.9145 Processing system for frozen blood. (a) Identification. A processing system for frozen blood is a device used to glycerolize red blood cells prior to freezing to...

  19. 3rd International Conference on Communications, Signal Processing, and Systems

    CERN Document Server

    Liang, Qilian; Wang, Wei; Zhang, Baoju; Pi, Yiming

    2015-01-01

    The Proceedings of The Third International Conference on Communications, Signal Processing, and Systems provides the state-of-art developments of communications, signal processing, and systems. This book is a collection of contributions from the conference and covers such topics as wireless communications, networks, systems, and signal processing for communications. The conference was held July 2014 in Hohhot, Inner Mongolia, China.

  20. The Back-end of User Centred Innovation

    DEFF Research Database (Denmark)

    Lassen, Astrid Heidemann

    2015-01-01

    User Centred Innovation (UCI) has during the past decade developed into a widely acknowledged approach to innovation. Yet, in spite of plethora of methods and tools for conducting UCI, companies continue to struggle in relation to creating the desired effect UCI. In this paper, it is proposed...... of such challenges calls for a new focus in UCI research on interorganizational alignment and cross-functional collaboration....

  1. Agassiz Wilderness Character Monitoring Back-end Database

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Wilderness Act of 1964 mandated the preservation of wilderness character. The NWRS has 18% of designated wilderness, comprising 21 million acres. After over 40...

  2. Biocatalytic process development using microfluidic miniaturized systems

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Heintz, Søren; Ringborg, Rolf Hoffmeyer

    2014-01-01

    The increasing interest in biocatalytic processes means there is a clear need for a new systematic development paradigm which encompasses both protein engineering and process engineering. This paper argues that through the use of a new microfluidic platform, data can be collected more rapidly...... and integrated with process modeling, can provide the basis for validating a reduced number of potential processes. The miniaturized platform should use a smaller reagent inventory and make better use of precious biocatalysts. The EC funded BIOINTENSE project will use ω-transaminase based synthesis of chiral...

  3. 77 FR 38782 - Privacy Act of 1974; System of Records

    Science.gov (United States)

    2012-06-29

    ..., Department of Defense. N07250-1 System Name: Navy Cash Financial System. System Location: Navy Cash is... records notices and may be obtained from the System Manager. The Navy Cash back-end ashore is operated by... branch or company name, Social Security Number (SSN), rate, rank, title, pay grade, date of birth,...

  4. Processing in (linear) systems with stochastic input

    Science.gov (United States)

    Nutu, Catalin Silviu; Axinte, Tiberiu

    2016-12-01

    The paper is providing a different approach to real-world systems, such as micro and macro systems of our real life, where the man has little or no influence on the system, either not knowing the rules of the respective system or not knowing the input of the system, being thus mainly only spectator of the system's output. In such a system, the input of the system and the laws ruling the system could be only "guessed", based on intuition or previous knowledge of the analyzer of the respective system. But, as we will see in the paper, it exists also another, more theoretical and hence scientific way to approach the matter of the real-world systems, and this approach is mostly based on the theory related to Schrödinger's equation and the wave function associated with it and quantum mechanics as well. The main results of the paper are regarding the utilization of the Schrödinger's equation and related theory but also of the Quantum mechanics, in modeling real-life and real-world systems.

  5. Processing information system for highly specialized information in corporate networks

    Science.gov (United States)

    Petrosyan, M. O.; Kovalev, I. V.; Zelenkov, P. V.; Brezitskaya, VV; Prohorovich, G. A.

    2016-11-01

    The new structure for formation system and management system for highly specialized information in corporate systems is offered. The main distinguishing feature of this structure is that it involves the processing of multilingual information in a single user request.

  6. Processes, Forms Of Sport Management System

    Directory of Open Access Journals (Sweden)

    Gheorghe Jinga

    2013-05-01

    Full Text Available The process of instructing the sportsmen has always been a complex and thorough activity, that requests profundity, professional sensitivity and bonding. The main characters in this process are played by managers, coaches, trainers, methodists, psychologists, sociologists, technicians who establish and hand over theoretical knowledge, abilitiesand skills for the sportsmen. In this way is being created the environment for instructing and highlighting the physical, technical, tactical and psychic potential of the participants in competitions. The training process of the sportsmen is more and more headed towards the integral and deep internalization of the instructive components, based on the interconnections between the elements of modern sport science.

  7. Power system operations: State estimation distributed processing

    Science.gov (United States)

    Ebrahimian, Mohammad Reza

    We present an application of a robust and fast parallel algorithm to power system state estimation with minimal amount of modifications to existing state estimators presently in place using the Auxiliary Problem Principle. We demonstrate its effectiveness on IEEE test systems, the Electric Reliability Counsel of Texas (ERCOT), and the Southwest Power Pool (SPP) systems. Since state estimation formulation may lead to an ill-conditioned system, we provide analytical explanations of the effects of mixtures of measurements on the condition of the state estimation information matrix. We demonstrate the closeness of the analytical equations to condition of several test case systems including IEEE RTS-96 and IEEE 118 bus systems. The research on the condition of the state estimation problem covers the centralized as well as distributed state estimation.

  8. Research and development of process innovation design oriented web-based process case base system

    Directory of Open Access Journals (Sweden)

    Guo Xin

    2015-01-01

    Full Text Available Process innovation is very significant for an enterprise to lower cost, improve product quality and win competitive advantage. In order to inspire designers to realize innovation design, this paper has proposed a concept of process innovation design regarding Web process case base system model. To be specific, it constructs system mainline through the realization of technique and application flow, determines system architecture by combining process case base and cognition method and establishes links among principles, innovation approaches and process cases on this basis. The process case prototype system is established under the model of browser/server, and 5 kinds of search models, i.e. processing methods, processing focus, design depth, innovation approaches and user-defined model are integrated. This paper has demonstrated case base backstage realization and management methods, showcased system interface and demonstrated its effectiveness in process design based on actual cases.

  9. Reliable High Performance Processing System (RHPPS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA's exploration, science, and space operations systems are critically dependent on the hardware technologies used in their implementation. Specifically, the...

  10. Digital signal processing in communication systems

    CERN Document Server

    Frerking, Marvin E

    1994-01-01

    An engineer's introduction to concepts, algorithms, and advancements in Digital Signal Processing. This lucidly written resource makes extensive use of real-world examples as it covers all the important design and engineering references.

  11. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-06-18

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement in the detection limit of various nitrogen and phosphorus compounds over traditional signal-processing methods in analyzing the output of a thermionic detector attached to the output of a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above. In addition, two of six were detected at levels 1/2 the concentration of the nominal threshold. We would have had another two correct hits if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was identified by running a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  12. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-12-05

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement over traditional signal-processing methods for the detection limit of various nitrogen and phosphorus compounds from the output of a thermionic detector attached to a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above the threshold. In addition, two of six spikes were detected at levels of 1/2 the concentration of the nominal threshold. Another two of the six would have been detected correctly if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was subsequently identified by analyzing a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods should be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  13. Parallel and distributed processing: applications to power systems

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Felix; Murphy, Liam [California Univ., Berkeley, CA (United States). Dept. of Electrical Engineering and Computer Sciences

    1994-12-31

    Applications of parallel and distributed processing to power systems problems are still in the early stages. Rapid progress in computing and communications promises a revolutionary increase in the capacity of distributed processing systems. In this paper, the state-of-the art in distributed processing technology and applications is reviewed and future trends are discussed. (author) 14 refs.,1 tab.

  14. System Engineering Process Realization Toolkit Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA faces many systems engineering challenges as it seeks to conduct exploration and science missions concurrently. One such challenge is implementing a repeatable...

  15. Influence Business Process On The Quality Of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Meiryani

    2015-01-01

    Full Text Available Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a significant effect on the quality of accounting information systems.

  16. SAR Systems and Related Signal Processing

    NARCIS (Netherlands)

    Hoogeboom, P.; Dekker, R.J.; Otten, M.P.G.

    1996-01-01

    Synthetic Aperture Radar (SAR) is today a valuable source of remote sensing information. SAR is a side-looking imaging radar and operates from airborne and spacebome platforms. Coverage, resolution and image quality are strongly influenced by the platform. SAR processing can be performed on standard

  17. Sneak analysis applied to process systems

    Science.gov (United States)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  18. VICAR-DIGITAL image processing system

    Science.gov (United States)

    Billingsley, F.; Bressler, S.; Friden, H.; Morecroft, J.; Nathan, R.; Rindfleisch, T.; Selzer, R.

    1969-01-01

    Computer program corrects various photometic, geometric and frequency response distortions in pictures. The program converts pictures to a number of elements, with each elements optical density quantized to a numerical value. The translated picture is recorded on magnetic tape in digital form for subsequent processing and enhancement by computer.

  19. Design and development of expert system for controlling sintering process

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The general structure of expert system for controlling sintering process has been proposed. It includes knowledge base, inference engine, data acquisition system, learning system, knowledge base management system, explanation system and so on. The control functions consist of sintering chemical composition control centered on basicity and sintering process state control centered on permeability. The adaptive prediction of sintering chemical composition, the control strategy centered on basicity, the control strategy centered on permeability, the judgement of permeability and the prediction of burn through point were studied. The software of system, which includes about 1000 expert rules, was successfully applied in off-line control of sintering process in a sintering plant.

  20. Mask alignment system for semiconductor processing

    Energy Technology Data Exchange (ETDEWEB)

    Webb, Aaron P.; Carlson, Charles T.; Weaver, William T.; Grant, Christopher N.

    2017-02-14

    A mask alignment system for providing precise and repeatable alignment between ion implantation masks and workpieces. The system includes a mask frame having a plurality of ion implantation masks loosely connected thereto. The mask frame is provided with a plurality of frame alignment cavities, and each mask is provided with a plurality of mask alignment cavities. The system further includes a platen for holding workpieces. The platen may be provided with a plurality of mask alignment pins and frame alignment pins configured to engage the mask alignment cavities and frame alignment cavities, respectively. The mask frame can be lowered onto the platen, with the frame alignment cavities moving into registration with the frame alignment pins to provide rough alignment between the masks and workpieces. The mask alignment cavities are then moved into registration with the mask alignment pins, thereby shifting each individual mask into precise alignment with a respective workpiece.

  1. Internet-based intelligent information processing systems

    CERN Document Server

    Tonfoni, G; Ichalkaranje, N S

    2003-01-01

    The Internet/WWW has made it possible to easily access quantities of information never available before. However, both the amount of information and the variation in quality pose obstacles to the efficient use of the medium. Artificial intelligence techniques can be useful tools in this context. Intelligent systems can be applied to searching the Internet and data-mining, interpreting Internet-derived material, the human-Web interface, remote condition monitoring and many other areas. This volume presents the latest research on the interaction between intelligent systems (neural networks, adap

  2. A SYSTEM DESIGN PROCESS TAILORED FOR REVERSE ENGINEERING AND REENGINEERING

    Directory of Open Access Journals (Sweden)

    Tae-Hun Yoon

    2010-10-01

    Full Text Available This paper discusses a system design process using a reverse engineering. The Reverse Engineering Approach, if possible, is a cost-effective and easy approach to be used in a system design. All industries use this approach consciously or unconsciously to reduce system development risks. It can be a part of formal process, simple requirement reuse, or adoption of industry standards. The reverse engineering approach can be considered as an effective system design method in immature system engineering environments. This paper proposes a system design process using reverse engineering which can be tailored for large complex system development projects. The proposed process composed of two stages to produce system specification generation. The reverse engineering stage is performed to define functional and physical architecture of legacy system used as reference model when they are not available. The reengineering stage takes outputs of the reverse engineering stage to define the rest of logical and physical solutions.

  3. The Impacts of Bologna Process on European Higher Education Systems

    Directory of Open Access Journals (Sweden)

    Zafer ÇELİK

    2012-01-01

    Full Text Available This study aims to examine the impact of Bologna Process on European higher education systems. It focuses on the influences of the main components of Bologna Process (i.e., implementing two-cycle system, increasing the student and academic staffs' mobility, European Credit Transfer System, quality assurance and qualification framework on the transformation of higher education systems. Although Bologna Process is perceived as a move to increase the quality of higher education system in Turkey, there are very serious criticisms from academics, students, and businessmen to the Bologna Process in various European countries. This study claims that the Process did not achieve its goals, more importantly the main instruments of the Process (qualifications, quality assurance agency etc. brought about hyper-bureaucratization, hierarchization and standardization of European higher education systems.

  4. Distributive Processing Issues in Education Information Systems.

    Science.gov (United States)

    Ender, Philip B.

    This is one of a series of reports based on an ongoing reality test of systemic evaluation for instructional decision making. This feasibility study is being carried out by the Center for the Study of Evaluation with the Laboratory in School and Community Relations at a suburban Los Angeles high school (called Site A). Viewing a school as a…

  5. Very Large Scale Distributed Information Processing Systems

    Science.gov (United States)

    1991-09-27

    34Reliable Distributed Database Management", Proc. of the IEEE, May 1987, pp. 601-620. [GOTT881 Gottlob , Georg andRoberto Zicari, "Closed World Databases... Gottlob , and Gio Wiederhold, "Interfacing Relational Databases and Prolog Efficiently," in Proceedings 2nd Expert Database Systems Conference, pp. 141

  6. Processing abstract language modulates motor system activity.

    Science.gov (United States)

    Glenberg, Arthur M; Sato, Marc; Cattaneo, Luigi; Riggio, Lucia; Palumbo, Daniele; Buccino, Giovanni

    2008-06-01

    Embodiment theory proposes that neural systems for perception and action are also engaged during language comprehension. Previous neuroimaging and neurophysiological studies have only been able to demonstrate modulation of action systems during comprehension of concrete language. We provide neurophysiological evidence for modulation of motor system activity during the comprehension of both concrete and abstract language. In Experiment 1, when the described direction of object transfer or information transfer (e.g., away from the reader to another) matched the literal direction of a hand movement used to make a response, speed of responding was faster than when the two directions mismatched (an action-sentence compatibility effect). In Experiment 2, we used single-pulse transcranial magnetic stimulation to study changes in the corticospinal motor pathways to hand muscles while reading the same sentences. Relative to sentences that do not describe transfer, there is greater modulation of activity in the hand muscles when reading sentences describing transfer of both concrete objects and abstract information. These findings are discussed in relation to the human mirror neuron system.

  7. Control and systems concepts in the innovation process

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    Allan Kjaer is an electrical engineer who arned master's and Ph.D.degrees in control nd system identification. He then began work n control, information, and production processes in the steel industry. Leading a team of evelopers at the Danish Steel Company, Dr. Kjaer applied systems thinking to the wider issues of process and information integration to achieve a tightly integrated production and business process. His article is important because it directly shows how control-systems

  8. Summary of the International Conference on Software and System Processes

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; O'Connor, Rory V.; Perry, Dewayne E.

    2016-01-01

    The International Conference on Software and Systems Process (ICSSP), continuing the success of Software Process Workshop (SPW), the Software Process Modeling and Simulation Workshop (ProSim) and the International Conference on Software Process (ICSP) conference series, has become the established...... premier event in the field of software and systems engineering processes. It provides a leading forum for the exchange of research outcomes and industrial best-practices in process development from software and systems disciplines. ICSSP 2016 was held in Austin, Texas, from 14-15 May 2016, co......-located with the 38th International Conference on Software Engineering (ICSE). The theme of mICSSP 2016 was studying "Process(es) in Action" by recognizing that the AS-Planned and AS-Practiced processes can be quite different in many ways including their ows, their complexity and the evolving needs of stakeholders...

  9. Integrated Automation System for Rare Earth Countercurrent Extraction Process

    Institute of Scientific and Technical Information of China (English)

    柴天佑; 杨辉

    2004-01-01

    Lower automation level in industrial rare-earth extraction processes results in high production cost, inconsistent product quality and great consumption of resources in China. An integrated automation system for extraction process of rare earth is proposed to realize optimal product indices, such as product purity,recycle rate and output. The optimal control strategy for output component, structure and function of the two-gradcd integrated automation system composed of the process management grade and the process control grade were discussed. This system is successfully applied to a HAB yttrium extraction production process and was found to provide optimal control, optimal operation, optimal management and remarkable benefits.

  10. High risk process control system assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Venetia [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), RJ (Brazil); Zamberlan, Maria Cristina [National Institute of Tehnology (INT), Rio de Janeiro, RJ (Brazil). Human Reliability and Ergonomics Research Group for the Oil, Gas and Energy Sector

    2009-07-01

    The evolution of ergonomics methodology has become necessary due to the dynamics imposed by the work environment, by the increase of the need of human cooperation and by the high interaction between various sections within a company. In the last 25 years, as of studies made in the high risk process control, we have developed a methodology to evaluate these situations that focus on the assessment of activities and human cooperation, the assessment of context, the assessment of the impact of work of other sectors in the final activity of the operator, as well as the modeling of existing risks. (author)

  11. Solar energy engineering processes and systems

    CERN Document Server

    Kalogirou, Soteris A

    2009-01-01

    As perhaps the most promising of all the renewable energy sources available today, solar energy is becoming increasingly important in the drive to achieve energy independence and climate balance. This new book is the masterwork from world-renowned expert Dr. Soteris Kalogirou, who has championed solar energy for decades. The book includes all areas of solar energy engineering, from the fundamentals to the highest level of current research. The author includes pivotal subjects such as solar collectors, solar water heating, solar space heating and cooling, industrial process heat, solar desalina

  12. Solar energy engineering processes and systems

    CERN Document Server

    Kalogirou, Soteris A

    2013-01-01

    As perhaps the most promising of all the renewable energy sources available today, solar energy is becoming increasingly important in the drive to achieve energy independence and climate balance. This new book is the masterwork from world-renowned expert Dr. Soteris Kalogirou, who has championed solar energy for decades. The book includes all areas of solar energy engineering, from the fundamentals to the highest level of current research. The author includes pivotal subjects such as solar collectors, solar water heating, solar space heating and cooling, industrial process heat, solar desalina

  13. Development of continuous pharmaceutical production processes supported by process systems engineering methods and tools

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2012-01-01

    The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way.......The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way....

  14. Retrofitting automated process control systems at Ukrainian power stations

    Energy Technology Data Exchange (ETDEWEB)

    B.E. Simkin; V.S. Naumchik; B.D. Kozitskii (and others) [OAO L' vovORGRES, Lviv (Ukraine)

    2008-04-15

    Approaches and principles for retrofitting automated process control systems at Ukrainian power stations are considered. The results obtained from retrofitting the monitoring and control system of Unit 9 at the Burshtyn thermal power station are described.

  15. A signal processing method for the friction-based endpoint detection system of a CMP process

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chi; Guo Dongming; Jin Zhuji; Kang Renke, E-mail: xuchi_dut@163.com [Key Laboratory for Precision and Non-Traditional Machining Technology of Ministry of Education, Dalian University of Technology, Dalian 116024 (China)

    2010-12-15

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process. (semiconductor technology)

  16. Standardization of information systems development processes and banking industry adaptations

    CERN Document Server

    Tanrikulu, Zuhal

    2011-01-01

    This paper examines the current system development processes of three major Turkish banks in terms of compliance to internationally accepted system development and software engineering standards to determine the common process problems of banks. After an in-depth investigation into system development and software engineering standards, related process-based standards were selected. Questions were then prepared covering the whole system development process by applying the classical Waterfall life cycle model. Each question is made up of guidance and suggestions from the international system development standards. To collect data, people from the information technology departments of three major banks in Turkey were interviewed. Results have been aggregated by examining the current process status of the three banks together. Problematic issues were identified using the international system development standards.

  17. Performance engineering for industrial embedded data-processing systems

    NARCIS (Netherlands)

    Hendriks, M.; Verriet, J.; Basten, T.; Brassn, M.; Dankers, R.; Laan, R.; Lint, A.; Moneva, H.; Somers, L.; Willekens, M.

    2015-01-01

    Performance is a key aspect of many embedded systems, embedded data processing systems in particular. System performance can typically only be measured in the later stages of system development. To avoid expensive re-work in the final stages of development, it is essential to have accurate performan

  18. Analysis of bilinear stochastic systems. [involving multiplicative noise processes

    Science.gov (United States)

    Willsky, A. S.; Marcus, S. I.; Martin, D. N.

    1974-01-01

    Analysis of stochastic dynamical systems that involve multiplicative (bilinear) noise processes is considered. After defining the systems of interest, the evolution of the moments of such systems, the question of stochastic stability, and estimation for bilinear stochastic systems are discussed. Both exact and approximate methods of analysis are introduced, and, in particular, the uses of Lie-theoretic concepts and harmonic analysis are discussed.

  19. Globally stable control systems for processes with input multiplicities

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jietae [Kyungpook National University, Daegu (Korea, Republic of); Edgar, Thomas F. [University of Texas, Austin (United States)

    2016-02-15

    A nonlinear process with input multiplicity has two or more input values for a given output at the steady state, and the process steady state gain changes its sign as the operating point changes. A control system with integral action will be unstable when both signs of the process gain and the controller integral gain are different, and its stability region will be limited to the boundary where the process steady state gain is zero. Unlike processes with output multiplicities, feedback controllers cannot be used to correct the sign changes of process gain. To remove such stability limitation, a simple control system with parallel compensator is proposed. The parallel compensator can be easily designed based on the process steady state gain information and tuned in the field. Using the two time scale method, the stability of proposed control systems for processes with input multiplicities can be checked.

  20. Process Planning Support System for Green Manufacturing and its application

    Institute of Scientific and Technical Information of China (English)

    HE Yan; LIU Fei; CAO Huajun; ZHANG Hua

    2007-01-01

    Owing to a lack of practical methods and soft- ware tools in the existing researches on green manufacturing (GM), process planning support system for green manufac- turing (GMPPSS) was developed to deal with the problems in optimization of environment-benign process planning. The GMPPSS consisted mainly of three function modules and related model repositories including: selection of process elements, optimization of process courses, and evaluation of process projects for GM. The database of the GMPPSS was used to provide plentiful information on resources consump- tion and environmental impact in manufacturing processes, which consisted of process attribute database, inventory database, machine database, tool database, and the cutting fluid database. Raw materials, secondary material consump- tion, energy consumption, and environment impacts of pro- cess planning were optimized to improve the green attribute of process planning of parts with the supports of the data- bases and model repositories. The gear processing in the machining tool factory was presented to verify the system's applicability.

  1. Grey systems for intelligent sensors and information processing

    Institute of Scientific and Technical Information of China (English)

    Chen Chunlin; Dong Daoyi; Chen Zonghai; Wang Haibo

    2008-01-01

    In a measurement system,new representation methods are necessary to maintain the uncertainty and to supply more powerful ability for reasoning and transformation between numerical system and symbolic system.A grey measurement system is discussed from the point of view of intelligent sensors and incomplete information processing compared with a numerical and symbolized mea8urement system.The methods of grey representation and information processing are proposed for data collection and reasoning.As a case study,multi-ultrasonic sensor systems are demonstrated to verify the effectiveness of the proposed methods.

  2. Automated business processes in outbound logistics: An information system perspective

    DEFF Research Database (Denmark)

    Tambo, Torben

    2010-01-01

    by securing a much higher quality of data and eliminating a number of defect management processes in the domain of fashion wholesale and retail. The information system perspective is used in dealing with adopting bespoke ERP development effort of creating an ERP system on the leading edge of business process...

  3. Eco-efficiency of grinding processes and systems

    CERN Document Server

    Winter, Marius

    2016-01-01

    This research monograph aims at presenting an integrated assessment approach to describe, model, evaluate and improve the eco-efficiency of existing and new grinding processes and systems. Various combinations of grinding process parameters and system configurations can be evaluated based on the eco-efficiency. The book presents the novel concept of empirical and physical modeling of technological, economic and environmental impact indicators. This includes the integrated evaluation of different grinding process and system scenarios. The book is a valuable read for research experts and practitioners in the field of eco-efficiency of manufacturing processes but the book may also be beneficial for graduate students.

  4. The Technique of Building a Networked Manufacturing Process Monitoring System

    Institute of Scientific and Technical Information of China (English)

    XIE Yong; ZHANG Yu; YANG Musheng

    2006-01-01

    This paper introduces the constitute, structure and the software model of a set of networked manufacturing process monitoring system, using JAVA network technique to realize a set of three layer distributed manufacturing process monitoring system which is comprised with remote manage center, manufacturing process supervision center and the units of measure and control layer such as displacement sensor, the device of temperature measure and alarm etc. The network integration of the production management layer, the process control layer and the hard ware control layer is realized via using this approach. The design using object-oriented technique based on JAVA can easily transport to different operation systems with high performance of the expansibility.

  5. SIMULATION AS A TOOL FOR PROCESS OPTIMIZATION OF LOGISTIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Radko Popovič

    2015-09-01

    Full Text Available The paper deals with the simulation of the production processes, especially module of Siemens Tecnomatix software. Tecnomatix Process Simulate is designed for building new or modifying existing production processes. The simulation created in this software has a posibility for fast testing of planned changes or improvements of the production processes. On the base of simulation you can imagine the future picture of the real production system. 3D Simulation can reflects the actual status and conditions on the running system and of course, after some improvements, it can show the possible figure of the production system.

  6. Virtual Systems Pharmacology (ViSP software for mechanistic system-level model simulations

    Directory of Open Access Journals (Sweden)

    Sergey eErmakov

    2014-10-01

    Full Text Available Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user’s particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  7. SUPPLY CHAIN MANAGEMENT SYSTEM USING THE PROPERTY OF GRAPHICAL USER INTERFACE

    Directory of Open Access Journals (Sweden)

    Venkatesan.M,

    2011-02-01

    Full Text Available Manufacturing companies uses the supply chain management system (SCM. SCM is a system for managing raw material and finished goods requirements in a manufacturing process. It is set of techniques that use inventory data, requirements for materials and goods etc.the system also makes recommendation for purchasing, sale and send for job work of raw materials. The main objective of the project is tomaintain the raw materials and finish goods in the manufacturing organization. In this software the information are stored in the database, which prepare reports when asked, and a very reliablefront-end structure with GUI property to make the user understand and work in a right way even though he/she is a layman. This project is developed using Oracle Developer 2000 Forms 6i as front-end, Oracle8.0 as a back-end, and Oracle Developer 2000 Reports 6i as reporting tool.

  8. Inclusive Education as Complex Process and Challenge for School System

    Science.gov (United States)

    Al-Khamisy, Danuta

    2015-01-01

    Education may be considered as a number of processes, actions and effects affecting human being, as the state or level of the results of these processes or as the modification of the functions, institutions and social practices roles, which in the result of inclusion become new, integrated system. Thus this is very complex process. Nowadays the…

  9. A new data acquisition and processing system for profiling sonar

    Institute of Scientific and Technical Information of China (English)

    XU Xiao-ka; SANG En-fang; QIAO Gang; WANG Ji-sheng

    2008-01-01

    A multi-beam chirp sonar based on IP connections and DSP processing nodes was proposed and designed to provide an expandable system with high-speed processing and mass-storage of real-time signals for multi-beam profiling sonar. The system was designed for seabed petroleum pipeline detection and orientation, and can receive echo signals and process the data in real time, refreshing the display 10 times per second. Every node of the chirp sonar connects with data processing nodes through TCP/IP.Merely by adding nodes, the system's processing ability can be increased proportionately without changing the software. System debugging and experimental testing proved the system to be practical and stable. This design provides a new method for high speed active sonar.

  10. Industrial process system assessment: bridging process engineering and life cycle assessment through multiscale modeling.

    Science.gov (United States)

    The Industrial Process System Assessment (IPSA) methodology is a multiple step allocation approach for connecting information from the production line level up to the facility level and vice versa using a multiscale model of process systems. The allocation procedure assigns inpu...

  11. The operation technology of realtime image processing system (Datacube)

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Lee, Yong Bum; Lee, Nam Ho; Choi, Young Soo; Park, Soon Yong; Park, Jin Seok

    1997-02-01

    In this project, a Sparc VME-based MaxSparc system, running the solaris operating environment, is selected as the dedicated image processing hardware for robot vision applications. In this report, the operation of Datacube maxSparc system, which is high performance realtime image processing hardware, is systematized. And image flow example programs for running MaxSparc system are studied and analyzed. The state-of-the-arts of Datacube system utilizations are studied and analyzed. For the next phase, advanced realtime image processing platform for robot vision application is going to be developed. (author). 19 refs., 71 figs., 11 tabs.

  12. Big Data Processing in Complex Hierarchical Network Systems

    CERN Document Server

    Polishchuk, Olexandr; Tyutyunnyk, Maria; Yadzhak, Mykhailo

    2016-01-01

    This article covers the problem of processing of Big Data that describe process of complex networks and network systems operation. It also introduces the notion of hierarchical network systems combination into associations and conglomerates alongside with complex networks combination into multiplexes. The analysis is provided for methods of global network structures study depending on the purpose of the research. Also the main types of information flows in complex hierarchical network systems being the basic components of associations and conglomerates are covered. Approaches are proposed for creation of efficient computing environments, distributed computations organization and information processing methods parallelization at different levels of system hierarchy.

  13. INFORMATION SYSTEM OF AUTOMATION OF PREPARATION EDUCATIONAL PROCESS DOCUMENTS

    Directory of Open Access Journals (Sweden)

    V. A. Matyushenko

    2016-01-01

    Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.

  14. How Workflow Systems Facilitate Business Process Reengineering and Improvement

    Directory of Open Access Journals (Sweden)

    Mohamed El Khadiri

    2012-03-01

    Full Text Available This paper investigates the relationship between workflow systems and business process reengineering and improvement. The study is based on a real case study at the “Centre Rgional dInvestissement” (CRI of Marrakech, Morocco. The CRI is entrusted to coordinate various investment projects at the regional level. Our previous work has shown that workflow system can be a basis for business process reengineering. However, for continuous process improvement, the system has shown to be insufficient as it fails to deal with exceptions and problem resolutions that informal communications provide. However, when this system is augmented with an expanded corporate memory system that includes social tools, to capture informal communication and data, we are closer to a more complete system that facilitates business process reengineering and improvement.

  15. Optimal control of switched systems arising in fermentation processes

    CERN Document Server

    Liu, Chongyang

    2014-01-01

    The book presents, in a systematic manner, the optimal controls under different mathematical models in fermentation processes. Variant mathematical models – i.e., those for multistage systems; switched autonomous systems; time-dependent and state-dependent switched systems; multistage time-delay systems and switched time-delay systems – for fed-batch fermentation processes are proposed and the theories and algorithms of their optimal control problems are studied and discussed. By putting forward novel methods and innovative tools, the book provides a state-of-the-art and comprehensive systematic treatment of optimal control problems arising in fermentation processes. It not only develops nonlinear dynamical system, optimal control theory and optimization algorithms, but can also help to increase productivity and provide valuable reference material on commercial fermentation processes.

  16. Massive MIMO Systems: Signal Processing Challenges and Research Trends

    OpenAIRE

    de Lamare, R.C.

    2013-01-01

    This article presents a tutorial on multiuser multiple-antenna wireless systems with a very large number of antennas, known as massive multi-input multi-output (MIMO) systems. Signal processing challenges and future trends in the area of massive MIMO systems are presented and key application scenarios are detailed. A linear algebra approach is considered for the description of the system and data models of massive MIMO architectures. The operational requirements of massive MIMO systems are di...

  17. Process Control System Cyber Security Standards - An Overview

    Energy Technology Data Exchange (ETDEWEB)

    Robert P. Evans

    2006-05-01

    The use of cyber security standards can greatly assist in the protection of process control systems by providing guidelines and requirements for the implementation of computer-controlled systems. These standards are most effective when the engineers and operators, using the standards, understand what each standard addresses. This paper provides an overview of several standards that deal with the cyber security of process measurements and control systems.

  18. Process of activation of a palladium catalyst system

    Science.gov (United States)

    Sobolevskiy, Anatoly; Rossin, Joseph A.; Knapke, Michael J.

    2011-08-02

    Improved processes for activating a catalyst system used for the reduction of nitrogen oxides are provided. In one embodiment, the catalyst system is activated by passing an activation gas stream having an amount of each of oxygen, water vapor, nitrogen oxides, and hydrogen over the catalyst system and increasing a temperature of the catalyst system to a temperature of at least 180.degree. C. at a heating rate of from 1-20.degree./min. Use of activation processes described herein leads to a catalyst system with superior NOx reduction capabilities.

  19. Software control and system configuration management - A process that works

    Science.gov (United States)

    Petersen, K. L.; Flores, C., Jr.

    1983-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  20. LabData database sub-systems for post-processing and quality control of stable isotope and gas chromatography measurements

    Science.gov (United States)

    Suckow, A. O.

    2013-12-01

    Measurements need post-processing to obtain results that are comparable between laboratories. Raw data may need to be corrected for blank, memory, drift (change of reference values with time), linearity (dependence of reference on signal height) and normalized to international reference materials. Post-processing parameters need to be stored for traceability of results. State of the art stable isotope correction schemes are available based on MS Excel (Geldern and Barth, 2012; Gröning, 2011) or MS Access (Coplen, 1998). These are specialized to stable isotope measurements only, often only to the post-processing of a special run. Embedding of algorithms into a multipurpose database system was missing. This is necessary to combine results of different tracers (3H, 3He, 2H, 18O, CFCs, SF6...) or geochronological tools (Sediment dating e.g. with 210Pb, 137Cs), to relate to attribute data (submitter, batch, project, geographical origin, depth in core, well information etc.) and for further interpretation tools (e.g. lumped parameter modelling). Database sub-systems to the LabData laboratory management system (Suckow and Dumke, 2001) are presented for stable isotopes and for gas chromatographic CFC and SF6 measurements. The sub-system for stable isotopes allows the following post-processing: 1. automated import from measurement software (Isodat, Picarro, LGR), 2. correction for sample-to sample memory, linearity, drift, and renormalization of the raw data. The sub-system for gas chromatography covers: 1. storage of all raw data 2. storage of peak integration parameters 3. correction for blank, efficiency and linearity The user interface allows interactive and graphical control of the post-processing and all corrections by export to and plot in MS Excel and is a valuable tool for quality control. The sub-databases are integrated into LabData, a multi-user client server architecture using MS SQL server as back-end and an MS Access front-end and installed in four

  1. Markov Skeleton Processes and Applications to Queueing Systems

    Institute of Scientific and Technical Information of China (English)

    Zhen-ting Hou

    2002-01-01

    In this paper, we apply the backward equations of Markov skeleton processes to queueing systems.The transient distribution of the waiting time of a GI/G/1 queueing system, the transient distribution of the length of a GI/G/N queueing system and the transient distribution of the length of queueing networks are obtained.

  2. Integration of Management Systems: A Process Based Model

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The paper discusses the barriers of integration of management systems (IMS). A model based on process is explored. It is indicated that integrating management systems should not ignore the characteristics of the management systems, especially scope issues. IMS needs to take into the continuous improvement.

  3. A COST ORIENTED SYSTEM FOR HOLE MAKING PROCESSES

    Directory of Open Access Journals (Sweden)

    Uğur PAMUKOĞLU

    2004-01-01

    Full Text Available A knowledge based system for manufacturing of various hole making processes has been developed. In the system, selection of machining methods, determination of sequences based on cutting tools for each process, determination of process time and cost analysis have been conducted. In the procedure, all available processes have been taken in to account regarding their costs and the most suitable in cost was chosen. The system generated helps facilitate determination of process time and the costs of features to be manufactured. It is especially useful for quick cost estimation. In addition to these, the system helps people who are naïve in manufacturing operations so that people could be used for the related manufacturing stages.

  4. Microscale and Nanoscale Process Systems Engineering: Challenge and Progress

    Institute of Scientific and Technical Information of China (English)

    杨友麒

    2008-01-01

    This is an overview of the development of process systems engineering (PSE) in a smaller world. Two different spatio-temporal scopes are identified for microscale and nanoscale process systems. The features and challenges for each scale are reviewed, and different methodologies used by them discussed. Comparison of these two new areas with traditional process systems engineering is described. If microscale PSE could be considered as an extension of traditional PSE, nanoscale PSE should be accepted as a new discipline which has looser connection with the extant core of chemical engineering. Since "molecular factories" is the next frontier of processing scale, nanoscale PSE will be the new theory to handle the design, simulation and operation of those active processing systems.

  5. Performance appraisal process and system facets: relationship with contextual performance.

    Science.gov (United States)

    Findley, H M; Giles, W F; Mossholder, K W

    2000-08-01

    Because appraisal-related interactions between supervisors and employees may influence more than task performance, the authors considered the potential effects of social and interpersonal processes in performance appraisal on contextual performance. They hypothesized that performance appraisal process and system facets were associated with employees' contextual performance as well as with their perceptions of appraisal accuracy. After controlling relevant variables, they found that appraisal process facets explained variance in contextual performance and perceived accuracy beyond that accounted for by the system facets. However, when the order of entry for the process and system variable sets was reversed, only for perceived appraisal accuracy, as hypothesized, did the system facets account for variance beyond that explained by the appraisal process facets.

  6. 1st International Conference on Cognitive Systems and Information Processing

    CERN Document Server

    Hu, Dewen; Liu, Huaping

    2014-01-01

    "Foundations and Practical Applications of Cognitive Systems and Information Processing" presents selected papers from the First International Conference on Cognitive Systems and Information Processing, held in Beijing, China on December 15-17, 2012 (CSIP2012). The aim of this conference is to bring together experts from different fields of expertise to discuss the state-of-the-art in artificial cognitive systems and advanced information processing, and to present new findings and perspectives on future development. This book introduces multidisciplinary perspectives on the subject areas of Cognitive Systems and Information Processing, including cognitive sciences and technology, autonomous vehicles, cognitive psychology, cognitive metrics, information fusion, image/video understanding, brain-computer interfaces, visual cognitive processing, neural computation, bioinformatics, etc. The book will be beneficial for both researchers and practitioners in the fields of Cognitive Science, Computer Science and Cogni...

  7. Competing particle systems evolving by interacting Levy processes

    CERN Document Server

    Shkolnikov, Mykhaylo

    2010-01-01

    We consider finite and infinite systems of particles on the real line and half-line evolving in continuous time. Hereby, the particles are driven by i.i.d. Levy processes endowed with rank-dependent drift and diffusion coefficients. In the finite systems we show that the processes of gaps in the respective particle configurations possess unique invariant distributions and prove the convergence of the gap processes to the latter in the total variation distance, assuming a bound on the jumps of the Levy processes. In the infinite case we show that the gap process of the particle system on the half-line is tight for appropriate initial conditions and same drift and diffusion coefficients for all particles. Applications of such processes include the modelling of capital distributions among the ranked participants in a financial market, the stability of certain stochastic queueing and storage networks and the study of the Sherrington-Kirkpatrick model of spin glasses.

  8. An extended process automation system : an approach based on a multi-agent system

    OpenAIRE

    Seilonen, Ilkka

    2006-01-01

    This thesis describes studies on application of multi-agent systems (acronym: MAS) to enhance process automation systems. A specification of an extended process automation system is presented. According to this specification, MAS can be used to extend the functionality of ordinary process automation systems at higher levels of control. Anticipated benefits of the specification include enhanced reconfigurability, responsiveness and flexibility properties of process automation. Previous res...

  9. Thermal processing system concepts and considerations for RWMC buried waste

    Energy Technology Data Exchange (ETDEWEB)

    Eddy, T.L.; Kong, P.C.; Raivo, B.D.; Anderson, G.L.

    1992-02-01

    This report presents a preliminary determination of ex situ thermal processing system concepts and related processing considerations for application to remediation of transuranic (TRU)-contaminated buried wastes (TRUW) at the Radioactive Waste Management Complex (RWMC) of the Idaho National Engineering Laboratory (INEL). Beginning with top-level thermal treatment concepts and requirements identified in a previous Preliminary Systems Design Study (SDS), a more detailed consideration of the waste materials thermal processing problem is provided. Anticipated waste stream elements and problem characteristics are identified and considered. Final waste form performance criteria, requirements, and options are examined within the context of providing a high-integrity, low-leachability glass/ceramic, final waste form material. Thermal processing conditions required and capability of key systems components (equipment) to provide these material process conditions are considered. Information from closely related companion study reports on melter technology development needs assessment and INEL Iron-Enriched Basalt (IEB) research are considered. Five potentially practicable thermal process system design configuration concepts are defined and compared. A scenario for thermal processing of a mixed waste and soils stream with essentially no complex presorting and using a series process of incineration and high temperature melting is recommended. Recommendations for applied research and development necessary to further detail and demonstrate the final waste form, required thermal processes, and melter process equipment are provided.

  10. Integrated System for Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    Takano, Kiyoteru; Gani, Rafiqul; Ishikawa, T.

    2000-01-01

    A thermodynamic insights based algorithm for integrated design and analysis of crystallization processes with electrolyte systems is presented. This algorithm consists of a thermodynamic calculation part, a process design/analysis part and a process simulation part, which are integrated through...... of the integrated system are illustrated through two case studies where one represents an industrial crystallization process....

  11. Design and Implementation of an Embedded NIOS II System for JPEG2000 Tier II Encoding

    Directory of Open Access Journals (Sweden)

    John M. McNichols

    2013-01-01

    Full Text Available This paper presents a novel implementation of the JPEG2000 standard as a system on a chip (SoC. While most of the research in this field centers on acceleration of the EBCOT Tier I encoder, this work focuses on an embedded solution for EBCOT Tier II. Specifically, this paper proposes using an embedded softcore processor to perform Tier II processing as the back end of an encoding pipeline. The Altera NIOS II processor is chosen for the implementation and is coupled with existing embedded processing modules to realize a fully embedded JPEG2000 encoder. The design is synthesized on a Stratix IV FPGA and is shown to out perform other comparable SoC implementations by 39% in computation time.

  12. Advanced Manufacturing Systems in Food Processing and Packaging Industry

    Science.gov (United States)

    Shafie Sani, Mohd; Aziz, Faieza Abdul

    2013-06-01

    In this paper, several advanced manufacturing systems in food processing and packaging industry are reviewed, including: biodegradable smart packaging and Nano composites, advanced automation control system consists of fieldbus technology, distributed control system and food safety inspection features. The main purpose of current technology in food processing and packaging industry is discussed due to major concern on efficiency of the plant process, productivity, quality, as well as safety. These application were chosen because they are robust, flexible, reconfigurable, preserve the quality of the food, and efficient.

  13. Nonterrestrial material processing and manufacturing of large space systems

    Science.gov (United States)

    Von Tiesenhausen, G.

    1979-01-01

    Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.

  14. A web accessible scientific workflow system for vadoze zone performance monitoring: design and implementation examples

    Science.gov (United States)

    Mattson, E.; Versteeg, R.; Ankeny, M.; Stormberg, G.

    2005-12-01

    Long term performance monitoring has been identified by DOE, DOD and EPA as one of the most challenging and costly elements of contaminated site remedial efforts. Such monitoring should provide timely and actionable information relevant to a multitude of stakeholder needs. This information should be obtained in a manner which is auditable, cost effective and transparent. Over the last several years INL staff has designed and implemented a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition from diverse sensors (geophysical, geochemical and hydrological) with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic javascript and html/css) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This system has been implemented and is operational for several sites, including the Ruby Gulch Waste Rock Repository (a capped mine waste rock dump on the Gilt Edge Mine Superfund Site), the INL Vadoze Zone Research Park and an alternative cover landfill. Implementations for other vadoze zone sites are currently in progress. These systems allow for autonomous performance monitoring through automated data analysis and report generation. This performance monitoring has allowed users to obtain insights into system dynamics, regulatory compliance and residence times of water. Our system uses modular components for data selection and graphing and WSDL compliant webservices for external functions such as statistical analyses and model invocations. Thus, implementing this system for novel sites and extending functionality (e.g. adding novel models) is relatively straightforward. As system access requires a standard webbrowser

  15. Expert,Neural and Fuzzy Systems in Process Planning

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Computer aided process planning (CAPP) aims at improving efficiency, quali t y, and productivity in a manufacturing concern through reducing lead-times and costs by utilizing better manufacturing practices thus improving competitiveness in the market. CAPP attempts to capture the thoughts and methods of the experie nced process planner. Variant systems are understandable, generative systems can plan new parts. Expert systems increase flexibility, fuzzy logic captures vague knowledge while neural networks learn. The combination of fuzzy, neural and exp ert system technologies is necessary to capture and utilize the process planning logic. A system that maintains the dependability and clarity of variant systems , is capable of planning new parts, and improves itself through learning is neede d by industry.

  16. Integrated Main Propulsion System Performance Reconstruction Process/Models

    Science.gov (United States)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  17. APPLEPIPS /Apple Personal Image Processing System/ - An interactive digital image processing system for the Apple II microcomputer

    Science.gov (United States)

    Masuoka, E.; Rose, J.; Quattromani, M.

    1981-01-01

    Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.

  18. Developing of robot flexible processing system for shipbuilding profile steel

    Institute of Scientific and Technical Information of China (English)

    姚舜; 邱涛; 楼松年; 王宏杰

    2003-01-01

    A robot flexible processing system of shipbuilding profile steel was developed. The system consists of computer integrated control and robot. An off-line programming robot was used for marking and cutting of shipbuilding profile steel. In the system the deformation and position error of profile steel can be detected by precise sensors, and figure position coordinate error resulted from profile steel deformation can be compensated by modifying traveling track of robotic arm online. The practical operation results show that the system performance can meet the needs of profile steel processing.

  19. Web-based software system for processing bilingual digital resources

    Directory of Open Access Journals (Sweden)

    Ralitsa Dutsova

    2014-09-01

    Full Text Available Web-based software system for processing bilingual digital resourcesThe article describes a software management system developed at the Institute of Mathematics and Informatics, BAS, for the creation, storing and processing of digital language resources in Bulgarian. Independent components of the system are intended for the creation and management of bilingual dictionaries, for information retrieval and data mining from a bilingual dictionary, and for the presentation of aligned corpora. A module which connects these components is also being developed. The system, implemented as a web-application, contains tools for compilation, editing and search within all components.

  20. Gathering Information from Transport Systems for Processing in Supply Chains

    Science.gov (United States)

    Kodym, Oldřich; Unucka, Jakub

    2016-12-01

    Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.

  1. Extracting Process and Mapping Management for Heterogennous Systems

    Science.gov (United States)

    Hagara, Igor; Tanuška, Pavol; Duchovičová, Soňa

    2013-12-01

    A lot of papers describe three common methods of data selection from primary systems. This paper defines how to select the correct method or combinations of methods for minimizing the impact of production system and common operation. Before using any method, it is necessary to know the primary system and its databases structures for the optimal use of the actual data structure setup and the best design for ETL process. Databases structures are usually categorized into groups, which characterize their quality. The classification helps to find the ideal method for each group and thus design a solution of ETL process with the minimal impact on the data warehouse and production system.

  2. Modular trigger processing The GCT muon and quiet bit system

    CERN Document Server

    Stettler, Matthew; Hansen, Magnus; Iles, Gregory; Jones, John; PH-EP

    2007-01-01

    The CMS Global Calorimeter Trigger system's HCAL Muon and Quiet bit reformatting function is being implemented with a novel processing architecture. This architecture utilizes micro TCA, a modern modular communications standard based on high speed serial links, to implement a processing matrix. This matrix is configurable in both logical functionality and data flow, allowing far greater flexibility than current trigger processing systems. In addition, the modular nature of this architecture allows flexibility in scale unmatched by traditional approaches. The Muon and Quiet bit system consists of two major components, a custom micro TCA backplane and processing module. These components are based on Xilinx Virtex5 and Mindspeed crosspoint switch devices, bringing together state of the art FPGA based processing and Telcom switching technologies.

  3. Statistical process control methods for expert system performance monitoring.

    Science.gov (United States)

    Kahn, M G; Bailey, T C; Steib, S A; Fraser, V J; Dunagan, W C

    1996-01-01

    The literature on the performance evaluation of medical expert system is extensive, yet most of the techniques used in the early stages of system development are inappropriate for deployed expert systems. Because extensive clinical and informatics expertise and resources are required to perform evaluations, efficient yet effective methods of monitoring performance during the long-term maintenance phase of the expert system life cycle must be devised. Statistical process control techniques provide a well-established methodology that can be used to define policies and procedures for continuous, concurrent performance evaluation. Although the field of statistical process control has been developed for monitoring industrial processes, its tools, techniques, and theory are easily transferred to the evaluation of expert systems. Statistical process tools provide convenient visual methods and heuristic guidelines for detecting meaningful changes in expert system performance. The underlying statistical theory provides estimates of the detection capabilities of alternative evaluation strategies. This paper describes a set of statistical process control tools that can be used to monitor the performance of a number of deployed medical expert systems. It describes how p-charts are used in practice to monitor the GermWatcher expert system. The case volume and error rate of GermWatcher are then used to demonstrate how different inspection strategies would perform.

  4. Bioregenerative Life Support Systems Test Complex (Bio-Plex) Food Processing System: A Dual System

    Science.gov (United States)

    Perchonok, Michele; Vittadini, Elena; Peterson, Laurie J.; Swango, Beverly E.; Toerne, Mary E.; Russo, Dane M. (Technical Monitor)

    2001-01-01

    A Bioregenerative Life Support Test Complex, BIO-Plex, is currently being constructed at the Johnson Space Center (JSC) in Houston, TX. This facility will attempt to answer the questions involved in developing a lunar or planetary base. The Food Processing System (FPS) of the BIO-Plex is responsible for supplying food to the crew in coordination with the chosen mission scenario. Long duration space missions require development of both a Transit Food System and of a Lunar or Planetary Food System. These two systems are intrinsically different since the first one will be utilized in the transit vehicle in microgravity conditions with mostly resupplied foods, while the second will be used in conditions of partial gravity (hypogravity) to process foods from crops grown in the facility. The Transit Food System will consist of prepackaged food of extended shelf life. It will be supplemented with salad crops that will be consumed fresh. Microgravity imposes significant limitation on the ability to handle food and allows only for minimal processing. The challenge is to develop food systems similar to the International Space Station or Shuttle Food Systems but with a shelf life of 3 - 5 years. The Lunar or Planetary Food System will allow for food processing of crops due to the presence of some gravitational force (1/6 to 1/3 that of Earth). Crops such as wheat, soybean, rice, potato, peanut, and salad crops, will be processed to final products to provide a nutritious and acceptable diet for the crew. Not only are constraints imposed on the FPS from the crops (e.g., crop variation, availability, storage and shelf-life) but also significant requirements are present for the crew meals (e.g., RDA, high quality, safety, variety). The FPS becomes a fulcrum creating the right connection from crops to crew meals while dealing with issues of integration within a closed self-regenerative system (e.g., safe processing, waste production, volumes, air contaminations, water usage, etc

  5. Flexible Process Notations for Cross-organizational Case Management Systems

    DEFF Research Database (Denmark)

    Slaats, Tijs

    2016-01-01

    frustration and inefficiency because they do not allow workers to use their expert experience to make the best judgements on how to solve the unique challenges they are faced with. However some structuring of their work is still required to ensure that laws and business rules are being followed. IT Systems...... for process control have a large role to play in structuring and organizing such processes, however most of these systems have been developed with a focus on production work and fail to support the more flexible processes required by knowledge workers. The problem arises at the core of these systems...... of the process and techniques for runtime adaptation. This dissertation reports on the results of the Technologies for Flexible Cross-organizational Case Management Systems (FLExCMS) research project which was started in cooperation between ITU and the company Exformatics A/S. The goals of the project were...

  6. Solar System Processes Underlying Planetary Formation, Geodynamics, and the Georeactor

    CERN Document Server

    Herndon, J M

    2006-01-01

    Only three processes, operant during the formation of the Solar System, are responsible for the diversity of matter in the Solar System and are directly responsible for planetary internal-structures, including planetocentric nuclear fission reactors, and for dynamical processes, including and especially, geodynamics. These processes are: (i) Low-pressure, low-temperature condensation from solar matter in the remote reaches of the Solar System or in the interstellar medium; (ii) High-pressure, high-temperature condensation from solar matter associated with planetary-formation by raining out from the interiors of giant-gaseous protoplanets, and; (iii) Stripping of the primordial volatile components from the inner portion of the Solar System by super-intense solar wind associated with T-Tauri phase mass-ejections, presumably during the thermonuclear ignition of the Sun. As described herein, these processes lead logically, in a causally related manner, to a coherent vision of planetary formation with profound imp...

  7. An Automatic Number Plate Recognition System under Image Processing

    OpenAIRE

    Sarbjit Kaur

    2016-01-01

    Automatic Number Plate Recognition system is an application of computer vision and image processing technology that takes photograph of vehicles as input image and by extracting their number plate from whole vehicle image , it display the number plate information into text. Mainly the ANPR system consists of 4 phases: - Acquisition of Vehicle Image and Pre-Processing, Extraction of Number Plate Area, Character Segmentation and Character Recognition. The overall accuracy and efficiency of whol...

  8. An expert systems application to space base data processing

    Science.gov (United States)

    Babb, Stephen M.

    1988-01-01

    The advent of space vehicles with their increased data requirements are reflected in the complexity of future telemetry systems. Space based operations with its immense operating costs will shift the burden of data processing and routine analysis from the space station to the Orbital Transfer Vehicle (OTV). A research and development project is described which addresses the real time onboard data processing tasks associated with a space based vehicle, specifically focusing on an implementation of an expert system.

  9. 4th International Conference on Communications, Signal Processing, and Systems

    CERN Document Server

    Mu, Jiasong; Wang, Wei; Zhang, Baoju

    2016-01-01

    This book brings together papers presented at the 4th International Conference on Communications, Signal Processing, and Systems, which provides a venue to disseminate the latest developments and to discuss the interactions and links between these multidisciplinary fields. Spanning topics ranging from Communications, Signal Processing and Systems, this book is aimed at undergraduate and graduate students in Electrical Engineering, Computer Science and Mathematics, researchers and engineers from academia and industry as well as government employees (such as NSF, DOD, DOE, etc).

  10. Signals, processes, and systems an interactive multimedia introduction to signal processing

    CERN Document Server

    Karrenberg, Ulrich

    2013-01-01

    This is a very new concept for learning Signal Processing, not only from the physically-based scientific fundamentals, but also from the didactic perspective, based on modern results of brain research. The textbook together with the DVD form a learning system that provides investigative studies and enables the reader to interactively visualize even complex processes. The unique didactic concept is built on visualizing signals and processes on the one hand, and on graphical programming of signal processing systems on the other. The concept has been designed especially for microelectronics, computer technology and communication. The book allows to develop, modify, and optimize useful applications using DasyLab - a professional and globally supported software for metrology and control engineering. With the 3rd edition, the software is also suitable for 64 bit systems running on Windows 7. Real signals can be acquired, processed and played on the sound card of your computer. The book provides more than 200 pre-pr...

  11. REVISITING THE SIMILAR PROCESS TO ENGINEER THE CONTEMPORARY SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Ana Luísa RAMOS; José Vasconcelos FERREIRA; Jaume BARCEL(O)

    2010-01-01

    This paper addresses the present-day context of Systems Engineering,revisiting and setting up an updated framework for the SIMILAR process in order to use it to engineer the contemporary systems.The contemporary world is crowded of large interdisciplinary complex systems made of other systems,personnel,hardware,softare,information,processes,and facilities.An integrated holistic approach is crucial to develop these systems and take proper account of their multifaceted nature and numerous interrelationships.As the system's complexity and extent grow,the number of parties involved(stakeholders and shareholders)usually also raises,bringing to the interaction a considerable amount of points of view,skills,responsibilities,and interests.The Systems Engineering approach aims to tackle the complex and interdisciplinary whole of those socio-technical systems,providing the means to enable their successful realization.Its exploitation in our modern world is assuming an increasing relevance noticeable by emergent standards,academic papers,international conferences,and post-graduate programmes in the field.This work aims to provide"the picture"of modern Systems Engineering,and to update the context of the SIMILAR process model in order to use this renewed framework to engineer the challenging contemporary systems.The emerging trends in the field are also pointed-out with particular reference to the Model-Based Systems Engineering approach.

  12. Digital Signal Processing for In-Vehicle Systems and Safety

    CERN Document Server

    Boyraz, Pinar; Takeda, Kazuya; Abut, Hüseyin

    2012-01-01

    Compiled from papers of the 4th Biennial Workshop on DSP (Digital Signal Processing) for In-Vehicle Systems and Safety this edited collection features world-class experts from diverse fields focusing on integrating smart in-vehicle systems with human factors to enhance safety in automobiles. Digital Signal Processing for In-Vehicle Systems and Safety presents new approaches on how to reduce driver inattention and prevent road accidents. The material addresses DSP technologies in adaptive automobiles, in-vehicle dialogue systems, human machine interfaces, video and audio processing, and in-vehicle speech systems. The volume also features: Recent advances in Smart-Car technology – vehicles that take into account and conform to the driver Driver-vehicle interfaces that take into account the driving task and cognitive load of the driver Best practices for In-Vehicle Corpus Development and distribution Information on multi-sensor analysis and fusion techniques for robust driver monitoring and driver recognition ...

  13. MATERIAL PROCESSING FOR SELF-ASSEMBLING MACHINE SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    K. LACKNER; D. BUTT; C. WENDT

    1999-06-01

    We are developing an important aspect of a new technology based on self-reproducing machine systems. Such systems could overcome resource limitations and control the deleterious side effects of human activities on the environment. Machine systems capable of building themselves promise an increase in industrial productivity as dramatic as that of the industrial revolution. To operate successfully, such systems must procure necessary raw materials from their surroundings. Therefore, next to automation, most critical for this new technology is the ability to extract important chemicals from readily available soils. In contrast to conventional metallurgical practice, these extraction processes cannot make substantial use of rare elements. We have designed a thermodynamically viable process and experimentally demonstrated most steps that differ from common practice. To this end we had to develop a small, disposable vacuum furnace system. Our work points to a viable extraction process.

  14. Decay Process of Quantum Open System at Finite Temperatures

    Institute of Scientific and Technical Information of China (English)

    肖骁; 高一波

    2012-01-01

    Starting from the formal solution to the Heisenberg equation, we revisit an universal model for a quantum open system with a harmonic oscillator linearly coupled to a boson bath. The analysis of the decay process for a Fock state and a coherent state demonstrate that this method is very useful in dealing with the problems in decay process of the open system. For finite temperatures, the calculations of the reduced density matrix and the mean excitation number for the open system show that an initiaJ coherent state will evolve into a temperature-dependant coherent state after tracing over the bath variables. Also in short-time limit, a temperature-dependant effective Hamiltonian for the open system characterizes the decay process of the open system.

  15. Characteristics of the Audit Processes for Distributed Informatics Systems

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2009-01-01

    Full Text Available The paper contains issues regarding: main characteristics and examples of the distributed informatics systems and main difference categories among them, concepts, principles, techniques and fields for auditing the distributed informatics systems, concepts and classes of the standard term, characteristics of this one, examples of standards, guidelines, procedures and controls for auditing the distributed informatics systems. The distributed informatics systems are characterized by the following issues: development process, resources, implemented functionalities, architectures, system classes, particularities. The audit framework has two sides: the audit process and auditors. The audit process must be led in accordance with the standard specifications in the IT&C field. The auditors must meet the ethical principles and they must have a high-level of professional skills and competence in IT&C field.

  16. Designing and Securing an Event Processing System for Smart Spaces

    Science.gov (United States)

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  17. Towards securing SCADA systems against process-related threats

    NARCIS (Netherlands)

    Hadziosmanovic, Dina; Bolzoni, Damiano; Hartel, Pieter

    2010-01-01

    We propose a tool-assisted approach to address process-related threats on SCADA systems. Process-related threats have not been addressed before in a systematic manner. Our approach consists of two steps: threat analysis and threat mitigation. For the threat analysis, we combine two methodologies (PH

  18. Designing Robust Process Analytical Technology (PAT) Systems for Crystallization Processes: A Potassium Dichromate Crystallization Case Study

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan

    2013-01-01

    The objective of this study is to test and validate a Process Analytical Technology (PAT) system design on a potassium dichromate crystallization process in the presence of input uncertainties using uncertainty and sensitivity analysis. To this end a systematic framework for managing uncertaintie...

  19. A systematic framework for design of process monitoring and control (PAT) systems for crystallization processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    A generic computer-aided framework for systematic design of a process monitoring and control system for crystallization processes has been developed to study various aspects of crystallization operations.The systematic design framework contains a generic crystallizer modelling toolbox, a tool for...

  20. A Generic Framework for Systematic Design of Process Monitoring and Control System for Crystallization Processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Meisler, Kresten Troelstrup; Sin, Gürkan

    2012-01-01

    A generic framework for systematic design of a process monitoring and control system for crystallization processes has been developed in order to obtain the desired end-product properties notably the crystal size distribution (CSD). The design framework contains a generic crystallizer modelling t...

  1. A Robust Process Analytical Technology (PAT) System Design for Crystallization Processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    A generic computer-aided framework for systematic design of a process monitoring and control system for crystallization processes has been developed to study various aspects of crystallization operations. The design framework contains a generic multidimensional modelling framework, a tool for gen...

  2. The Trichotomy of Processes: a philosophical basis for information systems

    Directory of Open Access Journals (Sweden)

    George Widmeyer

    2003-11-01

    Full Text Available The principle of trichotomy from the American philosopher Charles S. Peirce can be used to categorize processes into the triad of transactional, relational, and informational. The usefulness of these categories is explicated by a comparison with structuration theory and control theory, and elaborated with a consideration of democracy in a knowledge economy. These three example applications of the process triad show the generality of the conceptual categories and provide a natural way of bringing ideas from social and ethical theories into information systems design. Modeling the world and understanding business applications through the use of the Trichotomy of Processes should facilitate the development of more valuable information systems.

  3. An Experimental-Numerical Study of Metal Peel Off in Cu/low-k Back-End Structures%用于后端工艺中铜/低k材料金属去除的实验数值研究

    Institute of Scientific and Technical Information of China (English)

    R. Kregting; R.B.R. van Silfhout; O. van der Sluis; R.A.B. Engelen; W.D. van Driel; G.Q. Zhang

    2006-01-01

    Wire pull tests are generally conducted to assess the wire bonding quality. Using Cu/low-k technology, two failure modes are usually observed: Failure in the neck of the wire (neck break) and metal peel off (MPO). The objective of our study is to investigate the root cause of metal peel off by using a combined experimental and numerical approach. First, dedicated failure analyses are conducted to identify the failure locations. Scanning Electron Microscopy analysis for a large number of completely failed samples shows that the delaminated interface, after MPO has occurred, is always the interface on top of the third metal layer, near the centre of the stack. However, these inspections do not indicate where and when MPO initiated. To understand the initiation, incremental (non-destructive) wire pull tests are used. These samples have not failed completely, but may already show the initiating crack in either of the two possible regions. Combined with use of scanning acoustic tomography (SCAT) and focused ion beam (FIB) show that MPO initiates by delamination in the back-end structure at the interface on top of the third metal layer. Secondly, a 3D FEM model for a half bond pad with partial wire bond is used to simulate the wire pull test, in order to understand the failure mode. Analyses using stress as a failure index indicates, however, that the top interface is the most critical one. This does not match with the experimental observation. Therefore, an alternative, energy-based failure index is used, the so-called area release energy method (ARE). The ARE method approximately identifies the same critical interface as found from experiments. It is assumed that the presence of a stiff layer nearby a potential crack location restricts the elastic deformation upon release. This indicates that the initiation of a crack in the upper and lower interfaces results in the release of a lower amount of energy when compared to an interface in the centre, where the surrounding

  4. Biophysical Chemistry of Fractal Structures and Processes in Environmental Systems

    NARCIS (Netherlands)

    Buffle, J.; Leeuwen, van H.P.

    2008-01-01

    This book aims to provide the scientific community with a novel and valuable approach based on fractal geometry concepts on the important properties and processes of diverse environmental systems. The interpretation of complex environmental systems using modern fractal approaches is compared and con

  5. Plug and Play Process Control of a District Heating System

    DEFF Research Database (Denmark)

    Trangbaek, Klaus; Knudsen, Torben; Skovmose Kallesøe, Carsten

    2009-01-01

    The main idea of plug and play process control is to initialise and reconfigure control systems automatically. In this paper these ideas are applied to a scaled laboratory model of a district heating pressure control system.  First of all this serves as a concrete example of plug and play control...

  6. The process matters: cyber security in industrial control systems

    NARCIS (Netherlands)

    Hadžiosmanović, Dina

    2014-01-01

    An industrial control system (ICS) is a computer system that controls industrial processes such as power plants, water and gas distribution, food production, etc. Since cyber-attacks on an ICS may have devastating consequences on human lives and safety in general, the security of ICS is important. I

  7. Research on Three Dimensional Computer Assistance Assembly Process Design System

    Institute of Scientific and Technical Information of China (English)

    HOU Wenjun; YAN Yaoqi; DUAN Wenjia; SUN Hanxu

    2006-01-01

    The computer aided process planning will certainly play a significant role in the success of enterprise informationization. 3-dimensional design will promote Tri-dimensional process planning. This article analysis nowadays situation and problems of assembly process planning, gives a 3-dimensional computer aided process planning system (3D-VAPP), and researches on the product information extraction, assembly sequence and path planning in visual interactive assembly process design, dynamic emulation of assembly and process verification, assembly animation outputs and automatic exploding view generation, interactive craft filling and craft knowledge management, etc. It also gives a multi-layer collision detect and multi-perspective automatic camera switching algorithm. Experiments were done to validate the feasibility of such technology and algorithm, which established the foundation of tri-dimensional computer aided process planning.

  8. Process Control Systems in the Chemical Industry: Safety vs. Security

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey Hahn; Thomas Anderson

    2005-04-01

    Traditionally, the primary focus of the chemical industry has been safety and productivity. However, recent threats to our nation’s critical infrastructure have prompted a tightening of security measures across many different industry sectors. Reducing vulnerabilities of control systems against physical and cyber attack is necessary to ensure the safety, security and effective functioning of these systems. The U.S. Department of Homeland Security has developed a strategy to secure these vulnerabilities. Crucial to this strategy is the Control Systems Security and Test Center (CSSTC) established to test and analyze control systems equipment. In addition, the CSSTC promotes a proactive, collaborative approach to increase industry's awareness of standards, products and processes that can enhance the security of control systems. This paper outlines measures that can be taken to enhance the cybersecurity of process control systems in the chemical sector.

  9. Small Interactive Image Processing System (SMIPS) users manual

    Science.gov (United States)

    Moik, J. G.

    1973-01-01

    The Small Interactive Image Processing System (SMIP) is designed to facilitate the acquisition, digital processing and recording of image data as well as pattern recognition in an interactive mode. Objectives of the system are ease of communication with the computer by personnel who are not expert programmers, fast response to requests for information on pictures, complete error recovery as well as simplification of future programming efforts for extension of the system. The SMIP system is intended for operation under OS/MVT on an IBM 360/75 or 91 computer equipped with the IBM-2250 Model 1 display unit. This terminal is used as an interface between user and main computer. It has an alphanumeric keyboard, a programmed function keyboard and a light pen which are used for specification of input to the system. Output from the system is displayed on the screen as messages and pictures.

  10. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  11. Memory Systems, Processing Modes, and Components: Functional Neuroimaging Evidence

    Science.gov (United States)

    Cabeza, Roberto; Moscovitch, Morris

    2013-01-01

    In the 1980s and 1990s, there was a major theoretical debate in the memory domain regarding the multiple memory systems and processing modes frameworks. The components of processing framework argued for a middle ground: Instead of neatly divided memory systems or processing modes, this framework proposed the existence of numerous processing components that are recruited in different combinations by memory tasks and yield complex patterns of associations and dissociations. Because behavioral evidence was not sufficient to decide among these three frameworks, the debate was largely abandoned. However, functional neuroimaging evidence accumulated during the last two decades resolves the stalemate, because this evidence is more consistent with the components framework than with the other two frameworks. For example, functional neuroimaging evidence shows that brain regions attributed to one memory system can contribute to tasks associated with other memory systems and that brain regions attributed to the same processing mode (perceptual or conceptual) can be dissociated from each other. Functional neuroimaging evidence suggests that memory processes are supported by transient interactions between a few regions called process-specific alliances. These conceptual developments are an example of how functional neuroimaging can contribute to theoretical debates in cognitive psychology. PMID:24163702

  12. Data Processing Model of Coalmine Gas Early-Warning System

    Institute of Scientific and Technical Information of China (English)

    QIAN Jian-sheng; YIN Hong-sheng; LIU Xiu-rong; HUA Gang; XU Yong-gang

    2007-01-01

    The data processing mode is vital to the performance of an entire coalmine gas early-warning system, especially in real-time performance. Our objective was to present the structural features of coalmine gas data, so that the data could be processed at different priority levels in C language. Two different data processing models, one with priority and the other without priority, were built based on queuing theory. Their theoretical formulas were determined via a M/M/1 model in order to calculate average occupation time of each measuring point in an early-warning program. We validated the model with the gas early-warning system of the Huaibei Coalmine Group Corp. The results indicate that the average occupation time for gas data processing by using the queuing system model with priority is nearly 1/30 of that of the model without priority..

  13. Parallel and distributed processing in power system simulation and control

    Energy Technology Data Exchange (ETDEWEB)

    Falcao, Djalma M. [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia

    1994-12-31

    Recent advances in computer technology will certainly have a great impact in the methodologies used in power system expansion and operational planning as well as in real-time control. Parallel and distributed processing are among the new technologies that present great potential for application in these areas. Parallel computers use multiple functional or processing units to speed up computation while distributed processing computer systems are collection of computers joined together by high speed communication networks having many objectives and advantages. The paper presents some ideas for the use of parallel and distributed processing in power system simulation and control. It also comments on some of the current research work in these topics and presents a summary of the work presently being developed at COPPE. (author) 53 refs., 2 figs.

  14. The COMPTEL Processing and Analysis Software system (COMPASS)

    Science.gov (United States)

    de Vries, C. P.; COMPTEL Collaboration

    The data analysis system of the gamma-ray Compton Telescope (COMPTEL) onboard the Compton-GRO spacecraft is described. A continous stream of data of the order of 1 kbytes per second is generated by the instrument. The data processing and analysis software is build around a relational database managment system (RDBMS) in order to be able to trace heritage and processing status of all data in the processing pipeline. Four institutes cooperate in this effort requiring procedures to keep local RDBMS contents identical between the sites and swift exchange of data using network facilities. Lately, there has been a gradual move of the system from central processing facilities towards clusters of workstations.

  15. Generic Health Management: A System Engineering Process Handbook Overview and Process

    Science.gov (United States)

    Wilson, Moses Lee; Spruill, Jim; Hong, Yin Paw

    1995-01-01

    Health Management, a System Engineering Process, is one of those processes-techniques-and-technologies used to define, design, analyze, build, verify, and operate a system from the viewpoint of preventing, or minimizing, the effects of failure or degradation. It supports all ground and flight elements during manufacturing, refurbishment, integration, and operation through combined use of hardware, software, and personnel. This document will integrate Health Management Processes (six phases) into five phases in such a manner that it is never a stand alone task/effort which separately defines independent work functions.

  16. Quantum Information Processing in Disordered and Complex Quantum Systems

    CERN Document Server

    De, A S; Ahufinger, V; Briegel, H J; Sanpera, A; Lewenstein, M; De, Aditi Sen; Sen, Ujjwal; Ahufinger, Veronica; Briegel, Hans J.; Sanpera, Anna; Lewenstein, Maciej

    2005-01-01

    We investigate quantum information processing and manipulations in disordered systems of ultracold atoms and trapped ions. First, we demonstrate generation of entanglement and local realization of quantum gates in a quantum spin glass system. Entanglement in such systems attains significantly high values, after quenched averaging, and has a stable positive value for arbitrary times. Complex systems with long range interactions, such as ion chains or dipolar atomic gases, can be modeled by neural network Hamiltonians. In such systems, we find the characteristic time of persistence of quenched averaged entanglement, and also find the time of its revival.

  17. Arranging transient process used in iterative learning control system

    Institute of Scientific and Technical Information of China (English)

    Li-chuan Hui; Hui Lin

    2009-01-01

    Considering the same initial state error in each repetitive operation in the iterative learning system, a method of arranging the transient process is given. During the current iteration, the system will track the transient function firstly, and then the expected trajectory. After several iterations, the learning system output will trend to the arranged curve, which has avoided the effect of the initial error on the controller. Also the transient time can be changed as you need, which makes the designing simple and the operation easy. Then the detailed designing steps are given via the robot system. At last the simulation of the robot system is given, which shows the validity of the method.

  18. Bundle adjustment for data processing of theodolite industrial surveying system

    Institute of Scientific and Technical Information of China (English)

    邹峥嵘; 丁晓利; 曾卓乔; 何凭宗

    2001-01-01

    The photogrammetric bundle adjustment was used in data processing of electronic theodolite industrial surveying system by converting angular observations into virtual photo coordinates. The developed algorithm has ability of precision estimation and data-snooping, do not need initial values of exterior orientation elements and object point coordinates. The form of control condition for the system is quite flexible. Neither centering nor leveling is the theodolite needed and the lay-out of theodolite position is flexible when the system is used for precise survey. Experiments carried out in test field verify the validity of the data processing method.

  19. An approach to control collaborative processes in PLM systems

    CERN Document Server

    Kadiri, Soumaya El; Delattre, Miguel; Bouras, Abdelaziz

    2008-01-01

    Companies that collaborate within the product development processes need to implement an effective management of their collaborative activities. Despite the implementation of a PLM system, the collaborative activities are not efficient as it might be expected. This paper presents an analysis of the problems related to the collaborative work using a PLM system. From this analysis, we propose an approach for improving collaborative processes within a PLM system, based on monitoring indicators. This approach leads to identify and therefore to mitigate the brakes of the collaborative work.

  20. Arranging transient process used in iterative learning control system

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Considering the same initial state error in each repetitive operation in the iterative learning system, a method of arranging the transient process is given. During the current iteration, the system will track the transient function firstly, and then the expected trajectory. After several iterations, the learning system output will trend to the arranged curve, which has avoided the effect of the initial error on the controller. Also the transient time can be changed as you need, which makes the designing si...

  1. Carbon fibers: precursor systems, processing, structure, and properties.

    Science.gov (United States)

    Frank, Erik; Steudle, Lisa M; Ingildeev, Denis; Spörl, Johanna M; Buchmeiser, Michael R

    2014-05-19

    This Review gives an overview of precursor systems, their processing, and the final precursor-dependent structure of carbon fibers (CFs) including new developments in precursor systems for low-cost CFs. The following CF precursor systems are discussed: poly(acrylonitrile)-based copolymers, pitch, cellulose, lignin, poly(ethylene), and new synthetic polymeric precursors for high-end CFs. In addition, structure-property relationships and the different models for describing both the structure and morphology of CFs will be presented.

  2. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Llatas

    2015-11-01

    Full Text Available The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  3. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    Science.gov (United States)

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  4. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    Science.gov (United States)

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-11-30

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  5. DESIGN OF AN EXPERT CONTROL SYSTEM FOR LEACHING PROCESS

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    One important step in zinc hydrometallurgy is the leaching process, which involves the dissolving of zinc-bearing material in dilute sulfuric acid to form zinc sulfate solution. The key problem in the process control is to determine the optimal pHs of the overflows of the continuous leaches and track them. An expert control system for the leaching process was designed to solve the key problem. A methodology is proposed for determining and tracking the optimal pHs with an expert control strategy based on a combination of steadystate mathematical models and rule models of the process.

  6. Numerical Simulation System for Casting Process in Concurrent Engineering

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    According to the implementing principle and application background of the Concurrent Engineering (CE) project, studies on the integration of numerical simulation system for casting process with CE, simulation of turbulent phenomena in filling process of casting by Algebraic Stress Model (ASM), computation efficiency of filling process and quantitative prediction of shrinkage cavity and porosity under feeding condition of several risers are discussed. After the simulation of casting process of typical magnesium-based alloy casting with complicated structure, remarkable success in assuring the quality is also presented.

  7. Production process for advanced space satellite system cables/interconnects.

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza, Luis A.

    2007-12-01

    This production process was generated for the satellite system program cables/interconnects group, which in essences had no well defined production process. The driver for the development of a formalized process was based on the set backs, problem areas, challenges, and need improvements faced from within the program at Sandia National Laboratories. In addition, the formal production process was developed from the Master's program of Engineering Management for New Mexico Institute of Mining and Technology in Socorro New Mexico and submitted as a thesis to meet the institute's graduating requirements.

  8. Application of a medical image processing system in liver transplantation

    Institute of Scientific and Technical Information of China (English)

    Chi-Hua Fang; Xiao-Feng Li; Zhou Li; Ying-Fang Fan; Chao-Min Lu; Yan-Peng Huang; Feng-Ping Peng

    2010-01-01

    BACKGROUND: At present, imaging is used not only to show the form of images, but also to make three-dimensional (3D) reconstructions and visual simulations based on original data to guide clinical surgery. This study aimed to assess the use of a medical image-processing system in liver transplantation surgery. METHODS: The data of abdominal 64-slice spiral CT scan were collected from 200 healthy volunteers and 37 liver cancer patients in terms of hepatic arterial phase, portal phase, and hepatic venous phase. A 3D model of abdominal blood vessels including the abdominal aorta system, portal vein system, and inferior vena cava system was reconstructed by an abdominal image processing system to identify vascular variations. Then, a 3D model of the liver was reconstructed in terms of hepatic segmentation and liver volume was calculated. The FreeForm modeling system with a PHANTOM force feedback device was used to simulate the real liver transplantation environment, in which the total process of liver transplantation was completed. RESULTS: The reconstructed model of the abdominal blood vessels and the liver was clearly demonstrated to be three-dimensionally consistent with the anatomy of the liver, in which the variations of abdominal blood vessels were identiifed and liver segmentation was performed digitally. In the model, liver transplantation was simulated subsequently, and different modus operandi were selected successfully. CONCLUSION: The digitized medical image processing system may be valuable for liver transplantation.

  9. Energy Efficient Pump Control for an Offshore Oil Processing System

    DEFF Research Database (Denmark)

    Yang, Zhenyu; Soleiman, Kian; Løhndorf, Bo

    2012-01-01

    The energy efficient control of a pump system for an offshore oil processing system is investigated. The seawater is lifted up by a pump system which consists of three identical centrifugal pumps in parallel, and the lifted seawater is used to cool down the crude oil flowing out of a threephase...... separator on one of the Danish north-sea platform. A hierarchical pump-speed control strategy is developed for the considered system by minimizing the pump power consumption subject to keeping a satisfactory system performance. The proposed control strategy consists of online estimation of some system...... operating parameters, optimization of pump configurations, and a real-time feedback control. Comparing with the current control strategy at the considered system, where the pump system is on/off controlled, and the seawater flows are controlled by a number of control valves, the proposed control strategy...

  10. Demonstrator System for the Phase-I Upgrade of the Trigger Readout Electronics of the ATLAS Liquid-Argon Calorimeters

    CERN Document Server

    Chen, Kai; The ATLAS collaboration

    2014-01-01

    The trigger readout electronics of the ATLAS Liquid Argon (LAr) Calorimeters will be improved for the Phase- I luminosity upgrade of the LHC, to enhance the trigger feature extraction. Signals with higher spatial granularity will be digitized and processed by newly developed front-end and back- end electronics. In order to evaluate technical and performance aspects, a demonstrator system has been set up, many off-detector tests have been done. Analog signal parameters including the noise and cross-talk, as well as digital signal treatment, high speed data transmission have been measured and verified. After a series of tests, the demonstrator system has been installed on the ATLAS detector before the LHC run-2.

  11. System Analysis of Flat Grinding Process with Wheel Face

    Directory of Open Access Journals (Sweden)

    T. N. Ivanova

    2014-01-01

    Full Text Available The paper presents a conducted system analysis of the flat grinding wheel face, considers the state parameters, input and output variables of subsystems, namely: machine tool, workpiece, grinding wheel, cutting fluids, and the contact area. It reveals the factors influencing the temperature and power conditions for the grinding process.Aim: conducting the system analysis of the flat grinding process with wheel face expects to enable a development of the system of grinding process parameters as a technical system, which will make it possible to evaluate each parameter individually and implement optimization of the entire system.One of the most important criteria in defining the optimal process conditions is the grinding temperature, which, to avoid defects appearance of on the surface of component, should not exceed the critical temperature values to be experimentally determined. The temperature criterion can be useful for choosing the conditions for the maximum defect-free performance of the mechanical face grinding. To define the maximum performance of defect-free grinding can also use other criteria such as a critical power density, indirectly reflecting the allowable thermal stress grinding process; the structure of the ground surface, which reflects the presence or absence of a defect layer, which is determined after the large number of experiments; flow range of the diamond layer.Optimal conditions should not exceed those of defect-free grinding. It is found that a maximum performance depends on the characteristics of circles and grade of processed material, as well as on the contact area and grinding conditions. Optimal performance depends on the diamond value (cost and specific consumption of diamonds in a circle.Above criteria require formalization as a function of the variable parameters of the grinding process. There is an option for the compromise of inter-criteria optimality, thereby providing a set of acceptable solutions, from

  12. Models for Trustworthy Service and Process Oriented Systems

    DEFF Research Database (Denmark)

    Lopez, Hugo Andres

    2010-01-01

    studies focus on two dichotomies: the global/local views of service interactions, and their imperative/declarative specification. A global view of service interactions describes a process as a protocol for interactions, as e.g. an UML sequence diagram or a WS-CDL choreography. A local view describes...... the system as a set of processes, e.g. specied as a -calculus or WS-BPEL process, implementing each participant in the process. While the global view is what is usually provided as specication, the local view is a necessary step towards a distributed implementation. If processes are dened imperatively......, the control flow is dened explicitly, e.g. as a sequence or flow graph of interactions/commands. In a declarative approach processes are described as a collection of conditions they should fulfill in order to be considered correct. The two approaches have evolved rather independently from each other. Our...

  13. Security Processing for High End Embedded System with Cryptographic Algorithms

    Directory of Open Access Journals (Sweden)

    M.Shankar

    2012-01-01

    Full Text Available This paper is intended to introduce embedded system designers and design tool developers to the challenges involved in designing secure embedded systems. The challenges unique to embedded systems require new approaches to security covering all aspects of embedded system design from architecture to implementation. Security processing, which refers to the computations that must be performed in a system for the purpose of security, can easily overwhelm thecomputational capabilities of processors in both low- and highendembedded systems. The paper also briefs on the security enforced in a device by the use of proprietary security technology and also discusses the security measures taken during the production of the device. We also survey solution techniques to address these challenges, drawing from both current practice and emerging esearch, and identify open research problems that will require innovations in embedded system architecture and design methodologies.

  14. Licensing process for safety-critical software-based systems

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P. [VTT Automation, Espoo (Finland); Korhonen, J. [VTT Electronics, Espoo (Finland); Pulkkinen, U. [VTT Automation, Espoo (Finland)

    2000-12-01

    System vendors nowadays propose software-based technology even for the most critical safety functions in nuclear power plants. Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)', financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. As a part of the OHA-work a reference model for the licensing process for software-based safety automation systems is defined. The licensing process is defined as the set of interrelated activities whose purpose is to produce and assess evidence concerning the safety and reliability of the system/application to be licensed and to make the decision about the granting the construction and operation permissions based on this evidence. The parties of the licensing process are the authority, the licensee (the utility company), system vendors and their subcontractors and possible external independent assessors. The responsibility about the production of the evidence in first place lies at the licensee who in most cases rests heavily on the vendor expertise. The evaluation and gauging of the evidence is carried out by the authority (possibly using external experts), who also can acquire additional evidence by using their own (independent) methods and tools. Central issue in the licensing process is to combine the quality evidence about the system development process with the information acquired through tests, analyses and operational experience. The purpose of the licensing process described in this report is to act as a reference model both for the authority and the licensee when planning the licensing of individual applications

  15. Advances in signal processing and intelligent recognition systems

    CERN Document Server

    Gelbukh, Alexander; Mukhopadhyay, Jayanta

    2014-01-01

    This Edited Volume contains a selection of refereed and revised papers originally presented at the International Symposium on Signal Processing and Intelligent Recognition Systems (SIRS-2014), March 13-15, 2014, Trivandrum, India. The program committee received 134 submissions from 11 countries. Each paper was peer reviewed by at least three or more independent referees of the program committee and the 52 papers were finally selected. The papers offer stimulating insights into Pattern Recognition, Machine Learning and Knowledge-Based Systems; Signal and Speech Processing; Image and Video Processing; Mobile Computing and Applications and Computer Vision. The book is directed to the researchers and scientists engaged in various field of signal processing and related areas.  

  16. Realization Techniques of Virtual Assembly Process Planning System

    Institute of Scientific and Technical Information of China (English)

    LIU Jian-hua; NING Ru-xin; TANG Cheng-tong

    2005-01-01

    The key realization techniques of virtual assembly process planning (VAPP) system are analyzed,including virtual assembly model, real-time collision detection, automatic constraint recognition algorithm, cable harness assembly process planning and visual assembly process plan at the workshop. A virtual assembly model based on hierarchical assembly task list (HATL) is put forward, in which assembly tasks are defined to express component assembling operations and are sequentially and hierarchically organized according to different subassemblies, which can perfectly model the construction process of product. And a multi-layer automatic geometry constraint recognition algorithm of how to identify assembly constraint relations in the virtual environment is proposed, then a four-layer collision detection algorithm is discussed. A VAPP system is built and some simple mechanical assemblies are used to illustrate the feasibility of the proposed method and algorithms.

  17. Living is information processing; from molecules to global systems

    CERN Document Server

    Farnsworth, Keith D; Gershenson, Carlos

    2012-01-01

    We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function - to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, information at lower levels to appear at higher levels in complex systems (emergence). We show how information patterns, are united by the creation of mutual context, generating persistent consequences, to result in `functional information'. This constructive process forms arbitrarily large complexes of information, the combined effects of which include the functions of life. Molecules and simple organisms have already been measured in terms of functional information content; we show how quantification may extended to each le...

  18. System Choice for Data Processing, Analysis and Applications in Defence

    Directory of Open Access Journals (Sweden)

    Y. S. Rajan

    1995-10-01

    Full Text Available The design of a suitable system for image data processing, analysis and applications in Defence is governed by users' requirements during peace time and prehostility/hostility period. The users need timely information and image products for decision-making. The product specifications in terms of their scale, geometrical accuracy, information content, and turnaround time among other things are crucial for the design of systems. The systems are not complete without efficient software for information extraction and analysis and for aiding decision-making process. Usually, the base data is from high resolution remote sensing systems, both airborne and spaceborne, and also from conventional sources, like topomap and other intelligence gathering mechanisms. The database thus evolved is basic and vital for a decision support system. The sensors providing input to the database creation could be airborne high resolution camera systems, high resolution synthetic aperture radar systems and thermal imaging systems operating from a stand-off range of 50 to 100 km, or from high resolution spaceborne panchromatic optical and synthetic aperture radar imagery. High resolution stereo data from airborne and spaceborne sensors are also increasingly needed for image interpretation and analysis. The digital elevation data is another important information, derived from either existing topographic maps or high resolution space stereo imagery. The system also should cater to a large information archival/retrieval system and data dissimination system for the users spread far and wide. This may call for to and fro traffic between central operational system and units spread over different locations, preferably, through high speed satellite communication channels. Finally, the total system should have reliability, data security, adequate redundancy, user-friendliness and be efficient enough to provide timely information transfer for the decision makers. This paper discusses

  19. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  20. Process technology for multi-enzymatic reaction systems

    DEFF Research Database (Denmark)

    Xue, Rui; Woodley, John M.

    2012-01-01

    synthesis and fermentation as an alternative to chemical-catalysis for the production of pharmaceuticals and fine chemicals. In particular, the use of multiple enzymes is of special interest. However, many challenges remain in the scale-up of a multi-enzymatic system. This review summarizes and discusses...... the technology options and strategies that are available for the development of multi-enzymatic processes. Some engineering tools, including kinetic models and operating windows, for developing and evaluating such processes are also introduced....

  1. Magnetic Field Satellite (Magsat) data processing system specifications

    Science.gov (United States)

    Berman, D.; Gomez, R.; Miller, A.

    1980-01-01

    The software specifications for the MAGSAT data processing system (MDPS) are presented. The MDPS is divided functionally into preprocessing of primary input data, data management, chronicle processing, and postprocessing. Data organization and validity, and checks of spacecraft and instrumentation are dicussed. Output products of the MDPS, including various plots and data tapes, are described. Formats for important tapes are presented. Dicussions and mathematical formulations for coordinate transformations and field model coefficients are included.

  2. High-Level Waste System Process Interface Description

    Energy Technology Data Exchange (ETDEWEB)

    d' Entremont, P.D.

    1999-01-14

    The High-Level Waste System is a set of six different processes interconnected by pipelines. These processes function as one large treatment plant that receives, stores, and treats high-level wastes from various generators at SRS and converts them into forms suitable for final disposal. The three major forms are borosilicate glass, which will be eventually disposed of in a Federal Repository, Saltstone to be buried on site, and treated water effluent that is released to the environment.

  3. Configuration and Data Management Process and the System Safety Professional

    Science.gov (United States)

    Shivers, Charles Herbert; Parker, Nelson C. (Technical Monitor)

    2001-01-01

    This article presents a discussion of the configuration management (CM) and the Data Management (DM) functions and provides a perspective of the importance of configuration and data management processes to the success of system safety activities. The article addresses the basic requirements of configuration and data management generally based on NASA configuration and data management policies and practices, although the concepts are likely to represent processes of any public or private organization's well-designed configuration and data management program.

  4. Automatic and controlled processing in the corticocerebellar system.

    Science.gov (United States)

    Ramnani, Narender

    2014-01-01

    During learning, performance changes often involve a transition from controlled processing in which performance is flexible and responsive to ongoing error feedback, but effortful and slow, to a state in which processing becomes swift and automatic. In this state, performance is unencumbered by the requirement to process feedback, but its insensitivity to feedback reduces its flexibility. Many properties of automatic processing are similar to those that one would expect of forward models, and many have suggested that these may be instantiated in cerebellar circuitry. Since hierarchically organized frontal lobe areas can both send and receive commands, I discuss the possibility that they can act both as controllers and controlled objects and that their behaviors can be independently modeled by forward models in cerebellar circuits. Since areas of the prefrontal cortex contribute to this hierarchically organized system and send outputs to the cerebellar cortex, I suggest that the cerebellum is likely to contribute to the automation of cognitive skills, and to the formation of habitual behavior which is resistant to error feedback. An important prerequisite to these ideas is that cerebellar circuitry should have access to higher order error feedback that signals the success or failure of cognitive processing. I have discussed the pathways through which such feedback could arrive via the inferior olive and the dopamine system. Cerebellar outputs inhibit both the inferior olive and the dopamine system. It is possible that learned representations in the cerebellum use this as a mechanism to suppress the processing of feedback in other parts of the nervous system. Thus, cerebellar processes that control automatic performance may be completed without triggering the engagement of controlled processes by prefrontal mechanisms.

  5. Information systems for material flow management in construction processes

    Science.gov (United States)

    Mesároš, P.; Mandičák, T.

    2015-01-01

    The article describes the options for the management of material flows in the construction process. Management and resource planning is one of the key factors influencing the effectiveness of construction project. It is very difficult to set these flows correctly. The current period offers several options and tools to do this. Information systems and their modules can be used just for the management of materials in the construction process.

  6. Tera-Op Reliable Intelligently Adaptive Processing System (TRIPS)

    Science.gov (United States)

    2004-04-01

    AFRL-IF-WP-TR-2004-1514 TERA -OP RELIABLE INTELLIGENTLY ADAPTIVE PROCESSING SYSTEM (TRIPS) Stephen W. Keckler, Doug Burger, Michael Dahlin...03/31/2004 5a. CONTRACT NUMBER F33615-01-C-1892 5b. GRANT NUMBER 4. TITLE AND SUBTITLE TERA -OP RELIABLE INTELLIGENTLY ADAPTIVE PROCESSING...influence beyond the scope of this project; the influence is expected to increase with the fabrication of the prototype in phase 2. 1 2 Introduction The Tera

  7. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  8. HACCP and quality system in the food processing industry

    Directory of Open Access Journals (Sweden)

    Turubatović Lazar

    2002-01-01

    Full Text Available HACCP (Hazard Analysis and Critical Control Points is an indispensable contemporary system of process control in the food processing industry. In its original meaning this control procedure includes hazard analysis and identification of the points in the production process where the product contamination is reasonably likely to occur resulting in an unsafe product. At the critical points the control of the production process should be severer in order to eliminate or reduce the product safety risks. The aim of implementing a quality management system being quality management, according to the standards of the ISO 9000 series, the formulation of a product that meets "the requirements stated or implied", where the implied requirements refer to the prescribed quality requirements, which, in the food industry above all, comprises safety, it is necessary to build HACCP into the quality system. The application of HACCP principles when introducing a quality system should be extended to those parts of the production process in which the required quality of the product may be at risk.

  9. Digital signal processing in power system protection and control

    CERN Document Server

    Rebizant, Waldemar; Wiszniewski, Andrzej

    2011-01-01

    Digital Signal Processing in Power System Protection and Control bridges the gap between the theory of protection and control and the practical applications of protection equipment. Understanding how protection functions is crucial not only for equipment developers and manufacturers, but also for their users who need to install, set and operate the protection devices in an appropriate manner. After introductory chapters related to protection technology and functions, Digital Signal Processing in Power System Protection and Control presents the digital algorithms for signal filtering, followed

  10. ? filtering for stochastic systems driven by Poisson processes

    Science.gov (United States)

    Song, Bo; Wu, Zheng-Guang; Park, Ju H.; Shi, Guodong; Zhang, Ya

    2015-01-01

    This paper investigates the ? filtering problem for stochastic systems driven by Poisson processes. By utilising the martingale theory such as the predictable projection operator and the dual predictable projection operator, this paper transforms the expectation of stochastic integral with respect to the Poisson process into the expectation of Lebesgue integral. Then, based on this, this paper designs an ? filter such that the filtering error system is mean-square asymptotically stable and satisfies a prescribed ? performance level. Finally, a simulation example is given to illustrate the effectiveness of the proposed filtering scheme.

  11. Aligning Work Processes and the Adviser Portal Bank System

    DEFF Research Database (Denmark)

    Jørgensen, Jens Bæk; Lassen, Kristian Bisgaard

    2006-01-01

    The Adviser Portal (AP) is a new IT system for 15 Danish banks. The main goal of AP is to increase the efficiency and quality of bank advisers’ work. Re- quirements engineering for AP includes describing new work processes that must be supported by AP using a combination of: (1) prose and informal...... drawings; (2) The Adviser Portal (AP) is a new IT system for 15 Danish banks. The main goal of AP is to increase the efficiency and quality of bank advisers' work. Requirements engineering for AP includes describing new work processes that musty be supported by AP using a combination of: (1) prose...

  12. Scheduling algorithms for automatic control systems for technological processes

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays – all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  13. QUrdPro: Query processing system for Urdu Language

    Directory of Open Access Journals (Sweden)

    Rukhsana Thaker,

    2015-06-01

    Full Text Available The tremendous increase in the multilingual data on the internet has increased the demand for efficient retrieval of information. Urdu is one of the widely spoken and written languages of south Asia. Due to unstructured format of Urdu language information retrieval of information is a big challenge. Question Answering systems aims to retrieve point-to-point answers rather than flooding with documents. It is needed when the user gets an in depth knowledge in a particular domain. When user needs some information, it must give the relevant answer. The question-answer retrieval of ontology knowledge base provides a convenient way to obtain knowledge for use, but the natural language need to be mapped to the query statement of ontology. This paper describes a query processing system QUrdPro based on ontology. This system is a combination of NLP and Ontology. It makes use of ontology in several phases for efficient query processing. Our focus is on the knowledge derived from the concepts used in the ontology and the relationship between these concepts. In this paper we describe the architecture of QUrdPro ,query processing system for Urdu and process model for the system is also discussed in detail.

  14. Processing of Sensory Information in the Olfactory System

    DEFF Research Database (Denmark)

    The olfactory system is an attractive model system due to the easy control of sensory input and the experimental accessibility in animal studies. The odorant signals are processed from receptor neurons to a neural network of mitral and granular cells while various types of nonlinear behaviour can...... and equation-free techniques allow for a better reproduction and understanding of recent experimental findings. Talks: Olfaction as a Model System for Sensory-Processing Neural Networks (Jens Midtgaard, University of Copenhagen, Denmark) Nonlinear Effects of Signal Transduction in Olfactory Sensory Neurons...... (Peter Borowski, University of British Columbia, Canada; Juergen Reidl, University of Heidelberg, Germany; Jens Starke, Technical University of Denmark, Denmark; Martin Zapotocky, Max Planck Institute for Physics of Complex Systems, Germany; Markus Eiswirth, Fritz-Haber Institut, Germany; Anke Sensse...

  15. Process-based design of dynamical biological systems

    Science.gov (United States)

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-09-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered.

  16. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  17. IMPLEMENTATION OF IMAGE PROCESSING IN REAL TIME CAR PARKING SYSTEM

    Directory of Open Access Journals (Sweden)

    SAYANTI BANERJEE,

    2011-02-01

    Full Text Available Car parking lots are an important object class in many traffic and civilian applications. With the problems of increasing urban trafficcongestion and the ever increasing shortage of space, these car parking lots are needed to be well equipped with automatic parkingInformation and Guidance systems. Goals of intelligent parking lot management include counting the number of parked cars, and identifyingthe available location. This work proposes a new system for providing parking information and guidance using image processing. The proposed system includes counting the number of parked vehicles, and dentifying the stalls available. The system detects cars through images instead of using electronic sensors embedded on the floor. A camera is installed at the entry point of the parking lot. It capturesimage sequences. The image sequences are then analyzed using digital image processing for vehicle detection and according to the status ofvehicle occupancy inside, real time guidance and information is provided to the incoming driver.

  18. The UNIX* Localization and Chinese Information Processing System

    Institute of Scientific and Technical Information of China (English)

    孙玉方

    1991-01-01

    To facilitate the wider use of computers all over the world,it is necessary to provide National Language Support in the computer systems.This paper introduces some aspects of design and implementation of the UNIX-based Chinese Information Processing Systems (CIPS). Due to the special nature of the Oriental languages,and in order to be able to share resources and exchange in formation between different countries,it is necessary to create a standard of multilingual information exchange code.The unified Chinese/Japanese/Korean character code,Han Character Collection(HCC),was proposed to ISO/IEC JTC1/SC2/WG2 by China Computer and Information Processing Standardization Technical Committee.Based on this character set and the corresponding coding system,it is possible to create a true Internationalized UNIX System.

  19. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... of the framework. The issue of commercial simulators or software providing the necessary features for product-process synthesis-design as opposed to their development by the academic PSE community will also be discussed. An example of a successful collaboration between academia-industry for the development...

  20. Image processing system for digital chest X-ray images

    Energy Technology Data Exchange (ETDEWEB)

    Cocklin, M.; Gourlay, A.; Jackson, P.; Kaye, G.; Miessler, M. (I.B.M. U.K. Scientific Centre, Winchester (UK)); Kerr, I.; Lams, P. (Radiology Department, Brompton Hospital, London (UK))

    1984-01-01

    This paper investigates the requirements for image processing of digital chest X-ray images. These images are conventionally recorded on film and are characterised by large size, wide dynamic range and high resolution. X-ray detection systems are now becoming available for capturing these images directly in photoelectronic-digital form. The hardware and software facilities required for handling these images are described. These facilities include high resolution digital image displays, programmable video look up tables, image stores for image capture and processing and a full range of software tools for image manipulation. Examples are given of the applications of digital image processing techniques to this class of image.

  1. Intellectual Control System of Processing on CNC Machines

    Science.gov (United States)

    Nekrasov, R. Y.; Lasukov, A. A.; Starikov, A. I.; Soloviev, I. V.; Bekareva, O. V.

    2016-04-01

    Scientific and technical progress makes great demands for quality of engineering production. The priority is to ensure metalworking equipment with required dimensional accuracy during the entire period of operation at minimum manufacturing costs. In article considered the problem of increasing of accuracy of processing products on CNC. The authors offers a solution to the problem by providing compensating adjustment in the trajectory of the cutting tool and machining mode. The necessity of creation of mathematical models of processes behavior in an automated technological system operations (OATS). Based on the research, authors have proposed a generalized diagram of diagnosis and input operative correction and approximate mathematical models of individual processes of diagnosis.

  2. Signal Processing in Large Systems: a New Paradigm

    CERN Document Server

    Couillet, Romain

    2011-01-01

    For a long time, signal processing applications, and most particularly detection and parameter estimation methods, have relied on the limiting behaviour of test statistics and estimators, as the number n of observations of a population grows large comparatively to the population size N, i.e. n>>N. Modern technological and societal advances now demand the study of sometimes extremely large populations, while simultaneously requiring fast signal processing due to accelerated system dynamics; this results in not-so-large practical ratios n/N, sometimes even smaller than one. A disruptive change in classical signal processing methods has therefore been initiated in the past ten years, mostly spurred by the field of large dimensional random matrix theory. The early literature in random matrix theory for signal processing applications is however scarce and highly technical. This tutorial proposes an accessible methodological introduction to the modern tools of random matrix theory and to the signal processing metho...

  3. Practical Implementations of Advanced Process Control for Linear Systems

    DEFF Research Database (Denmark)

    Knudsen, Jørgen K . H.; Huusom, Jakob Kjøbsted; Jørgensen, John Bagterp

    2013-01-01

    Most advanced process control systems are based on Model Predictive Control (MPC). In this paper we discuss three critical issues for the practical implementation of linear MPC for process control applications. The rst issue is related to oset free control and disturbance models; the second issue...... is related to the use of soft output constraints in MPC; and the third issue is related to the computationally ecient solution of the quadratic program in the dynamic regulator of the MPC. We have implemented MPC in .Net using C# and the MPCMath library. The implemented MPC is based on the target...... models and integration of the innovation errors. If the disturbances increases, oset-free control cannot be achieved without violation of process constraints. A target calculation function is used to calculate the optimal achievable target for the process. The use of soft constraints for process output...

  4. Research on the Visual Processing System of the Punch Press

    Directory of Open Access Journals (Sweden)

    Sun Xuan

    2016-01-01

    Full Text Available Most of raw materials of small hardware processing for plate scraps, and it’s realized through the manual operation of ordinary punch, which way has the low production efficiency and the high labor intensity. In order to improve the automation level of production, developing and designing of a visual processing system for punch press manipulator which based on the MFC tools of Visual Studio software platform. Through the image acquisition and image processing, get the information about the board to be processed, such as shape, length, the center of gravity position and pose, and providing relevant parameters for positioning gripping and placing into the punch table positioning of the feeding manipulator and automatic programming of punching machine, so as to realize the automatic operation about press feeding and processing.

  5. Decision Support Systems (DSS) in Construction Tendering Processes

    CERN Document Server

    Mohemad, Rosmayati; Othman, Zulaiha Ali; Noor, Noor Maizura Mohamad

    2010-01-01

    The successful execution of a construction project is heavily impacted by making the right decision during tendering processes. Managing tender procedures is very complex and uncertain involving coordination of many tasks and individuals with different priorities and objectives. Bias and inconsistent decision are inevitable if the decision-making process is totally depends on intuition, subjective judgement or emotion. In making transparent decision and healthy competition tendering, there exists a need for flexible guidance tool for decision support. Aim of this paper is to give a review on current practices of Decision Support Systems (DSS) technology in construction tendering processes. Current practices of general tendering processes as applied to the most countries in different regions such as United States, Europe, Middle East and Asia are comprehensively discussed. Applications of Web-based tendering processes is also summarised in terms of its properties. Besides that, a summary of Decision Support Sy...

  6. System and process for capture of acid gasses at elevated pressure from gaseous process streams

    Science.gov (United States)

    Heldebrant, David J.; Koech, Phillip K.; Linehan, John C.; Rainbolt, James E.; Bearden, Mark D.; Zheng, Feng

    2016-09-06

    A system, method, and material that enables the pressure-activated reversible chemical capture of acid gasses such as CO.sub.2 from gas volumes such as streams, flows or any other volume. Once the acid gas is chemically captured, the resulting product typically a zwitterionic salt, can be subjected to a reduced pressure whereupon the resulting product will release the captures acid gas and the capture material will be regenerated. The invention includes this process as well as the materials and systems for carrying out and enabling this process.

  7. Decision Support Systems (DSS in Construction Tendering Processes

    Directory of Open Access Journals (Sweden)

    Rosmayati Mohemad

    2010-03-01

    Full Text Available The successful execution of a construction project is heavily impacted by making the right decision during tendering processes. Managing tender procedures is very complex and uncertain involving coordination of many tasks and individuals with different priorities and objectives. Bias and inconsistent decision are inevitable if the decision-making process is totally depends on intuition, subjective judgement or emotion. In making transparent decision and healthy competition tendering, there exists a need for flexible guidance tool for decision support. Aim of this paper is to give a review on current practices of Decision Support Systems (DSS technology in construction tendering processes. Current practices of general tendering processes as applied to the most countries in different regions such as United States, Europe, Middle East and Asia are comprehensively discussed. Applications of Web-based tendering processes is also summarised in terms of its properties. Besides that, a summary of Decision Support System (DSS components is included in the next section. Furthermore, prior researches on implementation of DSS approaches in tendering processes are discussed in details. Current issues arise from both of paper-based and Web-based tendering processes are outlined. Finally, conclusion is included at the end of this paper.

  8. Advanced information processing system: Input/output network management software

    Science.gov (United States)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  9. A system for pulsed NQR spectrometer control and signal processing

    Science.gov (United States)

    Gourdji, M.; Péneau, A.

    The system described was built at the IEF around a HP-21OOA computer and is presently used with a nitrogen-14 pulsed NQR spectrometer. Two main functions are provided: spectrometer control (radio-frequency, pulse sequence repetition rate, sample temperature settings) and signal processing (accumulation of the NQR signals, Fourier transform). Results are presented which show typical uses of the system for the observation of complex signals.

  10. Dynamic systems-engineering process - The application of concurrent engineering

    Science.gov (United States)

    Wiskerchen, Michael J.; Pittman, R. Bruce

    1989-01-01

    A system engineering methodology is described which enables users, particulary NASA and DOD, to accommodate changing needs; incorporate emerging technologies; identify, quantify, and manage system risks; manage evolving functional requirements; track the changing environment; and reduce system life-cycle costs. The approach is a concurrent, dynamic one which starts by constructing a performance model defining the required system functions and the interrelationships. A detailed probabilistic risk assessment of the system elements and their interrelationships is performed, and quantitative analysis of the reliability and maintainability of an engineering system allows its different technical and process failure modes to be identified and their probabilities to be computed. Decision makers can choose technical solutions that maximize an objective function and minimize the probability of failure under resource constraints.

  11. Submerged demineralize system processing of TMI-2 accident waste water

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, H.F.; Quinn, G.J.

    1983-02-01

    Accident-generated radioactive waste at Three Mile Island Unit 2 includes a varity of high and low specific-activity waste. The high-specific-activity waste, particularly over one million gallons of contaminated water, required special processing and secondary waste handling. General public utilities and its contractors developed a zeolite-based ion-exchange system called the Submerged Demineralizer System to reduce contamination levels in the water to below allowable limits. Testing and modifications resulted in an operating system that had successfully processed waste water from the Reactor Coolant Bleed Tanks, the Reactor Building Basement, and the Reactor Coolant System as of August 1982. System design objectives were met and decontamination criteria established in 10 CFR 20 were attained. Additional wastes that could not be handled routinely were generated by another water-processing system, called EPICOR II. EPICOR II wastes are discussed. Low-specific-activity (LSA) wastes such as trash and resin-bed waste canisters are also included in handling. LSA wastes are routinely handled and shipped according to existing industry practice. Plant records are summarized to provide approximate yearly volumes and curie loadings of low-specific-activity wastes being shipped off the Island to a commercial burial site.

  12. A common type system for clinical natural language processing

    Directory of Open Access Journals (Sweden)

    Wu Stephen T

    2013-01-01

    Full Text Available Abstract Background One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. Results We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs, thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System versions 2.0 and later. Conclusions We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types.

  13. The Error Monitoring and Processing System in Alcohol Use

    Directory of Open Access Journals (Sweden)

    Menizibeya O. Welcome

    2010-10-01

    Full Text Available Background: Current data suggest that alcohol might play significant role in error commission. Error commission is related to the functions of the Error Monitoring and Processing System (EMPS located in the substantia nigra of the midbrain, basal ganglia and cortex of the forebrain. The main components of the EMPS are the dopaminergic system and anterior cingulate cortex. Although, recent data show that alcohol disrupts the EMPS, the ways in which alcohol affects this system are poorly understood.Aims & Objectives: We reviewed recent data that suggest the indirect effect of alcohol use on error commission.Methods / Study Design: Databases were searched for relevant literatures using the following keywords combination – Alcohol AND Error Commission (OR Processing, Monitoring, Correction, Detection. Literatures were searched in scientific databases (Medline, DOAJ, Embase from 1940 to August 2010, journal website (Psychophysiology, Neuroscience and Trends in Neuroscience. Manual book search, including library information were included in the data collection process. Other additional information was searched through Google.Results / Findings: Blood and brain glucose levels play a vital role in error commission, and are related to error commission, monitoring and processing through the modulation of the activity of the dopaminergic system. To summarize the results of our findings, here we suggest a hypothesis of Alcohol-Related Glucose-Dependent System of Error Monitoring and Processing (ARGD-EMPS hypothesis, which holds that the disruption of the EMPS is related to the competency of glucose homeostasis regulation, which in turn may determine the dopamine level as a major component of the EMPS. The ARGD-EMPS hypothesis explains the general processes and mechanism of alcohol related disruption of the EMPS.Conclusion: Alcohol may indirectly disrupt the EMPS by affecting dopamine level through disorders in blood glucose homeostasis regulation. The

  14. Critical factors in the implementation process of integrated management systems

    Directory of Open Access Journals (Sweden)

    Ademir Antonio Ferreira

    2015-09-01

    Full Text Available This study is the result of research whose purpose was to study the implementation process of integrated management systems, called ERP Enterprise Resource Planning in the business environment. This study, more specifically, tried to identify the variables in this process and that, somehow, made it easy or caused some type of difficulty implementing the system. Based on the mixed method approach (Creswell, 2003, the study was performed by means of the content analysis of technical and scientific publications about this theme and by means of a field research for data collection from primary sources. The content analysis was based on the per mile procedure by Bardin (1977, making it possible to identify critical factors that may be found in the implementation of ERP system projects. Primary data was collected from structured interviews with the managers in charge of the implementation of the system, in each of the 12 companies in different sectors of the economy and based in Brazil. Based on this information, it was possible to test the factors extracted from the content analysis and then develop a list of factors that may effectively influence the implementation process of the system. In order to recognize the possible relations between the selected factors, the Spearman (rsp correlation coefficient was applied and the multiple regression analysis was performed by means of the stepwise procedure. The purpose of the regression analysis was to determine the relation of the “Assessment of the Implementation” dependent variable with other dependent variables in the selected categories. The results of these analyses showed that the support of the top management, the communication process for the clear evidence of this support, the technical support of the ERP program provider together with the project team expertise, training and qualification processes of the team in the system operation are significantly correlated and relevant factors for a

  15. From archive to process in past fluvial systems

    Science.gov (United States)

    Dikau, R.

    2009-04-01

    The reconstruction of sediment fluxes through palaeo ecological systems is based on effect (sediment record) - cause (soil erosion, fluvial transport, sediment deposition) relationships using abduction as central methodology. In philosophy of science abduction means, that the effect of a palaeo process is known. e.g. a recent sediment body including specific properties of this archive. There are, however, potentially a range of laws that could be applied to explain the cause, e.g. a human or a climatic impact or internal system behaviour. From a methodological point of view this means that the coupling of cause and effect has to consider several potential starting points of the sediment flux system and a range of laws or explanations which increases the degree of uncertainty significantly. Particularly in modelling plaeo sediment flux systems no reliable transfer functions exist which translate sediment archive properties into flux processes. This general methodological challenge for reconstructing palaeo systems is a particular problem in fluvial systems. Fluvial systems act as a filter whose properties for past time scales are widely unknown. This represents a decoupled cause-effect relationship. The filter function of these system types means, that the external signal that drives the sediment flux record cannot be read directly from that record and that e.g. climatic hypotheses eventually are not testable. The methodology to link archive and process therefore requires spatially-structured storage and release models including abductive interpretation laws for internal feedbacks, thresholds and complex non-linear dynamics. Based on these arguments the aim this presentation is a discussion of a methodological framework in past fluvial system understanding.

  16. A data processing firmware for an upgrade of the Outer Tracker detector at the LHCb experiment

    CERN Document Server

    Swientek, Stefan

    This thesis describes the data processing software for an Outer Tracker upgrade at the LHCb experiment. The 2018/19 intended upgrade for the LHCb detector will introduce new readout electronics on the front end of the detector as well as on the back end. The read out electronics on the back end of the detector will use a common board equipped with a Field Programmable Gate Array (FPGA). To ensure a correct data processing for each subdetector, the firmware used on the FPGA will contain subdetector specific data processing parts. The data processing part for the Outer Tracker includes receiving the data, sorting incoming data streams, merging hit pattern and drift times of three consecutive bunch crossings, clustering and formatting the data for an accurate output. In addition, a mezzanine card, called the SantaLuz board, is presented which can be used to extend existing FPGA boards with up to eight optical transceivers.

  17. A New Optimal Control System Design for Chemical Processes

    Institute of Scientific and Technical Information of China (English)

    丛二丁; 胡明慧; 涂善东; 邵惠鹤

    2013-01-01

    Based on frequency response and convex optimization, a novel optimal control system was developed for chemical processes. The feedforward control is designed to improve the tracking performance of closed loop chemical systems. The parametric model is not required because the system directly utilizes the frequency response of the loop transfer function, which can be measured accurately. In particular, the extremal values of magnitude and phase can be solved according to constrained quadratic programming optimizer and convex optimization. Simula-tion examples show the effectiveness of the method. The design method is simple and easily adopted in chemical industry.

  18. Design and image processing for tactile endoscope system

    Science.gov (United States)

    Yamada, Kenji; Susuki, Yuto; Nagakura, Toshiaki; Ishihara, Ken; Ohno, Yuko

    2010-08-01

    We have developed new type tactile endoscope with silicone rubber membrane. The system consists of silicone rubber membrane, image sensor and illumination system. A surface of the Silicone rubber membrane has any patterns which made by nanotechnology. This pattern is deformed by pressing tissue such as cancer, colon and so on. The deformed pattern is captured by image sensor. This pattern is analyzed by image processing. In this paper, the proposed architecture is presented. With several test targets, the characteristics of the prototype systems are evaluated in the computation simulation.

  19. GPU real-time processing in NA62 trigger system

    Science.gov (United States)

    Ammendola, R.; Biagioni, A.; Chiozzi, S.; Cretaro, P.; Di Lorenzo, S.; Fantechi, R.; Fiorini, M.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Neri, I.; Paolucci, P. S.; Pastorelli, E.; Piandani, R.; Piccini, M.; Pontisso, L.; Rossetti, D.; Simula, F.; Sozzi, M.; Vicini, P.

    2017-01-01

    A commercial Graphics Processing Unit (GPU) is used to build a fast Level 0 (L0) trigger system tested parasitically with the TDAQ (Trigger and Data Acquisition systems) of the NA62 experiment at CERN. In particular, the parallel computing power of the GPU is exploited to perform real-time fitting in the Ring Imaging CHerenkov (RICH) detector. Direct GPU communication using a FPGA-based board has been used to reduce the data transmission latency. The performance of the system for multi-ring reconstrunction obtained during the NA62 physics run will be presented.

  20. Guidance on Design of Internet-based Process Control Systems

    Institute of Scientific and Technical Information of China (English)

    S.H.YANG; L.YANG

    2005-01-01

    Internet-based process control is becoming new generations of control systems, in which the Internet is used as a platform for global remote monitoring and control. The obvious benefit is to enable global collaboration between operators from geographically dispersed locations, data sharing and data provision for remote monitoring and control. However, connection to an open network and the use of universal technology present new problems that did not exist with the conventional design and construction of control systems, such as time delay and data loss in Internet transmission and security. This paper reviews the latest research results and presents design guidance of Internet based monitoring and control systems.

  1. Global Precipitation Measurement (GPM) Mission: NASA Precipitation Processing System (PPS)

    Science.gov (United States)

    Stocker, Erich Franz

    2008-01-01

    NASA is contributing the precipitation measurement data system PPS to support the GPM mission. PPS will distribute all GPM data products including NASA s GMI data products freely and quickly. PPS is implementing no system mechanisms for restricting access to GPM data. PPS is implementing no system mechanisms for charging for GPM data products. PPS will provide a number of geographical and parameter subsetting features available to its users. The first implementation of PPS (called PPS--) will assume processing of TRMM data effective 1 June 2008. TRMM realtime data will be available via PPS- to all users requesting access

  2. Intelligent query processing for semantic mediation of information systems

    Directory of Open Access Journals (Sweden)

    Saber Benharzallah

    2011-11-01

    Full Text Available We propose an intelligent and an efficient query processing approach for semantic mediation of information systems. We propose also a generic multi agent architecture that supports our approach. Our approach focuses on the exploitation of intelligent agents for query reformulation and the use of a new technology for the semantic representation. The algorithm is self-adapted to the changes of the environment, offers a wide aptitude and solves the various data conflicts in a dynamic way; it also reformulates the query using the schema mediation method for the discovered systems and the context mediation for the other systems.

  3. Systems and methods for rapid processing and storage of data

    Energy Technology Data Exchange (ETDEWEB)

    Stalzer, Mark A.

    2017-01-24

    Systems and methods of building massively parallel computing systems using low power computing complexes in accordance with embodiments of the invention are disclosed. A massively parallel computing system in accordance with one embodiment of the invention includes at least one Solid State Blade configured to communicate via a high performance network fabric. In addition, each Solid State Blade includes a processor configured to communicate with a plurality of low power computing complexes interconnected by a router, and each low power computing complex includes at least one general processing core, an accelerator, an I/O interface, and cache memory and is configured to communicate with non-volatile solid state memory.

  4. Rethinking the Systems Engineering Process in Light of Design Thinking

    Science.gov (United States)

    2016-04-30

    incorporated into the systems engineering process. Architecting vs. Engineering The design problem changes in character from an ill-structured...limit, perhaps unnecessarily in some cases, the design space. Prototyping Prototyping during the early architecting phase is as important as during...Prototyping in the design thinking community is much more inclusive. Prototyping during the architecting phase is important for reasons of discovery

  5. Applying Systems Engineering Methodologies to the Creative Process

    Science.gov (United States)

    2014-09-01

    Concepts are further developed using mockups and prototypes (29). Concept development is similar to the idea generation/idea combination/idea...explained that elements of a system can be “…products (hardware, software , firmware), processes, people, information, techniques, facilities, services

  6. Cancer systems biology: signal processing for cancer research

    Institute of Scientific and Technical Information of China (English)

    Olli Yli-Harja; Antti Ylip(a)(a); Matti Nykter; Wei Zhang

    2011-01-01

    In this editorial we introduce the research paradigms of signal processing in the era of systems biology. Signal processing is a field of science traditionally focused on modeling electronic and communications systems, but recently it has turned to biological applications with astounding results. The essence of signal processing is to describe the natural world by mathematical models and then, based on these models, develop efficient computational tools for solving engineering problems. Here, we underline, with examples, the endless possibilities which arise when the battle-hardened tools of engineering are applied to solve the problems that have tormented cancer researchers. Based on this approach, a new field has emerged, called cancer systems biology. Despite its short history, cancer systems biology has already produced several success stories tackling previously impracticable problems. Perhaps most importantly, it has been accepted as an integral part of the major endeavors of cancer research, such as analyzing the genomic and epigenomic data produced by The Cancer Genome Atlas (TCGA) project. Finally, we show that signal processing and cancer research, two fields that are seemingly distant from each other, have merged into a field that is indeed more than the sum of its parts.

  7. Post-processing procedure for industrial quantum key distribution systems

    Science.gov (United States)

    Kiktenko, Evgeny; Trushechkin, Anton; Kurochkin, Yury; Fedorov, Aleksey

    2016-08-01

    We present algorithmic solutions aimed on post-processing procedure for industrial quantum key distribution systems with hardware sifting. The main steps of the procedure are error correction, parameter estimation, and privacy amplification. Authentication of classical public communication channel is also considered.

  8. System and method for cognitive processing for data fusion

    Science.gov (United States)

    Duong, Tuan A. (Inventor); Duong, Vu A. (Inventor)

    2012-01-01

    A system and method for cognitive processing of sensor data. A processor array receiving analog sensor data and having programmable interconnects, multiplication weights, and filters provides for adaptive learning in real-time. A static random access memory contains the programmable data for the processor array and the stored data is modified to provide for adaptive learning.

  9. Production management information system in wood processing and furniture manufacture

    Directory of Open Access Journals (Sweden)

    Tomislav Grladinović

    2007-11-01

    Full Text Available Introduction of a production management information system is one of the ways that could help the management to increase its efficiency. It should enable the monitoring of the whole business of a firm through co-ordination in the process of collecting and using information.

  10. The New Standards System for Processing Food to Be Proved

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    @@ It is reported that the work of re-structuring the frame of China national standards system for processing food has been finished with the print and distribution of 2004-2005 Development Plan of National Standards for Food (hereinafter Plan).

  11. Cancer systems biology: signal processing for cancer research.

    Science.gov (United States)

    Yli-Harja, Olli; Ylipää, Antti; Nykter, Matti; Zhang, Wei

    2011-04-01

    In this editorial we introduce the research paradigms of signal processing in the era of systems biology. Signal processing is a field of science traditionally focused on modeling electronic and communications systems, but recently it has turned to biological applications with astounding results. The essence of signal processing is to describe the natural world by mathematical models and then, based on these models, develop efficient computational tools for solving engineering problems. Here, we underline, with examples, the endless possibilities which arise when the battle-hardened tools of engineering are applied to solve the problems that have tormented cancer researchers. Based on this approach, a new field has emerged, called cancer systems biology. Despite its short history, cancer systems biology has already produced several success stories tackling previously impracticable problems. Perhaps most importantly, it has been accepted as an integral part of the major endeavors of cancer research, such as analyzing the genomic and epigenomic data produced by The Cancer Genome Atlas (TCGA) project. Finally, we show that signal processing and cancer research, two fields that are seemingly distant from each other, have merged into a field that is indeed more than the sum of its parts.

  12. Risk Informed Design as Part of the Systems Engineering Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  13. Restrictions on autogressive error processes in systems of demand equations

    OpenAIRE

    Brown, Mark G.

    1993-01-01

    Alternative theoretically based restrictions on autoregressive error processes in systems of demand equations are examined. Scaling, translation, and a utility-based approach suggested by Theil are used to generate restrictions. A study of juice demands suggests that the restrictions examined may be useful for empirical analysis.

  14. Declarative Business Process Modelling and the Generation of ERP Systems

    DEFF Research Database (Denmark)

    Schultz-Møller, Nicholas Poul; Hølmer, Christian; Hansen, Michael Reichhardt

    2009-01-01

    We present an approach to the construction of Enterprise Resource Planning (ERP) Systems, which is based on the Resources, Events and Agents (REA) ontology. This framework deals with processes involving exchange and flow of resources in a declarative, graphically-based manner describing what the ...

  15. [Fetal ECG monitoring system based on MCU processing].

    Science.gov (United States)

    Hu, Gang; Chen, Wei; Xie, Xicheng; Zhang, Hao

    2004-12-01

    In order to monitor the fetus in labor, the signal characteristic from fetal scalp electrode is researched, An adaptation algorithm and a peak to peak detecting technology are adopted in signal processing, and an adaptation gain control method is used to eliminate disturber from base-line shift. A fetal ECG monitoring system is designed on the basis of C8051F020 MCU.

  16. The NJOY Nuclear Data Processing System, Version 2016

    Energy Technology Data Exchange (ETDEWEB)

    Macfarlane, Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Muir, Douglas W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Boicourt, R. M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kahler, III, Albert Comstock [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-09

    The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy cardimage formats. NJOY works with evaluated les for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.

  17. Universal Reading Processes Are Modulated by Language and Writing System

    Science.gov (United States)

    Perfetti, Charles A.; Harris, Lindsay N.

    2013-01-01

    The connections among language, writing system, and reading are part of what confronts a child in learning to read. We examine these connections in addressing how reading processes adapt to the variety of written language and how writing adapts to language. The first adaptation (reading to writing), as evidenced in behavioral and neuroscience…

  18. DYSIM - A Modular Simulation System for Continuous Dynamic Processes

    DEFF Research Database (Denmark)

    Christensen, P. la Cour; Kofoed, J. E.; Larsen, N.

    1986-01-01

    The report describes a revised version of a simulation system for continuous processes, DYSIM. In relation to the previous version, which was developed in 1981, the main changes are conversion to Fortran 77 and introduction of a modular structure. The latter feature gives the user a possibility...

  19. Decision support for information systems management : applying analytic hierarchy process

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Vrolijk, Hans C.J.

    1995-01-01

    Decision-making in the field of information systems has become more complex due to a larger number of alternatives, multiple and sometimes conflicting goals, and an increasingly turbulent environment. In this paper we explore the appropriateness of Analytic Hierarchy Process to support I/S decision

  20. Integrating Biological Systems in the Process Dynamics and Control Curriculum

    Science.gov (United States)

    Parker, Robert S.; Doyle, Francis J.; Henson, Michael A.

    2006-01-01

    The evolution of the chemical engineering discipline motivates a re-evaluation of the process dynamics and control curriculum. A key requirement of future courses will be the introduction of theoretical concepts and application examples relevant to emerging areas, notably complex biological systems. We outline the critical concepts required to…

  1. ANALYTICAL SOLUTION OF FILLING AND EXHAUSTING PROCESS IN PNEUMATIC SYSTEM

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The filling and exhausting processes in a pneumatic system are involved with many factors,and numerical solutions of many partial differential equations are always adapted in the study of those processes, which have been proved to be troublesome and less intuitive. Analytical solutions based on loss-less tube model and average friction tube model are found respectively by using fluid net theory,and they fit the experimental results well. The research work shows that: Fluid net theory can be used to solve the analytical solution of filling and exhausting processes of pneumatic system, and the result of loss-less tube model is close to that of average friction model, so loss-less tube model is recommended since it is simpler, and the difference between filling time and exhausting time is determined by initial and final pressures, the volume of container and the section area of tube, and has nothing to do with the length of the tube.

  2. Mesoscopic phenomena in oxide nanoparticles systems: processes of growth

    Energy Technology Data Exchange (ETDEWEB)

    Konstantinova, Tetyana, E-mail: matscidep@aim.com; Danilenko, Igor; Glazunova, Valentina; Volkova, Galina; Gorban, Oksana [Donetsk Institute for Physics and Engineering of the NAS of Ukraine (Ukraine)

    2011-09-15

    The process of nanoparticles growth has been investigated and discussed in terms of mesoscopic approach on example of ZrO{sub 2}-3 mol%Y{sub 2}O{sub 3} system. Growth process of nanoparticles synthesized by co-precipitation has three stages: cooperative-oriented crystallization of ordered areas in xerogel polymer matrix and disintegration of crystallized areas (350-400 Degree-Sign C); oriented attachment of particles into single crystal caused by electrostatic interaction (400-600 Degree-Sign C); attachment of particles to single and poly-crystals by oxygen diffusion through vacancies in surface layers of joining crystals (600-1,000 Degree-Sign C). Proposed conception on mesoscopic processes of nanoparticles formation make the understanding and theoretical description of significant amount of experimental data possible and open the way for purposeful governing by oxide powder system on the stages of obtaining, compaction, and sintering.

  3. Snore related signals processing in a private cloud computing system.

    Science.gov (United States)

    Qian, Kun; Guo, Jian; Xu, Huijie; Zhu, Zhaomeng; Zhang, Gongxuan

    2014-09-01

    Snore related signals (SRS) have been demonstrated to carry important information about the obstruction site and degree in the upper airway of Obstructive Sleep Apnea-Hypopnea Syndrome (OSAHS) patients in recent years. To make this acoustic signal analysis method more accurate and robust, big SRS data processing is inevitable. As an emerging concept and technology, cloud computing has motivated numerous researchers and engineers to exploit applications both in academic and industry field, which could have an ability to implement a huge blue print in biomedical engineering. Considering the security and transferring requirement of biomedical data, we designed a system based on private cloud computing to process SRS. Then we set the comparable experiments of processing a 5-hour audio recording of an OSAHS patient by a personal computer, a server and a private cloud computing system to demonstrate the efficiency of the infrastructure we proposed.

  4. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    Energy Technology Data Exchange (ETDEWEB)

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  5. The Horse Raced Past: Gardenpath Processing in Dynamical Systems

    CERN Document Server

    Graben, Peter beim

    2012-01-01

    I pinpoint an interesting similarity between a recent account to rational parsing and the treatment of sequential decisions problems in a dynamical systems approach. I argue that expectation-driven search heuristics aiming at fast computation resembles a high-risk decision strategy in favor of large transition velocities. Hale's rational parser, combining generalized left-corner parsing with informed $\\mathrm{A}^*$ search to resolve processing conflicts, explains gardenpath effects in natural sentence processing by misleading estimates of future processing costs that are to be minimized. On the other hand, minimizing the duration of cognitive computations in time-continuous dynamical systems can be described by combining vector space representations of cognitive states by means of filler/role decompositions and subsequent tensor product representations with the paradigm of stable heteroclinic sequences. Maximizing transition velocities according to a high-risk decision strategy could account for a fast race e...

  6. Electromagnetic mixed-waste processing system for asbestos decontamination

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    The first phase of a program to develop and demonstrate a cost-effective, integrated process for remediation of asbestos-containing material that is contaminated with organics, heavy metals, and radioactive compounds was successfully completed. Laboratory scale tests were performed to demonstrate initial process viability for asbestos conversion, organics removal, and radionuclide and heavy metal removal. All success criteria for the laboratory tests were met. (1) Ohio DSI demonstrated greater than 99% asbestos conversion to amorphous solids using their commercial process. (2) KAI demonstrated 90% removal of organics from the asbestos suspension. (3) Westinghouse STC achieved the required metals removal criteria on a laboratory scale (e.g., 92% removal of uranium from solution, resin loadings of 0.6 equivalents per liter, and greater than 50% regeneration of resin in a batch test.) Using the information gained in the laboratory tests, the process was reconfigured to provide the basis for the mixed waste remediation system. An integrated process is conceptually developed, and a Phase 2 program plan is proposed to provide the bench-scale development needed in order to refine the design basis for a pilot processing system.

  7. Laser heated pedestal growth system commissioning and fiber processing

    Science.gov (United States)

    Buric, Michael; Yip, M. J.; Chorpening, Ben; Ohodnicki, Paul

    2016-05-01

    A new Laser Heated Pedestal Growth system was designed and fabricated using various aspects of effective legacy designs for the growth of single-crystal high-temperature-compatible optical fibers. The system is heated by a 100-watt, DC driven, CO2 laser with PID power control. Fiber diameter measurements are performed using a telecentric video system which identifies the molten zone and utilizes edge detection algorithms to report fiber-diameter. Beam shaping components include a beam telescope; along with gold-coated reflaxicon, turning, and parabolic focusing mirrors consistent with similar previous systems. The optical system permits melting of sapphire-feedstock up to 1.5mm in diameter for growth. Details regarding operational characteristics are reviewed and properties of single-crystal sapphire fibers produced by the system are evaluated. Aspects of the control algorithm efficacy will be discussed, along with relevant alternatives. Finally, some new techniques for in-situ processing making use of the laser-heating system are discussed. Ex-situ fiber modification and processing are also examined for improvements in fiber properties.

  8. High Performance Image Processing And Laser Beam Recording System

    Science.gov (United States)

    Fanelli, Anthony R.

    1980-09-01

    The article is meant to provide the digital image recording community with an overview of digital image processing, and recording. The Digital Interactive Image Processing System (DIIPS) was assembled by ESL for Air Force Systems Command under ROME AIR DEVELOPMENT CENTER's guidance. The system provides the capability of mensuration and exploitation of digital imagery with both mono and stereo digital images as inputs. This development provided for system design, basic hardware, software and operational procedures to enable the Air Force's System Command photo analyst to perform digital mensuration and exploitation of stereo digital images as inputs. The engineering model was based on state-of-the-art technology and to the extent possible off-the-shelf hardware and software. A LASER RECORDER was also developed for the DIIPS Systems and is known as the Ultra High Resolution Image Recorder (UHRIR). The UHRIR is a prototype model that will enable the Air Force Systems Command to record computer enhanced digital image data on photographic film at high resolution with geometric and radiometric distortion minimized.

  9. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    Science.gov (United States)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  10. Dual Systems Competence [Image Omitted] Procedural Processing: A Relational Developmental Systems Approach to Reasoning

    Science.gov (United States)

    Ricco, Robert B.; Overton, Willis F.

    2011-01-01

    Many current psychological models of reasoning minimize the role of deductive processes in human thought. In the present paper, we argue that deduction is an important part of ordinary cognition and we propose that a dual systems Competence [image omitted] Procedural processing model conceptualized within relational developmental systems theory…

  11. Computer-aided process planning: Development of an expert process planning system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, S.L.; Hummel, K.E.; Wolf, M.L.

    1991-12-01

    The project utilizes artificial intelligence (AI) technology to develop an expert system that will be used to prepare production plans, to automatically make cutting tool selections, and to automatically define machinability parameters and NC tape requirements. The expert system, XCUT, can plan features on moderately complex prismatic machined parts and reduce 2 to 4 hours of manual process effort into 15 or 30 minutes. Areas of future work have been identified that will enable the XCUT system to be used for production. These areas where further development is required are geometric reasoning, representation of manufacturing features, dimensioning and tolerancing, feature extraction, distributed computing architecture, knowledge gathering, and user interfaces.

  12. Computer-aided process planning: Development of an expert process planning system

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, S.L.; Hummel, K.E.; Wolf, M.L.

    1991-12-01

    The project utilizes artificial intelligence (AI) technology to develop an expert system that will be used to prepare production plans, to automatically make cutting tool selections, and to automatically define machinability parameters and NC tape requirements. The expert system, XCUT, can plan features on moderately complex prismatic machined parts and reduce 2 to 4 hours of manual process effort into 15 or 30 minutes. Areas of future work have been identified that will enable the XCUT system to be used for production. These areas where further development is required are geometric reasoning, representation of manufacturing features, dimensioning and tolerancing, feature extraction, distributed computing architecture, knowledge gathering, and user interfaces.

  13. Visual processing in rapid-chase systems: Image processing, attention, and awareness

    Directory of Open Access Journals (Sweden)

    Thomas eSchmidt

    2011-07-01

    Full Text Available Visual stimuli can be classified so rapidly that their analysis may be based on a single sweep of feedforward processing through the visuomotor system. Behavioral criteria for feedforward processing can be evaluated in response priming tasks where speeded pointing or keypress responses are performed towards target stimuli which are preceded by prime stimuli. We apply this method to several classes of complex stimuli. 1 When participants classify natural images into animals or non-animals, the time course of their pointing responses indicates that prime and target signals remain strictly sequential throughout all processing stages, meeting stringent behavioral criteria for feedforward processing (rapid-chase criteria. 2 Such priming effects are boosted by selective visual attention for positions, shapes, and colors, in a way consistent with bottom-up enhancement of visuomotor processing, even when primes cannot be consciously identified. 3 Speeded processing of phobic images is observed in participants specifically fearful of spiders or snakes, suggesting enhancement of feedforward processing by long-term perceptual learning. 4 When the perceived brightness of primes in complex displays is altered by means of illumination or transparency illusions, priming effects in speeded keypress responses can systematically contradict subjective brightness judgments, such that one prime appears brighter than the other but activates motor responses as if it was darker. We propose that response priming captures the output of the first feedforward pass of visual signals through the visuomotor system, and that this output lacks some characteristic features of more elaborate, recurrent processing. This way, visuomotor measures may become dissociated from several aspects of conscious vision. We argue that "fast" visuomotor measures predominantly driven by feedforward processing should supplement "slow" psychophysical measures predominantly based on visual

  14. Reliability Analysis of Repairable Systems Using Stochastic Point Processes

    Institute of Scientific and Technical Information of China (English)

    TAN Fu-rong; JIANG Zhi-bin; BAI Tong-shuo

    2008-01-01

    In order to analyze the failure data from repairable systems, the homogeneous Poisson process(HPP) is usually used. In general, HPP cannot be applied to analyze the entire life cycle of a complex, re-pairable system because the rate of occurrence of failures (ROCOF) of the system changes over time rather thanremains stable. However, from a practical point of view, it is always preferred to apply the simplest methodto address problems and to obtain useful practical results. Therefore, we attempted to use the HPP model toanalyze the failure data from real repairable systems. A graphic method and the Laplace test were also usedin the analysis. Results of numerical applications show that the HPP model may be a useful tool for the entirelife cycle of repairable systems.

  15. Energy saving in data processing and communication systems.

    Science.gov (United States)

    Iazeolla, Giuseppe; Pieroni, Alessandra

    2014-01-01

    The power management of ICT systems, that is, data processing (Dp) and telecommunication (Tlc) systems, is becoming a relevant problem in economical terms. Dp systems totalize millions of servers and associated subsystems (processors, monitors, storage devices, etc.) all over the world that need to be electrically powered. Dp systems are also used in the government of Tlc systems, which, besides requiring Dp electrical power, also require Tlc-specific power, both for mobile networks (with their cell-phone towers and associated subsystems: base stations, subscriber stations, switching nodes, etc.) and for wired networks (with their routers, gateways, switches, etc.). ICT research is thus expected to investigate into methods to reduce Dp- and Tlc-specific power consumption. However, saving power may turn into waste of performance, in other words, into waste of ICT quality of service (QoS). This paper investigates the Dp and Tlc power management policies that look at compromises between power saving and QoS.

  16. Development of technical information processing system(VI)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jee Hoh; Kim, Tae Hwan; Choi, Kwang; Chung, Hyun Sook; Keum, Jong Yong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-12-01

    This project is to establish high-quality information circulation system by developing serials-control system to improve serials management from ordering to distributing and availability on R and D and to advance in quality of information service needed in R and D by fast retrieval and providing of research information with CD-Net. The results of the project are as follows. 1. Serials management process which covers from ordering to distributing have higher efficiency by development of subscription information system. 2. Systematic control on each issue of serials is achieved by development of serials checking system. 3. It is possible to provide vol. and no. information of issue received currently to researchers promptly by improvement of serials holding information system. 4. Retrieval of research information contained in various CD-ROM DB throughout KAERI-NET is possible by research on construction methods of CD-Net. 2 figs, 25 refs. (Author).

  17. Detecting Anomalous Process Behaviour using Second Generation Artificial Immune Systems

    CERN Document Server

    Twycross, Jamie; Whitbrook, Amanda

    2010-01-01

    Artificial Immune Systems have been successfully applied to a number of problem domains including fault tolerance and data mining, but have been shown to scale poorly when applied to computer intrusion detec- tion despite the fact that the biological immune system is a very effective anomaly detector. This may be because AIS algorithms have previously been based on the adaptive immune system and biologically-naive mod- els. This paper focuses on describing and testing a more complex and biologically-authentic AIS model, inspired by the interactions between the innate and adaptive immune systems. Its performance on a realistic process anomaly detection problem is shown to be better than standard AIS methods (negative-selection), policy-based anomaly detection methods (systrace), and an alternative innate AIS approach (the DCA). In addition, it is shown that runtime information can be used in combination with system call information to enhance detection capability.

  18. The Dark Energy Survey Data Processing and Calibration System

    CERN Document Server

    Mohr, Joseph J; Bertin, Emmanuel; Daues, Gregory E; Desai, Shantanu; Gower, Michelle; Gruendl, Robert; Hanlon, William; Kuropatkin, Nikolay; Lin, Huan; Marriner, John; Petravick, Don; Sevilla, Ignacio; Swanson, Molly; Tomashek, Todd; Tucker, Douglas; Yanny, Brian

    2012-01-01

    The Dark Energy Survey (DES) is a 5000 deg2 grizY survey reaching characteristic photometric depths of 24th magnitude (10 sigma) and enabling accurate photometry and morphology of objects ten times fainter than in SDSS. Preparations for DES have included building a dedicated 3 deg2 CCD camera (DECam), upgrading the existing CTIO Blanco 4m telescope and developing a new high performance computing (HPC) enabled data management system (DESDM). The DESDM system will be used for processing, calibrating and serving the DES data. The total data volumes are high (~2PB), and so considerable effort has gone into designing an automated processing and quality control system. Special purpose image detrending and photometric calibration codes have been developed to meet the data quality requirements, while survey astrometric calibration, coaddition and cataloging rely on new extensions of the AstrOmatic codes which now include tools for PSF modeling, PSF homogenization, PSF corrected model fitting cataloging and joint mode...

  19. INTEC CPP-603 Basin Water Treatment System Closure: Process Design

    Energy Technology Data Exchange (ETDEWEB)

    Kimmitt, Raymond Rodney; Faultersack, Wendell Gale; Foster, Jonathan Kay; Berry, Stephen Michael

    2002-09-01

    This document describes the engineering activities that have been completed in support of the closure plan for the Idaho Nuclear Technology and Engineering Center (INTEC) CPP-603 Basin Water Treatment System. This effort includes detailed assessments of methods and equipment for performing work in four areas: 1. A cold (nonradioactive) mockup system for testing equipment and procedures for vessel cleanout and vessel demolition. 2. Cleanout of process vessels to meet standards identified in the closure plan. 3. Dismantlement and removal of vessels, should it not be possible to clean them to required standards in the closure plan. 4. Cleanout or removal of pipelines and pumps associated with the CPP-603 basin water treatment system. Cleanout standards for the pipes will be the same as those used for the process vessels.

  20. Incomplete fuzzy data processing systems using artificial neural network

    Science.gov (United States)

    Patyra, Marek J.

    1992-01-01

    In this paper, the implementation of a fuzzy data processing system using an artificial neural network (ANN) is discussed. The binary representation of fuzzy data is assumed, where the universe of discourse is decartelized into n equal intervals. The value of a membership function is represented by a binary number. It is proposed that incomplete fuzzy data processing be performed in two stages. The first stage performs the 'retrieval' of incomplete fuzzy data, and the second stage performs the desired operation on the retrieval data. The method of incomplete fuzzy data retrieval is proposed based on the linear approximation of missing values of the membership function. The ANN implementation of the proposed system is presented. The system was computationally verified and showed a relatively small total error.

  1. Integration of CAD/CAE System for Casting Process Design

    Institute of Scientific and Technical Information of China (English)

    周舰; 荆涛

    2003-01-01

    Concurrent engineering is needed to modernize the foundry industry and to reduce the scrap from castings and thus increase the economic profit. This paper presents an integrated 3-D CAD/CAE system for a foundry using concurrent engineering which considers casting structure, casting type, and manufacturing properties in the CAD module to design the pouring system, the riser, the chill core and so on. A visualized solid model is developed for the casting component with the model design enhanced by CAE analysis. Heat transfer and fluid flow simulation are used to analyze the initial design. The whole product development process is analyzed using concurrent engineering methods. The application shows that the integrated system can improve the efficiency of the design and manufacturing process of die casting.

  2. Characteristic Time Scales of Characteristic Magmatic Processes and Systems

    Science.gov (United States)

    Marsh, B. D.

    2004-05-01

    Every specific magmatic process, regardless of spatial scale, has an associated characteristic time scale. Time scales associated with crystals alone are rates of growth, dissolution, settling, aggregation, annealing, and nucleation, among others. At the other extreme are the time scales associated with the dynamics of the entire magmatic system. These can be separated into two groups: those associated with system genetics (e.g., the production and transport of magma, establishment of the magmatic system) and those due to physical characteristics of the established system (e.g., wall rock failure, solidification front propagation and instability, porous flow). The detailed geometry of a specific magmatic system is particularly important to appreciate; although generic systems are useful, care must be taken to make model systems as absolutely realistic as possible. Fuzzy models produce fuzzy science. Knowledge of specific time scales is not necessarily useful or meaningful unless the hierarchical context of the time scales for a realistic magmatic system is appreciated. The age of a specific phenocryst or ensemble of phenocrysts, as determined from isotopic or CSD studies, is not meaningful unless something can be ascertained of the provenance of the crystals. For example, crystal size multiplied by growth rate gives a meaningful crystal age only if it is from a part of the system that has experienced semi-monotonic cooling prior to chilling; crystals entrained from a long-standing cumulate bed that were mechanically sorted in ascending magma may not reveal this history. Ragged old crystals rolling about in the system for untold numbers of flushing times record specious process times, telling more about the noise in the system than the life of typical, first generation crystallization processes. The most helpful process-related time scales are those that are known well and that bound or define the temporal style of the system. Perhaps the most valuable of these

  3. Systems, Devices, and Materials for Digital Optical Processing.

    Science.gov (United States)

    Title, Mark Alan

    The massive parallelism and flexibility of three -dimensional optical communication may allow the development of new parallel computers free from the constraints of planar electronic technology. To bring the optical computer from possibility to reality, however, requires technological and scientific development in new optical systems, devices, and materials. We present here research results in each of these areas. First described is a prototype optical information processing system using CdS/liquid crystal spatial light modulators for optical logic and memory. This system has been developed as the first step in the implementation of a fine-grained, globally-interconnected optical processing element array. Notable system features include the implementation of programmable electronic control and the analysis of the optical power distribution within the processor, both directly applicable to the design of new and more advanced optical information processing systems. Next presented is the design and initial performance data for a new spatial light modulator combining an array of silicon phototransistors with the electro-optic material (Pb,La)(Zr,Ti)O _3, opening new possibilities for "intelligent" optical logic, memory, and switching devices. Important to the optimal performance of this Si/PLZT device is the fabrication of embedded electrodes in the electro-optic material, reducing the device operating voltage and switching energy while improving the uniformity of the optical modulation. An extensive computer model of embedded electrode performance and details of the electrode fabrication by reactive ion beam etching and electroless Ni deposition are presented. Finally, in the area of optical materials development we present initial results in the RF magnetron deposition of electro -optic PLZT on r-plane sapphire. This work is important to the fabrication of a monolithic, Si/PLZT-on-sapphire spatial light modulator, promising superior performance to devices using

  4. Programmable fast data acquisition system

    Science.gov (United States)

    Montebugnoli, S.; Bianchi, G.; Zoni, L.

    The goal of this work is to investigate the possibility to install on modern radiotelescopes a fast and programmable backend instead of several backends, each dedicated to a single acquisition task. Such an approach could lower the costs and make available a more compact and flexible back end. Exploiting the state-of-the-art of the FPGAs (Field Programmable Gate Arrays) and innovative architectures, a programmable system might be conceived.

  5. Quantum Processes and Dynamic Networks in Physical and Biological Systems.

    Science.gov (United States)

    Dudziak, Martin Joseph

    Quantum theory since its earliest formulations in the Copenhagen Interpretation has been difficult to integrate with general relativity and with classical Newtonian physics. There has been traditionally a regard for quantum phenomena as being a limiting case for a natural order that is fundamentally classical except for microscopic extrema where quantum mechanics must be applied, more as a mathematical reconciliation rather than as a description and explanation. Macroscopic sciences including the study of biological neural networks, cellular energy transports and the broad field of non-linear and chaotic systems point to a quantum dimension extending across all scales of measurement and encompassing all of Nature as a fundamentally quantum universe. Theory and observation lead to a number of hypotheses all of which point to dynamic, evolving networks of fundamental or elementary processes as the underlying logico-physical structure (manifestation) in Nature and a strongly quantized dimension to macroscalar processes such as are found in biological, ecological and social systems. The fundamental thesis advanced and presented herein is that quantum phenomena may be the direct consequence of a universe built not from objects and substance but from interacting, interdependent processes collectively operating as sets and networks, giving rise to systems that on microcosmic or macroscopic scales function wholistically and organically, exhibiting non-locality and other non -classical phenomena. The argument is made that such effects as non-locality are not aberrations or departures from the norm but ordinary consequences of the process-network dynamics of Nature. Quantum processes are taken to be the fundamental action-events within Nature; rather than being the exception quantum theory is the rule. The argument is also presented that the study of quantum physics could benefit from the study of selective higher-scale complex systems, such as neural processes in the brain

  6. Automation of the CFD Process on Distributed Computing Systems

    Science.gov (United States)

    Tejnil, Ed; Gee, Ken; Rizk, Yehia M.

    2000-01-01

    A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational

  7. New real-time image processing system for IRFPA

    Institute of Scientific and Technical Information of China (English)

    WANG Bing-jian; LIU Shang-qian; CHENG Yu-bao

    2006-01-01

    Influenced by detectors' material,manufacturing technology etc,every detector in infrared focal plane array (IRFPA) will output different voltages even if their input radiation flux is the same.And this is called non-uniformity of IRFPA.At the same time,the high background temperature,low temperature difference between targets and background and the low responsivity of IRFPA result in low contrast of infrared images.So non-uniformity correction and image enhancement are important techniques for IRFPA imaging system.This paper proposes a new real-time infrared image processing system based on Field Programmable Gate Array(FPGA).The system implements non-uniformity correction,image enhancement and video synthesization etc.By using parallel architecture and pipeline technique,the system processing speed is as high as 50Mx12bits per second.It is appropriate greatly to a large IRFPA and a high frame frequency IRFPA imaging system.The system is miniatured in one FPGA.

  8. A new Wellsite Information System to aid the drilling process

    Energy Technology Data Exchange (ETDEWEB)

    Grenadier, J.A.; McCann, D.; Koch, S.; Schlumberger, A.

    1994-12-31

    The IDEAL Wellsite Information System acquires data to monitor the drilling process. It interprets the realtime data flow from both surface and downhole and displays useful information on high resolution color screens to the key decision makers on and off the wellsite. The IDEAL Wellsite Information System can support four classes of users simultaneously: The driller, the directional driller on the rig floor, logging specialists in the unit and the company representative in the customer`s office. Color displays have been customized to the specialized needs of each class of user. In particular, the IDEAL Driller`s Display is a pressurized unit located on the rig floor. The driller can select from a number of screens with a minimum number of keystrokes. This information network improves drilling efficiency, geological evaluation and subsequent production through enhanced geological steering. Data is continually stored in both the time and depth domains. These databases can be exported into a variety of formats. Data can also be transmitted in realtime to the customer`s office offsite. Backup system components allow for redundancy so that system downtime is virtually eliminated. By having system developers concentrate on making the workstation easy to operate, the users can focus on the drilling process and not on the computer system. Custom graphic displays were designed by drillers for drillers. {open_quotes}Smart Alarms{close_quotes} have been designed to alert the user of potential problems such as kicks, sticking pipe and drillpipe washout.

  9. The Earth System Documentation (ES-DOC) Software Process

    Science.gov (United States)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  10. Control and Systems Concepts in the Innovation Process

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    The idea for this article and the associated feature articles in this special section come from two sources. The first is the opinion, shared by many control experts, that, while our subject has achieved much in its short history, it is now time to seek new directions and a new identity. The second source is my experience over the past 30 years with multidisciplinary R&D; I have seen that control systems methods can contribute to the development of new products and processes in a broader sense than the traditional design of feedback control loops.In some specific incidents, key innovative steps were made by taking a generalized control systems view. Interestingly, in most cases the control analyst, rather than making the key inventive steps, acted as a scientific facilitator by linking the skills of team members via the common languages of control systems, including modeling, simulation, and dynamic analysis. These experiences led me to reflect on searches for new directions in control and to question whether control systems analysts and educators should reposition control as a general systems science that can assist innovation in our industries, much in the same way that Wiener placed control at the heart of his cybernetic vision. This idea led to a further questioning of the sense in which innovation itself is a systematic process and therefore susceptible to control systems analysis.

  11. Autostereoscopic 3D visualization and image processing system for neurosurgery.

    Science.gov (United States)

    Meyer, Tobias; Kuß, Julia; Uhlemann, Falk; Wagner, Stefan; Kirsch, Matthias; Sobottka, Stephan B; Steinmeier, Ralf; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    A demonstrator system for planning neurosurgical procedures was developed based on commercial hardware and software. The system combines an easy-to-use environment for surgical planning with high-end visualization and the opportunity to analyze data sets for research purposes. The demonstrator system is based on the software AMIRA. Specific algorithms for segmentation, elastic registration, and visualization have been implemented and adapted to the clinical workflow. Modules from AMIRA and the image processing library Insight Segmentation and Registration Toolkit (ITK) can be combined to solve various image processing tasks. Customized modules tailored to specific clinical problems can easily be implemented using the AMIRA application programming interface and a self-developed framework for ITK filters. Visualization is done via autostereoscopic displays, which provide a 3D impression without viewing aids. A Spaceball device allows a comfortable, intuitive way of navigation in the data sets. Via an interface to a neurosurgical navigation system, the demonstrator system can be used intraoperatively. The precision, applicability, and benefit of the demonstrator system for planning of neurosurgical interventions and for neurosurgical research were successfully evaluated by neurosurgeons using phantom and patient data sets.

  12. Using the Unified Modelling Language (UML) to guide the systemic description of biological processes and systems.

    Science.gov (United States)

    Roux-Rouquié, Magali; Caritey, Nicolas; Gaubert, Laurent; Rosenthal-Sabroux, Camille

    2004-07-01

    One of the main issues in Systems Biology is to deal with semantic data integration. Previously, we examined the requirements for a reference conceptual model to guide semantic integration based on the systemic principles. In the present paper, we examine the usefulness of the Unified Modelling Language (UML) to describe and specify biological systems and processes. This makes unambiguous representations of biological systems, which would be suitable for translation into mathematical and computational formalisms, enabling analysis, simulation and prediction of these systems behaviours.

  13. Rheological behavior of a bismaleimide resin system for RTM process

    Institute of Scientific and Technical Information of China (English)

    DUAN Yuexin; SHI Feng; LIANG Zhiyong; ZHANG Zuoguang

    2007-01-01

    The curing properties and rheological behavior of a bismaleimide resin system were studied with differential scanning calorimetry (DSC) analysis and viscometer mea-surements,respectively.A dual-Arrhenius viscosity model and an engineering viscosity model were established to pre-dict the resin rheological behavior of this resin system.The two viscosity models were compared.The results show that the two models are both suitable for predicting the viscosity in the mold filling stage of resin transfer molding (RTM). However,the engineering model provides a more accurate prediction of the viscosity near the gel point.The effective-ness of the engineering viscosity model is verified both in isothermal and nonisothermal conditions.The limitation of the engineering model is that it cannot be used to predict the viscosity after cross-linking of the curing system.The engineering viscosity models can be used to predict the pro-cessing windows of different processing parameters of the RTM process,which is critical for the simulation and the optimization of composite manufacturing processes.

  14. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    Science.gov (United States)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  15. Water recycle treatment system for use in metal processing

    Energy Technology Data Exchange (ETDEWEB)

    Hewitt, D.E.; Dando, T.J.

    1976-08-10

    A water recycle treatment system is described comprising two main treatment sub-systems for treatment of contaminated water from a plurality of concentrated solutions and rinse baths to separate out the impurities therein. A first sub-system treats less concentrated solutions used for the rinse baths by channeling the flow therefrom to a first neutralizing tank which provides for pH control to produce a mixed output solution having a substantially constant pH factor, which is filtered to remove gross particles, the filtered solution being cooled in a holding tank and passed through a reverse osmosis process and carbon bed to produce clean water. The second sub-system treats highly concentrated solutions obtained from a plurality of chemical processes, mixes them in a second neutralizing tank which is utilized to produce a substantially constant pH output, which is fed to an evaporator to precipitate the metals and salts in sludge and also forms a water vapor output. The reverse osmosis waste is fed back into the second neutralizing tank and processed as noted above.

  16. Micro-Task Processing in Heterogeneous Reconfigurable Systems

    Institute of Scientific and Technical Information of China (English)

    Sebastian Wallner

    2005-01-01

    New reconfigurable computing architectures are introduced to overcome some of the limitations of conventional microprocessors and fine-grained reconfigurable devices (e.g., FPGAs). One of the new promising architectures are Configurable System-on-Chip (CSoC) solutions. They were designed to offer high computational performance for real-time signal processing and for a wide range of applications exhibiting high degrees of parallelism. The programming of such systems is an inherently challenging problem due to the lack of an programming model. This paper describes a novel heterogeneous system architecture for signal processing and data streaming applications. It offers high computational performance and a high degree of flexibility and adaptability by employing a micro Task Controller (mTC) unit in conjunction with programmable and configurable hardware. The hierarchically organized architecture provides a programming model, allows an efficient mapping of applications and is shown to be easy scalable to future VLSI technologies. Several mappings of commonly used digital signal processing algorithms for future telecommunication and multimedia systems and implementation results are given for a standard-cell ASIC design realization in 0.18 micron 6-layer UMC CMOS technology.

  17. Management by process based systems and safety focus; Verksamhetsstyrning med process-baserade ledningssystem och saekerhetsfokus

    Energy Technology Data Exchange (ETDEWEB)

    Rydnert, Bo; Groenlund, Bjoern [SIS Forum AB, Stockholm (Sweden)

    2005-12-15

    An initiative from The Swedish Nuclear Power Inspectorate led to this study carried out in the late autumn of 2005. The objective was to understand in more detail how an increasing use of process management affects organisations, on the one hand regarding risks and security, on the other hand regarding management by objectives and other management and operative effects. The main method was interviewing representatives of companies and independent experts. More than 20 interviews were carried out. In addition a literature study was made. All participating companies are using Management Systems based on processes. However, the methods chosen, and the results achieved, vary extensively. Thus, there are surprisingly few examples of complete and effective management by processes. Yet there is no doubt that management by processes is effective and efficient. Overall goals are reached, business results are achieved in more reliable ways and customers are more satisfied. The weaknesses found can be translated into a few comprehensive recommendations. A clear, structured and acknowledged model should be used and the processes should be described unambiguously. The changed management roles should be described and obeyed extremely legibly. New types of process objectives need to be formulated. In addition one fact needs to be observed and effectively fended off. Changes are often met by mental opposition on management level, as well as among co-workers. This fact needs attention and leadership. Safety development is closely related to the design and operation of a business management system and its continual improvement. A deep understanding of what constitutes an efficient and effective management system affects the understanding of safety. safety culture and abilities to achieve safety goals. Concerning risk, the opinions were unambiguous. Management by processes as such does not result in any further risks. On the contrary. Processes give a clear view of production and

  18. System for processing an encrypted instruction stream in hardware

    Science.gov (United States)

    Griswold, Richard L.; Nickless, William K.; Conrad, Ryan C.

    2016-04-12

    A system and method of processing an encrypted instruction stream in hardware is disclosed. Main memory stores the encrypted instruction stream and unencrypted data. A central processing unit (CPU) is operatively coupled to the main memory. A decryptor is operatively coupled to the main memory and located within the CPU. The decryptor decrypts the encrypted instruction stream upon receipt of an instruction fetch signal from a CPU core. Unencrypted data is passed through to the CPU core without decryption upon receipt of a data fetch signal.

  19. Layers And Processes In The Model Of Technological Postal System

    Directory of Open Access Journals (Sweden)

    Madleňáková Lucia

    2015-12-01

    Full Text Available The paper include important aspects of layer model of postal technological system such as makes the possibility to define rules for regulating, technical and technological requirements and interfaces to communicate with other postal systems. The current postal reform is mainly attributable to release of network access and ensuring full interoperability between technological systems. Not only to ensure the development and protection of competition but also in respect to the conservation of requirements to provide the universal service, which is the performance of public interest. There is a space here to examine the postal system, not only from a procedural point of view, but to be viewed as an open communication system. It is possible to find there the commonalities with other communication sector branches and to handle the technological postal system in more layers; similarly as the electronic communication systems are handled. Model of layer postal system, based not only on the processes but on layers functionality, will enable to identify communication protocols and interfaces determining interoperability. It also opens the question of appropriate regulation model.

  20. System design package for the solar heating and cooling central data processing system

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-01

    This system design package for the Central Data Processing System consists of the Software Performance Specification, Hardware Performance Specification, Software Verification Plan, CDPS Development Program, Qualification and Acceptance Test Procedures, Qualification Test and Analysis Report, and Qualification and Acceptance Test Review. The Central Data Processing System, located at IBM's Federal System Division facility in Huntsville, Alabama, provides the resources required to assess the performance of solar heating and cooling systems installed at remote sites. These sites consist of residential, commercial, government, and educational types of buildings, and the solar heating and cooling systems can be hot-water, space heating, cooling, and combinations of these. The instrumentation data associated with these systems will vary according to the application and must be collected, processed, and presented in a form which supports continuity of performance evaluation across all applications.

  1. Rapid prototyping in the development of image processing systems

    Science.gov (United States)

    von der Fecht, Arno; Kelm, Claus Thomas

    2004-08-01

    This contribution presents a rapid prototyping approach for the real-time demonstration of image processing algorithms. As an example EADS/LFK has developed a basic IR target tracking system implementing this approach. Traditionally in research and industry time-independent simulation of image processing algorithms on a host computer is processed. This method is good for demonstrating the algorithms' capabilities. Rarely done is a time-dependent simulation or even a real-time demonstration on a target platform to prove the real-time capabilities. In 1D signal processing applications time-dependent simulation and real-time demonstration has already been used for quite a while. For time-dependent simulation Simulink from The MathWorks has established as an industry standard. Combined with The MathWorks' Real-Time Workshop the simulation model can be transferred to a real-time target processor. The executable is generated automatically by the Real-Time Workshop directly out of the simulation model. In 2D signal processing applications like image processing The Mathworks' Matlab is commonly used for time-independent simulation. To achieve time-dependent simulation and real-time demonstration capabilities the algorithms can be transferred to Simulink, which in fact runs on top of Matlab. Additionally to increase the performance Simulink models or parts of them can be transferred to Xilinx FPGAs using Xilinx' System Generator. With a single model and the automatic workflow both, a time-dependant simulation and the real-time demonstration, are covered leading to an easy and flexible rapid prototyping approach. EADS/LFK is going to use this approach for a wider spectrum of IR image processing applications like automatic target recognition or image based navigation or imaging laser radar target recognition.

  2. Early Stage Disease Diagnosis System Using Human Nail Image Processing

    Directory of Open Access Journals (Sweden)

    Trupti S. Indi

    2016-07-01

    Full Text Available Human’s hand nail is analyzed to identify many diseases at early stage of diagnosis. Study of person hand nail color helps in identification of particular disease in healthcare domain. The proposed system guides in such scenario to take decision in disease diagnosis. The input to the proposed system is person nail image. The system will process an image of nail and extract features of nail which is used for disease diagnosis. Human nail consist of various features, out of which proposed system uses nail color changes for disease diagnosis. Here, first training set data is prepared using Weka tool from nail images of patients of specific diseases. A feature extracted from input nail image is compared with the training data set to get result. In this experiment we found that using color feature of nail image average 65% results are correctly matched with training set data during three tests conducted.

  3. A Miniaturized System for Neural Signal Acquiring and Processing

    Institute of Scientific and Technical Information of China (English)

    WANG Min; GAO Guang-hong; XIANG Dong-sheng; CAO Mao-yong; JIA Ai-bin; DING Lei; KONG Hui-min

    2008-01-01

    To collect neural activity data from awake, behaving freely animals, we develop miniaturized implantable recording system by the modern chip:Programmable System on Chip(PSoC) and through chronic electrodes in the cortex. With PSoC family member CY8C29466,the system completed operational and instrument amplifiers, filters, timers, AD convertors, and serial communication, etc. The signal processing was dealt with virtual instrument technology. All of these factors can significantly affect the price and development cycle of the project. The result showed that the system was able to record and analyze neural extrocellular discharge generated by neurons continuously for a week or more. This is very useful for the interdisciplinary research of neuroscience and information engineering technique.The circuits and architecture of the devices can be adapted for neurobiology and research with other small animals.

  4. Increased noise signal processing in incoherent radar systems

    Directory of Open Access Journals (Sweden)

    I. I. Chesanovskyi

    2013-09-01

    Full Text Available Introduction. The work is devoted to the method of increasing coherence and noise immunity pulse radar systems with incoherent sources probing signals. Problem. Incongruities between a resolution and a range of pulsed radar systems can not be resolved within the classical approaches of building incoherent radar systems, requiring new approaches in their construction. The main part. The paper presents a method of two-stage processing incoherent pulsed radar signals, allowing to compensate and use the information available to them and the angular amplitude of spurious modulation. Conclusions. Simulation results and research functions of these expressions of uncertainty indicate that use volatility as an additional transmitter modulation allows to significantly improve the resolution and robustness of the radar system.

  5. The dynamic power management for embedded system with Poisson process

    Institute of Scientific and Technical Information of China (English)

    CHEN Tian-zhou; HUANG Jiang-wei; DAI Hong-jun

    2005-01-01

    The mass of the embedded systems are driven by second batteries, not by wired power supply. So saving energy is one of the main design goals for embedded system. In this paper we present a new technique for modelling and solving the dynamic power management (DPM) problem for embedded systems with complex behavioural characteristics. First we model a power-managed embedded computing system as a controllable Flow Chart. Then we use the Poisson process for optimisation, and give the power management algorithm by the help of Dynamic Voltage Scaling (DVS) technology. At last we built the experimental model using the PXA 255 Processors. The experimental results showed that the proposed technique can achieve more than12% power saving compared to other existing DPM techniques.

  6. Collaborative process control: Observation of tracks generated by PLM system

    CERN Document Server

    Elkadiri, Soumaya; Delattre, Miguel; Bouras, Abdelaziz

    2008-01-01

    This paper aims at analyzing the problems related to collaborative work using a PLM system. This research is mainly focused on the organisational aspects of SMEs involved in networks composed of large companies, subcontractors and other industrial partners. From this analysis, we propose the deployment of an approach based on an observation process of tracks generated by PLM system. The specific contributions are of two fold. First is to identify the brake points of collaborative work. The second, thanks to the exploitation of generated tracks, it allows reducing risks by reacting in real time to the incidents or dysfunctions that may occur. The overall system architecture based on services technology and supporting the proposed approach is described, as well as associated prototype developed using an industrial PLM system.

  7. 40 CFR 63.480 - Applicability and designation of affected sources.

    Science.gov (United States)

    2010-07-01

    ... segregated sewers; (3) Water from fire-fighting and deluge systems in segregated sewers; (4) Spills; (5... emission points, back-end process operations subject to §§ 63.493 and 63.500, and heat exchange systems and... 2 emission points, back-end process operations subject to §§ 63.493 through 63.500, and...

  8. Assessing and Optimizing Microarchitectural Performance of Event Processing Systems

    Science.gov (United States)

    Mendes, Marcelo R. N.; Bizarro, Pedro; Marques, Paulo

    Event Processing (EP) systems are being progressively used in business critical applications in domains such as algorithmic trading, supply chain management, production monitoring, or fraud detection. To deal with high throughput and low response time requirements, these EP systems mainly use the CPU-RAM sub-system for data processing. However, as we show here, collected statistics on CPU usage or on CPU-RAM communication reveal that available systems are poorly optimized and grossly waste resources. In this paper we quantify some of these inefficiencies and propose cache-aware algorithms and changes on internal data structures to overcome them. We test the before and after system both at the microarchitecture and application level and show that: i) the changes improve microarchitecture metrics such as clocks-per-instruction, cache misses or TLB misses; ii) and that some of these improvements result in very high application level improvements such as a 44% improvement on stream-to-table joins with 6-fold reduction on memory consumption, and order-of-magnitude increase on throughput for moving aggregation operations.

  9. Tank waste remediation system process engineering instruction manual

    Energy Technology Data Exchange (ETDEWEB)

    ADAMS, M.R.

    1998-11-04

    The purpose of the Tank Waste Remediation System (TWRS) Process Engineering Instruction Manual is to provide guidance and direction to TWRS Process Engineering staff regarding conduct of business. The objective is to establish a disciplined and consistent approach to business such that the work processes within TWRS Process Engineering are safe, high quality, disciplined, efficient, and consistent with Lockheed Martin Hanford Corporation Policies and Procedures. The sections within this manual are of two types: for compliance and for guidance. For compliance sections are intended to be followed per-the-letter until such time as they are formally changed per Section 2.0 of this manual. For guidance sections are intended to be used by the staff for guidance in the conduct of work where technical judgment and discernment are required. The guidance sections shall also be changed per Section 2.0 of this manual. The required header for each manual section is illustrated in Section 2.0, Manual Change Control procedure. It is intended that this manual be used as a training and indoctrination resource for employees of the TWRS Process Engineering organization. The manual shall be required reading for all TWRS Process Engineering staff, matrixed, and subcontracted employees.

  10. Photonics for microwave systems and ultra-wideband signal processing

    Science.gov (United States)

    Ng, W.

    2016-08-01

    The advantages of using the broadband and low-loss distribution attributes of photonics to enhance the signal processing and sensing capabilities of microwave systems are well known. In this paper, we review the progress made in the topical areas of true-time-delay beamsteering, photonic-assisted analog-to-digital conversion, RF-photonic filtering and link performances. We also provide an outlook on the emerging field of integrated microwave photonics (MWP) that promise to reduce the cost of MWP subsystems and components, while providing significantly improved form-factors for system insertion.

  11. Lifetime-Based Memory Management for Distributed Data Processing Systems

    DEFF Research Database (Denmark)

    Lu, Lu; Shi, Xuanhua; Zhou, Yongluan

    2016-01-01

    In-memory caching of intermediate data and eager combining of data in shuffle buffers have been shown to be very effective in minimizing the re-computation and I/O cost in distributed data processing systems like Spark and Flink. However, it has also been widely reported that these techniques would...... create a large amount of long-living data objects in the heap, which may quickly saturate the garbage collector, especially when handling a large dataset, and hence would limit the scalability of the system. To eliminate this problem, we propose a lifetime-based memory management framework, which...

  12. Signal Processing System for the CASA Integrated Project I Radars

    Energy Technology Data Exchange (ETDEWEB)

    Bharadwaj, Nitin; Chandrasekar, V.; Junyent, Francesc

    2010-09-01

    This paper describes the waveform design space and signal processing system for dual-polarization Doppler weather radar operating at X band. The performance of the waveforms is presented with ground clutter suppression capability and mitigation of range velocity ambiguity. The operational waveform is designed based on operational requirements and system/hardware requirements. A dual Pulse Repetition Frequency (PRF) waveform was developed and implemented for the first generation X-band radars deployed by the Center for Collaborative Adaptive Sensing of the Atmosphere (CASA). This paper presents an evaluation of the performance of the waveforms based on simulations and data collected by the first-generation CASA radars during operations.

  13. Optimizing FORTRAN Programs for Hierarchical Memory Parallel Processing Systems

    Institute of Scientific and Technical Information of China (English)

    金国华; 陈福接

    1993-01-01

    Parallel loops account for the greatest amount of parallelism in numerical programs.Executing nested loops in parallel with low run-time overhead is thus very important for achieving high performance in parallel processing systems.However,in parallel processing systems with caches or local memories in memory hierarchies,“thrashing problemmay”may arise whenever data move back and forth between the caches or local memories in different processors.Previous techniques can only deal with the rather simple cases with one linear function in the perfactly nested loop.In this paper,we present a parallel program optimizing technique called hybri loop interchange(HLI)for the cases with multiple linear functions and loop-carried data dependences in the nested loop.With HLI we can easily eliminate or reduce the thrashing phenomena without reucing the program parallelism.

  14. A DNA Network as an Information Processing System

    Directory of Open Access Journals (Sweden)

    Andy M. Tyrrell

    2012-04-01

    Full Text Available Biomolecular systems that can process information are sought for computational applications, because of their potential for parallelism and miniaturization and because their biocompatibility also makes them suitable for future biomedical applications. DNA has been used to design machines, motors, finite automata, logic gates, reaction networks and logic programs, amongst many other structures and dynamic behaviours. Here we design and program a synthetic DNA network to implement computational paradigms abstracted from cellular regulatory networks. These show information processing properties that are desirable in artificial, engineered molecular systems, including robustness of the output in relation to different sources of variation. We show the results of numerical simulations of the dynamic behaviour of the network and preliminary experimental analysis of its main components.

  15. Enzyme-Based Logic Systems for Information Processing

    CERN Document Server

    Katz, Evgeny

    2009-01-01

    We review enzymatic systems which involve biocatalytic reactions utilized for information processing (biocomputing). Extensive ongoing research in biocomputing, mimicking Boolean logic gates has been motivated by potential applications in biotechnology and medicine. Furthermore, novel sensor concepts have been contemplated with multiple inputs processed biochemically before the final output is coupled to transducing "smart-material" electrodes and other systems. These applications have warranted recent emphasis on networking of biocomputing gates. First few-gate networks have been experimentally realized, including coupling, for instance, to signal-responsive electrodes for signal readout. In order to achieve scalable, stable network design and functioning, considerations of noise propagation and control have been initiated as a new research direction. Optimization of single enzyme-based gates for avoiding analog noise amplification has been explored, as were certain network-optimization concepts. We review a...

  16. System Design Support by Optimization Method Using Stochastic Process

    Science.gov (United States)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    We proposed the new optimization method based on stochastic process. The characteristics of this method are to obtain the approximate solution of the optimum solution as an expected value. In numerical calculation, a kind of Monte Carlo method is used to obtain the solution because of stochastic process. Then, it can obtain the probability distribution of the design variable because it is generated in the probability that design variables were in proportion to the evaluation function value. This probability distribution shows the influence of design variables on the evaluation function value. This probability distribution is the information which is very useful for the system design. In this paper, it is shown the proposed method is useful for not only the optimization but also the system design. The flight trajectory optimization problem for the hang-glider is shown as an example of the numerical calculation.

  17. Variant Computer Aided Process Planning System for Rotational Parts

    Institute of Scientific and Technical Information of China (English)

    AHMED Hassan; YAO Zhen-qiang; CAI Jian-guo

    2005-01-01

    The amount of material must be removed away to produce the final product should minimize, excess stock will increase not only the material cost, but also processing cost, fixture cost, tooling cost, and increases machine cycle times.Noticing in recent years that the world is running out of mineral resources, and the price of engineering materials will continually rise in the future, the percentage of the cost of manufactured part that is due to the cost of materials is also rising. This paper proposed a variant CAPP system for rotational parts based on the concept of group technology,this system accepts part features characteristics code number as an input and provides operation details for manufacturing route with the suitable primary processes required to produce the blank work piece as an output.

  18. Process Systems Engineering R&D for Advanced Fossil Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Zitney, S.E.

    2007-09-11

    This presentation will examine process systems engineering R&D needs for application to advanced fossil energy (FE) systems and highlight ongoing research activities at the National Energy Technology Laboratory (NETL) under the auspices of a recently launched Collaboratory for Process & Dynamic Systems Research. The three current technology focus areas include: 1) High-fidelity systems with NETL's award-winning Advanced Process Engineering Co-Simulator (APECS) technology for integrating process simulation with computational fluid dynamics (CFD) and virtual engineering concepts, 2) Dynamic systems with R&D on plant-wide IGCC dynamic simulation, control, and real-time training applications, and 3) Systems optimization including large-scale process optimization, stochastic simulation for risk/uncertainty analysis, and cost estimation. Continued R&D aimed at these and other key process systems engineering models, methods, and tools will accelerate the development of advanced gasification-based FE systems and produce increasingly valuable outcomes for DOE and the Nation.

  19. Process fluids of aero-hydraulic systems and their properties

    Directory of Open Access Journals (Sweden)

    I. S. Shumilov

    2014-01-01

    Full Text Available The article considers process fluids, which are presently applied to aviation hydraulic systems in domestic and world practice. Aviation practice deals with rather wide list of fluids. Based on the technical specification a designer makes the choice of specific fluid for the specific aircraft. Process fluids have to possess the specified properties presented in the article, namely: lubricating properties; stability of physical and chemical characteristics at operation and storage; lowtemperature properties; acceptable congelation temperature; compatibility with materials of units and components of hydraulic systems; heat conductivity; high rigidity; minimum low coefficient of volume expansion; fire-explosion safety; low density. They should also have good dielectric properties, be good to resist to destruction of molecules, have good anticorrosion and antierosion properties, as well as not create conditions for emerging electro-kinetic erosion of spooltype and other precision devices, and a number of other properties.The article presents materials on the oil-based process fluids with + (200-320 °C boiling temperature, gelled by a polymer of vinyl butyl ether, with aging inhibitor and dye for hydraulic systems of the subsonic and transonic aircraft which are combustible, with a temperature interval of use from — 60oС до +125oС. It also describes materials on process fluids, which are based on the mix of polydialkylsiloxane oligomers with organic diester aging inhibitors, and wear-resistant additive to be applied to the hydraulic systems of supersonic aircrafts using a fluid within the temperature interval from - 6О oС to +175oС for a long duration. The fire-explosion safety process fluids representing a mix of phosphoric esters with additives to improve viscous, anti-oxidizing, anticorrosive and anti-erosive properties are considered as well. They are used within the temperature range from - 60оС to +125оС with overheats up to +150

  20. Bioattractors: dynamical systems theory and the evolution of regulatory processes.

    Science.gov (United States)

    Jaeger, Johannes; Monk, Nick

    2014-06-01

    In this paper, we illustrate how dynamical systems theory can provide a unifying conceptual framework for evolution of biological regulatory systems. Our argument is that the genotype-phenotype map can be characterized by the phase portrait of the underlying regulatory process. The features of this portrait--such as attractors with associated basins and their bifurcations--define the regulatory and evolutionary potential of a system. We show how the geometric analysis of phase space connects Waddington's epigenetic landscape to recent computational approaches for the study of robustness and evolvability in network evolution. We discuss how the geometry of phase space determines the probability of possible phenotypic transitions. Finally, we demonstrate how the active, self-organizing role of the environment in phenotypic evolution can be understood in terms of dynamical systems concepts. This approach yields mechanistic explanations that go beyond insights based on the simulation of evolving regulatory networks alone. Its predictions can now be tested by studying specific, experimentally tractable regulatory systems using the tools of modern systems biology. A systematic exploration of such systems will enable us to understand better the nature and origin of the phenotypic variability, which provides the substrate for evolution by natural selection.

  1. INFORMATION SYSTEMS AND PROCESSES OF MONITORING POWER TRANSFORMERS CONDITION

    Directory of Open Access Journals (Sweden)

    Litvinov V. N.

    2016-02-01

    Full Text Available To solve the problem of reducing the power supply system’s reliability a prompt full-scale diagnostics based on modern methods can help. Inculcation of information systems for the operational diagnostics implementation allows providing the operating personnel with information that enables to predict possible infringements in power transformers work and to prepare in advance an action plan to address them. The paper presents fragments of the developed monitoring system of power transformer using programmable logic controllers. Within the work of the system there were marked such groups of controlled parameters as information about temperature and the cooling system work; magnitude of windings voltage per phase; the windings current values per phase; information about being transmitted and transmitted power; information about the insulation state. There is designed a functional scheme of the system for monitoring the state of the power transformer. There is described a general algorithm of system functioning. There is developed graphical operator interface that allows to monitor the object state and to manage the system state. Using XML markup language there was designed format of data packets. Designed hardware and software package can be used in the educational process, as it allows to improve the quality of students training, to bring them closer to the realities of modern professional activities; in operational activities as complying with the approved domestic calculating methods replacement of foreign software; in science in solving problems of analysis and optimization of operating parameters of power transformers

  2. Acceptance test report for 241-AW process air system

    Energy Technology Data Exchange (ETDEWEB)

    Kostelnik, A.J.

    1994-10-06

    The acceptance test procedure (ATP) for the compressed air system at building 241-AW-273 was completed on March 11, 1993. The system was upgraded to provide a reliable source of compressed air to the tank farm. The upgrade included the demolition of the existing air compressor and associated piping, as well as the installation of a new air compressor with a closed loop cooling system. A compressed air cross-tie was added to allow the process air compressor to function as a back-up to the existing instrument air compressor. The purpose of the ATP was to achieve three primary objectives: verify system upgrade in accordance with the design media; provide functional test of system components and controls; and prepare the system for the Operational Test. The ATP was successfully completed with thirteen exceptions, which were resolved prior to completing the acceptance test. The repaired exceptions had no impact to safety or the environment and are briefly summarized. Testing ensured that the system was installed per design, that its components function as required and that it is ready for operational testing and subsequent turnover to operations.

  3. Stream Processing in the Robot Operating System framework

    OpenAIRE

    Hongslo, Anders

    2012-01-01

    Streams of information rather than static databases are becoming increasingly important with the rapid changes involved in a number of fields such as finance, social media and robotics. DyKnow is a stream-based knowledge processing middleware which has been used in autonomous Unmanned Aerial Vehicle (UAV) research. ROS (Robot Operating System) is an open-source robotics framework providing hardware abstraction, device drivers, communication infrastructure, tools, libraries as well as other fu...

  4. Artificial intelligence, expert systems, computer vision, and natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1984-01-01

    An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

  5. The NJOY nuclear data processing system Version 91

    Energy Technology Data Exchange (ETDEWEB)

    MacFarlane, R.E.; Muir, D.W.

    1994-10-01

    The NJOY nuclear data processing system is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from elevated nuclear data in the ENDF format, including the latest US library, ENDF/B-VI. The NJOY code can work with neutrons, photons, and charged particles, and it can produce libraries for a wide variety of particle transport and reactor analysis codes.

  6. Decision support for information systems management: applying analytic hierarchy process

    OpenAIRE

    Huizingh, Eelko K.R.E.; Vrolijk, Hans C.J.

    1995-01-01

    Decision-making in the field of information systems has become more complex due to a larger number of alternatives, multiple and sometimes conflicting goals, and an increasingly turbulent environment. In this paper we explore the appropriateness of Analytic Hierarchy Process to support I/S decision making. AHP can be applied if the decision problem includes multiple objectives, conflicting criteria, incommensurable units, and aims at selecting an alternative from a known set of alternatives. ...

  7. Inclusive Education as Complex Process and Challenge for School System

    Directory of Open Access Journals (Sweden)

    Al-Khamisy Danuta

    2015-08-01

    Full Text Available Education may be considered as a number of processes, actions and effects affecting human being, as the state or level of the results of these processes or as the modification of the functions, institutions and social practices roles, which in the result of inclusion become new, integrated system. Thus this is very complex process. Nowadays the complexity appears to be one of very significant terms both in science and in philosophy. It appears that despite searching for simple rules, strategies, solutions everything is still more complex. The environment is complex, the organism living in it and exploring it, and just the exploration itself is a complex phenomenon, much more than this could initially seem to be.

  8. Use of microwave in processing of drug delivery systems.

    Science.gov (United States)

    Wong, T W

    2008-04-01

    Microwave has received a widespread application in pharmaceuticals and food processing, microbial sterilization, biomedical therapy, scientific and biomedical analysis, as well as, drug synthesis. This paper reviews the basis of application of microwave to prepare pharmaceutical dosage forms such as agglomerates, gel beads, microspheres, nanomatrix, solid dispersion, tablets and film coat. The microwave could induce drying, polymeric crosslinkages as well as drug-polymer interaction, and modify the structure of drug crystallites via its effects of heating and/or electromagnetic field on the dosage forms. The use of microwave opens a new approach to control the physicochemical properties and drug delivery profiles of pharmaceutical dosage forms without the need for excessive heat, lengthy process or toxic reactants. Alternatively, the microwave can be utilized to process excipients prior to their use in the formulation of drug delivery systems. The intended release characteristics of drugs in dosage forms can be met through modifying the physicochemical properties of excipients using the microwave.

  9. Survey of real-time processing systems for big data

    DEFF Research Database (Denmark)

    Iftikhar, Nadeem

    2014-01-01

    In recent years, real-time processing and analytics systems for big data–in the context of Business Intelligence (BI)–have received a growing attention. The traditional BI platforms that perform regular updates on daily, weekly or monthly basis are no longer adequate to satisfy the fast......-changing business environments. However, due to the nature of big data, it has become a challenge to achieve the real-time capability using the traditional technologies. The recent distributed computing technology, MapReduce, provides off-the-shelf high scalability that can significantly shorten the processing time...... for big data; Its open-source implementation such as Hadoop has become the de-facto standard for processing big data, however, Hadoop has the limitation of supporting real-time updates. The improvements in Hadoop for the real-time capability, and the other alternative real-time frameworks have been...

  10. Healthcare Firms and the ERP Systems

    OpenAIRE

    A. Garefalakis; G. Mantalis; E. Vourgourakis; K. Spinthiropoulos; Ch. Lemonakis

    2016-01-01

    With the continuous and drastic changes due to the economic crisis, along with the increasing market demands, major reforms are initiated in the healthcare sector in order to improve the quality of healthcare and operational efficiency, while reducing costs and optimizing back-end operations. ERP systems have been the basic technological infrastructure to many sectors as well as healthcare. The main objective of this study is to discuss how the adoption of ERP systems in healthcar...

  11. [Building a quality evaluation system for the nursing triage process].

    Science.gov (United States)

    Gadda, Giorgio; Destrebecq, Anne; Bollini, Giovanna; Terzoni, Stefano

    2009-01-01

    In literature there are no studies regarding the quality evaluation of the whole triage process; however, since it is necessary to evaluate what really happens everyday in the Emergency Rooms, as well as verifying the daily level of throughput, there is a strong need for an appropriate tool. Measuring the quality of triage means improving the caring process. This article presents a new measurement grid, aimed at realizing a Quality Evaluation of Nursing Care. Currently, the most widespread systems used for the QENC are the Australian Triage Scale, the Canadian triage and Acuity Scale and the Italian Group of Triage Scale. The global/biphasic triage system was used in our study because it seems the most accurate, according to literature. There are several indicators that evaluate many aspects of the triage process; every single indicator has a score. The sum of the scores defines the quality level of the nursing triage process. Our paper discusses the application of this score in a major Milan hospital, based upon a preliminary study.

  12. Aggression proneness: Transdiagnostic processes involving negative valence and cognitive systems.

    Science.gov (United States)

    Verona, Edelyn; Bresin, Konrad

    2015-11-01

    Aggressive behavior is observed in persons with various mental health problems and has been studied from the perspectives of neuroscience and psychophysiology. The present research reviews some of the extant experimental literature to help clarify the interplay between domains of functioning implicated in aggression proneness. We then convey a process-oriented model that elucidates how the interplay of the Negative Valence and Cognitive System domains of NIMH's Research Domain Criteria (RDoC) helps explain aggression proneness, particularly reactive aggression. Finally, we report on a study involving event-related potential (ERP) indices of emotional and inhibitory control processing during an emotional-linguistic go/no-go task among 67 individuals with histories of violence and criminal offending (30% female, 44% African-American) who reported on their aggressive tendencies using the Buss-Perry Aggression Questionnaire. Results provide evidence that tendencies toward angry and aggressive behavior relate to reduced inhibitory control processing (no-go P3) specifically during relevant threat-word blocks, suggesting deterioration of cognitive control by acute or sustained threat sensitivity. These findings highlight the value of ERP methodologies for clarifying the interplay of Negative Valence and Cognitive System processes in aggression proneness.

  13. Living is information processing: from molecules to global systems.

    Science.gov (United States)

    Farnsworth, Keith D; Nelson, John; Gershenson, Carlos

    2013-06-01

    We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function-to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, information at lower levels to appear at higher levels in complex systems (emergence). We show how information patterns, are united by the creation of mutual context, generating persistent consequences, to result in 'functional information'. This constructive process forms arbitrarily large complexes of information, the combined effects of which include the functions of life. Molecules and simple organisms have already been measured in terms of functional information content; we show how quantification may be extended to each level of organisation up to the ecological. In terms of a computer analogy, life is both the data and the program and its biochemical structure is the way the information is embodied. This idea supports the seamless integration of life at all scales with the physical universe. The innovation reported here is essentially to integrate these ideas, basing information on the 'general definition' of information, rather than simply the statistics of information, thereby explaining how functional information operates throughout life.

  14. Precise timing correlation in telemetry recording and processing systems

    Science.gov (United States)

    Pickett, R. B.; Matthews, F. L.

    1973-01-01

    Independent PCM telemetry data signals received from missiles must be correlated to within + or - 100 microseconds for comparison with radar data. Tests have been conducted to determine RF antenna receiving system delays; delays associated with wideband analog tape recorders used in the recording, dubbing and repdocuing processes; and uncertainties associated with computer processed time tag data. Several methods used in the recording of timing are evaluated. Through the application of a special time tagging technique, the cumulative timing bias from all sources is determined and the bias removed from final data. Conclusions show that relative time differences in receiving, recording, playback and processing of two telemetry links can be accomplished with a + or - 4 microseconds accuracy. In addition, the absolute time tag error (with respect to UTC) can be reduced to less than 15 microseconds. This investigation is believed to be the first attempt to identify the individual error contributions within the telemetry system and to describe the methods of error reduction within the telemetry system and to describe the methods of error reduction and correction.

  15. The IBA Easy-E-Beam™ Integrated Processing System

    Science.gov (United States)

    Cleland, Marshall R.; Galloway, Richard A.; Lisanti, Thomas F.

    2011-06-01

    IBA Industrial Inc., (formerly known as Radiation Dynamics, Inc.) has been making high-energy and medium-energy, direct-current proton and electron accelerators for research and industrial applications for many years. Some industrial applications of high-power electron accelerators are the crosslinking of polymeric materials and products, such as the insulation on electrical wires, multi-conductor cable jackets, heat-shrinkable plastic tubing and film, plastic pipe, foam and pellets, the partial curing of rubber sheet for automobile tire components, and the sterilization of disposable medical devices. The curing (polymerization and crosslinking) of carbon and glass fiber-reinforced composite plastic parts, the preservation of foods and the treatment of waste materials are attractive possibilities for future applications. With electron energies above 1.0 MeV, the radiation protection for operating personnel is usually provided by surrounding the accelerator facility with thick concrete walls. With lower energies, steel and lead panels can be used, which are substantially thinner and more compact than the equivalent concrete walls. IBA has developed a series of electron processing systems called Easy-e-Beam™ for the medium energy range from 300 keV to 1000 keV. These systems include the shielding as an integral part of a complete radiation processing facility. The basic concepts of the electron accelerator, the product processing equipment, the programmable control system, the configuration of the radiation shielding and some performance characteristics are described in this paper.

  16. A new ATLAS muon CSC readout system with system on chip technology on ATCA platform

    Energy Technology Data Exchange (ETDEWEB)

    Claus, R., E-mail: claus@slac.stanford.edu

    2016-07-11

    The ATLAS muon Cathode Strip Chamber (CSC) back-end readout system has been upgraded during the LHC 2013–2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run 2 luminosity. The readout design is based on the Reconfiguration Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the ATCA platform. The RCE design is based on the new System on Chip Xilinx Zynq series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources together with auxiliary memories to form a versatile DAQ building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the Zynq for G-link, S-link and TTC allowed the full system of 320 G-links from the 32 chambers to be processed by 6 COBs in one ATCA shelf through software waveform feature extraction to output 32 S-links. The full system was installed in Sept. 2014. We will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning towards LHC Run 2.

  17. Medium and high energy electron beam processing system

    Energy Technology Data Exchange (ETDEWEB)

    Kashiwagi, Masayuki [Nissin-High Voltage Co., Ltd., Kyoto (Japan)

    2003-02-01

    Electron Beam Processing System (EPS) is a useful and powerful tool for industrial irradiation process. The specification of EPS is decided by consideration to irradiate what material with how thick and wide, how much dose, how to handle, in what atmosphere. In designing an EPS, it is necessary to consider safety measure such as x-ray shielding, ozone control and interlock system. The initial costs to install typical EPS are estimated for acceleration voltages from 500 kV to 5 MV, including following items; those are electron beam machine, x-ray shielding, auxiliary equipment, material handling, survey for installation, ozone exhaust duct, cooling water system, wiring and piping. These prices are reference only because the price should be changed for each case. The price of x-ray shielding should be changed by construction cost. Auxiliary equipment includes window, cooling blower, ozone exhaust blower and SF6 gas handling equipment. In installation work at site, actual workers of 3 - 4 persons for 2 months are necessary. Material handling system is considered only rolls provided in the shielding room as reference. In addition to the initial installation, operators and workers may be required to wear a personal radiation monitor. An x-ray monitor of suitable design should be installed outside the shield room to monitor x-ray level in the working area. (Y. Tanaka)

  18. Optimally efficient neural systems for processing spoken language.

    Science.gov (United States)

    Zhuang, Jie; Tyler, Lorraine K; Randall, Billi; Stamatakis, Emmanuel A; Marslen-Wilson, William D

    2014-04-01

    Cognitive models claim that spoken words are recognized by an optimally efficient sequential analysis process. Evidence for this is the finding that nonwords are recognized as soon as they deviate from all real words (Marslen-Wilson 1984), reflecting continuous evaluation of speech inputs against lexical representations. Here, we investigate the brain mechanisms supporting this core aspect of word recognition and examine the processes of competition and selection among multiple word candidates. Based on new behavioral support for optimal efficiency in lexical access from speech, a functional magnetic resonance imaging study showed that words with later nonword points generated increased activation in the left superior and middle temporal gyrus (Brodmann area [BA] 21/22), implicating these regions in dynamic sound-meaning mapping. We investigated competition and selection by manipulating the number of initially activated word candidates (competition) and their later drop-out rate (selection). Increased lexical competition enhanced activity in bilateral ventral inferior frontal gyrus (BA 47/45), while increased lexical selection demands activated bilateral dorsal inferior frontal gyrus (BA 44/45). These findings indicate functional differentiation of the fronto-temporal systems for processing spoken language, with left middle temporal gyrus (MTG) and superior temporal gyrus (STG) involved in mapping sounds to meaning, bilateral ventral inferior frontal gyrus (IFG) engaged in less constrained early competition processing, and bilateral dorsal IFG engaged in later, more fine-grained selection processes.

  19. GPUs for real-time processing in HEP trigger systems

    CERN Document Server

    Ammendola, R; Deri, L; Fiorini, M; Frezza, O; Lamanna, G; Lo Cicero, F; Lonardo, A; Messina, A; Sozzi, M; Pantaleo, F; Paolucci, Ps; Rossetti, D; Simula, F; Tosoratto, L; Vicini, P

    2014-01-01

    We describe a pilot project (GAP - GPU Application Project) for the use of GPUs (Graphics processing units) for online triggering applications in High Energy Physics experiments. Two major trends can be identied in the development of trigger and DAQ systems for particle physics experiments: the massive use of general-purpose commodity systems such as commercial multicore PC farms for data acquisition, and the reduction of trigger levels implemented in hardware, towards a fully software data selection system (\\trigger-less"). The innovative approach presented here aims at exploiting the parallel computing power of commercial GPUs to perform fast computations in software not only in high level trigger levels but also in early trigger stages. General-purpose computing on GPUs is emerging as a new paradigm in several elds of science, although so far applications have been tailored to the specic strengths of such devices as accelerators in oine computation. With the steady reduction of GPU latencies, and the incre...

  20. Enhanced 3D face processing using an active vision system

    DEFF Research Database (Denmark)

    Lidegaard, Morten; Larsen, Rasmus; Kraft, Dirk;

    2014-01-01

    of the narrow FOV camera. We substantiate these two observations by qualitative results on face reconstruction and quantitative results on face recognition. As a consequence, such a set-up allows to achieve better and much more flexible system for 3D face reconstruction e.g. for recognition or emotion......We present an active face processing system based on 3D shape information extracted by means of stereo information. We use two sets of stereo cameras with different field of views (FOV): One with a wide FOV is used for face tracking, while the other with a narrow FOV is used for face identification....... We argue for two advantages of such a system: First, an extended work range, and second, the possibility to place the narrow FOV camera in a way such that a much better reconstruction quality can be achieved compared to a static camera even if the face had been fully visible in the periphery...

  1. Processes for microemulsion polymerization employing novel microemulsion systems

    Science.gov (United States)

    Beckman, Eric J.; Smith, Richard D.; Fulton, John L.

    1990-06-12

    This invention is directed to a microemulsion system comprising a first phase including a low-polarity fluid material which is a gas at standard temperature and pressure, and which has a cloud-point density. It also includes a second phase including a polar fluid, typically water, a monomer, preferably a monomer soluble in the polar fluid, and a microemulsion promoter for facilitating the formation of micelles including the monomer in the system. In the subject process, micelles including the monomer are formed in the first phase. A polymerization initiator is introduced into the micelles in the microemulsion system. The monomer is then polymerized in the micelles, preferably in the core of the micelle, to produce a polymeric material having a relatively high molecular weight.

  2. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    Science.gov (United States)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  3. An Architecture of Computer Aided Process Planning System Integrated with Scheduling Using Decision Support System

    Institute of Scientific and Technical Information of China (English)

    Manish; Kumar; Sunil; Rajotia

    2002-01-01

    Process planning and scheduling are two major plann in g and control activities that consume significant part of the lead-time, theref ore all attempts are being made to reduce lead-time by automating them. Compute r Aided Process Planning (CAPP) is a step in this direction. Most of the existin g CAPP systems do not consider scheduling while generating a process plan. Sched uling is done separately after the process plan has been generated and therefore , it is possible that a process plan so generated is e...

  4. Software Systems 2--Compiler and Operating Systems Lab--Advanced, Data Processing Technology: 8025.33.

    Science.gov (United States)

    Dade County Public Schools, Miami, FL.

    The course outline has been prepared as a guide to help the student develop the skills and knowledge necessary to succeed in the field of data processing. By learning the purpose and principles of compiler programs and operating systems, the student will become familiar with advanced data processing procedures that are representative of computer…

  5. Developing a Mobile Application "Educational Process Remote Management System" on the Android Operating System

    Science.gov (United States)

    Abildinova, Gulmira M.; Alzhanov, Aitugan K.; Ospanova, Nazira N.; Taybaldieva, Zhymatay; Baigojanova, Dametken S.; Pashovkin, Nikita O.

    2016-01-01

    Nowadays, when there is a need to introduce various innovations into the educational process, most efforts are aimed at simplifying the learning process. To that end, electronic textbooks, testing systems and other software is being developed. Most of them are intended to run on personal computers with limited mobility. Smart education is…

  6. Smart Screening System (S3) In Taconite Processing

    Energy Technology Data Exchange (ETDEWEB)

    Daryoush Allaei; Angus Morison; David Tarnowski; Asim Syed Mohammed

    2005-09-01

    components include smart motor and associated electronics, resonators, and supporting structural elements. It is shown that the smart motors have an acceptable life and performance. Resonator (or motion amplifier) designs are selected based on the final system requirement and vibration characteristics. All the components for a fully functional prototype are fabricated. The development program is on schedule. The last semi-annual report described the process of FE model validation and correlation with experimental data in terms of dynamic performance and predicted stresses. It also detailed efforts into making the supporting structure less important to system performance. Finally, an introduction into the dry application concept was presented. Since then, the design refinement phase was completed. This has resulted in a Smart Screen design that meets performance targets both in the dry condition and with taconite slurry flow using PZT motors. Furthermore, this system was successfully demonstrated for the DOE and partner companies at the Coleraine Mineral Research Laboratory in Coleraine, Minnesota.

  7. EXPRESSION AND ROLE OF PLASMINOGEN SYSTEM IN PROCESS OF RESTENOSIS

    Institute of Scientific and Technical Information of China (English)

    ZHAO Hai-guang; LU Xin-wu; HUANG Ying; JIANG Mi-er

    2005-01-01

    Objective To study the expression and role of plasminogen system in the process of restenosis.Methods We established a double-injury model of atherosclerotic restenosis in rabbit iliac artery mimicking human arterial restenosis. The time course of tissue plaminogen activator (tPA), urokinase plasminogen activator (uPA), urokinase plasminogen activator receptor (uPAR) and plasminogen activator inhibitor-1 (PAI-1) was investigated by immunohistochemistry. The mRNA expression of uPA and uPAR were detected after vascular procedures by in situ hybridization. Results In uninjured arteries, the weak expression of tPA and PAI-1 was detected in intimal and endothelial cells. The expression of tPA, uPA, uPAR and PAI-1 was significantly induced after double-injury, but after double-injury 14d, the expression of tPA restore to preinjury levels. The expression of uPA and uPAR in intimal was higher than that of media and maintain high levels in intimal within 42d and 56d. Conclusion Whereas t-PA is primarily involved in clot dissolution and play a limited role in the process of restenosis, in plasminogen system, uPA and uPAR play a prominent role in the process of restenosis.

  8. Digital Signal Processing Based Real Time Vehicular Detection System

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaoxuan; LIN Tao; LI Xiangping; LIU Chunyi; GAO Jian

    2005-01-01

    Traffic monitoring is of major importance for enforcing traffic management policies.To accomplish this task,the detection of vehicle can be achieved by exploiting image analysis techniques.In this paper,a solution is presented to obtain various traffic parameters through vehicular video detection system(VVDS).VVDS exploits the algorithm based on virtual loops to detect moving vehicle in real time.This algorithm uses the background differencing method,and vehicles can be detected through luminance difference of pixels between background image and current image.Furthermore a novel technology named as spatio-temporal image sequences analysis is applied to background differencing to improve detection accuracy.Then a hardware implementation of a digital signal processing (DSP) based board is described in detail and the board can simultaneously process four-channel video from different cameras. The benefit of usage of DSP is that images of a roadway can be processed at frame rate due to DSP′s high performance.In the end,VVDS is tested on real-world scenes and experiment results show that the system is both fast and robust to the surveillance of transportation.

  9. Energy management and process information system; Energiemanagement- und Betriebsinformationssysteme

    Energy Technology Data Exchange (ETDEWEB)

    Blumauer, G.; Schoefberger, W.; Traxler, R. [Elin EBG Elektrotechnik GmbH, Linz (Austria)

    1998-03-09

    The increasing pressure to the utilities because of the liberalization of the energy market and the increasing information demand of the officials makes an EDP supported processing of data and information necessary. In the fields of energy production and energy distribution a huge amount of data is generated by Scada systems. In these systems the critical factors operational reliability and time of reaction have absolute priority. Extensive data storage and evaluation are offered only inadequate because of capacity and security reasons. Only by the extensive connection and processing of the process- and hybriddata operational information is produced, which is of basic interest for several company divisions. (orig.) [Deutsch] In der Energieerzeugung und Netzfuehrung werden von den Automations- und Leitsystemen laufend grosse Mengen von Prozess- und Felddaten produziert. Bei diesen Systemen haben die kritischen Faktoren Betriebssicherheit und Reaktionsgeschwindigkeit absolute Prioritaet. Eine umfassende Datenaufbereitung und -langzeitarchivierung fuer individuelle Auswertemoeglichkeiten wird aus Kapazitaets- und Sicherheitsgruenden nur unzureichend geboten. Erst durch umfangreiche Verknuepfung und Aufbereitung von Prozess- und Hybriddaten entstehen Betriebsinformationen, die fuer eine Vielzahl weiterer Unternehmensbereiche von massgeblichem Interesse sind. (orig.)

  10. 5 MV 30 mA industrial electron processing system

    Science.gov (United States)

    Hoshi, Y.; Mizusawa, K.

    1991-05-01

    Industrial electron beam processing systems have been in use in various application fields such as: improving heat resistivity of wire insulation; controlling quality of automobile rubber tires and melt index characteristics of PE foams; and curing paintings or printing inks. Recently, there has come up a need for electron beam with an energy higher than 3 MV in order to disinfect salmonella in chicken meat, to kill bugs in fruits, and to sterilize medical disposables. To meet this need we developed a 5 MV 30 mA electron processing system with an X-ray conversion target. The machine was tested in NHV's plant in Kyoto at continuous operation of full voltage and full current. It proved to be very steady in operation with a high efficiency (as much as 72%). Also, the X-ray target was tested in a continuous run of 5 MV 30 mA (150 kW). It proved to be viable in industrial utilization. This paper introduces the process and the results of the development.

  11. Advanced Information Processing System (AIPS) proof-of-concept system functional design I/O network system services

    Science.gov (United States)

    1985-01-01

    The function design of the Input/Output (I/O) services for the Advanced Information Processing System (AIPS) proof of concept system is described. The data flow diagrams, which show the functional processes in I/O services and the data that flows among them, are contained. A complete list of the data identified on the data flow diagrams and in the process descriptions are provided.

  12. Analog signal processing for optical coherence imaging systems

    Science.gov (United States)

    Xu, Wei

    Optical coherence tomography (OCT) and optical coherence microscopy (OCM) are non-invasive optical coherence imaging techniques, which enable micron-scale resolution, depth resolved imaging capability. Both OCT and OCM are based on Michelson interferometer theory. They are widely used in ophthalmology, gastroenterology and dermatology, because of their high resolution, safety and low cost. OCT creates cross sectional images whereas OCM obtains en face images. In this dissertation, the design and development of three increasingly complicated analog signal processing (ASP) solutions for optical coherence imaging are presented. The first ASP solution was implemented for a time domain OCT system with a Rapid Scanning Optical Delay line (RSOD)-based optical signal modulation and logarithmic amplifier (Log amp) based demodulation. This OCT system can acquire up to 1600 A-scans per second. The measured dynamic range is 106dB at 200A-scan per second. This OCT signal processing electronics includes an off-the-shelf filter box with a Log amp circuit implemented on a PCB board. The second ASP solution was developed for an OCM system with synchronized modulation and demodulation and compensation for interferometer phase drift. This OCM acquired micron-scale resolution, high dynamic range images at acquisition speeds up to 45,000 pixels/second. This OCM ASP solution is fully custom designed on a perforated circuit board. The third ASP solution was implemented on a single 2.2 mm x 2.2 mm complementary metal oxide semiconductor (CMOS) chip. This design is expandable to a multiple channel OCT system. A single on-chip CMOS photodetector and ASP channel was used for coherent demodulation in a time domain OCT system. Cross-sectional images were acquired with a dynamic range of 76dB (limited by photodetector responsivity). When incorporated with a bump-bonded InGaAs photodiode with higher responsivity, the expected dynamic range is close to 100dB.

  13. Energy Saving in Data Processing and Communication Systems

    Directory of Open Access Journals (Sweden)

    Giuseppe Iazeolla

    2014-01-01

    Full Text Available The power management of ICT systems, that is, data processing (Dp and telecommunication (Tlc systems, is becoming a relevant problem in economical terms. Dp systems totalize millions of servers and associated subsystems (processors, monitors, storage devices, etc. all over the world that need to be electrically powered. Dp systems are also used in the government of Tlc systems, which, besides requiring Dp electrical power, also require Tlc-specific power, both for mobile networks (with their cell-phone towers and associated subsystems: base stations, subscriber stations, switching nodes, etc. and for wired networks (with their routers, gateways, switches, etc.. ICT research is thus expected to investigate into methods to reduce Dp- and Tlc-specific power consumption. However, saving power may turn into waste of performance, in other words, into waste of ICT quality of service (QoS. This paper investigates the Dp and Tlc power management policies that look at compromises between power saving and QoS.

  14. Power Processing for a Conceptual Project Prometheus Electric Propulsion System

    Science.gov (United States)

    Scina, Joseph E., Jr.; Aulisio, Michael; Gerber, Scott S.; Hewitt, Frank; Miller, Leonard; Elbuluk, Malik; Pinero, Luis R. (Technical Monitor)

    2005-01-01

    NASA has proposed a bold mission to orbit and explore the moons of Jupiter. This mission, known as the Jupiter Icy Moons Orbiter (JIMO), would significantly increase NASA s capability to explore deep space by making use of high power electric propulsion. One electric propulsion option under study for JIMO is an ion propulsion system. An early version of an ion propulsion system was successfully used on NASA's Deep Space 1 mission. One concept for an ion thruster system capable of meeting the current JIMO mission requirement would have individual thrusters that are 16 to 25 kW each and require voltages as high as 8.0 kV. The purpose of this work is to develop power processing schemes for delivering the high voltage power to the spacecraft ion thrusters based upon a three-phase AC distribution system. In addition, a proposed DC-DC converter topology is presented for an ion thruster ancillary supply based upon a DC distribution system. All specifications discussed in this paper are for design convenience and are speculative in nature.

  15. Process control of the EUS battery energy storage system

    Energy Technology Data Exchange (ETDEWEB)

    Harke, R.; Pierschke, T.; Schroeder, M. [EUS GmbH, Gelsenkirchen (Germany)

    1999-07-01

    The process control of the EUS battery energy storage system (BESS) is presented which is used to improve the utilization of regenerative energies. This multifunctional energy storage system includes three different functions: (i) Uninterruptible power supply (UPS); (ii) Improvement of power quality; (iii) Peak load shaving. UPS application has a long tradition and is used whenever a reliable power supply is needed. Additionally, nowadays, there is a growing demand for high quality power under consideration of an increase of system perturbation of electric grids. Peak load shaving means in this case the use of regenerative produced power stored in a battery for high peak load periods. For such a multifunctional application large lead-acid batteries with high power and good charge acceptance, as well as good cycle life are needed. The batteries consist of standard OCSM cells with positive tubular plates and negative copper grids but modified according to the special demand of an multifunctional application. This paper is based on two examples where multifunctional energy storage systems have started operation recently in Germany: one system was installed in combination with a 1 MW solar plant in Herne and another one was installed in combination with a 3,5 MW wind farm in Bocholt. At each of both places a 1,2 MWh (1h-rate) lead acid battery has been installed. (orig.)

  16. Synthetic membranes and membrane processes with counterparts in biological systems

    Science.gov (United States)

    Matson, Stephen L.

    1996-02-01

    Conventional synthetic membranes, fashioned for the most part from rather unremarkable polymeric materials, are essentially passive structures that achieve various industrial and biomedical separations through simple and selective membrane permeation processes. Indeed, simplicity of membrane material, structure, and function has long been perceived as a virtue of membranes relative to other separation processes with which they compete. The passive membrane separation processes -- exemplified by micro- and ultrafiltration, dialysis, reverse osmosis, and gas permeation -- differ from one another primarily in terms of membrane morphology or structure (e.g., porous, gel-type, and nonporous) and the permeant transport mechanism and driving force (e.g., diffusion, convection, and 'solution/diffusion'). The passive membrane separation processes have in common the fact that interaction between permeant and membrane material is typically weak and physicochemical in nature; indeed, it is frequently an objective of membrane materials design to minimize interaction between permeant and membrane polymer, since such strategies can minimize membrane fouling. As a consequence, conventional membrane processes often provide only modest separation factors or permselectivities; that is, they are more useful in performing 'group separations' (i.e., the separation of different classes of material) than they are in fractionating species within a given class. It has long been recognized within the community of membrane technologists that biological membrane structures and their components are extraordinarily sophisticated and powerful as compared to their synthetic counterparts. Moreover, biomembranes and related biological systems have been 'designed' according to a very different paradigm -- one that frequently maximizes and capitalizes on extraordinarily strong and biochemically specific interactions between components of the membrane and species interacting with them. Thus, in recent

  17. Linear systems analysis of activating processes of complement system as a defense mechanism.

    Science.gov (United States)

    Hirayama, H; Yoshii, K; Ojima, H; Kawai, N; Gotoh, S; Fukuyama, Y

    1996-01-01

    The complement system is an important element of the host defense mechanism, although its kinetics and characteristics as a system are still unclear. We have investigated its temporal changes and system properties from the view point of system engineering. The temporal changes of sequential activating processes of the system were expressed by 26 non-linear differential equations using reported values of rate constants and serum concentration for each component. The intermediate products in the activating processes increased parabolically while the membrane attack component as the final product, increased linearly. The little change in inactive precursors afforded validity for system linearization. Linear systems analysis revealed that the system which was insensitive to the changes in rate constants was unstable. The system became stable when the feed-back input from the final product was set to operate on the first step of the activating processes. Seven uncontrollable variables were insensitive to changes in rate constants or system optimization that minimized the changes in concentrations of components in the complement system. The singular values of the complement system were reduced and the impulse responses of the system were improved when the system was optimized. When stronger minimization was imposed on the changes of concentration of the components in the complement system, the singular values were reduced more, the magnitude of the impulse responses was depressed further and the responses terminated earlier than those when the elements in the weighting matrix of concentration of the components were set to be unity. By this potent minimization, the influences of changes in rate constants on the singular values were diminished. The present theoretical analysis is presented to evaluate the ability of defense mechanism of complement system.

  18. Process Completing Sequences for Resource Allocation Systems with Synchronization

    Directory of Open Access Journals (Sweden)

    Song Foh Chew

    2012-01-01

    Full Text Available This paper considers the problem of establishing live resource allocation in workflows with synchronization stages. Establishing live resource allocation in this class of systems is challenging since deciding whether a given level of resource capacities is sufficient to complete a single process is NP-complete. In this paper, we develop two necessary conditions and one sufficient condition that provide quickly computable tests for the existence of process completing sequences. The necessary conditions are based on the sequence of completions of subprocesses that merge together at a synchronization. Although the worst case complexity is O(2, we expect the number of subprocesses combined at any synchronization will be sufficiently small so that total computation time remains manageable. The sufficient condition uses a reduction scheme that computes a sufficient capacity level of each resource type to complete and merge all subprocesses. The worst case complexity is O(⋅, where is the number of synchronizations. Finally, the paper develops capacity bounds and polynomial methods for generating feasible resource allocation sequences for merging systems with single unit allocation. This method is based on single step look-ahead for deadly marked siphons and is O(2. Throughout the paper, we use a class of Petri nets called Generalized Augmented Marked Graphs to represent our resource allocation systems.

  19. Market development directory for solar industrial process heat systems

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-02-01

    The purpose of this directory is to provide a basis for market development activities through a location listing of key trade associations, trade periodicals, and key firms for three target groups. Potential industrial users and potential IPH system designers were identified as the prime targets for market development activities. The bulk of the directory is a listing of these two groups. The third group, solar IPH equipment manufacturers, was included to provide an information source for potential industrial users and potential IPH system designers. Trade associates and their publications are listed for selected four-digit Standard Industrial Code (SIC) industries. Since industries requiring relatively lower temperature process heat probably will comprise most of the near-term market for solar IPH systems, the 80 SIC's included in this chapter have process temperature requirements less than 350/sup 0/F. Some key statistics and a location list of the largest plants (according to number of employees) in each state are included for 15 of the 80 SIC's. Architectural/engineering and consulting firms are listed which are known to have solar experience. Professional associated and periodicals to which information on solar IPH sytstems may be directed also are included. Solar equipment manufacturers and their associations are listed. The listing is based on the SERI Solar Energy Information Data Base (SEIDB).

  20. A MEMS-based, wireless, biometric-like security system

    Science.gov (United States)

    Cross, Joshua D.; Schneiter, John L.; Leiby, Grant A.; McCarter, Steven; Smith, Jeremiah; Budka, Thomas P.

    2010-04-01

    We present a system for secure identification applications that is based upon biometric-like MEMS chips. The MEMS chips have unique frequency signatures resulting from fabrication process variations. The MEMS chips possess something analogous to a "voiceprint". The chips are vacuum encapsulated, rugged, and suitable for low-cost, highvolume mass production. Furthermore, the fabrication process is fully integrated with standard CMOS fabrication methods. One is able to operate the MEMS-based identification system similarly to a conventional RFID system: the reader (essentially a custom network analyzer) detects the power reflected across a frequency spectrum from a MEMS chip in its vicinity. We demonstrate prototype "tags" - MEMS chips placed on a credit card-like substrate - to show how the system could be used in standard identification or authentication applications. We have integrated power scavenging to provide DC bias for the MEMS chips through the use of a 915 MHz source in the reader and a RF-DC conversion circuit on the tag. The system enables a high level of protection against typical RFID hacking attacks. There is no need for signal encryption, so back-end infrastructure is minimal. We believe this system would make a viable low-cost, high-security system for a variety of identification and authentication applications.