WorldWideScience

Sample records for handling data

  1. LACIE data-handling techniques

    Science.gov (United States)

    Waits, G. H. (Principal Investigator)

    1979-01-01

    Techniques implemented to facilitate processing of LANDSAT multispectral data between 1975 and 1978 are described. The data that were handled during the large area crop inventory experiment and the storage mechanisms used for the various types of data are defined. The overall data flow, from the placing of the LANDSAT orders through the actual analysis of the data set, is discussed. An overview is provided of the status and tracking system that was developed and of the data base maintenance and operational task. The archiving of the LACIE data is explained.

  2. Data Handling and Parameter Estimation

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist

    2016-01-01

    ,engineers, and professionals. However, it is also expected that they will be useful both for graduate teaching as well as a stepping stone for academic researchers who wish to expand their theoretical interest in the subject. For the models selected to interpret the experimental data, this chapter uses available models from...... literature that are mostly based on the ActivatedSludge Model (ASM) framework and their appropriate extensions (Henze et al., 2000).The chapter presents an overview of the most commonly used methods in the estimation of parameters from experimental batch data, namely: (i) data handling and validation, (ii......Modelling is one of the key tools at the disposal of modern wastewater treatment professionals, researchers and engineers. It enables them to study and understand complex phenomena underlying the physical, chemical and biological performance of wastewater treatment plants at different temporal...

  3. HMSRP Hawaiian Monk Seal Handling Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains records for all handling and measurement of Hawaiian monk seals since 1981. Live seals are handled and measured during a variety of events...

  4. ATA diagnostic data handling system: an overview

    International Nuclear Information System (INIS)

    Chambers, F.W.; Kallman, J.; McDonald, J.; Slominski, M.

    1984-01-01

    The functions to be performed by the ATA diagnostic data handling system are discussed. The capabilities of the present data acquisition system (System 0) are presented. The goals for the next generation acquisition system (System 1), currently under design, are discussed. Facilities on the Octopus system for data handling are reviewed. Finally, we discuss what has been learned about diagnostics and computer based data handling during the past year

  5. Data handling systems and methods of wiring

    International Nuclear Information System (INIS)

    Grant, J.

    1981-01-01

    An improved data handling system, for monitoring and control of nuclear reactor operations, is described in which time delays associated with scanning are reduced and noise and fault signals in the system are resolved. (U.K.)

  6. WISE TECHNOLOGY FOR HANDLING BIG DATA FEDERATIONS

    NARCIS (Netherlands)

    Valentijn, E; Begeman, Kornelis; Belikov, Andrey; Boxhoorn, Danny; Verdoes Kleijn, Gijs; McFarland, John; Vriend, Willem-Jan; Williams, Owen; Soille, P.; Marchetti, P.G.

    2014-01-01

    The effective use of Big Data in current and future scientific missions requires intelligent data handling systems which are able to interface the user to complicated distributed data collections. We review the WISE Concept of Scientific Information Systems and the WISE solutions for the storage and

  7. Experience of Data Handling with IPPM Payload

    Science.gov (United States)

    Errico, Walter; Tosi, Pietro; Ilstad, Jorgen; Jameux, David; Viviani, Riccardo; Collantoni, Daniele

    2010-08-01

    A simplified On-Board Data Handling system has been developed by CAEN AURELIA SPACE and ABSTRAQT as PUS-over-SpaceWire demonstration platform for the Onboard Payload Data Processing laboratory at ESTEC. The system is composed of three Leon2-based IPPM (Integrated Payload Processing Module) computers that play the roles of Instrument, Payload Data Handling Unit and Satellite Management Unit. Two PCs complete the test set-up simulating an external Memory Management Unit and the Ground Control Unit. Communication among units take place primarily through SpaceWire links; RMAP[2] protocol is used for configuration and housekeeping. A limited implementation of ECSS-E-70-41B Packet Utilisation Standard (PUS)[1] over CANbus and MIL-STD-1553B has been also realized. The Open Source RTEMS is running on the IPPM AT697E CPU as real-time operating system.

  8. Statistical methods for handling incomplete data

    CERN Document Server

    Kim, Jae Kwang

    2013-01-01

    ""… this book nicely blends the theoretical material and its application through examples, and will be of interest to students and researchers as a textbook or a reference book. Extensive coverage of recent advances in handling missing data provides resources and guidelines for researchers and practitioners in implementing the methods in new settings. … I plan to use this as a textbook for my teaching and highly recommend it.""-Biometrics, September 2014

  9. Warmwassersphaere - handling and processing of hydrographic data

    Energy Technology Data Exchange (ETDEWEB)

    Sy, A

    1983-01-01

    This report reviews CTD data handling and the principle of processing for the profiling instrument 'Multisonde' as it is presently in use at the Institut fuer Meereskunde in Kiel, F.R.G., in the frame of the research programme 'Warmwassersphaere'. An introduction to the shipboard system of measuring and logging is given. An outline of both laboratory and in situ calibrations is presented. The main subject of this report is the processing of the data. Possible sources of errors in field measurements and their influence on data accuracy are discussed and the standard processing stages are described. A specific problem lies in data errors, which inhibit a routine processing. In order to edit these data a special filter, the median filter, is introduced. Its efficiency as well as the experiences gained through its practical application is described and discussed by means of comparison with conventional techniques. The data used for these tests were collected during the experiment Nordostatlantik '81.

  10. The handling of data from experiments

    CERN Document Server

    Davies, H E

    1974-01-01

    The use of small computers in on-line experiments in high-energy physics is briefly indicated. The requirement for an above-average performance (data-handling rates up to 1.5 Mbit/sec) is described, emphasizing the problem of data acquisition; data rates and buffering, data storage, and the importance of flexibility are dealt with. The discussion of hardware solutions to the special problems posed by on- line experiments includes the use of CAMAC interfaces, systems of linked computers, and the use of special processors which perform the first steps of data analysis very rapidly. A section on the software solution to data acquisition problems treats the requirements for flexibility and ease of use, giving as an example a comparison of a manufacturer-supplied Editor and CERN's ORION Editor, and concludes with an outline of the need for direct access to more powerful computers, giving as an illustration the FOCUS and Omega/SFM networks. (0 refs).

  11. Distributed computing for FTU data handling

    Energy Technology Data Exchange (ETDEWEB)

    Bertocchi, A. E-mail: bertocchi@frascati.enea.it; Bracco, G.; Buceti, G.; Centioli, C.; Giovannozzi, E.; Iannone, F.; Panella, M.; Vitale, V

    2002-06-01

    The growth of data warehouse in tokamak experiment is leading fusion laboratories to provide new IT solutions in data handling. In the last three years, the Frascati Tokamak Upgrade (FTU) experimental database was migrated from IBM-mainframe to Unix distributed computing environment. The migration efforts have taken into account the following items: (1) a new data storage solution based on storage area network over fibre channel; (2) andrew file system (AFS) for wide area network file sharing; (3) 'one measure/one file' philosophy replacing 'one shot/one file' to provide a faster read/write data access; (4) more powerful services, such as AFS, CORBA and MDSplus to allow users to access FTU database from different clients, regardless their O.S.; (5) large availability of data analysis tools, from the locally developed utility SHOW to the multi-platform Matlab, interactive data language and jScope (all these tools are now able to access also the Joint European Torus data, in the framework of the remote data access activity); (6) a batch-computing cluster of Alpha/CompaqTru64 CPU based on CODINE/GRD to optimize utilization of software and hardware resources.

  12. A memory module for experimental data handling

    Science.gov (United States)

    De Blois, J.

    1985-02-01

    A compact CAMAC memory module for experimental data handling was developed to eliminate the need of direct memory access in computer controlled measurements. When using autonomous controllers it also makes measurements more independent of the program and enlarges the available space for programs in the memory of the micro-computer. The memory module has three modes of operation: an increment-, a list- and a fifo mode. This is achieved by connecting the main parts, being: the memory (MEM), the fifo buffer (FIFO), the address buffer (BUF), two counters (AUX and ADDR) and a readout register (ROR), by an internal 24-bit databus. The time needed for databus operations is 1 μs, for measuring cycles as well as for CAMAC cycles. The FIFO provides temporary data storage during CAMAC cycles and separates the memory part from the application part. The memory is variable from 1 to 64K (24 bits) by using different types of memory chips. The application part, which forms 1/3 of the module, will be specially designed for each application and is added to the memory chian internal connector. The memory unit will be used in Mössbauer experiments and in thermal neutron scattering experiments.

  13. A memory module for experimental data handling

    International Nuclear Information System (INIS)

    Blois, J. de

    1985-01-01

    A compact CAMAC memory module for experimental data handling was developed to eliminate the need of direct memory access in computer controlled measurements. When using autonomous controllers it also makes measurements more independent of the program and enlarges the available space for programs in the memory of the micro-computer. The memory module has three modes of operation: an increment-, a list- and a fifo mode. This is achieved by connecting the main parts, being: the memory (MEM), the fifo buffer (FIFO), the address buffer (BUF), two counters (AUX and ADDR) and a readout register (ROR), by an internal 24-bit databus. The time needed for databus operations is 1 μs, for measuring cycles as well as for CAMAC cycles. The FIFO provides temporary data storage during CAMAC cycles and separates the memory part from the application part. The memory is variable from 1 to 64K (24 bits) by using different types of memory chips. The application part, which forms 1/3 of the module, will be specially designed for each application and is added to the memory by an internal connector. The memory unit will be used in Moessbauer experiments and in thermal neutron scattering experiments. (orig.)

  14. Command and Data Handling Branch Internship

    Science.gov (United States)

    Billings, Rachel Mae

    2016-01-01

    Modular Integrated Stackable Layers (MISL) is a computer system designed for simple, fast, and cost effective flexible reconfiguration in space environments such as the ISS and Orion projects for various uses. Existing applications include wireless and wired communications, data acquisition and instrumentation, and camera systems, and potential applications include bus protocol converters and subsystem control. MISL is based on Texas Instruments (TI)' MSP430 16-bit ultra-low-power microcontroller device. The purpose of my project was to integrate the MISL system with a liquid crystal display (LCD) touchscreen. The LCD, manufactured by Crystalfontz and part number CFAF320240F-035T-TS, is a 320 by 240 RGB resistive color screen including an optional carrier board. The vast majority of the project was done with Altium Designer, a tool for printed circuit board (PCB) schematic capture, 3D design, and FPGA (Field Programmable Gate Array) development. The new PCB was to allow the LCD to directly stack to the rest of MISL. Research was done with datasheets for the TI microcontroller and touchscreen display in order to meet desired hardware specifications. Documentation on prior MISL projects was also utilized. The initial step was to create a schematic for the LCD, power bus, and data bus connections between components. A layout was then designed with the required physical dimensions, routed traces and vias, power and ground planes, layer stacks, and other specified design rules such as plane clearance and hole size. Multiple consultation sessions were held with Hester Yim, the technical discipline lead for the Command and Data Handling Branch, and Christy Herring, the lead PCB layout designer in the Electronic Design and Manufacturing Branch in order to ensure proper configuration. At the moment, the PCB is awaiting revision by the latter-mentioned branch. Afterwards, the board will begin to undergo the manufacturing and testing process. Throughout the internship at

  15. Data-handling system for the Fly's Eye experiment

    International Nuclear Information System (INIS)

    Bergeson, H.E.; Cassiday, G.L.; Cooper, D.A.

    1975-01-01

    The Fly's Eye air scintillation experiment presents severe data-handling requirements for two reasons. First, nearly 1,000 photomultipliers each produce outputs at rates from 100 Khz to 20 Mhz. Second, much of the signal arrives before a trigger is formed. A data handling system which will deal with this problem is described. (orig.) [de

  16. The prevention and handling of the missing data

    OpenAIRE

    Kang, Hyun

    2013-01-01

    Even in a well-designed and controlled study, missing data occurs in almost all research. Missing data can reduce the statistical power of a study and can produce biased estimates, leading to invalid conclusions. This manuscript reviews the problems and types of missing data, along with the techniques for handling missing data. The mechanisms by which missing data occurs are illustrated, and the methods for handling the missing data are discussed. The paper concludes with recommendations for ...

  17. Methods for Handling Missing Secondary Respondent Data

    Science.gov (United States)

    Young, Rebekah; Johnson, David

    2013-01-01

    Secondary respondent data are underutilized because researchers avoid using these data in the presence of substantial missing data. The authors reviewed, evaluated, and tested solutions to this problem. Five strategies of dealing with missing partner data were reviewed: (a) complete case analysis, (b) inverse probability weighting, (c) correction…

  18. Data handling for the lepton detector

    International Nuclear Information System (INIS)

    Cutts, D.; Droege, T.F.; Kasha, H.; Kirsch, L.E.; Littenberg, L.; Matthews, J.A.J.; Rabin, M.S.Z.; Scharenguivel, J.

    1978-01-01

    The data acquisition and processing needs of a typical lepton-oriented large-solid-angle detector were evaluated, and a configuration of the data readout systems (including microprocessors), control computer(s), and local intersection computer was recommended. Features considered explicitly include triggering and data rates, data readout and monitoring (standard device processor, event processor, and intersection event processor), and off-line computing. 6 figures, 2 tables

  19. MICE data handling on the Grid

    International Nuclear Information System (INIS)

    Martyniak, J

    2014-01-01

    The international Muon Ionisation Cooling Experiment (MICE) is designed to demonstrate the principle of muon ionisation cooling for the first time, for application to a future Neutrino factory or Muon Collider. The experiment is currently under construction at the ISIS synchrotron at the Rutherford Appleton Laboratory (RAL), UK. In this paper we present a system – the Raw Data Mover, which allows us to store and distribute MICE raw data – and a framework for offline reconstruction and data management. The aim of the Raw Data Mover is to upload raw data files onto a safe tape storage as soon as the data have been written out by the DAQ system and marked as ready to be uploaded. Internal integrity of the files is verified and they are uploaded to the RAL Tier-1 Castor Storage Element (SE) and placed on two tapes for redundancy. We also make another copy at a separate disk-based SE at this stage to make it easier for users to access data quickly. Both copies are check-summed and the replicas are registered with an instance of the LCG File Catalog (LFC). On success a record with basic file properties is added to the MICE Metadata DB. The reconstruction process is triggered by new raw data records filled in by the mover system described above. Off-line reconstruction jobs for new raw files are submitted to RAL Tier-1 and the output is stored on tape. Batch reprocessing is done at multiple MICE enabled Grid sites and output files are shipped to central tape or disk storage at RAL using a custom File Transfer Controller.

  20. Wireless Sensor Network Handles Image Data

    Science.gov (United States)

    2008-01-01

    To relay data from remote locations for NASA s Earth sciences research, Goddard Space Flight Center contributed to the development of "microservers" (wireless sensor network nodes), which are now used commercially as a quick and affordable means to capture and distribute geographical information, including rich sets of aerial and street-level imagery. NASA began this work out of a necessity for real-time recovery of remote sensor data. These microservers work much like a wireless office network, relaying information between devices. The key difference, however, is that instead of linking workstations within one office, the interconnected microservers operate miles away from one another. This attribute traces back to the technology s original use: The microservers were originally designed for seismology on remote glaciers and ice streams in Alaska, Greenland, and Antarctica-acquiring, storing, and relaying data wirelessly between ground sensors. The microservers boast three key attributes. First, a researcher in the field can establish a "managed network" of microservers and rapidly see the data streams (recovered wirelessly) on a field computer. This rapid feedback permits the researcher to reconfigure the network for different purposes over the course of a field campaign. Second, through careful power management, the microservers can dwell unsupervised in the field for up to 2 years, collecting tremendous amounts of data at a research location. The third attribute is the exciting potential to deploy a microserver network that works in synchrony with robotic explorers (e.g., providing ground truth validation for satellites, supporting rovers as they traverse the local environment). Managed networks of remote microservers that relay data unsupervised for up to 2 years can drastically reduce the costs of field instrumentation and data rec

  1. A New Format for Handling Nuclear Data

    CERN Document Server

    Bak, S I; Tenreiro, C; Kadi, Y; Hong, S W; Manchanda, V; Gheata, M; Chai, J S; Carminati, F; Park, T S; Brun, R

    2011-01-01

    The ASCII ENDF format for nuclear data has been used for four decades. It is practical for human inspection and portability, but; it is not very effective for manipulating and displaying the data or for using them in Monte-Carlo applications. In this paper we present a prototype of a nuclear data manipulation package (TNudy) based on the ROOT system (http://root.cern.ch). The ROOT object-oriented C++ framework is the de-facto standard in high energy and nuclear physics since ten years. Starting from the ENDF format, the data. is stored in machine-portable binary format. Root files also offer a powerful direct access capability to their different sections and compressibility upon writing, minimising the disk occupancy. ROOT offers a complete library of visualisation and mathematical routines and the Virtual Monte-Carlo system, which allows running different transport Monte-Carlo (Geant 4, Geant 3) with common scoring and geometry modellers, which comes as part of ROOT. ROOT contains isotope decay data and the ...

  2. Data handling for the ''big lepton'' detector

    International Nuclear Information System (INIS)

    Cutts, D.

    1977-01-01

    The large lepton detector proposed and described in previous workshops was used as a known standard to study how data might be gathered, sorted, and recorded for ISABELLE detectors. The main source of concern is the high singles rate throughout most of the detector, together with the enormous number of individual devices that are running at this rate and must be serviced for each event. A calculation is given of this rate; the result is 1.0 x 10 7 particles/sec into the detector (at 10 32 /cm 2 sec). In this particular situation the use of charge coupled devices is natural; individual data elements feed continuously into these ''analog delay lines''; yet the data relevant to an occasional event are processed in parallel through a limited number of channels. The general technique seems desirable for many large scale experiments which manage to achieve in some way a reasonable trigger rate

  3. Computer facilities for ISABELLE data handling

    International Nuclear Information System (INIS)

    Kramer, M.A.; Love, W.A.; Miller, R.J.; Zeller, M.

    1977-01-01

    The analysis of data produced by ISABELLE experiments will need a large system of computers. An official group of prospective users and operators of that system should begin planning now. Included in the array will be a substantial computer system at each ISABELLE intersection in use. These systems must include enough computer power to keep experimenters aware of the health of the experiment. This will require at least one very fast sophisticated processor in the system, the size depending on the experiment. Other features of the intersection systems must be a good, high speed graphic display, ability to record data on magnetic tape at 500 to 1000 KB, and a high speed link to a central computer. The operating system software must support multiple interactive users. A substantially larger capacity computer system, shared by the six intersection region experiments, must be available with good turnaround for experimenters while ISABELLE is running. A computer support group will be required to maintain the computer system and to provide and maintain software common to all experiments. Special superfast computing hardware or special function processors constructed with microprocessor circuitry may be necessary both in the data gathering and data processing work. Thus both the local and central processors should be chosen with the possibility of interfacing such devices in mind

  4. An AFDX Network for Spacecraft Data Handling

    Science.gov (United States)

    Deredempt, Marie-Helene; Kollias, Vangelis; Sun, Zhili; Canamares, Ernest; Ricco, Philippe

    2014-08-01

    In aeronautical domain, ARINC-664 Part 7 specification (AFDX) [4] provides the enabling technology for interfacing equipment in Integrated Modular Avionics (IMA) architectures. The complementary part of AFDX for a complete interoperability - Time and Space Partitioning (ARINC 653) concepts [1]- was already studied as part of space domain ESA roadmap (i.e. IMA4Space project)Standardized IMA based architecture is already considered in aeronautical domain as more flexible, reliable and secure. Integration and validation become simple, using a common set of tools and data base and could be done by part on different means with the same definition (hardware and software test benches, flight control or alarm test benches, simulator and flight test installation).In some area, requirements in terms of data processing are quite similar in space domain and the concept could be applicable to take benefit of the technology itself and of the panel of hardware and software solutions and tools available on the market. The Mission project (Methodology and assessment for the applicability of ARINC-664 (AFDX) in Satellite/Spacecraft on-board communicatION networks), as an FP7 initiative for bringing terrestrial SME research into the space domain started to evaluate the applicability of the standard in space domain.

  5. Handling Imbalanced Data Sets in Multistage Classification

    Science.gov (United States)

    López, M.

    Multistage classification is a logical approach, based on a divide-and-conquer solution, for dealing with problems with a high number of classes. The classification problem is divided into several sequential steps, each one associated to a single classifier that works with subgroups of the original classes. In each level, the current set of classes is split into smaller subgroups of classes until they (the subgroups) are composed of only one class. The resulting chain of classifiers can be represented as a tree, which (1) simplifies the classification process by using fewer categories in each classifier and (2) makes it possible to combine several algorithms or use different attributes in each stage. Most of the classification algorithms can be biased in the sense of selecting the most populated class in overlapping areas of the input space. This can degrade a multistage classifier performance if the training set sample frequencies do not reflect the real prevalence in the population. Several techniques such as applying prior probabilities, assigning weights to the classes, or replicating instances have been developed to overcome this handicap. Most of them are designed for two-class (accept-reject) problems. In this article, we evaluate several of these techniques as applied to multistage classification and analyze how they can be useful for astronomy. We compare the results obtained by classifying a data set based on Hipparcos with and without these methods.

  6. Data handling for the wide-angle hall jet experiment

    International Nuclear Information System (INIS)

    Cassel, D.; Engelmann, R.; Gordon, H.; Grannis, P.; Mallik, U.; Meadows, B.; Morris, T.; Plano, R.; Saulys, A.; Stein, S.

    1978-01-01

    The data handling needs for a jet production experiment in the wide-angle hall are discussed. The several layers of triggering and a data acquisition system were designed, the configuration of the local intersection computer system was discussed, and the time required to analyze a typical event was estimated. It was concluded that the experiment does not appear to place unrealistic demands on detector technology. The slow trigger was believed to be the crucial aspect of the experiment. 15 figures, 1 tables

  7. Invention activities as preparation for learning laboratory data handling skills

    Science.gov (United States)

    Day, James

    2012-10-01

    Undergraduate physics laboratories are often driven by a mix of goals, and usually enough of them to cause cognitive overload for the student. Our recent findings align well with studies indicating that students often exit a physics lab without having properly learned how to handle real data. The value of having students explore the underlying structure of a problem before being able to solve it has been shown as an effective way to ready students for learning. Borrowing on findings from the fields of education and cognitive psychology, we use ``invention activities'' to precede direct instruction and bolster learning. In this talk I will show some of what we have learned about students' data handling skills, explain how an invention activity works, and share some observations of successful transfer.

  8. Conditions Data Handling In The Multithreaded ATLAS Framework

    CERN Document Server

    Leggett, Charles; The ATLAS collaboration

    2018-01-01

    In preparation for Run 3 of the LHC, the ATLAS experiment is migrating its offline software to use a multithreaded framework, which will allow multiple events to be processed simultaneously. This implies that the handling of non-event, time-dependent (conditions) data, such as calibrations and geometry, must also be extended to allow for multiple versions of such data to exist simultaneously. This has now been implemented as part of the new ATLAS framework. The detector geometry is included in this scheme by having sets of time-dependent displacements on top of a static base geometry.

  9. SaaS Platform for Time Series Data Handling

    Science.gov (United States)

    Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail

    2018-02-01

    The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.

  10. Integrated Payload Data Handling Systems Using Software Partitioning

    Science.gov (United States)

    Taylor, Alun; Hann, Mark; Wishart, Alex

    2015-09-01

    An integrated Payload Data Handling System (I-PDHS) is one in which multiple instruments share a central payload processor for their on-board data processing tasks. This offers a number of advantages over the conventional decentralised architecture. Savings in payload mass and power can be realised because the total processing resource is matched to the requirements, as opposed to the decentralised architecture here the processing resource is in effect the sum of all the applications. Overall development cost can be reduced using a common processor. At individual instrument level the potential benefits include a standardised application development environment, and the opportunity to run the instrument data handling application on a fully redundant and more powerful processing platform [1]. This paper describes a joint program by SCISYS UK Limited, Airbus Defence and Space, Imperial College London and RAL Space to implement a realistic demonstration of an I-PDHS using engineering models of flight instruments (a magnetometer and camera) and a laboratory demonstrator of a central payload processor which is functionally representative of a flight design. The objective is to raise the Technology Readiness Level of the centralised data processing technique by address the key areas of task partitioning to prevent fault propagation and the use of a common development process for the instrument applications. The project is supported by a UK Space Agency grant awarded under the National Space Technology Program SpaceCITI scheme. [1].

  11. The data handling processor of the Belle II DEPFET detector

    Energy Technology Data Exchange (ETDEWEB)

    Germic, Leonard; Hemperek, Tomasz; Kishishita, Tetsuichi; Paschen, Botho; Luetticke, Florian; Krueger, Hans; Marinas, Carlos; Wermes, Norbert [Universitaet Bonn (Germany); Collaboration: Belle II-Collaboration

    2016-07-01

    A two layer highly granular DEPFET pixel detector will be operated as the innermost subsystem of the Belle II experiment, at the new Japanese super flavor factory (SuperKEKB). Such a finely segmented system will allow to improve the vertex reconstruction in such ultra high luminosity environment but, at the same time, the raw data stream generated by the 8 million pixel detector will exceed the capability of real-time processing due to its high frame rate, considering the limited material budged and strict space constrains. For this reason a new ASIC, the Data Handling Processor (DHP) is designed to provide data processing at the level of the front-end electronics, such as zero-suppression and common mode correction. Additional feature of the Data Handling Processor is the control block, providing control signals for the on-module ASICs used in the pixel detector. In this contribution, the description of the latest chip revision in TSMC 65 nm technology together with the latest test results of the interface functionality tests are presented.

  12. Practical aspects of handling data protection and data security.

    Science.gov (United States)

    Louwerse, C P

    1991-01-01

    Looking at practical applications of health care information systems, we must conclude that in the field of data protection there still is too large a gap between what is feasible and necessary on one hand, and what is achieved in actual realizations on the other. To illustrate this point, we sketch the actual data protection measures in a large hospital information system, and describe the effects of changes affecting the system, such as increasing use of personal computers, and growing intensity of use of the system. Trends in the development of new and additional systems are indicated, and a summary of possible weak points and gaps in the security is given, some suggestions for improvement are made.

  13. Extension of ERIM multispectral data processing capabilities through improved data handling techniques

    Science.gov (United States)

    Kriegler, F. J.

    1973-01-01

    The improvement and extension of the capabilities of the Environmental Research Institute of Michigan processing facility in handling multispectral data are discussed. Improvements consisted of implementing hardware modifications which permitted more rapid access to the recorded data through improved numbering and indexing of such data. In addition, techniques are discussed for handling data from sources other than the ERIM M-5 and M-7 scanner systems.

  14. Handling high data rate detectors at Diamond Light Source

    Science.gov (United States)

    Pedersen, U. K.; Rees, N.; Basham, M.; Ferner, F. J. K.

    2013-03-01

    An increasing number of area detectors, in use at Diamond Light Source, produce high rates of data. In order to capture, store and process this data High Performance Computing (HPC) systems have been implemented. This paper will present the architecture and usage for handling high rate data: detector data capture, large volume storage and parallel processing. The EPICS area Detector frame work has been adopted to abstract the detectors for common tasks including live processing, file format and storage. The chosen data format is HDF5 which provides multidimensional data storage and NeXuS compatibility. The storage system and related computing infrastructure include: a centralised Lustre based parallel file system, a dedicated network and a HPC cluster. A well defined roadmap is in place for the evolution of this to meet demand as the requirements and technology advances. For processing the science data the HPC cluster allow efficient parallel computing, on a mixture of ×86 and GPU processing units. The nature of the Lustre storage system in combination with the parallel HDF5 library allow efficient disk I/O during computation jobs. Software developments, which include utilising optimised parallel file reading for a variety of post processing techniques, are being developed in collaboration as part of the Pan-Data EU Project (www.pan-data.eu). These are particularly applicable to tomographic reconstruction and processing of non crystalline diffraction data.

  15. BASIC overlay for CAMAC data and command handling

    Energy Technology Data Exchange (ETDEWEB)

    Ciftcioglu, O [Istanbul Technical Univ. (Turkey). Inst. for Nuclear Energy

    1979-11-15

    A BASIC overlay has been developed for the BASIC language run in the PDP-11 series of computers. The overlay has particularly been wirtten for a dedicated Camac Crate Controller DC-011 from Ortec. By means of the overlay, any command comprising C, N, A, F information can easily be issued by the host system to communicate with the peripherals connected to the CAMAC system, through the CAMAC interface. The overlay is particularly useful for rather slow control systems and data handling between two different operating systems with incompatible formats for the data files having the CAMAC system as a mutual system component controllable by each of the operating systems individually. The overlay can easily be modified to be used for a Standard controller (type A-1) or any other type of dedicated controller.

  16. Evaluating the Open Source Data Containers for Handling Big Geospatial Raster Data

    Directory of Open Access Journals (Sweden)

    Fei Hu

    2018-04-01

    Full Text Available Big geospatial raster data pose a grand challenge to data management technologies for effective big data query and processing. To address these challenges, various big data container solutions have been developed or enhanced to facilitate data storage, retrieval, and analysis. Data containers were also developed or enhanced to handle geospatial data. For example, Rasdaman was developed to handle raster data and GeoSpark/SpatialHadoop were enhanced from Spark/Hadoop to handle vector data. However, there are few studies to systematically compare and evaluate the features and performances of these popular data containers. This paper provides a comprehensive evaluation of six popular data containers (i.e., Rasdaman, SciDB, Spark, ClimateSpark, Hive, and MongoDB for handling multi-dimensional, array-based geospatial raster datasets. Their architectures, technologies, capabilities, and performance are compared and evaluated from two perspectives: (a system design and architecture (distributed architecture, logical data model, physical data model, and data operations; and (b practical use experience and performance (data preprocessing, data uploading, query speed, and resource consumption. Four major conclusions are offered: (1 no data containers, except ClimateSpark, have good support for the HDF data format used in this paper, requiring time- and resource-consuming data preprocessing to load data; (2 SciDB, Rasdaman, and MongoDB handle small/mediate volumes of data query well, whereas Spark and ClimateSpark can handle large volumes of data with stable resource consumption; (3 SciDB and Rasdaman provide mature array-based data operation and analytical functions, while the others lack these functions for users; and (4 SciDB, Spark, and Hive have better support of user defined functions (UDFs to extend the system capability.

  17. Software for handling and replacement of missing data

    Directory of Open Access Journals (Sweden)

    Mayer, Benjamin

    2009-10-01

    Full Text Available In medical research missing values often arise in the course of a data analysis. This fact constitutes a problem for different reasons, so e.g. standard methods for analyzing data lead to biased estimates and a loss of statistical power due to missing values, since those methods require complete data sets and therefore omit incomplete cases for the analyses. Furthermore missing values imply a certain loss of information for what reason the validity of results of a study with missing values has to be rated less than in a case where all data had been available. For years there are methods for replacement of missing values (Rubin, Schafer to tackle these problems and solve them in parts. Hence in this article we want to present the existing software to handle and replace missing values on the one hand and give an outline about the available options to get information on the other hand. The methodological aspects of the replacement strategies are delineated just briefly in this article.

  18. Variable identification in group method of data handling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Iraci Martinez, E-mail: martinez@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Bueno, Elaine Inacio [Instituto Federal de Educacao, Ciencia e Tecnologia, Guarulhos, SP (Brazil)

    2011-07-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a preselected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and Artificial Neural Network - ANN methodologies, and applied to the IPEN research Reactor IEA-R1. The GMDH was used to study the best set of variables to be used to train an ANN, resulting in a best monitoring variable estimative. The system performs the monitoring by comparing these estimative calculated values with measured ones. The IPEN Reactor Data Acquisition System is composed of 58 variables (process and nuclear variables). As the GMDH is a self-organizing methodology, the input variables choice is made automatically, and the real input variables used in the Monitoring and Diagnosis System were not showed in the final result. This work presents a study of variable identification of GMDH methodology by means of an algorithm that works in parallel with the GMDH algorithm and traces the initial variables paths, resulting in an identification of the variables that composes the best Monitoring and Diagnosis Model. (author)

  19. Variable identification in group method of data handling methodology

    International Nuclear Information System (INIS)

    Pereira, Iraci Martinez; Bueno, Elaine Inacio

    2011-01-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a preselected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and Artificial Neural Network - ANN methodologies, and applied to the IPEN research Reactor IEA-R1. The GMDH was used to study the best set of variables to be used to train an ANN, resulting in a best monitoring variable estimative. The system performs the monitoring by comparing these estimative calculated values with measured ones. The IPEN Reactor Data Acquisition System is composed of 58 variables (process and nuclear variables). As the GMDH is a self-organizing methodology, the input variables choice is made automatically, and the real input variables used in the Monitoring and Diagnosis System were not showed in the final result. This work presents a study of variable identification of GMDH methodology by means of an algorithm that works in parallel with the GMDH algorithm and traces the initial variables paths, resulting in an identification of the variables that composes the best Monitoring and Diagnosis Model. (author)

  20. Reservoir water level forecasting using group method of data handling

    Science.gov (United States)

    Zaji, Amir Hossein; Bonakdari, Hossein; Gharabaghi, Bahram

    2018-06-01

    Accurately forecasted reservoir water level is among the most vital data for efficient reservoir structure design and management. In this study, the group method of data handling is combined with the minimum description length method to develop a very practical and functional model for predicting reservoir water levels. The models' performance is evaluated using two groups of input combinations based on recent days and recent weeks. Four different input combinations are considered in total. The data collected from Chahnimeh#1 Reservoir in eastern Iran are used for model training and validation. To assess the models' applicability in practical situations, the models are made to predict a non-observed dataset for the nearby Chahnimeh#4 Reservoir. According to the results, input combinations (L, L -1) and (L, L -1, L -12) for recent days with root-mean-squared error (RMSE) of 0.3478 and 0.3767, respectively, outperform input combinations (L, L -7) and (L, L -7, L -14) for recent weeks with RMSE of 0.3866 and 0.4378, respectively, with the dataset from https://www.typingclub.com/st. Accordingly, (L, L -1) is selected as the best input combination for making 7-day ahead predictions of reservoir water levels.

  1. Review: Lyn Richards (2005. Handling Qualitative Data: A Practical Guide

    Directory of Open Access Journals (Sweden)

    Robert L. Miller

    2006-03-01

    Full Text Available Handling Qualitative Data: A Practical Guide is an introductory textbook covering all stages of qualitative research from the initial conceptualisation of a project, through data collection and analysis, to writing up. The author, Lyn RICHARDS, is a well-known developer of two key qualitative software analysis packages, NUD*IST and NVivo. While RICHARDS clearly advocates the use of qualitative analysis software, the text is "generic" and could be used in tandem with any qualitative software package. The book concentrates on practical advice about the use of software to manage and analyse qualitative data, and provides insights in these areas. The consideration of issues around team-based qualitative research is another strong point. However, due in part to its short length, the overall coverage of topics tends to be superficial. In itself, the book does not provide sufficient detailed support for a student who would like to use it as her/his main source of guidance for carrying out a qualitative research project. URN: urn:nbn:de:0114-fqs0602244

  2. New System For Tokamak T-10 Experimental Data Acquisition, Data Handling And Remote Access

    International Nuclear Information System (INIS)

    Sokolov, M. M.; Igonkina, G. B.; Koutcherenko, I. Yu.; Nurov, D. N.

    2008-01-01

    For carrying out the experiments on nuclear fusion devices in the Institute of Nuclear Fusion, Moscow, a system for experimental data acquisition, data handling and remote access (further 'DAS-T10') was developed and has been used in the Institute since the year 2000. The DAS-T10 maintains the whole cycle of experimental data handling: from configuration of data measuring equipment and acquisition of raw data from the fusion device (the Device), to presentation of math-processed data and support of the experiment data archive. The DAS-T10 provides facilities for the researchers to access the data both at early stages of an experiment and well afterwards, locally from within the experiment network and remotely over the Internet.The DAS-T10 is undergoing a modernization since the year 2007. The new version of the DAS-T10 will accommodate to modern data measuring equipment and will implement improved architectural solutions. The innovations will allow the DAS-T10 to produce and handle larger amounts of experimental data, thus providing the opportunities to intensify and extend the fusion researches. The new features of the DAS-T10 along with the existing design principles are reviewed in this paper

  3. Geospatial Big Data Handling Theory and Methods: A Review and Research Challenges

    DEFF Research Database (Denmark)

    Li, Songnian; Dragicevic, Suzana; Anton, François

    2016-01-01

    Big data has now become a strong focus of global interest that is increasingly attracting the attention of academia, industry, government and other organizations. Big data can be situated in the disciplinary area of traditional geospatial data handling theory and methods. The increasing volume...... for Photogrammetry and Remote Sensing (ISPRS) Technical Commission II (TC II) revisits the existing geospatial data handling methods and theories to determine if they are still capable of handling emerging geospatial big data. Further, the paper synthesises problems, major issues and challenges with current...... developments as well as recommending what needs to be developed further in the near future....

  4. Hazard Control Extensions in a COTS Based Data Handling System

    Science.gov (United States)

    Vogel, Torsten; Rakers, Sven; Gronowski, Matthias; Schneegans, Joachim

    2011-08-01

    EML is an electromagnetic levitator for containerless processing of conductive samples on the International Space Station. This material sciences experiment is running in the European Drawer Rack (EDR) facility. The objective of this experiment is to gain insight into the parameters of liquid metal samples and their crystallisation processes without the influence of container walls. To this end the samples are electromagnetically positioned in a coil system and then heated up beyond their melting point in an ultraclean environment.The EML programme is currently under development by Astrium Space Transportation in Friedrichshafen and Bremen; jointly funded by ESA and DLR (on behalf of BMWi, contract 50WP0808). EML consists of four main modules listed in Table 1. The paper focuses mainly on the architecture and design of the ECM module and its contribution to a safe operation of the experiment. The ECM is a computer system that integrates the power supply to the EML experiment, control functions and video handling and compression features. Experiment control is performed by either telecommand or the execution of predefined experiment scripts.

  5. Data handling and processing for the ATLAS experiment

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment has taken data steadily since Autumn 2009, collecting close to 1 fm-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the first two years of operation.

  6. Data handling and processing for the ATLAS experiment

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, collecting so far over 2.5 fm-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. This paper reports on the experience of setting up and operating this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the first two years of operation.

  7. The handling of missing binary data in language research

    Directory of Open Access Journals (Sweden)

    François Pichette

    2015-01-01

    Full Text Available Researchers are frequently confronted with unanswered questions or items on their questionnaires and tests, due to factors such as item difficulty, lack of testing time, or participant distraction. This paper first presents results from a poll confirming previous claims (Rietveld & van Hout, 2006; Schafer & Gra- ham, 2002 that data replacement and deletion methods are common in research. Language researchers declared that when faced with missing answers of the yes/no type (that translate into zero or one in data tables, the three most common solutions they adopt are to exclude the participant’s data from the analyses, to leave the square empty, or to fill in with zero, as for an incorrect answer. This study then examines the impact on Cronbach’s α of five types of data insertion, using simulated and actual data with various numbers of participants and missing percentages. Our analyses indicate that the three most common methods we identified among language researchers are the ones with the greatest impact  n Cronbach's α coefficients; in other words, they are the least desirable solutions to the missing data problem. On the basis of our results, we make recommendations for language researchers concerning the best way to deal with missing data. Given that none of the most common simple methods works properly, we suggest that the missing data be replaced either by the item’s mean or by the participants’ overall mean to provide a better, more accurate image of the instrument’s internal consistency.

  8. Pattern data handling system for an electron beam exposure system

    International Nuclear Information System (INIS)

    Berrian, D.W.; Ward, B.W.

    1985-01-01

    A beam is blanked over a rectangular area by storing beam blanking data in a bit map memory having data words of n bits. An address counter is loaded with the x-coordinate of the location of the rectangle and inhibited (by word counter) from counting after advancing through a number of addresses corresponding to the length of the rectangle. Mask bits whose position determines the y-coordinate of the location of the rectangle and whose number determines the width of the rectangle are loaded into a mask latch having n bit locations each with an output coupled to write enable inputs of the memory corresponding to the n bits of the data words in the memory. As the address counter advances through the number of addresses corresponding to the length of the rectangle, data describing the rectangle is stored in the memory without altering masked parts of previously stored blanking data. (author)

  9. Computer handling of Savannah River Plant environmental monitoring data

    International Nuclear Information System (INIS)

    Zeigler, C.C.

    1975-12-01

    At the Savannah River Plant, computer programs are used to calculate, store, and retrieve radioactive and nonradioactive environmental monitoring data. Objectives are to provide daily, monthly, and annual summaries of all routine monitoring data; to calculate and tabulate releases according to radioisotopic species or nonradioactive pollutant, source point, and mode of entry to the environment (atmosphere, stream, or earthen seepage basins). The computer programs use a compatible numeric coding system for the data, and printouts are in the form required for internal and external reports. Data input and program maintenance are accomplished with punched cards, paper or magnetic tapes, and when applicable, with computer terminals. Additional aids for data evaluation provided by the programs are statistical counting errors, maximum and minimum values, standard deviations of averages, and other statistical analyses

  10. Generalized Nuclear Data: A New Structure (with Supporting Infrastructure) for Handling Nuclear Data

    International Nuclear Information System (INIS)

    Mattoon, C.M.; Beck, B.R.; Patel, N.R.; Summers, N.C.; Hedstrom, G.W.; Brown, D.A.

    2012-01-01

    The Evaluated Nuclear Data File (ENDF) format was designed in the 1960s to accommodate neutron reaction data to support nuclear engineering applications in power, national security and criticality safety. Over the years, the scope of the format has been extended to handle many other kinds of data including charged particle, decay, atomic, photo-nuclear and thermal neutron scattering. Although ENDF has wide acceptance and support for many data types, its limited support for correlated particle emission, limited numeric precision, and general lack of extensibility mean that the nuclear data community cannot take advantage of many emerging opportunities. More generally, the ENDF format provides an unfriendly environment that makes it difficult for new data evaluators and users to create and access nuclear data. The Cross Section Evaluation Working Group (CSEWG) has begun the design of a new Generalized Nuclear Data (or 'GND') structure, meant to replace older formats with a hierarchy that mirrors the underlying physics, and is aligned with modern coding and database practices. In support of this new structure, Lawrence Livermore National Laboratory (LLNL) has updated its nuclear data/reactions management package Fudge to handle GND structured nuclear data. Fudge provides tools for converting both the latest ENDF format (ENDF-6) and the LLNL Evaluated Nuclear Data Library (ENDL) format to and from GND, as well as for visualizing, modifying and processing (i.e., converting evaluated nuclear data into a form more suitable to transport codes) GND structured nuclear data. GND defines the structure needed for storing nuclear data evaluations and the type of data that needs to be stored. But unlike ENDF and ENDL, GND does not define how the data are to be stored in a file. Currently, Fudge writes the structured GND data to a file using the eXtensible Markup Language (XML), as it is ASCII based and can be viewed with any text editor. XML is a meta-language, meaning that it

  11. Generalized Nuclear Data: A New Structure (with Supporting Infrastructure) for Handling Nuclear Data

    Energy Technology Data Exchange (ETDEWEB)

    Mattoon, C.M. [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore CA (United States); Beck, B.R.; Patel, N.R.; Summers, N.C.; Hedstrom, G.W. [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore CA (United States); Brown, D.A. [National Nuclear Data Center, Upton NY (United States)

    2012-12-15

    The Evaluated Nuclear Data File (ENDF) format was designed in the 1960s to accommodate neutron reaction data to support nuclear engineering applications in power, national security and criticality safety. Over the years, the scope of the format has been extended to handle many other kinds of data including charged particle, decay, atomic, photo-nuclear and thermal neutron scattering. Although ENDF has wide acceptance and support for many data types, its limited support for correlated particle emission, limited numeric precision, and general lack of extensibility mean that the nuclear data community cannot take advantage of many emerging opportunities. More generally, the ENDF format provides an unfriendly environment that makes it difficult for new data evaluators and users to create and access nuclear data. The Cross Section Evaluation Working Group (CSEWG) has begun the design of a new Generalized Nuclear Data (or 'GND') structure, meant to replace older formats with a hierarchy that mirrors the underlying physics, and is aligned with modern coding and database practices. In support of this new structure, Lawrence Livermore National Laboratory (LLNL) has updated its nuclear data/reactions management package Fudge to handle GND structured nuclear data. Fudge provides tools for converting both the latest ENDF format (ENDF-6) and the LLNL Evaluated Nuclear Data Library (ENDL) format to and from GND, as well as for visualizing, modifying and processing (i.e., converting evaluated nuclear data into a form more suitable to transport codes) GND structured nuclear data. GND defines the structure needed for storing nuclear data evaluations and the type of data that needs to be stored. But unlike ENDF and ENDL, GND does not define how the data are to be stored in a file. Currently, Fudge writes the structured GND data to a file using the eXtensible Markup Language (XML), as it is ASCII based and can be viewed with any text editor. XML is a meta-language, meaning that it

  12. Disk storage at CERN: Handling LHC data and beyond

    International Nuclear Information System (INIS)

    Espinal, X; Adde, G; Chan, B; Iven, J; Presti, G Lo; Lamanna, M; Mascetti, L; Pace, A; Peters, A; Ponce, S; Sindrilaru, E

    2014-01-01

    The CERN-IT Data Storage and Services (DSS) group stores and provides access to data coming from the LHC and other physics experiments. We implement specialised storage services to provide tools for optimal data management, based on the evolution of data volumes, the available technologies and the observed experiment and users' usage patterns. Our current solutions are CASTOR, for highly-reliable tape-backed storage for heavy-duty Tier-0 workflows, and EOS, for disk-only storage for full-scale analysis activities. CASTOR is evolving towards a simplified disk layer in front of the tape robotics, focusing on recording the primary data from the detectors. EOS is now a well-established storage service used intensively by the four big LHC experiments. Its conceptual design based on multi-replica and in-memory namespace, makes it the perfect system for data intensive workflows. The LHC-Long Shutdown 1 (LSI) presents a window of opportunity to shape up both of our storage services and validate against the ongoing analysis activity in order to successfully face the new LHC data taking period in 2015. In this paper, the current state and foreseen evolutions of CASTOR and EOS will be presented together with a study about the reliability of our systems.

  13. Computer network prepared to handle massive data flow

    CERN Multimedia

    2006-01-01

    "Massive quantities of data will soon begin flowing from the largest scientific instrument ever built into an internationl network of computer centers, including one operated jointly by the University of Chicago and Indiana University." (2 pages)

  14. The role of visualisation in data handling in Grade 9 within a problem-centred context

    Directory of Open Access Journals (Sweden)

    Antonia Makina

    2009-09-01

    Full Text Available In the recent past, data handling has been neglected at secondary school level, perhaps partially due to the strong emphasis on developing arithmetic, algebra and geometry. For the first time, the South African curriculum includes substantial amounts of data handling at all grade levels. The introduction of more data handling in the secondary school curriculum in South Africa and the prevalence of many problems in the teaching of probability and statistics argues for a serious reconsideration of the way it is taught to the pupils. Currently this concern has been the focus of a call for reform in mathematics education by a body like the National Council of Teachers of Mathematics (NCTM at all levels of schooling (NCTM, 1989; 2000. The importance of visualisation in mathematics, at all levels of mathematical problem solving is well documented in the literature (Bishop, 1989; Maher & Alston, 1989; Moses, 1982; Wheatley, 1991 but almost nothing was done to appreciate visualisation in the learning of data handling. The paper therefore provides a qualitative examination from a Masters dissertation (Makina, 2005 of the role of visualisation in the learning of data handling. This is done through examining the thought processes involved by Grade 9 learners during visualisation while solving data handling tasks. Several roles of visualisation were identified and most were found to improve the critical and creative thinking of pupils during their learning of data handling. The results show that learners are likely to improve their performance in data handling if the awareness of the need to use visualisation creatively as a tool for understanding are highlighted.

  15. The Unified Database for BM@N experiment data handling

    Science.gov (United States)

    Gertsenberger, Konstantin; Rogachevsky, Oleg

    2018-04-01

    The article describes the developed Unified Database designed as a comprehensive relational data storage for the BM@N experiment at the Joint Institute for Nuclear Research in Dubna. The BM@N experiment, which is one of the main elements of the first stage of the NICA project, is a fixed target experiment at extracted Nuclotron beams of the Laboratory of High Energy Physics (LHEP JINR). The structure and purposes of the BM@N setup are briefly presented. The article considers the scheme of the Unified Database, its attributes and implemented features in detail. The use of the developed BM@N database provides correct multi-user access to actual information of the experiment for data processing. It stores information on the experiment runs, detectors and their geometries, different configuration, calibration and algorithm parameters used in offline data processing. An important part of any database - user interfaces are presented.

  16. Handling data redundancy and update anomalies in fuzzy relational databases

    International Nuclear Information System (INIS)

    Chen, G.; Kerre, E.E.

    1996-01-01

    This paper discusses various data redundancy and update anomaly problems that may occur with fuzzy relational databases. In coping with these problems to avoid undesirable consequences when fuzzy databases are updated via data insertion, deletion and modification, a number of fuzzy normal forms (e.g., F1NF, 0-F2NF, 0-F3NF, 0-FBCNF) are used to guide the design of relation schemes such that partial and transitive fuzzy functional dependencies (FFDs) between relation attributes are restricted. Based upon FFDs and related concepts, particular attention is paid to 0-F3NF and 0-FBCNF, and to the corresponding decomposition algorithms. These algorithms not only produce relation schemes which are either in 0-F3NF or in 0-FBCNF, but also guarantee that the information (data content and FFDs) with original schemes can be recovered with those resultant schemes

  17. LFI Radiometric Chain Assembly (RCA) data handling ``Rachel''

    Science.gov (United States)

    Malaspina, M.; Franceschi, E.; Battaglia, P.; Binko, P.; Butler, R. C.; D'Arcangelo, O.; Fogliani, S.; Frailis, M.; Franceschet, C.; Galeotta, S.; Gasparo, F.; Gregorio, A.; Lapolla, M.; Leonardi, R.; Maggio, G.; Mandolesi, N.; Manzato, P.; Maris, M.; Meharga, M.; Meinhold, P.; Morisset, N.; Pasian, F.; Perrotta, F.; Rohlfs, R.; Sandri, M.; Tomasi, M.; Türler, M.; Zacchei, A.; Zonca, A.

    2009-12-01

    Planck's Low Frequency Instrument is an array of 22 pseudo-correlation radiometers at 30, 44, and 70 GHz. Before integrating the overall array assembly, a first set of tests has been performed for each radiometer chain assembly (RCA), consisting of two radiometers. In this paper, we describe Rachel, a software application which has been purposely developed and used during the RCA test campaign to carry out both near-realtime on-line data analysis and data storage (in FITS format) of the raw output from the radiometric chains.

  18. LFI Radiometric Chain Assembly (RCA) data handling 'Rachel'

    International Nuclear Information System (INIS)

    Malaspina, M; Franceschi, E; Butler, R C; Mandolesi, N; Battaglia, P; Franceschet, C; Lapolla, M; Binko, P; Meharga, M; D'Arcangelo, O; Fogliani, S; Frailis, M; Galeotta, S; Gasparo, F; Maggio, G; Manzato, P; Maris, M; Gregorio, A; Leonardi, R; Meinhold, P

    2009-01-01

    Planck's Low Frequency Instrument is an array of 22 pseudo-correlation radiometers at 30, 44, and 70 GHz. Before integrating the overall array assembly, a first set of tests has been performed for each radiometer chain assembly (RCA), consisting of two radiometers. In this paper, we describe Rachel, a software application which has been purposely developed and used during the RCA test campaign to carry out both near-realtime on-line data analysis and data storage (in FITS format) of the raw output from the radiometric chains.

  19. Optimization of Planck-LFI on-board data handling

    Energy Technology Data Exchange (ETDEWEB)

    Maris, M; Galeotta, S; Frailis, M; Zacchei, A; Fogliani, S; Gasparo, F [INAF-OATs, Via G.B. Tiepolo 11, 34131 Trieste (Italy); Tomasi, M; Bersanelli, M [Universita di Milano, Dipartimento di Fisica, Via G. Celoria 16, 20133 Milano (Italy); Miccolis, M [Thales Alenia Space Italia S.p.A., S.S. Padana Superiore 290, 20090 Vimodrone (Italy); Hildebrandt, S; Chulani, H; Gomez, F [Instituto de Astrofisica de Canarias (IAC), C/o Via Lactea, s/n E38205 - La Laguna, Tenerife (Spain); Rohlfs, R; Morisset, N; Binko, P [ISDC Data Centre for Astrophysics, University of Geneva, ch. d' Ecogia 16, 1290 Versoix (Switzerland); Burigana, C; Butler, R C; Cuttaia, F; Franceschi, E [INAF-IASF Bologna, Via P. Gobetti, 101, 40129 Bologna (Italy); D' Arcangelo, O, E-mail: maris@oats.inaf.i [IFP-CNR, via Cozzi 53, 20125 Milano (Italy)

    2009-12-15

    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) on-board the Planck mission will acquire data at a rate much higher than the data rate allowed by the science telemetry bandwith of 35.5 Kbps. The data are processed by an on-board pipeline, followed on-ground by a decoding and reconstruction step, to reduce the volume of data to a level compatible with the bandwidth while minimizing the loss of information. This paper illustrates the on-board processing of the scientific data used by Planck/LFI to fit the allowed data-rate, an intrinsecally lossy process which distorts the signal in a manner which depends on a set of five free parameters (N{sub aver}, r{sub 1}, r{sub 2}, q, O) for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the on-board processing as a function of these parameters. It describes the method of tuning the on-board processing chain to cope with the limited bandwidth while keeping to a minimum the signal distortion. Tuning is sensitive to the statistics of the signal and has to be constantly adapted during flight. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, pre-launch tests or data taken in flight from LFI operating in a special diagnostic acquisition mode. All the needed optimization steps are performed by an automated tool, OCA2, which simulates the on-board processing, explores the space of possible combinations of parameters, and produces a set of statistical indicators, among them: the compression rate C{sub r} and the processing noise epsilon{sub Q}. For Planck/LFI it is required that C{sub r} = 2.4 while, as for other systematics, epsilon{sub Q} would have to be less than 10% of rms of the instrumental white noise. An analytical model is developed that is able to extract most of the relevant information on the processing errors and the compression rate as a function of the signal

  20. Information jet: Handling noisy big data from weakly disconnected network

    Science.gov (United States)

    Aurongzeb, Deeder

    Sudden aggregation (information jet) of large amount of data is ubiquitous around connected social networks, driven by sudden interacting and non-interacting events, network security threat attacks, online sales channel etc. Clustering of information jet based on time series analysis and graph theory is not new but little work is done to connect them with particle jet statistics. We show pre-clustering based on context can element soft network or network of information which is critical to minimize time to calculate results from noisy big data. We show difference between, stochastic gradient boosting and time series-graph clustering. For disconnected higher dimensional information jet, we use Kallenberg representation theorem (Kallenberg, 2005, arXiv:1401.1137) to identify and eliminate jet similarities from dense or sparse graph.

  1. Tracking, Vertexing and data handling strategy for the LHCb upgrade

    CERN Document Server

    Seyfert, Paul

    2017-01-01

    For Run III (2021 onwards) of the LHC, LHCb will take data at an instantaneous luminosity of $2 \\times 10^{33} \\mathrm{cm}^{-2} \\mathrm{s}^{-1}$, five times higher than in Run II (2015-2018). To cope with the harsher data taking conditions, the LHCb collaboration will upgrade the DAQ system and install a purely software based trigger, in addition to various detector upgrades. The high readout rate contributes to the challenge of reconstructing and selecting events in real time. Special emphasis in this contribution will be put on the need for fast track reconstruction in the software trigger. The modified detector infrastructure will be able to face this challenge and the necessary changes to the reconstruction sequence are discussed. A novel strategy is presented which distributes and maximises the bandwidth among the different physics channels using a genetic algorithm. The data processing chain includes a re-design of the event scheduling, introduction of concurrent processing, optimisations in processor ...

  2. Online data handling and storage at the CMS experiment

    Science.gov (United States)

    Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Demiragli, Z.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gómez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, RK; Morovic, S.; Nuñez-Barranco-Fernández, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ∼62 sources produced with an aggregate rate of ∼2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.

  3. Online Data Handling and Storage at the CMS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Andre, J. M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.

  4. Online data handling and storage at the CMS experiment

    CERN Document Server

    Andre, Jean-marc Olivier; Behrens, Ulf; Branson, James; Chaze, Olivier; Demiragli, Zeynep; Dobson, Marc; Dupont, Aymeric; Erhan, Samim; Gigi, Dominique; Glege, Frank; Gomez Ceballos, Guillelmo; Hegeman, Jeroen Guido; Holzner, Andre Georg; Jimenez Estupinan, Raul; Masetti, Lorenzo; Meijers, Franciscus; Meschi, Emilio; Mommsen, Remigius; Morovic, Srecko; Nunez Barranco Fernandez, Carlos; O'Dell, Vivian; Orsini, Luciano; Paus, Christoph Maria Ernst; Petrucci, Andrea; Pieri, Marco; Racz, Attila; Roberts, Penelope Amelia; Sakulin, Hannes; Schwick, Christoph; Stieger, Benjamin Bastian; Sumorok, Konstanty; Veverka, Jan; Zaza, Salvatore; Zejdl, Petr

    2015-01-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small 'documents' using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store ...

  5. Online data handling and storage at the CMS experiment

    International Nuclear Information System (INIS)

    Andre, J-M; Andronidis, A; Chaze, O; Deldicque, C; Dobson, M; Dupont, A; Gigi, D; Glege, F; Hegeman, J; Jimenez-Estupiñán, R; Masetti, L; Meijers, F; Behrens, U; Branson, J; Cittolin, S; Holzner, A; Darlea, G-L; Demiragli, Z; Gómez-Ceballos, G; Erhan, S

    2015-01-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ∼62 sources produced with an aggregate rate of ∼2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system. (paper)

  6. Computing and data handling recent experiences at Fermilab and SLAC

    International Nuclear Information System (INIS)

    Cooper, P.S.

    1990-01-01

    Computing has become evermore central to the doing of high energy physics. There are now major second and third generation experiments for which the largest single cost is computing. At the same time the availability of ''cheap'' computing has made possible experiments which were previously considered infeasible. The result of this trend has been an explosion of computing and computing needs. I will review here the magnitude of the problem, as seen at Fermilab and SLAC, and the present methods for dealing with it. I will then undertake the dangerous assignment of projecting the needs and solutions forthcoming in the next few years at both laboratories. I will concentrate on the ''offline'' problem; the process of turning terabytes of data tapes into pages of physics journals. 5 refs., 4 figs., 4 tabs

  7. Fuel handling, reprocessing, and waste and related nuclear data aspects

    International Nuclear Information System (INIS)

    Kuesters, H.; Lalovic, M.; Wiese, H.W.

    1979-06-01

    The essential processes in the out-of-pile nuclear fuel cycle are described, i.e. mining and milling of uranium ores, enrichment, fuel fabrication, storage, transportation, reprocessing of irradiated fuel, waste treatment and waste disposal. The aspects of radiation (mainly gammas and neutrons) and of heat production, as well as special safety considerations are outlined with respect to their potential operational impacts and long-term hazards. In this context the importance of nuclear data for the out-of-pile fuel cycle is discussed. Special weight is given to the LWR fuel cycle including recycling; the differences of LMFBR high burn-up fuel with large PuO 2 content are described. The HTR fuel cycle is discussed briefly as well as some alternative fuel cycle concepts. (orig.) [de

  8. Data handling and visualization for NASA's science programs

    Science.gov (United States)

    Bredekamp, Joseph H. (Editor)

    1995-01-01

    Advanced information systems capabilities are essential to conducting NASA's scientific research mission. Access to these capabilities is no longer a luxury for a select few within the science community, but rather an absolute necessity for carrying out scientific investigations. The dependence on high performance computing and networking, as well as ready and expedient access to science data, metadata, and analysis tools is the fundamental underpinning for the entire research endeavor. At the same time, advances in the whole range of information technologies continues on an almost explosive growth path, reaching beyond the research community to affect the population as a whole. Capitalizing on and exploiting these advances are critical to the continued success of space science investigations. NASA must remain abreast of developments in the field and strike an appropriate balance between being a smart buyer and a direct investor in the technology which serves its unique requirements. Another key theme deals with the need for the space and computer science communities to collaborate as partners to more fully realize the potential of information technology in the space science research environment.

  9. Teachers’ professional development needs in data handling and probability

    Directory of Open Access Journals (Sweden)

    Helena Wessels

    2011-07-01

    Full Text Available Poor Trends in International Mathematics and Science Study (TIMMS results and widespread disappointing mathematics results in South Africa necessitate research-based and more efficient professional development for in-service mathematics teachers. This article reports on the profiling of mathematics teachers’ statistical knowledge, beliefs and confidence in order to inform the development of in-service teacher education programmes in statistics for Grade 8 and Grade 9 teachers. Ninety mathematics teachers from schools with culturally diverse learner populations in an urban region in South Africa were profiled using an adapted profiling instrument (Watson, 2001. Although statistics formed part of quite a number of these teachers’ initial teacher education and about half of them were involved in professional development in statistics education, they still teach traditionally, rather than using a more data driven approach. Teachers indicated high levels of confidence in teaching most statistics topics but showed low levels of statistical thinking when they had to apply their knowledge of concepts, such as sample and average in social contexts including newspaper articles and research reports.

  10. Individual Information-Centered Approach for Handling Physical Activity Missing Data

    Science.gov (United States)

    Kang, Minsoo; Rowe, David A.; Barreira, Tiago V.; Robinson, Terrance S.; Mahar, Matthew T.

    2009-01-01

    The purpose of this study was to validate individual information (II)-centered methods for handling missing data, using data samples of 118 middle-aged adults and 91 older adults equipped with Yamax SW-200 pedometers and Actigraph accelerometers for 7 days. We used a semisimulation approach to create six data sets: three physical activity outcome…

  11. Operation of data acquisition and handling system in the INS-SF cyclotron

    International Nuclear Information System (INIS)

    Yasue, M.; Omata, K.

    1976-01-01

    Operations of following data processing routines are described. 1) One-dimensional multiplexer PHA. 2) Two-dimensional multiplexer PHA. 3) Two or three parameter data handling: Digital gating, dumping of raw data onto MT and processing in function modes. These processing routines are executed under the control of a real time disk operating system in TOSBAC-40C. (auth.)

  12. Handling Missing Data in Structural Equation Models in R: A Replication Study for Applied Researchers

    Science.gov (United States)

    Wolgast, Anett; Schwinger, Malte; Hahnel, Carolin; Stiensmeier-Pelster, Joachim

    2017-01-01

    Introduction: Multiple imputation (MI) is one of the most highly recommended methods for replacing missing values in research data. The scope of this paper is to demonstrate missing data handling in SEM by analyzing two modified data examples from educational psychology, and to give practical recommendations for applied researchers. Method: We…

  13. Data handling and post-reconstruction analysis at next generation experiments

    International Nuclear Information System (INIS)

    Fischler, M.; Lammel, S.

    1995-11-01

    A new generation of experiments in high energy physics is approaching. With the approval of the LHC at CERN and the revised Main Injector project at Fermilab, high statistics experiments will start operation within 5 to 10 years. With luminosities Up to 10 34 /cm 2 /sec and several hundred thousand readout channels, data most likely cannot be handled and analysed using traditional HEP approaches. The CAP group at Fermilab is investigating different approaches to data handling and organization for post-reconstruction analysis. We discuss the approaches considered, their strengths and weaknesses, integration with hierarchical storage, and sharing of primary data resources

  14. The on-board data handling system of the AFIS-P mission

    Energy Technology Data Exchange (ETDEWEB)

    Gaisbauer, Dominic; Greenwald, Daniel; Hahn, Alexander; Hauptmann, Philipp; Konorov, Igor; Meng, Lingxin; Paul, Stephan; Poeschl, Thomas [Physics Department E18, Technische Universitaet Muenchen (Germany); Losekamm, Martin [Physics Department E18, Technische Universitaet Muenchen (Germany); Institute of Astronautics, Technische Universitaet Muenchen (Germany); Renker, Dieter [Physics Department E17, Technische Universitaet Muenchen (Germany)

    2014-07-01

    The Antiproton Flux in Space experiment (AFIS) is a novel particle detector comprised of silicon photomultipliers and scintillating plastic fibers. Its purpose is to measure the trapped antiproton flux in low Earth orbit. To test the detector and the data acquisition system, a prototype detector will be flown aboard a high altitude research balloon as part of the REXUS/BEXUS program by the German Aerospace Center (DLR). This talk presents the on-board data handling system and the ground support equipment of AFIS-P. It will also highlight the data handling algorithms developed and used for the mission.

  15. Handling and archiving of magnetic fusion data at DIII-D

    International Nuclear Information System (INIS)

    VanderLaan, J.F.; Miller, S.; McHarg, B.B. Jr.; Henline, P.A.

    1995-10-01

    Recent modifications to the computer network at DIII-D enhance the collection and distribution of newly acquired and archived experimental data. Linked clients and servers route new data from diagnostic computers to centralized mass storage and distribute data on demand to local and remote workstations and computers. Capacity for data handling exceeds the upper limit of DIII-D Tokamak data production of about 4 GBytes per day. Network users have fast access to new data stored on line. An interactive program handles requests for restoration of data archived off line. Disk management procedures retain selected data on line in preference to other data. Redundancy of all components on the archiving path from the network to magnetic media has prevented loss of data. Older data are rearchived as dictated by limited media life

  16. Computing and data handling requirements for SSC [Superconducting Super Collider] and LHC [Large Hadron Collider] experiments

    International Nuclear Information System (INIS)

    Lankford, A.J.

    1990-05-01

    A number of issues for computing and data handling in the online in environment at future high-luminosity, high-energy colliders, such as the Superconducting Super Collider (SSC) and Large Hadron Collider (LHC), are outlined. Requirements for trigger processing, data acquisition, and online processing are discussed. Some aspects of possible solutions are sketched. 6 refs., 3 figs

  17. Substructure analysis techniques and automation. [to eliminate logistical data handling and generation chores

    Science.gov (United States)

    Hennrich, C. W.; Konrath, E. J., Jr.

    1973-01-01

    A basic automated substructure analysis capability for NASTRAN is presented which eliminates most of the logistical data handling and generation chores that are currently associated with the method. Rigid formats are proposed which will accomplish this using three new modules, all of which can be added to level 16 with a relatively small effort.

  18. 40 CFR 65.161 - Continuous records and monitoring system data handling.

    Science.gov (United States)

    2010-07-01

    ... section. (D) Owners and operators shall retain the current description of the monitoring system as long as... Routing to a Fuel Gas System or a Process § 65.161 Continuous records and monitoring system data handling...) Monitoring system breakdowns, repairs, preventive maintenance, calibration checks, and zero (low-level) and...

  19. Remote sensing data handling to improve the system integration of indonesian national spatial data infrastructure

    International Nuclear Information System (INIS)

    Hari, G. R. V.

    2010-01-01

    With the usage of metadata as a reference for spatial data query, remote sensing images and other spatial datasets have been linked to their related semantic information. In the current catalogue systems, like those or satellite data provides, or clearinghouses, each remote sensing image is maintained as an independent entity. There is a very limited possibility to know the linkage of one image to another, even if one image has actually been derived from the other. It is an advantage for many purposes if the linkage among remote sensing image or other spatial data can be maintained or at least reconstructed. This research will explore how an image is linked to its related information, and how an image can be linked to another images. By exploring links among remote sensing images, a query of remote sensing data collection can be extended, for example, to find the answer of the query: 'which images are used to create certain dataset?', or 'which images have been created from a concrete dataset?', or 'is there a relationship between image A and image B based on their processing steps?'. By building links among spatial datasets in a collection based on their creation process, a further possibility of spatial data organization can be supported. The applicability and compatibility of the proposed method with the current platform is also considered. The proposed method can be implemented using the same standard and protocol and using the same metadata file as used by the existing system. This approach makes it also possible to be implemented in many countries which use the same infrastructure. To prove this purpose, we develop a prototype based on open source platform, including PostgreSQL, Apache Webserver, Mapserver WebGIS, and PHP programming environment. The output of this research leads to an improvement of spatial data handling, where an adjacency list is used to maintain spatial dataset history link. This improvement can enhance the query of spatial data in a

  20. Architectures and methodologies for future deployment of multi-site Zettabyte-Exascale data handling platforms

    CERN Document Server

    Acín, V; Boccali, T; Cancio, G; Collier, I P; Corney, D; Delaunay, B; Delfino, M; dell'Agnello, L; Flix, J; Fuhrmann, P; Gasthuber, M; Gülzow, V; Heiss, A; Lamanna, G; Macchi, P E; Maggi, M; Matthews, B; Neissner, C; Nief, J Y; Porto, M C; Sansum, A; Schulz, M; Shiers, J

    2015-01-01

    Several scientific fields, including Astrophysics, Astroparticle Physics, Cosmology, Nuclear and Particle Physics, and Research with Photons, are estimating that by the 2020 decade they will require data handling systems with data volumes approaching the Zettabyte distributed amongst as many as 10(18) individually addressable data objects (Zettabyte-Exascale systems). It may be convenient or necessary to deploy such systems using multiple physical sites. This paper describes the findings of a working group composed of experts from several

  1. Architectures and methodologies for future deployment of multi-site Zettabyte-Exascale data handling platforms

    International Nuclear Information System (INIS)

    Acín, V; Neissner, C; Bird, I; Cancio, G; Boccali, T; Dell'Agnello, L; Maggi, M; Collier, I P; Corney, D; Matthews, B; Delaunay, B; Lamanna, G; Macchi, P-E; Nief, J-Y; Delfino, M; Flix, J; Fuhrmann, P; Gasthuber, M; Gülzow, V; Heiss, A

    2015-01-01

    Several scientific fields, including Astrophysics, Astroparticle Physics, Cosmology, Nuclear and Particle Physics, and Research with Photons, are estimating that by the 2020 decade they will require data handling systems with data volumes approaching the Zettabyte distributed amongst as many as 10 18 individually addressable data objects (Zettabyte-Exascale systems). It may be convenient or necessary to deploy such systems using multiple physical sites. This paper describes the findings of a working group composed of experts from several (paper)

  2. Architectures and methodologies for future deployment of multi-site Zettabyte-Exascale data handling platforms

    Science.gov (United States)

    Acín, V.; Bird, I.; Boccali, T.; Cancio, G.; Collier, I. P.; Corney, D.; Delaunay, B.; Delfino, M.; dell'Agnello, L.; Flix, J.; Fuhrmann, P.; Gasthuber, M.; Gülzow, V.; Heiss, A.; Lamanna, G.; Macchi, P.-E.; Maggi, M.; Matthews, B.; Neissner, C.; Nief, J.-Y.; Porto, M. C.; Sansum, A.; Schulz, M.; Shiers, J.

    2015-12-01

    Several scientific fields, including Astrophysics, Astroparticle Physics, Cosmology, Nuclear and Particle Physics, and Research with Photons, are estimating that by the 2020 decade they will require data handling systems with data volumes approaching the Zettabyte distributed amongst as many as 1018 individually addressable data objects (Zettabyte-Exascale systems). It may be convenient or necessary to deploy such systems using multiple physical sites. This paper describes the findings of a working group composed of experts from several

  3. CAMAC - A modular instrumentation system for data handling. Revised description and specification

    International Nuclear Information System (INIS)

    1977-03-01

    CAMAC is a modern data handling system in widespread use with on-line digital computers. It is based on a digital highway for data and control. The CAMAC specifications ensures compatibility between equipment from different sources. The revised specification introduces several new features, but is consistent with the previous version (EUR 4100e, 1969). The CAMAC system was specified by European laboratories, through the Esone Committee, and has been endorsed by the USAEC NIM Committee, who have an identical specification (TID-25875)

  4. Data handling at EBR-II [Experimental Breeder Reactor II] for advanced diagnostics and control work

    International Nuclear Information System (INIS)

    Lindsay, R.W.; Schorzman, L.W.

    1988-01-01

    Improved control and diagnostics systems are being developed for nuclear and other applications. The Experimental Breeder Reactor II (EBR-II) Division of Argonne National Laboratory has embarked on a project to upgrade the EBR-II control and data handling systems. The nature of the work at EBR-II requires that reactor plant data be readily available for experimenters, and that the plant control systems be flexible to accommodate testing and development needs. In addition, operational concerns require that improved operator interfaces and computerized diagnostics be included in the reactor plant control system. The EBR-II systems have been upgraded to incorporate new data handling computers, new digital plant process controllers, and new displays and diagnostics are being developed and tested for permanent use. In addition, improved engineering surveillance will be possible with the new systems

  5. A versatile system for the rapid collection, handling and graphics analysis of multidimensional data

    International Nuclear Information System (INIS)

    O'Brien, P.M.; Moloney, G.; O'Oconnor, A.; Legge, G.J.F.

    1991-01-01

    The paper discusses the performances of a versatile computerized system developed at the Microanalytical Research Centre of the Melbourne University, for handling multiparameter data that may arise from a variety of experiments - nuclear, accelerator mass spectrometry, microprobe elemental analysis or 3-D microtomography. Some of the most demanding requirements arise in the application of microprobes to quantitative elemental mapping and to microtomography. A system to handle data from such experiments had been under continuous development. It has been reprogramed to run on a DG DS7540 workstation. The whole system of software has been rewritten, greatly expanded and made much more powerful and faster, by use of modern computer technology - a VME bus computer with a real-time operating system and a RISC workstation running UNIX and the X-window environment

  6. Handling Data Skew in MapReduce Cluster by Using Partition Tuning

    Directory of Open Access Journals (Sweden)

    Yufei Gao

    2017-01-01

    Full Text Available The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH. In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN. We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM on healthcare data.

  7. The 'last mile' of data handling: Fermilab's IFDH tools

    International Nuclear Information System (INIS)

    Lyon, Adam L; Mengel, Marc W

    2014-01-01

    IFDH (Intensity Frontier Data Handling), is a suite of tools for data movement tasks for Fermilab experiments and is an important part of the FIFE[2] (Fabric for Intensity Frontier [1] Experiments) initiative described at this conference. IFDH encompasses moving input data from caches or storage elements to compute nodes (the 'last mile' of data movement) and moving output data potentially to those caches as part of the journey back to the user. IFDH also involves throttling and locking to ensure that large numbers of jobs do not cause data movement bottlenecks. IFDH is realized as an easy to use layer that users call in their job scripts (e.g. 'ifdh cp'), hiding the low level data movement tools. One advantage of this layer is that the underlying low level tools can be selected or changed without the need for the user to alter their scripts. Logging and performance monitoring can also be added easily. This system will be presented in detail as well as its impact on the ease of data handling at Fermilab experiments.

  8. Outline and handling manual of experimental data time slice monitoring software 'SLICE'

    International Nuclear Information System (INIS)

    Shirai, Hiroshi; Hirayama, Toshio; Shimizu, Katsuhiro; Tani, Keiji; Azumi, Masafumi; Hirai, Ken-ichiro; Konno, Satoshi; Takase, Keizou.

    1993-02-01

    We have developed a software 'SLICE' which maps various kinds of plasma experimental data measured at the different geometrical position of JT-60U and JFT-2M onto the equilibrium magnetic configuration and treats them as a function of volume averaged minor radius ρ. Experimental data can be handled uniformly by using 'SLICE'. Plenty of commands of 'SLICE' make it easy to process the mapped data. The experimental data measured as line integrated values are also transformed by Abel inversion. The mapped data are fitted to a functional form and saved to the database 'MAPDB'. 'SLICE' can read the data from 'MAPDB' and re-display and transform them. Still more 'SLICE' creates run data of orbit following Monte-Carlo code 'OFMC' and tokamak predictive and interpretation code system 'TOPICS'. This report summarizes an outline and the usage of 'SLICE'. (author)

  9. Interactive handling of regional cerebral blood flow data using a macrolanguage

    International Nuclear Information System (INIS)

    Sveinsdottir, E.; Schomacker, T.; Lassen, N.A.

    1976-01-01

    A general image handling software system has been developed for on-line collection, processing and display of gamma camera images (IMAGE system). The most distinguishable feature of the system is the ability for the user to interactively specify sequences, called macros, of basic functions to be performed. Information about a specified sequence is retained in the system, thus enabling new sequences or macros to be defined using already specified sequences. Facilities for parameter setting and parameter transfer between functions, as well as facilities for repetition of a function, are included. Finally, functions, be it basic or macro, can be specified to be iteratively activated using a physiological trigger signal as f.ex. the ECG. In addition, a special program system was developed for handling the dynamic data, from Xenon-133 studies of regional cerebral blood flow (CBF system). Parametric or functional images derived from the CBF system and depicting estimates of regional cerebral blood flow, relative weights of grey matter or other parameters can after computation be handled in the IMAGE system

  10. Extensions to the Joshua GDMS to support environmental science and analysis data handling requirements

    International Nuclear Information System (INIS)

    Suich, J.E.; Honeck, H.C.

    1978-01-01

    For the past ten years, a generalized data management system (GDMS) called JOSHUA has been in use at the Savannah River Laboratory. Originally designed and implemented to support nuclear reactor physics and safety computational applications, the system is now also supporting environmental science modeling and impact assessment. Extensions to the original system are being developed to meet neet new data handling requirements, which include more general owner-member record relationships occurring in geographically encoded data sets, unstructured (relational) inquiry capability, cartographic analysis and display, and offsite data exchange. This paper discusses the need for these capabilities, places them in perspective as generic scientific data management activities, and presents the planned context-free extensions to the basic JOSHUA GDMS

  11. Extensions to the Joshua GDMS to support environmental science and analysis data handling requirements

    International Nuclear Information System (INIS)

    Suich, J.E.; Honeck, H.C.

    1977-01-01

    For the past ten years, a generalized data management system (GDMS) called JOSHUA has been in use at the Savannah River Laboratory. Originally designed and implemented to support nuclear reactor physics and safety computational applications, the system is now also supporting environmental science modeling and impact assessment. Extensions to the original system are being developed to meet new data handling requirements, which include more general owner-member record relationships occurring in geographically encoded data sets, unstructured (relational) inquiry capability, cartographic analysis and display, and offsite data exchange. This paper discusses the need for these capabilities, places them in perspective as generic scientific data management activities, and presents the planned context-free extensions to the basic JOSHUA GDMS

  12. The sample handling system for the Mars Icebreaker Life mission: from dirt to data.

    Science.gov (United States)

    Davé, Arwen; Thompson, Sarah J; McKay, Christopher P; Stoker, Carol R; Zacny, Kris; Paulsen, Gale; Mellerowicz, Bolek; Glass, Brian J; Willson, David; Bonaccorsi, Rosalba; Rask, Jon

    2013-04-01

    The Mars Icebreaker Life mission will search for subsurface life on Mars. It consists of three payload elements: a drill to retrieve soil samples from approximately 1 m below the surface, a robotic sample handling system to deliver the sample from the drill to the instruments, and the instruments themselves. This paper will discuss the robotic sample handling system. Collecting samples from ice-rich soils on Mars in search of life presents two challenges: protection of that icy soil--considered a "special region" with respect to planetary protection--from contamination from Earth, and delivery of the icy, sticky soil to spacecraft instruments. We present a sampling device that meets these challenges. We built a prototype system and tested it at martian pressure, drilling into ice-cemented soil, collecting cuttings, and transferring them to the inlet port of the SOLID2 life-detection instrument. The tests successfully demonstrated that the Icebreaker drill, sample handling system, and life-detection instrument can collectively operate in these conditions and produce science data that can be delivered via telemetry--from dirt to data. Our results also demonstrate the feasibility of using an air gap to prevent forward contamination. We define a set of six analog soils for testing over a range of soil cohesion, from loose sand to basalt soil, with angles of repose of 27° and 39°, respectively. Particle size is a key determinant of jamming of mechanical parts by soil particles. Jamming occurs when the clearance between moving parts is equal in size to the most common particle size or equal to three of these particles together. Three particles acting together tend to form bridges and lead to clogging. Our experiments show that rotary-hammer action of the Icebreaker drill influences the particle size, typically reducing particle size by ≈ 100 μm.

  13. Steller sea lion capture, marking, and handling data across their range 1985-2014

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database contains information on individual sea lions that were marked or handled from 1985-2014. Individuals were handled for various projects including vital...

  14. Handling of time-critical Conditions Data in the CMS experiment - Experience of the first year of data taking

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Data management for a wide category of non-event data plays a critical role in the operation of the CMS experiment. The processing chain (data taking-reconstruction-analysis) relies in the prompt availability of specific, time dependent data describing the state of the various detectors and their calibration parameters, which are treated separately from event data. The Condition Database system is the infrastructure established to handle these data and to make sure that they are available to both offline and online workflows. The Condition Data layout is designed such that the payload data (the Condition) is associated to an Interval Of Validity (IOV). The IOV allows accessing selectively the sets corresponding to specific intervals of time, run number or luminosity section. Both payloads and IOVs are stored in a cluster of relational database servers (Oracle) using an object-relational access approach. The strict requirements of security and isolation of the CMS online systems are imposing a redundant archit...

  15. Tourism forecasting using modified empirical mode decomposition and group method of data handling

    Science.gov (United States)

    Yahya, N. A.; Samsudin, R.; Shabri, A.

    2017-09-01

    In this study, a hybrid model using modified Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) model is proposed for tourism forecasting. This approach reconstructs intrinsic mode functions (IMFs) produced by EMD using trial and error method. The new component and the remaining IMFs is then predicted respectively using GMDH model. Finally, the forecasted results for each component are aggregated to construct an ensemble forecast. The data used in this experiment are monthly time series data of tourist arrivals from China, Thailand and India to Malaysia from year 2000 to 2016. The performance of the model is evaluated using Root Mean Square Error (RMSE) and Mean Absolute Percentage Error (MAPE) where conventional GMDH model and EMD-GMDH model are used as benchmark models. Empirical results proved that the proposed model performed better forecasts than the benchmarked models.

  16. Software development for statistical handling of dosimetric and epidemiological data base

    International Nuclear Information System (INIS)

    Amaro, M.

    1990-01-01

    The dose records from different groups of occupationally exposed workers are available in a computerized data base whose main purpose is the individual dose follow-up. Apart from this objective, such a dosimetric data base can be useful to obtain statistical analysis. The type of statistical n formation that can be extracted from the data base may aim to attain mainly two kinds of objectives: - Individual and collective dose distributions and statistics. -Epidemiological statistics. The report describes the software developed to obtain the statistical reports required by the Regulatory Body, as well as any other type of dose distributions or statistics to be included in epidemiological studies A Users Guide for the operators who handle this software package, and the codes listings, are also included in the report. (Author) 2 refs

  17. Software development for statistical handling of dosimetric and epidemiological data base

    International Nuclear Information System (INIS)

    Amaro, M.

    1990-01-01

    The dose records from different group of occupationally exposed workers are available in a computerized data base whose main purpose is the individual dose follow-up. Apart from this objective, such a dosimetric data base can be useful to obtain statistical analysis. The type of statistical information that can be extracted from the data base may aim to attain mainly two kinds of obsectives: - Individual and collective dose distributions and statistics. - Epidemiological statistics. The report describes the software developed to obtain the statistical reports required by the Regulatory Body, as well as any other type of dose distributions or statistics to be included in epidsemiological studies. A Users Guide for the operators who handle this sofware package, and the codes listings, are also included in the report. (Author)

  18. Handling missing data in transmission disequilibrium test in nuclear families with one affected offspring.

    Directory of Open Access Journals (Sweden)

    Gulhan Bourget

    Full Text Available The Transmission Disequilibrium Test (TDT compares frequencies of transmission of two alleles from heterozygote parents to an affected offspring. This test requires all genotypes to be known from all members of the nuclear families. However, obtaining all genotypes in a study might not be possible for some families, in which case, a data set results in missing genotypes. There are many techniques of handling missing genotypes in parents but only a few in offspring. The robust TDT (rTDT is one of the methods that handles missing genotypes for all members of nuclear families [with one affected offspring]. Even though all family members can be imputed, the rTDT is a conservative test with low power. We propose a new method, Mendelian Inheritance TDT (MITDT-ONE, that controls type I error and has high power. The MITDT-ONE uses Mendelian Inheritance properties, and takes population frequencies of the disease allele and marker allele into account in the rTDT method. One of the advantages of using the MITDT-ONE is that the MITDT-ONE can identify additional significant genes that are not found by the rTDT. We demonstrate the performances of both tests along with Sib-TDT (S-TDT in Monte Carlo simulation studies. Moreover, we apply our method to the type 1 diabetes data from the Warren families in the United Kingdom to identify significant genes that are related to type 1 diabetes.

  19. A Globally Distributed System for Job, Data, and Information Handling for High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, Gabriele [DePaul Univ., Chicago, IL (United States)

    2006-01-13

    The computing infrastructures of the modern high energy physics experiments need to address an unprecedented set of requirements. The collaborations consist of hundreds of members from dozens of institutions around the world and the computing power necessary to analyze the data produced surpasses already the capabilities of any single computing center. A software infrastructure capable of seamlessly integrating dozens of computing centers around the world, enabling computing for a large and dynamical group of users, is of fundamental importance for the production of scientific results. Such a computing infrastructure is called a computational grid. The SAM-Grid offers a solution to these problems for CDF and DZero, two of the largest high energy physics experiments in the world, running at Fermilab. The SAM-Grid integrates standard grid middleware, such as Condor-G and the Globus Toolkit, with software developed at Fermilab, organizing the system in three major components: data handling, job handling, and information management. This dissertation presents the challenges and the solutions provided in such a computing infrastructure.

  20. Neo: an object model for handling electrophysiology data in multiple formats

    Directory of Open Access Journals (Sweden)

    Samuel eGarcia

    2014-02-01

    Full Text Available Neuroscientists use many different software tools to acquire, analyse and visualise electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs.A common representation of the core data would improve interoperability and facilitate data-sharing.To that end, we propose here a language-independent object model, named Neo, suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language.In addition to representing electrophysiology data in memory for the purposes of analysis and visualisation, the Python implementation provides a set of input/output (IO modules for reading/writing the data from/to a variety of commonly used file formats.Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB.Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation.For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualisation.Software for neurophysiology data analysis and visualisation built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in

  1. EUDAT strategies for handling dynamic data in the solid Earth Sciences

    Science.gov (United States)

    Michelini, Alberto; Evans, Peter; Kemps-Snijder, Mark; Heikkinen, Jani; Buck, Justin; Misutka, Jozef; Drude, Sebastian; Fares, Massimo; Cacciari, Claudio; Fiameni, Giuseppe

    2014-05-01

    Some dynamic data is generated by sensors which produce data streams that may be temporarily incomplete (owing to latencies or temporary interruptions of the transmission lines between the field sensors and the data acquisition centres) and that may consequently fill up over time (automatically or after manual intervention). Dynamic data can also be generated by massive crowd sourcing where, for example, experimental collections of data can be filled up at random moments. The nature of dynamic data makes it difficult to handle for various reasons: a) establishing valid policies that guide early replication for data preservation and access optimization is not trivial, b) identifying versions of such data - thus making it possible to check their integrity - and referencing the versions is also a challenging task, and c) performance issues are extremely important since all these activities must be performed fast enough to keep up with the incoming data stream. There is no doubt that both applications areas (namely data from sensors and crowdsourcing) are growing in their relevance for science, and that appropriate infrastructure support (by initiatives such as EUDAT) is vital to handle these challenges. In addition, data must be citeable to encourage transparent, reproducible science, and to provide clear metrics for assessing the impact of research, which also drives funding choices. Data stream in real time often undergo changes/revisions while they are still growing, as new data arrives, and they are revised as missing data is recovered, or as new calibration values are applied. We call these "dynamic" data sets, DDS. A common form of DDS is time series data in which measurements are obtained on a regular schedule, with a well-defined sample rate. Examples include the hourly temperature in Barcelona, and the displacement (a 3-D vector quantity) of a seismograph from its rest position, which may record at a rate of 100 or more samples per second. These form streams

  2. Robust, Radiation Tolerant Command and Data Handling and Power System Electronics for SmallSats

    Science.gov (United States)

    Nguyen, Hanson Cao; Fraction, James

    2018-01-01

    In today's budgetary environment, there is significant interest within the National Aeronautics and Space Administration (NASA) to enable small robotic science missions that can be executed faster and cheaper than previous larger missions. To help achieve this, focus has shifted from using exclusively radiation-tolerant or radiation-hardened parts to using more commercial-off-the-shelf (COTS) components for NASA small satellite missions that can last at least one year in orbit. However, there are some portions of a spacecraft's avionics, such as the Command and Data Handling (C&DH) subsystem and the Power System Electronics (PSE) that need to have a higher level of reliability that goes beyond what is attainable with currently available COTS parts. While there are a number of COTS components that can withstand a total ionizing dose (TID) of tens or hundreds of kilorads, there is still a great deal of concern about tolerance to and mitigation of single-event effects (SEE).

  3. INfluence of vinasse on water movement in soil, using automatic acquisition and handling data system

    International Nuclear Information System (INIS)

    Nascimento Filho, V.F. do; Barros Ferraz, E.S. de

    1986-01-01

    The vinasse, by-product of ethylic alcohol industry from the sugar cane juice or molasses yeast fermentation, has been incorporated in the soil as fertilizer, due to the its hight organic matter (2-6%), potassium and sulphate (0,1-0,5%) and other nutrient contents. By employing monoenergetic gamma-ray beam attenuation technique (241Am; 59,5 keV; 100 mCi) the influence of vinasse on the water movement in the soil was studied. For this, an automatic acquisition and handling data system was used, based in multichannel analyser, multi-scaling mode operated, coupled to a personal microcomputer and plotter. Despite the small depth studied (6 cm), it was observed that vinasse decreases the water infiltration velocity in the soil. (Author) [pt

  4. Gravity Probe B data analysis: II. Science data and their handling prior to the final analysis

    International Nuclear Information System (INIS)

    Silbergleit, A S; Conklin, J W; Heifetz, M I; Holmes, T; Li, J; Mandel, I; Solomonik, V G; Stahl, K; P W Worden Jr; Everitt, C W F; Adams, M; Berberian, J E; Bencze, W; Clarke, B; Al-Jadaan, A; Keiser, G M; Kozaczuk, J A; Al-Meshari, M; Muhlfelder, B; Salomon, M

    2015-01-01

    The results of the Gravity Probe B relativity science mission published in Everitt et al (2011 Phys. Rev. Lett. 106 221101) required a rather sophisticated analysis of experimental data due to several unexpected complications discovered on-orbit. We give a detailed description of the Gravity Probe B data reduction. In the first paper (Silbergleit et al Class. Quantum Grav. 22 224018) we derived the measurement models, i.e., mathematical expressions for all the signals to analyze. In the third paper (Conklin et al Class. Quantum Grav. 22 224020) we explain the estimation algorithms and their program implementation, and discuss the experiment results obtained through data reduction. This paper deals with the science data preparation for the main analysis yielding the relativistic drift estimates. (paper)

  5. Dealing with the Data Deluge: Handling the Multitude Of Chemical Biology Data Sources.

    Science.gov (United States)

    Guha, Rajarshi; Nguyen, Dac-Trung; Southall, Noel; Jadhav, Ajit

    2012-09-01

    Over the last 20 years, there has been an explosion in the amount and type of biological and chemical data that has been made publicly available in a variety of online databases. While this means that vast amounts of information can be found online, there is no guarantee that it can be found easily (or at all). A scientist searching for a specific piece of information is faced with a daunting task - many databases have overlapping content, use their own identifiers and, in some cases, have arcane and unintuitive user interfaces. In this overview, a variety of well known data sources for chemical and biological information are highlighted, focusing on those most useful for chemical biology research. The issue of using multiple data sources together and the associated problems such as identifier disambiguation are highlighted. A brief discussion is then provided on Tripod, a recently developed platform that supports the integration of arbitrary data sources, providing users a simple interface to search across a federated collection of resources.

  6. HOPE: An On-Line Piloted Handling Qualities Experiment Data Book

    Science.gov (United States)

    Jackson, E. B.; Proffitt, Melissa S.

    2010-01-01

    A novel on-line database for capturing most of the information obtained during piloted handling qualities experiments (either flight or simulated) is described. The Hyperlinked Overview of Piloted Evaluations (HOPE) web application is based on an open-source object-oriented Web-based front end (Ruby-on-Rails) that can be used with a variety of back-end relational database engines. The hyperlinked, on-line data book approach allows an easily-traversed way of looking at a variety of collected data, including pilot ratings, pilot information, vehicle and configuration characteristics, test maneuvers, and individual flight test cards and repeat runs. It allows for on-line retrieval of pilot comments, both audio and transcribed, as well as time history data retrieval and video playback. Pilot questionnaires are recorded as are pilot biographies. Simple statistics are calculated for each selected group of pilot ratings, allowing multiple ways to aggregate the data set (by pilot, by task, or by vehicle configuration, for example). Any number of per-run or per-task metrics can be captured in the database. The entire run metrics dataset can be downloaded in comma-separated text for further analysis off-line. It is expected that this tool will be made available upon request

  7. Designing on-Board Data Handling for EDF (Electric Ducted Fan) Rocket

    Science.gov (United States)

    Mulyana, A.; Faiz, L. A. A.

    2018-02-01

    The EDF (Electric Ducted Fan) rocket to launch requires a system of monitoring, tracking and controlling to allow the rocket to glide properly. One of the important components in the rocket is OBDH (On-Board Data Handling) which serves as a medium to perform commands and data processing. However, TTC (Telemetry, Tracking, and Command) are required to communicate between GCS (Ground Control Station) and OBDH on EDF rockets. So the design control system of EDF rockets and GCS for telemetry and telecommand needs to be made. In the design of integrated OBDH controller uses a lot of electronics modules, to know the behavior of rocket used IMU sensor (Inertial Measurement Unit) in which consist of 3-axis gyroscope sensor and Accelerometer 3-axis. To do tracking using GPS, compass sensor as a determinant of the direction of the rocket as well as a reference point on the z-axis of gyroscope sensor processing and used barometer sensors to measure the height of the rocket at the time of glide. The data can be known in real-time by sending data through radio modules at 2.4 GHz frequency using XBee-Pro S2B to GCS. By using windows filter, noises can be reduced, and it used to guarantee monitoring and controlling system can work properly.

  8. Handling imbalance data in churn prediction using combined SMOTE and RUS with bagging method

    Science.gov (United States)

    Pura Hartati, Eka; Adiwijaya; Arif Bijaksana, Moch

    2018-03-01

    Customer churn has become a significant problem and also a challenge for Telecommunication company such as PT. Telkom Indonesia. It is necessary to evaluate whether the big problems of churn customer and the company’s managements will make appropriate strategies to minimize the churn and retaining the customer. Churn Customer data which categorized churn Atas Permintaan Sendiri (APS) in this Company is an imbalance data, and this issue is one of the challenging tasks in machine learning. This study will investigate how is handling class imbalance in churn prediction using combined Synthetic Minority Over-Sampling (SMOTE) and Random Under-Sampling (RUS) with Bagging method for a better churn prediction performance’s result. The dataset that used is Broadband Internet data which is collected from Telkom Regional 6 Kalimantan. The research firstly using data preprocessing to balance the imbalanced dataset and also to select features by sampling technique SMOTE and RUS, and then building churn prediction model using Bagging methods and C4.5.

  9. Scalable Data Quality for Big Data: The Pythia Framework for Handling Missing Values.

    Science.gov (United States)

    Cahsai, Atoshum; Anagnostopoulos, Christos; Triantafillou, Peter

    2015-09-01

    Solving the missing-value (MV) problem with small estimation errors in large-scale data environments is a notoriously resource-demanding task. The most widely used MV imputation approaches are computationally expensive because they explicitly depend on the volume and the dimension of the data. Moreover, as datasets and their user community continuously grow, the problem can only be exacerbated. In an attempt to deal with such a problem, in our previous work, we introduced a novel framework coined Pythia, which employs a number of distributed data nodes (cohorts), each of which contains a partition of the original dataset. To perform MV imputation, the Pythia, based on specific machine and statistical learning structures (signatures), selects the most appropriate subset of cohorts to perform locally a missing value substitution algorithm (MVA). This selection relies on the principle that particular subset of cohorts maintains the most relevant partition of the dataset. In addition to this, as Pythia uses only part of the dataset for imputation and accesses different cohorts in parallel, it improves efficiency, scalability, and accuracy compared to a single machine (coined Godzilla), which uses the entire massive dataset to compute imputation requests. Although this article is an extension of our previous work, we particularly investigate the robustness of the Pythia framework and show that the Pythia is independent from any MVA and signature construction algorithms. In order to facilitate our research, we considered two well-known MVAs (namely K-nearest neighbor and expectation-maximization imputation algorithms), as well as two machine and neural computational learning signature construction algorithms based on adaptive vector quantization and competitive learning. We prove comprehensive experiments to assess the performance of the Pythia against Godzilla and showcase the benefits stemmed from this framework.

  10. The Sample Handling System for the Mars Icebreaker Life Mission: from Dirt to Data

    Science.gov (United States)

    Dave, Arwen; Thompson, Sarah J.; McKay, Christopher P.; Stoker, Carol R.; Zacny, Kris; Paulsen, Gale; Mellerowicz, Bolek; Glass, Brian J.; Wilson, David; Bonaccorsi, Rosalba; hide

    2013-01-01

    The Mars icebreaker life mission will search for subsurface life on mars. It consists of three payload elements: a drill to retrieve soil samples from approx. 1 meter below the surface, a robotic sample handling system to deliver the sample from the drill to the instruments, and the instruments themselves. This paper will discuss the robotic sample handling system.

  11. Study of input variables in group method of data handling methodology

    International Nuclear Information System (INIS)

    Pereira, Iraci Martinez; Bueno, Elaine Inacio

    2013-01-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a pre-selected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and ANN methodologies, and applied to the IPEN research Reactor IEA-1. The system performs the monitoring by comparing the GMDH and ANN calculated values with measured ones. As the GMDH is a self-organizing methodology, the input variables choice is made automatically. On the other hand, the results of ANN methodology are strongly dependent on which variables are used as neural network input. (author)

  12. Determination of heat capacity of ionic liquid based nanofluids using group method of data handling technique

    Science.gov (United States)

    Sadi, Maryam

    2018-01-01

    In this study a group method of data handling model has been successfully developed to predict heat capacity of ionic liquid based nanofluids by considering reduced temperature, acentric factor and molecular weight of ionic liquids, and nanoparticle concentration as input parameters. In order to accomplish modeling, 528 experimental data points extracted from the literature have been divided into training and testing subsets. The training set has been used to predict model coefficients and the testing set has been applied for model validation. The ability and accuracy of developed model, has been evaluated by comparison of model predictions with experimental values using different statistical parameters such as coefficient of determination, mean square error and mean absolute percentage error. The mean absolute percentage error of developed model for training and testing sets are 1.38% and 1.66%, respectively, which indicate excellent agreement between model predictions and experimental data. Also, the results estimated by the developed GMDH model exhibit a higher accuracy when compared to the available theoretical correlations.

  13. Data Container Study for Handling array-based data using Hive, Spark, MongoDB, SciDB and Rasdaman

    Science.gov (United States)

    Xu, M.; Hu, F.; Yang, J.; Yu, M.; Yang, C. P.

    2017-12-01

    Geoscience communities have come up with various big data storage solutions, such as Rasdaman and Hive, to address the grand challenges for massive Earth observation data management and processing. To examine the readiness of current solutions in supporting big Earth observation, we propose to investigate and compare four popular data container solutions, including Rasdaman, Hive, Spark, SciDB and MongoDB. Using different types of spatial and non-spatial queries, datasets stored in common scientific data formats (e.g., NetCDF and HDF), and two applications (i.e. dust storm simulation data mining and MERRA data analytics), we systematically compare and evaluate the feature and performance of these four data containers in terms of data discover and access. The computing resources (e.g. CPU, memory, hard drive, network) consumed while performing various queries and operations are monitored and recorded for the performance evaluation. The initial results show that 1) the popular data container clusters are able to handle large volume of data, but their performances vary in different situations. Meanwhile, there is a trade-off between data preprocessing, disk saving, query-time saving, and resource consuming. 2) ClimateSpark, MongoDB and SciDB perform the best among all the containers in all the queries tests, and Hive performs the worst. 3) These studied data containers can be applied on other array-based datasets, such as high resolution remote sensing data and model simulation data. 4) Rasdaman clustering configuration is more complex than the others. A comprehensive report will detail the experimental results, and compare their pros and cons regarding system performance, ease of use, accessibility, scalability, compatibility, and flexibility.

  14. Handling missing data in cluster randomized trials: A demonstration of multiple imputation with PAN through SAS

    Directory of Open Access Journals (Sweden)

    Jiangxiu Zhou

    2014-09-01

    Full Text Available The purpose of this study is to demonstrate a way of dealing with missing data in clustered randomized trials by doing multiple imputation (MI with the PAN package in R through SAS. The procedure for doing MI with PAN through SAS is demonstrated in detail in order for researchers to be able to use this procedure with their own data. An illustration of the technique with empirical data was also included. In this illustration thePAN results were compared with pairwise deletion and three types of MI: (1 Normal Model (NM-MI ignoring the cluster structure; (2 NM-MI with dummy-coded cluster variables (fixed cluster structure; and (3 a hybrid NM-MI which imputes half the time ignoring the cluster structure, and the other half including the dummy-coded cluster variables. The empirical analysis showed that using PAN and the other strategies produced comparable parameter estimates. However, the dummy-coded MI overestimated the intraclass correlation, whereas MI ignoring the cluster structure and the hybrid MI underestimated the intraclass correlation. When compared with PAN, the p-value and standard error for the treatment effect were higher with dummy-coded MI, and lower with MI ignoring the clusterstructure, the hybrid MI approach, and pairwise deletion. Previous studies have shown that NM-MI is not appropriate for handling missing data in clustered randomized trials. This approach, in addition to the pairwise deletion approach, leads to a biased intraclass correlation and faultystatistical conclusions. Imputation in clustered randomized trials should be performed with PAN. We have demonstrated an easy way for using PAN through SAS.

  15. An automated data handling process integrating spreadsheets and word processors with analytical programs

    International Nuclear Information System (INIS)

    Fisher, G.F.; Bennett, L.G.I.

    1994-01-01

    A data handling process utilizing software programs that are commercially available for use on MS-DOS microcomputers was developed to reduce the time, energy and labour required to tabulate the final results of trace analyses. The elimination of hand computations reduced the possibility of transcription errors since, once the γ-ray spectrum analysis results are obtained and saved to a hard disk of a microcomputer, they can be manipulated very easily with little possibility of distortion. The 8 step process permitted the selection of each element of interest's best concentration value based upon its associated peak area. Calculated concentration values were automatically compared against the sample's determination limit. Unsatisfactory values were flagged for latter review and adjustment by the user. In the final step, a file was created which identified the samples with their appropriate particulars (i.e. source, sample, date, etc.), and the trace element concentration were displayed. This final file contained a fully formatted summary table that listed all of the sample's results and particulars such that it could be printed or imported into a word processor for inclusion in a report. In the illustrated application of analyzing wear debris in oil-lubricated systems, over 13,000 individual numbers were processed to arrive at final concentration estimates of 19 trace elements in 80 samples. The system works very well for the elements that were analyzed in this investigation. The usefulness of commercially available spreadsheets and word processors for this task was demonstrated. (author) 5 refs.; 2 figs.; 5 tabs

  16. Data handling with SAM and art at the NOνA experiment

    International Nuclear Information System (INIS)

    Aurisano, A; Backhouse, C; Davies, G S; Illingworth, R; Mengel, M; Norman, A; Mayer, N; Rocco, D; Zirnstein, J

    2015-01-01

    During operations, NOvA produces between 5,000 and 7,000 raw files per day with peaks in excess of 12,000. These files must be processed in several stages to produce fully calibrated and reconstructed analysis files. In addition, many simulated neutrino interactions must be produced and processed through the same stages as data. To accommodate the large volume of data and Monte Carlo, production must be possible both on the Fermilab grid and on off-site farms, such as the ones accessible through the Open Science Grid. To handle the challenge of cataloging these files and to facilitate their off-line processing, we have adopted the SAM system developed at Fermilab. SAM indexes files according to metadata, keeps track of each file's physical locations, provides dataset management facilities, and facilitates data transfer to off-site grids. To integrate SAM with Fermilab's art software framework and the NOvA production workflow, we have developed methods to embed metadata into our configuration files, art files, and standalone ROOT files. A module in the art framework propagates the embedded information from configuration files into art files, and from input art files to output art files, allowing us to maintain a complete processing history within our files. Embedding metadata in configuration files also allows configuration files indexed in SAM to be used as inputs to Monte Carlo production jobs. Further, SAM keeps track of the input files used to create each output file. Parentage information enables the construction of self-draining datasets which have become the primary production paradigm used at NOvA. In this paper we will present an overview of SAM at NOvA and how it has transformed the file production framework used by the experiment. (paper)

  17. The development and operation of the international solar-terrestrial physics central data handling facility

    Science.gov (United States)

    Lehtonen, Kenneth

    1994-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) International Solar-Terrestrial Physics (ISTP) Program is committed to the development of a comprehensive, multi-mission ground data system which will support a variety of national and international scientific missions in an effort to study the flow of energy from the sun through the Earth-space environment, known as the geospace. A major component of the ISTP ground data system is an ISTP-dedicated Central Data Handling Facility (CDHF). Acquisition, development, and operation of the ISTP CDHF were delegated by the ISTP Project Office within the Flight Projects Directorate to the Information Processing Division (IPD) within the Mission Operations and Data Systems Directorate (MO&DSD). The ISTP CDHF supports the receipt, storage, and electronic access of the full complement of ISTP Level-zero science data; serves as the linchpin for the centralized processing and long-term storage of all key parameters generated either by the ISTP CDHF itself or received from external, ISTP Program approved sources; and provides the required networking and 'science-friendly' interfaces for the ISTP investigators. Once connected to the ISTP CDHF, the online catalog of key parameters can be browsed from their remote processing facilities for the immediate electronic receipt of selected key parameters using the NASA Science Internet (NSI), managed by NASA's Ames Research Center. The purpose of this paper is twofold: (1) to describe how the ISTP CDHF was successfully implemented and operated to support initially the Japanese Geomagnetic Tail (GEOTAIL) mission and correlative science investigations, and (2) to describe how the ISTP CDHF has been enhanced to support ongoing as well as future ISTP missions. Emphasis will be placed on how various project management approaches were undertaken that proved to be highly effective in delivering an operational ISTP CDHF to the Project on schedule and

  18. Command and Data Handling Flight Software test framework: A Radiation Belt Storm Probes practice

    Science.gov (United States)

    Hill, T. A.; Reid, W. M.; Wortman, K. A.

    During the Radiation Belt Storm Probes (RBSP) mission, a test framework was developed by the Embedded Applications Group in the Space Department at the Johns Hopkins Applied Physics Laboratory (APL). The test framework is implemented for verification of the Command and Data Handling (C& DH) Flight Software. The RBSP C& DH Flight Software consists of applications developed for use with Goddard Space Flight Center's core Flight Executive (cFE) architecture. The test framework's initial concept originated with tests developed for verification of the Autonomy rules that execute with the Autonomy Engine application of the RBSP C& DH Flight Software. The test framework was adopted and expanded for system and requirements verification of the RBSP C& DH Flight Software. During the evolution of the RBSP C& DH Flight Software test framework design, a set of script conventions and a script library were developed. The script conventions and library eased integration of system and requirements verification tests into a comprehensive automated test suite. The comprehensive test suite is currently being used to verify releases of the RBSP C& DH Flight Software. In addition to providing the details and benefits of the test framework, the discussion will include several lessons learned throughout the verification process of RBSP C& DH Flight Software. Our next mission, Solar Probe Plus (SPP), will use the cFE architecture for the C& DH Flight Software. SPP also plans to use the same ground system as RBSP. Many of the RBSP C& DH Flight Software applications are reusable on the SPP mission, therefore there is potential for test design and test framework reuse for system and requirements verification.

  19. A versatile data handling system for nuclear physics experiments based on PDP 11/03 micro-computers

    International Nuclear Information System (INIS)

    Raaf, A.J. de

    1979-01-01

    A reliable and low cost data handling system for nuclear physics experiments is described. It is based on two PDP 11/03 micro-computers together with Gec-Elliott CAMAC equipment. For the acquisition of the experimental data a fast system has been designed. It consists of a controller for four ADCs together with an intelligent 38k MOS memory with a word size of 24 bits. (Auth.)

  20. Modular, Autonomous Command and Data Handling Software with Built-In Simulation and Test

    Science.gov (United States)

    Cuseo, John

    2012-01-01

    The spacecraft system that plays the greatest role throughout the program lifecycle is the Command and Data Handling System (C&DH), along with the associated algorithms and software. The C&DH takes on this role as cost driver because it is the brains of the spacecraft and is the element of the system that is primarily responsible for the integration and interoperability of all spacecraft subsystems. During design and development, many activities associated with mission design, system engineering, and subsystem development result in products that are directly supported by the C&DH, such as interfaces, algorithms, flight software (FSW), and parameter sets. A modular system architecture has been developed that provides a means for rapid spacecraft assembly, test, and integration. This modular C&DH software architecture, which can be targeted and adapted to a wide variety of spacecraft architectures, payloads, and mission requirements, eliminates the current practice of rewriting the spacecraft software and test environment for every mission. This software allows missionspecific software and algorithms to be rapidly integrated and tested, significantly decreasing time involved in the software development cycle. Additionally, the FSW includes an Onboard Dynamic Simulation System (ODySSy) that allows the C&DH software to support rapid integration and test. With this solution, the C&DH software capabilities will encompass all phases of the spacecraft lifecycle. ODySSy is an on-board simulation capability built directly into the FSW that provides dynamic built-in test capabilities as soon as the FSW image is loaded onto the processor. It includes a six-degrees- of-freedom, high-fidelity simulation that allows complete closed-loop and hardware-in-the-loop testing of a spacecraft in a ground processing environment without any additional external stimuli. ODySSy can intercept and modify sensor inputs using mathematical sensor models, and can intercept and respond to actuator

  1. Handling Large and Complex Data in a Photovoltaic Research Institution Using a Custom Laboratory Information Management System

    Energy Technology Data Exchange (ETDEWEB)

    White, Robert R.; Munch, Kristin

    2014-01-01

    Twenty-five years ago the desktop computer started becoming ubiquitous in the scientific lab. Researchers were delighted with its ability to both control instrumentation and acquire data on a single system, but they were not completely satisfied. There were often gaps in knowledge that they thought might be gained if they just had more data and they could get the data faster. Computer technology has evolved in keeping with Moore’s Law meeting those desires; however those improvements have of late become both a boon and bane for researchers. Computers are now capable of producing high speed data streams containing terabytes of information; capabilities that evolved faster than envisioned last century. Software to handle large scientific data sets has not kept up. How much information might be lost through accidental mismanagement or how many discoveries are missed through data overload are now vital questions. An important new task in most scientific disciplines involves developing methods to address those issues and to create the software that can handle large data sets with an eye towards scalability. This software must create archived, indexed, and searchable data from heterogeneous instrumentation for the implementation of a strong data-driven materials development strategy. At the National Center for Photovoltaics in the National Renewable Energy Laboratory, we began development a few years ago on a Laboratory Information Management System (LIMS) designed to handle lab-wide scientific data acquisition, management, processing and mining needs for physics and materials science data, and with a specific focus towards future scalability for new equipment or research focuses. We will present the decisions, processes, and problems we went through while building our LIMS system for materials research, its current operational state and our steps for future development.

  2. Adaptive handling of Rayleigh and Raman scatter of fluorescence data based on evaluation of the degree of spectral overlap

    Science.gov (United States)

    Hu, Yingtian; Liu, Chao; Wang, Xiaoping; Zhao, Dongdong

    2018-06-01

    At present the general scatter handling methods are unsatisfactory when scatter and fluorescence seriously overlap in excitation emission matrix. In this study, an adaptive method for scatter handling of fluorescence data is proposed. Firstly, the Raman scatter was corrected by subtracting the baseline of deionized water which was collected in each experiment to adapt to the intensity fluctuations. Then, the degrees of spectral overlap between Rayleigh scatter and fluorescence were classified into three categories based on the distance between the spectral peaks. The corresponding algorithms, including setting to zero, fitting on single or both sides, were implemented after the evaluation of the degree of overlap for individual emission spectra. The proposed method minimized the number of fitting and interpolation processes, which reduced complexity, saved time, avoided overfitting, and most importantly assured the authenticity of data. Furthermore, the effectiveness of this procedure on the subsequent PARAFAC analysis was assessed and compared to Delaunay interpolation by conducting experiments with four typical organic chemicals and real water samples. Using this method, we conducted long-term monitoring of tap water and river water near a dyeing and printing plant. This method can be used for improving adaptability and accuracy in the scatter handling of fluorescence data.

  3. ASSIST - a package of Fortran routines for handling input under specified syntax rules and for management of data structures

    International Nuclear Information System (INIS)

    Sinclair, J.E.

    1991-02-01

    The ASSIST package (A Structured Storage and Input Syntax Tool) provides for Fortran programs a means for handling data structures more general than those provided by the Fortran language, and for obtaining input to the program from a file or terminal according to specified syntax rules. The syntax-controlled input can be interactive, with automatic generation of prompts, and dialogue to correct any input errors. The range of syntax rules possible is sufficient to handle lists of numbers and character strings, keywords, commands with optional clauses, and many kinds of variable-format constructions, such as algebraic expressions. ASSIST was developed for use in two large programs for the analysis of safety of radioactive waste disposal facilities, but it should prove useful for a wide variety of applications. (author)

  4. A dedicated database system for handling multi-level data in systems biology

    OpenAIRE

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Background Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging...

  5. Empirical comparison of techniques for handling incomplete data using decision trees

    CSIR Research Space (South Africa)

    Twala, B

    2009-01-01

    Full Text Available Increasing the awareness of how incomplete data affects learning and classification accuracy has led to increasing numbers of missing data techniques. This paper investigates the robustness and accuracy of seven popular techniques for tolerating...

  6. Application of the software system USS to nuclear power plant data handling

    International Nuclear Information System (INIS)

    Wellhausen, U.

    1979-01-01

    The Unified Software System (USS) has been used to establish a data bank of general, economic, and technical nuclear power plant data. On the basis of a test magnetic tape, the principal lay-out of the data bank is described and examples of searches are given. In conclusion an additional programme is presented for sorting numerial parameters in a certain order

  7. A new approach for handling longitudinal count data with zero-inflation and overdispersion: poisson geometric process model.

    Science.gov (United States)

    Wan, Wai-Yin; Chan, Jennifer S K

    2009-08-01

    For time series of count data, correlated measurements, clustering as well as excessive zeros occur simultaneously in biomedical applications. Ignoring such effects might contribute to misleading treatment outcomes. A generalized mixture Poisson geometric process (GMPGP) model and a zero-altered mixture Poisson geometric process (ZMPGP) model are developed from the geometric process model, which was originally developed for modelling positive continuous data and was extended to handle count data. These models are motivated by evaluating the trend development of new tumour counts for bladder cancer patients as well as by identifying useful covariates which affect the count level. The models are implemented using Bayesian method with Markov chain Monte Carlo (MCMC) algorithms and are assessed using deviance information criterion (DIC).

  8. A proposed framework on hybrid feature selection techniques for handling high dimensional educational data

    Science.gov (United States)

    Shahiri, Amirah Mohamed; Husain, Wahidah; Rashid, Nur'Aini Abd

    2017-10-01

    Huge amounts of data in educational datasets may cause the problem in producing quality data. Recently, data mining approach are increasingly used by educational data mining researchers for analyzing the data patterns. However, many research studies have concentrated on selecting suitable learning algorithms instead of performing feature selection process. As a result, these data has problem with computational complexity and spend longer computational time for classification. The main objective of this research is to provide an overview of feature selection techniques that have been used to analyze the most significant features. Then, this research will propose a framework to improve the quality of students' dataset. The proposed framework uses filter and wrapper based technique to support prediction process in future study.

  9. A general method for handling missing binary outcome data in randomized controlled trials

    OpenAIRE

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-01-01

    Aims The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. Design We propose a sensitivity analysis where standard analyses, which could include ‘missing = smoking’ and ‘last observation carried forward’, are embedded in a wider class of models. Setting We apply our general method to data from two smoking cessation trials. Partici...

  10. A dedicated database system for handling multi-level data in systems biology.

    Science.gov (United States)

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging. To overcome this, we designed and developed a dedicated database system that can serve and solve the vital issues in data management and hereby facilitate data integration, modeling and analysis in systems biology within a sole database. In addition, a yeast data repository was implemented as an integrated database environment which is operated by the database system. Two applications were implemented to demonstrate extensibility and utilization of the system. Both illustrate how the user can access the database via the web query function and implemented scripts. These scripts are specific for two sample cases: 1) Detecting the pheromone pathway in protein interaction networks; and 2) Finding metabolic reactions regulated by Snf1 kinase. In this study we present the design of database system which offers an extensible environment to efficiently capture the majority of biological entities and relations encountered in systems biology. Critical functions and control processes were designed and implemented to ensure consistent, efficient, secure and reliable transactions. The two sample cases on the yeast integrated data clearly demonstrate the value of a sole database environment for systems biology research.

  11. Strategies for Handling Missing Data with Maximum Likelihood Estimation in Career and Technical Education Research

    Science.gov (United States)

    Lee, In Heok

    2012-01-01

    Researchers in career and technical education often ignore more effective ways of reporting and treating missing data and instead implement traditional, but ineffective, missing data methods (Gemici, Rojewski, & Lee, 2012). The recent methodological, and even the non-methodological, literature has increasingly emphasized the importance of…

  12. Handling Imprecision in Qualitative Data Warehouse: Urban Building Sites Annoyance Analysis Use Case

    Science.gov (United States)

    Amanzougarene, F.; Chachoua, M.; Zeitouni, K.

    2013-05-01

    Data warehouse means a decision support database allowing integration, organization, historisation, and management of data from heterogeneous sources, with the aim of exploiting them for decision-making. Data warehouses are essentially based on multidimensional model. This model organizes data into facts (subjects of analysis) and dimensions (axes of analysis). In classical data warehouses, facts are composed of numerical measures and dimensions which characterize it. Dimensions are organized into hierarchical levels of detail. Based on the navigation and aggregation mechanisms offered by OLAP (On-Line Analytical Processing) tools, facts can be analyzed according to the desired level of detail. In real world applications, facts are not always numerical, and can be of qualitative nature. In addition, sometimes a human expert or learned model such as a decision tree provides a qualitative evaluation of phenomenon based on its different parameters i.e. dimensions. Conventional data warehouses are thus not adapted to qualitative reasoning and have not the ability to deal with qualitative data. In previous work, we have proposed an original approach of qualitative data warehouse modeling, which permits integrating qualitative measures. Based on computing with words methodology, we have extended classical multidimensional data model to allow the aggregation and analysis of qualitative data in OLAP environment. We have implemented this model in a Spatial Decision Support System to help managers of public spaces to reduce annoyances and improve the quality of life of the citizens. In this paper, we will focus our study on the representation and management of imprecision in annoyance analysis process. The main objective of this process consists in determining the least harmful scenario of urban building sites, particularly in dense urban environments.

  13. A method for data handling numerical results in parallel OpenFOAM simulations

    International Nuclear Information System (INIS)

    nd Vasile Pârvan Ave., 300223, TM Timişoara, Romania, alin.anton@cs.upt.ro (Romania))" data-affiliation=" (Faculty of Automatic Control and Computing, Politehnica University of Timişoara, 2nd Vasile Pârvan Ave., 300223, TM Timişoara, Romania, alin.anton@cs.upt.ro (Romania))" >Anton, Alin; th Mihai Viteazu Ave., 300221, TM Timişoara (Romania))" data-affiliation=" (Center for Advanced Research in Engineering Science, Romanian Academy – Timişoara Branch, 24th Mihai Viteazu Ave., 300221, TM Timişoara (Romania))" >Muntean, Sebastian

    2015-01-01

    Parallel computational fluid dynamics simulations produce vast amount of numerical result data. This paper introduces a method for reducing the size of the data by replaying the interprocessor traffic. The results are recovered only in certain regions of interest configured by the user. A known test case is used for several mesh partitioning scenarios using the OpenFOAM toolkit ® [1]. The space savings obtained with classic algorithms remain constant for more than 60 Gb of floating point data. Our method is most efficient on large simulation meshes and is much better suited for compressing large scale simulation results than the regular algorithms

  14. A method for data handling numerical results in parallel OpenFOAM simulations

    Energy Technology Data Exchange (ETDEWEB)

    Anton, Alin [Faculty of Automatic Control and Computing, Politehnica University of Timişoara, 2" n" d Vasile Pârvan Ave., 300223, TM Timişoara, Romania, alin.anton@cs.upt.ro (Romania); Muntean, Sebastian [Center for Advanced Research in Engineering Science, Romanian Academy – Timişoara Branch, 24" t" h Mihai Viteazu Ave., 300221, TM Timişoara (Romania)

    2015-12-31

    Parallel computational fluid dynamics simulations produce vast amount of numerical result data. This paper introduces a method for reducing the size of the data by replaying the interprocessor traffic. The results are recovered only in certain regions of interest configured by the user. A known test case is used for several mesh partitioning scenarios using the OpenFOAM toolkit{sup ®}[1]. The space savings obtained with classic algorithms remain constant for more than 60 Gb of floating point data. Our method is most efficient on large simulation meshes and is much better suited for compressing large scale simulation results than the regular algorithms.

  15. Time-Critical Database Conditions Data-Handling for the CMS Experiment

    CERN Document Server

    De Gruttola, M; Innocente, V; Pierro, A

    2011-01-01

    Automatic, synchronous and of course reliable population of the condition database is critical for the correct operation of the online selection as well as of the offline reconstruction and data analysis. We will describe here the system put in place in the CMS experiment to automate the processes to populate centrally the database and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are ``dropped{''} by the users in a dedicated service which synchronizes them and takes care of writing them into the online database. Then they are automatically streamed to the offline database, hence immediately accessible offline worldwide. This mechanism was intensively used during 2008 and 2009 operation with cosmic ray challenges and first LHC collision data, and many improvements were done so far. The experience of this first years of operation will be discussed in detail.

  16. Packet telemetry and packet telecommand - The new generation of spacecraft data handling techniques

    Science.gov (United States)

    Hooke, A. J.

    1983-01-01

    Because of rising costs and reduced reliability of spacecraft and ground network hardware and software customization, standardization Packet Telemetry and Packet Telecommand concepts are emerging as viable alternatives. Autonomous packets of data, within each concept, which are created within ground and space application processes through the use of formatting techniques, are switched end-to-end through the space data network to their destination application processes through the use of standard transfer protocols. This process may result in facilitating a high degree of automation and interoperability because of completely mission-independent-designed intermediate data networks. The adoption of an international guideline for future space telemetry formatting of the Packet Telemetry concept, and the advancement of the NASA-ESA Working Group's Packet Telecommand concept to a level of maturity parallel to the of Packet Telemetry are the goals of the Consultative Committee for Space Data Systems. Both the Packet Telemetry and Packet Telecommand concepts are reviewed.

  17. Functional tests of a prototype for the CMS-ATLAS common non-event data handling framework

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00366910; The ATLAS collaboration; Formica, Andrea

    2017-01-01

    Since 2014 the ATLAS and CMS experiments share a common vision on the database infrastructure for the handling of the non-event data in forthcoming LHC runs. The wide commonality in the use cases has allowed to agree on a common overall design solution that is meeting the requirements of both experiments. A first prototype has been completed in 2016 and has been made available to both experiments. The prototype is based on a web service implementing a REST api with a set of functions for the management of conditions data. In this contribution, we describe this prototype architecture and the tests that have been performed within the CMS computing infrastructure, with the aim of validating the support of the main use cases and of suggesting future improvements.

  18. Big data handling mechanisms in the healthcare applications: A comprehensive and systematic literature review.

    Science.gov (United States)

    Pashazadeh, Asma; Jafari Navimipour, Nima

    2018-04-12

    Healthcare provides many services such as diagnosing, treatment, prevention of diseases, illnesses, injuries, and other physical and mental disorders. Large-scale distributed data processing applications in healthcare as a basic concept operates on large amounts of data. Therefore, big data application functions are the main part of healthcare operations, but there was not any comprehensive and systematic survey about studying and evaluating the important techniques in this field. Therefore, this paper aims at providing the comprehensive, detailed, and systematic study of the state-of-the-art mechanisms in the big data related to healthcare applications in five categories, including machine learning, cloud-based, heuristic-based, agent-based, and hybrid mechanisms. Also, this paper displayed a systematic literature review (SLR) of the big data applications in the healthcare literature up to the end of 2016. Initially, 205 papers were identified, but a paper selection process reduced the number of papers to 29 important studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. A Framework for the Interactive Handling of High-Dimensional Simulation Data in Complex Geometries

    KAUST Repository

    Benzina, Amal; Buse, Gerrit; Butnaru, Daniel; Murarasu, Alin; Treib, Marc; Varduhn, Vasco; Mundani, Ralf-Peter

    2013-01-01

    Flow simulations around building infrastructure models involve large scale complex geometries, which when discretized in adequate detail entail high computational cost. Moreover, tasks such as simulation insight by steering or optimization require many such costly simulations. In this paper, we illustrate the whole pipeline of an integrated solution for interactive computational steering, developed for complex flow simulation scenarios that depend on a moderate number of both geometric and physical parameters. A mesh generator takes building information model input data and outputs a valid cartesian discretization. A sparse-grids-based surrogate model—a less costly substitute for the parameterized simulation—uses precomputed data to deliver approximated simulation results at interactive rates. Furthermore, a distributed multi-display visualization environment shows building infrastructure together with flow data. The focus is set on scalability and intuitive user interaction.

  20. Data acquisition, handling, and display for the heater experiments at Stripa

    Energy Technology Data Exchange (ETDEWEB)

    McEvoy, M.B.

    1979-02-01

    In June 1978, a joint Swedish/American research team began acquiring data from the Stripa mine in Sweden, 340 m below the surface. Electrical heaters are used to assess the suitability of granite rock as a repository for radioactive waste material. Extensive instrumentation also measures temperature, stress, and displacement effects caused by these heaters. This report describes the data acquisition system, its design considerations, capabilities, and operational use. The techniques employed to detect and analyze any anomalous experimental results are also described. Environmental considerations are described in an appendix.

  1. Insights into vehicle trajectories at the handling limits: analysing open data from race car drivers

    Science.gov (United States)

    Kegelman, John C.; Harbott, Lene K.; Gerdes, J. Christian

    2017-02-01

    Race car drivers can offer insights into vehicle control during extreme manoeuvres; however, little data from race teams is publicly available for analysis. The Revs Program at Stanford has built a collection of vehicle dynamics data acquired from vintage race cars during live racing events with the intent of making this database publicly available for future analysis. This paper discusses the data acquisition, post-processing, and storage methods used to generate the database. An analysis of available data quantifies the repeatability of professional race car driver performance by examining the statistical dispersion of their driven paths. Certain map features, such as sections with high path curvature, consistently corresponded to local minima in path dispersion, quantifying the qualitative concept that drivers anchor their racing lines at specific locations around the track. A case study explores how two professional drivers employ distinct driving styles to achieve similar lap times, supporting the idea that driving at the limits allows a family of solutions in terms of paths and speed that can be adapted based on specific spatial, temporal, or other constraints and objectives.

  2. Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data

    Directory of Open Access Journals (Sweden)

    Mingwu Jin

    2012-01-01

    Full Text Available Local canonical correlation analysis (CCA is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM, a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.

  3. The Network Data Handling War: MySQL vs. NfDump

    NARCIS (Netherlands)

    Hofstede, Rick; Hofstede, R.J.; Sperotto, Anna; Fioreze, Tiago; Pras, Aiko

    Network monitoring plays a crucial role in any network management environment. Especially nowadays, with network speed and load constantly increasing, more and more data needs to be collected and efficiently processed. In highly interactive network monitoring systems, a quick response time from

  4. Combining machine learning and ontological data handling for multi-source classification of nature conservation areas

    Science.gov (United States)

    Moran, Niklas; Nieland, Simon; Tintrup gen. Suntrup, Gregor; Kleinschmit, Birgit

    2017-02-01

    Manual field surveys for nature conservation management are expensive and time-consuming and could be supplemented and streamlined by using Remote Sensing (RS). RS is critical to meet requirements of existing laws such as the EU Habitats Directive (HabDir) and more importantly to meet future challenges. The full potential of RS has yet to be harnessed as different nomenclatures and procedures hinder interoperability, comparison and provenance. Therefore, automated tools are needed to use RS data to produce comparable, empirical data outputs that lend themselves to data discovery and provenance. These issues are addressed by a novel, semi-automatic ontology-based classification method that uses machine learning algorithms and Web Ontology Language (OWL) ontologies that yields traceable, interoperable and observation-based classification outputs. The method was tested on European Union Nature Information System (EUNIS) grasslands in Rheinland-Palatinate, Germany. The developed methodology is a first step in developing observation-based ontologies in the field of nature conservation. The tests show promising results for the determination of the grassland indicators wetness and alkalinity with an overall accuracy of 85% for alkalinity and 76% for wetness.

  5. Handling limited datasets with neural networks in medical applications: A small-data approach.

    Science.gov (United States)

    Shaikhina, Torgyn; Khovanova, Natalia A

    2017-01-01

    Single-centre studies in medical domain are often characterised by limited samples due to the complexity and high costs of patient data collection. Machine learning methods for regression modelling of small datasets (less than 10 observations per predictor variable) remain scarce. Our work bridges this gap by developing a novel framework for application of artificial neural networks (NNs) for regression tasks involving small medical datasets. In order to address the sporadic fluctuations and validation issues that appear in regression NNs trained on small datasets, the method of multiple runs and surrogate data analysis were proposed in this work. The approach was compared to the state-of-the-art ensemble NNs; the effect of dataset size on NN performance was also investigated. The proposed framework was applied for the prediction of compressive strength (CS) of femoral trabecular bone in patients suffering from severe osteoarthritis. The NN model was able to estimate the CS of osteoarthritic trabecular bone from its structural and biological properties with a standard error of 0.85MPa. When evaluated on independent test samples, the NN achieved accuracy of 98.3%, outperforming an ensemble NN model by 11%. We reproduce this result on CS data of another porous solid (concrete) and demonstrate that the proposed framework allows for an NN modelled with as few as 56 samples to generalise on 300 independent test samples with 86.5% accuracy, which is comparable to the performance of an NN developed with 18 times larger dataset (1030 samples). The significance of this work is two-fold: the practical application allows for non-destructive prediction of bone fracture risk, while the novel methodology extends beyond the task considered in this study and provides a general framework for application of regression NNs to medical problems characterised by limited dataset sizes. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. New Chicago-Indiana computer network prepared to handle massive data flow

    CERN Multimedia

    2006-01-01

    "The Chicago-Indiana system is ont of five Tier-2 (regional) centers in the United States that will receive data from one of four massive detectors at the Large Hadron Collider at CERN, the European particle physics laboratory in Geneva. When the new instrument begins operating late next year, beams of protons will collide 40 million times a second. When each of those proton beams reaches full intensity, each collision will produce approximately 23 interactions between protons that will create various types of subatomic particles." (1,5 page)

  7. Quality control and data-handling in multicentre studies: the case of the Multicentre Project for Tuberculosis Research

    Directory of Open Access Journals (Sweden)

    Caloto Teresa

    2001-12-01

    Full Text Available Abstract Background The Multicentre Project for Tuberculosis Research (MPTR was a clinical-epidemiological study on tuberculosis carried out in Spain from 1996 to 1998. In total, 96 centres scattered all over the country participated in the project, 19935 "possible cases" of tuberculosis were examined and 10053 finally included. Data-handling and quality control procedures implemented in the MPTR are described. Methods The study was divided in three phases: 1 preliminary phase, 2 field work 3 final phase. Quality control procedures during the three phases are described. Results: Preliminary phase: a organisation of the research team; b design of epidemiological tools; training of researchers. Field work: a data collection; b data computerisation; c data transmission; d data cleaning; e quality control audits; f confidentiality. Final phase: a final data cleaning; b final analysis. Conclusion The undertaking of a multicentre project implies the need to work with a heterogeneous research team and yet at the same time attain a common goal by following a homogeneous methodology. This demands an additional effort on quality control.

  8. Efficient Geometry and Data Handling for Large-Scale Monte Carlo - Thermal-Hydraulics Coupling

    Science.gov (United States)

    Hoogenboom, J. Eduard

    2014-06-01

    Detailed coupling of thermal-hydraulics calculations to Monte Carlo reactor criticality calculations requires each axial layer of each fuel pin to be defined separately in the input to the Monte Carlo code in order to assign to each volume the temperature according to the result of the TH calculation, and if the volume contains coolant, also the density of the coolant. This leads to huge input files for even small systems. In this paper a methodology for dynamical assignment of temperatures with respect to cross section data is demonstrated to overcome this problem. The method is implemented in MCNP5. The method is verified for an infinite lattice with 3x3 BWR-type fuel pins with fuel, cladding and moderator/coolant explicitly modeled. For each pin 60 axial zones are considered with different temperatures and coolant densities. The results of the axial power distribution per fuel pin are compared to a standard MCNP5 run in which all 9x60 cells for fuel, cladding and coolant are explicitly defined and their respective temperatures determined from the TH calculation. Full agreement is obtained. For large-scale application the method is demonstrated for an infinite lattice with 17x17 PWR-type fuel assemblies with 25 rods replaced by guide tubes. Again all geometrical detailed is retained. The method was used in a procedure for coupled Monte Carlo and thermal-hydraulics iterations. Using an optimised iteration technique, convergence was obtained in 11 iteration steps.

  9. Easy research data handling with an OpenEarth DataLab for geo-monitoring research

    Science.gov (United States)

    Vanderfeesten, Maurice; van der Kuil, Annemiek; Prinčič, Alenka; den Heijer, Kees; Rombouts, Jeroen

    2015-04-01

    OpenEarth DataLab is an open source-based collaboration and processing platform to enable streamlined research data management from raw data ingest and transformation to interoperable distribution. It enables geo-scientists to easily synchronise, share, compute and visualise the dynamic and most up-to-date research data, scripts and models in multi-stakeholder geo-monitoring programs. This DataLab is developed by the Research Data Services team of TU Delft Library and 3TU.Datacentrum together with coastal engineers of Delft University of Technology and Deltares. Based on the OpenEarth software stack an environment has been developed to orchestrate numerous geo-related open source software components that can empower researchers and increase the overall research quality by managing research data; enabling automatic and interoperable data workflows between all the components with track & trace, hit & run data transformation processing in cloud infrastructure using MatLab and Python, synchronisation of data and scripts (SVN), and much more. Transformed interoperable data products (KML, NetCDF, PostGIS) can be used by ready-made OpenEarth tools for further analyses and visualisation, and can be distributed via interoperable channels such as THREDDS (OpenDAP) and GeoServer. An example of a successful application of OpenEarth DataLab is the Sand Motor, an innovative method for coastal protection in the Netherlands. The Sand Motor is a huge volume of sand that has been applied along the coast to be spread naturally by wind, waves and currents. Different research disciplines are involved concerned with: weather, waves and currents, sand distribution, water table and water quality, flora and fauna, recreation and management. Researchers share and transform their data in the OpenEarth DataLab, that makes it possible to combine their data and to see influence of different aspects of the coastal protection on their models. During the project the data are available only for the

  10. Data Container Study for Handling Array-based Data Using Rasdaman, Hive, Spark, and MongoDB

    Science.gov (United States)

    Xu, M.; Hu, F.; Yu, M.; Scheele, C.; Liu, K.; Huang, Q.; Yang, C. P.; Little, M. M.

    2016-12-01

    Geoscience communities have come up with various big data storage solutions, such as Rasdaman and Hive, to address the grand challenges for massive Earth observation data management and processing. To examine the readiness of current solutions in supporting big Earth observation, we propose to investigate and compare four popular data container solutions, including Rasdaman, Hive, Spark, and MongoDB. Using different types of spatial and non-spatial queries, datasets stored in common scientific data formats (e.g., NetCDF and HDF), and two applications (i.e. dust storm simulation data mining and MERRA data analytics), we systematically compare and evaluate the feature and performance of these four data containers in terms of data discover and access. The computing resources (e.g. CPU, memory, hard drive, network) consumed while performing various queries and operations are monitored and recorded for the performance evaluation. The initial results show that 1) Rasdaman has the best performance for queries on statistical and operational functions, and supports NetCDF data format better than HDF; 2) Rasdaman clustering configuration is more complex than the others; 3) Hive performs better on single pixel extraction from multiple images; and 4) Except for the single pixel extractions, Spark performs better than Hive and its performance is close to Rasdaman. A comprehensive report will detail the experimental results, and compare their pros and cons regarding system performance, ease of use, accessibility, scalability, compatibility, and flexibility.

  11. When and how should multiple imputation be used for handling missing data in randomised clinical trials - a practical guide with flowcharts

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Gluud, Christian; Wetterslev, Jørn

    2017-01-01

    the missingness. Therefore, the analysis of trial data with missing values requires careful planning and attention. METHODS: The authors had several meetings and discussions considering optimal ways of handling missing data to minimise the bias potential. We also searched PubMed (key words: missing data; randomi...

  12. Data handling and modelling

    International Nuclear Information System (INIS)

    Minchin, P.E.H.

    1986-01-01

    The author reviews the interpretation of various tracer profiles. A quantitative description of the change of profile shape is given by the transfer function calculated from observed profiles. From the transfer function physically meaningful quantities such as average transit time, dispersion and leakage can be calculated

  13. Handling data redundancy in helical cone beam reconstruction with a cone-angle-based window function and its asymptotic approximation

    International Nuclear Information System (INIS)

    Tang Xiangyang; Hsieh Jiang

    2007-01-01

    A cone-angle-based window function is defined in this manuscript for image reconstruction using helical cone beam filtered backprojection (CB-FBP) algorithms. Rather than defining the window boundaries in a two-dimensional detector acquiring projection data for computed tomographic imaging, the cone-angle-based window function deals with data redundancy by selecting rays with the smallest cone angle relative to the reconstruction plane. To be computationally efficient, an asymptotic approximation of the cone-angle-based window function is also given and analyzed in this paper. The benefit of using such an asymptotic approximation also includes the avoidance of functional discontinuities that cause artifacts in reconstructed tomographic images. The cone-angle-based window function and its asymptotic approximation provide a way, equivalent to the Tam-Danielsson-window, for helical CB-FBP reconstruction algorithms to deal with data redundancy, regardless of where the helical pitch is constant or dynamically variable during a scan. By taking the cone-parallel geometry as an example, a computer simulation study is conducted to evaluate the proposed window function and its asymptotic approximation for helical CB-FBP reconstruction algorithm to handle data redundancy. The computer simulated Forbild head and thorax phantoms are utilized in the performance evaluation, showing that the proposed cone-angle-based window function and its asymptotic approximation can deal with data redundancy very well in cone beam image reconstruction from projection data acquired along helical source trajectories. Moreover, a numerical study carried out in this paper reveals that the proposed cone-angle-based window function is actually equivalent to the Tam-Danielsson-window, and rigorous mathematical proofs are being investigated

  14. Time-critical database condition data handling in the CMS experiment during the first data taking period

    CERN Document Server

    Di Guida, Salvatore

    2011-01-01

    Automatic, synchronous and of course reliable population of the condition databases is critical for the correct operation of the online selection as well as of the offline reconstruction and analysis of data. In this complex infrastructure, monitoring and fast detection of errors is a very challenging task. To recover the system and to put it in a safe state requires spotting a faulty situation within strict time constraints. We will describe here the system put in place in the CMS experiment to automate the processes that populate centrally the Condition Databases and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are automatically collected using centralized jobs or are ``dropped'' by the users in dedicate services (offline and online drop-box), which synchronize them and take care of writing them into the online database. Then they are automatically streamed to the offline database, and thus are immediately acce...

  15. Short-term effects of air pollution on lower respiratory diseases and forecasting by the group method of data handling

    Science.gov (United States)

    Zhu, Wenjin; Wang, Jianzhou; Zhang, Wenyu; Sun, Donghuai

    2012-05-01

    Risk of lower respiratory diseases was significantly correlated with levels of monthly average concentration of SO2; NO2 and association rules have high lifts. In view of Lanzhou's special geographical location, taking into account the impact of different seasons, especially for the winter, the relations between air pollutants and the respiratory disease deserve further study. In this study the monthly average concentration of SO2, NO2, PM10 and the monthly number of people who in hospital because of lower respiratory disease from January 2001 to December 2005 are grouped equidistant and considered as the terms of transactions. Then based on the relational algebraic theory we employed the optimization relation association rule to mine the association rules of the transactions. Based on the association rules revealing the effects of air pollutants on the lower respiratory disease, we forecast the number of person who suffered from lower respiratory disease by the group method of data handling (GMDH) to reveal the risk and give a consultation to the hospital in Xigu District, the most seriously polluted district in Lanzhou. The data and analysis indicate that individuals may be susceptible to the short-term effects of pollution and thus suffer from lower respiratory diseases and this effect presents seasonal.

  16. Time-critical Database Condition Data Handling in the CMS Experiment During the First Data Taking Period

    International Nuclear Information System (INIS)

    Cavallari, Francesca; Gruttola, Michele de; Di Guida, Salvatore; Innocente, Vincenzo; Pfeiffer, Andreas; Govi, Giacomo; Pierro, Antonio

    2011-01-01

    Automatic, synchronous and reliable population of the condition databases is critical for the correct operation of the online selection as well as of the offline reconstruction and analysis of data. In this complex infrastructure, monitoring and fast detection of errors is a very challenging task. In this paper, we describe the CMS experiment system to process and populate the Condition Databases and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are automatically collected using centralized jobs or are 'dropped' by the users in dedicated services (offline and online drop-box), which synchronize them and take care of writing them into the online database. Then they are automatically streamed to the offline database, and thus are immediately accessible offline worldwide. The condition data are managed by different users using a wide range of applications. In normal operation the database monitor is used to provide simple timing information and the history of all transactions for all database accounts, and in the case of faults it is used to return simple error messages and more complete debugging information.

  17. MOBBED: a computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB.

    Science.gov (United States)

    Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A

    2013-01-01

    Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED

  18. A survey aimed at general citizens of the US and Japan about their attitudes toward electronic medical data handling.

    Science.gov (United States)

    Kimura, Michio; Nakaya, Jun; Watanabe, Hiroshi; Shimizu, Toshiro; Nakayasu, Kazuyuki

    2014-04-25

    To clarify the views of the general population of two countries (US and Japan), concerning the handling of their medical records electronically. We contacted people nationwide in the United States at random via Random Digit Dialing (RDD) to obtain 200 eligible responders. The questionnaire was for obtaining the information on their attitudes towards handling of their medical records, disclosure of the name of disease, secondary usage of information, compiling their records into a lifelong medical record, and access to their medical records on the Internet. We had also surveyed people of Shizuoka prefecture in Japan using same questionnaires sent by mail, for which we obtained 457 valid answers. Even in an unidentifiable manner, US people feel profit-oriented usage of medical data without specific consent is not acceptable. There is a significant difference between usage of unidentifiable medical data for profit (about 50% feel negatively) and for official/research purposes (about 30% feel negatively). About 60% of the US responders have a negative view on the proposal that unidentifiable medical information be utilized for profit by private companies to attain healthcare cost savings. As regards compiling a lifelong medical record, positive answers and negative answers are almost equally divided in the US (46% vs. 38%) while more positive attitudes are seen in Japan (74% vs. 12%). However, any incentive measures aimed at changing attitudes to such a compiling including the discount of healthcare costs or insurance fees are unwelcomed by people regardless of their age or health condition in both surveys. Regarding the access to their own medical record via the Internet, 38% of the US responders feel this is unacceptable while 50.5% were willing to accept it. Participants from the US think that the extent of the sharing their identifiable medical records should be limited to the doctors-in-charge and specified doctors referred to by their own doctors. On the other

  19. A Survey Aimed at General Citizens of the US and Japan about Their Attitudes toward Electronic Medical Data Handling

    Directory of Open Access Journals (Sweden)

    Michio Kimura

    2014-04-01

    Full Text Available Objectives: To clarify the views of the general population of two countries (US and Japan, concerning the handling of their medical records electronically. Methods: We contacted people nationwide in the United States at random via Random Digit Dialing (RDD to obtain 200 eligible responders. The questionnaire was for obtaining the information on their attitudes towards handling of their medical records, disclosure of the name of disease, secondary usage of information, compiling their records into a lifelong medical record, and access to their medical records on the Internet. We had also surveyed people of Shizuoka prefecture in Japan using same questionnaires sent by mail, for which we obtained 457 valid answers. Results: Even in an unidentifiable manner, US people feel profit-oriented usage of medical data without specific consent is not acceptable. There is a significant difference between usage of unidentifiable medical data for profit (about 50% feel negatively and for official/research purposes (about 30% feel negatively. About 60% of the US responders have a negative view on the proposal that unidentifiable medical information be utilized for profit by private companies to attain healthcare cost savings. As regards compiling a lifelong medical record, positive answers and negative answers are almost equally divided in the US (46% vs. 38% while more positive attitudes are seen in Japan (74% vs. 12%. However, any incentive measures aimed at changing attitudes to such a compiling including the discount of healthcare costs or insurance fees are unwelcomed by people regardless of their age or health condition in both surveys. Regarding the access to their own medical record via the Internet, 38% of the US responders feel this is unacceptable while 50.5% were willing to accept it. Conclusions: Participants from the US think that the extent of the sharing their identifiable medical records should be limited to the doctors-in-charge and specified

  20. Standards should be applied in the prevention and handling of missing data for patient-centered outcomes research: a systematic review and expert consensus.

    Science.gov (United States)

    Li, Tianjing; Hutfless, Susan; Scharfstein, Daniel O; Daniels, Michael J; Hogan, Joseph W; Little, Roderick J A; Roy, Jason A; Law, Andrew H; Dickersin, Kay

    2014-01-01

    To recommend methodological standards in the prevention and handling of missing data for primary patient-centered outcomes research (PCOR). We searched National Library of Medicine Bookshelf and Catalog as well as regulatory agencies' and organizations' Web sites in January 2012 for guidance documents that had formal recommendations regarding missing data. We extracted the characteristics of included guidance documents and recommendations. Using a two-round modified Delphi survey, a multidisciplinary panel proposed mandatory standards on the prevention and handling of missing data for PCOR. We identified 1,790 records and assessed 30 as having relevant recommendations. We proposed 10 standards as mandatory, covering three domains. First, the single best approach is to prospectively prevent missing data occurrence. Second, use of valid statistical methods that properly reflect multiple sources of uncertainty is critical when analyzing missing data. Third, transparent and thorough reporting of missing data allows readers to judge the validity of the findings. We urge researchers to adopt rigorous methodology and promote good science by applying best practices to the prevention and handling of missing data. Developing guidance on the prevention and handling of missing data for observational studies and studies that use existing records is a priority for future research. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Handling missing data for the identification of charged particles in a multilayer detector: A comparison between different imputation methods

    Energy Technology Data Exchange (ETDEWEB)

    Riggi, S., E-mail: sriggi@oact.inaf.it [INAF - Osservatorio Astrofisico di Catania (Italy); Riggi, D. [Keras Strategy - Milano (Italy); Riggi, F. [Dipartimento di Fisica e Astronomia - Università di Catania (Italy); INFN, Sezione di Catania (Italy)

    2015-04-21

    Identification of charged particles in a multilayer detector by the energy loss technique may also be achieved by the use of a neural network. The performance of the network becomes worse when a large fraction of information is missing, for instance due to detector inefficiencies. Algorithms which provide a way to impute missing information have been developed over the past years. Among the various approaches, we focused on normal mixtures’ models in comparison with standard mean imputation and multiple imputation methods. Further, to account for the intrinsic asymmetry of the energy loss data, we considered skew-normal mixture models and provided a closed form implementation in the Expectation-Maximization (EM) algorithm framework to handle missing patterns. The method has been applied to a test case where the energy losses of pions, kaons and protons in a six-layers’ Silicon detector are considered as input neurons to a neural network. Results are given in terms of reconstruction efficiency and purity of the various species in different momentum bins.

  2. "I spy, with my little sensor": fair data handling practices for robots between privacy, copyright and security

    Science.gov (United States)

    Schafer, Burkhard; Edwards, Lilian

    2017-07-01

    The paper suggests an amendment to Principle 4 of ethical robot design, and a demand for "transparency by design". It argues that while misleading vulnerable users as to the nature of a robot is a serious ethical issue, other forms of intentionally deceptive or unintentionally misleading aspects of robotic design pose challenges that are on the one hand more universal and harmful in their application, on the other more difficult to address consistently through design choices. The focus will be on transparent design regarding the sensory capacities of robots. Intuitive, low-tech but highly efficient privacy preserving behaviour is regularly dependent on an accurate understanding of surveillance risks. Design choices that hide, camouflage or misrepresent these capacities can undermine these strategies. However, formulating an ethical principle of "sensor transparency" is not straightforward, as openness can also lead to greater vulnerability and with that security risks. We argue that the discussion on sensor transparency needs to be embedded in a broader discussion of "fair data handling principles" for robots that involve issues of privacy, but also intellectual property rights such as copyright.

  3. Criteria of GenCall score to edit marker data and methods to handle missing markers have an influence on accuracy of genomic predictions

    DEFF Research Database (Denmark)

    Edriss, Vahid; Guldbrandtsen, Bernt; Lund, Mogens Sandø

    2013-01-01

    The aim of this study was to investigate the effect of different strategies for handling low-quality or missing data on prediction accuracy for direct genomic values of protein yield, mastitis and fertility using a Bayesian variable model and a GBLUP model in the Danish Jersey population. The data...... contained 1071 Jersey bulls that were genotyped with the Illumina Bovine 50K chip. After preliminary editing, 39227 SNP remained in the dataset. Four methods to handle missing genotypes were: 1) BEAGLE: missing markers were imputed using Beagle 3.3 software, 2) COMMON: missing genotypes at a locus were...

  4. Group method of data handling and neral networks applied in monitoring and fault detection in sensors in nuclear power plants; Group Method of Data Handling (GMDH) e Redes Neurais na Monitoracao e Deteccao de Falhas em sensores de centrais nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Bueno, Elaine Inacio

    2011-07-01

    The increasing demand in the complexity, efficiency and reliability in modern industrial systems stimulated studies on control theory applied to the development of Monitoring and Fault Detection system. In this work a new Monitoring and Fault Detection methodology was developed using GMDH (Group Method of Data Handling) algorithm and Artificial Neural Networks (ANNs) which was applied to the IEA-R1 research reactor at IPEN. The Monitoring and Fault Detection system was developed in two parts: the first was dedicated to preprocess information, using GMDH algorithm; and the second part to the process information using ANNs. The GMDH algorithm was used in two different ways: firstly, the GMDH algorithm was used to generate a better database estimated, called matrix{sub z}, which was used to train the ANNs. After that, the GMDH was used to study the best set of variables to be used to train the ANNs, resulting in a best monitoring variable estimative. The methodology was developed and tested using five different models: one Theoretical Model and four Models using different sets of reactor variables. After an exhausting study dedicated to the sensors Monitoring, the Fault Detection in sensors was developed by simulating faults in the sensors database using values of 5%, 10%, 15% and 20% in these sensors database. The results obtained using GMDH algorithm in the choice of the best input variables to the ANNs were better than that using only ANNs, thus making possible the use of these methods in the implementation of a new Monitoring and Fault Detection methodology applied in sensors. (author)

  5. Energy spectra unfolding of fast neutron sources using the group method of data handling and decision tree algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini, Seyed Abolfazl, E-mail: sahosseini@sharif.edu [Department of Energy Engineering, Sharif University of Technology, Tehran 8639-11365 (Iran, Islamic Republic of); Afrakoti, Iman Esmaili Paeen [Faculty of Engineering & Technology, University of Mazandaran, Pasdaran Street, P.O. Box: 416, Babolsar 47415 (Iran, Islamic Republic of)

    2017-04-11

    Accurate unfolding of the energy spectrum of a neutron source gives important information about unknown neutron sources. The obtained information is useful in many areas like nuclear safeguards, nuclear nonproliferation, and homeland security. In the present study, the energy spectrum of a poly-energetic fast neutron source is reconstructed using the developed computational codes based on the Group Method of Data Handling (GMDH) and Decision Tree (DT) algorithms. The neutron pulse height distribution (neutron response function) in the considered NE-213 liquid organic scintillator has been simulated using the developed MCNPX-ESUT computational code (MCNPX-Energy engineering of Sharif University of Technology). The developed computational codes based on the GMDH and DT algorithms use some data for training, testing and validation steps. In order to prepare the required data, 4000 randomly generated energy spectra distributed over 52 bins are used. The randomly generated energy spectra and the simulated neutron pulse height distributions by MCNPX-ESUT for each energy spectrum are used as the output and input data. Since there is no need to solve the inverse problem with an ill-conditioned response matrix, the unfolded energy spectrum has the highest accuracy. The {sup 241}Am-{sup 9}Be and {sup 252}Cf neutron sources are used in the validation step of the calculation. The unfolded energy spectra for the used fast neutron sources have an excellent agreement with the reference ones. Also, the accuracy of the unfolded energy spectra obtained using the GMDH is slightly better than those obtained from the DT. The results obtained in the present study have good accuracy in comparison with the previously published paper based on the logsig and tansig transfer functions. - Highlights: • The neutron pulse height distribution was simulated using MCNPX-ESUT. • The energy spectrum of the neutron source was unfolded using GMDH. • The energy spectrum of the neutron source was

  6. Cybernetic group method of data handling (GMDH) statistical learning for hyperspectral remote sensing inverse problems in coastal ocean optics

    Science.gov (United States)

    Filippi, Anthony Matthew

    For complex systems, sufficient a priori knowledge is often lacking about the mathematical or empirical relationship between cause and effect or between inputs and outputs of a given system. Automated machine learning may offer a useful solution in such cases. Coastal marine optical environments represent such a case, as the optical remote sensing inverse problem remains largely unsolved. A self-organizing, cybernetic mathematical modeling approach known as the group method of data handling (GMDH), a type of statistical learning network (SLN), was used to generate explicit spectral inversion models for optically shallow coastal waters. Optically shallow water light fields represent a particularly difficult challenge in oceanographic remote sensing. Several algorithm-input data treatment combinations were utilized in multiple experiments to automatically generate inverse solutions for various inherent optical property (IOP), bottom optical property (BOP), constituent concentration, and bottom depth estimations. The objective was to identify the optimal remote-sensing reflectance Rrs(lambda) inversion algorithm. The GMDH also has the potential of inductive discovery of physical hydro-optical laws. Simulated data were used to develop generalized, quasi-universal relationships. The Hydrolight numerical forward model, based on radiative transfer theory, was used to compute simulated above-water remote-sensing reflectance Rrs(lambda) psuedodata, matching the spectral channels and resolution of the experimental Naval Research Laboratory Ocean PHILLS (Portable Hyperspectral Imager for Low-Light Spectroscopy) sensor. The input-output pairs were for GMDH and artificial neural network (ANN) model development, the latter of which was used as a baseline, or control, algorithm. Both types of models were applied to in situ and aircraft data. Also, in situ spectroradiometer-derived Rrs(lambda) were used as input to an optimization-based inversion procedure. Target variables

  7. Group method of data handling and neral networks applied in monitoring and fault detection in sensors in nuclear power plants

    International Nuclear Information System (INIS)

    Bueno, Elaine Inacio

    2011-01-01

    The increasing demand in the complexity, efficiency and reliability in modern industrial systems stimulated studies on control theory applied to the development of Monitoring and Fault Detection system. In this work a new Monitoring and Fault Detection methodology was developed using GMDH (Group Method of Data Handling) algorithm and Artificial Neural Networks (ANNs) which was applied to the IEA-R1 research reactor at IPEN. The Monitoring and Fault Detection system was developed in two parts: the first was dedicated to preprocess information, using GMDH algorithm; and the second part to the process information using ANNs. The GMDH algorithm was used in two different ways: firstly, the GMDH algorithm was used to generate a better database estimated, called matrix z , which was used to train the ANNs. After that, the GMDH was used to study the best set of variables to be used to train the ANNs, resulting in a best monitoring variable estimative. The methodology was developed and tested using five different models: one Theoretical Model and four Models using different sets of reactor variables. After an exhausting study dedicated to the sensors Monitoring, the Fault Detection in sensors was developed by simulating faults in the sensors database using values of 5%, 10%, 15% and 20% in these sensors database. The results obtained using GMDH algorithm in the choice of the best input variables to the ANNs were better than that using only ANNs, thus making possible the use of these methods in the implementation of a new Monitoring and Fault Detection methodology applied in sensors. (author)

  8. A Bit-Encoding Based New Data Structure for Time and Memory Efficient Handling of Spike Times in an Electrophysiological Setup.

    Science.gov (United States)

    Ljungquist, Bengt; Petersson, Per; Johansson, Anders J; Schouenborg, Jens; Garwicz, Martin

    2018-04-01

    Recent neuroscientific and technical developments of brain machine interfaces have put increasing demands on neuroinformatic databases and data handling software, especially when managing data in real time from large numbers of neurons. Extrapolating these developments we here set out to construct a scalable software architecture that would enable near-future massive parallel recording, organization and analysis of neurophysiological data on a standard computer. To this end we combined, for the first time in the present context, bit-encoding of spike data with a specific communication format for real time transfer and storage of neuronal data, synchronized by a common time base across all unit sources. We demonstrate that our architecture can simultaneously handle data from more than one million neurons and provide, in real time (based on analysis of previously recorded data. In addition to managing recordings from very large numbers of neurons in real time, it also has the capacity to handle the extensive periods of recording time necessary in certain scientific and clinical applications. Furthermore, the bit-encoding proposed has the additional advantage of allowing an extremely fast analysis of spatiotemporal spike patterns in a large number of neurons. Thus, we conclude that this architecture is well suited to support current and near-future Brain Machine Interface requirements.

  9. The application of advanced remote systems technology to future waste handling facilities: Waste Systems Data and Development Program

    International Nuclear Information System (INIS)

    Kring, C.T.; Herndon, J.N.; Meacham, S.A.

    1987-01-01

    The Consolidated Fuel Reprocessing Program (CFRP) at the Oak Ridge National Laboratory (ORNL) has been advancing the technology in remote handling and remote maintenance of in-cell systems planned for future US nuclear fuel reprocessing plants. Much of the experience and technology developed over the past decade in this endeavor are directly applicable to the in-cell systems being considered for the facilities of the Federal Waste Management System (FWMS). The ORNL developments are based on the application of teleoperated force-reflecting servomanipulators controlled by an operator completely removed from the hazardous environment. These developments address the nonrepetitive nature of remote maintenance in the unstructured environments encountered in a waste handling facility. Employing technological advancements in dexterous manipulators, as well as basic design guidelines that have been developed for remotely maintained equipment and processes, can increase operation and maintenance system capabilities, thereby allowing the attainment of two FWMS major objectives: decreasing plant personnel radiation exposure and increasing plant availability by decreasing the mean-time-to-repair in-cell maintenance and process equipment. 5 refs., 7 figs

  10. Automated Data Handling And Instrument Control Using Low-Cost Desktop Computers And An IEEE 488 Compatible Version Of The ODETA V.

    Science.gov (United States)

    van Leunen, J. A. J.; Dreessen, J.

    1984-05-01

    The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to

  11. Test sample handling apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    A test sample handling apparatus using automatic scintillation counting for gamma detection, for use in such fields as radioimmunoassay, is described. The apparatus automatically and continuously counts large numbers of samples rapidly and efficiently by the simultaneous counting of two samples. By means of sequential ordering of non-sequential counting data, it is possible to obtain precisely ordered data while utilizing sample carrier holders having a minimum length. (U.K.)

  12. BWR spent fuel storage cask performance test. Volume 1. Cask handling experience and decay heat, heat transfer, and shielding data

    International Nuclear Information System (INIS)

    McKinnon, M.A.; Doman, J.W.; Tanner, J.E.; Guenther, R.J.; Creer, J.M.; King, C.E.

    1986-02-01

    This report documents a heat transfer and shielding performance test conducted on a Ridihalgh, Eggers and Associates REA 2023 boiling water reactor (BWR) spent fuel storage cask. The testing effort consisted of three parts: pretest preparations, performance testing, and post-test activities. Pretest preparations included conducting cask handling dry runs and characterizing BWR spent fuel assemblies from Nebraska Public Power District's Cooper Nuclear Station. The performance test matrix included 14 runs consisting of two loadings, two cask orientations, and three backfill environments. Post-test activities included calorimetry and axial radiation scans of selected fuel assemblies, in-basin sipping of each assembly, crud collection, video and photographic scans, and decontamination of the cask interior and exterior

  13. Handling the data management needs of high-throughput sequencing data: SpeedGene, a compression algorithm for the efficient storage of genetic data

    Science.gov (United States)

    2012-01-01

    Background As Next-Generation Sequencing data becomes available, existing hardware environments do not provide sufficient storage space and computational power to store and process the data due to their enormous size. This is and will be a frequent problem that is encountered everyday by researchers who are working on genetic data. There are some options available for compressing and storing such data, such as general-purpose compression software, PBAT/PLINK binary format, etc. However, these currently available methods either do not offer sufficient compression rates, or require a great amount of CPU time for decompression and loading every time the data is accessed. Results Here, we propose a novel and simple algorithm for storing such sequencing data. We show that, the compression factor of the algorithm ranges from 16 to several hundreds, which potentially allows SNP data of hundreds of Gigabytes to be stored in hundreds of Megabytes. We provide a C++ implementation of the algorithm, which supports direct loading and parallel loading of the compressed format without requiring extra time for decompression. By applying the algorithm to simulated and real datasets, we show that the algorithm gives greater compression rate than the commonly used compression methods, and the data-loading process takes less time. Also, The C++ library provides direct-data-retrieving functions, which allows the compressed information to be easily accessed by other C++ programs. Conclusions The SpeedGene algorithm enables the storage and the analysis of next generation sequencing data in current hardware environment, making system upgrades unnecessary. PMID:22591016

  14. Kollektiv Handling

    DEFF Research Database (Denmark)

    Ibsen, Flemming; Høgedahl, Laust Kristian; Scheuer, Steen

    lønmodtagere orienterer sig anderledes og har skiftet præferencer, når det gælder valg henholdsvis fravalg af fagforeningsmedlemskab. Denne bog bygger på unikke data fra en omfattende survey, hvis formål netop har været at kortlægge danske lønmodtageres motiver til skift eller fravalg af fagforening. Bogens...

  15. Development of an integrated data acquisition and handling system based on digital time series analysis for the measurement of plasma fluctuations

    International Nuclear Information System (INIS)

    Ghayspoor, R.; Roth, J.R.

    1986-01-01

    The nonlinear characteristics of data obtained by many plasma diagnostic systems requires the power of modern computers for on-line data processing and reduction. The objective of this work is to develop an integrated data acquisition and handling system based on digital time series analysis techniques. These techniques make it possible to investigate the nature of plasma fluctuations and the physical processes which give rise to them. The approach is to digitize the data, and to generate various spectra by means of Fast Fourier Transforms (FFT). Of particular interest is the computer generated auto-power spectrum, cross-power spectrum, phase spectrum, and squared coherency spectrum. Software programs based on those developed by Jae. Y. Hong at the University of Texas are utilized for these spectra. The LeCroy 3500-SA signal analyzer and VAX 11/780 are used as the data handling and reduction system in this work. In this report, the software required to link these two systems are described

  16. 7 CFR 926.9 - Handle.

    Science.gov (United States)

    2010-01-01

    ... the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE DATA COLLECTION, REPORTING AND RECORDKEEPING REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY MARKETING ORDER § 926.9 Handle. Handle...

  17. [A guide to good practice for information security in the handling of personal health data by health personnel in ambulatory care facilities].

    Science.gov (United States)

    Sánchez-Henarejos, Ana; Fernández-Alemán, José Luis; Toval, Ambrosio; Hernández-Hernández, Isabel; Sánchez-García, Ana Belén; Carrillo de Gea, Juan Manuel

    2014-04-01

    The appearance of electronic health records has led to the need to strengthen the security of personal health data in order to ensure privacy. Despite the large number of technical security measures and recommendations that exist to protect the security of health data, there is an increase in violations of the privacy of patients' personal data in healthcare organizations, which is in many cases caused by the mistakes or oversights of healthcare professionals. In this paper, we present a guide to good practice for information security in the handling of personal health data by health personnel, drawn from recommendations, regulations and national and international standards. The material presented in this paper can be used in the security audit of health professionals, or as a part of continuing education programs in ambulatory care facilities. Copyright © 2013 Elsevier España, S.L. All rights reserved.

  18. Belgian and Spanish consumption data and consumer handling practices for fresh fruits and vegetables useful for further microbiological and chemical exposure assessment.

    Science.gov (United States)

    Jacxsens, L; Ibañez, I Castro; Gómez-López, V M; Fernandes, J Araujo; Allende, A; Uyttendaele, M; Huybrechts, I

    2015-04-01

    A consumer survey was organized in Spain and Belgium to obtain consumption data and to gain insight into consumer handling practices for fresh vegetables consumed raw or minimally processed (i.e., heads of leafy greens, bell peppers, tomatoes, fresh herbs, and precut and packed leafy greens) and fruits to be consumed without peeling (i.e., apples, grapes, strawberries, raspberries, other berries, fresh juices, and precut mixed fruit). This information can be used for microbiological and/or chemical food safety research. After extensive cleanup of rough databases for missing and extreme values and age correction, information from 583 respondents from Spain and 1,605 respondents from Belgium (18 to 65 years of age) was retained. Daily intake (grams per day) was calculated taking into account frequency and seasonality of consumption, and distributions were obtained that can be used in quantitative risk assessment for chemical hazards with chronic effects on human health. Data also were recalculated to obtain discrete distributions of consumption per portion and the corresponding frequency of consumption, which can be used in acute microbiological risk assessment or outbreak investigations. The ranked median daily consumption of fruits and vegetables was similar in Spain and Belgium: apple > strawberry > grapes > strawberries and raspberries; and tomatoes > leafy greens > bell peppers > fresh herbs. However, vegetable consumption was higher (in terms of both portion and frequency of consumption) in Spain than in Belgium, whereas the opposite was found for fruit consumption. Regarding consumer handling practices related to storage time and method, Belgian consumers less frequently stored their fresh produce in a refrigerator and did so for shorter times compared with Spanish consumers. Washing practices for lettuce heads and packed leafy greens also were different. The survey revealed differences between these two countries in consumption and consumer handling practices

  19. Review of guidelines and literature for handling missing data in longitudinal clinical trials with a case study.

    Science.gov (United States)

    Liu, M; Wei, L; Zhang, J

    2006-01-01

    Missing data in clinical trials are inevitable. We highlight the ICH guidelines and CPMP points to consider on missing data. Specifically, we outline how we should consider missing data issues when designing, planning and conducting studies to minimize missing data impact. We also go beyond the coverage of the above two documents, provide a more detailed review of the basic concepts of missing data and frequently used terminologies, and examples of the typical missing data mechanism, and discuss technical details and literature for several frequently used statistical methods and associated software. Finally, we provide a case study where the principles outlined in this paper are applied to one clinical program at protocol design, data analysis plan and other stages of a clinical trial.

  20. Research Note: The consequences of different methods for handling missing network data in Stochastic Actor Based Models.

    Science.gov (United States)

    Hipp, John R; Wang, Cheng; Butts, Carter T; Jose, Rupa; Lakon, Cynthia M

    2015-05-01

    Although stochastic actor based models (e.g., as implemented in the SIENA software program) are growing in popularity as a technique for estimating longitudinal network data, a relatively understudied issue is the consequence of missing network data for longitudinal analysis. We explore this issue in our research note by utilizing data from four schools in an existing dataset (the AddHealth dataset) over three time points, assessing the substantive consequences of using four different strategies for addressing missing network data. The results indicate that whereas some measures in such models are estimated relatively robustly regardless of the strategy chosen for addressing missing network data, some of the substantive conclusions will differ based on the missing data strategy chosen. These results have important implications for this burgeoning applied research area, implying that researchers should more carefully consider how they address missing data when estimating such models.

  1. Software development for statistical handling of dosimetric and epidemiological data base; Programacion para la explotacion estadistica de los bancos de datos dosimetrico y epidemiologico

    Energy Technology Data Exchange (ETDEWEB)

    Amaro, M

    1990-07-01

    The dose records from different groups of occupationally exposed workers are available in a computerized data base whose main purpose is the individual dose follow-up. Apart from this objective, such a dosimetric data base can be useful to obtain statistical analysis. The type of statistical n formation that can be extracted from the data base may aim to attain mainly two kinds of objectives: - Individual and collective dose distributions and statistics. -Epidemiological statistics. The report describes the software developed to obtain the statistical reports required by the Regulatory Body, as well as any other type of dose distributions or statistics to be included in epidemiological studies A Users Guide for the operators who handle this software package, and the codes listings, are also included in the report. (Author) 2 refs.

  2. Development of a Data-Driven Predictive Model of Supply Air Temperature in an Air-Handling Unit for Conserving Energy

    Directory of Open Access Journals (Sweden)

    Goopyo Hong

    2018-02-01

    Full Text Available The purpose of this study was to develop a data-driven predictive model that can predict the supply air temperature (SAT in an air-handling unit (AHU by using a neural network. A case study was selected, and AHU operational data from December 2015 to November 2016 was collected. A data-driven predictive model was generated through an evolving process that consisted of an initial model, an optimal model, and an adaptive model. In order to develop the optimal model, input variables, the number of neurons and hidden layers, and the period of the training data set were considered. Since AHU data changes over time, an adaptive model, which has the ability to actively cope with constantly changing data, was developed. This adaptive model determined the model with the lowest mean square error (MSE of the 91 models, which had two hidden layers and sets up a 12-hour test set at every prediction. The adaptive model used recently collected data as training data and utilized the sliding window technique rather than the accumulative data method. Furthermore, additional testing was performed to validate the adaptive model using AHU data from another building. The final adaptive model predicts SAT to a root mean square error (RMSE of less than 0.6 °C.

  3. Easy handling of tectonic data: the programs TectonicVB for Mac and TectonicsFP for Windows™

    Science.gov (United States)

    Ortner, Hugo; Reiter, Franz; Acs, Peter

    2002-12-01

    TectonicVB for Macintosh and TectonicsFP for Windows TM operating systems are two menu-driven computer programs which allow the shared use of data on these environments. The programs can produce stereographic plots of orientation data (great circles, poles, lineations). Frequently used statistical procedures like calculation of eigenvalues and eigenvectors, calculation of mean vector with concentration parameters and confidence cone can be easily performed. Fault data can be plotted in stereographic projection (Angelier and Hoeppener plots). Sorting of datasets into homogeneous subsets and rotation of tectonic data can be performed in interactive two-diagram windows. The paleostress tensor can be calculated from fault data sets using graphical (calculation of kinematic axes and right dihedra method) or mathematical methods (direct inversion or numerical dynamical analysis). The calculations can be checked in dimensionless Mohr diagrams and fluctuation histograms.

  4. Handling Low-Density LiDAR Data: Calculating the Heights of Civil Constructions and the Accuracy Expected

    Directory of Open Access Journals (Sweden)

    Rubén Martínez Marín

    2013-01-01

    Full Text Available During the last years, in many developed countries, administrations and private companies have devoted considerable amounts of money to obtain mapping data using airborne LiDAR. For many civil activities, we can take advantage of it, since those data are available with no cost. Some important questions arise: Are those data good enough to be used for determining the heights of the civil constructions with the accuracy we need in some civil work? What accuracy can we expect when using low-density LiDAR data (0.5 pts/m2? In order to answer those questions, we have developed a specific methodology based on establishing a set of control points on the top of several constructions and calculating the elevation of each one using postprocessing GPS. Those results have been taken as correct values and the comparison between those values and the elevations obtained, assigning values to the control points by the interpolation of the LiDAR dataset, has been carried out. This paper shows the results obtained using low-density airborne LiDAR data and the accuracy obtained. Results have shown that LiDAR can be accurate enough (10–25 cm to determine the height of civil constructions and apply those data in many civil engineering activities.

  5. National Survey of Workplaces Handling and Manufacturing Nanomaterials, Exposure to and Health Effects of Nanomaterials, and Evaluation of Nanomaterial Safety Data Sheets

    Science.gov (United States)

    2016-01-01

    A national survey on workplace environment nanomaterial handling and manufacturing was conducted in 2014. Workplaces relevant to nanomaterials were in the order of TiO2 (91), SiO2 (88), carbon black (84), Ag (35), Al2O3 (35), ZnO (34), Pb (33), and CeO2 (31). The survey results indicated that the number of workplaces handling or manufacturing nanomaterials was 340 (0.27% of total 126,846) workplaces. The number of nanomaterials used and products was 546 (1.60 per company) and 583 (1.71 per company), respectively. For most workplaces, the results on exposure to hazardous particulate materials, including nanomaterials, were below current OELs, yet a few workplaces were above the action level. As regards the health status of workers, 9 workers were diagnosed with a suspected respiratory occupational disease, where 7 were recommended for regular follow-up health monitoring. 125 safety data sheets (SDSs) were collected from the nanomaterial-relevant workplaces and evaluated for their completeness and reliability. Only 4 CNT SDSs (3.2%) included the term nanomaterial, while most nanomaterial SDSs were not regularly updated and lacked hazard information. When taken together, the current analysis provides valuable national-level information on the exposure and health status of workers that can guide the next policy steps for nanomaterial management in the workplace. PMID:27556041

  6. Reporting and Handling Missing Outcome Data in Mental Health: A Systematic Review of Cochrane Systematic Reviews and Meta-Analyses

    Science.gov (United States)

    Spineli, Loukia M.; Pandis, Nikolaos; Salanti, Georgia

    2015-01-01

    Objectives: The purpose of the study was to provide empirical evidence about the reporting of methodology to address missing outcome data and the acknowledgement of their impact in Cochrane systematic reviews in the mental health field. Methods: Systematic reviews published in the Cochrane Database of Systematic Reviews after January 1, 2009 by…

  7. Realization of the low background neutrino detector Double Chooz. From the development of a high-purity liquid and gas handling concept to first neutrino data

    Energy Technology Data Exchange (ETDEWEB)

    Pfahler, Patrick

    2012-12-17

    Neutrino physics is one of the most vivid fields in particle physics. Within this field, neutrino oscillations are of special interest as they allow to determine driving oscillation parameters, which are collected as mixing angles in the leptonic mixing matrix. The exact knowledge of these parameters is the main key for the investigation of new physics beyond the currently known Standard Model of particle physics. The Double Chooz experiment is one of three reactor disappearance experiments currently taking data, which recently succeeded to discover a non-zero value for the last neutrino mixing angle {Theta}{sub 13}. As successor of the CHOOZ experiment, Double Chooz will use two detectors with improved design, each of them now composed of four concentrically nested detector vessels each filled with different detector liquid. The integrity of this multi-layered structure and the quality of the used detector liquids are essential for the success of the experiment. Within this frame, the here presented work describes the production of two detector liquids, the filling and handling of the Double Chooz far detector and the installation of all necessary hardware components therefore. In order to meet the strict requirements existing for the detector liquids, all components were individually selected in an extensive material selection process at TUM, which compared samples from different companies for their key properties: density, transparency, light yield and radio purity. Based on these measurements, the composition of muon veto scintillator and buffer liquid were determined. For the production of the detector liquids, a simple surface building close to the far detector site was upgraded into a large-scale storage and mixing facility, which allowed to separately, mix, handle and store 90 m{sup 3} of muon veto scintillator and 110 m{sup 3} of buffer liquid. For the muon veto scintillator, a master-solution composed of 4800 l LAB, 180 kg PPO and 1.8 kg of bis/MSB was

  8. Realization of the low background neutrino detector Double Chooz. From the development of a high-purity liquid and gas handling concept to first neutrino data

    International Nuclear Information System (INIS)

    Pfahler, Patrick

    2012-01-01

    Neutrino physics is one of the most vivid fields in particle physics. Within this field, neutrino oscillations are of special interest as they allow to determine driving oscillation parameters, which are collected as mixing angles in the leptonic mixing matrix. The exact knowledge of these parameters is the main key for the investigation of new physics beyond the currently known Standard Model of particle physics. The Double Chooz experiment is one of three reactor disappearance experiments currently taking data, which recently succeeded to discover a non-zero value for the last neutrino mixing angle Θ 13 . As successor of the CHOOZ experiment, Double Chooz will use two detectors with improved design, each of them now composed of four concentrically nested detector vessels each filled with different detector liquid. The integrity of this multi-layered structure and the quality of the used detector liquids are essential for the success of the experiment. Within this frame, the here presented work describes the production of two detector liquids, the filling and handling of the Double Chooz far detector and the installation of all necessary hardware components therefore. In order to meet the strict requirements existing for the detector liquids, all components were individually selected in an extensive material selection process at TUM, which compared samples from different companies for their key properties: density, transparency, light yield and radio purity. Based on these measurements, the composition of muon veto scintillator and buffer liquid were determined. For the production of the detector liquids, a simple surface building close to the far detector site was upgraded into a large-scale storage and mixing facility, which allowed to separately, mix, handle and store 90 m 3 of muon veto scintillator and 110 m 3 of buffer liquid. For the muon veto scintillator, a master-solution composed of 4800 l LAB, 180 kg PPO and 1.8 kg of bis/MSB was produced and

  9. The Back-End of the Nuclear Fuel Cycle in Sweden. Considerations for safeguards and data handling

    International Nuclear Information System (INIS)

    Fritzell, Anni

    2011-01-01

    report, Paper 2, describes which data must be secured prior to encapsulation and disposal of the nuclear material. Fuel data is needed for safeguards reasons and for future national needs. The conclusions include a summary of the data types that cannot be recreated once the fuel assembly is encapsulated. The safeguards system needs data from a measurement showing that the fuel assembly to be encapsulated actually contains spent nuclear fuel, i.e. is not a dummy. Furthermore, each assembly must be undisputedly identified. These conclusions are supported by the diversion path analysis presented in Paper 3. Fuel data for national needs should cover all future needs that the State (which will assume responsibility for the repository and its contents after its closure) may have. Data that must be secured prior to encapsulation is the isotopic composition of the fuel including the uncertainties for the computed or measured data. The third and final part of the report, Paper 3, includes an analysis of possible paths from the encapsulation plant along which nuclear material could be diverted. The diversion path analysis is the basis for discussions on which data the safeguards system must have access to (see Paper 2) and on how a comprehensive safeguards system for the encapsulation plant could be designed (see Paper 1). The diversion path analysis includes a summary of identified diversion paths from the encapsulation process and detection points, i.e. points in the process where diversion can be detected

  10. The Back-End of the Nuclear Fuel Cycle in Sweden. Considerations for safeguards and data handling

    Energy Technology Data Exchange (ETDEWEB)

    Fritzell, Anni (ES-konsult, Solna (Sweden))

    2011-01-15

    report, Paper 2, describes which data must be secured prior to encapsulation and disposal of the nuclear material. Fuel data is needed for safeguards reasons and for future national needs. The conclusions include a summary of the data types that cannot be recreated once the fuel assembly is encapsulated. The safeguards system needs data from a measurement showing that the fuel assembly to be encapsulated actually contains spent nuclear fuel, i.e. is not a dummy. Furthermore, each assembly must be undisputedly identified. These conclusions are supported by the diversion path analysis presented in Paper 3. Fuel data for national needs should cover all future needs that the State (which will assume responsibility for the repository and its contents after its closure) may have. Data that must be secured prior to encapsulation is the isotopic composition of the fuel including the uncertainties for the computed or measured data. The third and final part of the report, Paper 3, includes an analysis of possible paths from the encapsulation plant along which nuclear material could be diverted. The diversion path analysis is the basis for discussions on which data the safeguards system must have access to (see Paper 2) and on how a comprehensive safeguards system for the encapsulation plant could be designed (see Paper 1). The diversion path analysis includes a summary of identified diversion paths from the encapsulation process and detection points, i.e. points in the process where diversion can be detected

  11. Handling mixed-state magnetization data for magnetocaloric studies-a solution to achieve realistic entropy behaviour

    International Nuclear Information System (INIS)

    Das, S; Amaral, J S; Amaral, V S

    2010-01-01

    We present an approach to extract a realistic magnetic entropy value from non-equilibrium magnetization data near the transition temperature of a typical first-order system with a mixed-phase state, influenced by the phase transformation, which is responsible for large values reported, even higher than the theoretical limit. The effect of the mixed-phase state is modelled in the magnetization and its non-physical contribution is removed to obtain the magnetic entropy in accordance with calorimetric experiment and theoretical simulation. This approach gives a reliable estimation of the magnetic entropy value incorporating experimental non-equilibrium magnetization data and correcting the use of Maxwell's relation. (fast track communication)

  12. Functional tests of a prototype for the CMS-ATLAS common non-event data handling framework

    CERN Document Server

    Formica, Andrea; The ATLAS collaboration

    2016-01-01

    Since the 2014 the experiments ATLAS and CMS have started to share a common vision for the Condition Database infrastructure required for the forthcoming LHC runs. The large commonality in the use cases to be satisfied has allowed to agree to an overall design solution which could meet the requirements for both experiments. A first prototype implementing these solutions has been completed in 2015 and made available to both the experiments. The prototype is based on a web service implementing a REST api with a set of functions for the management of conditions data. The objects which constitute the elements of the data model are seen as resources on which CRUD operations can be performed via standard HTTP methods. The choice to insert a REST api in the architecture has several advantages: 1) the conditions data are exchanged in a neutral format ( JSON or XML), allowing to be processed by different technologies in different frameworks. 2) the client is agnostic with respect to the underlying technology adopted f...

  13. Predicting methionine and lysine contents in soybean meal and fish meal using a group method of data handling-type neural network

    Energy Technology Data Exchange (ETDEWEB)

    Mottaghitalab, M.; Nikkhah, N.; Darmani-Kuhi, H.; López, S.; France, J.

    2015-07-01

    Artificial neural network models offer an alternative to linear regression analysis for predicting the amino acid content of feeds from their chemical composition. A group method of data handling-type neural network (GMDH-type NN), with an evolutionary method of genetic algorithm, was used to predict methionine (Met) and lysine (Lys) contents of soybean meal (SBM) and fish meal (FM) from their proximate analyses (i.e. crude protein, crude fat, crude fibre, ash and moisture). A data set with 119 data lines for Met and 116 lines for Lys was used to develop GMDH-type NN models with two hidden layers. The data lines were divided into two groups to produce training and validation sets. The data sets were imported into the GEvoM software for training the networks. The predictive capability of the constructed models was evaluated by their abilities to estimate the validation data sets accurately. A quantitative examination of goodness of fit for the predictive models was made using a number of precision, concordance and bias statistics. The statistical performance of the models developed revealed close agreement between observed and predicted Met and Lys contents for SBM and FM. The results of this study clearly illustrate the validity of GMDH-type NN models to estimate accurately the amino acid content of poultry feed ingredients from their chemical composition . (Author)

  14. A personal digital assistant application (MobilDent) for dental fieldwork data collection, information management and database handling.

    Science.gov (United States)

    Forsell, M; Häggström, M; Johansson, O; Sjögren, P

    2008-11-08

    To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.

  15. The impact of different strategies to handle missing data on both precision and bias in a drug safety study: a multidatabase multinational population-based cohort study

    Science.gov (United States)

    Martín-Merino, Elisa; Calderón-Larrañaga, Amaia; Hawley, Samuel; Poblador-Plou, Beatriz; Llorente-García, Ana; Petersen, Irene; Prieto-Alhambra, Daniel

    2018-01-01

    Background Missing data are often an issue in electronic medical records (EMRs) research. However, there are many ways that people deal with missing data in drug safety studies. Aim To compare the risk estimates resulting from different strategies for the handling of missing data in the study of venous thromboembolism (VTE) risk associated with antiosteoporotic medications (AOM). Methods New users of AOM (alendronic acid, other bisphosphonates, strontium ranelate, selective estrogen receptor modulators, teriparatide, or denosumab) aged ≥50 years during 1998–2014 were identified in two Spanish (the Base de datos para la Investigación Farmacoepidemiológica en Atención Primaria [BIFAP] and EpiChron cohort) and one UK (Clinical Practice Research Datalink [CPRD]) EMR. Hazard ratios (HRs) according to AOM (with alendronic acid as reference) were calculated adjusting for VTE risk factors, body mass index (that was missing in 61% of patients included in the three databases), and smoking (that was missing in 23% of patients) in the year of AOM therapy initiation. HRs and standard errors obtained using cross-sectional multiple imputation (MI) (reference method) were compared to complete case (CC) analysis – using only patients with complete data – and longitudinal MI – adding to the cross-sectional MI model the body mass index/smoking values as recorded in the year before and after therapy initiation. Results Overall, 422/95,057 (0.4%), 19/12,688 (0.1%), and 2,051/161,202 (1.3%) VTE cases/participants were seen in BIFAP, EpiChron, and CPRD, respectively. HRs moved from 100.00% underestimation to 40.31% overestimation in CC compared with cross-sectional MI, while longitudinal MI methods provided similar risk estimates compared with cross-sectional MI. Precision for HR improved in cross-sectional MI versus CC by up to 160.28%, while longitudinal MI improved precision (compared with cross-sectional) only minimally (up to 0.80%). Conclusion CC may substantially

  16. Summary of Conceptual Models and Data Needs to Support the INL Remote-Handled Low-Level Waste Disposal Facility Performance Assessment and Composite Analysis

    International Nuclear Information System (INIS)

    Sondrup, A. Jeff; Schafter, Annette L.; Rood, Arthur S.

    2010-01-01

    An overview of the technical approach and data required to support development of the performance assessment, and composite analysis are presented for the remote handled low-level waste disposal facility on-site alternative being considered at Idaho National Laboratory. Previous analyses and available data that meet requirements are identified and discussed. Outstanding data and analysis needs are also identified and summarized. The on-site disposal facility is being evaluated in anticipation of the closure of the Radioactive Waste Management Complex at the INL. An assessment of facility performance and of the composite performance are required to meet the Department of Energy's Low-Level Waste requirements (DOE Order 435.1, 2001) which stipulate that operation and closure of the disposal facility will be managed in a manner that is protective of worker and public health and safety, and the environment. The corresponding established procedures to ensure these protections are contained in DOE Manual 435.1-1, Radioactive Waste Management Manual (DOE M 435.1-1 2001). Requirements include assessment of (1) all-exposure pathways, (2) air pathway, (3) radon, and (4) groundwater pathway doses. Doses are computed from radionuclide concentrations in the environment. The performance assessment and composite analysis are being prepared to assess compliance with performance objectives and to establish limits on concentrations and inventories of radionuclides at the facility and to support specification of design, construction, operation and closure requirements. Technical objectives of the PA and CA are primarily accomplished through the development of an establish inventory, and through the use of predictive environmental transport models implementing an overarching conceptual framework. This document reviews the conceptual model, inherent assumptions, and data required to implement the conceptual model in a numerical framework. Available site-specific data and data sources

  17. Plutonium safe handling

    International Nuclear Information System (INIS)

    Tvehlov, Yu.

    2000-01-01

    The abstract, prepared on the basis of materials of the IAEA new leadership on the plutonium safe handling and its storage (the publication no. 9 in the Safety Reports Series), aimed at presenting internationally acknowledged criteria on the radiation danger evaluation and summarizing the experience in the safe management of great quantities of plutonium, accumulated in the nuclear states, is presented. The data on the weapon-class and civil plutonium, the degree of its danger, the measures for provision of its safety, including the data on accident radiation consequences with the fission number 10 18 , are presented. The recommendations, making it possible to eliminate the super- criticality danger, as well as ignition and explosion, to maintain the tightness of the facility, aimed at excluding the radioactive contamination and the possibility of internal irradiation, to provide for the plutonium security, physical protection and to reduce irradiation are given [ru

  18. Remote handling in ZEPHYR

    International Nuclear Information System (INIS)

    Andelfinger, C.; Lackner, E.; Ulrich, M.; Weber, G.; Schilling, H.B.

    1982-04-01

    A conceptual design of the ZEPHYR building is described. The listed radiation data show that remote handling devices will be necessary in most areas of the building. For difficult repair and maintenance works it is intended to transfer complete units from the experimental hall to a hot cell which provides better working conditions. The necessary crane systems and other transport means are summarized as well as suitable commercially available manipulators and observation devices. The conept of automatic devices for cutting and welding and other operations inside the vacuum vessel and the belonging position control system is sketched. Guidelines for the design of passive components are set up in order to facilitate remote operation. (orig.)

  19. Development of the remote-handled transuranic waste radioassay data quality objectives. An evaluation of RH-TRU waste inventories, characteristics, radioassay methods and capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Meeks, A.M.; Chapman, J.A.

    1997-09-01

    The Waste Isolation Pilot Plant will accept remote-handled transuranic waste as early as October of 2001. Several tasks must be accomplished to meet this schedule, one of which is the development of Data Quality Objectives (DQOs) and corresponding Quality Assurance Objectives (QAOs) for the assay of radioisotopes in RH-TRU waste. Oak Ridge National Laboratory (ORNL) was assigned the task of providing to the DOE QAO, information necessary to aide in the development of DQOs for the radioassay of RH-TRU waste. Consistent with the DQO process, information needed and presented in this report includes: identification of RH-TRU generator site radionuclide data that may have potential significance to the performance of the WIPP repository or transportation requirements; evaluation of existing methods to measure the identified isotopic and quantitative radionuclide data; evaluation of existing data as a function of site waste streams using documented site information on fuel burnup, radioisotope processing and reprocessing, special research and development activities, measurement collection efforts, and acceptable knowledge; and the current status of technologies and capabilities at site facilities for the identification and assay of radionuclides in RH-TRU waste streams. This report is intended to provide guidance in developing the RH-TRU waste radioassay DQOs, first by establishing a baseline from which to work, second, by identifying needs to fill in the gaps between what is known and achievable today and that which will be required before DQOs can be formulated, and third, by recommending measures that should be taken to assure that the DQOs in fact balance risk and cost with an achievable degree of certainty.

  20. Development of the remote-handled transuranic waste radioassay data quality objectives. An evaluation of RH-TRU waste inventories, characteristics, radioassay methods and capabilities

    International Nuclear Information System (INIS)

    Meeks, A.M.; Chapman, J.A.

    1997-09-01

    The Waste Isolation Pilot Plant will accept remote-handled transuranic waste as early as October of 2001. Several tasks must be accomplished to meet this schedule, one of which is the development of Data Quality Objectives (DQOs) and corresponding Quality Assurance Objectives (QAOs) for the assay of radioisotopes in RH-TRU waste. Oak Ridge National Laboratory (ORNL) was assigned the task of providing to the DOE QAO, information necessary to aide in the development of DQOs for the radioassay of RH-TRU waste. Consistent with the DQO process, information needed and presented in this report includes: identification of RH-TRU generator site radionuclide data that may have potential significance to the performance of the WIPP repository or transportation requirements; evaluation of existing methods to measure the identified isotopic and quantitative radionuclide data; evaluation of existing data as a function of site waste streams using documented site information on fuel burnup, radioisotope processing and reprocessing, special research and development activities, measurement collection efforts, and acceptable knowledge; and the current status of technologies and capabilities at site facilities for the identification and assay of radionuclides in RH-TRU waste streams. This report is intended to provide guidance in developing the RH-TRU waste radioassay DQOs, first by establishing a baseline from which to work, second, by identifying needs to fill in the gaps between what is known and achievable today and that which will be required before DQOs can be formulated, and third, by recommending measures that should be taken to assure that the DQOs in fact balance risk and cost with an achievable degree of certainty

  1. Nuclear fuel handling apparatus

    International Nuclear Information System (INIS)

    Andrea, C.; Dupen, C.F.G.; Noyes, R.C.

    1977-01-01

    A fuel handling machine for a liquid metal cooled nuclear reactor in which a retractable handling tube and gripper are lowered into the reactor to withdraw a spent fuel assembly into the handling tube. The handling tube containing the fuel assembly immersed in liquid sodium is then withdrawn completely from the reactor into the outer barrel of the handling machine. The machine is then used to transport the spent fuel assembly directly to a remotely located decay tank. The fuel handling machine includes a decay heat removal system which continuously removes heat from the interior of the handling tube and which is capable of operating at its full cooling capacity at all times. The handling tube is supported in the machine from an articulated joint which enables it to readily align itself with the correct position in the core. An emergency sodium supply is carried directly by the machine to provide make up in the event of a loss of sodium from the handling tube during transport to the decay tank. 5 claims, 32 drawing figures

  2. The WAIS Melt Monitor: An automated ice core melting system for meltwater sample handling and the collection of high resolution microparticle size distribution data

    Science.gov (United States)

    Breton, D. J.; Koffman, B. G.; Kreutz, K. J.; Hamilton, G. S.

    2010-12-01

    Paleoclimate data are often extracted from ice cores by careful geochemical analysis of meltwater samples. The analysis of the microparticles found in ice cores can also yield unique clues about atmospheric dust loading and transport, dust provenance and past environmental conditions. Determination of microparticle concentration, size distribution and chemical makeup as a function of depth is especially difficult because the particle size measurement either consumes or contaminates the meltwater, preventing further geochemical analysis. Here we describe a microcontroller-based ice core melting system which allows the collection of separate microparticle and chemistry samples from the same depth intervals in the ice core, while logging and accurately depth-tagging real-time electrical conductivity and particle size distribution data. This system was designed specifically to support microparticle analysis of the WAIS Divide WDC06A deep ice core, but many of the subsystems are applicable to more general ice core melting operations. Major system components include: a rotary encoder to measure ice core melt displacement with 0.1 millimeter accuracy, a meltwater tracking system to assign core depths to conductivity, particle and sample vial data, an optical debubbler level control system to protect the Abakus laser particle counter from damage due to air bubbles, a Rabbit 3700 microcontroller which communicates with a host PC, collects encoder and optical sensor data and autonomously operates Gilson peristaltic pumps and fraction collectors to provide automatic sample handling, melt monitor control software operating on a standard PC allowing the user to control and view the status of the system, data logging software operating on the same PC to collect data from the melting, electrical conductivity and microparticle measurement systems. Because microparticle samples can easily be contaminated, we use optical air bubble sensors and high resolution ice core density

  3. Software for handling MFME1

    International Nuclear Information System (INIS)

    Van der Merwe, W.G.

    1984-01-01

    The report deals with SEMFIP, a computer code for determining magnetic field measurements. The program is written in FORTRAN and ASSEMBLER. The preparations for establishing SEMFIP, the actual measurements, data handling and the problems that were experienced are discussed. Details on the computer code are supplied in an appendix

  4. Scheduling of outbound luggage handling at airports

    DEFF Research Database (Denmark)

    Barth, Torben C.; Pisinger, David

    2012-01-01

    This article considers the outbound luggage handling problem at airports. The problem is to assign handling facilities to outbound flights and decide about the handling start time. This dynamic, near real-time assignment problem is part of the daily airport operations. Quality, efficiency......). Another solution method is a decomposition approach. The problem is divided into different subproblems and solved in iterative steps. The different solution approaches are tested on real world data from Frankfurt Airport....

  5. How to Handle Abuse

    Science.gov (United States)

    ... Handle Abuse KidsHealth / For Kids / How to Handle Abuse What's in this article? Tell Right Away How Do You Know Something Is Abuse? ... babysitter, teacher, coach, or a bigger kid. Child abuse can happen anywhere — at ... building. Tell Right Away A kid who is being seriously hurt ...

  6. Grain Handling and Storage.

    Science.gov (United States)

    Harris, Troy G.; Minor, John

    This text for a secondary- or postecondary-level course in grain handling and storage contains ten chapters. Chapter titles are (1) Introduction to Grain Handling and Storage, (2) Elevator Safety, (3) Grain Grading and Seed Identification, (4) Moisture Control, (5) Insect and Rodent Control, (6) Grain Inventory Control, (7) Elevator Maintenance,…

  7. Handling Pyrophoric Reagents

    Energy Technology Data Exchange (ETDEWEB)

    Alnajjar, Mikhail S.; Haynie, Todd O.

    2009-08-14

    Pyrophoric reagents are extremely hazardous. Special handling techniques are required to prevent contact with air and the resulting fire. This document provides several methods for working with pyrophoric reagents outside of an inert atmosphere.

  8. Remote handling equipment

    International Nuclear Information System (INIS)

    Clement, G.

    1984-01-01

    After a definition of intervention, problems encountered for working in an adverse environment are briefly analyzed for development of various remote handling equipments. Some examples of existing equipments are given [fr

  9. Ergonomics and patient handling.

    Science.gov (United States)

    McCoskey, Kelsey L

    2007-11-01

    This study aimed to describe patient-handling demands in inpatient units during a 24-hour period at a military health care facility. A 1-day total population survey described the diverse nature and impact of patient-handling tasks relative to a variety of nursing care units, patient characteristics, and transfer equipment. Productivity baselines were established based on patient dependency, physical exertion, type of transfer, and time spent performing the transfer. Descriptions of the physiological effect of transfers on staff based on patient, transfer, and staff characteristics were developed. Nursing staff response to surveys demonstrated how patient-handling demands are impacted by the staff's physical exertion and level of patient dependency. The findings of this study describe the types of transfers occurring in these inpatient units and the physical exertion and time requirements for these transfers. This description may guide selection of the most appropriate and cost-effective patient-handling equipment required for specific units and patients.

  10. DDOS ATTACK DETECTION SIMULATION AND HANDLING MECHANISM

    Directory of Open Access Journals (Sweden)

    Ahmad Sanmorino

    2013-11-01

    Full Text Available In this study we discuss how to handle DDoS attack that coming from the attacker by using detection method and handling mechanism. Detection perform by comparing number of packets and number of flow. Whereas handling mechanism perform by limiting or drop the packets that detected as a DDoS attack. The study begins with simulation on real network, which aims to get the real traffic data. Then, dump traffic data obtained from the simulation used for detection method on our prototype system called DASHM (DDoS Attack Simulation and Handling Mechanism. From the result of experiment that has been conducted, the proposed method successfully detect DDoS attack and handle the incoming packet sent by attacker.

  11. Remote handling machines

    International Nuclear Information System (INIS)

    Sato, Shinri

    1985-01-01

    In nuclear power facilities, the management of radioactive wastes is made with its technology plus the automatic techniques. Under the radiation field, the maintenance or aid of such systems is important. To cope with this situation, MF-2 system, MF-3 system and a manipulator system as remote handling machines are described. MF-2 system consists of an MF-2 carrier truck, a control unit and a command trailer. It is capable of handling heavy-weight objects. The system is not by hydraulic but by electrical means. MF-3 system consists of a four-crawler truck and a manipulator. The truck is versatile in its posture by means of the four independent crawlers. The manipulator system is bilateral in operation, so that the delicate handling is made possible. (Mori, K.)

  12. Practices of Handling

    DEFF Research Database (Denmark)

    Ræbild, Ulla

    to touch, pick up, carry, or feel with the hands. Figuratively it is to manage, deal with, direct, train, or control. Additionally, as a noun, a handle is something by which we grasp or open up something. Lastly, handle also has a Nordic root, here meaning to trade, bargain or deal. Together all four...... meanings seem to merge in the fashion design process, thus opening up for an embodied engagement with matter that entails direction giving, organizational management and negotiation. By seeing processes of handling as a key fashion methodological practice, it is possible to divert the discourse away from...... introduces four ways whereby fashion designers apply their own bodies as tools for design; a) re-activating past garment-design experiences, b) testing present garment-design experiences c) probing for new garment-design experiences and d) design of future garment experiences by body proxy. The paper...

  13. Remote handling at LAMPF

    International Nuclear Information System (INIS)

    Grisham, D.L.; Lambert, J.E.

    1983-01-01

    Experimental area A at the Clinton P. Anderson Meson Physics Facility (LAMPF) encompasses a large area. Presently there are four experimental target cells along the main proton beam line that have become highly radioactive, thus dictating that all maintenance be performed remotely. The Monitor remote handling system was developed to perform in situ maintenance at any location within area A. Due to the complexity of experimental systems and confined space, conventional remote handling methods based upon hot cell and/or hot bay concepts are not workable. Contrary to conventional remote handling which require special tooling for each specifically planned operation, the Monitor concept is aimed at providing a totally flexible system capable of remotely performing general mechanical and electrical maintenance operations using standard tools. The Monitor system is described

  14. TRANSPORT/HANDLING REQUESTS

    CERN Multimedia

    Groupe ST/HM

    2002-01-01

    A new EDH document entitled 'Transport/Handling Request' will be in operation as of Monday, 11th February 2002, when the corresponding icon will be accessible from the EDH desktop, together with the application instructions. This EDH form will replace the paper-format transport/handling request form for all activities involving the transport of equipment and materials. However, the paper form will still be used for all vehicle-hire requests. The introduction of the EDH transport/handling request form is accompanied by the establishment of the following time limits for the various services concerned: 24 hours for the removal of office items, 48 hours for the transport of heavy items (of up to 6 metric tons and of standard road width), 5 working days for a crane operation, extra-heavy transport operation or complete removal, 5 working days for all transport operations relating to LHC installation. ST/HM Group, Logistics Section Tel: 72672 - 72202

  15. 30o inclination in handles of plastic boxes can reduce postural and muscular workload during handling

    Directory of Open Access Journals (Sweden)

    Luciana C. C. B. Silva

    2013-06-01

    Full Text Available BACKGROUND: The handling of materials, which occurs in the industrial sector, is associated with lesions on the lumbar spine and in the upper limbs. Inserting handles in industrial boxes is a way to reduce work-related risks. Although the position and angle of the handles are significant factors in comfort and safety during handling, these factors have rarely been studied objectively. OBJECTIVE: To compare the handling of a commercial box and prototypes with handles and to evaluate the effects on upper limb posture, muscle electrical activity, and perceived acceptability using different grips while handling materials from different heights. METHOD: Thirty-seven healthy volunteers evaluated the handles of prototypes that allowed for changes in position (top and bottom and angle (0°, 15°, and 30°. Wrist, elbow, and shoulder movements were evaluated using electrogoniometry and inclinometry. The muscle electrical activity in the wrist extensors, biceps brachii, and the upper portion of the trapezius was measured using a portable electromyographer. The recorded data on muscle movements and electrical activity were synchronized. Subjective evaluations of acceptability were evaluated using a visual analog scale. RESULTS AND CONCLUSIONS: The prototypes with handles at a 30° angle produced the highest acceptability ratings, more neutral wrist positions, lower levels of electromyographic activity for the upper trapezius, and lower elevation angles for the arms. The different measurement methods were complementary in evaluating the upper limbs during handling.

  16. Proximity measuring device with backscattering radiation usable noticeably in remote handling or robotics and related data processing system. Proximetre a rayonnement retrodiffuse utilisable notamment en telemanipulation ou robotique et systeme de traitement associe

    Energy Technology Data Exchange (ETDEWEB)

    Andre, G; Espiau, B

    1985-05-03

    The invention is aimed at a proximity measuring device whose emitter, an electroluminescent diode, is controlled by control means to emit short duration (< 10 microseconds), high intensity (> 1A) flashes with periods higher than 100 microseconds. Emetter-object distance can be precisely measured on an 0-30 cm interval with the help of data processing of the response given by the proximity device receiver. This device can be used in remote handling and robotics.

  17. Safe handling of tritium

    International Nuclear Information System (INIS)

    1991-01-01

    The main objective of this publication is to provide practical guidance and recommendations on operational radiation protection aspects related to the safe handling of tritium in laboratories, industrial-scale nuclear facilities such as heavy-water reactors, tritium removal plants and fission fuel reprocessing plants, and facilities for manufacturing commercial tritium-containing devices and radiochemicals. The requirements of nuclear fusion reactors are not addressed specifically, since there is as yet no tritium handling experience with them. However, much of the material covered is expected to be relevant to them as well. Annex III briefly addresses problems in the comparatively small-scale use of tritium at universities, medical research centres and similar establishments. However, the main subject of this publication is the handling of larger quantities of tritium. Operational aspects include designing for tritium safety, safe handling practice, the selection of tritium-compatible materials and equipment, exposure assessment, monitoring, contamination control and the design and use of personal protective equipment. This publication does not address the technologies involved in tritium control and cleanup of effluents, tritium removal, or immobilization and disposal of tritium wastes, nor does it address the environmental behaviour of tritium. Refs, figs and tabs

  18. Grain Grading and Handling.

    Science.gov (United States)

    Rendleman, Matt; Legacy, James

    This publication provides an introduction to grain grading and handling for adult students in vocational and technical education programs. Organized in five chapters, the booklet provides a brief overview of the jobs performed at a grain elevator and of the techniques used to grade grain. The first chapter introduces the grain industry and…

  19. Mars Sample Handling Functionality

    Science.gov (United States)

    Meyer, M. A.; Mattingly, R. L.

    2018-04-01

    The final leg of a Mars Sample Return campaign would be an entity that we have referred to as Mars Returned Sample Handling (MRSH.) This talk will address our current view of the functional requirements on MRSH, focused on the Sample Receiving Facility (SRF).

  20. Handling wood shavings

    Energy Technology Data Exchange (ETDEWEB)

    1974-09-18

    Details of bulk handling equipment suitable for collection and compressing wood waste from commercial joinery works are discussed. The Redler Bin Discharger ensures free flow of chips from storage silo discharge prior to compression into briquettes for use as fuel or processing into chipboard.

  1. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  2. Handling and Transport Problems

    Energy Technology Data Exchange (ETDEWEB)

    Pomarola, J. [Head of Technical Section, Atomic Energy Commission, Saclay (France); Savouyaud, J. [Head of Electro-Mechanical Sub-Division, Atomic Energy Commission, Saclay (France)

    1960-07-01

    Arrangements for special or dangerous transport operations by road arising out of the activities of the Atomic Energy Commission are made by the Works and Installations Division which acts in concert with the Monitoring and Protection Division (MPD) whenever radioactive substances or appliances are involved. In view of the risk of irradiation and contamination entailed in handling and transporting radioactive substances, including waste, a specialized transport and storage team has been formed as a complement to the emergency and decontamination teams.

  3. Solid waste handling

    International Nuclear Information System (INIS)

    Parazin, R.J.

    1995-01-01

    This study presents estimates of the solid radioactive waste quantities that will be generated in the Separations, Low-Level Waste Vitrification and High-Level Waste Vitrification facilities, collectively called the Tank Waste Remediation System Treatment Complex, over the life of these facilities. This study then considers previous estimates from other 200 Area generators and compares alternative methods of handling (segregation, packaging, assaying, shipping, etc.)

  4. Handling of radioactive waste

    International Nuclear Information System (INIS)

    Sanhueza Mir, Azucena

    1998-01-01

    Based on characteristics and quantities of different types of radioactive waste produced in the country, achievements in infrastructure and the way to solve problems related with radioactive waste handling and management, are presented in this paper. Objectives of maintaining facilities and capacities for controlling, processing and storing radioactive waste in a conditioned form, are attained, within a great range of legal framework, so defined to contribute with safety to people and environment (au)

  5. Renal phosphate handling: Physiology

    Directory of Open Access Journals (Sweden)

    Narayan Prasad

    2013-01-01

    Full Text Available Phosphorus is a common anion. It plays an important role in energy generation. Renal phosphate handling is regulated by three organs parathyroid, kidney and bone through feedback loops. These counter regulatory loops also regulate intestinal absorption and thus maintain serum phosphorus concentration in physiologic range. The parathyroid hormone, vitamin D, Fibrogenic growth factor 23 (FGF23 and klotho coreceptor are the key regulators of phosphorus balance in body.

  6. Uranium hexafluoride handling

    International Nuclear Information System (INIS)

    1991-01-01

    The United States Department of Energy, Oak Ridge Field Office, and Martin Marietta Energy Systems, Inc., are co-sponsoring this Second International Conference on Uranium Hexafluoride Handling. The conference is offered as a forum for the exchange of information and concepts regarding the technical and regulatory issues and the safety aspects which relate to the handling of uranium hexafluoride. Through the papers presented here, we attempt not only to share technological advances and lessons learned, but also to demonstrate that we are concerned about the health and safety of our workers and the public, and are good stewards of the environment in which we all work and live. These proceedings are a compilation of the work of many experts in that phase of world-wide industry which comprises the nuclear fuel cycle. Their experience spans the entire range over which uranium hexafluoride is involved in the fuel cycle, from the production of UF 6 from the naturally-occurring oxide to its re-conversion to oxide for reactor fuels. The papers furnish insights into the chemical, physical, and nuclear properties of uranium hexafluoride as they influence its transport, storage, and the design and operation of plant-scale facilities for production, processing, and conversion to oxide. The papers demonstrate, in an industry often cited for its excellent safety record, continuing efforts to further improve safety in all areas of handling uranium hexafluoride

  7. Uranium hexafluoride handling. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    1991-12-31

    The United States Department of Energy, Oak Ridge Field Office, and Martin Marietta Energy Systems, Inc., are co-sponsoring this Second International Conference on Uranium Hexafluoride Handling. The conference is offered as a forum for the exchange of information and concepts regarding the technical and regulatory issues and the safety aspects which relate to the handling of uranium hexafluoride. Through the papers presented here, we attempt not only to share technological advances and lessons learned, but also to demonstrate that we are concerned about the health and safety of our workers and the public, and are good stewards of the environment in which we all work and live. These proceedings are a compilation of the work of many experts in that phase of world-wide industry which comprises the nuclear fuel cycle. Their experience spans the entire range over which uranium hexafluoride is involved in the fuel cycle, from the production of UF{sub 6} from the naturally-occurring oxide to its re-conversion to oxide for reactor fuels. The papers furnish insights into the chemical, physical, and nuclear properties of uranium hexafluoride as they influence its transport, storage, and the design and operation of plant-scale facilities for production, processing, and conversion to oxide. The papers demonstrate, in an industry often cited for its excellent safety record, continuing efforts to further improve safety in all areas of handling uranium hexafluoride. Selected papers were processed separately for inclusion in the Energy Science and Technology Database.

  8. Torus sector handling system

    International Nuclear Information System (INIS)

    Grisham, D.L.

    1981-01-01

    A remote handling system is proposed for moving a torus sector of the accelerator from under the cryostat to a point where it can be handled by a crane and for the reverse process for a new sector. Equipment recommendations are presented, as well as possible alignment schemes. Some general comments about future remote-handling methods and the present capabilities of existing systems will also be included. The specific task to be addressed is the removal and replacement of a 425 to 450 ton torus sector. This requires a horizontal movement of approx. 10 m from a normal operating position to a point where its further transport can be accomplished by more conventional means (crane or floor transporter). The same horizontal movement is required for reinstallation, but a positional tolerance of 2 cm is required to allow reasonable fit-up for the vacuum seal from the radial frames to the torus sector. Since the sectors are not only heavy but rather tall and narrow, the transport system must provide a safe, stable, and repeatable method fo sector movement. This limited study indicates that the LAMPF-based method of transporting torus sectors offers a proven method of moving heavy items. In addition, the present state of the art in remote equipment is adequate for FED maintenance

  9. Handling of Solid Residues

    International Nuclear Information System (INIS)

    Medina Bermudez, Clara Ines

    1999-01-01

    The topic of solid residues is specifically of great interest and concern for the authorities, institutions and community that identify in them a true threat against the human health and the atmosphere in the related with the aesthetic deterioration of the urban centers and of the natural landscape; in the proliferation of vectorial transmitters of illnesses and the effect on the biodiversity. Inside the wide spectrum of topics that they keep relationship with the environmental protection, the inadequate handling of solid residues and residues dangerous squatter an important line in the definition of political and practical environmentally sustainable. The industrial development and the population's growth have originated a continuous increase in the production of solid residues; of equal it forms, their composition day after day is more heterogeneous. The base for the good handling includes the appropriate intervention of the different stages of an integral administration of residues, which include the separation in the source, the gathering, the handling, the use, treatment, final disposition and the institutional organization of the administration. The topic of the dangerous residues generates more expectation. These residues understand from those of pathogen type that are generated in the establishments of health that of hospital attention, until those of combustible, inflammable type, explosive, radio-active, volatile, corrosive, reagent or toxic, associated to numerous industrial processes, common in our countries in development

  10. Harvesting and handling agricultural residues for energy

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, B.M.; Summer, H.R.

    1986-05-01

    Significant progress in understanding the needs for design of agricultural residue collection and handling systems has been made but additional research is required. Recommendations are made for research to (a) integrate residue collection and handling systems into general agricultural practices through the development of multi-use equipment and total harvest systems; (b) improve methods for routine evaluation of agricultural residue resources, possibly through remote sensing and image processing; (c) analyze biomass properties to obtain detailed data relevant to engineering design and analysis; (d) evaluate long-term environmental, social, and agronomic impacts of residue collection; (e) develop improved equipment with higher capacities to reduce residue collection and handling costs, with emphasis on optimal design of complete systems including collection, transportation, processing, storage, and utilization; and (f) produce standard forms of biomass fuels or products to enhance material handling and expand biomass markets through improved reliability and automatic control of biomass conversion and other utilization systems. 118 references.

  11. Preference Handling for Artificial Intelligence

    OpenAIRE

    Goldsmith, Judy; University of Kentucky; Junker, Ulrich; ILOG

    2009-01-01

    This article explains the benefits of preferences for AI systems and draws a picture of current AI research on preference handling. It thus provides an introduction to the topics covered by this special issue on preference handling.

  12. Religious Serpent Handling and Community Relations.

    Science.gov (United States)

    Williamson, W Paul; Hood, Ralph W

    2015-01-01

    Christian serpent handling sects of Appalachia comprise a community that has long been mischaracterized and marginalized by the larger communities surrounding them. To explore this dynamic, this article traces the emergence of serpent handling in Appalachia and the emergence of anti-serpent-handling state laws, which eventually failed to curb the practice, as local communities gave serpent handling groups support. We present two studies to consider for improving community relations with serpent handling sects. In study 1, we present data relating the incidence of reported serpent-bite deaths with the rise of anti-serpent-handling laws and their eventual abatement, based on increasing acceptance of serpent handlers by the larger community. Study 2 presents interview data on serpent bites and death that provide explanations for these events from the cultural and religious perspective. We conclude that first-hand knowledge about serpent handlers, and other marginalized groups, helps to lessen suspicion and allows them to be seen as not much different, which are tendencies that are important for promoting inter-community harmony.

  13. Crud handling circuit

    International Nuclear Information System (INIS)

    Smith, J.C.; Manuel, R.J.; McAllister, J.E.

    1981-01-01

    A process for handling the problems of crud formation during the solvent extraction of wet-process phosphoric acid, e.g. for uranium and rare earth removal, is described. It involves clarification of the crud-solvent mixture, settling, water washing the residue and treatment of the crud with a caustic wash to remove and regenerate the solvent. Applicable to synergistic mixtures of dialkylphosphoric acids and trialkylphosphine oxides dissolved in inert diluents and more preferably to the reductive stripping technique. (U.K.)

  14. Handling of potassium

    International Nuclear Information System (INIS)

    Schwarz, N.; Komurka, M.

    1983-03-01

    As a result for the Fast Breeder Development extensive experience is available worldwide with respect to Sodium technology. Due to the extension of the research program to topping cycles with Potassium as the working medium, test facilities with Potassium have been designed and operated in the Institute of Reactor Safety. The different chemical properties of Sodium and Potassium give rise in new safety concepts and operating procedures. The handling problems of Potassium are described in the light of theoretical properties and own experiences. Selected literature on main safety and operating problems complete this report. (Author) [de

  15. Extreme coal handling

    Energy Technology Data Exchange (ETDEWEB)

    Bradbury, S; Homleid, D. [Air Control Science Inc. (United States)

    2004-04-01

    Within the journals 'Focus on O & M' is a short article describing modifications to coal handling systems at Eielson Air Force Base near Fairbanks, Alaska, which is supplied with power and heat from a subbituminous coal-fired central plant. Measures to reduce dust include addition of an enclosed recirculation chamber at each transfer point and new chute designs to reduce coal velocity, turbulence, and induced air. The modifications were developed by Air Control Science (ACS). 7 figs., 1 tab.

  16. Experience of safety and performance improvement for fuel handling equipment

    International Nuclear Information System (INIS)

    Gyoon Chang, Sang; Hee Lee, Dae

    2014-01-01

    The purpose of this study is to provide experience of safety and performance improvement of fuel handling equipment for nuclear power plants in Korea. The fuel handling equipment, which is used as an important part of critical processes during the refueling outage, has been improved to enhance safety and to optimize fuel handling procedures. Results of data measured during the fuel reloading are incorporated into design changes. The safety and performance improvement for fuel handling equipment could be achieved by simply modifying the components and improving the interlock system. The experience provided in this study can be useful lessons for further improvement of the fuel handling equipment. (authors)

  17. Handling hunger strikers.

    Science.gov (United States)

    1992-04-01

    Hunger strikes are being used increasingly and not only by those with a political point to make. Whereas in the past, hunger strikes in the United Kingdom seemed mainly to be started by terrorist prisoners for political purposes, the most recent was begun by a Tamil convicted of murder, to protest his innocence. In the later stages of his strike, before calling it off, he was looked after at the Hammersmith Hospital. So it is not only prison doctors who need to know how to handle a hunger strike. The following guidelines, adopted by the 43rd World Medical Assembly in Malta in November 1991, are therefore a timely reminder of the doctor's duties during a hunger strike.

  18. MFTF exception handling system

    International Nuclear Information System (INIS)

    Nowell, D.M.; Bridgeman, G.D.

    1979-01-01

    In the design of large experimental control systems, a major concern is ensuring that operators are quickly alerted to emergency or other exceptional conditions and that they are provided with sufficient information to respond adequately. This paper describes how the MFTF exception handling system satisfies these requirements. Conceptually exceptions are divided into one of two classes. Those which affect command status by producing an abort or suspend condition and those which fall into a softer notification category of report only or operator acknowledgement requirement. Additionally, an operator may choose to accept an exception condition as operational, or turn off monitoring for sensors determined to be malfunctioning. Control panels and displays used in operator response to exceptions are described

  19. Handle with care

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1965-03-15

    Full text: A film dealing with transport of radioactive materials by everyday means - rail, road, sea and air transport - has been made for IAEA. It illustrates in broad terms some of the simple precautions which should be followed by persons dealing with such materials during shipment. Throughout, the picture stresses the transport regulations drawn up and recommended by the Agency, and in particular the need to carry out carefully the instructions based on these regulations in order to ensure that there is no hazard to the public nor to those who handle radioactive materials in transit and storage. In straightforward language, the film addresses the porter of a goods wagon, an airline cargo clerk, a dockside crane operator, a truck driver and others who load and ship freight. It shows the various types of package used to contain different categories of radioactive substances according to the intensity of the radiation emitted. It also illustrates their robustness by a series of tests involving drops, fires, impact, crushing, etc. Clear instructions are conveyed on what to do in the event of an unlikely accident with any type of package. The film is entitled, 'The Safe Transport of Radioactive Materials', and is No. 3 in the series entitled, 'Handle with Care'. It was made for IAEA through the United Kingdom Atomic Energy Authority by the Film Producers' Guild in the United Kingdom. It is in 16 mm colour, optical sound, with a running time of 20 minutes. It is available for order at $50 either direct from IAEA or through any of its Member Governments. Prints can be supplied in English, French, Russian or Spanish. Copies are also available for adaptation for commentaries in other languages. (author)

  20. biojs-io-biom, a BioJS component for handling data in Biological Observation Matrix (BIOM format [version 2; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Markus J. Ankenbrand

    2017-01-01

    Full Text Available The Biological Observation Matrix (BIOM format is widely used to store data from high-throughput studies. It aims at increasing interoperability of bioinformatic tools that process this data. However, due to multiple versions and implementation details, working with this format can be tricky. Currently, libraries in Python, R and Perl are available, whilst such for JavaScript are lacking. Here, we present a BioJS component for parsing BIOM data in all format versions. It supports import, modification, and export via a unified interface. This module aims to facilitate the development of web applications that use BIOM data. Finally, we demonstrate its usefulness by two applications that already use this component. Availability: https://github.com/molbiodiv/biojs-io-biom, https://dx.doi.org/10.5281/zenodo.218277

  1. Unvented Drum Handling Plan

    International Nuclear Information System (INIS)

    MCDONALD, K.M.

    2000-01-01

    This drum-handling plan proposes a method to deal with unvented transuranic drums encountered during retrieval of drums. Finding unvented drums during retrieval activities was expected, as identified in the Transuranic (TRU) Phase I Retrieval Plan (HNF-4781). However, significant numbers of unvented drums were not expected until excavation of buried drums began. This plan represents accelerated planning for management of unvented drums. A plan is proposed that manages unvented drums differently based on three categories. The first category of drums is any that visually appear to be pressurized. These will be vented immediately, using either the Hanford Fire Department Hazardous Materials (Haz. Mat.) team, if such are encountered before the facilities' capabilities are established, or using internal capabilities, once established. To date, no drums have been retrieved that showed signs of pressurization. The second category consists of drums that contain a minimal amount of Pu isotopes. This minimal amount is typically less than 1 gram of Pu, but may be waste-stream dependent. Drums in this category are assayed to determine if they are low-level waste (LLW). LLW drums are typically disposed of without venting. Any unvented drums that assay as TRU will be staged for a future venting campaign, using appropriate safety precautions in their handling. The third category of drums is those for which records show larger amounts of Pu isotopes (typically greater than or equal to 1 gram of Pu). These are assumed to be TRU and are not assayed at this point, but are staged for a future venting campaign. Any of these drums that do not have a visible venting device will be staged awaiting venting, and will be managed under appropriate controls, including covering the drums to protect from direct solar exposure, minimizing of container movement, and placement of a barrier to restrict vehicle access. There are a number of equipment options available to perform the venting. The

  2. New transport and handling contract

    CERN Multimedia

    SC Department

    2008-01-01

    A new transport and handling contract entered into force on 1.10.2008. As with the previous contract, the user interface is the internal transport/handling request form on EDH: https://edh.cern.ch/Document/TransportRequest/ To ensure that you receive the best possible service, we invite you to complete the various fields as accurately as possible and to include a mobile telephone number on which we can reach you. You can follow the progress of your request (schedule, completion) in the EDH request routing information. We remind you that the following deadlines apply: 48 hours for the transport of heavy goods (up to 8 tonnes) or simple handling operations 5 working days for crane operations, transport of extra-heavy goods, complex handling operations and combined transport and handling operations in the tunnel. For all enquiries, the number to contact remains unchanged: 72202. Heavy Handling Section TS-HE-HH 72672 - 160319

  3. Remote handling and accelerators

    International Nuclear Information System (INIS)

    Wilson, M.T.

    1983-01-01

    The high-current levels of contemporary and proposed accelerator facilities induce radiation levels into components, requiring consideration be given to maintenance techniques that reduce personnel exposure. Typical components involved include beamstops, targets, collimators, windows, and instrumentation that intercepts the direct beam. Also included are beam extraction, injection, splitting, and kicking regions, as well as purposeful spill areas where beam tails are trimmed and neutral particles are deposited. Scattered beam and secondary particles activate components all along a beamline such as vacuum pipes, magnets, and shielding. Maintenance techniques vary from hands-on to TV-viewed operation using state-of-the-art servomanipulators. Bottom- or side-entry casks are used with thimble-type target and diagnostic assemblies. Long-handled tools are operated from behind shadow shields. Swinging shield doors, unstacking block, and horizontally rolling shield roofs are all used to provide access. Common to all techniques is the need to make operations simple and to provide a means of seeing and reaching the area

  4. TFTR tritium handling concepts

    International Nuclear Information System (INIS)

    Garber, H.J.

    1976-01-01

    The Tokamak Fusion Test Reactor, to be located on the Princeton Forrestal Campus, is expected to operate with 1 to 2.5 MA tritium--deuterium plasmas, with the pulses involving injection of 50 to 150 Ci (5 to 16 mg) of tritium. Attainment of fusion conditions is based on generation of an approximately 1 keV tritium plasma by ohmic heating and conversion to a moderately hot tritium--deuterium ion plasma by injection of a ''preheating'' deuterium neutral beam (40 to 80 keV), followed by injection of a ''reacting'' beam of high energy neutral deuterium (120 to 150 keV). Additionally, compressions accompany the beam injections. Environmental, safety and cost considerations led to the decision to limit the amount of tritium gas on-site to that required for an experiment, maintaining all other tritium in ''solidified'' form. The form of the tritium supply is as uranium tritide, while the spent tritium and other hydrogen isotopes are getter-trapped by zirconium--aluminum alloy. The issues treated include: (1) design concepts for the tritium generator and its purification, dispensing, replenishment, containment, and containment--cleanup systems; (2) features of the spent plasma trapping system, particularly the regenerable absorption cartridges, their integration into the vacuum system, and the handling of non-getterables; (3) tritium permeation through the equipment and the anticipated releases to the environment; (4) overview of the tritium related ventilation systems; and (5) design bases for the facility's tritium clean-up systems

  5. Safe Handling of Radioisotopes

    International Nuclear Information System (INIS)

    1958-01-01

    Under its Statute the International Atomic Energy Agency is empowered to provide for the application of standards of safety for protection against radiation to its own operations and to operations making use of assistance provided by it or with which it is otherwise directly associated. To this end authorities receiving such assistance are required to observe relevant health and safety measures prescribed by the Agency. As a first step, it has been considered an urgent task to provide users of radioisotopes with a manual of practice for the safe handling of these substances. Such a manual is presented here and represents the first of a series of manuals and codes to be issued by the Agency. It has been prepared after careful consideration of existing national and international codes of radiation safety, by a group of international experts and in consultation with other international bodies. At the same time it is recommended that the manual be taken into account as a basic reference document by Member States of the Agency in the preparation of national health and safety documents covering the use of radioisotopes.

  6. Radioactive wastes handling facility

    International Nuclear Information System (INIS)

    Hirose, Emiko; Inaguma, Masahiko; Ozaki, Shigeru; Matsumoto, Kaname.

    1997-01-01

    There are disposed an area where a conveyor is disposed for separating miscellaneous radioactive solid wastes such as metals, on area for operators which is disposed in the direction vertical to the transferring direction of the conveyor, an area for receiving the radioactive wastes and placing them on the conveyor and an area for collecting the radioactive wastes transferred by the conveyor. Since an operator can conduct handling while wearing a working cloth attached to a partition wall as he wears his ordinary cloth, the operation condition can be improved and the efficiency for the separating work can be improved. When the area for settling conveyors and the area for the operators is depressurized, cruds on the surface of the wastes are not released to the outside and the working clothes can be prevented from being involved. Since the wastes are transferred by the conveyor, the operator's moving range is reduced, poisonous materials are fallen and moved through a sliding way to an area for collecting materials to be separated. Accordingly, the materials to be removed can be accumulated easily. (N.H.)

  7. Trends in Modern Exception Handling

    Directory of Open Access Journals (Sweden)

    Marcin Kuta

    2003-01-01

    Full Text Available Exception handling is nowadays a necessary component of error proof information systems. The paper presents overview of techniques and models of exception handling, problems connected with them and potential solutions. The aspects of implementation of propagation mechanisms and exception handling, their effect on semantics and general program efficiency are also taken into account. Presented mechanisms were adopted to modern programming languages. Considering design area, formal methods and formal verification of program properties we can notice exception handling mechanisms are weakly present what makes a field for future research.

  8. A comparison of two methods for retrieving ICD-9-CM data: the effect of using an ontology-based method for handling terminology changes.

    Science.gov (United States)

    Yu, Alexander C; Cimino, James J

    2011-04-01

    Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Recall and interclass correlation coefficient. Statistically significant differences were detected (pontology-based ICD-9-CM data retrieval method that takes into account the effects of

  9. Mushroom Tyrosinase: A Model System to Combine Experimental Investigation of Enzyme-Catalyzed Reactions, Data Handling Using R, and Enzyme-Inhibitor Structural Studies

    Science.gov (United States)

    Nairn, Robert; Cresswell, Will; Nairn, Jacqueline

    2015-01-01

    The activity of mushroom tyrosinase can be measured by monitoring the conversion of phenolic compounds into quinone derivatives using spectrophotometry. This article describes a series of experiments which characterize the functional properties of tyrosinase, the analysis of the resulting data using R to determine the kinetic parameters, and the…

  10. Introduction to meteorological measurements and data handling for solar energy applications. Task IV-Development of an insolation handbook and instrument package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-10-01

    Recognizing a need for a coordinated approach to resolve energy problems, certain members of the Organization for Economic Cooperation and Development (OECD) met in September 1974 and agreed to develop an International Energy Program. The International Energy Agency (IEA) was established within the OECD to administer, monitor and execute this International Energy Program. In July 1975, Solar Heating and Cooling was selected as one of the sixteen technology fields for multilateral cooperation. Five project areas, called tasks, were identified for cooperative activities within the IEA Program to Develop and Test Solar Heating and Cooling Systems. The objective of one task was to obtain improved basic resource information for the design and operation of solar heating and cooling systems through a better understanding of the required insolation (solar radiation) and related weather data, and through improved techniques for measurement and evaluation of such data. At the February 1976 initial experts meeting in Norrkoeping, Sweden, the participants developed the objective statement into two subtasks. (1) an insolation handbook; and (2) a portable meteorological instrument package. This handbook is the product of the first subtask. The objective of this handbook is to provide a basis for a dialogue between solar scientists and meteorologists. Introducing the solar scientist to solar radiation and related meteorological data enables him to better express his scientific and engineering needs to the meteorologist; and introducing the meteorologist to the special solar radiation and meteorological data applications of the solar scientist enables him to better meet the needs of the solar energy community.

  11. Personal ways of handling everyday life

    DEFF Research Database (Denmark)

    Jensen, Lasse Meinert

    at variations in everyday life pursuits:  How does a person's pursuit of goals and concerns lead him/her to experience and handle breaks, interruptions, and variation in everyday activities?  The research project so far holds quantitative data.  A convenient sample of 217 persons were administered...

  12. Guidance Counsellor Strategies for Handling Bullying

    Science.gov (United States)

    Power-Elliott, Michleen; Harris, Gregory E.

    2012-01-01

    The purpose of this exploratory-descriptive study was to examine how guidance counsellors in the province of Newfoundland and Labrador would handle a specific verbal-relational bullying incident. Also of interest was guidance counsellor involvement and training in bullying programmes and Positive Behaviour Supports. Data for this study was…

  13. Ergonomics of disposable handles for minimally invasive surgery.

    Science.gov (United States)

    Büchel, D; Mårvik, R; Hallabrin, B; Matern, U

    2010-05-01

    The ergonomic deficiencies of currently available minimally invasive surgery (MIS) instrument handles have been addressed in many studies. In this study, a new ergonomic pistol handle concept, realized as a prototype, and two disposable ring handles were investigated according to ergonomic properties set by new European standards. In this study, 25 volunteers performed four practical tasks to evaluate the ergonomics of the handles used in standard operating procedures (e.g., measuring a suture and cutting to length, precise maneuvering and targeting, and dissection of a gallbladder). Moreover, 20 participants underwent electromyography (EMG) tests to measure the muscle strain they experienced while carrying out the basic functions (grasp, rotate, and maneuver) in the x, y, and z axes. The data measured included the number of errors, the time required for task completion, perception of pressure areas, and EMG data. The values for usability in the test were effectiveness, efficiency, and user satisfaction. Surveys relating to the subjective rating were completed after each task for each of the three handles tested. Each handle except the new prototype caused pressure areas and pain. Extreme differences in muscle strain could not be observed for any of the three handles. Experienced surgeons worked more quickly with the prototype when measuring and cutting a suture (approximately 20%) and during precise maneuvering and targeting (approximately 20%). On the other hand, they completed the dissection task faster with the handle manufactured by Ethicon. Fewer errors were made with the prototype in dissection of the gallbladder. In contrast to the handles available on the market, the prototype was always rated as positive by the volunteers in the subjective surveys. None of the handles could fulfil all of the requirements with top scores. Each handle had its advantages and disadvantages. In contrast to the ring handles, the volunteers could fulfil most of the tasks more

  14. Safety measuring for sodium handling

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Ji Young; Jeong, K C; Kim, T J; Kim, B H; Choi, J H

    2001-09-01

    This is the report for the safety measures of sodium handling. These contents are prerequisites for the development of sodium technology and thus the workers participate in sodium handling and experiments have to know them perfectly. As an appendix, the relating parts of the laws are presented.

  15. Waste Handling Building Conceptual Study

    International Nuclear Information System (INIS)

    G.W. Rowe

    2000-01-01

    The objective of the ''Waste Handling Building Conceptual Study'' is to develop proposed design requirements for the repository Waste Handling System in sufficient detail to allow the surface facility design to proceed to the License Application effort if the proposed requirements are approved by DOE. Proposed requirements were developed to further refine waste handling facility performance characteristics and design constraints with an emphasis on supporting modular construction, minimizing fuel inventory, and optimizing facility maintainability and dry handling operations. To meet this objective, this study attempts to provide an alternative design to the Site Recommendation design that is flexible, simple, reliable, and can be constructed in phases. The design concept will be input to the ''Modular Design/Construction and Operation Options Report'', which will address the overall program objectives and direction, including options and issues associated with transportation, the subsurface facility, and Total System Life Cycle Cost. This study (herein) is limited to the Waste Handling System and associated fuel staging system

  16. A Perspective on Remote Handling Operations and Human Machine Interface for Remote Handling in Fusion

    International Nuclear Information System (INIS)

    Haist, B.; Hamilton, D.; Sanders, St.

    2006-01-01

    A large-scale fusion device presents many challenges to the remote handling operations team. This paper is based on unique operational experience at JET and gives a perspective on remote handling task development, logistics and resource management, as well as command, control and human-machine interface systems. Remote operations require an accurate perception of a dynamic environment, ideally providing the operators with the same unrestricted knowledge of the task scene as would be available if they were actually at the remote work location. Traditional camera based systems suffer from a limited number of viewpoints and also degrade quickly when exposed to high radiation. Virtual Reality and Augmented Reality software offer great assistance. The remote handling system required to maintain a tokamak requires a large number of different and complex pieces of equipment coordinating to perform a large array of tasks. The demands on the operator's skill in performing the tasks can escalate to a point where the efficiency and safety of operations are compromised. An operations guidance system designed to facilitate the planning, development, validation and execution of remote handling procedures is essential. Automatic planning of motion trajectories of remote handling equipment and the remote transfer of heavy loads will be routine and need to be reliable. This paper discusses the solutions developed at JET in these areas and also the trends in management and presentation of operational data as well as command, control and HMI technology development offering the potential to greatly assist remote handling in future fusion machines. (author)

  17. Handling conditional discrimination

    NARCIS (Netherlands)

    Zliobaite, I.; Kamiran, F.; Calders, T.G.K.

    2011-01-01

    Historical data used for supervised learning may contain discrimination. We study how to train classifiers on such data, so that they are discrimination free with respect to a given sensitive attribute, e.g., gender. Existing techniques that deal with this problem aim at removing all discrimination

  18. Sophisticated fuel handling system evolved

    International Nuclear Information System (INIS)

    Ross, D.A.

    1988-01-01

    The control systems at Sellafield fuel handling plant are described. The requirements called for built-in diagnostic features as well as the ability to handle a large sequencing application. Speed was also important; responses better than 50ms were required. The control systems are used to automate operations within each of the three main process caves - two Magnox fuel decanners and an advanced gas-cooled reactor fuel dismantler. The fuel route within the fuel handling plant is illustrated and described. ASPIC (Automated Sequence Package for Industrial Control) which was developed as a controller for the plant processes is described. (U.K.)

  19. Production management of window handles

    Directory of Open Access Journals (Sweden)

    Manuela Ingaldi

    2014-12-01

    Full Text Available In the chapter a company involved in the production of aluminum window and door handles was presented. The main customers of the company are primarily companies which produce PCV joinery and wholesalers supplying these companies. One chosen product from the research company - a single-arm pin-lift window handle - was described and its production process depicted technologically. The chapter also includes SWOT analysis conducted in the research company and the value stream of the single-arm pin-lift window handle.

  20. CHLOE: a system for the automatic handling of spark pictures

    International Nuclear Information System (INIS)

    Butler, J.W.; Hodges, D.; Royston, R.

    The system for automatic data handling uses commercially available or state-of-the-art components. The system is flexible enough to accept information from various types of experiments involving photographic data acquisition

  1. Handling missing data in ranked set sampling

    CERN Document Server

    Bouza-Herrera, Carlos N

    2013-01-01

    The existence of missing observations is a very important aspect to be considered in the application of survey sampling, for example. In human populations they may be caused by a refusal of some interviewees to give the true value for the variable of interest. Traditionally, simple random sampling is used to select samples. Most statistical models are supported by the use of samples selected by means of this design. In recent decades, an alternative design has started being used, which, in many cases, shows an improvement in terms of accuracy compared with traditional sampling. It is called R

  2. A Guide to Handling Biomedical Data

    Science.gov (United States)

    1982-01-01

    Scrotal pain? I = yes SCRPN 2 = no 9 = unk 27 Chyluria? 1 = yes CHUIA 2 = no 9 = unk 28 Malaria? I = yes MALIA 2 = no 9 = unk 29 Elephantiasis ? I = yes...33 Thickened epicdidymis I = yes THKEP 2 = no 9 = unk 34 Elephantiasis Scrotum Breat? I = yes SCBRE 2 = no 9 = unk 35 Hackett Spleen (1-5 ps; o-5; 9

  3. Safe handling of radiation sources

    International Nuclear Information System (INIS)

    Abd Nasir Ibrahim; Azali Muhammad; Ab Razak Hamzah; Abd Aziz Mohamed; Mohammad Pauzi Ismail

    2004-01-01

    This chapter discussed the subjects related to the safe handling of radiation sources: type of radiation sources, method of use: transport within premises, transport outside premises; Disposal of Gamma Sources

  4. How Retailers Handle Complaint Management

    DEFF Research Database (Denmark)

    Hansen, Torben; Wilke, Ricky; Zaichkowsky, Judy

    2009-01-01

    This article fills a gap in the literature by providing insight about the handling of complaint management (CM) across a large cross section of retailers in the grocery, furniture, electronic and auto sectors. Determinants of retailers’ CM handling are investigated and insight is gained as to the......This article fills a gap in the literature by providing insight about the handling of complaint management (CM) across a large cross section of retailers in the grocery, furniture, electronic and auto sectors. Determinants of retailers’ CM handling are investigated and insight is gained...... as to the links between CM and redress of consumers’ complaints. The results suggest that retailers who attach large negative consequences to consumer dissatisfaction are more likely than other retailers to develop a positive strategic view on customer complaining, but at the same time an increase in perceived...

  5. Ergonomic material-handling device

    Science.gov (United States)

    Barsnick, Lance E.; Zalk, David M.; Perry, Catherine M.; Biggs, Terry; Tageson, Robert E.

    2004-08-24

    A hand-held ergonomic material-handling device capable of moving heavy objects, such as large waste containers and other large objects requiring mechanical assistance. The ergonomic material-handling device can be used with neutral postures of the back, shoulders, wrists and knees, thereby reducing potential injury to the user. The device involves two key features: 1) gives the user the ability to adjust the height of the handles of the device to ergonomically fit the needs of the user's back, wrists and shoulders; and 2) has a rounded handlebar shape, as well as the size and configuration of the handles which keep the user's wrists in a neutral posture during manipulation of the device.

  6. The technique on handling radiation

    International Nuclear Information System (INIS)

    1997-11-01

    This book describes measurement of radiation and handling radiation. The first part deals with measurement of radiation. The contents of this part are characteristic on measurement technique of radiation, radiation detector, measurement of energy spectrum, measurement of radioactivity, measurement for a level of radiation and county's statistics on radiation. The second parts explains handling radiation with treating of sealed radioisotope, treating unsealed source and radiation shield.

  7. Civilsamfundets ABC: H for Handling

    DEFF Research Database (Denmark)

    Lund, Anker Brink; Meyer, Gitte

    2015-01-01

    Hvad er civilsamfundet? Anker Brink Lund og Gitte Meyer fra CBS Center for Civil Society Studies gennemgår civilsamfundet bogstav for bogstav. Vi er nået til H for Handling.......Hvad er civilsamfundet? Anker Brink Lund og Gitte Meyer fra CBS Center for Civil Society Studies gennemgår civilsamfundet bogstav for bogstav. Vi er nået til H for Handling....

  8. Asthma, guides for diagnostic and handling

    International Nuclear Information System (INIS)

    Salgado, Carlos E; Caballero A, Andres S; Garcia G, Elizabeth

    1999-01-01

    The paper defines the asthma, includes topics as diagnostic, handling of the asthma, special situations as asthma and pregnancy, handling of the asthmatic patient's perioperatory and occupational asthma

  9. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Yamada, Koji

    1987-01-01

    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  10. Selecting the group method of data handling as one of the most perspective algorithmes for building a predictive model of petroleum consumption in the system of energy balance of Ukraine

    Directory of Open Access Journals (Sweden)

    Trachuk A.R.

    2017-06-01

    Full Text Available This paper deals with issues of petroleum consumption in Ukraine. The dynamics of consumption of petroleum is analysed and proposed guidelines for the efficient production, consumption and import of petroleum in Ukraine. Constructed and developed predictive models of petroleum consumption in Ukraine through the use of modern software and using the group method of data handling, which allowed building adequate predictive models of petroleum consumption in the system of Ukraine’s energy balance. Researched and forecasted scenarios of petroleum consumption in the Ukraine. The problem of efficient use of energy resources is critical for sustainable economic development against the backdrop of energy saving national economy depends on energy imports, on the one hand, and rising prices for these resources. The basic foundation of the formation energy system of Ukraine is to build forecasting scenarios for different types of energy and different criteria for effective use of energy resources. Solving this problem is not only with ensuring energy security, but also with the level of development of regions of Ukraine and ensuring quality of life. Prediction of petroleum consumption in Ukraine today is an extremely important issue of strategic importance since conducted through analysis and building predictive models will be possible to develop guidelines for the efficient production and consumption of petroleum across Ukraine as a whole.

  11. TURVA-2012: Handling QA

    International Nuclear Information System (INIS)

    Snellman, Margit; Hellae, Pirjo; Pastina, Barbara; Smith, Paul; Myllymaa, Samu

    2014-01-01

    Posiva applies a management system that complies with the ISO 9001:2008 standard for all activities, including the production of safety case reports, and requires the application of the same quality assurance principles from all its contractors and suppliers. The ISO standard was first launched in 1997 and has since been subject to continuous maintenance, updating and several internal and external audits. The purpose of Posiva's quality management system is to ensure, in a documented and traceable way, that Posiva's products - whether in the form of abstract knowledge and information, published reports or physical objects - fulfil the requirements set for them. The general quality objectives, requirements and instructions defined in Posiva's management system also form the foundation for the quality management of safety case activities. The quality management of the safety case follows the Posiva's general management system, which is based on the ISO 9001:2008 standard and management through processes, but also applies the principle of a graded approach similar to the safety guides for nuclear facilities. This means that the primary emphasis in the quality control and assurance of safety case activities is placed on those activities that have a direct bearing on the arguments and conclusions on the long-term safety of disposal, whereas standard quality measures are applied in the supporting work. The quality management of the safety case aims at traceability and transparency of the key data, assumptions, modelling and calculations. Regarding the activities related to ONKALO, the management system also takes into account the regulatory requirements of YVL Guide 1.4 'Management System for Nuclear Facilities' (which will be subject to revision in 2013). (authors)

  12. Investigation into slipping and falling accidents and materials handling in the South African mining industry.

    CSIR Research Space (South Africa)

    Schutte, PC

    2003-03-01

    Full Text Available The objective of this study was to analyze information on slipping and falling accidents and materials handling activities in the South African mining industry. Accident data pertaining to slipping, falling and materials handling accidents...

  13. Handling of waste in ports

    International Nuclear Information System (INIS)

    Olson, P.H.

    1994-01-01

    The regulations governing the handling of port-generated waste are often national and/or local legislation, whereas the handling of ship-generated waste is governed by the MARPOL Convention in most parts of the world. The handling of waste consists of two main phases -collection and treatment. Waste has to be collected in every port and on board every ship, whereas generally only some wastes are treated and to a certain degree in ports and on board ships. This paper considers the different kinds of waste generated in both ports and on board ships, where and how it is generated, how it could be collected and treated. The two sources are treated together to show how some ship-generated waste may be treated in port installations primarily constructed for the treatment of the port-generated waste, making integrated use of the available treatment facilities. (author)

  14. Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design

    Science.gov (United States)

    Lawrence, Ben

    2014-01-01

    This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.

  15. Welding method by remote handling

    International Nuclear Information System (INIS)

    Hashinokuchi, Minoru.

    1994-01-01

    Water is charged into a pit (or a water reservoir) and an article to be welded is placed on a support in the pit by remote handling. A steel plate is disposed so as to cover the article to be welded by remote handling. The welding device is positioned to the portion to be welded and fixed in a state where the article to be welded is shielded from radiation by water and the steel plate. Water in the pit is drained till the portion to be welded is exposed to the atmosphere. Then, welding is conducted. After completion of the welding, water is charged again to the pit and the welding device and fixing jigs are decomposed in a state where the article to be welded is shielded again from radiation by water and the steel plate. Subsequently, the steel plate is removed by remote handling. Then, the article to be welded is returned from the pit to a temporary placing pool by remote handling. This can reduce operator's exposure. Further, since the amount of the shielding materials can be minimized, the amount of radioactive wastes can be decreased. (I.N.)

  16. Good analytical practice: statistics and handling data in biomedical science. A primer and directions for authors. Part 1: Introduction. Data within and between one or two sets of individuals.

    Science.gov (United States)

    Blann, A D; Nation, B R

    2008-01-01

    The biomedical scientist is bombarded on a daily basis by information, almost all of which refers to the health status of an individual or groups of individuals. This review is the first of a two-part article written to explain some of the issues related to the presentation and analysis of data. The first part focuses on types of data and how to present and analyse data from an individual or from one or two groups of persons. The second part will examine data from three or more sets of persons, what methods are available to allow this analysis (i.e., statistical software packages), and will conclude with a statement on appropriate descriptors of data, their analyses, and presentation for authors considering submission of their data to this journal.

  17. Advanced handling-systems with enhanced performance flexibility

    International Nuclear Information System (INIS)

    1986-04-01

    This report describes the results of a project related to future applications and requirements for advanced handling systems. This report consists of six chapters. Following the description of the aims the tools for setting up the requirements for the handling systems including the experience during the data acquisition process is described. Furthermore some information is given about the current state of the art of robotics and manipulators. Of paramount importance are the descriptions of applications and related concepts in the following chapters leading to specific categories of advanced handling units. The paper closes with the description of the first concepts for realization. (orig./HP) [de

  18. Building a framework for ergonomic research on laparoscopic instrument handles.

    Science.gov (United States)

    Li, Zheng; Wang, Guohui; Tan, Juan; Sun, Xulong; Lin, Hao; Zhu, Shaihong

    2016-06-01

    Laparoscopic surgery carries the advantage of minimal invasiveness, but ergonomic design of the instruments used has progressed slowly. Previous studies have demonstrated that the handle of laparoscopic instruments is vital for both surgical performance and surgeon's health. This review provides an overview of the sub-discipline of handle ergonomics, including an evaluation framework, objective and subjective assessment systems, data collection and statistical analyses. Furthermore, a framework for ergonomic research on laparoscopic instrument handles is proposed to standardize work on instrument design. Copyright © 2016 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.

  19. Handling Procedures of Vegetable Crops

    Science.gov (United States)

    Perchonok, Michele; French, Stephen J.

    2004-01-01

    The National Aeronautics and Space Administration (NASA) is working towards future long duration manned space flights beyond low earth orbit. The duration of these missions may be as long as 2.5 years and will likely include a stay on a lunar or planetary surface. The primary goal of the Advanced Food System in these long duration exploratory missions is to provide the crew with a palatable, nutritious, and safe food system while minimizing volume, mass, and waste. Vegetable crops can provide the crew with added nutrition and variety. These crops do not require any cooking or food processing prior to consumption. The vegetable crops, unlike prepackaged foods, will provide bright colors, textures (crispy), and fresh aromas. Ten vegetable crops have been identified for possible use in long duration missions. They are lettuce, spinach, carrot, tomato, green onion, radish, bell pepper, strawberries, fresh herbs, and cabbage. Whether these crops are grown on a transit vehicle (e.g., International Space Station) or on the lunar or planetary surface, it will be necessary to determine how to safely handle the vegetables while maintaining acceptability. Since hydrogen peroxide degrades into water and oxygen and is generally recognized as safe (GRAS), hydrogen peroxide has been recommended as the sanitizer. The objective of th is research is to determine the required effective concentration of hydrogen peroxide. In addition, it will be determined whether the use of hydrogen peroxide, although a viable sanitizer, adversely affects the quality of the vegetables. Vegetables will be dipped in 1 % hydrogen peroxide, 3% hydrogen peroxide, or 5% hydrogen peroxide. Treated produce and controls will be stored in plastic bags at 5 C for up to 14 days. Sensory, color, texture, and total plate count will be measured. The effect on several vegetables including lettuce, radish, tomato and strawberries has been completed. Although each vegetable reacts to hydrogen peroxide differently, the

  20. Experience in handling concentrated tritium

    International Nuclear Information System (INIS)

    Holtslander, W.J.

    1985-12-01

    The notes describe the experience in handling concentrated tritium in the hydrogen form accumulated in the Chalk River Nuclear Laboratories Tritium Laboratory. The techniques of box operation, pumping systems, hydriding and dehydriding operations, and analysis of tritium are discussed. Information on the Chalk River Tritium Extraction Plant is included as a collection of reprints of papers presented at the Dayton Meeting on Tritium Technology, 1985 April 30 - May 2

  1. International handling of fissionable material

    International Nuclear Information System (INIS)

    1975-01-01

    The opinion of the ministry for foreign affairs on international handling of fissionable materials is given. As an introduction a survey is given of the possibilities to produce nuclear weapons from materials used in or produced by power reactors. Principles for international control of fissionable materials are given. International agreements against proliferation of nuclear weapons are surveyed and methods to improve them are proposed. (K.K.)

  2. Confinement facilities for handling plutonium

    International Nuclear Information System (INIS)

    Maraman, W.J.; McNeese, W.D.; Stafford, R.G.

    1975-01-01

    Plutonium handling on a multigram scale began in 1944. Early criteria, equipment, and techniques for confining contamination have been superseded by more stringent criteria and vastly improved equipment and techniques for in-process contamination control, effluent air cleaning and treatment of liquid wastes. This paper describes the evolution of equipment and practices to minimize exposure of workers and escape of contamination into work areas and into the environment. Early and current contamination controls are compared. (author)

  3. Remote handling equipment for SNS

    International Nuclear Information System (INIS)

    Poulten, B.H.

    1983-01-01

    This report gives information on the areas of the SNS, facility which become highly radioactive preventing hands-on maintenance. Levels of activity are sufficiently high in the Target Station Area of the SNS, especially under fault conditions, to warrant reactor technology to be used in the design of the water, drainage and ventilation systems. These problems, together with the type of remote handling equipment required in the SNS, are discussed

  4. Remote handling in reprocessing plants

    International Nuclear Information System (INIS)

    Streiff, G.

    1984-01-01

    Remote control will be the rule for maintenance in hot cells of future spent fuel reprocessing plants because of the radioactivity level. New handling equipments will be developed and intervention principles defined. Existing materials, recommendations for use and new manipulators are found in the PMDS' documentation. It is also a help in the choice and use of intervention means and a guide for the user [fr

  5. Equipment for the handling of thorium materials

    International Nuclear Information System (INIS)

    Heisler, S.W. Jr.; Mihalovich, G.S.

    1988-01-01

    The Feed Materials Production Center (FMPC) is the United States Department of Energy's storage facility for thorium. FMPC thorium handling and overpacking projects ensure the continued safe handling and storage of the thorium inventory until final disposition of the materials is determined and implemented. The handling and overpacking of the thorium materials requires the design of a system that utilizes remote handling and overpacking equipment not currently utilized at the FMPC in the handling of uranium materials. The use of remote equipment significantly reduces radiation exposure to personnel during the handling and overpacking efforts. The design system combines existing technologies from the nuclear industry, the materials processing and handling industry and the mining industry. The designed system consists of a modified fork lift truck for the transport of thorium containers, automated equipment for material identification and inventory control, and remote handling and overpacking equipment for material identification and inventory control, and remote handling and overpacking equipment for repackaging of the thorium materials

  6. Enteral Feeding Set Handling Techniques.

    Science.gov (United States)

    Lyman, Beth; Williams, Maria; Sollazzo, Janet; Hayden, Ashley; Hensley, Pam; Dai, Hongying; Roberts, Cristine

    2017-04-01

    Enteral nutrition therapy is common practice in pediatric clinical settings. Often patients will receive a pump-assisted bolus feeding over 30 minutes several times per day using the same enteral feeding set (EFS). This study aims to determine the safest and most efficacious way to handle the EFS between feedings. Three EFS handling techniques were compared through simulation for bacterial growth, nursing time, and supply costs: (1) rinsing the EFS with sterile water after each feeding, (2) refrigerating the EFS between feedings, and (3) using a ready-to-hang (RTH) product maintained at room temperature. Cultures were obtained at baseline, hour 12, and hour 21 of the 24-hour cycle. A time-in-motion analysis was conducted and reported in average number of seconds to complete each procedure. Supply costs were inventoried for 1 month comparing the actual usage to our estimated usage. Of 1080 cultures obtained, the overall bacterial growth rate was 8.7%. The rinse and refrigeration techniques displayed similar bacterial growth (11.4% vs 10.3%, P = .63). The RTH technique displayed the least bacterial growth of any method (4.4%, P = .002). The time analysis in minutes showed the rinse method was the most time-consuming (44.8 ± 2.7) vs refrigeration (35.8 ± 2.6) and RTH (31.08 ± 0.6) ( P refrigerating the EFS between uses is the next most efficacious method for handling the EFS between bolus feeds.

  7. The handling of radiation accidents

    International Nuclear Information System (INIS)

    Macdonald, H.F.; Orchard, H.C.; Walker, C.W.

    1977-04-01

    Some of the more interesting and important contributions to a recent International Symposium on the Handling of Radiation Accidents are discussed and personal comments on many of the papers presented are included. The principal conclusion of the Symposium was that although the nuclear industry has an excellent safety record, there is no room for complacency. Continuing attention to emergency planning and exercising are essential in order to maintain this position. A full list of the papers presented at the Symposium is included as an Appendix. (author)

  8. 7 CFR 58.443 - Whey handling.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Whey handling. 58.443 Section 58.443 Agriculture... Procedures § 58.443 Whey handling. (a) Adequate sanitary facilities shall be provided for the handling of whey. If outside, necessary precautions shall be taken to minimize flies, insects and development of...

  9. VVER NPPs fuel handling machine control system

    International Nuclear Information System (INIS)

    Mini, G.; Rossi, G.; Barabino, M.; Casalini, M.

    2002-01-01

    In order to increase the safety level of the fuel handling machine on WWER NPPs, Ansaldo Nucleare was asked to design and supply a new Control System. Two Fuel Handling Machine (FHM) Control System units have been already supplied for Temelin NPP and others supply are in process for the Atommash company, which has in charge the supply of FHMs for NPPs located in Russia, Ukraine and China.The computer-based system takes into account all the operational safety interlocks so that it is able to avoid incorrect and dangerous manoeuvres in the case of operator error. Control system design criteria, hardware and software architecture, and quality assurance control, are in accordance with the most recent international requirements and standards, and in particular for electromagnetic disturbance immunity demands and seismic compatibility. The hardware architecture of the control system is based on ABB INFI 90 system. The microprocessor-based ABB INFI 90 system incorporates and improves upon many of the time proven control capabilities of Bailey Network 90, validated over 14,000 installations world-wide.The control system complies all the former designed sensors and devices of the machine and markedly the angular position measurement sensors named 'selsyn' of Russian design. Nevertheless it is fully compatible with all the most recent sensors and devices currently available on the market (for ex. Multiturn absolute encoders).All control logic were developed using standard INFI 90 Engineering Work Station, interconnecting blocks extracted from an extensive SAMA library by using a graphical approach (CAD) and allowing and easier intelligibility, more flexibility and updated and coherent documentation. The data acquisition system and the Man Machine Interface are implemented by ABB in co-operation with Ansaldo. The flexible and powerful software structure of 1090 Work-stations (APMS - Advanced Plant Monitoring System, or Tenore NT) has been successfully used to interface the

  10. Safety of Cargo Aircraft Handling Procedure

    Directory of Open Access Journals (Sweden)

    Daniel Hlavatý

    2017-07-01

    Full Text Available The aim of this paper is to get acquainted with the ways how to improve the safety management system during cargo aircraft handling. The first chapter is dedicated to general information about air cargo transportation. This includes the history or types of cargo aircraft handling, but also the means of handling. The second part is focused on detailed description of cargo aircraft handling, including a description of activities that are performed before and after handling. The following part of this paper covers a theoretical interpretation of safety, safety indicators and legislative provisions related to the safety of cargo aircraft handling. The fourth part of this paper analyzes the fault trees of events which might occur during handling. The factors found by this analysis are compared with safety reports of FedEx. Based on the comparison, there is a proposal on how to improve the safety management in this transportation company.

  11. Transfer Area Mechanical Handling Calculation

    International Nuclear Information System (INIS)

    Dianda, B.

    2004-01-01

    This calculation is intended to support the License Application (LA) submittal of December 2004, in accordance with the directive given by DOE correspondence received on the 27th of January 2004 entitled: ''Authorization for Bechtel SAX Company L.L. C. to Include a Bare Fuel Handling Facility and Increased Aging Capacity in the License Application, Contract Number DE-AC--28-01R W12101'' (Arthur, W.J., I11 2004). This correspondence was appended by further Correspondence received on the 19th of February 2004 entitled: ''Technical Direction to Bechtel SAIC Company L.L. C. for Surface Facility Improvements, Contract Number DE-AC--28-OIRW12101; TDL No. 04-024'' (BSC 2004a). These documents give the authorization for a Fuel Handling Facility to be included in the baseline. The purpose of this calculation is to establish preliminary bounding equipment envelopes and weights for the Fuel Handling Facility (FHF) transfer areas equipment. This calculation provides preliminary information only to support development of facility layouts and preliminary load calculations. The limitations of this preliminary calculation lie within the assumptions of section 5 , as this calculation is part of an evolutionary design process. It is intended that this calculation is superseded as the design advances to reflect information necessary to support License Application. The design choices outlined within this calculation represent a demonstration of feasibility and may or may not be included in the completed design. This calculation provides preliminary weight, dimensional envelope, and equipment position in building for the purposes of defining interface variables. This calculation identifies and sizes major equipment and assemblies that dictate overall equipment dimensions and facility interfaces. Sizing of components is based on the selection of commercially available products, where applicable. This is not a specific recommendation for the future use of these components or their

  12. Translator for Optimizing Fluid-Handling Components

    Science.gov (United States)

    Landon, Mark; Perry, Ernest

    2007-01-01

    A software interface has been devised to facilitate optimization of the shapes of valves, elbows, fittings, and other components used to handle fluids under extreme conditions. This software interface translates data files generated by PLOT3D (a NASA grid-based plotting-and- data-display program) and by computational fluid dynamics (CFD) software into a format in which the files can be read by Sculptor, which is a shape-deformation- and-optimization program. Sculptor enables the user to interactively, smoothly, and arbitrarily deform the surfaces and volumes in two- and three-dimensional CFD models. Sculptor also includes design-optimization algorithms that can be used in conjunction with the arbitrary-shape-deformation components to perform automatic shape optimization. In the optimization process, the output of the CFD software is used as feedback while the optimizer strives to satisfy design criteria that could include, for example, improved values of pressure loss, velocity, flow quality, mass flow, etc.

  13. CANISTER HANDLING FACILITY DESCRIPTION DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    J.F. Beesley

    2005-04-21

    The purpose of this facility description document (FDD) is to establish requirements and associated bases that drive the design of the Canister Handling Facility (CHF), which will allow the design effort to proceed to license application. This FDD will be revised at strategic points as the design matures. This FDD identifies the requirements and describes the facility design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This FDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This FDD is part of an iterative design process. It leads the design process with regard to the flowdown of upper tier requirements onto the facility. Knowledge of these requirements is essential in performing the design process. The FDD follows the design with regard to the description of the facility. The description provided in this FDD reflects the current results of the design process.

  14. Bulk handling benefits from ICT

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-11-15

    The efficiency and accuracy of bulk handling is being improved by the range of management information systems and services available today. As part of the program to extend Richards Bay Coal Terminal, Siemens is installing a manufacturing execution system which coordinates and monitors all movements of raw materials. The article also reports recent developments by AXSMarine, SunGuard Energy, Fuelworx and Railworx in providing integrated tools for tracking, managing and optimising solid/liquid fuels and rail car maintenance activities. QMASTOR Ltd. has secured a contract with Anglo Coal Australia to provide its Pit to Port.net{reg_sign} and iFuse{reg_sign} software systems across all their Australians sites, to include pit-to-product stockpile management. 2 figs.

  15. Handling and transport problems (1960)

    International Nuclear Information System (INIS)

    Pomarola, J.; Savouyaud, J.

    1960-01-01

    I. The handling and transport of radioactive wastes involves the danger of irradiation and contamination. It is indispensable: - to lay down a special set of rules governing the removal and transport of wastes within centres or from one centre to another; - to give charge of this transportation to a group containing teams of specialists. The organisation, equipment and output of these teams is being examined. II. Certain materials are particularly dangerous to transport, and for these special vehicles and fixed installations are necessary. This is the case especially for the evacuation of very active liquids. A transport vehicle is described, consisting of a trailer tractor and a recipient holding 500 litres of liquid of which the activity can reach 1000 C/l; the decanting operation, the route to be followed by the vehicle, and the precautions taken are also described. (author) [fr

  16. CANISTER HANDLING FACILITY DESCRIPTION DOCUMENT

    International Nuclear Information System (INIS)

    Beesley. J.F.

    2005-01-01

    The purpose of this facility description document (FDD) is to establish requirements and associated bases that drive the design of the Canister Handling Facility (CHF), which will allow the design effort to proceed to license application. This FDD will be revised at strategic points as the design matures. This FDD identifies the requirements and describes the facility design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This FDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This FDD is part of an iterative design process. It leads the design process with regard to the flowdown of upper tier requirements onto the facility. Knowledge of these requirements is essential in performing the design process. The FDD follows the design with regard to the description of the facility. The description provided in this FDD reflects the current results of the design process

  17. Fuel Handling Facility Description Document

    International Nuclear Information System (INIS)

    M.A. LaFountain

    2005-01-01

    The purpose of the facility description document (FDD) is to establish the requirements and their bases that drive the design of the Fuel Handling Facility (FHF) to allow the design effort to proceed to license application. This FDD is a living document that will be revised at strategic points as the design matures. It identifies the requirements and describes the facility design as it currently exists, with emphasis on design attributes provided to meet the requirements. This FDD was developed as an engineering tool for design control. Accordingly, the primary audience and users are design engineers. It leads the design process with regard to the flow down of upper tier requirements onto the facility. Knowledge of these requirements is essential to performing the design process. It trails the design with regard to the description of the facility. This description is a reflection of the results of the design process to date

  18. Cask system design guidance for robotic handling

    International Nuclear Information System (INIS)

    Griesmeyer, J.M.; Drotning, W.D.; Morimoto, A.K.; Bennett, P.C.

    1990-10-01

    Remote automated cask handling has the potential to reduce both the occupational exposure and the time required to process a nuclear waste transport cask at a handling facility. The ongoing Advanced Handling Technologies Project (AHTP) at Sandia National Laboratories is described. AHTP was initiated to explore the use of advanced robotic systems to perform cask handling operations at handling facilities for radioactive waste, and to provide guidance to cask designers regarding the impact of robotic handling on cask design. The proof-of-concept robotic systems developed in AHTP are intended to extrapolate from currently available commercial systems to the systems that will be available by the time that a repository would be open for operation. The project investigates those cask handling operations that would be performed at a nuclear waste repository facility during cask receiving and handling. The ongoing AHTP indicates that design guidance, rather than design specification, is appropriate, since the requirements for robotic handling do not place severe restrictions on cask design but rather focus on attention to detail and design for limited dexterity. The cask system design features that facilitate robotic handling operations are discussed, and results obtained from AHTP design and operation experience are summarized. The application of these design considerations is illustrated by discussion of the robot systems and their operation on cask feature mock-ups used in the AHTP project. 11 refs., 11 figs

  19. The claims handling process of liability insurance in South Africa

    Directory of Open Access Journals (Sweden)

    Jacoline van Jaarsveld

    2015-04-01

    Full Text Available Liabilities play a very important financial role in business operations, professional service providers as well as in the personal lives of people. It is possible that a single claim may even lead to the bankruptcy of the defendant. The claims handling process of liability insurance by short-term insurers is therefore very important to these parties as it should be clear that liability claims may have enormous and far-reaching financial implications for them. The objective of this research paper embodies the improvement of financial decision-making by short-term insurers with regard to the claims handling process of liability insurance. Secondary data was initially studied which provided the basis to compile a questionnaire for the empirical survey. The leaders of liability insurance in the South African short-term insurance market that represented 69.5% of the annual gross written premiums received for liability insurance in South Africa were the respondents of the empirical study. The perceptions of these short-term insurers provided the primary data for the vital conclusions of this research. This paper pays special attention to the importance of the claims handling factors of liability insurance, how often the stipulations of liability insurance policies are adjusted by the short-term insurers to take the claims handling factors into consideration, as well as the problem areas which short-term insurers may experience during the claims handling process. Feasible solutions to address the problem areas are also discussed.

  20. Hot Laboratories and Remote Handling

    International Nuclear Information System (INIS)

    Bart, G.; Blanc, J.Y.; Duwe, R.

    2003-01-01

    The European Working Group on ' Hot Laboratories and Remote Handling' is firmly established as the major contact forum for the nuclear R and D facilities at the European scale. The yearly plenary meetings intend to: - Exchange experience on analytical methods, their implementation in hot cells, the methodologies used and their application in nuclear research; - Share experience on common infrastructure exploitation matters such as remote handling techniques, safety features, QA-certification, waste handling; - Promote normalization and co-operation, e.g., by looking at mutual complementarities; - Prospect present and future demands from the nuclear industry and to draw strategic conclusions regarding further needs. The 41. plenary meeting was held in CEA Saclay from September 22 to 24, 2003 in the premises and with the technical support of the INSTN (National Institute for Nuclear Science and Technology). The Nuclear Energy Division of CEA sponsored it. The Saclay meeting was divided in three topical oral sessions covering: - Post irradiation examination: new analysis methods and methodologies, small specimen technology, programmes and results; - Hot laboratory infrastructure: decommissioning, refurbishment, waste, safety, nuclear transports; - Prospective research on materials for future applications: innovative fuels (Generation IV, HTR, transmutation, ADS), spallation source materials, and candidate materials for fusion reactor. A poster session was opened to transport companies and laboratory suppliers. The meeting addressed in three sessions the following items: Session 1 - Post Irradiation Examinations. Out of 12 papers (including 1 poster) 7 dealt with surface and solid state micro analysis, another one with an equally complex wet chemical instrumental analytical technique, while the other four papers (including the poster) presented new concepts for digital x-ray image analysis; Session 2 - Hot laboratory infrastructure (including waste theme) which was

  1. Development of commercial robots for radwaste handling

    International Nuclear Information System (INIS)

    Colborn, K.A.

    1988-01-01

    The cost and dose burden associated with low level radwaste handling activities is a matter of increasing concern to the commercial nuclear power industry. This concern is evidenced by the fact that many utilities have begun to revaluate waste generation, handling, and disposal activities at their plants in an effort to improve their overall radwaste handling operations. This paper reports on the project Robots for Radwaste Handling, to identify the potential of robots to improve radwaste handling operations. The project has focussed on the potential of remote or automated technology to improve well defined, recognizable radwaste operations. The project focussed on repetitive, low skill level radwaste handling and decontamination tasks which involve significant radiation exposure

  2. Experiences with decontaminating tritium-handling apparatus

    International Nuclear Information System (INIS)

    Maienschein, J.L.; Garcia, F.; Garza, R.G.; Kanna, R.L.; Mayhugh, S.R.; Taylor, D.T.

    1992-01-01

    Tritium-handling apparatus has been decontaminated as part of the downsizing of the LLNL Tritium Facility. Two stainless-steel glove boxes that had been used to process lithium deuteride-tritide (LiDT) slat were decontaminated using the Portable Cleanup System so that they could be flushed with room air through the facility ventilation system. In this paper the details on the decontamination operation are provided. A series of metal (palladium and vanadium) hydride storage beds have been drained of tritium and flushed with deuterium, in order to remove as much tritium as possible. The bed draining and flushing procedure is described, and a calculational method is presented which allows estimation of the tritium remaining in a bed after it has been drained and flushed. Data on specific bed draining and flushing are given

  3. Sequence trajectory generation for garment handling systems

    OpenAIRE

    Liu, Honghai; Lin, Hua

    2008-01-01

    This paper presents a novel generic approach to the planning strategy of garment handling systems. An assumption is proposed to separate the components of such systems into a component for intelligent gripper techniques and a component for handling planning strategies. Researchers can concentrate on one of the two components first, then merge the two problems together. An algorithm is addressed to generate the trajectory position and a clothes handling sequence of clothes partitions, which ar...

  4. Enclosure for handling high activity materials

    International Nuclear Information System (INIS)

    Jimeno de Osso, F.

    1977-01-01

    One of the most important problems that are met at the laboratories producing and handling radioisotopes is that of designing, building and operating enclosures suitable for the safe handling of active substances. With this purpose in mind, an enclosure has been designed and built for handling moderately high activities under a shielding made of 150 mm thick lead. In this report a description is given of those aspects that may be of interest to people working in this field. (Author)

  5. Enclosure for handling high activity materials abstract

    International Nuclear Information System (INIS)

    Jimeno de Osso, F.; Dominguez Rodriguez, G.; Cruz Castillo, F. de la; Rodriguez Esteban, A.

    1977-01-01

    One of the most important problems that are met at the laboratories producing and handling radioisotopes is that of designing, building and operating enclosures suitable for the safe handling of active substances. With that purpose in mind, an enclosure has been designed and built for handling moderately high activities under a shielding made of 150 mm thick lead. A description is given of those aspects that may be of interest to people working in this field. (author) [es

  6. Enclosure for handling high activity materials

    Energy Technology Data Exchange (ETDEWEB)

    Jimeno de Osso, F

    1977-07-01

    One of the most important problems that are met at the laboratories producing and handling radioisotopes is that of designing, building and operating enclosures suitable for the safe handling of active substances. With this purpose in mind, an enclosure has been designed and built for handling moderately high activities under a shielding made of 150 mm thick lead. In this report a description is given of those aspects that may be of interest to people working in this field. (Author)

  7. WWER NPPs fuel handling machine control system

    International Nuclear Information System (INIS)

    Mini, G.; Rossi, G.; Barabino, M.; Casalini, M.

    2001-01-01

    In order to increase the safety level of the fuel handling machine on WWER NPPs, Ansaldo Nucleare was asked to design and supply a new Control System. Two FHM Control System units have been already supplied for Temelin NPP and others supplies are in process for the Atommash company, which has in charge the supply of FHMs for NPPs located in Russia, Ukraine and China. The Fuel Handling Machine (FHM) Control System is an integrated system capable of a complete management of nuclear fuel assemblies. The computer-based system takes into account all the operational safety interlocks so that it is able to avoid incorrect and dangerous manoeuvres in the case of operator error. Control system design criteria, hardware and software architecture, and quality assurance control, are in accordance with the most recent international requirements and standards, and in particular for electromagnetic disturbance immunity demands and seismic compatibility. The hardware architecture of the control system is based on ABB INFI 90 system. The microprocessor-based ABB INFI 90 system incorporates and improves upon many of the time proven control capabilities of Bailey Network 90, validated over 14,000 installations world-wide. The control system complies all the former designed sensors and devices of the machine and markedly the angular position measurement sensors named 'selsyn' of Russian design. Nevertheless it is fully compatible with all the most recent sensors and devices currently available on the market (for ex. Multiturn absolute encoders). All control logic components were developed using standard INFI 90 Engineering Work Station, interconnecting blocks extracted from an extensive SAMA library by using a graphical approach (CAD) and allowing an easier intelligibility, more flexibility and updated and coherent documentation. The data acquisition system and the Man Machine Interface are implemented by ABB in co-operation with Ansaldo. The flexible and powerful software structure

  8. Fire and earthquake counter measures in radiation handling facilities

    International Nuclear Information System (INIS)

    1985-01-01

    'Fire countermeasures in radiation handling facilities' published in 1961 is still widely utilized as a valuable guideline for those handling radiation through the revision in 1972. However, science and technology rapidly advanced, and the relevant laws were revised after the publication, and many points which do not conform to the present state have become to be found. Therefore, it was decided to rewrite this book, and the new book has been completed. The title was changed to 'Fire and earthquake countermeasures in radiation handling facilities', and the countermeasures to earthquakes were added. Moreover, consideration was given so that the book is sufficiently useful also for those concerned with fire fighting, not only for those handling radiation. In this book, the way of thinking about the countermeasures against fires and earthquakes, the countermeasures in normal state and when a fire or an earthquake occurred, the countermeasures when the warning declaration has been announced, and the data on fires, earthquakes, the risk of radioisotopes, fire fighting equipment, the earthquake counter measures for equipment, protectors and radiation measuring instruments, first aid, the example of emergency system in radiation handling facilities, the activities of fire fighters, the example of accidents and so on are described. (Kako, I.)

  9. Remote handling needs of the Princeton Plasma Physics Laboratory

    International Nuclear Information System (INIS)

    Smiltnieks, V.

    1982-07-01

    This report is the result of a Task Force study commissioned by the Canadian Fusion Fuels Technology Project (CFFTP) to investigate the remote handling requirements at the Princeton Plasma Physics Laboratory (PPPL) and identify specific areas where CFFTP could offer a contractual or collaborative participation, drawing on the Canadian industrial expertise in remote handling technology. The Task Force reviewed four areas related to remote handling requirements; the TFTR facility as a whole, the service equipment required for remote maintenance, the more complex in-vessel components, and the tritium systems. Remote maintenance requirements both inside the vacuum vessel and around the periphery of the machine were identified as the principal areas where Canadian resources could effectively provide an input, initially in requirement definition, concept evaluation and feasibility design, and subsequently in detailed design and manufacture. Support requirements were identified in such areas as the mock-up facility and a variety of planning studies relating to reliability, availability, and staff training. Specific tasks are described which provide an important data base to the facility's remote handling requirements. Canadian involvement in the areas is suggested where expertise exists and support for the remote handling work is warranted. Reliability, maintenance operations, inspection strategy and decommissioning are suggested for study. Several specific components are singled out as needing development

  10. Application Examples for Handle System Usage

    Science.gov (United States)

    Toussaint, F.; Weigel, T.; Thiemann, H.; Höck, H.; Stockhause, M.; Lautenschlager, M.

    2012-12-01

    Besides the well-known DOI (Digital Object Identifiers) as a special form of Handles that resolve to scientific publications there are various other applications in use. Others perhaps are just not yet. We present some examples for the existing ones and some ideas for the future. The national German project C3-Grid provides a framework to implement a first solution for provenance tracing and explore unforeseen implications. Though project-specific, the high-level architecture is generic and represents well a common notion of data derivation. Users select one or many input datasets and a workflow software module (an agent in this context) to execute on the data. The output data is deposited in a repository to be delivered to the user. All data is accompanied by an XML metadata document. All input and output data, metadata and the workflow module receive Handles and are linked together to establish a directed acyclic graph of derived data objects and involved agents. Data that has been modified by a workflow module is linked to its predecessor data and the workflow module involved. Version control systems such as svn or git provide Internet access to software repositories using URLs. To refer to a specific state of the source code of for instance a C3 workflow module, it is sufficient to reference the URL to the svn revision or git hash. In consequence, individual revisions and the repository as a whole receive PIDs. Moreover, the revision specific PIDs are linked to their respective predecessors and become part of the provenance graph. Another example for usage of PIDs in a current major project is given in EUDAT (European Data Infrastructure) which will link scientific data of several research communities together. In many fields it is necessary to provide data objects at multiple locations for a variety of applications. To ensure consistency, not only the master of a data object but also its copies shall be provided with a PID. To verify transaction safety and to

  11. Hot Laboratories and Remote Handling

    International Nuclear Information System (INIS)

    2007-01-01

    The Opening talk of the workshop 'Hot Laboratories and Remote Handling' was given by Marin Ciocanescu with the communication 'Overview of R and D Program in Romanian Institute for Nuclear Research'. The works of the meeting were structured into three sections addressing the following items: Session 1. Hot cell facilities: Infrastructure, Refurbishment, Decommissioning; Session 2. Waste, transport, safety and remote handling issues; Session 3. Post-Irradiation examination techniques. In the frame of Section 1 the communication 'Overview of hot cell facilities in South Africa' by Wouter Klopper, Willie van Greunen et al, was presented. In the framework of the second session there were given the following four communications: 'The irradiated elements cell at PHENIX' by Laurent Breton et al., 'Development of remote equipment for DUPIC fuel fabrication at KAERI', by Jung Won Lee et al., 'Aspects of working with manipulators and small samples in an αβγ-box, by Robert Zubler et al., and 'The GIOCONDA experience of the Joint Research Centre Ispra: analysis of the experimental assemblies finalized to their safe recovery and dismantling', by Roberto Covini. Finally, in the framework of the third section the following five communications were presented: 'PIE of a CANDU fuel element irradiated for a load following test in the INR TRIGA reactor' by Marcel Parvan et al., 'Adaptation of the pole figure measurement to the irradiated items from zirconium alloys' by Yury Goncharenko et al., 'Fuel rod profilometry with a laser scan micrometer' by Daniel Kuster et al., 'Raman spectroscopy, a new facility at LECI laboratory to investigate neutron damage in irradiated materials' by Lionel Gosmain et al., and 'Analysis of complex nuclear materials with the PSI shielded analytical instruments' by Didier Gavillet. In addition, eleven more presentations were given as posters. Their titles were: 'Presentation of CETAMA activities (CEA analytic group)' by Alain Hanssens et al. 'Analysis of

  12. Arrival condition of spent fuel after storage, handling, and transportation

    International Nuclear Information System (INIS)

    Bailey, W.J.; Pankaskie, P.J.; Langstaff, D.C.; Gilbert, E.R.; Rising, K.H.; Schreiber, R.E.

    1982-11-01

    This report presents the results of a study conducted to determine the probable arrival condition of spent light-water reactor (LWR) fuel after handling and interim storage in spent fuel storage pools and subsequent handling and accident-free transport operations under normal or slightly abnormal conditions. The objective of this study was to provide information on the expected condition of spent LWR fuel upon arrival at interim storage or fuel reprocessing facilities or at disposal facilities if the fuel is declared a waste. Results of a literature survey and data evaluation effort are discussed. Preliminary threshold limits for storing, handling, and transporting unconsolidated spent LWR fuel are presented. The difficulty in trying to anticipate the amount of corrosion products (crud) that may be on spent fuel in future shipments is also discussed, and potential areas for future work are listed. 95 references, 3 figures, 17 tables

  13. Fuel handling problems at KANUPP

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, I; Mazhar Hasan, S; Mugtadir, A [Karachi Nuclear Power Plant (KANUPP), Karachi (Pakistan)

    1991-04-01

    KANUPP experienced two abnormal fuel and fuel handling related problems during the year 1990. One of these had arisen due to development of end plate to end plate coupling between the two bundles at the leading end of the fuel string in channel HO2-S. The incident occurred when attempts were being made to fuel this channel. Due to pulling of sticking bundles into the acceptor fuelling machine (north) magazine, which was not designed to accommodate two bundles, a magazine rotary stop occurred. The forward motion of the charge tube was simultaneously discovered to be restricted. The incident led to stalling of fuelling machine locked on to the channel HO2, necessitating a reactor shut down. Removal of the fuelling machine was accomplished sometime later after draining of the channel. The second incident which made the fuelling of channel KO5-N temporarily inexecutable, occurred during attempts to remove its north end shield plug when this channel came up for fuelling. The incident resulted due to breaking of the lugs of the shield plug, making its withdrawal impossible. The Plant however kept operating with suspended fuelling of channel KO5, until it could no longer sustain a further increase in fuel burnup at the maximum rating position. Resolving both these problems necessitated draining of the respective channels, leaving the resident fuel uncovered for the duration of the associated operation. Due to substantial difference in the oxidation temperatures Of UO{sub 2} and Zircaloy and its influence as such on the cooling requirement, it was necessary either to determine explicitly that the respective channels did not contain defective fuel bundles or wait for time long enough to allow the decay heat to reduce to manageable proportions. This had a significant bearing on the Plant down time necessary for the rectification of the problems. This paper describes the two incidents in detail and dwells upon the measures adopted to resolve the related problems. (author)

  14. Fuel handling problems at KANUPP

    International Nuclear Information System (INIS)

    Ahmed, I.; Mazhar Hasan, S.; Mugtadir, A.

    1991-01-01

    KANUPP experienced two abnormal fuel and fuel handling related problems during the year 1990. One of these had arisen due to development of end plate to end plate coupling between the two bundles at the leading end of the fuel string in channel HO2-S. The incident occurred when attempts were being made to fuel this channel. Due to pulling of sticking bundles into the acceptor fuelling machine (north) magazine, which was not designed to accommodate two bundles, a magazine rotary stop occurred. The forward motion of the charge tube was simultaneously discovered to be restricted. The incident led to stalling of fuelling machine locked on to the channel HO2, necessitating a reactor shut down. Removal of the fuelling machine was accomplished sometime later after draining of the channel. The second incident which made the fuelling of channel KO5-N temporarily inexecutable, occurred during attempts to remove its north end shield plug when this channel came up for fuelling. The incident resulted due to breaking of the lugs of the shield plug, making its withdrawal impossible. The Plant however kept operating with suspended fuelling of channel KO5, until it could no longer sustain a further increase in fuel burnup at the maximum rating position. Resolving both these problems necessitated draining of the respective channels, leaving the resident fuel uncovered for the duration of the associated operation. Due to substantial difference in the oxidation temperatures Of UO 2 and Zircaloy and its influence as such on the cooling requirement, it was necessary either to determine explicitly that the respective channels did not contain defective fuel bundles or wait for time long enough to allow the decay heat to reduce to manageable proportions. This had a significant bearing on the Plant down time necessary for the rectification of the problems. This paper describes the two incidents in detail and dwells upon the measures adopted to resolve the related problems. (author)

  15. Evaluating ITER remote handling middleware concepts

    Energy Technology Data Exchange (ETDEWEB)

    Koning, J.F., E-mail: j.f.koning@differ.nl [FOM Institute DIFFER, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster and ITER-NL, PO Box 1207, 3430 BE Nieuwegein (Netherlands); Heemskerk, C.J.M.; Schoen, P.; Smedinga, D. [Heemskerk Innovative Technology, Noordwijk (Netherlands); Boode, A.H. [University of Applied Sciences InHolland, Alkmaar (Netherlands); Hamilton, D.T. [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: ► Remote Handling Study Centre: middleware system setup and modules built. ► Aligning to ITER RH Control System Layout: prototype of database, VR and simulator. ► OpenSplice DDS, ZeroC ICE messaging and object oriented middlewares reviewed. ► Windows network latency found problematic for semi-realtime control over the network. -- Abstract: Remote maintenance activities in ITER will be performed by a unique set of hardware systems, supported by an extensive software kit. A layer of middleware will manage and control a complex set of interconnections between teams of operators, hardware devices in various operating theatres, and databases managing tool and task logistics. The middleware is driven by constraints on amounts and timing of data like real-time control loops, camera images, and database access. The Remote Handling Study Centre (RHSC), located at FOM institute DIFFER, has a 4-operator work cell in an ITER relevant RH Control Room setup which connects to a virtual hot cell back-end. The centre is developing and testing flexible integration of the Control Room components, resulting in proof-of-concept tests of this middleware layer. SW components studied include generic human-machine interface software, a prototype of a RH operations management system, and a distributed virtual reality system supporting multi-screen, multi-actor, and multiple independent views. Real-time rigid body dynamics and contact interaction simulation software supports simulation of structural deformation, “augmented reality” operations and operator training. The paper presents generic requirements and conceptual design of middleware components and Operations Management System in the context of a RH Control Room work cell. The simulation software is analyzed for real-time performance and it is argued that it is critical for middleware to have complete control over the physical network to be able to guarantee bandwidth and latency to the components.

  16. Evaluating ITER remote handling middleware concepts

    International Nuclear Information System (INIS)

    Koning, J.F.; Heemskerk, C.J.M.; Schoen, P.; Smedinga, D.; Boode, A.H.; Hamilton, D.T.

    2013-01-01

    Highlights: ► Remote Handling Study Centre: middleware system setup and modules built. ► Aligning to ITER RH Control System Layout: prototype of database, VR and simulator. ► OpenSplice DDS, ZeroC ICE messaging and object oriented middlewares reviewed. ► Windows network latency found problematic for semi-realtime control over the network. -- Abstract: Remote maintenance activities in ITER will be performed by a unique set of hardware systems, supported by an extensive software kit. A layer of middleware will manage and control a complex set of interconnections between teams of operators, hardware devices in various operating theatres, and databases managing tool and task logistics. The middleware is driven by constraints on amounts and timing of data like real-time control loops, camera images, and database access. The Remote Handling Study Centre (RHSC), located at FOM institute DIFFER, has a 4-operator work cell in an ITER relevant RH Control Room setup which connects to a virtual hot cell back-end. The centre is developing and testing flexible integration of the Control Room components, resulting in proof-of-concept tests of this middleware layer. SW components studied include generic human-machine interface software, a prototype of a RH operations management system, and a distributed virtual reality system supporting multi-screen, multi-actor, and multiple independent views. Real-time rigid body dynamics and contact interaction simulation software supports simulation of structural deformation, “augmented reality” operations and operator training. The paper presents generic requirements and conceptual design of middleware components and Operations Management System in the context of a RH Control Room work cell. The simulation software is analyzed for real-time performance and it is argued that it is critical for middleware to have complete control over the physical network to be able to guarantee bandwidth and latency to the components

  17. Remote-handled transuranic waste study

    International Nuclear Information System (INIS)

    1995-10-01

    The Waste Isolation Pilot Plant (WIPP) was developed by the US Department of Energy (DOE) as a research and development facility to demonstrate the safe disposal of transuranic (TRU) radioactive wastes generated from the Nation's defense activities. The WIPP disposal inventory will include up to 250,000 cubic feet of TRU wastes classified as remote handled (RH). The remaining inventory will include contact-handled (CH) TRU wastes, which characteristically have less specific activity (radioactivity per unit volume) than the RH-TRU wastes. The WIPP Land Withdrawal Act (LWA), Public Law 102-579, requires a study of the effect of RH-TRU waste on long-term performance. This RH-TRU Waste Study has been conducted to satisfy the requirements defined by the LWA and is considered by the DOE to be a prudent exercise in the compliance certification process of the WIPP repository. The objectives of this study include: conducting an evaluation of the impacts of RH-TRU wastes on the performance assessment (PA) of the repository to determine the effects of Rh-TRU waste as a part of the total WIPP disposal inventory; and conducting a comparison of CH-TRU and RH-TRU wastes to assess the differences and similarities for such issues as gas generation, flammability and explosiveness, solubility, and brine and geochemical interactions. This study was conducted using the data, models, computer codes, and information generated in support of long-term compliance programs, including the WIPP PA. The study is limited in scope to post-closure repository performance and includes an analysis of the issues associated with RH-TRU wastes subsequent to emplacement of these wastes at WIPP in consideration of the current baseline design. 41 refs

  18. 9 CFR 3.118 - Handling.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Handling. 3.118 Section 3.118 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE STANDARDS Specifications for the Humane Handling, Care, Treatment, and Transportation of Marine...

  19. How to Handle Impasses in Bargaining.

    Science.gov (United States)

    Durrant, Robert E.

    Guidelines in an outline format are presented to school board members and administrators on how to handle impasses in bargaining. The following two rules are given: there sometimes may be strikes, but there always will be settlements; and on the way to settlements, there always will be impasses. Suggestions for handling impasses are listed under…

  20. Handling uncertainty through adaptiveness in planning approaches

    NARCIS (Netherlands)

    Zandvoort, M.; Vlist, van der M.J.; Brink, van den A.

    2018-01-01

    Planners and water managers seek to be adaptive to handle uncertainty through the use of planning approaches. In this paper, we study what type of adaptiveness is proposed and how this may be operationalized in planning approaches to adequately handle different uncertainties. We took a

  1. Survey of postharvest handling, preservation and processing ...

    African Journals Online (AJOL)

    Survey of postharvest handling, preservation and processing practices along the camel milk chain in Isiolo district, Kenya. ... Despite the important contribution of camel milk to food security for pastoralists in Kenya, little is known about the postharvest handling, preservation and processing practices. In this study, existing ...

  2. PND fuel handling decontamination: facilities and techniques

    International Nuclear Information System (INIS)

    Pan, R.Y.

    1996-01-01

    The use of various decontamination techniques and equipment has become a critical part of Fuel Handling maintenance work at Ontario Hydro's Pickering Nuclear Division. This paper presents an overview of the set up and techniques used for decontamination in the PND Fuel Handling Maintenance Facility and the effectiveness of each. (author). 1 tab., 9 figs

  3. Handling Kids in Crisis with Care

    Science.gov (United States)

    Bushinski, Cari

    2018-01-01

    The Handle with Care program helps schools help students who experience trauma. While at the scene of an event like a domestic violence call, drug raid, or car accident, law enforcement personnel determine the names and school of any children present. They notify that child's school to "handle ___ with care" the next day, and the school…

  4. PND fuel handling decontamination: facilities and techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pan, R Y [Ontario Hydro, Toronto, ON (Canada)

    1997-12-31

    The use of various decontamination techniques and equipment has become a critical part of Fuel Handling maintenance work at Ontario Hydro`s Pickering Nuclear Division. This paper presents an overview of the set up and techniques used for decontamination in the PND Fuel Handling Maintenance Facility and the effectiveness of each. (author). 1 tab., 9 figs.

  5. Handling knowledge on osteoporosis - a qualitative study

    DEFF Research Database (Denmark)

    Nielsen, Dorthe; Huniche, Lotte; Brixen, Kim

    2013-01-01

    Scand J Caring Sci; 2012 Handling knowledge on osteoporosis - a qualitative study The aim of this qualitative study was to increase understanding of the importance of osteoporosis information and knowledge for patients' ways of handling osteoporosis in their everyday lives. Interviews were...

  6. MRI of meniscal bucket-handle tears

    Energy Technology Data Exchange (ETDEWEB)

    Magee, T.H.; Hinson, G.W. [Menorah Medical Center, Overland Park, KS (United States). Dept. of Radiology

    1998-09-01

    A meniscal bucket-handle tear is a tear with an attached fragment displaced from the meniscus of the knee joint. Low sensitivity of MRI for detection of bucket-handle tears (64% as compared with arthroscopy) has been reported previously. We report increased sensitivity for detecting bucket-handle tears with the use of coronal short tau inversion recovery (STIR) images. Results. By using four criteria for diagnosis of meniscal bucket-handle tears, our overall sensitivity compared with arthroscopy was 93% (28 of 30 meniscal bucket-handle tears seen at arthroscopy were detected by MRI). The meniscal fragment was well visualized in all 28 cases on coronal STIR images. The double posterior cruciate ligament sign was seen in 8 of 30 cases, the flipped meniscus was seen in 10 of 30 cases and a fragment in the intercondylar notch was seen in 18 of 30 cases. (orig.)

  7. The handling of radiation accidents

    International Nuclear Information System (INIS)

    1977-01-01

    The symposium was attended by 204 participants from 39 countries and 5 international organizations. Forty-two papers were presented in 8 sessions. The purpose of the meeting was to foster an exchange of experiences gained in establishing and exercising plans for mitigating the effects of radiation accidents and in the handling of actual accident situations. Only a small number of accidents were reported at the symposium, and this reflects the very high standards of safety that has been achieved by the nuclear industry. No accidents of radiological significance were reported to have occurred at commercial nuclear power plants. Of the accidents reported, industrial radiography continues to be the area in which most of the radiation accidents occur. The experience gained in the reported accident situations served to confirm the crucial importance of the prompt availability of medical and radiological services, particularly in the case of uptake of radioactive material, and emphasized the importance of detailed investigation into the causes of the accident in order to improve preventative measures. One of the principal themes of the symposium involved emergency procedures related to nuclear power plant accidents, and several papers defining the scope, progression and consequences of design base accidents for both thermal and fast reactor systems were presented. These were complemented by papers defining the resultant protection requirements that should be satisfied in the establishment of plans designed to mitigate the effects of the postulated accident situations. Several papers were presented describing existing emergency organizational arrangements relating both to specific nuclear power plants and to comprehensive national schemes, and a particularly informative session was devoted to the topic of training of personnel in the practical conduct of emergency arrangements. The general feeling of the participants was one of studied confidence in the competence and

  8. Musculoskeletal injuries resulting from patient handling tasks among hospital workers.

    Science.gov (United States)

    Pompeii, Lisa A; Lipscomb, Hester J; Schoenfisch, Ashley L; Dement, John M

    2009-07-01

    The purpose of this study was to evaluate musculoskeletal injuries and disorders resulting from patient handling prior to the implementation of a "minimal manual lift" policy at a large tertiary care medical center. We sought to define the circumstances surrounding patient handling injuries and to identify potential preventive measures. Human resources data were used to define the cohort and their time at work. Workers' compensation records (1997-2003) were utilized to identify work-related musculoskeletal claims, while the workers' description of injury was used to identify those that resulted from patient handling. Adjusted rate ratios were generated using Poisson regression. One-third (n = 876) of all musculoskeletal injuries resulted from patient handling activities. Most (83%) of the injury burden was incurred by inpatient nurses, nurses' aides and radiology technicians, while injury rates were highest for nurses' aides (8.8/100 full-time equivalent, FTEs) and smaller workgroups including emergency medical technicians (10.3/100 FTEs), patient transporters (4.3/100 FTEs), operating room technicians (3.1/100 FTEs), and morgue technicians (2.2/100 FTEs). Forty percent of injuries due to lifting/transferring patients may have been prevented through the use of mechanical lift equipment, while 32% of injuries resulting from repositioning/turning patients, pulling patients up in bed, or catching falling patients may not have been prevented by the use of lift equipment. The use of mechanical lift equipment could significantly reduce the risk of some patient handling injuries but additional interventions need to be considered that address other patient handling tasks. Smaller high-risk workgroups should not be neglected in prevention efforts.

  9. Fuel handling machine and auxiliary systems for a fuel handling cell

    International Nuclear Information System (INIS)

    Suikki, M.

    2013-10-01

    This working report is an update for as well as a supplement to an earlier fuel handling machine design (Kukkola and Roennqvist 2006). A focus in the earlier design proposal was primarily on the selection of a mechanical structure and operating principle for the fuel handling machine. This report introduces not only a fuel handling machine design but also auxiliary fuel handling cell equipment and its operation. An objective of the design work was to verify the operating principles of and space allocations for fuel handling cell equipment. The fuel handling machine is a remote controlled apparatus capable of handling intensely radiating fuel assemblies in the fuel handling cell of an encapsulation plant. The fuel handling cell is air tight space radiation-shielded with massive concrete walls. The fuel handling machine is based on a bridge crane capable of traveling in the handling cell along wall tracks. The bridge crane has its carriage provided with a carousel type turntable having mounted thereon both fixed and telescopic masts. The fixed mast has a gripper movable on linear guides for the transfer of fuel assemblies. The telescopic mast has a manipulator arm capable of maneuvering equipment present in the fuel handling cell, as well as conducting necessary maintenance and cleaning operations or rectifying possible fault conditions. The auxiliary fuel handling cell systems consist of several subsystems. The subsystems include a service manipulator, a tool carrier for manipulators, a material hatch, assisting winches, a vacuum cleaner, as well as a hose reel. With the exception of the vacuum cleaner, the devices included in the fuel handling cell's auxiliary system are only used when the actual encapsulation process is not ongoing. The malfunctions of mechanisms or actuators responsible for the motion actions of a fuel handling machine preclude in a worst case scenario the bringing of the fuel handling cell and related systems to a condition appropriate for

  10. Fuel handling machine and auxiliary systems for a fuel handling cell

    Energy Technology Data Exchange (ETDEWEB)

    Suikki, M. [Optimik Oy, Turku (Finland)

    2013-10-15

    This working report is an update for as well as a supplement to an earlier fuel handling machine design (Kukkola and Roennqvist 2006). A focus in the earlier design proposal was primarily on the selection of a mechanical structure and operating principle for the fuel handling machine. This report introduces not only a fuel handling machine design but also auxiliary fuel handling cell equipment and its operation. An objective of the design work was to verify the operating principles of and space allocations for fuel handling cell equipment. The fuel handling machine is a remote controlled apparatus capable of handling intensely radiating fuel assemblies in the fuel handling cell of an encapsulation plant. The fuel handling cell is air tight space radiation-shielded with massive concrete walls. The fuel handling machine is based on a bridge crane capable of traveling in the handling cell along wall tracks. The bridge crane has its carriage provided with a carousel type turntable having mounted thereon both fixed and telescopic masts. The fixed mast has a gripper movable on linear guides for the transfer of fuel assemblies. The telescopic mast has a manipulator arm capable of maneuvering equipment present in the fuel handling cell, as well as conducting necessary maintenance and cleaning operations or rectifying possible fault conditions. The auxiliary fuel handling cell systems consist of several subsystems. The subsystems include a service manipulator, a tool carrier for manipulators, a material hatch, assisting winches, a vacuum cleaner, as well as a hose reel. With the exception of the vacuum cleaner, the devices included in the fuel handling cell's auxiliary system are only used when the actual encapsulation process is not ongoing. The malfunctions of mechanisms or actuators responsible for the motion actions of a fuel handling machine preclude in a worst case scenario the bringing of the fuel handling cell and related systems to a condition appropriate for

  11. Handling of multiassembly sealed baskets between reactor storage and a remote handling facility

    International Nuclear Information System (INIS)

    Massey, J.V.; Kessler, J.H.; McSherry, A.J.

    1989-06-01

    The storage of multiple fuel assemblies in sealed (welded) dry storage baskets is gaining increasing use to augment at-reactor fuel storage capacity. Since this increasing use will place a significant number of such baskets on reactor sites, some initial downstream planning for their future handling scenarios for retrieving multi-assembly sealed baskets (MSBs) from onsite storage and transferring and shipping the fuel (and/or the baskets) to a federally operated remote handling facility (RHF). Numerous options or at-reactor and away-from-reactor handling were investigated. Materials handling flowsheets were developed along with conceptual designs for the equipment and tools required to handle and open the MSBs. The handling options were evaluated and compared to a reference case, fuel handling sequence (i.e., fuel assemblies are taken from the fuel pool, shipped to a receiving and handling facility and placed into interim storage). The main parameters analyzed are throughout, radiation dose burden and cost. In addition to evaluating the handling of MSBs, this work also evaluated handling consolidated fuel canisters (CFCs). In summary, the handling of MSBs and CFCs in the store, ship and bury fuel cycle was found to be feasible and, under some conditions, to offer significant benefits in terms of throughput, cost and safety. 14 refs., 20 figs., 24 tabs

  12. Safeguards information handling and treatment

    International Nuclear Information System (INIS)

    Carchon, R.; Liu, J.; Ruan, D.

    2001-01-01

    Many states are currently discussing the new additional protocol (INFCIRC/540). This expanded framework is expected to establish the additional confirmation that there are no undeclared activities and facilities in that state. The information collected by the IAEA mainly comes from three different sources: information either provided by the state, collected by the IAEA, and from open sources. This information can be uncertain, incomplete, imprecise, not fully reliable, contradictory, etc. Hence, there is a need for a mathematical framework that provides a basis for handling and treatment of multidimensional information of varying quality. We use a linguistic assessment based on fuzzy set theory, as a flexible and realistic approach. The concept of a linguistic variable serves the purpose of providing a means of approximated characterization of information that may be imprecise, too complex or ill-defined, for which the traditional quantitative approach does not give an adequate answer. In the application of this linguistic assessment approach, a problem arises on how to aggregate linguistic information. Two different approaches can be followed: (1) approximation approach using the associated membership function; (2) symbolic approach acting by the direct computation on labels, where the use of membership function and the linguistic approximation is unnecessary, which makes computation simple and quick. To manipulate the linguistic information in this context, we work with aggregation operators for combining the linguistic non-weighted and weighted values by direct computation on labels, like the Min-type and Max-type weighted aggregation operators as well as the median aggregation operator. A case study on the application of these aggregation operators to the fusion of safeguards relevant information is given. The IAEA Physical Model of the nuclear fuel cycle can be taken as a systematic and comprehensive indicator system. It identifies and describes indicators of

  13. Development of remote handling tools and equipment

    International Nuclear Information System (INIS)

    Nakahira, Masataka; Oka, Kiyoshi; Taguchi, Kou; Ito, Akira; Fukatsu, Seiichi; Oda, Yasushi; Kajiura, Soji; Yamazaki, Seiichiro; Aoyama, Kazuo.

    1997-01-01

    The remote handling (RH) tools and equipment development in ITER focuses mainly on the welding and cutting technique, weld inspection and double-seal door which are essential factors in the replacement of in-vessel components such as divertor and blanket. The conceptual design of these RH tools and equipment has been defined through ITER engineering design activity (EDA). Similarly, elementary R and D of the RH tools and equipment have been extensively performed to accumulate a technological data base for process and performance qualification. Based on this data, fabrications of full-scale RH tools and equipment are under progress. A prototypical bore tool for pipe welding and cutting has already been fabricated and is currently undergoing integrated performance tests. This paper describes the design outline of the RH tools and equipment related to in-vessel components maintenance, and highlights the current status of RH tools and equipment development by the Japan Home Team as an ITER R and D program. This paper also includes an outline of insulation joint and quick-pipe connector development, which has also been conducted through the ITER R and D program in order to standardize RH operations and components. (author)

  14. Faktor Risiko Manual Handling dengan Keluhan Nyeri Punggung Bawah Pembuat Batu Bata

    Directory of Open Access Journals (Sweden)

    Heru Subaris Kasjono

    2017-08-01

    Full Text Available During done manual work handling for objects work hard, it will cause risk of injury or cause musculoskeletal systems. Risk assessment manual work handling with the methods indicators key-Leitmerkmal Method (LMM intended to know the relationship between time, burden, attitudes of the body, and working conditions manual handling with complaints of the lower back pain at all stages making bricks perceived maker bricks. The kind of research used is surveyed such data is cross sectional. The data taken by lower back pain questionnaire assisted examination physically by nurses and checklist Key-LMM. Analysis relations use the spearman. The results of research acquired at variable time manual handling based frequency raised or operation the transfer of on stage excavation raw materials, the formation and drying bricks there are relations with complaints of  low back pain with p value each are 0,039, 0,047, 0,038 while on the variables of working conditions manual handling in stage excavation raw materials obtained p value of 0,028 with so it can be said there was a correlation between working conditions manual handling with complaints low back pain. A variable load manual handling and attitudes of the body manual handling do not relate in significant to lower back pain all stages making bricks. Conclusion researchers that the variable time manual handling relate in significant with complaints lower back pain in stage excavation raw materials, the formation and drying bricks, while the phase processing raw materials that there was no correlation, in a variable load manual handling and attitudes of the body manual handling all these stage there was no correlation with complaints lower back pain, while variable working conditions manual handling only in stage excavation the raw materials there are relations with complaints lower back pain in the third stage other there was no correlation.

  15. Ergonomics: safe patient handling and mobility.

    Science.gov (United States)

    Hallmark, Beth; Mechan, Patricia; Shores, Lynne

    2015-03-01

    This article reviews and investigates the issues surrounding ergonomics, with a specific focus on safe patient handling and mobility. The health care worker of today faces many challenges, one of which is related to the safety of patients. Safe patient handling and mobility is on the forefront of the movement to improve patient safety. This article reviews the risks associated with patient handling and mobility, and informs the reader of current evidence-based practice relevant to this area of care. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. How the NWC handles software as product

    Energy Technology Data Exchange (ETDEWEB)

    Vinson, D.

    1997-11-01

    This tutorial provides a hands-on view of how the Nuclear Weapons Complex project should be handling (or planning to handle) software as a product in response to Engineering Procedure 401099. The SQAS has published the document SQAS96-002, Guidelines for NWC Processes for Handling Software Product, that will be the basis for the tutorial. The primary scope of the tutorial is on software products that result from weapons and weapons-related projects, although the information presented is applicable to many software projects. Processes that involve the exchange, review, or evaluation of software product between or among NWC sites, DOE, and external customers will be described.

  17. Handling of bulk solids theory and practice

    CERN Document Server

    Shamlou, P A

    1990-01-01

    Handling of Bulk Solids provides a comprehensive discussion of the field of solids flow and handling in the process industries. Presentation of the subject follows classical lines of separate discussions for each topic, so each chapter is self-contained and can be read on its own. Topics discussed include bulk solids flow and handling properties; pressure profiles in bulk solids storage vessels; the design of storage silos for reliable discharge of bulk materials; gravity flow of particulate materials from storage vessels; pneumatic transportation of bulk solids; and the hazards of solid-mater

  18. Capturing Complex Multidimensional Data in Location-Based Data Warehouses

    DEFF Research Database (Denmark)

    Timko, Igor; Pedersen, Torben Bach

    2004-01-01

    Motivated by the increasing need to handle complex multidimensional data inlocation-based data warehouses, this paper proposes apowerful data model that is able to capture the complexities of such data. The model provides a foundation for handling complex transportationinfrastructures...

  19. A Look at Technologies Vis-a-vis Information Handling Techniques.

    Science.gov (United States)

    Swanson, Rowena W.

    The paper examines several ideas for information handling implemented with new technologies that suggest directions for future development. These are grouped under the topic headings: Handling Large Data Banks, Providing Personalized Information Packages, Providing Information Specialist Services, and Expanding Man-Machine Interaction. Guides in…

  20. Laboratory Activity on Sample Handling and Maintaining a Laboratory Notebook through Simple pH Measurements

    Science.gov (United States)

    Erdmann, Mitzy A.; March, Joe L.

    2016-01-01

    Sample handling and laboratory notebook maintenance are necessary skills but can seem abstract if not presented to students in context. An introductory exercise focusing on proper sample handling, data collection and laboratory notebook keeping for the general chemistry laboratory was developed to emphasize the importance of keeping an accurate…

  1. Management of transport and handling contracts

    CERN Document Server

    Rühl, I

    2004-01-01

    This paper shall outline the content, application and management strategies for the various contracts related to transport and handling activities. In total, the two sections Logistics and Handling Maintenance are in charge of 27 (!) contracts ranging from small supply contracts to big industrial support contracts. The activities as well as the contracts can generally be divided into four main topics "Vehicle Fleet Management"; "Supply, Installation and Commissioning of Lifting and Hoisting Equipment"; "Equipment Maintenance" and "Industrial Support for Transport and Handling". Each activity and contract requires different approaches and permanent adaptation to the often changing CERN's requirements. In particular, the management and the difficulties experienced with the contracts E072 "Maintenance of lifting and hoisting equipment", F420 "Supply of seven overhead traveling cranes for LHC" and S090/S103 "Industrial support for transport and handling" will be explained in detail.

  2. Travelling cranes for heavy reactor component handling

    International Nuclear Information System (INIS)

    Champeil, M.

    1977-01-01

    Structure and operating machinery of two travelling cranes (600 t and 450 t) used in the Framatome factory for handling heavy reactor components are described. When coupled, these cranes can lift loads up to 1000 t [fr

  3. Remote handling for an ISIS target change

    International Nuclear Information System (INIS)

    Broome, T.A.; Holding, M.

    1989-01-01

    During 1987 two ISIS targets were changed. This document describes the main features of the remote handling aspects of the work. All the work has to be carried out using remote handling techniques. The radiation level measured on the surface of the reflector when the second target had been removed was about 800 mGy/h demonstrating that hands on operations on any part of the target reflector moderator assembly is not practical. The target changes were the first large scale operations in the Target Station Remote Handling Cell and a great deal was learned about both equipment and working practices. Some general principles emerged which are applicable to other active handling tasks on facilities like ISIS and these are discussed below. 8 figs

  4. Aerobot Sampling and Handling System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Honeybee Robotics proposes to: ?Derive and document the functional and technical requirements for Aerobot surface sampling and sample handling across a range of...

  5. Handling of disused radioactive materials in Ecuador

    International Nuclear Information System (INIS)

    Benitez, Manuel

    1999-10-01

    This paper describes the handling of disused radioactive sources. It also shows graphic information of medical and industrial equipment containing radioactive sources. This information was prepared as part of a training course on radioactive wastes. (The author)

  6. Foster parenting, human imprinting and conventional handling ...

    African Journals Online (AJOL)

    p2492989

    Foster parenting, human imprinting and conventional handling affects survival and early .... bird may subsequently direct its sexual attention to those humans on whom it was imprinted (Bubier et al., ..... The mind through chicks' eyes: memory,.

  7. Control device for handling device of control rod drive

    International Nuclear Information System (INIS)

    Sasaki, Toshiya

    1998-01-01

    A predetermined aimed portion of control rod drives disposed in a pedestal is photographed, and image data and camera data including the position of the camera are outputted. Edge cut out processing image data are formed based on the outputted image data, and aimed image data and aimed camera data obtained when previously positioning the handling device precisely to a predetermined aimed position are stored. The aimed image data are taken out from the aimed image data file to prepare computer graphic image data, and there is disposed an image superposing processing portion for comparing images based on the computer graphic image data and images based on the image data for edge cut out processing, as well as comparing the aimed camera data and the camera data, and displaying each of them to an image display portion. (I.S.)

  8. Regulatory process for material handling equipment

    International Nuclear Information System (INIS)

    Rajendran, S.; Agarwal, Kailash

    2017-01-01

    Atomic Energy (Factories) Rules (AEFR) 1996, Rule 35 states, 'Thorough inspection and load testing of a Crane shall be done by a Competent Person at least once every 12 months'. To adhere to this rule, BARC Safety Council constituted 'Material Handling Equipment Committee (MHEC)' under the aegis of Conventional Fire and Safety Review Committee (CFSRC) to carry out periodical inspection and certification of Material Handling Equipment (MHE), tools and tackles used in BARC Facilities at Trombay, Tarapur and Kalpakkam

  9. Stud bolt handling equipment for reactor vessel

    International Nuclear Information System (INIS)

    Bunyan, T.W.

    1989-01-01

    Reactor vessel stud bolt handling equipment includes means for transferring a stud bolt to a carrier from a parking station, or vice versa. Preferably a number of stud bolts are handled simultaneously. The transfer means may include cross arms rotatable about extendable columns, and the equipment is mounted on a mobile base for movement into and out of position. Each carrier comprises a tubular socket and an expandable sleeve to grip a stud bolt. (author)

  10. Testing of FFTF fuel handling equipment

    International Nuclear Information System (INIS)

    Coleman, D.W.; Grazzini, E.D.; Hill, L.F.

    1977-07-01

    The Fast Flux Test Facility has several manual/computer controlled fuel handling machines which are exposed to severe environments during plant operation but still must operate reliably when called upon for reactor refueling. The test programs for two such machines--the Closed Loop Ex-Vessel Machine and the In-Vessel Handling Machine--are described. The discussion centers on those areas where design corrections or equipment repairs substantiated the benefits of a test program prior to plant operation

  11. Human factors issues in fuel handling

    International Nuclear Information System (INIS)

    Beattie, J.D.; Iwasa-Madge, K.M.; Tucker, D.A.

    1994-01-01

    The staff of the Atomic Energy Control Board wish to further their understanding of human factors issues of potential concern associated with fuel handling in CANDU nuclear power stations. This study contributes to that objective by analysing the role of human performance in the overall fuel handling process at Ontario Hydro's Darlington Nuclear Generating Station, and reporting findings in several areas. A number of issues are identified in the areas of design, operating and maintenance practices, and the organizational and management environment

  12. About brachytherapy for the handling of cancer

    International Nuclear Information System (INIS)

    Campos, Tarcisio P.R.; Silva, Nilton O.; Damaso, Renato S.; Costa, Helder R.; Borges, Paulo H.R.; Mendes, Bruno M.

    2000-01-01

    The technique of brachytherapy is argued in this article. The 'hardware' and 'necessary software' for the handling are summarily presented. Being the macro-dosimetry an important stage in the radiation therapy procedure, a simplified method of doses evaluation in conventional brachytherapy is presented. In an illustrative form, isodoses of a three-dimensional distribution of linear sources are drawn on a digitalized X-ray picture, exemplifying the handling of breast brachytherapy by sources of iridium

  13. Development of standard components for remote handling

    International Nuclear Information System (INIS)

    Taguchi, Kou; Kakudate, Satoshi; Nakahira, Masataka; Ito, Akira

    1998-01-01

    The core of Fusion Experimental Reactor consists of various components such as superconducting magnets and forced-cooled in-vessel components, which are remotely maintained due to intense of gamma radiation. Mechanical connectors such as cooling pipe connections, insulation joints and electrical connectors are commonly used for maintenance of these components and have to be standardized in terms of remote handling. This paper describes these mechanical connectors developed as the standard component compatible with remote handling and tolerable for radiation. (author)

  14. Development of standard components for remote handling

    Energy Technology Data Exchange (ETDEWEB)

    Taguchi, Kou; Kakudate, Satoshi; Nakahira, Masataka; Ito, Akira [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-04-01

    The core of Fusion Experimental Reactor consists of various components such as superconducting magnets and forced-cooled in-vessel components, which are remotely maintained due to intense of gamma radiation. Mechanical connectors such as cooling pipe connections, insulation joints and electrical connectors are commonly used for maintenance of these components and have to be standardized in terms of remote handling. This paper describes these mechanical connectors developed as the standard component compatible with remote handling and tolerable for radiation. (author)

  15. Safety Training: "Manual Handling" course in September

    CERN Multimedia

    Safety Training, HSE Unit

    2016-01-01

    The next "Manual Handling" course will be given, in French, on 26 September 2016. This course is designed for anyone required to carry out manual handling of loads in the course of their work.   The main objective of this course is to adopt and apply the basic principles of physical safety and economy of effort. There are places available. If you are interested in following this course, please fill an EDH training request via our catalogue. 

  16. The claims handling process of engineering insurance in South Africa

    Directory of Open Access Journals (Sweden)

    I.C. de Beer

    2015-05-01

    Full Text Available Due to technological developments, the complicated world of engineering and its associated products are continuously becoming more specialized. Short-term insurers provide engineering insurance to enable the owners and operators of engineering assets to combat the negative impact of the associated risks. It is, however, a huge challenge to the insurers of engineering insurance to manage the particular risks against the background of technological enhancement. The skills gap in the short-term insurance market and the engineering environment may be the main factor which inhibits the growth of the engineering insurance market. The objective of this research embodies the improvement of financial decision-making concerning the claims handling process of engineering insurance. Secondary as well as primary data were necessary to achieve the stated objective. The secondary data provided the background of the research and enabled the researchers to compile a questionnaire for the empirical survey. The questionnaire and a cover letter were sent to the top 10 short-term insurers in South Africa that are providing engineering insurance. Their perceptions should provide guidelines to other short-term insurers who are engaged in engineering insurance, as they are regarded as the market leaders of engineering insurance in South Africa. The empirical results of this research focus on the importance of various claims handling factors when assessing the claims handling process of engineering insurance, the problem areas in the claims handling process concerned, as well as how often the stipulations of engineering insurance policies are adjusted to take the claims handling factors into account.

  17. Specialization and Flexibility in Port Cargo Handling

    Directory of Open Access Journals (Sweden)

    Hakkı KİŞİ

    2016-11-01

    Full Text Available Cargo handling appears to be the fundamental function of ports. In this context, the question of type of equipment and capacity rate need to be tackled with respect to cargo handling principles. The purpose of this study is to discuss the types of equipment to be used in ports, relating the matter to costs and capacity. The question is studied with a basic economic theoretical approach. Various conditions like port location, size, resources, cargo traffic, ships, etc. are given parameters to dictate the type and specification of the cargo handling equipment. Besides, a simple approach in the context of cost capacity relation can be useful in deciding whether to use specialized or flexible equipment. Port equipment is sometimes expected to be flexible to handle various types of cargo as many as possible and sometimes to be specialized to handle one specific type of cargo. The cases that might be suitable for those alternatives are discussed from an economic point of view in this article. Consequently, effectiveness and efficiency criteria play important roles in determining the handling equipment in ports.

  18. Genotoxic monitoring of nurses handling cytotoxic drugs

    Directory of Open Access Journals (Sweden)

    Anna Tompa

    2016-01-01

    Full Text Available Objective: Several biomarkers may be used to detect harmful exposure and individual susceptibility to cancer. Monitoring of biomarkers related to exposure may have a significant effect on early detection of cell transformation, thereby aiding the primary prevention of various chronic and malignant diseases. Nurses who handle cytotoxic drugs are exposed to carcinogenic agents, which have the potential to interrupt the cell cycle and to induce chromosomal aberrations. The presence of high chromosomal aberrations indicates the need for intervention even when exposure to these carcinogens is low. Methods: Nationally representative samples of 552 nurses were investigated by a follow-up monitoring system. The measured biomarkers were clinical laboratory routine tests, completed with genotoxicological (chromosome aberrations [CAs] and sister chromatid exchanges [SCEs] and immunotoxicological monitoring (ratio of lymphocyte subpopulations and lymphocyte activation markers measured on peripheral blood lymphocytes. Results were compared to the data of 140 healthy, age-matched controls. Results: In nurses exposed to cytostatics, we observed a significantly increased frequency of CAs and SCEs compared with those in the controls. Cytostatic drug exposure also manifested itself in an increased frequency of helper T lymphocytes. Genotoxicological and immunotoxicological changes, as well as negative health effects (i.e., iron deficiency, anemia, and thyroid diseases, increased among cytostatic exposed subjects. Conclusions: These results raised concerns about the protection of nursing staff from chemical carcinogens in the working environment.

  19. Experiences with decontaminating tritium-handling apparatus

    International Nuclear Information System (INIS)

    Maienschein, J.L.; Garcia, F.; Garza, R.G.; Kanna, R.L.; Mayhugh, S.R.; Taylor, D.T.

    1991-07-01

    Tritium-handling apparatus has been decontaminated as part of the shutdown of the LLNL Tritium Facility. Two stainless-steel gloveboxes that had been used to process lithium deuteride-tritide (LiDT) salt were decontaminated using the Portable Cleanup System so that they could be flushed with room air through the facility ventilation system. Further surface decontamination was performed by scrubbing the interior with paper towels and ethyl alcohol or Swish trademark. The surface contamination, as shown by swipe surveys, was reduced from 4x10 4 --10 6 disintegrations per minute (dpm)/cm 2 to 2x10 2 --4x10 4 dpm/cm 2 . Details on the decontamination operation are provided. A series of metal (palladium and vanadium) hydride storage beds have been drained of tritium and flushed with deuterium in order to remove as much tritium as possible. The bed draining and flushing procedure is described, and a calculational method is presented which allows estimation of the tritium remaining in a bed after it has been drained and flushed. Data on specific bed draining and flushing are given

  20. Electronic astronomical information handling and flexible publishing.

    Science.gov (United States)

    Heck, A.

    The current dramatic evolution in information technology is bringing major modifications in the way scientists work and communicate. The concept of electronic information handling encompasses the diverse types of information, the different media, as well as the various communication methodologies and technologies. It ranges from the very collection of data until the final publication of results and sharing of knowledge. New problems and challenges result also from the new information culture, especially on legal, ethical, and educational grounds. Electronic publishing will have to diverge from an electronic version of contributions on paper and will be part of a more general flexible-publishing policy. The benefits of private publishing are questioned. The procedures for validating published material and for evaluating scientific activities will have to be adjusted too. Provision of electronic refereed information independently from commercial publishers in now feasible. Scientists and scientific institutions have now the possibility to run an efficient information server with validated (refereed) material without the help of a commercial publishers.

  1. Physical load handling and listening comprehension effects on balance control.

    Science.gov (United States)

    Qu, Xingda

    2010-12-01

    The purpose of this study was to determine the physical load handling and listening comprehension effects on balance control. A total of 16 young and 16 elderly participants were recruited in this study. The physical load handling task required holding a 5-kg load in each hand with arms at sides. The listening comprehension task involved attentive listening to a short conversation. Three short questions were asked regarding the conversation right after the testing trial to test the participants' attentiveness during the experiment. Balance control was assessed by centre of pressure-based measures, which were calculated from the force platform data when the participants were quietly standing upright on a force platform. Results from this study showed that both physical load handling and listening comprehension adversely affected balance control. Physical load handling had a more deleterious effect on balance control under the listening comprehension condition vs. no-listening comprehension condition. Based on the findings from this study, interventions for the improvement of balance could be focused on avoiding exposures to physically demanding tasks and cognitively demanding tasks simultaneously. STATEMENT OF RELEVANCE: Findings from this study can aid in better understanding how humans maintain balance, especially when physical and cognitive loads are applied. Such information is useful for developing interventions to prevent fall incidents and injuries in occupational settings and daily activities.

  2. A model for transfer baggage handling at airports

    DEFF Research Database (Denmark)

    Barth, Torben C.; Timler Holm, Janus; Lindorff Larsen, Jakob

    This work deals with the handling of baggage from passengers changing aircraft at an airport. The transfer baggage problem is to assign the bags from each arriving aircraft to an infeed area into the airport infrastructure. The infrastructure will then distribute the bags to the handling faciliti...... is studied and future approaches for improving robustness are discussed. The presented solution approach runs successfully as part of the operation control systems at Frankfurt Airport since 2008.......This work deals with the handling of baggage from passengers changing aircraft at an airport. The transfer baggage problem is to assign the bags from each arriving aircraft to an infeed area into the airport infrastructure. The infrastructure will then distribute the bags to the handling facilities...... and robustness. The model can be solved with a commercial MIP-solver. Furthermore, the use of the model in the dynamic environment during daily operations is introduced. The model includes two different approaches for increasing the robustness of the generated solutions. The uncertainty of the input data...

  3. Safety handling of beryllium for fusion technology R and D

    International Nuclear Information System (INIS)

    Yoshida, Hiroshi; Okamoto, Makoto; Terai, Takayuki; Odawara, Osamu; Ashibe, Kusuo; Ohara, Atsushi.

    1992-07-01

    Feasibility of beryllium use as a blanket neutron multiplier, first wall and plasma facing material has been studied for the D-T burning experiment reactors such as ITER. Various experimental work of beryllium and its compounds will be performed under the conditions of high temperature and high energy particle exposure simulating fusion reactor conditions. Beryllium is known as a hazardous substance and its handling has been carefully controlled by various health and safe guidances and/or regulations in many countries. Japanese regulations for hazardous substance provide various guidelines on beryllium for the protection of industrial workers and environment. This report was prepared for the safe handling of beryllium in a laboratory scale experiments for fusion technology R and D such as blanket development. Major items in this report are; (1) Brief review of guidances and regulations in USA, UK and Japan. (2) Safe handling and administration manuals at beryllium facilities in INEL, LANL and JET. (3) Conceptual design study of beryllium handling facility for small to mid-scale blanket R and D. (4) Data on beryllium toxicity, example of clinical diagnosis of beryllium disease, and environmental occurence of beryllium. (5) Personnel protection tools of Japanese Industrial Standard for hazardous substance. (author) 61 refs

  4. Handling final storage of unreprocessed spent nuclear fuel

    International Nuclear Information System (INIS)

    1978-01-01

    The present second report from KBS describes how the safe final storage of spent unreprocessed nuclear fuel can be implemented. According to the Swedish Stipulation Law, the owner must specify in which form the waste is to be stored, how final storage is to be effected, how the waste is to be transported and all other aspects of fuel handling and storage which must be taken into consideration in judging whether the proposed final storage method can be considered to be absolutely safe and feasible. Thus, the description must go beyond general plans and sketches. The description is therefore relatively detailed, even concerning those parts which are less essential for evaluating the safety of the waste storage method. For those parts of the handling chain which are the same for both alternatives of the Stipulation Law, the reader is referred in some cases to the first report. Both of the alternatives of the Stipulation Law may be used in the future. Handling equipment and facilities for the two storage methods are so designed that a combination in the desired proportions is practically feasible. In this first part of the report are presented: premises and data, a description of the various steps of the handling procedure, a summary of dispersal processes and a safety analysis. (author)

  5. Quinone-induced protein handling changes: Implications for major protein handling systems in quinone-mediated toxicity

    International Nuclear Information System (INIS)

    Xiong, Rui; Siegel, David; Ross, David

    2014-01-01

    Para-quinones such as 1,4-Benzoquinone (BQ) and menadione (MD) and ortho-quinones including the oxidation products of catecholamines, are derived from xenobiotics as well as endogenous molecules. The effects of quinones on major protein handling systems in cells; the 20/26S proteasome, the ER stress response, autophagy, chaperone proteins and aggresome formation, have not been investigated in a systematic manner. Both BQ and aminochrome (AC) inhibited proteasomal activity and activated the ER stress response and autophagy in rat dopaminergic N27 cells. AC also induced aggresome formation while MD had little effect on any protein handling systems in N27 cells. The effect of NQO1 on quinone induced protein handling changes and toxicity was examined using N27 cells stably transfected with NQO1 to generate an isogenic NQO1-overexpressing line. NQO1 protected against BQ–induced apoptosis but led to a potentiation of AC- and MD-induced apoptosis. Modulation of quinone-induced apoptosis in N27 and NQO1-overexpressing cells correlated only with changes in the ER stress response and not with changes in other protein handling systems. These data suggested that NQO1 modulated the ER stress response to potentiate toxicity of AC and MD, but protected against BQ toxicity. We further demonstrated that NQO1 mediated reduction to unstable hydroquinones and subsequent redox cycling was important for the activation of the ER stress response and toxicity for both AC and MD. In summary, our data demonstrate that quinone-specific changes in protein handling are evident in N27 cells and the induction of the ER stress response is associated with quinone-mediated toxicity. - Highlights: • Unstable hydroquinones contributed to quinone-induced ER stress and toxicity

  6. Quinone-induced protein handling changes: Implications for major protein handling systems in quinone-mediated toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, Rui; Siegel, David; Ross, David, E-mail: david.ross@ucdenver.edu

    2014-10-15

    Para-quinones such as 1,4-Benzoquinone (BQ) and menadione (MD) and ortho-quinones including the oxidation products of catecholamines, are derived from xenobiotics as well as endogenous molecules. The effects of quinones on major protein handling systems in cells; the 20/26S proteasome, the ER stress response, autophagy, chaperone proteins and aggresome formation, have not been investigated in a systematic manner. Both BQ and aminochrome (AC) inhibited proteasomal activity and activated the ER stress response and autophagy in rat dopaminergic N27 cells. AC also induced aggresome formation while MD had little effect on any protein handling systems in N27 cells. The effect of NQO1 on quinone induced protein handling changes and toxicity was examined using N27 cells stably transfected with NQO1 to generate an isogenic NQO1-overexpressing line. NQO1 protected against BQ–induced apoptosis but led to a potentiation of AC- and MD-induced apoptosis. Modulation of quinone-induced apoptosis in N27 and NQO1-overexpressing cells correlated only with changes in the ER stress response and not with changes in other protein handling systems. These data suggested that NQO1 modulated the ER stress response to potentiate toxicity of AC and MD, but protected against BQ toxicity. We further demonstrated that NQO1 mediated reduction to unstable hydroquinones and subsequent redox cycling was important for the activation of the ER stress response and toxicity for both AC and MD. In summary, our data demonstrate that quinone-specific changes in protein handling are evident in N27 cells and the induction of the ER stress response is associated with quinone-mediated toxicity. - Highlights: • Unstable hydroquinones contributed to quinone-induced ER stress and toxicity.

  7. Preoperational checkout of the remote-handled transuranic waste handling at the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    1987-09-01

    This plan describes the preoperational checkout for handling Remote-Handled Transuranic (RH-TRU) Wastes from their receipt at the Waste Isolation Pilot Plant (WIPP) to their emplacement underground. This plan identifies the handling operations to be performed, personnel groups responsible for executing these operations, and required equipment items. In addition, this plan describes the quality assurance that will be exercised throughout the checkout, and finally, it establishes criteria by which to measure the success of the checkout. 7 refs., 5 figs

  8. Radiological safety aspects of handling plutonium

    International Nuclear Information System (INIS)

    Sundararajan, A.R.

    2016-01-01

    Department of Atomic Energy in its scheme of harnessing the nuclear energy for electrical power generation and strategic applications has given a huge role to utilization of plutonium. In the power production programme, fast reactors with plutonium as fuel are expected to play a major role. This would require establishing fuel reprocessing plants to handle both thermal and fast reactor fuels. So in the nuclear fuel cycle facilities variety of chemical, metallurgical, mechanical operations have to be carried out involving significant inventories of "2"3"9 Pu and associated radionuclides. Plutonium is the most radiotoxic radionuclide and therefore any facility handling it has to be designed and operated with utmost care. Two problems of major concern in the protection of persons working in plutonium handling facilities are the internal exposure to the operating personnel from uptake of plutonium and transplutonic nuclides as they are highly radiotoxic and the radiation exposure of hands and eye lens during fuel fabrication operations especially while handling recycled high burn up plutonium. In view of the fact that annual limit for intake is very small for "2"3"9Pu and its radiation emission characteristics are such that it is a huge challenge for the health physicists to detect Pu in air and in workers. This paper discusses the principles and practices followed in providing radiological surveillance to workers in plutonium handling areas. The challenges in protecting the workers from receiving exposures to hands and eye lens in handling high burn up plutonium are also discussed. The sites having Pu fuel cycle facilities should have trained medical staff to handle cases involving excessive intake of plutonium. (author)

  9. Full scale tests on remote handled FFTF fuel assembly waste handling and packaging

    International Nuclear Information System (INIS)

    Allen, C.R.; Cash, R.J.; Dawson, S.A.; Strode, J.N.

    1986-01-01

    Handling and packaging of remote handled, high activity solid waste fuel assembly hardware components from spent FFTF reactor fuel assemblies have been evaluated using full scale components. The demonstration was performed using FFTF fuel assembly components and simulated components which were handled remotely using electromechanical manipulators, shielding walls, master slave manipulators, specially designed grapples, and remote TV viewing. The testing and evaluation included handling, packaging for current and conceptual shipping containers, and the effects of volume reduction on packing efficiency and shielding requirements. Effects of waste segregation into transuranic (TRU) and non-transuranic fractions also are discussed

  10. MANU. Handling of bentonite prior buffer block manufacturing

    International Nuclear Information System (INIS)

    Laaksonen, R.

    2010-01-01

    The aim of this study is to describe the entire bentonite handling process starting from freight from harbour to storage facility and ending up to the manufacturing filling process of the bentonite block moulds. This work describes the bentonite handling prior to the process in which bentonite blocks are manufactured in great quantities. This work included a study of relevant Nordic and international well documented cases of storage, processing and techniques involving bentonite material. Information about storage and handling processes from producers or re-sellers of bentonite was collected while keeping in mind the requirements coming from the Posiva side. Also a limited experiment was made for humidification of different material types. This work includes a detailed description of methods and equipment needed for bentonite storage and processing. Posiva Oy used Jauhetekniikka Oy as a consultant to prepare handling process flow charts for bentonite. Jauhetekniikka Oy also evaluated the content of this report. The handling of bentonite was based on the assumption that bentonite process work is done in one factory for 11 months of work time while the weekly volume is around 41-45 tons. Storage space needed in this case is about 300 tons of bentonite which equals about seven weeks of raw material consumption. This work concluded several things to be carefully considered: sampling at various phases of the process, the air quality at the production/storage facilities (humidity and temperature), the level of automation/process control of the manufacturing process and the means of producing/saving data from different phases of the process. (orig.)

  11. Email Adaptation for Conflict Handling

    DEFF Research Database (Denmark)

    Lee, Joyce Yi‐Hui; Panteli, Niki; Bülow, Anne Marie

    2018-01-01

    This paper explores the context of email‐based communication in anestablished but fragile, inter‐organisational partnership, which wasoften overlain with conflict. Drawing upon adaptation theory, thisstudy explores how participants adapt to the use of email to handleconflict. Extensive data were...... obtained during a 6‐month field studyof a case of cross‐border inter‐organisational collaboration in EastAsia. We observed that the individuals involved in the cross‐borderpartnership used email as a lean form of communication to stopcovert conflict from explicitly emerging. In contrast to prior researchon...... the leanness of email in managing conflict, we found that underthe described conflict situation the very leanness of emailwas appreciated and thus, exploited by those concerned tomanage the conflict situation. Specifically, we identified 4 keyconflict‐triggered adaptation strategies, namely...

  12. Quantifying the effect of editor-author relations on manuscript handling times.

    Science.gov (United States)

    Sarigöl, Emre; Garcia, David; Scholtes, Ingo; Schweitzer, Frank

    2017-01-01

    In this article we study to what extent the academic peer review process is influenced by social relations between the authors of a manuscript and the editor handling the manuscript. Taking the open access journal PlosOne as a case study, our analysis is based on a data set of more than 100,000 articles published between 2007 and 2015. Using available data on handling editor, submission and acceptance time of manuscripts, we study the question whether co-authorship relations between authors and the handling editor affect the manuscript handling time , i.e. the time taken between the submission and acceptance of a manuscript. Our analysis reveals (1) that editors handle papers co-authored by previous collaborators significantly more often than expected at random, and (2) that such prior co-author relations are significantly related to faster manuscript handling. Addressing the question whether these shorter manuscript handling times can be explained by the quality of publications, we study the number of citations and downloads which accepted papers eventually accumulate. Moreover, we consider the influence of additional (social) factors, such as the editor's experience, the topical similarity between authors and editors, as well as reciprocal citation relations between authors and editors. Our findings show that, even when correcting for other factors like time, experience, and performance, prior co-authorship relations have a large and significant influence on manuscript handling times, speeding up the editorial decision on average by 19 days.

  13. A computerized system for handling renal size measurements from urograms

    International Nuclear Information System (INIS)

    Claesson, I.; Jacobsson, B.F.; Riha, M.

    1987-01-01

    The size of a kidney, as measured on an urogram, is a sensitive indicator of renal damage in a child with urinary tract infection, and renal surface area correlates well with glomerular filtration rate. Sequential measurements can be invaluable in evaluating the efficacy of a regimen of treatment. A system utilizing a personal microcomputer has been developed to facilitate the measuring procedure and the handling and analysis of data. (orig.)

  14. Effective Teaching Practices in Handling Non Readers

    Directory of Open Access Journals (Sweden)

    Jacklyn S. Dacalos

    2016-08-01

    Full Text Available The study determined the effective teaching practices in handling nonreaders. This seeks to answer the following objectives: describe the adjustments, effective strategies, and scaffolds utilized by teachers in handling nonreaders; differentiate the teachers’ reading adjustments, strategies and scaffolds in teaching nonreaders; analyze the teaching reading efficiency of nonreaders using effective teaching reading strategies; and find significant correlation of nonreaders’ grades and reading teachers’ reading adjustments, strategies and scaffolds. This study utilized mixed methods of research. Case studies of five public schools teachers were selected as primary subjects, who were interviewed in handling nonreaders in the areas of adjustments, strategies, and reading scaffolds. Actual teaching observation was conducted according to the five subjects’ most convenient time. In ascertaining the nonreaders’ academic performance, the students’ grades in English subject was analyzed using T-Test within subject design. Handling nonreaders in order to read and understand better in the lesson is an arduous act, yet; once done with effectiveness and passion, it yielded a great amount of learning success. Effective teaching practices in handling nonreaders comprised the use of teachers’ adjustments, strategies, and scaffolds to establish reading mastery, exposing them to letter sounds, short stories, and the use of follow-up. WH questions enhanced their reading performance significantly. Variations of reading teachers’ nature as: an enabler, a facilitator, a humanist, a behaviorist, and an expert, as regards to their teaching practices, were proven significant to students’ reading effectiveness.

  15. Thorium-applications and handling

    International Nuclear Information System (INIS)

    Reichelt, A.

    1993-01-01

    The most important aspects concerning the natural occurrence and extraction of thorium are presented the topics covered are: natural isotopes, occurence in minerals, thorium-activity-content of naturally occuring materials, the resulting radiation exposure, extraction of thorium from ores, time-dependent activity after separation. The sources of radiation exposure due to Thorium, caused by human activity, can be divided into two categories, namely, those in which thorium is deliberately added to (consumer) products in order to improve their usefullness, and those in which the thorium is present accidentally and unwanted due to the naturally occuring thorium in the material used in the manufacturing processes. Some examples of such products and substances will be presented and results about their specific thorium activity will be discussed. Experimental data from a currently running research programme, will be presented, and will include results concerning the radiation occupational exposure due to phosphate fertilizers, thorium impregnated gas mantles and the use of thoriated TIG-Electrodes in arc welding. (orig.) [de

  16. Development of tritium-handling technique

    International Nuclear Information System (INIS)

    Ohmura, Hiroshi; Hosaka, Akio; Okamoto, Takahumi

    1988-01-01

    The overview of developing activities for tritium-handling techniques in IHI are presented. To establish a fusion power plant, tritium handling is one of the key technologies. Recently in JAERI, conceptual design of FER (Fusion Experimental Reactor) has been carried out, and the FER system requires a processing system for a large amount of tritium. IHI concentrate on investigation of fuel gas purification, isotope separation and storage systems under contract with Toshiba Corporation. Design results of the systems and each components are reviewed. IHI has been developing fundamental handling techniques which are the ZrNi bed for hydrogen isotope storage and isotope separation by laser. The ZrNi bed with a tritium storage capacity of 1000 Ci has been constructed and recovery capability of the hydrogen isotope until 10 -4 Torr {0.013 Pa} was confirmed. In laser isotope separation, the optimum laser wave length has been determined. (author)

  17. Automated system for handling tritiated mixed waste

    International Nuclear Information System (INIS)

    Dennison, D.K.; Merrill, R.D.; Reitz, T.C.

    1995-03-01

    Lawrence Livermore National Laboratory (LLNL) is developing a semi system for handling, characterizing, processing, sorting, and repackaging hazardous wastes containing tritium. The system combines an IBM-developed gantry robot with a special glove box enclosure designed to protect operators and minimize the potential release of tritium to the atmosphere. All hazardous waste handling and processing will be performed remotely, using the robot in a teleoperational mode for one-of-a-kind functions and in an autonomous mode for repetitive operations. Initially, this system will be used in conjunction with a portable gas system designed to capture any gaseous-phase tritium released into the glove box. This paper presents the objectives of this development program, provides background related to LLNL's robotics and waste handling program, describes the major system components, outlines system operation, and discusses current status and plans

  18. DOE handbook: Tritium handling and safe storage

    International Nuclear Information System (INIS)

    1999-03-01

    The DOE Handbook was developed as an educational supplement and reference for operations and maintenance personnel. Most of the tritium publications are written from a radiological protection perspective. This handbook provides more extensive guidance and advice on the null range of tritium operations. This handbook can be used by personnel involved in the full range of tritium handling from receipt to ultimate disposal. Compliance issues are addressed at each stage of handling. This handbook can also be used as a reference for those individuals involved in real time determination of bounding doses resulting from inadvertent tritium releases. This handbook provides useful information for establishing processes and procedures for the receipt, storage, assay, handling, packaging, and shipping of tritium and tritiated wastes. It includes discussions and advice on compliance-based issues and adds insight to those areas that currently possess unclear DOE guidance

  19. DOE handbook: Tritium handling and safe storage

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The DOE Handbook was developed as an educational supplement and reference for operations and maintenance personnel. Most of the tritium publications are written from a radiological protection perspective. This handbook provides more extensive guidance and advice on the null range of tritium operations. This handbook can be used by personnel involved in the full range of tritium handling from receipt to ultimate disposal. Compliance issues are addressed at each stage of handling. This handbook can also be used as a reference for those individuals involved in real time determination of bounding doses resulting from inadvertent tritium releases. This handbook provides useful information for establishing processes and procedures for the receipt, storage, assay, handling, packaging, and shipping of tritium and tritiated wastes. It includes discussions and advice on compliance-based issues and adds insight to those areas that currently possess unclear DOE guidance.

  20. MHSS: a material handling system simulator

    Energy Technology Data Exchange (ETDEWEB)

    Pomernacki, L.; Hollstien, R.B.

    1976-04-07

    A Material Handling System Simulator (MHSS) program is described that provides specialized functional blocks for modeling and simulation of nuclear material handling systems. Models of nuclear fuel fabrication plants may be built using functional blocks that simulate material receiving, storage, transport, inventory, processing, and shipping operations as well as the control and reporting tasks of operators or on-line computers. Blocks are also provided that allow the user to observe and gather statistical information on the dynamic behavior of simulated plants over single or replicated runs. Although it is currently being developed for the nuclear materials handling application, MHSS can be adapted to other industries in which material accountability is important. In this paper, emphasis is on the simulation methodology of the MHSS program with application to the nuclear material safeguards problem. (auth)

  1. Fuel handling grapple for nuclear reactor plants

    International Nuclear Information System (INIS)

    Rousar, D.L.

    1992-01-01

    This patent describes a fuel handling system for nuclear reactor plants. It comprises: a reactor vessel having an openable top and removable cover and containing therein, submerged in water substantially filling the reactor vessel, a fuel core including a multiplicity of fuel bundles formed of groups of sealed tube elements enclosing fissionable fuel assembled into units, the fuel handling system consisting essentially of the combination of: a fuel bundle handling platform movable over the open top of the reactor vessel; a fuel bundle handling mast extendable downward from the platform with a lower end projecting into the open top reactor vessel to the fuel core submerged in water; a grapple head mounted on the lower end of the mast provided with grapple means comprising complementary hooks which pivot inward toward each other to securely grasp a bail handle of a nuclear reactor fuel bundle and pivot backward away from each other to release a bail handle; the grapple means having a hollow cylindrical support shaft fixed within the grapple head with hollow cylindrical sleeves rotatably mounted and fixed in longitudinal axial position on the support shaft and each sleeve having complementary hooks secured thereto whereby each hook pivots with the rotation of the sleeve secured thereto; and the hollow cylindrical support shaft being provided with complementary orifices on opposite sides of its hollow cylindrical and intermediate to the sleeves mounted thereon whereby the orifices on both sides of the hollow cylindrical support shaft are vertically aligned providing a direct in-line optical viewing path downward there-through and a remote operator positioned above the grapple means can observe from overhead the area immediately below the grapple hooks

  2. Handling of biological specimens for electron microscopy

    International Nuclear Information System (INIS)

    Bullock, G.

    1987-01-01

    There are many different aspects of specimen preparation procedure which need to be considered in order to achieve good results. Whether using the scanning or transmission microscope, the initial handling procedures are very similar and are selected for the information required. Handling procedures and techniques described are: structural preservation; immuno-and histo-chemistry; x-ray microanalysis and autoradiography; dehydration and embedding; mounting and coating specimens for scanning electron microscopy; and sectioning of resin embedded material. With attention to detail and careful choice of the best available technique, excellent results should be obtainable whatever the specimen. 6 refs

  3. Human factors issues in fuel handling

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, J D; Iwasa-Madge, K M; Tucker, D A [Humansystems Inc., Milton, ON (Canada)

    1994-12-31

    The staff of the Atomic Energy Control Board wish to further their understanding of human factors issues of potential concern associated with fuel handling in CANDU nuclear power stations. This study contributes to that objective by analysing the role of human performance in the overall fuel handling process at Ontario Hydro`s Darlington Nuclear Generating Station, and reporting findings in several areas. A number of issues are identified in the areas of design, operating and maintenance practices, and the organizational and management environment. 1 fig., 4 tabs., 19 refs.

  4. Safe handling of plutonium: a panel report

    Energy Technology Data Exchange (ETDEWEB)

    1974-01-01

    This guide results from a meeting of a Panel of Experts held by the International Atomic Energy Agency on 8 to 12 November 1971. It is directed to workers in research laboratories handling plutonium in gram amounts. Contents: aspects of the physical and chemical properties of plutonium; metabolic features of plutonium; facility design features for safe handling of plutonium (layout of facility, working zones, decontamination room, etc.); glove boxes; health surveillance (surveillance of environment and supervision of workers); emergencies; organization. Annexes: types of glove boxes; tables; mobile ..cap alpha.. air sampler; aerosol monitor; bio-assay limits of detection; examples of contamination control monitors.

  5. Apparatus for handling control rod drives

    International Nuclear Information System (INIS)

    Akimoto, A.; Watanabe, M.; Yoshida, T.; Sugaya, Z.; Saito, T.; Ishii, Y.

    1979-01-01

    An apparatus for handling control rod drives (CRD's) attached by detachable fixing means to housings mounted in a reactor pressure vessel and each coupled to one of control rods inserted in the reactor pressure vessel is described. The apparatus for handling the CRD's comprise cylindrical housing means, uncoupling means mounted in the housing means for uncoupling each of the control rods from the respective CRD, means mounted on the housing means for effecting attaching and detaching of the fixing means, means for supporting the housing means, and means for moving the support means longitudinally of the CRD

  6. Control panel handling of a nuclear simulator

    International Nuclear Information System (INIS)

    Martin Polo, F.; Jimenez Fraustro, L.A.; Banuelos Galindo, A.; Diamant Rubinstein, A.

    1985-01-01

    The handling of the control panels for a Nuclear Simulator for operating training is described. The control panels are handled by a set of intelligent controllers, each with at least two processors (8035 - Communications Controller and a 8085 - Master processor). The Controllers are connected to the main computers (Two dual processor Gould concept 32/6780 and a single processor Gould concept 32/6705) via serial asynchronous channels in a multidrop, star-like architecture. The controllers transmit to the main computers only the changes detected and receive the whole set of output variables as computed by the mathematical models of the Nuclear Plant

  7. Notification: Audit of Security Categorization for EPA Systems That Handle Hazardous Material Information

    Science.gov (United States)

    Project #OA-FY18-0089, January 8, 2018. The OIG plans to begin preliminary research to determine whether the EPA classified the sensitivity of data for systems that handle hazardous waste material information as prescribed by NIST.

  8. Concept Analysis of Occupational Therapy Handling in the Children with Cerebral Palsy: A Hybrid Model

    Directory of Open Access Journals (Sweden)

    Hamid Dalvand

    2015-07-01

    Full Text Available Objective: This study aimed to analyze the concept of occupational therapy handling in the children with cerebral palsy from the perspective of occupational therapy instructors and clinicians in Iran. Materials & Methods: In this qualitative study, using hybrid model to clarify the concept of handling through three phases. For the theoretical phase, attributes of handling were recognized through a review of the literature (until February 2014, and six in-depth semi - structured interviews, two observations and one panel of experts were conducted for the fieldwork to develop attributes from the data and to verify those identified from the literature review. In the third phase attributes and final analysis of handling were extracted from the first and second phase. Results: The results were classified in five main categories that were identified as: (1 care of child, (2 management of treatment, (3 manual techniques, (4 education of activities of daily living (ADL, and (5 lifting and carrying. Core attributes of handling include "control, safety, transfer and positioning". Conclusion: It seems that the results of this study may help in clarifying the concept of handling in children with CP. In addition, by identifying the process, barriers and facilitative factors, and the concept of handling, occupational therapy instructors and therapists will be able to design and run their educational activities based on scientific findings which can provide them with the necessary conditions for education, learning and proper execution of handling in occupational therapy.

  9. A pilot modeling technique for handling-qualities research

    Science.gov (United States)

    Hess, R. A.

    1980-01-01

    A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.

  10. Remotely-operated equipment for inspection, measurement and handling

    CERN Document Server

    Bertone, C; CERN. Geneva. TS Department

    2008-01-01

    As part of the application of ALARA radiation dose reduction principles at CERN, inspection, measurement and handling interventions in controlled areas are being studied in detail. A number of activities which could be carried out as remote operations have already been identified and equipment is being developed. Example applications include visual inspection to check for ice formation on LHC components or water leaks, measurement of radiation levels before allowing personnel access, measurement of collimator or magnet alignment, visual inspection or measurements before fire service access in the event of fire, gas leak or oxygen deficiency. For these applications, a modular monorail train, TIM, has been developed with inspection and measurement wagons. In addition TIM provides traction, power and data communication for lifting and handling units such as the remote collimator exchange module and vision for other remotely operated units such as the TAN detector exchange mini-cranes. This paper describes the eq...

  11. Design of systems for handling radioactive ion exchange resin beads

    International Nuclear Information System (INIS)

    Shapiro, S.A.; Story, G.L.

    1979-01-01

    The flow of slurries in pipes is a complex phenomenon. There are little slurry data available on which to base the design of systems for radioactive ion exchange resin beads and, as a result, the designs vary markedly in operating plants. With several plants on-line, the opportunity now exists to evaluate the designs of systems handling high activity spent resin beads. Results of testing at Robbins and Meyers Pump Division to quantify the behavior of resin bead slurries are presented. These tests evaluated the following slurry parameters; resin slurry velocity, pressure drop, bead degradation, and slurry concentration effects. A discussion of the general characteristics of resin bead slurries is presented along with a correlation to enable the designer to establish the proper flowrate for a given slurry composition and flow regime as a function of line size. Guidelines to follow in designing a resin handling system are presented

  12. Recommendations for cask features for robotic handling from the Advanced Handling Technology Project

    International Nuclear Information System (INIS)

    Drotning, W.

    1991-02-01

    This report describes the current status and recent progress in the Advanced Handling Technology Project (AHTP) initiated to explore the use of advanced robotic systems and handling technologies to perform automated cask handling operations at radioactive waste handling facilities, and to provide guidance to cask designers on the impact of robotic handling on cask design. Current AHTP tasks have developed system mock-ups to investigate robotic manipulation of impact limiters and cask tiedowns. In addition, cask uprighting and transport, using computer control of a bridge crane and robot, were performed to demonstrate the high speed cask transport operation possible under computer control. All of the current AHTP tasks involving manipulation of impact limiters and tiedowns require robotic operations using a torque wrench. To perform these operations, a pneumatic torque wrench and control system were integrated into the tool suite and control architecture of the gantry robot. The use of captured fasteners is briefly discussed as an area where alternative cask design preferences have resulted from the influence of guidance for robotic handling vs traditional operations experience. Specific robotic handling experiences with these system mock-ups highlight a number of continually recurring design principles: (1) robotic handling feasibility is improved by mechanical designs which emphasize operation with limited dexterity in constrained workspaces; (2) clearances, tolerances, and chamfers must allow for operations under actual conditions with consideration for misalignment and imprecise fixturing; (3) successful robotic handling is enhanced by including design detail in representations for model-based control; (4) robotic handling and overall quality assurance are improved by designs which eliminate the use of loose, disassembled parts. 8 refs., 15 figs

  13. Handling Uncertainty in Accessing Petroleum Exploration Data Traitement de l'incertain dans l'accès aux données d'exploration pétrolière

    Directory of Open Access Journals (Sweden)

    Chung P. W. H.

    2006-11-01

    Full Text Available This paper discusses the role of uncertainty in accessing petroleum exploration databases. Two distinct forms of uncertainty are identified : the uncertainty in the user's requirements, and the uncertainty in the data held. A general mechanism is described which is applicable to both. Cet article traite du rôle de l'incertitude dans l'accès aux bases de données d'exploration pétrolière. Deux sortes distinctes d'incertitudes sont identifiées : l'incertitude au niveau des requêtes de l'utilisateur et l'incertitude attachée aux données stockées. Nous décrivons un mécanisme général qui s'applique à ces deux types d'incertitude.

  14. Comparison of Imputation Methods for Handling Missing Categorical Data with Univariate Pattern|| Una comparación de métodos de imputación de variables categóricas con patrón univariado

    Directory of Open Access Journals (Sweden)

    Torres Munguía, Juan Armando

    2014-06-01

    Full Text Available This paper examines the sample proportions estimates in the presence of univariate missing categorical data. A database about smoking habits (2011 National Addiction Survey of Mexico was used to create simulated yet realistic datasets at rates 5% and 15% of missingness, each for MCAR, MAR and MNAR mechanisms. Then the performance of six methods for addressing missingness is evaluated: listwise, mode imputation, random imputation, hot-deck, imputation by polytomous regression and random forests. Results showed that the most effective methods for dealing with missing categorical data in most of the scenarios assessed in this paper were hot-deck and polytomous regression approaches. || El presente estudio examina la estimación de proporciones muestrales en la presencia de valores faltantes en una variable categórica. Se utiliza una encuesta de consumo de tabaco (Encuesta Nacional de Adicciones de México 2011 para crear bases de datos simuladas pero reales con 5% y 15% de valores perdidos para cada mecanismo de no respuesta MCAR, MAR y MNAR. Se evalúa el desempeño de seis métodos para tratar la falta de respuesta: listwise, imputación de moda, imputación aleatoria, hot-deck, imputación por regresión politómica y árboles de clasificación. Los resultados de las simulaciones indican que los métodos más efectivos para el tratamiento de la no respuesta en variables categóricas, bajo los escenarios simulados, son hot-deck y la regresión politómica.

  15. A multi-component patient-handling intervention improves attitudes and behaviors for safe patient handling and reduces aggression experienced by nursing staff

    DEFF Research Database (Denmark)

    Risør, Bettina Wulff; Casper, Sven Dalgas; Andersen, Lars L.

    2017-01-01

    This study evaluated an intervention for patient-handling equipment aimed to improve nursing staffs' use of patient handling equipment and improve their general health, reduce musculoskeletal problems, aggressive episodes, days of absence and work-related accidents. As a controlled before......-after study, questionnaire data were collected at baseline and 12-month follow-up among nursing staff at intervention and control wards at two hospitals. At 12-month follow-up, the intervention group had more positive attitudes towards patient-handling equipment and increased use of specific patient......-handling equipment. In addition, a lower proportion of nursing staff in the intervention group had experienced physically aggressive episodes. No significant change was observed in general health status, musculoskeletal problems, days of absence or work-related accidents. The intervention resulted in more positive...

  16. Confluence Modulo Equivalence in Constraint Handling Rules

    DEFF Research Database (Denmark)

    Christiansen, Henning; Kirkeby, Maja Hanne

    2015-01-01

    Previous results on confluence for Constraint Handling Rules, CHR, are generalized to take into account user-defined state equivalence relations. This allows a much larger class of programs to enjoy the advantages of confluence, which include various optimization techniques and simplified...

  17. 9 CFR 3.142 - Handling.

    Science.gov (United States)

    2010-01-01

    ... Warmblooded Animals Other Than Dogs, Cats, Rabbits, Hamsters, Guinea Pigs, Nonhuman Primates, and Marine... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Handling. 3.142 Section 3.142 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL...

  18. 340 Waste Handling Facility interim safety basis

    International Nuclear Information System (INIS)

    Bendixsen, R.B.

    1995-01-01

    This document establishes the interim safety basis (ISB) for the 340 Waste Handling Facility (340 Facility). An ISB is a documented safety basis that provides a justification for the continued operation of the facility until an upgraded final safety analysis report is prepared that complies with US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports. The ISB for the 340 Facility documents the current design and operation of the facility. The 340 Facility ISB (ISB-003) is based on a facility walkdown and review of the design and operation of the facility, as described in the existing safety documentation. The safety documents reviewed, to develop ISB-003, include the following: OSD-SW-153-0001, Operating Specification Document for the 340 Waste Handling Facility (WHC 1990); OSR-SW-152-00003, Operating Limits for the 340 Waste Handling Facility (WHC 1989); SD-RE-SAP-013, Safety Analysis Report for Packaging, Railroad Liquid Waste Tank Cars (Mercado 1993); SD-WM-TM-001, Safety Assessment Document for the 340 Waste Handling Facility (Berneski 1994a); SD-WM-SEL-016, 340 Facility Safety Equipment List (Berneski 1992); and 340 Complex Fire Hazard Analysis, Draft (Hughes Assoc. Inc. 1994)

  19. 7 CFR 985.8 - Handle.

    Science.gov (United States)

    2010-01-01

    ... the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE MARKETING ORDER REGULATING THE HANDLING OF...: Provided, That: (a) The preparation for market of salable oil by producers who are not dealers or users, (b...

  20. Handling system for nuclear fuel pellet inspection

    International Nuclear Information System (INIS)

    Nyman, D.H.; McLemore, D.R.; Sturges, R.H.

    1978-11-01

    HEDL is developing automated fabrication equipment for fast reactor fuel. A major inspection operation in the process is the gaging of fuel pellets. A key element in the system has been the development of a handling system that reliably moves pellets at the rate of three per second without product damage or excessive equipment wear

  1. Combating wear in bulk solids handling plants

    Energy Technology Data Exchange (ETDEWEB)

    1986-01-01

    A total of five papers presented at a seminar on problems of wear caused by abrasive effects of materials in bulk handling. Topics of papers cover the designer viewpoint, practical experience from the steel, coal, cement and quarry industries to create an awareness of possible solutions.

  2. Emergency handling of radiation accident cases: firemen

    International Nuclear Information System (INIS)

    Procedures for the emergency handling of persons exposed to radiation or radioactive contamination are presented, with emphasis on information needed by firemen. The types of radiation accident patients that may be encountered are described and procedures for first aid, for preventing the spread of radioactive contamination, and for reporting the accident are outlined

  3. 340 waste handling facility interim safety basis

    Energy Technology Data Exchange (ETDEWEB)

    VAIL, T.S.

    1999-04-01

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people.

  4. 7 CFR 981.16 - To handle.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false To handle. 981.16 Section 981.16 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... in any other way to put almonds grown in the area of production into any channel of trade for human...

  5. Ergonomics intervention in manual handling of oxygen

    Directory of Open Access Journals (Sweden)

    M Motamedzadeh

    2013-05-01

    Conclusion: With the implementation of ergonomic intervention is casting unit, the risk of exposure to musculoskeletal disorders caused by manual handling of oxygen cylinders was eliminated and safety of employees against the risk of explosion of the cylinders in comparison with before the intervention was improved.

  6. PREPD O and VE remote handling system

    International Nuclear Information System (INIS)

    Theil, T.N.

    1985-01-01

    The Process Experimental Pilot Plant (PREPP) at the Idaho National Engineering Laboratory is designed for volume reduction and packaging of transuranic (TRU) waste. The PREPP opening and verification enclosure (O and VE) remote handling system, within that facility, is designed to provide examination of the contents of various TRU waste storage containers. This remote handling system will provide the means of performing a hazardous operation that is currently performed manually. The TeleRobot to be used in this system is a concept that will incorporate and develop man in the loop operation (manual mode), standardized automatic sequencing of end effector tools, increased payload and reach over currently available computer-controlled robots, and remote handling of a hazardous waste operation. The system is designed within limited space constraints and an operation that was originally planned, and is currently being manually performed at other plants. The PREPP O and VE remote handling system design incorporates advancing technology to improve the working environment in the nuclear field

  7. Intertextuality for Handling Complex Environmental Issues

    Science.gov (United States)

    Byhring, Anne Kristine; Knain, Erik

    2016-01-01

    Nowhere is the need for handling complexity more pertinent than in addressing environmental issues. Our study explores students' situated constructs of complexity in unfolding discourses on socio-scientific issues. Students' dialogues in two group-work episodes are analysed in detail, with tools from Systemic Functional Linguistics. We identify…

  8. 7 CFR 996.4 - Handle.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Handle. 996.4 Section 996.4 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements and... consumption channels of commerce: Provided, That this term does not include sales or deliveries of peanuts by...

  9. 7 CFR 982.7 - To handle.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false To handle. 982.7 Section 982.7 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements and... hazelnuts, inshell or shelled, into the channels of trade either within the area of production or from such...

  10. Exploring Reflective Means to Handle Plagiarism

    Science.gov (United States)

    Dalal, Nikunj

    2016-01-01

    Plagiarism has become widespread in the university teaching environment. This article presents practical wisdom from several years of experience handling plagiarism in two Information Systems (IS) courses with the exploratory use of reflective means such as dialogues and essays. There has been very little work on the use of reflective approaches…

  11. Remote technologies for handling spent fuel

    International Nuclear Information System (INIS)

    Ramakumar, M.S.

    1999-01-01

    The nuclear programme in India involves building and operating power and research reactors, production and use of isotopes, fabrication of reactor fuel, reprocessing of irradiated fuel, recovery of plutonium and uranium-233, fabrication of fuel containing plutonium-239, uranium-233, post-irradiation examination of fuel and hardware and handling solid and liquid radioactive wastes. Fuel that could be termed 'spent' in thermal reactors is a source for second generation fuel (plutonium and uranium-233). Therefore, it is only logical to extend remote techniques beyond handling fuel from thermal reactors to fuel from fast reactors, post-irradiation examination etc. Fabrication of fuel containing plutonium and uranium-233 poses challenges in view of restriction on human exposure to radiation. Hence, automation will serve as a step towards remotisation. Automated systems, both rigid and flexible (using robots) need to be developed and implemented. Accounting of fissile material handled by robots in local area networks with appropriate access codes will be possible. While dealing with all these activities, it is essential to pay attention to maintenance and repair of the facilities. Remote techniques are essential here. There are a number of commonalities in these requirements and so development of modularized subsystems, and integration of different configurations should receive attention. On a long-term basis, activities like decontamination, decommissioning of facilities and handling of waste generated have to be addressed. While robotized remote systems have to be designed for existing facilities, future designs of facilities should take into account total operation with robotic remote systems. (author)

  12. 21 CFR 820.140 - Handling.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Handling. 820.140 Section 820.140 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES... manufacturer shall establish and maintain procedures to ensure that mixups, damage, deterioration...

  13. Handling and disposing of radioactive waste

    International Nuclear Information System (INIS)

    Trauger, D.B.

    1983-01-01

    Radioactive waste has been separated by definition into six categories. These are: commercial spent fuel; high-level wastes; transuranium waste; low-level wastes; decommissioning and decontamination wastes; and mill tailings and mine wastes. Handling and disposing of these various types of radioactive wastes are discussed briefly

  14. Biodiesel Handling and Use Guide (Fifth Edition)

    Energy Technology Data Exchange (ETDEWEB)

    Alleman, T.L.; McCormick, R.L.; Christensen, E.D.; Fioroni, G.; Moriarty. K.; Yanowitz, J.

    2016-11-08

    This document is a guide for those who blend, distribute, and use biodiesel and biodiesel blends. It provides basic information on the proper and safe use of biodiesel and biodiesel blends in engines and boilers, and is intended to help fleets, individual users, blenders, distributors, and those involved in related activities understand procedures for handling and using biodiesel fuels.

  15. Instrumentation to handle thermal polarized neutron beams

    NARCIS (Netherlands)

    Kraan, W.H.

    2004-01-01

    In this thesis we investigate devices needed to handle the polarization of thermal neutron beams: Ï/2-flippers (to start/stop Larmor precession) and Ï-flippers (to reverse polarization/precession direction) and illustrate how these devices are used to investigate the properties of matter and of the

  16. Confluence Modulo Equivalence in Constraint Handling Rules

    DEFF Research Database (Denmark)

    Christiansen, Henning; Kirkeby, Maja Hanne

    2014-01-01

    Previous results on confluence for Constraint Handling Rules, CHR, are generalized to take into account user-defined state equivalence relations. This allows a much larger class of programs to enjoy the ad- vantages of confluence, which include various optimization techniques and simplified...

  17. 340 waste handling facility interim safety basis

    International Nuclear Information System (INIS)

    VAIL, T.S.

    1999-01-01

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people

  18. Remote-handled transuranic system assessment appendices. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    Volume 2 of this report contains six appendices to the report: Inventory and generation of remote-handled transuranic waste; Remote-handled transuranic waste site storage; Characterization of remote-handled transuranic waste; RH-TRU waste treatment alternatives system analysis; Packaging and transportation study; and Remote-handled transuranic waste disposal alternatives.

  19. Remote-handled transuranic system assessment appendices. Volume 2

    International Nuclear Information System (INIS)

    1995-11-01

    Volume 2 of this report contains six appendices to the report: Inventory and generation of remote-handled transuranic waste; Remote-handled transuranic waste site storage; Characterization of remote-handled transuranic waste; RH-TRU waste treatment alternatives system analysis; Packaging and transportation study; and Remote-handled transuranic waste disposal alternatives

  20. Regulating nanomedicine - can the FDA handle it?

    Science.gov (United States)

    Bawa, Raj

    2011-05-01

    There is enormous excitement and expectation surrounding the multidisciplinary field of nanomedicine - the application of nanotechnology to healthcare - which is already influencing the pharmaceutical industry. This is especially true in the design, formulation and delivery of therapeutics. Currently, nanomedicine is poised at a critical stage. However, regulatory guidance in this area is generally lacking and critically needed to provide clarity and legal certainty to manufacturers, policymakers, healthcare providers as well as public. There are hundreds, if not thousands, of nanoproducts on the market for human use but little is known of their health risks, safety data and toxicity profiles. Less is known of nanoproducts that are released into the environment and that come in contact with humans. These nanoproducts, whether they are a drug, device, biologic or combination of any of these, are creating challenges for the Food and Drug Administration (FDA), as regulators struggle to accumulate data and formulate testing criteria to ensure development of safe and efficacious nanoproducts (products incorporating nanoscale technologies). Evidence continues to mount that many nanoproducts inherently posses novel size-based properties and toxicity profiles. Yet, this scientific fact has been generally ignored by the FDA and the agency continues to adopt a precautionary approach to the issue in hopes of countering future potential negative public opinion. As a result, the FDA has simply maintained the status quo with regard to its regulatory policies pertaining to nanomedicine. Therefore, there are no specific laws or mechanisms in place for oversight of nanomedicine and the FDA continues to treat nanoproducts as substantially equivalent ("bioequivalent") to their bulk counterparts. So, for now nanoproducts submitted for FDA review will continue to be subjected to an uncertain regulatory pathway. Such regulatory uncertainty could negatively impact venture funding, stifle

  1. Airborne nanoparticle exposures associated with the manual handling of nanoalumina and nanosilver in fume hoods

    International Nuclear Information System (INIS)

    Tsai, Su-Jung; Ada, Earl; Isaacs, Jacqueline A.; Ellenbecker, Michael J.

    2009-01-01

    Manual handling of nanoparticles is a fundamental task of most nanomaterial research; such handling may expose workers to ultrafine or nanoparticles. Recent studies confirm that exposures to ultrafine or nanoparticles produce adverse inflammatory responses in rodent lungs and such particles may translocate to other areas of the body, including the brain. An important method for protecting workers handling nanoparticles from exposure to airborne nanoparticles is the laboratory fume hood. Such hoods rely on the proper face velocity for optimum performance. In addition, several other hood design and operating factors can affect worker exposure. Handling experiments were performed to measure airborne particle concentration while handling nanoparticles in three fume hoods located in different buildings under a range of operating conditions. Nanoalumina and nanosilver were selected to perform handling experiments in the fume hoods. Air samples were also collected on polycarbonate membrane filters and particles were characterized by scanning electron microscopy. Handling tasks included transferring particles from beaker to beaker by spatula and by pouring. Measurement locations were the room background, the researcher's breathing zone and upstream and downstream from the handling location. Variable factors studied included hood design, transfer method, face velocity/sash location and material types. Airborne particle concentrations measured at breathing zone locations were analyzed to characterize exposure level. Statistics were used to test the correlation between data. The test results found that the handling of dry powders consisting of nano-sized particles inside laboratory fume hoods can result in a significant release of airborne nanoparticles from the fume hood into the laboratory environment and the researcher's breathing zone. Many variables were found to affect the extent of particle release including hood design, hood operation (sash height, face velocity

  2. The Influence Of Customer Handling On Brand Image In Building Customer Loyalty

    Directory of Open Access Journals (Sweden)

    Ryan Kurniawan

    2015-08-01

    Full Text Available Complaint handling influences brand image which will influence customer loyalty in the future. This research is aimed to find out how the complaint handling is capable to influence against the brand image in building the customer loyalty of Indomaret Minimarket with the study case of Indomaret Minimarket. This research also aims to find out how the complaint handling is conducted by Indomaret Minimarket the brand image of service on Indomaret Minimarket as well as the customer loyalty of Indomaret Minimarket. This research using questionnaire as an instrument in collecting the data. The analysis type of this research is descriptive analysis and causal. The sample used as many as a 165 respondents with purposive sampling techniques. This research uses 33 indicators that will be counted with analysis technique SEM Structural Equation Modelling. The result of this research is that the complaint handling conducted by Indomaret Minimarket has been good but the aspect of speed in complaint handling is considered as not good. Besides the brand image and loyalty have been good enough. The customer loyalty is influenced by the complaint handling and the brand image by 32.7. The complaint handling has a significant influence against the customer satisfaction but the complaint handling does not influence against the customer loyalty. The brand image significantly influences against the customer loyalty. Then complaint handling influential not directly to customers trough loyalty of customer satisfaction. In addition the necessary integrated system standardization compensation and to rejuvenate issue at regular intervals to improve complaint handling that can give the effect to customer loyalty through brand image.

  3. Airborne nanoparticle exposures associated with the manual handling of nanoalumina and nanosilver in fume hoods

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Su-Jung, E-mail: candace.umass@gmail.com; Ada, Earl [University of Massachusetts Lowell, NSF Center for High-rate Nanomanufacturing (CHN) (United States); Isaacs, Jacqueline A. [Northeastern University, NSF Center for High-rate Nanomanufacturing (CHN) (United States); Ellenbecker, Michael J. [University of Massachusetts Lowell, NSF Center for High-rate Nanomanufacturing (CHN) (United States)

    2009-01-15

    Manual handling of nanoparticles is a fundamental task of most nanomaterial research; such handling may expose workers to ultrafine or nanoparticles. Recent studies confirm that exposures to ultrafine or nanoparticles produce adverse inflammatory responses in rodent lungs and such particles may translocate to other areas of the body, including the brain. An important method for protecting workers handling nanoparticles from exposure to airborne nanoparticles is the laboratory fume hood. Such hoods rely on the proper face velocity for optimum performance. In addition, several other hood design and operating factors can affect worker exposure. Handling experiments were performed to measure airborne particle concentration while handling nanoparticles in three fume hoods located in different buildings under a range of operating conditions. Nanoalumina and nanosilver were selected to perform handling experiments in the fume hoods. Air samples were also collected on polycarbonate membrane filters and particles were characterized by scanning electron microscopy. Handling tasks included transferring particles from beaker to beaker by spatula and by pouring. Measurement locations were the room background, the researcher's breathing zone and upstream and downstream from the handling location. Variable factors studied included hood design, transfer method, face velocity/sash location and material types. Airborne particle concentrations measured at breathing zone locations were analyzed to characterize exposure level. Statistics were used to test the correlation between data. The test results found that the handling of dry powders consisting of nano-sized particles inside laboratory fume hoods can result in a significant release of airborne nanoparticles from the fume hood into the laboratory environment and the researcher's breathing zone. Many variables were found to affect the extent of particle release including hood design, hood operation (sash height, face

  4. Performance Evaluation and Suggestion of Upgraded Fuel Handling Equipment for Operating OPR1000

    International Nuclear Information System (INIS)

    Chang, Sang Gyoon; Hwang, Jeung Ki; Choi, Taek Sang; Na, Eun Seok; Lee, Myung Lyul; Baek, Seung Jin; Kim, Man Su; Kunik, Jack

    2011-01-01

    The purpose of this study is to evaluate the performance of upgraded FHE (Fuel Handling Equipment) for operating OPR 1000 (Optimized Power Reactor) by using data measured during the fuel reloading, and make some suggestions on enhancing the performance of the FHE. The fuel handling equipment, which serves critical processes in the refueling outage, has been upgraded to increase and improve the operational availability of the plant. The evaluation and suggestion of this study can be a beneficial tool related to the performance of the fuel handling equipment

  5. Distributed and parallel approach for handle and perform huge datasets

    Science.gov (United States)

    Konopko, Joanna

    2015-12-01

    Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.

  6. Spent nuclear fuel shipping cask handling capabilities of commercial light water reactors

    International Nuclear Information System (INIS)

    Daling, P.M.; Konzek, G.J.; Lezberg, A.J.; Votaw, E.F.; Collingham, M.I.

    1985-04-01

    This report describes an evaluation of the cask handling capabilities of those reactors which are operating or under construction. A computerized data base that includes cask handling information was developed with information from the literature and utility-supplied data. The capability of each plant to receive and handle existing spent fuel shipping casks was then evaluated. Modal fractions were then calculated based on the results of these evaluations and the quantities of spent fuel projected to be generated by commercial nuclear power plants through 1998. The results indicated that all plants are capable of receiving and handling truck casks. Up to 118 out of 130 reactors (91%) could potentially handle the larger and heavier rail casks if the maximum capability of each facility is utilized. Design and analysis efforts and physical modifications to some plants would be needed to achieve this high rail percentage. These modifications would be needed to satisfy regulatory requirements, increase lifting capabilities, develop rail access, or improve other deficiencies. The remaining 12 reactors were determined to be capable of handling only the smaller truck casks. The percentage of plants that could receive and handle rail casks in the near-term would be reduced to 64%. The primary reason for a plant to be judged incapable of handling rail casks in the near-term was a lack of rail access. The remaining 36% of the plants would be limited to truck shipments. The modal fraction calculations indicated that up to 93% of the spent fuel accumulated by 1998 could be received at federal storage or disposal facilities via rail (based on each plant's maximum capabilities). If the near-term cask handling capabilities are considered, the rail percentage is reduced to 62%

  7. Development of liquid handling techniques in microgravity

    Science.gov (United States)

    Antar, Basil N.

    1995-01-01

    A large number of experiments dealing with protein crystal growth and also with growth of crystals from solution require complicated fluid handling procedures including filling of empty containers with liquids, mixing of solutions, and stirring of liquids. Such procedures are accomplished in a straight forward manner when performed under terrestrial conditions in the laboratory. However, in the low gravity environment of space, such as on board the Space Shuttle or an Earth-orbiting space station, these procedures sometimes produced entirely undesirable results. Under terrestrial conditions, liquids usually completely separate from the gas due to the buoyancy effects of Earth's gravity. Consequently, any gas pockets that are entrained into the liquid during a fluid handling procedure will eventually migrate towards the top of the vessel where they can be removed. In a low gravity environment any folded gas bubble will remain within the liquid bulk indefinitely at a location that is not known a priori resulting in a mixture of liquid and vapor.

  8. Fission reactor recycling pump handling device

    International Nuclear Information System (INIS)

    Togasawa, Hiroshi; Komita, Hideo; Susuki, Shoji; Endo, Takio; Yamamoto, Tetsuzo; Takahashi, Hideaki; Saito, Noboru.

    1991-01-01

    This invention provides a device for handling a recycling pump in a nuclear reactor upon periodical inspections in a BWR type power plant. That is, in a handling device comprising a support for supporting components of a recycling pump, and a lifter for vertically moving the support below a motor case disposed passing through a reactor pressure vessel, a weight is disposed below the support. Then, the center of gravity of the components, the support and the entire weight is substantially aligned with the position for the support. With such a constitution, the components can be moved vertically to the motor case extremely safely, to remarkably suppress vibrations. Further, the operation safety can remarkably be improved by preventing turning down upon occurrence of earthquakes. Further, since vibration-proof jigs as in a prior art can be saved, operation efficiency can be improved. (I.S.)

  9. Process & Quality procedures for transport & handling activities

    CERN Document Server

    Böttcher, O

    2002-01-01

    To respect the detailed and complex planning of the LHC installation project it is essential to reduce possible faults in every technical service that can cause delays in the schedule. In order to ensure proper execution of transport and handling activities it is important to get detailed information from the clients as early as possible in order to do the planning and the organisation of the required resources. One procedure that requires greater focus in the future is the preparation of the resources. The goal is to prevent equipment breakdowns and accidents while executing transport and handling activities. In the LEP dismantling project multiple breakdowns of important cranes caused serious problems in the project schedule. For the LHC installation project similar incidents in the reliability of the equipment cannot be accepted because of the high sensitivity of the whole schedule. This paper shall outline the efforts and methods that are put in place in order to meet the LHC installation requirements.

  10. Trends in remote handling device development

    International Nuclear Information System (INIS)

    Raimondi, T.

    1991-01-01

    A brief review is given of studies on layouts and methods for handling some major components requiring remote maintenance in future fusion reactors: Neutral sources and beam lines, the blanket, divertor plates, armour tiles and vacuum pumps. Comparison is made to problems encountered in JET, methods and equipment used and development work done there. Areas requiring development and research are outlined. These include topics which are the subject of papers presented here, such as dynamic studies and control of transporters, improvements to the man-machine interface and hot cell equipment. A variety of other topics where effort is needed are also mentioned: Environmental tolerance of components and equipment, TV viewing and compensation of viewing difficulties with aids such as computer graphics and image processing, safety assessment, computer aids for remote manipulation, remote cutting and welding techniques, routine in-vessel inspection methods and selection of connectors and flanges for remote handling. (orig.)

  11. Fission reactor recycling pump handling device

    Energy Technology Data Exchange (ETDEWEB)

    Togasawa, Hiroshi; Komita, Hideo; Susuki, Shoji; Endo, Takio; Yamamoto, Tetsuzo; Takahashi, Hideaki; Saito, Noboru

    1991-06-24

    This invention provides a device for handling a recycling pump in a nuclear reactor upon periodical inspections in a BWR type power plant. That is, in a handling device comprising a support for supporting components of a recycling pump, and a lifter for vertically moving the support below a motor case disposed passing through a reactor pressure vessel, a weight is disposed below the support. Then, the center of gravity of the components, the support and the entire weight is substantially aligned with the position for the support. With such a constitution, the components can be moved vertically to the motor case extremely safely, to remarkably suppress vibrations. Further, the operation safety can remarkably be improved by preventing turning down upon occurrence of earthquakes. Further, since vibration-proof jigs as in a prior art can be saved, operation efficiency can be improved. (I.S.).

  12. Adaptive control of manipulators handling hazardous waste

    International Nuclear Information System (INIS)

    Colbaugh, R.; Glass, K.

    1994-01-01

    This article focuses on developing a robot control system capable of meeting hazardous waste handling application requirements, and presents as a solution an adaptive strategy for controlling the mechanical impedance of kinematically redundant manipulators. The proposed controller is capable of accurate end-effector impedance control and effective redundancy utilization, does not require knowledge of the complex robot dynamic model or parameter values for the robot or the environment, and is implemented without calculation of the robot inverse transformation. Computer simulation results are given for a four degree of freedom redundant robot under adaptive impedance control. These results indicate that the proposed controller is capable of successfully performing important tasks in robotic waste handling applications. (author) 3 figs., 39 refs

  13. Handling construction waste of building demolition

    Directory of Open Access Journals (Sweden)

    Vondráčková Terezie

    2018-01-01

    Full Text Available Some building defects lead to their demolition. What about construction and demolition waste? According to the Waste Act 185/2001 Coll. and its amendment 223/2015 Coll., which comes into force on January 1, 2017, the production of waste has to be reduced because, as already stated in the amendment to Act No. 229/2014 Coll., the ban on landfilling of waste will apply from 2024 onwards. The main goals of waste management can thus be considered: Preventing or minimizing waste; Waste handling to be used as a secondary raw material - recycling, composting, combustion and the remaining waste to be dumped. Company AZS 98 s. r. o. was established, among other activities, also for the purpose of recycling construction and demolition waste. It operates 12 recycling centers throughout the Czech Republic and therefore we have selected it for a demonstration of the handling of construction and demolition waste in addressing the defects of the buildings.

  14. Chromosome analyses of nurses handling cytostatic agents

    International Nuclear Information System (INIS)

    Waksvik, H.; Klepp, O.; Brogger, A.

    1981-01-01

    A cytogenetic study of ten nurses handling cytostatic agents (average exposure, 2150 hours) and ten female hospital clerks revealed an increased frequency of chromosome gaps and a slight increase in sister chromatid exchange frequency among the nurses. The increase may be due to exposure to cytostatic drugs and points to these agents as a possible occupational health hazard. A second group of 11 nurses handling cytostatic agents for a shorter period of time (average exposure, 1078 hours), and three other groups (eight nurses engaged in therapeutic and diagnostic radiology, nine nurses engaged in anesthesiology, and seven nurses in postoperative ward) did not differ from the office personnel, except for an increased frequency of chromosome gaps in the radiology group

  15. Robotic requirements for plutonium handling automation

    International Nuclear Information System (INIS)

    Heywood, A.C.; Armantrout, G.A.

    1990-01-01

    While over 200,000 robots are in manufacturing service worldwide, only two are in use for the handling of plutonium in a glovebox. The difficulties of applying robotics to the glovebox environment include limited access for service and maintenance, radiation damage to electronics and insulators, and abrasion damage to bearings and sliding surfaces. The limited volume of the glovebox environment, and the need to handle heavy workloads, and the need to maximize work volume dictates the use of an overhead gantry system. This paper discusses how the application of such a system will require a robot with extensive safety features, a high degree of flexibility to perform a variety of tasks, and high reliability coupled with an easily serviced design. Substantial challenges exist in control system design, sensor and operator integration, and programming to achieve these goals

  16. Player-Specific Conflict Handling Ontology

    Directory of Open Access Journals (Sweden)

    Charline Hondrou

    2014-09-01

    Full Text Available This paper presents an ontology that leads the player of a serious game - regarding conflict handling - to the educative experience from which they will benefit the most. It provides a clearly defined tree of axioms that maps the player’s visually manifested affective cues and emotional stimuli from the serious game to conflict handling styles and proposes interventions. The importance of this ontology lies in the fact that it promotes natural interaction (non-invasive methods and at the same time makes the game as player-specific as it can be for its educational goal. It is an ontology that can be adapted to different educational theories and serve various educational purposes.

  17. Remote Inspection, Measurement and Handling for LHC

    CERN Document Server

    Kershaw, K; Coin, A; Delsaux, F; Feniet, T; Grenard, J L; Valbuena, R

    2007-01-01

    Personnel access to the LHC tunnel will be restricted to varying extents during the life of the machine due to radiation, cryogenic and pressure hazards. The ability to carry out visual inspection, measurement and handling activities remotely during periods when the LHC tunnel is potentially hazardous offers advantages in terms of safety, accelerator down time, and costs. The first applications identified were remote measurement of radiation levels at the start of shut-down, remote geometrical survey measurements in the collimation regions, and remote visual inspection during pressure testing and initial machine cool-down. In addition, for remote handling operations, it will be necessary to be able to transmit several real-time video images from the tunnel to the control room. The paper describes the design, development and use of a remotely controlled vehicle to demonstrate the feasibility of meeting the above requirements in the LHC tunnel. Design choices are explained along with operating experience to-dat...

  18. Teachers' professional development needs in data handling and probability

    OpenAIRE

    Nieuwoudt, Hercules David; Wessels, Helena

    2011-01-01

    Poor Trends in International Mathematics and Science Study (TIMMS) results and widespread disappointing mathematics results in South Africa necessitate research-based and more efficient professional development for in-service mathematics teachers. This article reports on the profiling of mathematics teachers’ statistical knowledge, beliefs and confidence in order to inform the development of in-service teacher education programmes in statistics for Grade 8 and Grade 9 teachers. Ninety mathema...

  19. An introduction to the measurement errors and data handling

    International Nuclear Information System (INIS)

    Rubio, J.A.

    1979-01-01

    Some usual methods to estimate and correlate measurement errors are presented. An introduction to the theory of parameter determination and goodness of the estimates is also presented. Some examples are discussed. (author)

  20. Handling of wet residues in industry

    DEFF Research Database (Denmark)

    Villanueva, Alejandro

    is fundamental in most disposal routes for clarifying the possibility of treating the residue. The better the characterisation from the start is, the easier the assessment of the feasible disposal alternatives becomes. The decision about the handling/disposal solution for the residue is a trade-off between......, and can depend on factors such as the investment capacity, the relationships with the stakeholders, or the promotion of its environmental profile....

  1. Safe handling of plutonium in research laboratories

    International Nuclear Information System (INIS)

    1976-01-01

    The training film illustrates the main basic requirements for the safe handling of small amounts of plutonium. The film is intended not only for people setting up plutonium research laboratories but also for all those who work in existing plutonium research laboratories. It was awarded the first prize in the category ''Protection of Workers'' at the international film festival organized by the 4th World Congress of the International Radiation Protection Association (IRPA) in Paris in April 1977

  2. Procedure of safe handling with cytostatic drugs

    Directory of Open Access Journals (Sweden)

    Kodžo Dragan

    2003-01-01

    Full Text Available Working group for safe handling with cytostatic drugs has been formed by the Ministry of Health, and it consists of professionals from IORS, Federal Bureau of Weights and Measures, Industrial Medicine, Institute of Hematology, Military Medical Academy, and Crown Agents. The aim of this working group is to prepare procedures for safe handling with cytostatic drugs, as well as program for educational seminar for nurses, medical technicians, and pharmaceutical technicians. The procedures will serve as a guide of good practice of oncology health care, and will refer to all actions that health care professionals carry out from the moment of drugs arrival to the pharmacy to the moment of their application. In the first segment of this procedure, general rules are given for working with cytotoxic agents, control for risky exposures, safe system of work, control of working environment, monitoring of the employees' health condition adequate protection in the working environment, protective equipment of the employees (gloves, mask, cap, eyeglasses, shoe covers, coats and chambers for vertical laminary air stream. Storing of cytostatics, procedure in case of accident, and waste handling and removal are also described in this segment. Fifty-three standard operational procedures are described in detail in the second segment. Training scheme for preparation of chemotherapy is given in the third segment - education related to various fields and practical part, which would be carried out through workshops, and at the end of the course participants would pass a test and obtain certificate. After the procedures for safe handling with cytostatics are legally regulated employer will have to provide minimum of protective equipment, special rooms for the drugs dissolving, chambers with laminar airflow, 6 hours working time, rotation of the staff working with drugs dissolving in intervals of every five years, higher efficiency, better health control. In conclusion

  3. Development of spent fuel remote handling technology

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Ji Sup; Park, B S; Park, Y S; Oh, S C; Kim, S H; Cho, M W; Hong, D H

    1997-12-01

    Since the nation`s policy on spent fuel management is not finalized, the technical items commonly required for safe management and recycling of spent fuel - remote technologies of transportation, inspection, maintenance, and disassembly of spent fuel - are selected and pursued. In this regards, the following R and D activities are carried out : collision free transportation of spent fuel assembly, mechanical disassembly of spent nuclear fuel and graphical simulation of fuel handling / disassembly process. (author). 36 refs., 16 tabs., 77 figs

  4. Repository waste-handling operations, 1998

    International Nuclear Information System (INIS)

    Cottam, A.E.; Connell, L.

    1986-04-01

    The Civilian Radioactive Waste Management Program Mission Plan and the Generic Requirements for a Mined Geologic Disposal System state that beginning in 1998, commercial spent fuel not exceeding 70,000 metric tons of heavy metal, or a quantity of solidified high-level radioactive waste resulting from the reprocessing of such a quantity of spent fuel, will be shipped to a deep geologic repository for permanent storage. The development of a waste-handling system that can process 3000 metric tons of heavy metal annually will require the adoption of a fully automated approach. The safety and minimum exposure of personnel will be the prime goals of the repository waste handling system. A man-out-of-the-loop approach will be used in all operations including the receipt of spent fuel in shipping casks, the inspection and unloading of the spent fuel into automated hot-cell facilities, the disassembly of spent fuel assemblies, the consolidation of fuel rods, and the packaging of fuel rods into heavy-walled site-specific containers. These containers are designed to contain the radionuclides for up to 1000 years. The ability of a repository to handle more than 6000 pressurized water reactor spent-fuel rods per day on a production basis for approximately a 23-year period will require that a systems approach be adopted that combines space-age technology, robotics, and sophisticated automated computerized equipment. New advanced inspection techniques, maintenance by robots, and safety will be key factors in the design, construction, and licensing of a repository waste-handling facility for 1998

  5. Means for attaching remote handling tongs

    International Nuclear Information System (INIS)

    Kearney, A.S.

    1982-01-01

    A remote handling tong has a replaceable slave head assembly provided with a spring biased latch which engages a recess in a barrel member of the tong. The latch bolt extends transverse to the barrel member, and has studs which project at each end beyond the body of the slave head assembly so as to engage respective linear cam surfaces at a station for parking the slave head assembly. (author)

  6. Inert gas handling in ion plating systems

    International Nuclear Information System (INIS)

    Goode, A.R.; Burden, M.St.J.

    1979-01-01

    The results of an investigation into the best methods for production and monitoring of the inert gas environment for ion plating systems are reported. Work carried out on Pirani gauges and high pressure ion gauges for the measurement of pressures in the ion plating region (1 - 50mtorr) and the use of furnaces for cleaning argon is outlined. A schematic of a gas handling system is shown and discussed. (UK)

  7. Development of spent fuel remote handling technology

    International Nuclear Information System (INIS)

    Yoon, Ji Sup; Park, B. S.; Park, Y. S.; Oh, S. C.; Kim, S. H.; Cho, M. W.; Hong, D. H.

    1997-12-01

    Since the nation's policy on spent fuel management is not finalized, the technical items commonly required for safe management and recycling of spent fuel - remote technologies of transportation, inspection, maintenance, and disassembly of spent fuel - are selected and pursued. In this regards, the following R and D activities are carried out : collision free transportation of spent fuel assembly, mechanical disassembly of spent nuclear fuel and graphical simulation of fuel handling / disassembly process. (author). 36 refs., 16 tabs., 77 figs

  8. Powder handling for automated fuel processing

    International Nuclear Information System (INIS)

    Frederickson, J.R.; Eschenbaum, R.C.; Goldmann, L.H.

    1989-01-01

    Installation of the Secure Automated Fabrication (SAF) line has been completed. It is located in the Fuel Cycle Plant (FCP) at the Department of Energy's (DOE) Hanford site near Richland, Washington. The SAF line was designed to fabricate advanced reactor fuel pellets and assemble fuel pins by automated, remote operation. This paper describes powder handling equipment and techniques utilized for automated powder processing and powder conditioning systems in this line. 9 figs

  9. Safe handling of plutonium in research laboratories

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1977-12-31

    The training film illustrates the main basic requirements for the safe handling of small amounts of plutonium. The film is intended not only for people setting up plutonium research laboratories but also for all those who work in existing plutonium research laboratories. It was awarded the first prize in the category ``Protection of Workers`` at the international film festival organized by the 4th World Congress of the International Radiation Protection Association (IRPA) in Paris in April 1977

  10. Differences in Muscle Activity During Cable Resistance Training Are Influenced by Variations in Handle Types.

    Science.gov (United States)

    Rendos, Nicole K; Heredia Vargas, Héctor M; Alipio, Taislaine C; Regis, Rebeca C; Romero, Matthew A; Signorile, Joseph F

    2016-07-01

    Rendos, NK, Heredia Vargas, HM, Alipio, TC, Regis, RC, Romero, MA, and Signorile, JF. Differences in muscle activity during cable resistance training are influenced by variations in handle types. J Strength Cond Res 30(7): 2001-2009, 2016-There has been a recent resurgence in the use of cable machines for resistance training allowing movements that more effectively simulate daily activities and sports-specific movements. By necessity, these devices require a machine/human interface through some type of handle. Considerable data from material handling, industrial engineering, and exercise training studies indicate that handle qualities, especially size and shape, can significantly influence force production and muscular activity, particularly of the forearm muscles, which affect the critical link in activities that require object manipulation. The purpose for this study was to examine the influence of three different handle conditions: standard handle (StandH), ball handle with the cable between the index and middle fingers (BallIM), and ball handle with the cable between the middle and ring fingers (BallMR), on activity levels (rmsEMG) of the triceps brachii lateral and long heads (TriHLat, TriHLong), brachioradialis (BR), flexor carpi radialis (FCR), extensor carpi ulnaris, and extensor digitorum (ED) during eight repetitions of standing triceps pushdown performed from 90° to 0° elbow flexion at 1.5 s per contractile stage. Handle order was randomized. No significant differences were seen for triceps or BR rmsEMG across handle conditions; however, relative patterns of activation did vary for the forearm muscles by handle condition, with more coordinated activation levels for the FCR and ED during the ball handle conditions. In addition, the rmsEMG for the ED was significantly higher during the BallIM than any other condition and during the BallMR than the StandH. These results indicate that the use of ball handles with the cable passing between different fingers

  11. Uranium hexafluoride: Handling procedures and container descriptions

    International Nuclear Information System (INIS)

    1987-09-01

    The US Department of Energy (DOE) guidelines for packaging, measuring, and transferring uranium hexafluoride (UF 6 ) have been undergoing continual review and revision for several years to keep them in phase with developing agreements for the supply of enriched uranium. Initially, K-1323 ''A Brief Guide to UF 6 Handling,'' was issued in 1957. This was superceded by ORO-651, first issued in 1966, and reissued in 1967 to make editorial changes and to provide minor revisions in procedural information. In 1968 and 1972, Revisions 2 and 3, respectively, were issued as part of the continuing effort to present updated information. Revision 4 issued in 1977 included revisions to UF 6 cylinders, valves, and methods to use. Revision 5 adds information dealing with pigtails, overfilled cylinders, definitions and handling precautions, and cylinder heel reduction procedures. Weighing standards previously presented in ORO-671, Vol. 1 (Procedures for Handling and Analysis of UF 6 ) have also been included. This revision, therefore, supercedes ORO-671-1 as well as all prior issues of this report. These guidelines will normally apply in all transactions involving receipt or shipment of UF 6 by DOE, unless stipulated otherwise by contracts or agreements with DOE or by notices published in the Federal Register. Any questions or requests for additional information on the subject matter covered herein should be directed to the United States Department of Energy, P.O. Box E, Oak Ridge, Tennessee 37831, Attention: Director, Uranium Enrichment Operations Division. 33 figs., 12 tabs

  12. Vestibule and Cask Preparation Mechanical Handling Calculation

    International Nuclear Information System (INIS)

    Ambre, N.

    2004-01-01

    The scope of this document is to develop the size, operational envelopes, and major requirements of the equipment to be used in the vestibule, cask preparation area, and the crane maintenance area of the Fuel Handling Facility. This calculation is intended to support the License Application (LA) submittal of December 2004, in accordance with the directive given by DOE correspondence received on the 27th of January 2004 entitled: ''Authorization for Bechtel SAIC Company L.L.C. to Include a Bare Fuel Handling Facility and Increased Aging Capacity in the License Application, Contract Number DE-AC--28-01R W12101'' (Ref. 167124). This correspondence was appended by further correspondence received on the 19th of February 2004 entitled: ''Technical Direction to Bechtel SAIC Company L.L. C. for Surface Facility Improvements, Contract Number DE-AC--28-01R W12101; TDL No. 04-024'' (Ref. 16875 1). These documents give the authorization for a Fuel Handling Facility to be included in the baseline. The limitations of this preliminary calculation lie within the assumptions of section 5 , as this calculation is part of an evolutionary design process

  13. Remote handling systems for the Pride application

    International Nuclear Information System (INIS)

    Kim, K.; Lee, J.; Lee, H.; Kim, S.; Kim, H.

    2010-10-01

    In this paper is described the development of remote handling systems for use in the pyro processing technology development. Remote handling systems mainly include a BDSM (Bridge transported Dual arm Servo-Manipulator) and a simulator, all of which will be applied to the Pride (Pyro process integrated inactive demonstration facility) that is under construction at KAERI. BDMS that will traverse the length of the ceiling is designed to have two pairs of master-slave manipulators of which each pair of master-slave manipulators has a kinematic similarity and a force reflection. A simulator is also designed to provide an efficient means for simulating and verifying the conceptual design, developments, arrangements, and rehearsal of the pyro processing equipment and relevant devices from the viewpoint of remote operation and maintenance. In our research is presented activities and progress made in developing remote handling systems to be used for the remote operation and maintenance of the pyro processing equipment and relevant devices in the Pride. (Author)

  14. A Review of Toxicity and Use and Handling Considerations for Guanidine, Guanidine Hydrochloride, and Urea.

    Energy Technology Data Exchange (ETDEWEB)

    Ertell, Katherine GB

    2006-03-27

    This is a technical report prepared for Oregon Sustainable Energy, LLC, under Agreement 06-19 with PNNL's Office of Small Business Programs. The request was to perform a review of the toxicity and safe handling of guanidine. The request was later amended to add urea. This report summarizes the toxicity data available in the scientific literature and provides an interpretation of the results and recommendations for handling these compounds.

  15. CANISTER HANDLING FACILITY CRITICALITY SAFETY CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    C.E. Sanders

    2005-04-07

    This design calculation revises and updates the previous criticality evaluation for the canister handling, transfer and staging operations to be performed in the Canister Handling Facility (CHF) documented in BSC [Bechtel SAIC Company] 2004 [DIRS 167614]. The purpose of the calculation is to demonstrate that the handling operations of canisters performed in the CHF meet the nuclear criticality safety design criteria specified in the ''Project Design Criteria (PDC) Document'' (BSC 2004 [DIRS 171599], Section 4.9.2.2), the nuclear facility safety requirement in ''Project Requirements Document'' (Canori and Leitner 2003 [DIRS 166275], p. 4-206), the functional/operational nuclear safety requirement in the ''Project Functional and Operational Requirements'' document (Curry 2004 [DIRS 170557], p. 75), and the functional nuclear criticality safety requirements described in the ''Canister Handling Facility Description Document'' (BSC 2004 [DIRS 168992], Sections 3.1.1.3.4.13 and 3.2.3). Specific scope of work contained in this activity consists of updating the Category 1 and 2 event sequence evaluations as identified in the ''Categorization of Event Sequences for License Application'' (BSC 2004 [DIRS 167268], Section 7). The CHF is limited in throughput capacity to handling sealed U.S. Department of Energy (DOE) spent nuclear fuel (SNF) and high-level radioactive waste (HLW) canisters, defense high-level radioactive waste (DHLW), naval canisters, multicanister overpacks (MCOs), vertical dual-purpose canisters (DPCs), and multipurpose canisters (MPCs) (if and when they become available) (BSC 2004 [DIRS 168992], p. 1-1). It should be noted that the design and safety analyses of the naval canisters are the responsibility of the U.S. Department of the Navy (Naval Nuclear Propulsion Program) and will not be included in this document. In addition, this calculation is valid for

  16. CANISTER HANDLING FACILITY CRITICALITY SAFETY CALCULATIONS

    International Nuclear Information System (INIS)

    C.E. Sanders

    2005-01-01

    This design calculation revises and updates the previous criticality evaluation for the canister handling, transfer and staging operations to be performed in the Canister Handling Facility (CHF) documented in BSC [Bechtel SAIC Company] 2004 [DIRS 167614]. The purpose of the calculation is to demonstrate that the handling operations of canisters performed in the CHF meet the nuclear criticality safety design criteria specified in the ''Project Design Criteria (PDC) Document'' (BSC 2004 [DIRS 171599], Section 4.9.2.2), the nuclear facility safety requirement in ''Project Requirements Document'' (Canori and Leitner 2003 [DIRS 166275], p. 4-206), the functional/operational nuclear safety requirement in the ''Project Functional and Operational Requirements'' document (Curry 2004 [DIRS 170557], p. 75), and the functional nuclear criticality safety requirements described in the ''Canister Handling Facility Description Document'' (BSC 2004 [DIRS 168992], Sections 3.1.1.3.4.13 and 3.2.3). Specific scope of work contained in this activity consists of updating the Category 1 and 2 event sequence evaluations as identified in the ''Categorization of Event Sequences for License Application'' (BSC 2004 [DIRS 167268], Section 7). The CHF is limited in throughput capacity to handling sealed U.S. Department of Energy (DOE) spent nuclear fuel (SNF) and high-level radioactive waste (HLW) canisters, defense high-level radioactive waste (DHLW), naval canisters, multicanister overpacks (MCOs), vertical dual-purpose canisters (DPCs), and multipurpose canisters (MPCs) (if and when they become available) (BSC 2004 [DIRS 168992], p. 1-1). It should be noted that the design and safety analyses of the naval canisters are the responsibility of the U.S. Department of the Navy (Naval Nuclear Propulsion Program) and will not be included in this document. In addition, this calculation is valid for the current design of the CHF and may not reflect the ongoing design evolution of the facility

  17. Handling of Small-Scale Protests in China

    DEFF Research Database (Denmark)

    Gui, Xiaowei

    that is less than Sedaka, but more than an interview. A thorough analysis of the highly mixed reality of protest and protest-handling in the dissertation improves scholarly understanding of state-society relation and contentious politics in China. In particular, Chapter One is the brief introduction...... and argue why I believe the data gathered to be applicable to the study. The next three analytical chapters concentrate on three specific issues. Chapter 4 illustrates how and why petitions are mishandled. Chapter 5 explores how and why nail residents can succeed. Chapter 6 provides an explanation of why...

  18. Post-irradiation handling and examination at the HFEF complex

    International Nuclear Information System (INIS)

    Bacca, J.P.

    1980-01-01

    The Hot Fuel Examination Facility provides postirradiation handling and examination of fast reactor irradiation experiments and safety tests for the United States Breeder Reactor Program. Nondestructive interim examinations and destructive terminal examinations at HFEF derive data from tests irradiated in the Experimental Breeder Reactor No. II, in the Transient Reactor Test Facility (TREAT), and in the Sodium Loop Safety Facility. Similar support will be provided in the near future for tests irradiated in the Fast Flux Test Facility, and for the larger sodium loops to be irradiated in TREAT

  19. Emergency Handling for MAC Protocol in Human Body Communication

    Directory of Open Access Journals (Sweden)

    Kwon Youngmi

    2011-01-01

    Full Text Available The human body communication (HBC is a technology that enables short range data communication using the human body as a medium, like an electrical wire. Thus it removes the need for a traditional antenna. HBC may be used as a type of data communication in body area network (BAN, while the devices are being in contact with body. One of important issues in BAN is an emergency alarm because it may be closely related to human life. For emergency data communication, the most critical factor is the time constraint. IEEE 802.15.6 specifies that the emergency alarm for the BAN must be notified in less than 1 sec and must provide prioritization mechanisms for emergency traffic and notification. As one type of BAN, the HBC must follow this recommendation, too. Existing emergency handling methods in BAN are based on the carrier sensing capability on radio frequencies to detect the status of channels. However, PHY protocol in HBC does not provide the carrier sensing. So the previous methods are not well suitable for HBC directly. Additionally, in the environment that the emergency rate is very low, the allocation of dedicated slot(s for emergency in each superframe is very wasteful. In this work, we proposed specific emergency handling operation for human body communication's medium access control (HBC-MAC protocol to meet the emergency requirements for BAN. We also showed the optimal number of emergency slots for the various combinations of beacon intervals and emergency rates.

  20. Museopathy: Exploring the Healing Potential of Handling Museum Objects

    Directory of Open Access Journals (Sweden)

    Helen Chatterjee

    2009-11-01

    Full Text Available To coincide with emerging arts and health practices, University College London Museums & Collections and University College London Hospitals Arts partnered to create a pilot project, entitled “Heritage in Hospitals”, which sought to assess whether handling museum objects has a positive impact on patient wellbeing. Quantitative data from 32 sessions conducted with patients in May through July (inclusive of 2008 demonstrated, on average, an increase in self-reported measures of life satisfaction and health status after handling museum objects. Constant comparative analysis of the qualitative data collected from the sessions revealed two major recurring themes: “impersonal/educational” and “personal/reminiscence”. The first theme included instances where handling museum objects allowed patients to access truths about the objects ascertainable solely through touch (such as gauging weight, texture, temperature, and spatial relation to the body, to verify what was seen, to facilitate an intimate and imaginative connection with the museum objects and their origins, to investigate and explore the objects, to permit an interaction with the “rare” and “museum-worthy”, and to assist with aesthetic appreciation. The second theme illustrated the project’s potential to assist with counselling on issues of illness, death, loss and mourning, and to help restore dignity, respect and a sense of identity (particularly among elderly patients by providing a springboard for reminiscing and the telling of life stories in a highly institutionalized setting. This paper contextualizes the project, explores the implications of the project’s methodology and its findings, and provides questions for future research.

  1. TECHNIQUES WITH POTENTIAL FOR HANDLING ENVIRONMENTAL SAMPLES IN CAPILLARY ELECTROPHORESIS

    Science.gov (United States)

    An assessment of the methods for handling environmental samples prior to capillary electrophoresis (CE) is presented for both aqueous and solid matrices. Sample handling in environmental analyses is the subject of ongoing research at the Environmental Protection Agency's National...

  2. Effects of handling on fear reactions in young Icelandic horses

    DEFF Research Database (Denmark)

    Marsbøll, Anna Feldberg; Christensen, Janne Winther

    2015-01-01

    To investigate the effect of a short-term standardised handling procedure on reactions of young horses in 2 types of fear tests (including and excluding human handling). Study design An experimental study with 3-year-old Icelandic horses (n = 24). Methods Handled horses (n = 12) were trained according...... to a standardised handling procedure whereas controls (n = 12) remained untrained. Behavioural and heart rate responses in a novel object test and 2 handling fear tests (HFTs) were measured. The HFTs were conducted with both an unknown (HFT-unknown) and a known handler (HFT-known). Results There was no effect...... correlated significantly between tests. Conclusions Previous handling may affect the behavioural fear response of horses when handled by their usual handler, whereas this effect did not apply to an unknown handler. Heart rates appeared unaffected by handling and may be a more reliable indicator...

  3. A combined constraint handling framework: an empirical study

    DEFF Research Database (Denmark)

    Si, Chengyong; Hu, Junjie; Lan, Tian

    2017-01-01

    This paper presents a new combined constraint handling framework (CCHF) for solving constrained optimization problems (COPs). The framework combines promising aspects of different constraint handling techniques (CHTs) in different situations with consideration of problem characteristics. In order...

  4. Laboratory biosafety for handling emerging viruses

    Directory of Open Access Journals (Sweden)

    I. Made Artika

    2017-05-01

    Full Text Available Emerging viruses are viruses whose occurrence has risen within the past twenty years, or whose presence is likely to increase in the near future. Diseases caused by emerging viruses are a major threat to global public health. In spite of greater awareness of safety and containment procedures, the handling of pathogenic viruses remains a likely source of infection, and mortality, among laboratory workers. There is a steady increase in both the number of laboratories and scientist handling emerging viruses for diagnostics and research. The potential for harm associated to work with these infectious agents can be minimized through the application of sound biosafety concepts and practices. The main factors to the prevention of laboratory-acquired infection are well-trained personnel who are knowledgable and biohazard aware, who are perceptive of the various ways of transmission, and who are professional in safe laboratory practice management. In addition, we should emphasize that appropriate facilities, practices and procedures are to be used by the laboratory workers for the handling of emerging viruses in a safe and secure manner. This review is aimed at providing researchers and laboratory personnel with basic biosafety principles to protect themselves from exposure to emerging viruses while working in the laboratory. This paper focuses on what emerging viruses are, why emerging viruses can cause laboratory-acquired infection, how to assess the risk of working with emerging viruses, and how laboratory-acquired infection can be prevented. Control measures used in the laboratory designed as such that they protect workers from emerging viruses and safeguard the public through the safe disposal of infectious wastes are also addressed.

  5. Recent fuel handling experience in Canada

    International Nuclear Information System (INIS)

    Welch, A.C.

    1991-01-01

    For many years, good operation of the fuel handling system at Ontario Hydro's nuclear stations has been taken for granted with the unavailability of the station arising from fuel handling system-related problems usually contributing less than one percent of the total unavailability of the stations. While the situation at the newer Hydro stations continues generally to be good (with the specific exception of some units at Pickering B) some specific and some general problems have caused significant loss of availability at the older plants (Pickering A and Bruce A). Generally the experience at the 600 MWe units in Canada has also continued to be good with Point Lepreau leading the world in availability. As a result of working to correct identified deficiencies, there were some changes for the better as some items of equipment that were a chronic source of trouble were replaced with improved components. In addition, the fuel handling system has been used three times as a delivery system for large-scale non destructive examination of the pressure tubes, twice at Bruce and once at Pickering and performing these inspections this way has saved many days of reactor downtime. Under COG there are several programs to develop improved versions of some of the main assemblies of the fuelling machine head. This paper will generally cover the events relating to Pickering in more detail but will describe the problems with the Bruce Fuelling Machine Bridges since the 600 MW 1P stations have a bridge drive arrangement that is somewhat similar to Bruce

  6. System for handling and storing radioactive waste

    Science.gov (United States)

    Anderson, John K.; Lindemann, Paul E.

    1984-01-01

    A system and method for handling and storing spent reactor fuel and other solid radioactive waste, including canisters to contain the elements of solid waste, storage racks to hold a plurality of such canisters, storage bays to store these racks in isolation by means of shielded doors in the bays. This system also includes means for remotely positioning the racks in the bays and an access tunnel within which the remotely operated means is located to position a rack in a selected bay. The modular type of these bays will facilitate the construction of additional bays and access tunnel extension.

  7. Remote handling equipment for CANDU retubing

    International Nuclear Information System (INIS)

    Crawford, G.S.; Lowe, H.

    1993-01-01

    Numet Engineering Ltd. has designed and supplied remote handling equipment for Ontario Hydro's retubing operation of its CANDU reactors at the Bruce Nuclear Generating Station. This equipment consists of ''Retubing Tool Carriers'' an'' Worktables'' which operate remotely or manually at the reactor face. Together they function to transport tooling to and from the reactor face, to position and support tooling during retubing operations, and to deliver and retrieve fuel channels and channel components. This paper presents the fundamentals of the process and discusses the equipment supplied in terms of its design, manufacturing, components and controls, to meet the functional and quality requirements of Ontario Hydro's retubing process. (author)

  8. Remote filter handling machine for Sizewell B

    International Nuclear Information System (INIS)

    Barker, D.

    1993-01-01

    Two Filter Handling machines (FHM) have been supplied to Nuclear Electric plc for use at Sizewell B Power Station. These machines have been designed and built following ALARP principles with the functional objective being to remove radioactive filter cartridges from a filter housing and replace them with clean filter cartridges. Operation of the machine is achieved by the prompt of each distinct task via an industrial computer or the prompt of a full cycle using the automatic mode. The design of the machine features many aspects demonstrating ALARP while keeping the machine simple, robust and easy to maintain. (author)

  9. CLASSIFICATION OF THE MGR MUCK HANDLING SYSTEM

    International Nuclear Information System (INIS)

    R. Garrett

    1999-01-01

    The purpose of this analysis is to document the Quality Assurance (QA) classification of the Monitored Geologic Repository (MGR) muck handling system structures, systems and components (SSCs) performed by the MGR Safety Assurance Department. This analysis also provides the basis for revision of YMP/90-55Q, Q-List (YMP 1998). The Q-List identifies those MGR SSCs subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description (QARD) (DOE 1998). This QA classification incorporates the current MGR design and the results of the ''Preliminary Preclosure Design Basis Event Calculations for the Monitored Geologic Repository (CRWMS M and O 1998a)

  10. How to handle station black outs

    International Nuclear Information System (INIS)

    Reisch, Frigyes

    1986-01-01

    Station black out is defined as the loss of ail high voltage alternating current at a nuclear power site. An international study was made to survey the practices in the different countries. The best way to handle station black out is to avoid it therefore briefly the normal off site and emergency on site power supplies are discussed. The ways in use to enhance nuclear power plants using Boiling Water Reactors or Pressurized Water Reactors to cope with a station black out are discussed in some detail. (author)

  11. The remote handling systems for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Isabel, E-mail: mir@isr.ist.utl.pt [Institute for Systems and Robotics/Instituto Superior Tecnico, Lisboa (Portugal); Damiani, Carlo [Fusion for Energy, Barcelona (Spain); Tesini, Alessandro [ITER Organization, Cadarache (France); Kakudate, Satoshi [ITER Tokamak Device Group, Japan Atomic Energy Agency, Ibaraki (Japan); Siuko, Mikko [VTT Systems Engineering, Tampere (Finland); Neri, Carlo [Associazione EURATOM ENEA, Frascati (Italy)

    2011-10-15

    The ITER remote handling (RH) maintenance system is a key component in ITER operation both for scheduled maintenance and for unexpected situations. It is a complex collection and integration of numerous systems, each one at its turn being the integration of diverse technologies into a coherent, space constrained, nuclearised design. This paper presents an integrated view and recent results related to the Blanket RH System, the Divertor RH System, the Transfer Cask System (TCS), the In-Vessel Viewing System, the Neutral Beam Cell RH System, the Hot Cell RH and the Multi-Purpose Deployment System.

  12. How to handle station black outs

    Energy Technology Data Exchange (ETDEWEB)

    Reisch, Frigyes [Swedish Nuclear Power Inspectorate, S-10252 Stockholm (Sweden)

    1986-02-15

    Station black out is defined as the loss of ail high voltage alternating current at a nuclear power site. An international study was made to survey the practices in the different countries. The best way to handle station black out is to avoid it therefore briefly the normal off site and emergency on site power supplies are discussed. The ways in use to enhance nuclear power plants using Boiling Water Reactors or Pressurized Water Reactors to cope with a station black out are discussed in some detail. (author)

  13. Handling of sodium for the FFTF

    International Nuclear Information System (INIS)

    Ballif, J.L.; Meadows, G.E.

    1978-06-01

    Based on the High Temperature Sodium Facility (HTSF) experience and the extensive design efforts for FFTF, procedures are in place for the unloading of the tank cars and for the fill of the FFTF reactor. Special precautions have been taken to provide safe handling and to accommodate contingencies in operation. These contingencies include special protective suits allowing personnel to enter and correct conditions arising from fill operations in the course of moving 7.71 x 10 5 kg (1.7 x 10 6 lbs) of sodium from the tank cars into the reactor vessel and its loop system

  14. Tire, accident, handling, and roadway safety

    International Nuclear Information System (INIS)

    Logan, R.W.

    1994-01-01

    The authors are developing technology for an integrated package for the analysis of vehicle handling and impact into roadside features and into other vehicles. The program involves the development and use of rigid-body algorithms and Lawrence Livermore National Laboratory's DYNA and NIKE finite-element codes. The goal is a tool for use by engineers in industry and at federal and state Departments of Transportation, allowing good quantitative results at the workstation level. At the same time, the work enhances the authors' competency in furthering the development of DYNA and NIKE, and their expertise in spaceframe design for impact and crashworthiness

  15. DISPOSAL CONTAINER HANDLING SYSTEM DESCRIPTION DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    E. F. Loros

    2000-06-30

    The Disposal Container Handling System receives and prepares new disposal containers (DCs) and transfers them to the Assembly Transfer System (ATS) or Canister Transfer System (CTS) for loading. The system receives the loaded DCs from ATS or CTS and welds the lids. When the welds are accepted the DCs are termed waste packages (WPs). The system may stage the WP for later transfer or transfer the WP directly to the Waste Emplacement/Retrieval System. The system can also transfer DCs/WPs to/from the Waste Package Remediation System. The Disposal Container Handling System begins with new DC preparation, which includes installing collars, tilting the DC upright, and outfitting the container for the specific fuel it is to receive. DCs and their lids are staged in the receipt area for transfer to the needed location. When called for, a DC is put on a cart and sent through an airlock into a hot cell. From this point on, all processes are done remotely. The DC transfer operation moves the DC to the ATS or CTS for loading and then receives the DC for welding. The DC welding operation receives loaded DCs directly from the waste handling lines or from interim lag storage for welding of the lids. The welding operation includes mounting the DC on a turntable, removing lid seals, and installing and welding the inner and outer lids. After the weld process and non-destructive examination are successfully completed, the WP is either staged or transferred to a tilting station. At the tilting station, the WP is tilted horizontally onto a cart and the collars removed. The cart is taken through an air lock where the WP is lifted, surveyed, decontaminated if required, and then moved into the Waste Emplacement/Retrieval System. DCs that do not meet the welding non-destructive examination criteria are transferred to the Waste Package Remediation System for weld preparation or removal of the lids. The Disposal Container Handling System is contained within the Waste Handling Building System

  16. Solution for remote handling in accelerator installations

    International Nuclear Information System (INIS)

    Burgerjon, J.J.; Ekberg, E.L.; Grisham, D.L.; Horne, R.A.; Meyer, R.E.; Flatau, C.R.; Wilson, K.B.

    1977-01-01

    A description is given of a remote-handling system designed for the Los Alamos Clinton P. Anderson Meson Physics Facility (LAMPF), versatile enough to be used in a variety of situations found around particle accelerators. The system consists of a bilateral (force-reflecting) servomanipulator installed on an articulated hydraulic boom. The boom also carries the necessary tools and observation devices. The whole slave unit can be moved by crane or truck to the area of operation. A control cable connects the slave unit with the control station, located at a safe distance in a trailer. Various stages of development as well as some operating experience are discussed

  17. WASTE HANDLING BUILDING SHIELD WALL ANALYSIS

    International Nuclear Information System (INIS)

    Padula, D.

    2000-01-01

    The scope of this analysis is to estimate the shielding wall, ceiling or equivalent door thicknesses that will be required in the Waste Handling Building to maintain the radiation doses to personnel within acceptable limits. The shielding thickness calculated is the minimum required to meet administrative limits, and not necessarily what will be recommended for the final design. The preliminary evaluations will identify the areas which have the greatest impact on mechanical and facility design concepts. The objective is to provide the design teams with the necessary information to assure an efficient and effective design

  18. Materials Handling. Module SH-01. Safety and Health.

    Science.gov (United States)

    Center for Occupational Research and Development, Inc., Waco, TX.

    This student module on materials handling is one of 50 modules concerned with job safety and health. It presents the procedures for safe materials handling. Discussed are manual handling methods (lifting and carrying by hand) and mechanical lifting (lifting by powered trucks, cranes or conveyors). Following the introduction, 15 objectives (each…

  19. Execution Constraint Verification of Exception Handling on UML Sequence Diagrams

    NARCIS (Netherlands)

    Ciraci, S.; Sözer, Hasan; Aksit, Mehmet; Havinga, W.K.

    2011-01-01

    Exception handling alters the control flow of the program. As such, errors introduced in exception handling code may influence the overall program in undesired ways. To detect such errors early and thereby decrease the programming costs, it is worthwhile to consider exception handling at design

  20. Automation of 3D micro object handling process

    DEFF Research Database (Denmark)

    Gegeckaite, Asta; Hansen, Hans Nørgaard

    2007-01-01

    Most of the micro objects in industrial production are handled with manual labour or in semiautomatic stations. Manual labour usually makes handling and assembly operations highly flexible, but slow, relatively imprecise and expensive. Handling of 3D micro objects poses special challenges due to ...

  1. Getting to grips with remote handling and robotics

    Energy Technology Data Exchange (ETDEWEB)

    Mosey, D [Ontario Hydro, Toronto (Canada)

    1984-12-01

    A report on the Canadian Nuclear Society Conference on robotics and remote handling in the nuclear industry, September 1984. Remote handling in reactor operations, particularly in the Candu reactors is discussed, and the costs and benefits of use of remote handling equipment are considered. Steam generator inspection and repair is an area in which practical application of robotic technology has made a major advance.

  2. 21 CFR 58.107 - Test and control article handling.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Test and control article handling. 58.107 Section... GOOD LABORATORY PRACTICE FOR NONCLINICAL LABORATORY STUDIES Test and Control Articles § 58.107 Test and control article handling. Procedures shall be established for a system for the handling of the test and...

  3. 30 CFR 75.817 - Cable handling and support systems.

    Science.gov (United States)

    2010-07-01

    ... High-Voltage Longwalls § 75.817 Cable handling and support systems. Longwall mining equipment must be provided with cable-handling and support systems that are constructed, installed and maintained to minimize... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Cable handling and support systems. 75.817...

  4. [The risk of manual handling loads in the hotel sector].

    Science.gov (United States)

    Muraca, G; Martino, L Barbaro; Abbate, A; De Pasquale, D; Barbuzza, O; Brecciaroli, R

    2007-01-01

    The aim of our study is to evaluate the manual handling risk and the incidence of muscle-skeletal pathologies in the hotel compartment. Our study is conducted on 264 workers of the hotel compartment. The sample is divided on the base of the working turn in the following groups: porter (both to the plans and in the kitchen); waiters to the plans; services (gardeners and workers). The duties have been valued according to the method NIOSH. The presence of muscle-skeletal pathologies has been verified on the base to the accused symptomology, and on the presence of clinical objectivity and to the reports of checks. The data has been compared to a control group. The application of the NIOSH method has showed for each working profile an elevated synthetic index, > 3, and for porter the index is 5. The clinical data has shown an elevated incidence of pathologies of the spine, especially lumbar spine, with a high prevalence in the group of male porters. In conclusion we believe that the manual handling represents a particularly remarkable risk for the workers in the hotel compartment.

  5. Epidemiological study of health hazards among workers handling engineered nanomaterials

    International Nuclear Information System (INIS)

    Liou, Saou-Hsing; Tsou, Tsui-Chun; Wang, Shu-Li; Li, Lih-Ann; Chiang, Hung-Che; Li, Wan-Fen; Lin, Pin-Pin; Lai, Ching-Huang; Lee, Hui-Ling; Lin, Ming-Hsiu; Hsu, Jin-Huei; Chen, Chiou-Rong; Shih, Tung-Sheng; Liao, Hui-Yi; Chung, Yu-Teh

    2012-01-01

    The aim of this study was to establish and identify the health effect markers of workers with potential exposure to nanoparticles (20–100 nm) during manufacturing and/or application of nanomaterials. For this cross-sectional study, we recruited 227 workers who handled nanomaterials and 137 workers for comparison who did not from 14 plants in Taiwan. A questionnaire was used to collect data on exposure status, demographics, and potential confounders. The health effect markers were measured in the medical laboratory. Control banding from the Nanotool Risk Level Matrix was used to categorize the exposure risk levels of the workers. The results showed that the antioxidant enzyme, superoxide dismutase (SOD) in risk level 1 (RL1) and risk level 2 (RL2) workers was significantly (p RL1 > RL2). Another antioxidant, glutathione peroxidase (GPX), was significantly lower only in RL1 workers than in the control workers. The cardiovascular markers, fibrinogen and ICAM (intercellular adhesion molecule), were significantly higher in RL2 workers than in controls and a significant dose–response with an increasing trend was found for these two cardiovascular markers. Another cardiovascular marker, interleukin-6, was significantly increased among RL1 workers, but not among RL2 workers. The accuracy rate for remembering 7-digits and reciting them backwards was significantly lower in RL2 workers (OR = 0.48) than in controls and a significantly reversed gradient was also found for the correct rate of backward memory (OR = 0.90 for RL1, OR = 0.48 for RL2, p < 0.05 in test for trend). Depression of antioxidant enzymes and increased expression of cardiovascular markers were found among workers handling nanomaterials. Antioxidant enzymes, such as SOD and GPX, and cardiovascular markers, such as fibrinogen, ICAM, and interluekin-6, are possible biomarkers for medical surveillance of workers handling engineered nanomaterials.

  6. Recent advances in remote handling at LAMPF

    International Nuclear Information System (INIS)

    Lambert, J.E.; Grisham, D.L.

    1985-01-01

    The Clinton P. Anderson Meson Physics Facility (LAMPF) has operated at beam currents above 200 microamperes since 1976. As a result, the main experimental beam line (Line A) has become increasingly radioactive over the years. Since 1976 the radiation levels have steadily increased from 100 mR/hr to levels that exceed 10,000 R/hr in the components near the pion production targets. During this time the LAMPF remote handling system, Monitor, has continued to operate successfully in the ever-increasing radiation levels, as well as with more complex remote-handling situations. This paper briefly describes the evolution of Monitor and specifically describes the complete rebuild of the A-6 target area, which is designated as the beam stop, but also includes isotope production capabilities and a primitive neutron irradiation facility. The new facility includes not only the beam stop and isotope production, but also facilities for proton irradiation and a ten-fold expansion in neutron irradiation facilities

  7. Safe handling of radioactive isotopes. Handbook 42

    International Nuclear Information System (INIS)

    1949-09-01

    With the increasing use of radioactive isotopes by industry, the medical profession, and research laboratories, it is essential that certain minimal precautions be taken to protect the users and the public. The recommendations contained in this handbook represent what is believed to be the best available opinions on the subject as of this date. As our experience with radioisotopes broadens, we will undoubtedly be able to improve and strengthen the recommendations for their safe handling and utilization. Through the courtesy of the National Research Council about a year ago, several hundred draft copies of this report were circulated to all leading workers and authorities in the field for comment and criticism. The present handbook embodies all pertinent suggestions received from these people. Further comment will be welcomed by the committee. One of the greatest difficulties encountered in the preparation of this handbook lay in the uncertainty regarding permissible radiation exposure levels - particularly for ingested radioactive materials. The establishment of sound figures for such exposure still remains a problem of high priority for many conditions and radioactive substances. Such figures as are used in this report represent the best available information today. If, in the future, these can be improved upon, appropriate corrections will be issued. The subject will be under continuous study by the two subcommittees mentioned above. The present Handbook has been prepared by the Subcommittee on the Handling of Radioactive Isotopes and Fission Products

  8. Safe handling of radioactive isotopes. Handbook 42

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1949-09-15

    With the increasing use of radioactive isotopes by industry, the medical profession, and research laboratories, it is essential that certain minimal precautions be taken to protect the users and the public. The recommendations contained in this handbook represent what is believed to be the best available opinions on the subject as of this date. As our experience with radioisotopes broadens, we will undoubtedly be able to improve and strengthen the recommendations for their safe handling and utilization. Through the courtesy of the National Research Council about a year ago, several hundred draft copies of this report were circulated to all leading workers and authorities in the field for comment and criticism. The present handbook embodies all pertinent suggestions received from these people. Further comment will be welcomed by the committee. One of the greatest difficulties encountered in the preparation of this handbook lay in the uncertainty regarding permissible radiation exposure levels - particularly for ingested radioactive materials. The establishment of sound figures for such exposure still remains a problem of high priority for many conditions and radioactive substances. Such figures as are used in this report represent the best available information today. If, in the future, these can be improved upon, appropriate corrections will be issued. The subject will be under continuous study by the two subcommittees mentioned above. The present Handbook has been prepared by the Subcommittee on the Handling of Radioactive Isotopes and Fission Products.

  9. Development of spent fuel remote handling technology

    International Nuclear Information System (INIS)

    Yoon, J. S.; Hong, H. D.; Kim, S. H.

    2004-02-01

    In this research, the remote handling technology is developed for the advanced spent fuel conditioning process which gives a possible solution to deal with the rapidly increasing spent fuels. In detail, a fuel rod slitting device is developed for the decladding of the spent fuel. A series of experiments has been performed to find out the optimal condition of the spent fuel voloxidation which converts the UO 2 pellet into U 3 O 8 powder. The design requirements of the ACP equipment for hot test is established by analysing the modular requirement, radiation hardening and thermal protection of the process equipment, etc. The prototype of the servo manipulator is developed. The manipulator has an excellent performance in terms of the payload to weight ratio that is 30 % higher than that of existing manipulators. To provide reliability and safety of the ACP, the 3 dimensional graphic simulator is developed. Using the simulator the remote handling operation is simulated and as a result, the optimal layout of ACP is obtained. The supervisory control system is designed to control and monitor the several different unit processes. Also the failure monitoring system is developed to detect the possible accidents of the reduction reactor

  10. Radioactivity, shielding, radiation damage, and remote handling

    International Nuclear Information System (INIS)

    Wilson, M.T.

    1975-01-01

    Proton beams of a few hundred million electron volts of energy are capable of inducing hundreds of curies of activity per microampere of beam intensity into the materials they intercept. This adds a new dimension to the parameters that must be considered when designing and operating a high-intensity accelerator facility. Large investments must be made in shielding. The shielding itself may become activated and require special considerations as to its composition, location, and method of handling. Equipment must be designed to withstand large radiation dosages. Items such as vacuum seals, water tubing, and electrical insulation must be fabricated from radiation-resistant materials. Methods of maintaining and replacing equipment are required that limit the radiation dosages to workers.The high-intensity facilities of LAMPF, SIN, and TRIUMF and the high-energy facility of FERMILAB have each evolved a philosophy of radiation handling that matches their particular machine and physical plant layouts. Special tooling, commercial manipulator systems, remote viewing, and other techniques of the hot cell and fission reactor realms are finding application within accelerator facilities. (U.S.)

  11. Decontamination manual of RI handling laboratory

    International Nuclear Information System (INIS)

    Wadachi, Yoshiki

    2004-01-01

    Based on experiences in Japan Atomic Energy Research Institute (JAERI), the essential and practical knowledge of radioactive contamination and its decontamination, and the method and procedure of floor decontamination are described for researcher and managing person in charge of handling radioisotopes (RI) in RI handling laboratories. Essential knowledge concerns the uniqueness of solid surface contamination derived from RI half lives and quantities, surface contamination density limit, and mode/mechanism of contamination. The principle of decontamination is a single conduct with recognition of chemical form of the RI under use. As the practical knowledge, there are physical and chemical methods of solid surface decontamination. The latter involves use of inorganic acids, chelaters and surfactants. Removal and replacement of contaminated solid like floor material are often effective. Distribution mapping of surface contamination can be done by measuring the radioactivity in possibly contaminated areas, and is useful for planning of effective decontamination. Floor surface decontamination is for the partial and spread areas of the floor. It is essential to conduct the decontamination with reagent from the highly to less contaminated areas. Skin decontamination with either neutral detergent or titanium oxide is also described. (N.I.)

  12. Robotic liquid handling and automation in epigenetics.

    Science.gov (United States)

    Gaisford, Wendy

    2012-10-01

    Automated liquid-handling robots and high-throughput screening (HTS) are widely used in the pharmaceutical industry for the screening of large compound libraries, small molecules for activity against disease-relevant target pathways, or proteins. HTS robots capable of low-volume dispensing reduce assay setup times and provide highly accurate and reproducible dispensing, minimizing variation between sample replicates and eliminating the potential for manual error. Low-volume automated nanoliter dispensers ensure accuracy of pipetting within volume ranges that are difficult to achieve manually. In addition, they have the ability to potentially expand the range of screening conditions from often limited amounts of valuable sample, as well as reduce the usage of expensive reagents. The ability to accurately dispense lower volumes provides the potential to achieve a greater amount of information than could be otherwise achieved using manual dispensing technology. With the emergence of the field of epigenetics, an increasing number of drug discovery companies are beginning to screen compound libraries against a range of epigenetic targets. This review discusses the potential for the use of low-volume liquid handling robots, for molecular biological applications such as quantitative PCR and epigenetics.

  13. Design and evaluation of a new ergonomic handle for instruments in minimally invasive surgery.

    Science.gov (United States)

    Sancibrian, Ramon; Gutierrez-Diez, María C; Torre-Ferrero, Carlos; Benito-Gonzalez, Maria A; Redondo-Figuero, Carlos; Manuel-Palazuelos, Jose C

    2014-05-01

    Laparoscopic surgery techniques have been demonstrated to provide massive benefits to patients. However, surgeons are subjected to hardworking conditions because of the poor ergonomic design of the instruments. In this article, a new ergonomic handle design is presented. This handle is designed using ergonomic principles, trying to provide both more intuitive manipulation of the instrument and a shape that reduces the high-pressure zones in the contact with the surgeon's hand. The ergonomic characteristics of the new handle were evaluated using objective and subjective studies. The experimental evaluation was performed using 28 volunteers by means of the comparison of the new handle with the ring-handle (RH) concept in an instrument available on the market. The volunteers' muscle activation and motions of the hand, wrist, and arm were studied while they performed different tasks. The data measured in the experiment include electromyography and goniometry values. The results obtained from the subjective analysis reveal that most volunteers (64%) preferred the new prototype to the RH, reporting less pain and less difficulty to complete the tasks. The results from the objective study reveal that the hyperflexion of the wrist required for the manipulation of the instrument is strongly reduced. The new ergonomic handle not only provides important ergonomic advantages but also improves the efficiency when completing the tasks. Compared with RH instruments, the new prototype reduced the high-pressure areas and the extreme motions of the wrist. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Women of low socioeconomic status living with diabetes: Becoming adept at handling a disease

    Directory of Open Access Journals (Sweden)

    Wimonrut Boonsatean

    2015-12-01

    Full Text Available Objective: The objective of this study was to explore how Thai women of low socioeconomic status handle their type 2 diabetes. Methods: A qualitative interpretative method was used to study 19 women with type 2 diabetes in a suburban community in Thailand. Data were collected via semi-structured interviews and were analysed using inductive and constructive processes. Results: Participants’ lives underwent many changes between their initial diagnoses and later stages when they became adept at handling diabetes. Two themes emerged, which involved (1 the transition to handling diabetes and (2 the influences of the social environment. The first theme encompassed confronting the disease, reaching a turning point in the process of adaptation and developing expertise in handling diabetes. The second theme involved threats of loss of status and empowerment by families. These findings showed that becoming adept at handling diabetes required significant changes in women’s behaviours and required taking advantage of influences from the social environment. Conclusion: The process of developing expertise in handling diabetes was influenced by both inner and outer factors that required adjustment to learn to live with diabetes. Furthermore, the reductions found in women’s social statuses when they become patients in the healthcare system might pose a barrier to women of low socioeconomic status becoming adept at handling diabetes. However, the experiences of empowerment received from the women’s families acted as a powerful strategy to strengthen their handling of the disease. To develop accessible and sensitive health care for this population, it is important to pay attention to these findings.

  15. Remote Handling behind port plug in ITER

    International Nuclear Information System (INIS)

    Bede, O.; Neuberger, H.

    2006-01-01

    Different Test Blanket Modules (TBM) will be used in succession in the same equatorial ports of ITER. The remote handling operations for connection/disconnection of an interface between the port plug of the EU-HCPB-TBM and the port cell equipment are investigated with the goal to reach a quick and simple TBM exchange procedure. This paper describes the operations and systems which are required for connection of the TBM to its supply lines at this interface. The interface is located inside the free space of the port plug flange between the port plug shield and the bioshield of the port cell behind. The approach of the operation place is only available through a narrow gate in the bioshield opened temporarily during maintenance periods. This gate limits the dimensions of the whole system and its tools. The current design of the EU-HCPB-TBM foresees up to 9 supply lines which have to be connected inside the free space of one half of the port plug flange. The connection operations require positioning and adjustment of the tools for each pipe separately. Despite the strict circumstances it is still possible to find such an industrial jointed-arm robot with sufficient payload, which can penetrate into the working area. A mechanical system is necessary to move the robot from its storing place in the hot cell to the port plug on 6 m distance. Each operation requires different end-of-arm tools. The most special one is a pipe positioner tool, which can position and pull the pipe ends to each other and align the tool before welding and hold them in proper position during the welding process. Weld seams can be made by orbital welding tool. The pipe positioner tool has to provide place for welding tool. Using of inbore tool is impossible because pipes have no open ends where the tool could leave it. Orbital tool must be modified to meet requirements of remote handling because it is designed for human handling. The coolant is helium, so for eliminating the leak of helium it is

  16. Automated cassette-to-cassette substrate handling system

    Science.gov (United States)

    Kraus, Joseph Arthur; Boyer, Jeremy James; Mack, Joseph; DeChellis, Michael; Koo, Michael

    2014-03-18

    An automated cassette-to-cassette substrate handling system includes a cassette storage module for storing a plurality of substrates in cassettes before and after processing. A substrate carrier storage module stores a plurality of substrate carriers. A substrate carrier loading/unloading module loads substrates from the cassette storage module onto the plurality of substrate carriers and unloads substrates from the plurality of substrate carriers to the cassette storage module. A transport mechanism transports the plurality of substrates between the cassette storage module and the plurality of substrate carriers and transports the plurality of substrate carriers between the substrate carrier loading/unloading module and a processing chamber. A vision system recognizes recesses in the plurality of substrate carriers corresponding to empty substrate positions in the substrate carrier. A processor receives data from the vision system and instructs the transport mechanism to transport substrates to positions on the substrate carrier in response to the received data.

  17. Handling newborn monkeys alters later exploratory, cognitive, and social behaviors.

    Science.gov (United States)

    Simpson, Elizabeth A; Sclafani, Valentina; Paukner, Annika; Kaburu, Stefano S K; Suomi, Stephen J; Ferrari, Pier F

    2017-08-18

    Touch is one of the first senses to develop and one of the earliest modalities for infant-caregiver communication. While studies have explored the benefits of infant touch in terms of physical health and growth, the effects of social touch on infant behavior are relatively unexplored. Here, we investigated the influence of neonatal handling on a variety of domains, including memory, novelty seeking, and social interest, in infant monkeys (Macaca mulatta; n=48) from 2 to 12 weeks of age. Neonates were randomly assigned to receive extra holding, with or without accompanying face-to-face interactions. Extra-handled infants, compared to standard-reared infants, exhibited less stress-related behavior and more locomotion around a novel environment, faster approach of novel objects, better working memory, and less fear towards a novel social partner. In sum, infants who received more tactile stimulation in the neonatal period subsequently demonstrated more advanced motor, social, and cognitive skills-particularly in contexts involving exploration of novelty-in the first three months of life. These data suggest that social touch may support behavioral development, offering promising possibilities for designing future early interventions, particularly for infants who are at heightened risk for social disorders. Copyright © 2017. Published by Elsevier Ltd.

  18. ITER - TVPS remote handling critical design issues

    International Nuclear Information System (INIS)

    1990-09-01

    This report describes critical design issues concerning remote maintenance of the ITER Torus Vacuum Pumping System (TVPS). The key issues under investigation are the regeneration/isolation valve seal and seal mechanism replacement; impact of inert gas operation; impact of remote handling (RH) on the building configuration and RH equipment requirements. Seal exchange concepts are developed and their impact on the valve design identified. Concerns regarding the design and operation of RH equipment in an inert gas atmosphere are also explored. The report compares preliminary RH equipment options, pumping equipment maintenance frequency and their impact on the building design, and makes recommendations where a conflict exists between pumping equipment and the building layout. (51 figs., 11 refs.)

  19. The environmental handling in the Oil Wells

    International Nuclear Information System (INIS)

    Carta Petrolera

    1998-01-01

    The oil industry bears environmental impacts related to the resources, soil, air, water, fauna and the socioeconomic aspects of the environment; for this reason the search of the petroleum goes beyond what is believed, it embraces other spheres because, after discovery, it is necessary to extract it, to transport it and to treat it to put it to the service of the humanity's development and in all this long and complex process, is the environmental responsibility, a serious concern as the same discovery of the hydrocarbons. In the beginnings of the oil industry in Colombia this activity was approached without the biggest forecasts as for contamination it refers; however, the environmental laws of the world and the constant concern to preserve the environment, it took to our country to the creation of severe legislations in the matter. In Colombia today in day an environmental legislation exists in all the related with the handling of waters, air and soils inside the oil activity

  20. Handling process disturbances in petroleum production

    Energy Technology Data Exchange (ETDEWEB)

    Sten, T; Bodsberg, L; Ingstad, O; Ulleberg, T

    1988-06-01

    Factors of importance in successful handling of major disturbances and crisis situations in petroleum production are discussed. Case studies based on interviews, questionnaires and systematic observations have been undertaken to identify critical factors in human computer design, in operator competence and attitudes and in work organization. It is shown that certain features of the humancomputer interaction become critical when serious disturbances are encountered. Likewise focusing on requirements during disturbances in particular has highlighted some new aspects of operator competence and of the work organization. The results are considered to be useful input to safety management in petroleum process plants, in formation of design specifications and in identifying need for further research regarding safety in offshore production.